Skip to content

Commit eb94674

Browse files
EnayatUllahfacebook-github-bot
authored andcommitted
Fix DistributedDP Optimizer for Fast Gradient Clipping (#662)
Summary: Pull Request resolved: #662 The step function incorrectly called "original_optimizer.original_optimizer" instead of "original_optimizer". Fixed it now. Reviewed By: HuanyuZhang Differential Revision: D60484128 fbshipit-source-id: 1bde00292b2afccc31803ebefb2c361dc7e9bb77
1 parent 4804a51 commit eb94674

File tree

2 files changed

+3
-2
lines changed

2 files changed

+3
-2
lines changed

opacus/__init__.py

Lines changed: 2 additions & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -14,14 +14,15 @@
1414
# limitations under the License.
1515

1616
from . import utils
17-
from .grad_sample import GradSampleModule
17+
from .grad_sample import GradSampleModule, GradSampleModuleFastGradientClipping
1818
from .privacy_engine import PrivacyEngine
1919
from .version import __version__
2020

2121

2222
__all__ = [
2323
"PrivacyEngine",
2424
"GradSampleModule",
25+
"GradSampleModuleFastGradientClipping",
2526
"utils",
2627
"__version__",
2728
]

opacus/optimizers/ddpoptimizer_fast_gradient_clipping.py

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -76,6 +76,6 @@ def step(
7676

7777
if self.pre_step():
7878
self.reduce_gradients()
79-
return self.original_optimizer.original_optimizer.step()
79+
return self.original_optimizer.step()
8080
else:
8181
return None

0 commit comments

Comments
 (0)