Skip to content

Commit 16bdbcd

Browse files
committed
oops
1 parent 771341a commit 16bdbcd

File tree

2 files changed

+3
-3
lines changed

2 files changed

+3
-3
lines changed

adam_atan2_pytorch/adopt.py

Lines changed: 2 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -98,15 +98,15 @@ def step(
9898
next_m = grad.div(v.sqrt().clamp(min = eps)) # they claim that a max(value, eps) performs better than adding the epsilon
9999

100100
if steps > 1:
101-
m.lerp_(next_m, 1. - beta2)
101+
m.lerp_(next_m, 1. - beta1)
102102

103103
# then update parameters
104104

105105
p.add_(m, alpha = -lr)
106106

107107
# update exp grad sq (v)
108108

109-
v.lerp_(grad_sq, 1. - beta1)
109+
v.lerp_(grad_sq, 1. - beta2)
110110

111111
# increment steps
112112

pyproject.toml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,6 +1,6 @@
11
[project]
22
name = "adam-atan2-pytorch"
3-
version = "0.1.2"
3+
version = "0.1.4"
44
description = "Adam-atan2 for Pytorch"
55
authors = [
66
{ name = "Phil Wang", email = "lucidrains@gmail.com" }

0 commit comments

Comments
 (0)