You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
You don't need to skip lr. The type would be Callable[[torch.nn.Parameter], torch.optim.Optimizer]. If a class path is given to an optimizer that requires lr, then lr must also be provided. The fact that lr is positional is not important. At least, this is how it is supposed to work. Did something not work correctly for you?
Sorry, what I meant is that I want to skip the lr. I dont want to expose it to the command line or config. My code will set a value automatically (not represented in the repro snippet above).
In this example
How would I skip the
lr
?I tried both:
but they do nothing.
I noticed that:
will work assuming that the
lr
is the second positional argument of all optimizers, which might not be always true.Thanks for the help!
The text was updated successfully, but these errors were encountered: