-
Couldn't load subscription status.
- Fork 22
Open
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers
Description
It is always good to have more options to choose. So it would be a good idea to add more optimizers. The steps are the following:
- in conf/optimizer add a config for a new optimizer
- if this optimizer requires some other library, update requirements
- run tests to check that everything works with command
pytest
Example: https://github.com/Erlemar/pytorch_tempest/blob/master/conf/optimizer/adamw.yaml
# @package _group_
class_name: torch.optim.AdamW
params:
lr: ${training.lr}
weight_decay: 0.001
# @package _group_- default necessary line forhydraclass_name- full name/path to the objectparams: parameters, which are overriden. If optimizer has more parameters than defined in config, then default values will be used.
There are 3 possible cases of adding an optimizer:
- default pytorch optimizers. Simply add config for it.
- optimizer from another library. Add this library to requirements, define config with
class_namebased on the library. For exampleadamp.AdamP - optimizer from custom class. Add class to src/optimizers and add config with full path to the class
Diyago
Metadata
Metadata
Assignees
Labels
enhancementNew feature or requestNew feature or requestgood first issueGood for newcomersGood for newcomers