Skip to content

Add new optimizers #15

@Erlemar

Description

@Erlemar

It is always good to have more options to choose. So it would be a good idea to add more optimizers. The steps are the following:

  • in conf/optimizer add a config for a new optimizer
  • if this optimizer requires some other library, update requirements
  • run tests to check that everything works with command pytest

Example: https://github.com/Erlemar/pytorch_tempest/blob/master/conf/optimizer/adamw.yaml

# @package _group_
class_name: torch.optim.AdamW
params:
  lr: ${training.lr}
  weight_decay: 0.001
  • # @package _group_ - default necessary line for hydra
  • class_name - full name/path to the object
  • params: parameters, which are overriden. If optimizer has more parameters than defined in config, then default values will be used.

There are 3 possible cases of adding an optimizer:

  • default pytorch optimizers. Simply add config for it.
  • optimizer from another library. Add this library to requirements, define config with class_name based on the library. For example adamp.AdamP
  • optimizer from custom class. Add class to src/optimizers and add config with full path to the class

Metadata

Metadata

Assignees

No one assigned

    Labels

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions