Extend optimizers in HAT.
| Member | Summary |
|---|---|
legacy_nadam_ex.LegacyNadamEx | Nadam optimizer. |
lion.Lion | Implements Lion algorithm <https://arxiv.org/pdf/2302.06675.pdf>. |
optim_param_wrap.custom_param_optimizer | Return optimizer with custom params setting. |
Nadam optimizer.
This optimizer compute wd in the wrong way, but get far more better : performance.
Perform a single optimization step.
Implements Lion algorithm <https://arxiv.org/pdf/2302.06675.pdf>.
Note docs: : https://horizonrobotics.feishu.cn/docx/AsStdmRXIoeSyZxk9OPccxvnn0u
Perform a single optimization step.
Return optimizer with custom params setting.
The custom_param_mapper has the following key characteristics:
Key Matching:
- Class of torch.nn.Module: The keys can directly match the corresponding parameters of the model.
- Predefined types: Keys can be chosen from predefined types. Supported types include [“norm_types”, ].
- String match: Keys can be matched based on param_names.
- Tuple of previous 3 kinds of keys.
Value Setting: : Optimizer parameters can be set, e.g., {“weight_decay”: 1e-4, “lr”: 0.01}.
Example: