composer.optim (original) (raw)

Back to top

Edit this page

Toggle table of contents sidebar

Optimizers and learning rate schedulers.

Composer is compatible with optimizers based off of PyTorch’s native Optimizer API, and common optimizers such However, where applicable, it is recommended to use the optimizers provided in decoupled_weight_decay since they improve off of their PyTorch equivalents.

PyTorch schedulers can be used with Composer, but this is explicitly discouraged. Instead, it is recommended to use schedulers based off of Composer’s ComposerScheduler API, which allows more flexibility and configuration in writing schedulers.

Functions

Classes