Schedulers (original) (raw)

🤗 Diffusers provides many scheduler functions for the diffusion process. A scheduler takes a model’s output (the sample which the diffusion process is iterating on) and a timestep to return a denoised sample. The timestep is important because it dictates where in the diffusion process the step is; data is generated by iterating forward n timesteps and inference occurs by propagating backward through the timesteps. Based on the timestep, a scheduler may be discrete in which case the timestep is an int or continuous in which case the timestep is a float.

Depending on the context, a scheduler defines how to iteratively add noise to an image or how to update a sample based on a model’s output:

Many schedulers are implemented from the k-diffusion library by Katherine Crowson, and they’re also widely used in A1111. To help you map the schedulers from k-diffusion and A1111 to the schedulers in 🤗 Diffusers, take a look at the table below:

A1111/k-diffusion 🤗 Diffusers Usage
DPM++ 2M DPMSolverMultistepScheduler
DPM++ 2M Karras DPMSolverMultistepScheduler init with use_karras_sigmas=True
DPM++ 2M SDE DPMSolverMultistepScheduler init with algorithm_type="sde-dpmsolver++"
DPM++ 2M SDE Karras DPMSolverMultistepScheduler init with use_karras_sigmas=True and algorithm_type="sde-dpmsolver++"
DPM++ 2S a N/A very similar to DPMSolverSinglestepScheduler
DPM++ 2S a Karras N/A very similar to DPMSolverSinglestepScheduler(use_karras_sigmas=True, ...)
DPM++ SDE DPMSolverSinglestepScheduler
DPM++ SDE Karras DPMSolverSinglestepScheduler init with use_karras_sigmas=True
DPM2 KDPM2DiscreteScheduler
DPM2 Karras KDPM2DiscreteScheduler init with use_karras_sigmas=True
DPM2 a KDPM2AncestralDiscreteScheduler
DPM2 a Karras KDPM2AncestralDiscreteScheduler init with use_karras_sigmas=True
DPM adaptive N/A
DPM fast N/A
Euler EulerDiscreteScheduler
Euler a EulerAncestralDiscreteScheduler
Heun HeunDiscreteScheduler
LMS LMSDiscreteScheduler
LMS Karras LMSDiscreteScheduler init with use_karras_sigmas=True
N/A DEISMultistepScheduler
N/A UniPCMultistepScheduler

Noise schedules and schedule types

A1111/k-diffusion 🤗 Diffusers
Karras init with use_karras_sigmas=True
sgm_uniform init with timestep_spacing="trailing"
simple init with timestep_spacing="trailing"
exponential init with timestep_spacing="linspace", use_exponential_sigmas=True
beta init with timestep_spacing="linspace", use_beta_sigmas=True

All schedulers are built from the base SchedulerMixin class which implements low level utilities shared by all schedulers.

SchedulerMixin

Base class for all schedulers.

SchedulerMixin contains common functions shared by all schedulers such as general loading and saving functionalities.

ConfigMixin takes care of storing the configuration attributes (like num_train_timesteps) that are passed to the scheduler’s __init__ function, and the attributes can be accessed by scheduler.config.num_train_timesteps.

Class attributes:

from_pretrained

< source >

( pretrained_model_name_or_path: typing.Union[str, os.PathLike, NoneType] = None subfolder: typing.Optional[str] = None return_unused_kwargs = False **kwargs )

Parameters

Instantiate a scheduler from a pre-defined JSON configuration file in a local directory or Hub repository.

To use private or gated models, log-in withhuggingface-cli login. You can also activate the special“offline-mode” to use this method in a firewalled environment.

save_pretrained

< source >

( save_directory: typing.Union[str, os.PathLike] push_to_hub: bool = False **kwargs )

Parameters

Save a scheduler configuration object to a directory so that it can be reloaded using thefrom_pretrained() class method.

SchedulerOutput

class diffusers.schedulers.scheduling_utils.SchedulerOutput

< source >

( prev_sample: Tensor )

Parameters

Base class for the output of a scheduler’s step function.

KarrasDiffusionSchedulers

KarrasDiffusionSchedulers are a broad generalization of schedulers in 🤗 Diffusers. The schedulers in this class are distinguished at a high level by their noise sampling strategy, the type of network and scaling, the training strategy, and how the loss is weighed.

The different schedulers in this class, depending on the ordinary differential equations (ODE) solver type, fall into the above taxonomy and provide a good abstraction for the design of the main schedulers implemented in 🤗 Diffusers. The schedulers in this class are given here.

PushToHubMixin

class diffusers.utils.PushToHubMixin

< source >

( )

A Mixin to push a model, scheduler, or pipeline to the Hugging Face Hub.

push_to_hub

< source >

( repo_id: str commit_message: typing.Optional[str] = None private: typing.Optional[bool] = None token: typing.Optional[str] = None create_pr: bool = False safe_serialization: bool = True variant: typing.Optional[str] = None )

Parameters

Upload model, scheduler, or pipeline files to the 🤗 Hugging Face Hub.

Examples:

from diffusers import UNet2DConditionModel

unet = UNet2DConditionModel.from_pretrained("stabilityai/stable-diffusion-2", subfolder="unet")

unet.push_to_hub("my-finetuned-unet")

unet.push_to_hub("your-org/my-finetuned-unet")

< > Update on GitHub