-
Notifications
You must be signed in to change notification settings - Fork 1.3k
Description
Is your feature request related to a problem? Please describe.
The diffusers library has now a lot of new diffusion schedulers that might improve training quality. MONAI offers a reduced number of schedulers and schedulers features.
Of course, it would make sense to try incorporating those, but as this diffusers library has grown widely, and the code in MONAI was made relatively compatible with that from hugging face, I wonder whether we could make our DiffusionInferer and LatentDiffusionInferer classes compatible with the schedulers from the diffusers library.
Describe the solution you'd like
That you can use the schedulers from diffusers with MONAI inferer same as you use the MONAI schedulers. Example:
`
import diffusers
from monai.inferers import LatentDiffusionInferer
scheduler = diffusers.DDPMScheduler(
num_train_timesteps=1000,
beta_schedule='scaled_linear',
beta_start=0.0015, beta_end=0.0205,
prediction_type='epsilon',
timestep_spacing='trailing',
)
inferer = LatentDiffusionInferer(scheduler = scheduler, scale_factor=0.5)
`
I've only had to do a few tweaks to make it work. For instance, the 'step' function of the DDPMScheduler is by default returning a special class dictionary as output, which gives an error, so when calling this function on the diffusers scheduler you have to set parameter return_dict = False (argument that doesn't exist in the MONAI version).
Describe alternatives you've considered
Not use MONAI inferers at all.
Things to consider
Perhaps we need to list which schedulers can be made compatible. I've only tried it with a simple DDPM one, I assume there are others that require more modifications.