You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
I used the code to train CIFAR10 dataset. As mentioned in the paper DDPM, my batchsize is 128, the optimizer is Adam, the learning rate is 0.0002, and I used l2 loss. I found that the training loss kept fluctuating between 0.015 and 0.030. Do I need to reduce the learning rate? Is there a training log that I can refer to?
The text was updated successfully, but these errors were encountered:
I used the code to train CIFAR10 dataset. As mentioned in the paper DDPM, my batchsize is 128, the optimizer is Adam, the learning rate is 0.0002, and I used l2 loss. I found that the training loss kept fluctuating between 0.015 and 0.030. Do I need to reduce the learning rate? Is there a training log that I can refer to?
The text was updated successfully, but these errors were encountered: