Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

reconstruction loss becomes negative or 'nan' #1

Open
invinceleo opened this issue Nov 15, 2021 · 0 comments
Open

reconstruction loss becomes negative or 'nan' #1

invinceleo opened this issue Nov 15, 2021 · 0 comments

Comments

@invinceleo
Copy link

invinceleo commented Nov 15, 2021

Thanks for sharing your codes and I do believe your work is pretty good.
I ran your codes on a multi-class dataset and I want to do multi category generation.
And I didn't modify your total framework and loss calculations, while the network was changed a little to process the categorial vector in order to make the latent vector z contain categorial information.
But after 4 or 5 epochs of training, the reconstruction loss became negative. According to the equation 23, 26 of http://arxiv.org/abs/1308.0850, if pi * Norm is bigger than 1, than the log value would be positive and the loss would be negative according to
result1 = -torch.log(result1 + epsilon).
But, Norm is the possibility density of a bivariate Gaussian, Norm ∈[0,1], and pi ∈(0,1).

So how can I handle this, could you please give me some advice?
By the way, I don't know if it may caused by the iteration of training because there are over 400K sketches in the training dataset, so one epoch may contains over 4K g_steps.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant