Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

KL_divergence subtraction z_dim #1

Open
lw3259111 opened this issue Jul 4, 2018 · 1 comment
Open

KL_divergence subtraction z_dim #1

lw3259111 opened this issue Jul 4, 2018 · 1 comment

Comments

@lw3259111
Copy link

KL_divergence = 0.5 * tf.reduce_mean(tf.reduce_sum(tf.exp(self.enc_logvar) - self.enc_logvar + self.enc_mean**2,axis=1) - self.z_dim)
Thanks for your code, but i meet a problem for your code, In line 137, why need you substraction z_dim?
In origin pytorch code, i can't found the operator.

@abdulfatir
Copy link

In the KL expression you have -1 and you take a sum over the latent dimension. This would just sum to -z_dim which is what is written.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants