Hi spaVAE team,
I tried running the model on a new dataset and encountered a situation where the GP KLD loss became negative. After debugging, I found that the initial GP KLD loss was positive, but after gradient descent, it turned negative. Is this behavior normal?