-
Notifications
You must be signed in to change notification settings - Fork 18
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Double Stochasticity #4
Comments
This relates to a more general question of how to support different approaches to gradient estimation. At the moment the way to do that is by overloading
I think we could do something better than this though. Something like defining an expectation "operator" or something, e.g. |
For my experience with AGP.jl, I created two different approaches for VI. One with sampling (the actual ADVI) and one with quadrature. For the quadrature I am directly using FastGaussQuadrature as Expectations.jl is mostly a wrapper around it. Replacing |
I didn't mean using the existing Expectations.jl for expectation-estimation (other than in the univariate case, for exactly the reason you mention). I just meant that I think the interface is nice and that it would great to have something similar to for this package. A lot of different VI algorithms can be separated only in the way they do the estimation of the objective, e.g. MC, importance-weighted, semi-implicit, and so I would love to have some way for me to only implement a new estimator and then plug it into, say, And this is why it's taking me so bloody long (sorry about that, again) to do a proper write-up of my thoughts on the topic, since I'll need to do a proper review of existing VI methods to get a good understanding of what functionality we need to cover.
One thing I forgot to mention is the "stochastic" or mini-batch VI in your initial comment. |
Related discussion in |
Closed in favor of #38 |
ADVI and other methods (SVGD, etc) can treat double stochasticity (Stochastic estimation of the expectation via samples, Stochastic estimation of the log joint via mini batches) :
M. Titsias and M. Lázaro-Gredilla. Doubly stochastic variational Bayes for non-conjugate
inference.
The text was updated successfully, but these errors were encountered: