-
Notifications
You must be signed in to change notification settings - Fork 1.1k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Support for self attention guidance #150
Comments
@Mut1nyJD thanks! the more i think about this paper, the more excited i am thank you for bringing this to my attention 😄 |
I also like quiet this one: Fast Sampling of Diffusion Models via Operator Learning Seems like a clever idea to speed up sampling definitely feels better than distillation to me but unfortunately the architecture description is quiet vague to really understand their temporal convolution operator (for me it leaves too many open questions) |
@Mut1nyJD ohh yes, i remember this paper, though not as excited about that than the guidance on self-attention map |
@Mut1nyJD from that group, i am most excited about https://arxiv.org/abs/2209.15171, will be circling around back to bio soon |
will definitely try this out this week, and if it pans out, abstract this into a framework so one can try guidance on signals other than the attention map |
Awesome! Sounds like a great plan. Unfortunately have not had time to give it a try myself yet. |
@lucidrains asked me to add this issue out of #141 into a new issue
The authors implementation can be found here https://github.com/PnDong/Self-Attention-Guidance
The text was updated successfully, but these errors were encountered: