Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Support for self attention guidance #150

Open
Mut1nyJD opened this issue Jan 11, 2023 · 6 comments
Open

Support for self attention guidance #150

Mut1nyJD opened this issue Jan 11, 2023 · 6 comments

Comments

@Mut1nyJD
Copy link

@lucidrains asked me to add this issue out of #141 into a new issue
The authors implementation can be found here https://github.com/PnDong/Self-Attention-Guidance

@lucidrains
Copy link
Owner

@Mut1nyJD thanks! the more i think about this paper, the more excited i am

thank you for bringing this to my attention 😄

@Mut1nyJD
Copy link
Author

I also like quiet this one:

Fast Sampling of Diffusion Models via Operator Learning
https://arxiv.org/abs/2211.13449

Seems like a clever idea to speed up sampling definitely feels better than distillation to me but unfortunately the architecture description is quiet vague to really understand their temporal convolution operator (for me it leaves too many open questions)

@lucidrains
Copy link
Owner

@Mut1nyJD ohh yes, i remember this paper, though not as excited about that than the guidance on self-attention map

@lucidrains
Copy link
Owner

@Mut1nyJD from that group, i am most excited about https://arxiv.org/abs/2209.15171, will be circling around back to bio soon

@lucidrains
Copy link
Owner

will definitely try this out this week, and if it pans out, abstract this into a framework so one can try guidance on signals other than the attention map

@Mut1nyJD
Copy link
Author

will definitely try this out this week, and if it pans out, abstract this into a framework so one can try guidance on signals other than the attention map

Awesome! Sounds like a great plan. Unfortunately have not had time to give it a try myself yet.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants