Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Trying to run torch_compile_with_torchao cannot import name 'TorchAoConfig' from 'transformers' #32

Open
lukaLLM opened this issue Sep 9, 2024 · 2 comments

Comments

@lukaLLM
Copy link

lukaLLM commented Sep 9, 2024

Hi, while running the notebook from recipes https://github.com/huggingface/huggingface-llama-recipes/blob/main/torch_compile_with_torchao.ipynb I got cannot import name 'TorchAoConfig' from 'transformers' I installed newest pytorch using conda install pytorch torchvision torchaudio pytorch-cuda=12.4 -c pytorch-nightly -c nvidia and followed the installation here https://github.com/pytorch/ao am I doing smth wrong ? I use Meta-Llama-3.1-8B-Instruct locally.

@ariG23498
Copy link
Collaborator

Hey @lukaLLM

Could you give https://github.com/huggingface/huggingface-llama-recipes/blob/main/performance_optimization/torch_compile_with_torchao.ipynb another try, with the latest installation of transformers?

@lukaLLM
Copy link
Author

lukaLLM commented Oct 2, 2024

I will need some this as I get dependency complication in project if I do that. I think I should also configure it diffrently on windows. Still not sure if its faster than jsut flash attention 2.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants