Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

ERROR: No matching distribution found for flash-attn==2.6.3+cu123torch2.4cxx11abifalse #1423

Open
carolynsoo opened this issue Jan 6, 2025 · 0 comments

Comments

@carolynsoo
Copy link

carolynsoo commented Jan 6, 2025

In the last month or two, did something change wrt the distribution of flash attention, specifically flash-attn 2.6.3+cu123?

flash-attn is a dependency I rely on both directly and indirectly (via the deps of other wheels). I last had to reinstall the requirements within my venv in late 2024, and historically flash-attn was fragile but workable. However, now the exact same install script and files gives me:

Processing /tmp/tmp6h9bcpvl/local_packages/pypi/flash_attn-2.6.3+cu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl 
Discarding file:///tmp/tmp6h9bcpvl/local_packages/pypi/flash_attn-2.6.3%2Bcu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x8
6_64.whl: Requested flash-attn==2.6.3+cu123torch2.4cxx11abifalse from file:///tmp/tmp6h9bcpvl/local_packages/pypi/flash_a
ttn-2.6.3%2Bcu123torch2.4cxx11abiFALSE-cp310-cp310-linux_x86_64.whl has inconsistent version: expected '2.6.3+cu123torch2
.4cxx11abifalse', but metadata has '2.6.3'                                                                               
                                                                                                                         
Pretty-printed STDERR:                                                                                                   
ERROR: Ignored the following yanked versions: 1.0.3                                                                      
ERROR: Ignored the following versions that require a different python version: 0.23.0 Requires-Python >=3.6, <3.10; 0.36.
0 Requires-Python >=3.6,<3.10; 0.37.0 Requires-Python >=3.7,<3.10; 0.52.0 Requires-Python >=3.6,<3.9; 0.52.0rc3 Requires-
Python >=3.6,<3.9; 0.53.0 Requires-Python >=3.6,<3.10; 0.53.0rc1.post1 Requires-Python >=3.6,<3.10; 0.53.0rc2 Requires-Py
thon >=3.6,<3.10; 0.53.0rc3 Requires-Python >=3.6,<3.10; 0.53.1 Requires-Python >=3.6,<3.10; 0.54.0 Requires-Python >=3.7
,<3.10; 0.54.0rc2 Requires-Python >=3.7,<3.10; 0.54.0rc3 Requires-Python >=3.7,<3.10; 0.54.1 Requires-Python >=3.7,<3.10;
 0.6.0 Requires-Python >=3.5,<3.9; 0.6.0.1 Requires-Python >=3.5,<3.9; 0.6.1 Requires-Python >=3.5,<3.9; 0.6.1.1 Requires
-Python >=3.5,<3.9; 0.6.2 Requires-Python >=3.5,<3.9; 0.6.3 Requires-Python >=3.5,<3.9; 0.6.4 Requires-Python >=3.5,<3.9;
 0.6.5 Requires-Python >=3.5,<3.9; 0.6.5.1 Requires-Python >=3.5,<3.9; 0.6.5.2 Requires-Python >=3.5,<3.9; 0.6.6 Requires
-Python >=3.5,<3.9; 0.7.0 Requires-Python >=3.5,<3.9; 1.6.2 Requires-Python >=3.7,<3.10; 1.6.3 Requires-Python >=3.7,<3.1
0; 1.7.0 Requires-Python >=3.7,<3.10; 1.7.1 Requires-Python >=3.7,<3.10                                                  
ERROR: Could not find a version that satisfies the requirement flash-attn==2.6.3+cu123torch2.4cxx11abifalse (from version
s: 0.2.0, 0.2.1, 0.2.2, 0.2.3, 0.2.4, 0.2.5, 0.2.6.post1, 0.2.7, 0.2.8, 1.0.0, 1.0.1, 1.0.2, 1.0.3.post0, 1.0.4, 1.0.5, 1
.0.6, 1.0.7, 1.0.8, 1.0.9, 2.0.0.post1, 2.0.1, 2.0.2, 2.0.3, 2.0.4, 2.0.5, 2.0.6, 2.0.6.post2, 2.0.7, 2.0.8, 2.0.9, 2.1.0
, 2.1.1, 2.1.2.post3, 2.2.0, 2.2.1, 2.2.2, 2.2.3.post2, 2.2.4, 2.2.4.post1, 2.2.5, 2.3.0, 2.3.1.post1, 2.3.2, 2.3.3, 2.3.
4, 2.3.5, 2.3.6, 2.4.0.post1, 2.4.1, 2.4.2, 2.4.3.post1, 2.5.0, 2.5.1.post1, 2.5.2, 2.5.3, 2.5.4, 2.5.5, 2.5.6, 2.5.7, 2.
5.8, 2.5.9.post1, 2.6.0.post1, 2.6.1, 2.6.2, 2.6.3, 2.6.3+cu123torch2.4cxx11abifalse, 2.7.0.post2, 2.7.1.post4, 2.7.2.pos
t1)                                                                                                                      
ERROR: No matching distribution found for flash-attn==2.6.3+cu123torch2.4cxx11abifalse

I believe this may be a dep of a dep, since for my own code I had been specifying https://github.com/Dao-AILab/flash-attention/releases/download/v2.6.3/flash_attn-2.6.3+cu118torch2.3cxx11abiFALSE-cp310-cp310-linux_x86_64.whl in my requirements/venv rather than the version listed in the error (flash-attn==2.6.3+cu123torch2.4cxx11abifalse)... but I'm not sure where to start debugging as there aren't many similar breakages mentioned?

FWIW, deps are auto-reconciled with pip-compile (has worked fine till now).

Has anyone encountered this problem before?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

1 participant