Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Unable to install flash-attn even if I first install torch alone #1421

Closed
ytxmobile98 opened this issue Jan 3, 2025 · 2 comments
Closed

Comments

@ytxmobile98
Copy link

ytxmobile98 commented Jan 3, 2025

My environment:

  • OS: Ubuntu 24.04.1 LTS
  • Python version: 3.10.15
  • PIP version: 24.3.1
  • Torch version: 2.5.1

It came to my attention that pip install flash_attn does not work. When I try it, the error I got is: No module named 'torch'.

This issue happens even if I install torch first, then install flash-attn afterwards.

$ pip install flash-attn
Looking in indexes: https://pypi.tuna.tsinghua.edu.cn/simple/
Collecting flash-attn
  Using cached https://pypi.tuna.tsinghua.edu.cn/packages/83/29/48df18cb51902a7cb7a0ee13327bb2cf50b6ba24bd2e8283d0a9538dde52/flash_attn-2.7.2.post1.tar.gz (3.1 MB)
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [17 lines of output]
      Traceback (most recent call last):
        File "/home/tianxing/1/.venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/home/tianxing/1/.venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
        File "/home/tianxing/1/.venv/lib/python3.10/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
        File "/tmp/pip-build-env-k5ezeotm/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 334, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=[])
        File "/tmp/pip-build-env-k5ezeotm/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 304, in _get_build_requires
          self.run_setup()
        File "/tmp/pip-build-env-k5ezeotm/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 522, in run_setup
          super().run_setup(setup_script=setup_script)
        File "/tmp/pip-build-env-k5ezeotm/overlay/lib/python3.10/site-packages/setuptools/build_meta.py", line 320, in run_setup
          exec(code, locals())
        File "<string>", line 21, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> See above for output.

note: This error originates from a subprocess, and is likely not a problem with pip.

Output of pip list:

$ pip list
Package                  Version
------------------------ ----------
filelock                 3.16.1
fsspec                   2024.12.0
Jinja2                   3.1.5
MarkupSafe               3.0.2
mpmath                   1.3.0
networkx                 3.4.2
nvidia-cublas-cu12       12.4.5.8
nvidia-cuda-cupti-cu12   12.4.127
nvidia-cuda-nvrtc-cu12   12.4.127
nvidia-cuda-runtime-cu12 12.4.127
nvidia-cudnn-cu12        9.1.0.70
nvidia-cufft-cu12        11.2.1.3
nvidia-curand-cu12       10.3.5.147
nvidia-cusolver-cu12     11.6.1.9
nvidia-cusparse-cu12     12.3.1.170
nvidia-nccl-cu12         2.21.5
nvidia-nvjitlink-cu12    12.4.127
nvidia-nvtx-cu12         12.4.127
pip                      24.3.1
setuptools               65.5.0
sympy                    1.13.1
torch                    2.5.1
triton                   3.1.0
typing_extensions        4.12.2
@ytxmobile98 ytxmobile98 changed the title Unable to install flash_attn even with torch installed standalone Unable to install flash-attn even if I first install torch alone Jan 3, 2025
@LoicPZ
Copy link

LoicPZ commented Jan 7, 2025

Try :
pip install psutil
pip install flash_attn --no-build-isolation

@ytxmobile98
Copy link
Author

Thanks!

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants