Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Installation on NVidia Linux fails due to unmet dependency #25

Open
jpodivin opened this issue Dec 6, 2024 · 2 comments
Open

Installation on NVidia Linux fails due to unmet dependency #25

jpodivin opened this issue Dec 6, 2024 · 2 comments

Comments

@jpodivin
Copy link

jpodivin commented Dec 6, 2024

Step by step installation procedure fails while installing flash-attn package.
This is easy to resolve by itself, but it points to underspecified dependencies of the package.

Document should be revise to mention dependency on torch.
https://docs.instructlab.ai/getting-started/linux_nvidia/

Traceback:

Collecting flash-attn>=2.4.0 (from instructlab-training[cuda]>=0.6.0; extra == "cuda"->instructlab[cuda])
  Downloading flash_attn-2.7.0.post2.tar.gz (2.7 MB)
     ━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.7/2.7 MB 9.2 MB/s eta 0:00:00
  Installing build dependencies ... done
  Getting requirements to build wheel ... error
  error: subprocess-exited-with-error
  
  × Getting requirements to build wheel did not run successfully.
  │ exit code: 1
  ╰─> [20 lines of output]
      Traceback (most recent call last):
        File "/home/jpodivin/ilab/venv-instructlab-0.18-3.11/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
          main()
        File "/home/jpodivin/ilab/venv-instructlab-0.18-3.11/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
          json_out['return_val'] = hook(**hook_input['kwargs'])
                                   ^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/home/jpodivin/ilab/venv-instructlab-0.18-3.11/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
          return hook(config_settings)
                 ^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-ph7nok6z/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 334, in get_requires_for_build_wheel
          return self._get_build_requires(config_settings, requirements=[])
                 ^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
        File "/tmp/pip-build-env-ph7nok6z/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 304, in _get_build_requires
          self.run_setup()
        File "/tmp/pip-build-env-ph7nok6z/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 522, in run_setup
          super().run_setup(setup_script=setup_script)
        File "/tmp/pip-build-env-ph7nok6z/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 320, in run_setup
          exec(code, locals())
        File "<string>", line 21, in <module>
      ModuleNotFoundError: No module named 'torch'
      [end of output]
  
  note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error

The issue is well known, documented, but still unresolved on the flash-attention side:
Dao-AILab/flash-attention#246

@aoyawale
Copy link

aoyawale commented Dec 7, 2024

having the same issue installing on the AMD side. Using python 3.11.9

@jpodivin
Copy link
Author

jpodivin commented Dec 9, 2024

@aoyawale AFAIK the only solution is to install torch manually. That takes some effort, since you need to make sure you have the environment ready. The good news is that it has been worked through several times.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants