You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Step by step installation procedure fails while installing flash-attn package.
This is easy to resolve by itself, but it points to underspecified dependencies of the package.
Collecting flash-attn>=2.4.0 (from instructlab-training[cuda]>=0.6.0; extra == "cuda"->instructlab[cuda])
Downloading flash_attn-2.7.0.post2.tar.gz (2.7 MB)
━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━━ 2.7/2.7 MB 9.2 MB/s eta 0:00:00
Installing build dependencies ... done
Getting requirements to build wheel ... error
error: subprocess-exited-with-error
× Getting requirements to build wheel did not run successfully.
│ exit code: 1
╰─> [20 lines of output]
Traceback (most recent call last):
File "/home/jpodivin/ilab/venv-instructlab-0.18-3.11/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 353, in <module>
main()
File "/home/jpodivin/ilab/venv-instructlab-0.18-3.11/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 335, in main
json_out['return_val'] = hook(**hook_input['kwargs'])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/home/jpodivin/ilab/venv-instructlab-0.18-3.11/lib/python3.11/site-packages/pip/_vendor/pyproject_hooks/_in_process/_in_process.py", line 118, in get_requires_for_build_wheel
return hook(config_settings)
^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-ph7nok6z/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 334, in get_requires_for_build_wheel
return self._get_build_requires(config_settings, requirements=[])
^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^
File "/tmp/pip-build-env-ph7nok6z/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 304, in _get_build_requires
self.run_setup()
File "/tmp/pip-build-env-ph7nok6z/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 522, in run_setup
super().run_setup(setup_script=setup_script)
File "/tmp/pip-build-env-ph7nok6z/overlay/lib/python3.11/site-packages/setuptools/build_meta.py", line 320, in run_setup
exec(code, locals())
File "<string>", line 21, in <module>
ModuleNotFoundError: No module named 'torch'
[end of output]
note: This error originates from a subprocess, and is likely not a problem with pip.
error: subprocess-exited-with-error
The issue is well known, documented, but still unresolved on the flash-attention side: Dao-AILab/flash-attention#246
The text was updated successfully, but these errors were encountered:
@aoyawale AFAIK the only solution is to install torch manually. That takes some effort, since you need to make sure you have the environment ready. The good news is that it has been worked through several times.
Step by step installation procedure fails while installing flash-attn package.
This is easy to resolve by itself, but it points to underspecified dependencies of the package.
Document should be revise to mention dependency on torch.
https://docs.instructlab.ai/getting-started/linux_nvidia/
Traceback:
The issue is well known, documented, but still unresolved on the flash-attention side:
Dao-AILab/flash-attention#246
The text was updated successfully, but these errors were encountered: