-
Notifications
You must be signed in to change notification settings - Fork 192
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
ModuleNotFoundError: No module named 'exllamav2' #121
Comments
try "pip install exllamav2"? and "pip show exllamav2" to check if installed successfully |
(1)Did: During handling of the above exception, another exception occurred: Traceback (most recent call last): |
Hello, you can upgrade your Python version to 3.9 first and try again. We work normally with python=3.10 and CUDA version=11.6. The recommended Python version is >=3.9. We will also investigate the solution as soon as possible. |
conda create --name tigerbot python=3.8 |
pip install torch==2.0.1+cu117 torchvision torchaudio --extra-index-url https://mirror.sjtu.edu.cn/pytorch-wheels/cu117 |
yes. installed torch==2.0.1+cu117. |
yes, it's a bug. exllamav2 will first check local path. You can download model weights from HF manually. We will repair it soon. |
Thank you! Another option is to do 4 bits quantization with AutoGPTQ only. |
The test environment: the server OS is Ubuntu 22.04 and there is a T4 GPU available.
(1)did below install and infer steps according to https://huggingface.co/TigerResearch/tigerbot-13b-chat-4bit-exl2:
conda create --name tigerbot python=3.8
conda activate tigerbot
conda install pytorch torchvision torchaudio pytorch-cuda=11.7 -c pytorch -c nvidia
git clone https://github.com/TigerResearch/TigerBot
cd TigerBot
pip install -r requirements.txt
git clone https://github.com/turboderp/exllamav2
cd exllamav2
pip install -r requirements.txt
cd ..
CUDA_VISIBLE_DEVICES=0 python other_infer/exllamav2_hf_infer.py --model_path TigerResearch/tigerbot-13b-chat-4bit-exl2
(2)Got the error:
Traceback (most recent call last):
File "other_infer/exllamav2_hf_infer.py", line 9, in
from exllamav2 import ExLlamaV2, ExLlamaV2Cache, ExLlamaV2Config
ModuleNotFoundError: No module named 'exllamav2'
(3)Please advise how to fix this issue.
The text was updated successfully, but these errors were encountered: