You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
The basic environment was successfully installed without any errors.
git clone https://github.com/FasterDecoding/Medusa.git
cd Medusa
pip install -e .
Run python -m medusa.inference.cli and get an error
❯ python -m medusa.inference.cli --model FasterDecoding/medusa-1.0-vicuna-13b-v1.5
^[[ATraceback (most recent call last):
File "/usr/local/anaconda3/envs/medusa/lib/python3.10/runpy.py", line 196, in _run_module_as_main
return _run_code(code, main_globals, None,
File "/usr/local/anaconda3/envs/medusa/lib/python3.10/runpy.py", line 86, in _run_code
exec(code, run_globals)
File "/data/lab/Medusa/medusa/inference/cli.py", line 24, in <module>
from medusa.model.medusa_model import MedusaModel
File "/data/lab/Medusa/medusa/model/medusa_model.py", line 3, in <module>
from .modeling_llama_kv import LlamaForCausalLM as KVLlamaForCausalLM
File "/data/lab/Medusa/medusa/model/modeling_llama_kv.py", line 22, in <module>
from transformers.utils import (
ImportError: cannot import name 'is_flash_attn_available' from 'transformers.utils' (/medusa/lib/python3.10/site-packages/transformers/utils/__init__.py)
I got a error when I refer to https://github.com/FasterDecoding/Medusa to prepare to run the Demo .
python -m medusa.inference.cli
and get an errorThe text was updated successfully, but these errors were encountered: