-
Notifications
You must be signed in to change notification settings - Fork 28
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
running of v0.7.1 problem #56
Comments
0.7.1 require a new torch_npu version which is will be release in these 2-3 days And we'll release 0.7.1 official soon. Please wait more. Thanks. |
OK, I tried with the main branch, and it works. The project README may need a more detailed explanation of the version branches and their corresponding features to help peoples install package. |
yes! We're working on doc updating and will make sure the doc works before the first release. |
Plz try again with the torch-npu listed here: |
Yes, please try the new release https://github.com/vllm-project/vllm-ascend/releases/tag/v0.7.1rc1 |
@wangxiyuan |
@dawnranger you should install a new torch-npu as well. Please follow the guide |
@wangxiyuan |
base env:
torch 2.5.1+cpu
torch-npu 2.5.1rc1
torch-optimizer 0.3.0
torchaudio 2.5.1+cpu
torchmetrics 0.10.0
torchscale 0.2.0
torchtext 0.18.0+cpu
torchvision 0.20.1+cpu
CANN 8.0
install version branch:
vllm 0.7.1
vllm-ascend 0.7.1-release
when I running the vllm server, encounter
AttributeError: module 'torch_npu' has no attribute 'npu_rope'
The text was updated successfully, but these errors were encountered: