Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

running of v0.7.1 problem #56

Open
hz0ne opened this issue Feb 13, 2025 · 10 comments
Open

running of v0.7.1 problem #56

hz0ne opened this issue Feb 13, 2025 · 10 comments
Labels
documentation Improvements or additions to documentation

Comments

@hz0ne
Copy link

hz0ne commented Feb 13, 2025

base env:
torch 2.5.1+cpu
torch-npu 2.5.1rc1
torch-optimizer 0.3.0
torchaudio 2.5.1+cpu
torchmetrics 0.10.0
torchscale 0.2.0
torchtext 0.18.0+cpu
torchvision 0.20.1+cpu

CANN 8.0

install version branch:
vllm 0.7.1
vllm-ascend 0.7.1-release

when I running the vllm server, encounter AttributeError: module 'torch_npu' has no attribute 'npu_rope'

  File "/opt/conda/lib/python3.10/site-packages/torch/nn/modules/module.py", line 1747, in _call_impl
    return forward_call(*args, **kwargs)
  File "/opt/conda/lib/python3.10/site-packages/vllm/model_executor/custom_op.py", line 23, in forward
    return self._forward_method(*args, **kwargs)
  File "/home/admin/runtime_package/vllm-ascend/vllm_ascend/ops/rotary_embedding.py", line 41, in rope_forward_oot
    torch_npu.npu_rope(
AttributeError: module 'torch_npu' has no attribute 'npu_rope'
Traceback (most recent call last):
  File "/opt/conda/bin/vllm", line 8, in <module>
    sys.exit(main())
  File "/opt/conda/lib/python3.10/site-packages/vllm/scripts.py", line 202, in main
    args.dispatch_function(args)
  File "/opt/conda/lib/python3.10/site-packages/vllm/scripts.py", line 42, in serve
    uvloop.run(run_server(args))
  File "/opt/conda/lib/python3.10/site-packages/uvloop/__init__.py", line 82, in run
    return loop.run_until_complete(wrapper())
  File "uvloop/loop.pyx", line 1518, in uvloop.loop.Loop.run_until_complete
  File "/opt/conda/lib/python3.10/site-packages/uvloop/__init__.py", line 61, in wrapper
@wangxiyuan
Copy link
Collaborator

0.7.1 require a new torch_npu version which is will be release in these 2-3 days And we'll release 0.7.1 official soon. Please wait more. Thanks.

@hz0ne
Copy link
Author

hz0ne commented Feb 13, 2025

0.7.1 require a new torch_npu version which is will be release in these 2-3 days And we'll release 0.7.1 official soon. Please wait more. Thanks.

OK, I tried with the main branch, and it works. The project README may need a more detailed explanation of the version branches and their corresponding features to help peoples install package.

@wangxiyuan
Copy link
Collaborator

yes! We're working on doc updating and will make sure the doc works before the first release.

@Yikun
Copy link
Collaborator

Yikun commented Feb 15, 2025

@hz0ne I added some notes for verisoing, branch and release policy, would you mind taking a look to let me know is it clear enough.

Many thanks for your feedback again.

#62

@hz0ne
Copy link
Author

hz0ne commented Feb 16, 2025

@hz0ne I added some notes for verisoing, branch and release policy, would you mind taking a look to let me know is it clear enough.

Many thanks for your feedback again.

#62

This is a relatively good version release strategy document. Thank you for your support!

@Yikun Yikun added the documentation Improvements or additions to documentation label Feb 17, 2025
@MengqingCao
Copy link
Contributor

Plz try again with the torch-npu listed here:
https://vllm-ascend.readthedocs.io/en/latest/installation.html#setup-vllm-and-vllm-ascend

@wangxiyuan
Copy link
Collaborator

Yes, please try the new release https://github.com/vllm-project/vllm-ascend/releases/tag/v0.7.1rc1

@dawnranger
Copy link

Yes, please try the new release https://github.com/vllm-project/vllm-ascend/releases/tag/v0.7.1rc1

@wangxiyuan
This release does not work, and the problem remains unsolved. Only installing from the main branch works.

@wangxiyuan
Copy link
Collaborator

@dawnranger you should install a new torch-npu as well. Please follow the guide
https://vllm-ascend.readthedocs.io/en/v0.7.1rc1/installation.html#setup-vllm-and-vllm-ascend

@dawnranger
Copy link

@dawnranger you should install a new torch-npu as well. Please follow the guide https://vllm-ascend.readthedocs.io/en/v0.7.1rc1/installation.html#setup-vllm-and-vllm-ascend

@wangxiyuan torch_npu-2.5.1.dev20250218 has import error, see: #117

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
documentation Improvements or additions to documentation
Projects
None yet
Development

No branches or pull requests

5 participants