Skip to content

Releases: vllm-project/vllm-ascend

v0.7.1rc1

19 Feb 09:19
f17417f
Compare
Choose a tag to compare
v0.7.1rc1 Pre-release
Pre-release

🎉 Hello, World!

We are excited to announce the first release candidate of v0.7.1 for vllm-ascend.

vLLM Ascend Plugin (vllm-ascend) is a community maintained hardware plugin for running vLLM on the Ascend NPU. With this release, users can now enjoy the latest features and improvements of vLLM on the Ascend NPU.

Please visit the official doc to start the journey: https://vllm-ascend.readthedocs.io/en/v0.7.1rc1

Note that this is a release candidate, and there may be some bugs or issues. We appreciate your feedback and suggestions here

Highlights

  • Initial supports for Ascend NPU on vLLM. #3
  • DeepSeek is now supported. #88 #68
  • Qwen, Llama series and other popular models are also supported, you can see more details in here.

Core

  • Added the Ascend quantization config option, the implementation will coming soon. #7 #73
  • Add silu_and_mul and rope ops and add mix ops into attention layer. #18

Other

  • [CI] Enable Ascend CI to actively monitor and improve quality for vLLM on Ascend. #3
  • [Docker] Add vllm-ascend container image #64
  • [Docs] Add a live doc #55

Known issues

  • This release relies on an unreleased torch_npu version. It has been installed within official container image already. Please install it manually if you are using non-container environment.
  • There are logs like No platform deteced, vLLM is running on UnspecifiedPlatform or Failed to import from vllm._C with ModuleNotFoundError("No module named 'vllm._C'") shown when runing vllm-ascend. It actually doesn't affect any functionality and performance. You can just ignore it. And it has been fixed in this PR which will be included in v0.7.3 soon.
  • There are logs like # CPU blocks: 35064, # CPU blocks: 2730 shown when runing vllm-ascend which should be # NPU blocks: . It actually doesn't affect any functionality and performance. You can just ignore it. And it has been fixed in this PR which will be included in v0.7.3 soon.

Full Changelog: https://github.com/vllm-project/vllm-ascend/commits/v0.7.1rc1