| Documentation | Blog | User Forum | Developer Slack |
Upcoming Events 🔥
- Join us at the PyTorch Conference, October 22-23 in San Francisco!
- Join us at Ray Summit, November 3-5 in San Francisco!
- Join us at JAX DevLab on November 18th in Sunnyvale!
Latest News 🔥
Previous News 🔥
vLLM TPU is now powered by tpu-inference
, an expressive and powerful new hardware plugin unifying JAX and PyTorch under a single lowering path within the vLLM project. The new backend now provides a framework for developers to:
- Push the limits of TPU hardware performance in open source.
- Provide more flexibility to JAX and PyTorch users by running PyTorch model definitions performantly on TPU without any additional code changes, while also extending native support to JAX.
- Retain vLLM standardization: keep the same user experience, telemetry, and interface.
Although vLLM TPU’s new unified backend makes out-of-the-box high performance serving possible with any model supported in vLLM, the reality is that we're still in the process of implementing a few core components.
For this reason, we’ve provided a Recommended Models and Features page detailing the models and features that are validated through unit, integration, and performance testing.
Get started with vLLM on TPUs by following the quickstart guide.
Visit our documentation to learn more.
Compatible TPU Generations
- Recommended: v5e, v6e
- Experimental: v3, v4, v5p
Check out a few v6e recipes here!
We're always looking for ways to partner with the community to accelerate vLLM TPU development. If you're interested in contributing to this effort, check out the Contributing guide and Issues to start. We recommend filtering Issues on the good first issue tag if it's your first time contributing.
- For technical questions and feature requests, open a GitHub Issue
- For feature requests, please open one on Github here
- For discussing with fellow users, use the TPU support topic in the vLLM Forum
- For coordinating contributions and development, use the Developer Slack
- For collaborations and partnerships, contact us at [email protected]