Skip to content
This repository was archived by the owner on Oct 15, 2025. It is now read-only.

llm-d-1.0.20

Choose a tag to compare

@bumper-bot-llm-d bumper-bot-llm-d released this 23 Jun 21:10
4ef77d3

Released chart test status

llm-d is a Kubernetes-native high-performance distributed LLM inference framework

What's Changed

  • [quickstart] Add option -j/--gateway to llmd-installer.sh by @maugustosilva in #316
  • Fix: --download-model CLI parameter now correctly overrides template … by @yossiovadia in #320
  • Add "release name" as cli parameter (-r/--release) for llmd-installer.sh by @maugustosilva in #326
  • Add the ability to dispatch a PR or branch in ec2-e2e by @nerdalert in #325
  • e2e lint fix by @nerdalert in #333
  • fix: populate gateway.gatewayClassName helm value when the gateway type is not istio by @chewong in #327
  • fix: correct --download-model to use HuggingFace URI instead of PVC by @yossiovadia in #328
  • Detect minikube context for uninstall by @nerdalert in #278
  • Add the ability to define envs to the vLLM containers in sample app by @nerdalert in #301

New Contributors

Full Changelog: llm-d-1.0.19...llm-d-1.0.20