File tree Expand file tree Collapse file tree 1 file changed +1
-1
lines changed
docs/source/getting_started Expand file tree Collapse file tree 1 file changed +1
-1
lines changed Original file line number Diff line number Diff line change @@ -77,7 +77,7 @@ ENABLE_OLLAMA=ollama INFERENCE_MODEL="llama3.2:3b" llama stack build --template
7777You can use a container image to run the Llama Stack server. We provide several container images for the server
7878component that works with different inference providers out of the box. For this guide, we will use
7979` llamastack/distribution-starter ` as the container image. If you'd like to build your own image or customize the
80- configurations, please check out [ this guide] ( ../references/index .md ) .
80+ configurations, please check out [ this guide] ( ../distributions/building_distro .md ) .
8181First lets setup some environment variables and create a local directory to mount into the container’s file system.
8282``` bash
8383export INFERENCE_MODEL=" llama3.2:3b"
You can’t perform that action at this time.
0 commit comments