Install this plugin in the same environment as LLM. This plugin likely only works on macOS.
llm install llm-mlx
To install an MLX model from Hugging Face, use the llm mlx download-model
command:
llm mlx download-model mlx-community/Llama-3.2-3B-Instruct-4bit
Then run prompts like this:
llm -m mlx-community/Llama-3.2-3B-Instruct-4bit 'Capital of France?' -s 'you are a pelican'
The mlx-community organization is a useful source for compatible models.
To set up this plugin locally, first checkout the code. Then create a new virtual environment:
cd llm-mlx
python -m venv venv
source venv/bin/activate
Now install the dependencies and test dependencies:
llm install -e '.[test]'
To run the tests:
python -m pytest