Skip to content

pekiti/llm-mlx

 
 

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

6 Commits
 
 
 
 
 
 
 
 
 
 
 
 
 
 

Repository files navigation

llm-mlx

PyPI Changelog Tests License

Support for MLX models in LLM

Installation

Install this plugin in the same environment as LLM. This plugin likely only works on macOS.

llm install llm-mlx

Usage

To install an MLX model from Hugging Face, use the llm mlx download-model command:

llm mlx download-model mlx-community/Llama-3.2-3B-Instruct-4bit

Then run prompts like this:

llm -m mlx-community/Llama-3.2-3B-Instruct-4bit 'Capital of France?' -s 'you are a pelican'

The mlx-community organization is a useful source for compatible models.

Development

To set up this plugin locally, first checkout the code. Then create a new virtual environment:

cd llm-mlx
python -m venv venv
source venv/bin/activate

Now install the dependencies and test dependencies:

llm install -e '.[test]'

To run the tests:

python -m pytest

About

Support for MLX models in LLM

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 100.0%