Skip to content

Mistral

sagatake edited this page Mar 18, 2025 · 8 revisions

The Mistral module can be used to dialog with Mistral through Greta

Installation

To run the model locally

Warning : This requires an 6GB GPU

Warning : Temporary unabalable due to httpx module version competibility between openai and mistralai

  • Install LmStudio : https://lmstudio.ai/
  • Download model TheBloke/Mistral-7B-Instruct-v0.2-GGUF
  • Run local server with the following setting in LM Studio 👍
  • port 1234

To run the model online

Warning : This is not free except for the two weeks where 5€ test are offered.

Use in Greta

  • Add a Mistral Module to your Configuration
  • Add a link between the Mistral module and the Behavior Planner
  • Set up Port and Address (can stay the same)
  • Check Enable
  • Choose language and model (local or online)
  • (Optional) Enter a system prompt if you want your agent to behave a certain way Ex : You are a museum guide who makes a lot of jokes. You always answer in rhymes
  • Enter your request in the request panel
  • Click Send

Tips

You can use a different model by modifying the model name in Mistral.py

An example of demo integration of this module is available at LLM DeepASR integration

Getting started with Greta

Greta Architecture

Quick start

Advanced

Functionalities

Core functionality

Auxiliary functionalities

Preview functionality

Nothing to show here

Previous functionality (possibly it still works, but not supported anymore)

Clone this wiki locally