Interactive CLI chat with an AI agent using ElizaOS. Supports multiple LLM providers.
The chat will automatically use the first provider with a valid API key:
| Provider | API Key Variable | Get Your Key |
|---|---|---|
| OpenAI | OPENAI_API_KEY |
platform.openai.com |
| Anthropic (Claude) | ANTHROPIC_API_KEY |
console.anthropic.com |
| xAI (Grok) | XAI_API_KEY |
console.x.ai |
| Google GenAI (Gemini) | GOOGLE_GENERATIVE_AI_API_KEY |
aistudio.google.com |
| Groq | GROQ_API_KEY |
console.groq.com |
-
Install dependencies
bun install
-
Configure your API key
cp .env.example .env # Edit .env and add at least one API key -
Run the chat
bun run start
🚀 Starting Eliza Chat...
✅ Using OpenAI for language model
💬 Chat with Eliza (type 'exit' to quit)
You: Hello!
Eliza: Hello! How can I help you today?
You: exit
👋 Goodbye!
If multiple API keys are set, the chat will use them in this order:
- OpenAI
- Anthropic (Claude)
- xAI (Grok)
- Google GenAI (Gemini)
- Groq
You can override the default models for each provider. See .env.example for all options.
# Use GPT-4 Turbo instead of default
OPENAI_LARGE_MODEL=gpt-5
# Use Claude Opus
ANTHROPIC_LARGE_MODEL=claude-opus-4-7
# Use Llama 70B on Groq
GROQ_LARGE_MODEL=llama-3.3-70b-versatileRun with hot reload:
bun run devMIT