-
Notifications
You must be signed in to change notification settings - Fork 5k
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Does it support Qwen series hosted model? #1572
Comments
Certainly! Let's dive straight into a concrete example of how you can integrate DeepSeek V3 from OpenRouter.ai into your Open Interpreter setup. This will help you leverage DeepSeek's powerful language model capabilities within Open Interpreter seamlessly. Step-by-Step Integration Guide1. Obtain Your OpenRouter.ai API KeyFirst, you need to sign up for an account on [OpenRouter.ai](https://openrouter.ai/) and generate an API key.
2. Install Open InterpreterIf you haven't installed Open Interpreter yet, you can do so using pip install open-interpreter 3. Configure Open Interpreter to Use DeepSeek V3 via OpenRouter.aiYou have two primary ways to configure Open Interpreter to use DeepSeek V3: Command Line and Python Script. A. Using Command LineOpen your terminal and run the following command, replacing interpreter --api_base "https://openrouter.ai/api/v1" --api_key "YOUR_OPENROUTER_API_KEY" --model "deepseek/deepseek-chat" Example: interpreter --api_base "https://openrouter.ai/api/v1" --api_key "sk-1234567890abcdef" --model "deepseek/deepseek-chat" This command sets the API base to OpenRouter's endpoint, provides your API key, and specifies the DeepSeek V3 model. B. Using a Python ScriptIf you prefer configuring within a Python environment, follow these steps:
from interpreter import interpreter
# Configure the interpreter to use DeepSeek V3 via OpenRouter.ai
interpreter.llm.model = "deepseek/deepseek-chat"
interpreter.llm.api_key = "YOUR_OPENROUTER_API_KEY"
interpreter.llm.api_base = "https://openrouter.ai/api/v1"
# Start an interactive chat session
interpreter.chat()
Ensure you replace
python use_deepseek.py This script initializes Open Interpreter with DeepSeek V3 and starts an interactive chat session. 4. Testing the IntegrationAfter configuring, it's essential to verify that the integration works as expected.
5. Advanced Configuration (Optional)You can further customize your setup based on your requirements.
6. Troubleshooting Tips
ConclusionBy following the above steps, you should have successfully integrated DeepSeek V3 via OpenRouter.ai into your Open Interpreter setup. This integration harnesses the advanced capabilities of DeepSeek, providing a more powerful and flexible language model experience. If you encounter any further issues or have specific questions, feel free to reach out by commenting on [Issue #1572](#1572) or joining the [Open Interpreter Discord](https://discord.com/invite/open-interpreter) for real-time support. Happy coding! |
wow so we can use o1, deepseek whatever we want! thank you |
Is your feature request related to a problem? Please describe.
No response
Describe the solution you'd like
Hi. I want to use the Qwen-turbo model on the open interpreter. I checked the documentation and did not find the
Qwen
series hosted model. How can I add support for it? Is there a tutorial or guide to follow?Thanks.
Describe alternatives you've considered
No response
Additional context
No response
The text was updated successfully, but these errors were encountered: