Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Does it support Qwen series hosted model? #1572

Open
ruiqurm opened this issue Dec 22, 2024 · 2 comments
Open

Does it support Qwen series hosted model? #1572

ruiqurm opened this issue Dec 22, 2024 · 2 comments

Comments

@ruiqurm
Copy link

ruiqurm commented Dec 22, 2024

Is your feature request related to a problem? Please describe.

No response

Describe the solution you'd like

Hi. I want to use the Qwen-turbo model on the open interpreter. I checked the documentation and did not find the Qwen series hosted model. How can I add support for it? Is there a tutorial or guide to follow?
Thanks.

Describe alternatives you've considered

No response

Additional context

No response

@ebowwa
Copy link

ebowwa commented Dec 27, 2024

Certainly! Let's dive straight into a concrete example of how you can integrate DeepSeek V3 from OpenRouter.ai into your Open Interpreter setup. This will help you leverage DeepSeek's powerful language model capabilities within Open Interpreter seamlessly.

Step-by-Step Integration Guide

1. Obtain Your OpenRouter.ai API Key

First, you need to sign up for an account on [OpenRouter.ai](https://openrouter.ai/) and generate an API key.

  1. Sign Up / Log In:

  2. Generate API Key:

2. Install Open Interpreter

If you haven't installed Open Interpreter yet, you can do so using pip:

pip install open-interpreter

3. Configure Open Interpreter to Use DeepSeek V3 via OpenRouter.ai

You have two primary ways to configure Open Interpreter to use DeepSeek V3: Command Line and Python Script.


A. Using Command Line

Open your terminal and run the following command, replacing YOUR_OPENROUTER_API_KEY with the API key you obtained:

interpreter --api_base "https://openrouter.ai/api/v1" --api_key "YOUR_OPENROUTER_API_KEY" --model "deepseek/deepseek-chat"

Example:

interpreter --api_base "https://openrouter.ai/api/v1" --api_key "sk-1234567890abcdef" --model "deepseek/deepseek-chat"

This command sets the API base to OpenRouter's endpoint, provides your API key, and specifies the DeepSeek V3 model.


B. Using a Python Script

If you prefer configuring within a Python environment, follow these steps:

  1. Create a Python Script (e.g., use_deepseek.py):
from interpreter import interpreter

# Configure the interpreter to use DeepSeek V3 via OpenRouter.ai
interpreter.llm.model = "deepseek/deepseek-chat"
interpreter.llm.api_key = "YOUR_OPENROUTER_API_KEY"
interpreter.llm.api_base = "https://openrouter.ai/api/v1"

# Start an interactive chat session
interpreter.chat()
  1. Replace the API Key:

Ensure you replace "YOUR_OPENROUTER_API_KEY" with your actual API key.

  1. Run the Script:
python use_deepseek.py

This script initializes Open Interpreter with DeepSeek V3 and starts an interactive chat session.


4. Testing the Integration

After configuring, it's essential to verify that the integration works as expected.

  1. Start Open Interpreter:

    • Command Line: If you used the command line method, running the command as shown above will start the interpreter.
    • Python Script: Running your Python script will start the interpreter.
  2. Execute a Test Command:

    Once the interpreter is running, try a simple command to ensure everything is functioning.

    Example:

    interpreter> Plot the stock prices of AAPL and MSFT for the last month.

    Expected Outcome:

    The interpreter should process the command using DeepSeek V3 and generate the corresponding plot.


5. Advanced Configuration (Optional)

You can further customize your setup based on your requirements.

  • Setting Environment Variables:

    Instead of passing the API key directly in commands or scripts, you can set it as an environment variable for better security.

    export OPENROUTER_API_KEY="YOUR_OPENROUTER_API_KEY"
    interpreter --api_base "https://openrouter.ai/api/v1" --api_key "$OPENROUTER_API_KEY" --model "deepseek/deepseek-chat"
  • Using a .env File:

    Create a .env file in your project directory:

    OPENROUTER_API_KEY=your_api_key_here

    Then, modify your Python script to load environment variables:

    import os
    from interpreter import interpreter
    from dotenv import load_dotenv
    
    load_dotenv()  # Load variables from .env
    
    interpreter.llm.model = "deepseek/deepseek-chat"
    interpreter.llm.api_key = os.getenv("OPENROUTER_API_KEY")
    interpreter.llm.api_base = "https://openrouter.ai/api/v1"
    
    interpreter.chat()

    Install python-dotenv if not already installed:

    pip install python-dotenv

6. Troubleshooting Tips

  • Invalid API Key:

    • Ensure that the API key is correctly copied without any extra spaces or characters.
  • Network Issues:

    • Verify your internet connection.
    • Ensure that https://openrouter.ai/api/v1 is accessible from your network.
  • Model Not Found:

    • Double-check the model name. It should be exactly "deepseek/deepseek-chat".
  • Verbose Mode:

    • Enable verbose mode to get detailed logs which can help in debugging.

    Command Line:

    interpreter --api_base "https://openrouter.ai/api/v1" --api_key "YOUR_OPENROUTER_API_KEY" --model "deepseek/deepseek-chat" --verbose

    Python Script:

    interpreter.verbose = True

Conclusion

By following the above steps, you should have successfully integrated DeepSeek V3 via OpenRouter.ai into your Open Interpreter setup. This integration harnesses the advanced capabilities of DeepSeek, providing a more powerful and flexible language model experience.

If you encounter any further issues or have specific questions, feel free to reach out by commenting on [Issue #1572](#1572) or joining the [Open Interpreter Discord](https://discord.com/invite/open-interpreter) for real-time support.

Happy coding!

@tristayunsub
Copy link

wow so we can use o1, deepseek whatever we want! thank you

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

3 participants