Skip to content

Add CometAPI documentation to open-interpreter/docs/language-models/hosted-models/ #1645

@TensorNull

Description

@TensorNull

Is your feature request related to a problem? Please describe.

CometAPI is already integrated with LiteLLM and should work with Open Interpreter like other providers, but there's no documentation in the hosted models section to guide users on how to configure and use CometAPI.

Describe the solution you'd like

Add CometAPI documentation to the open-interpreter/docs/language-models/hosted-models/ directory that includes:

  • Configuration instructions for CometAPI
  • API key setup steps
  • Available models and usage examples
  • Integration patterns with Open Interpreter

Describe alternatives you've considered

Users currently need to figure out CometAPI configuration through trial and error or by referencing LiteLLM documentation directly.

Additional context

CometAPI is an OpenAI-compatible API (base URL: https://api.cometapi.com/v1/) that works with existing OpenAI SDKs and LiteLLM. Since Open Interpreter uses LiteLLM, CometAPI should work seamlessly as a provider.

CometAPI Resources:

We can assist with writing the documentation according to your project's standards and guidelines.

Metadata

Metadata

Assignees

No one assigned

    Labels

    No labels
    No labels

    Type

    No type

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions