-
Notifications
You must be signed in to change notification settings - Fork 5.2k
Description
Is your feature request related to a problem? Please describe.
CometAPI is already integrated with LiteLLM and should work with Open Interpreter like other providers, but there's no documentation in the hosted models section to guide users on how to configure and use CometAPI.
Describe the solution you'd like
Add CometAPI documentation to the open-interpreter/docs/language-models/hosted-models/ directory that includes:
- Configuration instructions for CometAPI
- API key setup steps
- Available models and usage examples
- Integration patterns with Open Interpreter
Describe alternatives you've considered
Users currently need to figure out CometAPI configuration through trial and error or by referencing LiteLLM documentation directly.
Additional context
CometAPI is an OpenAI-compatible API (base URL: https://api.cometapi.com/v1/) that works with existing OpenAI SDKs and LiteLLM. Since Open Interpreter uses LiteLLM, CometAPI should work seamlessly as a provider.
CometAPI Resources:
- Website: https://www.cometapi.com/?utm_source=open-interpreter&utm_campaign=integration&utm_medium=integration&utm_content=integration
- Documentation: https://api.cometapi.com/doc
- Models: https://api.cometapi.com/v1/models
- API Key: https://api.cometapi.com/console/token
- Pricing: https://api.cometapi.com/pricing
We can assist with writing the documentation according to your project's standards and guidelines.