Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Other LLM models support? #8

Open
Pythonpa opened this issue Jul 5, 2024 · 6 comments
Open

Other LLM models support? #8

Pythonpa opened this issue Jul 5, 2024 · 6 comments
Labels
enhancement New feature or request help wanted Extra attention is needed

Comments

@Pythonpa
Copy link

Pythonpa commented Jul 5, 2024

Hey,thanks for your hard work.
Do you have a plan which this project can support other LLM's API or local models.
Such as Kimi? 通义千问?讯飞星火? or Ollama?

@yinan-c
Copy link
Owner

yinan-c commented Jul 6, 2024

Thanks for your attention, other llms are great, but unfortunately not planned at the moment, since I haven’t really used them myself. Hopefully someone else can submit a pr on this!

@yinan-c yinan-c added enhancement New feature or request help wanted Extra attention is needed labels Jul 6, 2024
@yinan-c yinan-c mentioned this issue Jul 6, 2024
@yinan-c
Copy link
Owner

yinan-c commented Jul 21, 2024

As a workaround, with update 3b1aec6, 96a7e0b, you should be able to use OneAPI (thanks to its similar use of OPENAI API) and make use of the customised model name.

@Felix2yu
Copy link

Felix2yu commented Oct 2, 2024

测试ollama可以正常使用,现在每一个订阅都需要手动填写模型,如果可以的话希望能配置全局加入自定义模型,只需要在下拉框选一下,如果能够设置默认模型就更好了。

@te-chan2
Copy link
Contributor

Hi!
I encountered a model support probrem while using Azure OpenAI.

How about using litellm for model support ?
In my case, by specifying 'azure/gpt-4o' as the model name and setting the environment variables correctly, I was able to use Azure OpenAI in this project as well.
image

NOTE:

There seem to be models that cannot specify response_format: {type: "json"}.

@yinan-c
Copy link
Owner

yinan-c commented Oct 29, 2024

Hi! I encountered a model support probrem while using Azure OpenAI.

How about using litellm for model support ? In my case, by specifying 'azure/gpt-4o' as the model name and setting the environment variables correctly, I was able to use Azure OpenAI in this project as well. image

Thanks. I will check out the repo when I have time, in the meanwhile, I am not familiar with azure openai, do you need to set the base_url to call the azure openai models?

NOTE:

There seem to be models that cannot specify response_format: {type: "json"}.

I think it will be fine. I have setup a fallback option to call the AI model without json mode.

@te-chan2
Copy link
Contributor

do you need to set the base_url to call the azure openai models?

Yes. Azure OpenAI requires "base_url", "api_key", "api_version", "deployment_name".
(this is useful for specifying deployement location)

Using LiteLLM, these variables can be configured via environment variable.

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement New feature or request help wanted Extra attention is needed
Projects
None yet
Development

No branches or pull requests

4 participants