feat: move /completions under /v1/completions#112
Conversation
|
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
|
Can confirm this works 👌 with Continue.dev using |
There was a problem hiding this comment.
LGTM! 👍 - this is somewhat related to #42 , but we can ponder on that and just get this more compatible with other API.
We will need to fix the wiki next: https://github.com/louisgv/local.ai/wiki
Side-note: I wonder if we can add both v1 and also the base API? (So that when we update to v2, we can point base to the latest version :-?...
@tomasmcm can you investigate if that's possible?
|
Actually... would be much better if we exposes the baseURL as a config instead - so that folks can put EDIT: scratch this idea - it's basically #42 |
|
@louisgv agreed, one step at a time. Having a compatible API is essential for many use cases. Making it customisable is even better but that can be done next 😊 |
This PR updates the server endpoints in local.ai to match the endpoints from llama-cpp-python.
Specifically it changes
/completionsto/v1/completionsas in llama-cpp-python/llama_cpp/server/app.py:599And
/modelto/v1/modelsas in llama-cpp-python/llama_cpp/server/app.py:805