You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Just to clarify, you want your phone to act as a server? Why not use Ollama, Llama.cpp, etc.? and how would you like to connect to this phone? from another phone? can you please elaborate regarding the use case?
Just to clarify, you want your phone to act as a server?
Yes
Why not use Ollama, Llama.cpp, etc.?
That would require users to setup termux and install (compile from source) the said apps since they are not available as standard Android apps, furthermore why use another llm backend for the server if it can all be done in one app. And since llama.rn is built on llama.cpp, I think it should have a built in server.
and how would you like to connect to this phone? from another phone?
From other apps on the same phone along with an option to expose it to other devices on the network would be nice.
can you please elaborate regarding the use case?
To use as a backend for other apps/services, such as a GPT powered keyboard, or for developers to build around and experiment with the API using their own apps.
Description
I want to use openai library to talk to local models hosted in pocketpal. Is it possible to start pocketpal as a daemon similar to ollama?
Use Case
Allow offline on device llm calling from other apps
The text was updated successfully, but these errors were encountered: