I would suggest adding this Local LLM runner as well. Its like a ollama wrapper that helps use open source llm models without any api key. Link: https://msty.app/
I would suggest adding this Local LLM runner as well. Its like a ollama wrapper that helps use open source llm models without any api key.
Link: https://msty.app/