You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
it's basically just a wrapper. call it a framework. with...a standard way of doing things (dictated by the specification).
or in other words:
you could take your locally installed ollama, with your locally installed AI (which supports tool calling) and write custom tools for everything you need which is time-consuming, a repetitive task for everyone.
or you could just write a specification in the hope that it becomes the next standard (probably the first standard) on "how AI's should talk to the outside world" and let other people do the job of writing those functions by implementing their own MCP servers (shop integrating its API, business owner integrating its HTTP endpoints, big online company integrating its openAPI spec, etc.) and sit back and just connect to whatever service you want with your tool-calling-capable LLM. now it is just one tool you call and you get 50 tools back. (hence why I said it is just a wrapper).
I always said to people, LLMs are just the "brain" (if you will) and they are missing hands and feet.
think of this effort making LLMs tool-calling capable and defining a standard around it, as the effort of giving LLMs tentacles that spread around the world.
Could explain the difference between MCP and function calling? thanks
The text was updated successfully, but these errors were encountered: