You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
MCP supports providing completion suggestions for prompt arguments and resource template parameters. With the context parameter, servers can provide completions based on previously resolved values:
Copy file name to clipboardExpand all lines: examples/clients/simple-chatbot/README.MD
+6-4Lines changed: 6 additions & 4 deletions
Original file line number
Diff line number
Diff line change
@@ -25,11 +25,12 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into
25
25
```plaintext
26
26
LLM_API_KEY=your_api_key_here
27
27
```
28
+
28
29
**Note:** The current implementation is configured to use the Groq API endpoint (`https://api.groq.com/openai/v1/chat/completions`) with the `llama-3.2-90b-vision-preview` model. If you plan to use a different LLM provider, you'll need to modify the `LLMClient` class in `main.py` to use the appropriate endpoint URL and model parameters.
29
30
30
31
3.**Configure servers:**
31
32
32
-
The `servers_config.json` follows the same structure as Claude Desktop, allowing for easy integration of multiple servers.
33
+
The `servers_config.json` follows the same structure as Claude Desktop, allowing for easy integration of multiple servers.
33
34
Here's an example:
34
35
35
36
```json
@@ -46,9 +47,11 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into
46
47
}
47
48
}
48
49
```
50
+
49
51
Environment variables are supported as well. Pass them as you would with the Claude Desktop App.
50
52
51
53
Example:
54
+
52
55
```json
53
56
{
54
57
"mcpServers": {
@@ -72,7 +75,7 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into
72
75
```
73
76
74
77
2.**Interact with the assistant:**
75
-
78
+
76
79
The assistant will automatically detect available tools and can respond to queries based on the tools provided by the configured servers.
77
80
78
81
3.**Exit the session:**
@@ -86,6 +89,7 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into
86
89
-**Server Integration**: Supports any MCP-compatible server, tested with various server implementations including Uvicorn and Node.js.
87
90
88
91
### Class Structure
92
+
89
93
-**Configuration**: Manages environment variables and server configurations
90
94
-**Server**: Handles MCP server initialization, tool discovery, and execution
91
95
-**Tool**: Represents individual tools with their properties and formatting
@@ -107,5 +111,3 @@ This example demonstrates how to integrate the Model Context Protocol (MCP) into
107
111
- If it's a direct response → return to user
108
112
- Tool results are sent back to LLM for interpretation
0 commit comments