You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: README.md
+110
Original file line number
Diff line number
Diff line change
@@ -352,6 +352,116 @@ Once started, you can access the interface by opening `http://localhost:41999` (
352
352
<imgsrc=".github/resources/web-interface.png"alt="MCP Web Interface"width="700">
353
353
</p>
354
354
355
+
### LLM Integration
356
+
357
+
MCP Tools includes a powerful LLM integration that enables AI models to interact with MCP servers through natural language. The LLM command creates an interactive chat session where AI can discover and use tools from one or more MCP servers on your behalf.
-**Multi-Provider Support**: Works with major LLM providers:
373
+
- OpenAI (default) - Uses API key from `OPENAI_API_KEY`
374
+
- Anthropic - Uses API key from `ANTHROPIC_API_KEY`
375
+
376
+
-**Multiple Server Integration**: Connect up to 3 MCP servers simultaneously:
377
+
```bash
378
+
mcp llm -M "server1" -M "server2" -M "server3"
379
+
```
380
+
Tools are automatically prefixed with server IDs (s1_, s2_, s3_) to avoid naming conflicts.
381
+
382
+
-**Tool Execution**: LLMs can:
383
+
- Discover available tools across all connected servers
384
+
- Call tools with proper parameters
385
+
- Receive and process tool results
386
+
- Make multiple tool calls in a single turn
387
+
388
+
-**Server Aliases**: Use server aliases for simpler commands:
389
+
```bash
390
+
# Using aliases for servers
391
+
mcp llm -M fs-server -M github-server
392
+
```
393
+
394
+
#### Example Session
395
+
396
+
```
397
+
mcp > MCP LLM Shell
398
+
mcp > Connecting to server 1: npx -y @modelcontextprotocol/server-filesystem ~
399
+
mcp > Server 1: Registered 8 tools
400
+
mcp > Using provider: openai, model: gpt-4o
401
+
mcp > Total registered tools: 8
402
+
mcp > Type 'exit' to quit
403
+
404
+
user > What files are in my current directory?
405
+
406
+
agent > I'll check the files in your current directory.
407
+
408
+
[Calling s1_list_dir]
409
+
mcp > [Server 1 running list_dir with params {"path":"."}]
410
+
{
411
+
"entries": [
412
+
{
413
+
"name": "README.md",
414
+
"type": "file",
415
+
"size": 12345,
416
+
"modTime": "2023-05-01T12:34:56Z"
417
+
},
418
+
{
419
+
"name": "src",
420
+
"type": "directory"
421
+
}
422
+
]
423
+
}
424
+
425
+
In your current directory, you have:
426
+
1. README.md (file, 12.1 KB)
427
+
2. src (directory)
428
+
429
+
user > Show me the contents of README.md
430
+
431
+
agent > I'll show you the contents of README.md.
432
+
433
+
[Calling s1_read_file]
434
+
mcp > [Server 1 running read_file with params {"path":"README.md"}]
435
+
{
436
+
"content": "# My Project\nThis is a sample README file."
437
+
}
438
+
439
+
Here's the content of README.md:
440
+
441
+
# My Project
442
+
This is a sample README file.
443
+
```
444
+
445
+
#### Configuration Options
446
+
447
+
```bash
448
+
# Provider selection
449
+
mcp llm --provider openai # Default
450
+
mcp llm --provider anthropic
451
+
452
+
# Model selection
453
+
mcp llm --model gpt-4o # Default for OpenAI
454
+
mcp llm --model claude-3.7-sonnet # Default for Anthropic
455
+
456
+
# API key override (otherwise uses environment variables)
457
+
mcp llm --api-key "your-api-key-here"
458
+
459
+
# Display options
460
+
mcp llm --no-color # Disable colored output
461
+
```
462
+
463
+
The LLM integration is designed to make AI-driven workflow automation with MCP tools intuitive and powerful, allowing models to perform complex tasks by combining available tools through natural language requests.
464
+
355
465
### Project Scaffolding
356
466
357
467
MCP Tools provides a scaffolding feature to quickly create new MCP servers with TypeScript:
0 commit comments