Ready-to-use template for building multi-agent apps with Smolagents. It ships with:
- Gradio chat UI
- MCP client manager
- A manager agent coordinating two worker agents (tool-calling + code)
- OpenTelemetry tracing support with Phoenix
Use this as a minimal, extensible starting point to create your own agents, tools, and UIs.
- Manager agent delegates to two workers:
- Tool-calling worker uses DeepWiki MCP tools (queries public GitHub repos).
- Code worker uses a native Joke toolkit (
get_joke).
MCPManagerloads clients from JSON and exposes their tools to agents.GradioAgentUIlaunches a chat interface for the manager agent.
Simple flow:
flowchart LR
User[User] --> UI[Gradio UI]
UI --> Manager["Manager (Tool-Calling Agent)"]
Manager --> DeepwikiWorker["Worker (Tool-Calling Agent)"]
Manager --> JokeWorker["Worker (Code Agent)"]
DeepwikiWorker --> DeepWiki["DeepWiki MCP Tools"]
JokeWorker --> JokeAPI["Joke Toolkit (Native Tools)"]
Prereqs: Python 3.13+, and uv for dependency management.
# Clone and enter the project
git clone <this-repo-url>
cd smolagents-quickstart-template# Create your environment from pyproject/uv.lock
uv sync# Copy env and fill required values
cp env.example .envEdit .env and set at least:
LITELLM_MODEL_ID(e.g.,gemini/gemini-2.5-flashor your preferred model id)LITELLM_API_KEY(API key for the selected model provider)MCP_CONFIG_PATH=mcp_manager/mcp_config.json(path to the included MCP config)
Notes:
- The included MCP config defines a
deepwikiclient using streamable-http. - Other env entries are optional unless you run your own MCP servers.
This template uses LiteLLMModel in the example agents so you can plug into many providers with a single interface.
- Configure via
.env: setLITELLM_MODEL_IDandLITELLM_API_KEY. - Examples:
gemini/gemini-2.5-flash,openai/gpt-4o-mini, etc. (provider/model naming per LiteLLM conventions). - You can swap to other Smolagents-supported backends by replacing
LiteLLMModelin your agents.
See Smolagents docs for supported model backends and parameters:
- Docs: https://huggingface.co/docs/smolagents
- GitHub (examples): https://github.com/huggingface/smolagents
Launch the manager agent with a Gradio chat UI:
uv run python app.pyThe terminal will print the Gradio URLs (local and a temporary share link). Open the printed URL in your browser to chat with the manager agent.
This template includes OpenTelemetry tracing support using Phoenix and OpenInference for Smolagents. Tracing is enabled by default to monitor agent interactions and performance.
To inspect traces from your runs:
python -m phoenix.server.main serveYou can then navigate to http://0.0.0.0:6006/projects/ to inspect your run!
- Manager Agent (
ExampleManagerAgent): Orchestrates work and delegates to workers. - Tool-Calling Worker (
ExampleToolCallingAgent): Uses DeepWiki MCP tools for repo knowledge. - Code Worker (
ExampleCodeAgent): Uses a native Joke toolkit (get_joke).
High-level flow (from main.py):
- Initialize
MCPManagerand set up MCP clients (setup_clients). - Fetch tools: MCP tools from
deepwikiand native tools fromExampleJokeToolkit. - Create workers:
ExampleToolCallingAgent(MCP tools) andExampleCodeAgent(native tools). - Create
ExampleManagerAgentwithmanaged_agents=[tool_calling_agent.agent, code_agent.agent]. - Launch the Gradio chat UI via
GradioAgentUI(agent=manager_agent).launch(). - On exit, disconnect MCP clients (
mcp_manager.disconnect_all()).
Simple phases diagram:
flowchart LR
MCP[MCP setup + tools] --> Agents[Agent creation]
Agents --> UI[Gradio UI + chat loop]
app.py: Entrypoint (python app.pycallsmain()).main.py: Wires MCP clients, builds agents, launches the Gradio UI.agents/base_agent.py: Base classes for simple and manager agents.example_tool_calling_agent.py: Tool-calling worker using DeepWiki MCP tools.example_code_agent.py: Code worker using a native Joke toolkit.example_manager_agent.py: Manager agent coordinating the workers.
toolkits/example_joke_toolkit.py: Native tool (@tool get_joke) exposed to agents.prompts/prompts.py: Instruction strings for each example agent.mcp_manager/mcp_manager.py: Loads MCP client configs and exposes tools.mcp_config.json: Example config with adeepwikiclient.
ui/gradio_agent_ui.py: Minimal Gradio chat wrapper for an agent.base_ui.py: UI abstraction.
env.example: Template for.envvariables.pyproject.toml: Project metadata and dependencies.
- Add a file under
agents/and inherit fromBaseAgent(orBaseManagerAgent). - Inside
__init__, build a Smolagents agent (ToolCallingAgentorCodeAgent) and assign it toself.agent.
Example (tool-calling agent skeleton):
# agents/my_agent.py
import os
from smolagents.agents import ToolCallingAgent
from smolagents import LiteLLMModel, Tool
from agents.base_agent import BaseAgent
from prompts import prompts
class MyAgent(BaseAgent):
def __init__(self, tools: list[Tool]):
super().__init__()
self.agent = ToolCallingAgent(
...
)
def run(self, message, history=None):
return self.agent.run(message)Example (code agent skeleton):
# agents/my_code_agent.py
import os
from smolagents.agents import CodeAgent
from smolagents import LiteLLMModel, Tool
from agents.base_agent import BaseAgent
from prompts import prompts
class MyCodeAgent(BaseAgent):
def __init__(self, tools: list[Tool]):
super().__init__()
self.agent = CodeAgent(
...
)
def run(self, message, history=None):
return self.agent.run(message)Tip: Both agents accept additional parameters (e.g., return_full_result, step limits, etc.). See Smolagents docs for details:
- Docs: https://huggingface.co/docs/smolagents
- API reference and examples: https://github.com/huggingface/smolagents
- Put instruction strings in
prompts/prompts.py(e.g.,MY_AGENT_INSTRUCTIONS = "..."). - Reference them via the
instructions=parameter when constructing the agent.
- Create a function in
toolkits/and decorate with@tool. - Return it in a
get_tools()list for easy import.
# toolkits/my_tools.py
from smolagents import tool
@tool
def my_utility(x: int) -> int:
return x * 2
class MyToolkit:
@staticmethod
def get_tools():
return [my_utility]- Edit
mcp_manager/mcp_config.jsonto add a client, for example:
{
"deepwiki": { "url": "https://mcp.deepwiki.com/mcp", "transport": "streamable-http" },
"my_client": { "url": "https://my-mcp.example.com/mcp", "transport": "streamable-http" }
}- Ensure
.envhasMCP_CONFIG_PATH=mcp_manager/mcp_config.json(or your custom path). - In
main.py, fetch tools withmcp_manager.get_tools("my_client").
- Import your agents and toolkits, instantiate them, pass tools, then include them in the manager’s
managed_agentslist and UI:
import os
from mcp_manager.mcp_manager import MCPManager
from toolkits.my_tools import MyToolkit
from agents.example_tool_calling_agent import ExampleToolCallingAgent
from agents.example_code_agent import ExampleCodeAgent
from agents.example_manager_agent import ExampleManagerAgent
from ui.gradio_agent_ui import GradioAgentUI
# 1) Setup MCP manager and clients
mcp_manager = MCPManager(os.getenv("MCP_CONFIG_PATH", "mcp_manager/mcp_config.json"))
mcp_manager.setup_clients()
# 2) Fetch tools
deepwiki_tools = mcp_manager.get_tools("deepwiki")
joke_tools = MyToolkit.get_tools()
# 3) Create workers
tool_worker = ExampleToolCallingAgent(tools=deepwiki_tools)
code_worker = ExampleCodeAgent(tools=joke_tools)
# 4) Create manager and launch UI
manager_agent = ExampleManagerAgent(tools=[], managed_agents=[tool_worker.agent, code_worker.agent])
ui = GradioAgentUI(agent=manager_agent)
ui.launch()- Change host/sharing in
ui/gradio_agent_ui.py(server_name,share).
- No MCP tools? Verify
MCP_CONFIG_PATHpoints to a valid JSON (e.g.,mcp_manager/mcp_config.json). - Model/auth errors? Set
LITELLM_MODEL_IDandLITELLM_API_KEYin.env. - Connectivity issues? Ensure your network can reach the MCP endpoint(s).
- Smolagents documentation: https://huggingface.co/docs/smolagents
- Smolagents GitHub (examples, issues): https://github.com/huggingface/smolagents