Skip to content

Commit ca3fc05

Browse files
authored
update docs with tool calling helpers info (#833)
Signed-off-by: Filinto Duran <[email protected]>
1 parent 41f44c5 commit ca3fc05

File tree

2 files changed

+302
-0
lines changed

2 files changed

+302
-0
lines changed

daprdocs/content/en/python-sdk-docs/_index.md

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -67,6 +67,13 @@ Python SDK imports are subpackages included with the main SDK install, but need
6767
<a href="{{% ref python-actor %}}" class="stretched-link"></a>
6868
</div>
6969
</div>
70+
<div class="card">
71+
<div class="card-body">
72+
<h5 class="card-title"><b>Conversation</b></h5>
73+
<p class="card-text">Use the Dapr Conversation API (Alpha) for LLM interactions, tools, and multi-turn flows.</p>
74+
<a href="{{% ref conversation %}}" class="stretched-link"></a>
75+
</div>
76+
</div>
7077
</div>
7178

7279
Learn more about _all_ of the [available Dapr Python SDK imports](https://github.com/dapr/python-sdk/tree/master/dapr).
Lines changed: 295 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,295 @@
1+
title: "Conversation API (Python) – Recommended Usage"
2+
linkTitle: "Conversation"
3+
weight: 11000
4+
type: docs
5+
description: Recommended patterns for using Dapr Conversation API in Python with and without tools, including multi‑turn flows and safety guidance.
6+
---
7+
8+
The Dapr Conversation API is currently in alpha. This page presents the recommended, minimal patterns to use it effectively with the Python SDK:
9+
- Plain requests (no tools)
10+
- Requests with tools (functions as tools)
11+
- Multi‑turn flows with tool execution
12+
- Async variants
13+
- Important safety notes for executing tool calls
14+
15+
## Prerequisites
16+
17+
- [Dapr CLI]({{% ref install-dapr-cli.md %}}) installed
18+
- Initialized [Dapr environment]({{% ref install-dapr-selfhost.md %}})
19+
- [Python 3.9+](https://www.python.org/downloads/) installed
20+
- [Dapr Python package]({{% ref "python#installation" %}}) installed
21+
- A configured LLM component (for example, OpenAI or Azure OpenAI) in your Dapr environment
22+
23+
For full, end‑to‑end flows and provider setup, see:
24+
- The SDK examples under Conversation:
25+
- [TOOL-CALL-QUICKSTART.md](https://github.com/dapr/python-sdk/blob/main/examples/conversation/TOOL-CALL-QUICKSTART.md)
26+
- [real_llm_providers_example.py](https://github.com/dapr/python-sdk/blob/main/examples/conversation/real_llm_providers_example.py)
27+
28+
## Plain conversation (no tools)
29+
30+
```python
31+
from dapr.clients import DaprClient
32+
from dapr.clients.grpc import conversation
33+
34+
# Build a single‑turn Alpha2 input
35+
user_msg = conversation.create_user_message("What's Dapr?")
36+
alpha2_input = conversation.ConversationInputAlpha2(messages=[user_msg])
37+
38+
with DaprClient() as client:
39+
resp = client.converse_alpha2(
40+
name="echo", # replace with your LLM component name
41+
inputs=[alpha2_input],
42+
temperature=1,
43+
)
44+
45+
for msg in resp.to_assistant_messages():
46+
if msg.of_assistant.content:
47+
print(msg.of_assistant.content[0].text)
48+
```
49+
50+
Key points:
51+
- Use `conversation.create_user_message` to build messages.
52+
- Wrap into `ConversationInputAlpha2(messages=[...])` and pass to `converse_alpha2`.
53+
- Use `response.to_assistant_messages()` to iterate assistant outputs.
54+
55+
## Tools: decorator‑based (recommended)
56+
57+
Decorator-based tools offer a clean, ergonomic approach. Define a function with clear type hints and detail docstring, this is important for the LLM to understand how or when to invoke the tool;
58+
decorate it with `@conversation.tool`. Registered tools can be passed to the LLM and invoked via tool calls.
59+
60+
```python
61+
from dapr.clients import DaprClient
62+
from dapr.clients.grpc import conversation
63+
64+
@conversation.tool
65+
def get_weather(location: str, unit: str = 'fahrenheit') -> str:
66+
"""Get current weather for a location."""
67+
# Replace with a real implementation
68+
return f"Weather in {location} (unit={unit})"
69+
70+
user_msg = conversation.create_user_message("What's the weather in Paris?")
71+
alpha2_input = conversation.ConversationInputAlpha2(messages=[user_msg])
72+
73+
with DaprClient() as client:
74+
response = client.converse_alpha2(
75+
name="openai", # your LLM component
76+
inputs=[alpha2_input],
77+
tools=conversation.get_registered_tools(), # tools registered by @conversation.tool
78+
tool_choice='auto',
79+
temperature=1,
80+
)
81+
82+
# Inspect assistant messages, including any tool calls
83+
for msg in response.to_assistant_messages():
84+
if msg.of_assistant.tool_calls:
85+
for tc in msg.of_assistant.tool_calls:
86+
print(f"Tool call: {tc.function.name} args={tc.function.arguments}")
87+
elif msg.of_assistant.content:
88+
print(msg.of_assistant.content[0].text)
89+
```
90+
91+
Notes:
92+
- Use `conversation.get_registered_tools()` to collect all `@conversation.tool` decorated functions.
93+
- The binder validates/coerces params using your function signature. Keep annotations accurate.
94+
95+
## Minimal multi‑turn with tools
96+
97+
This is the go‑to loop for tool‑using conversations:
98+
99+
{{% alert title="Warning" color="warning" %}}
100+
Do not blindly auto‑execute tool calls returned by the LLM unless you trust all tools registered. Treat tool names and arguments as untrusted input.
101+
- Validate inputs and enforce guardrails (allow‑listed tools, argument schemas, side‑effect constraints).
102+
- For async or I/O‑bound tools, prefer `conversation.execute_registered_tool_async(..., timeout=...)` and set conservative timeouts.
103+
- Consider adding a policy layer or a user confirmation step before execution in sensitive contexts.
104+
- Log and monitor tool usage; fail closed when validation fails.
105+
{{% /alert %}}
106+
107+
```python
108+
from dapr.clients import DaprClient
109+
from dapr.clients.grpc import conversation
110+
111+
@conversation.tool
112+
def get_weather(location: str, unit: str = 'fahrenheit') -> str:
113+
return f"Weather in {location} (unit={unit})"
114+
115+
history: list[conversation.ConversationMessage] = [
116+
conversation.create_user_message("What's the weather in San Francisco?")]
117+
118+
with DaprClient() as client:
119+
# Turn 1
120+
resp1 = client.converse_alpha2(
121+
name="openai",
122+
inputs=[conversation.ConversationInputAlpha2(messages=history)],
123+
tools=conversation.get_registered_tools(),
124+
tool_choice='auto',
125+
temperature=1,
126+
)
127+
128+
# Append assistant messages; execute tool calls; append tool results
129+
for msg in resp1.to_assistant_messages():
130+
history.append(msg)
131+
for tc in msg.of_assistant.tool_calls:
132+
# IMPORTANT: validate inputs and enforce guardrails in production
133+
tool_output = conversation.execute_registered_tool(
134+
tc.function.name, tc.function.arguments
135+
)
136+
history.append(
137+
conversation.create_tool_message(
138+
tool_id=tc.id, name=tc.function.name, content=str(tool_output)
139+
)
140+
)
141+
142+
# Turn 2 (LLM sees tool result)
143+
history.append(conversation.create_user_message("Should I bring an umbrella?"))
144+
resp2 = client.converse_alpha2(
145+
name="openai",
146+
inputs=[conversation.ConversationInputAlpha2(messages=history)],
147+
tools=conversation.get_registered_tools(),
148+
temperature=1,
149+
)
150+
151+
for msg in resp2.to_assistant_messages():
152+
history.append(msg)
153+
if not msg.of_assistant.tool_calls and msg.of_assistant.content:
154+
print(msg.of_assistant.content[0].text)
155+
```
156+
157+
Tips:
158+
- Always append assistant messages to history.
159+
- Execute each tool call (with validation) and append a tool message with the tool output.
160+
- The next turn includes these tool results so the LLM can reason with them.
161+
162+
## Functions as tools: alternatives
163+
164+
When decorators aren’t practical, two options exist.
165+
166+
A) Automatic schema from a typed function:
167+
168+
```python
169+
from enum import Enum
170+
from dapr.clients.grpc import conversation
171+
172+
class Units(Enum):
173+
CELSIUS = 'celsius'
174+
FAHRENHEIT = 'fahrenheit'
175+
176+
def get_weather(location: str, unit: Units = Units.FAHRENHEIT) -> str:
177+
return f"Weather in {location}"
178+
179+
fn = conversation.ConversationToolsFunction.from_function(get_weather)
180+
weather_tool = conversation.ConversationTools(function=fn)
181+
```
182+
183+
B) Manual JSON Schema (fallback):
184+
185+
```python
186+
from dapr.clients.grpc import conversation
187+
188+
fn = conversation.ConversationToolsFunction(
189+
name='get_weather',
190+
description='Get current weather',
191+
parameters={
192+
'type': 'object',
193+
'properties': {
194+
'location': {'type': 'string'},
195+
'unit': {'type': 'string', 'enum': ['celsius', 'fahrenheit']},
196+
},
197+
'required': ['location'],
198+
},
199+
)
200+
weather_tool = conversation.ConversationTools(function=fn)
201+
```
202+
203+
## Async variant
204+
205+
Use the asynchronous client and async tool execution helpers as needed.
206+
207+
```python
208+
import asyncio
209+
from dapr.aio.clients import DaprClient as AsyncDaprClient
210+
from dapr.clients.grpc import conversation
211+
212+
@conversation.tool
213+
def get_time() -> str:
214+
return '2025-01-01T12:00:00Z'
215+
216+
async def main():
217+
async with AsyncDaprClient() as client:
218+
msg = conversation.create_user_message('What time is it?')
219+
inp = conversation.ConversationInputAlpha2(messages=[msg])
220+
resp = await client.converse_alpha2(
221+
name='openai', inputs=[inp], tools=conversation.get_registered_tools()
222+
)
223+
for m in resp.to_assistant_messages():
224+
if m.of_assistant.content:
225+
print(m.of_assistant.content[0].text)
226+
227+
asyncio.run(main())
228+
```
229+
230+
If you need to execute tools asynchronously (e.g., network I/O), implement async functions and use `conversation.execute_registered_tool_async` with timeouts.
231+
232+
## Safety and validation (must‑read)
233+
234+
An LLM may suggest tool calls. Treat all model‑provided parameters as untrusted input.
235+
236+
Recommendations:
237+
- Register only trusted functions as tools. Prefer the `@conversation.tool` decorator for clarity and automatic schema generation.
238+
- Use precise type annotations and docstrings. The SDK converts function signatures to JSON schema and binds parameters with type coercion and rejection of unexpected/invalid fields.
239+
- Add guardrails for tools that can cause side effects (filesystem, network, subprocess). Consider allow‑lists, sandboxing, and limits.
240+
- Validate arguments before execution. For example, sanitize file paths or restrict URLs/domains.
241+
- Consider timeouts and concurrency controls. For async tools, pass a timeout to `execute_registered_tool_async(..., timeout=...)`.
242+
- Log and monitor tool usage. Fail closed: if validation fails, avoid executing the tool and inform the user safely.
243+
244+
See also inline notes in `dapr/clients/grpc/conversation.py` (e.g., `tool()`, `ConversationTools`, `execute_registered_tool`) for parameter binding and error handling details.
245+
246+
247+
## Key helper methods (quick reference)
248+
249+
This section summarizes helper utilities available in dapr.clients.grpc.conversation used throughout the examples.
250+
251+
- create_user_message(text: str) -> ConversationMessage
252+
- Builds a user role message for Alpha2. Use in history lists.
253+
- Example: `history.append(conversation.create_user_message("Hello"))`
254+
255+
- create_system_message(text: str) -> ConversationMessage
256+
- Builds a system message to steer the assistant’s behavior.
257+
- Example: `history = [conversation.create_system_message("You are a concise assistant.")]`
258+
259+
- create_assistant_message(text: str) -> ConversationMessage
260+
- Useful for injecting assistant text in tests or controlled flows.
261+
262+
- create_tool_message(tool_id: str, name: str, content: Any) -> ConversationMessage
263+
- Converts a tool’s output into a tool message the LLM can read next turn.
264+
- content can be any object; it is stringified safely by the SDK.
265+
- Example: `history.append(conversation.create_tool_message(tool_id=tc.id, name=tc.function.name, content=conversation.execute_registered_tool(tc.function.name, tc.function.arguments)))`
266+
267+
- get_registered_tools() -> list[ConversationTools]
268+
- Returns all tools currently registered in the in-process registry.
269+
- Includes tools created via:
270+
- @conversation.tool decorator (auto-registered by default), and
271+
- ConversationToolsFunction.from_function with register=True (default).
272+
- Pass this list in converse_alpha2(..., tools=...).
273+
274+
- register_tool(name: str, t: ConversationTools) / unregister_tool(name: str)
275+
- Manually manage the tool registry (e.g., advanced scenarios, tests, cleanup).
276+
- Names must be unique; unregister to avoid collisions in long-lived processes.
277+
278+
- execute_registered_tool(name: str, params: Mapping|Sequence|str|None) -> Any
279+
- Synchronously executes a registered tool by name.
280+
- params accepts kwargs (mapping), args (sequence), JSON string, or None. If a JSON string is provided (as commonly returned by LLMs), it is parsed for you.
281+
- Parameters are validated and coerced against the function signature/schema; unexpected or invalid fields raise errors.
282+
- Security: treat params as untrusted; add guardrails for side effects.
283+
284+
- execute_registered_tool_async(name: str, params: Mapping|Sequence|str|None, *, timeout: float|None=None) -> Any
285+
- Async counterpart. Supports timeouts, which are recommended for I/O-bound tools.
286+
- Prefer this for async tools or when using the aio client.
287+
288+
- ConversationToolsFunction.from_function(func: Callable, register: bool = True) -> ConversationToolsFunction
289+
- Derives a JSON schema from a typed Python function (annotations + optional docstring) and optionally registers a tool.
290+
- Typical usage: `spec = conversation.ConversationToolsFunction.from_function(my_func)`; then either rely on auto-registration or wrap with `ConversationTools(function=spec)` and call `register_tool(spec.name, tool)` or pass `[tool]` directly to `tools=`.
291+
292+
- ConversationResponseAlpha2.to_assistant_messages() -> list[ConversationMessage]
293+
- Convenience to transform the response outputs into assistant ConversationMessage objects you can append to history directly (including tool_calls when present).
294+
295+
Tip: The @conversation.tool decorator is the easiest way to create a tool. It auto-generates the schema from your function, allows an optional namespace/name override, and auto-registers the tool (you can set register=False to defer registration).

0 commit comments

Comments
 (0)