You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
@@ -15,7 +15,7 @@ LLM Observability tools that just let you understand how your model is performin
15
15
16
16
## Pydantic Logfire
17
17
18
-
[Pydantic Logfire](https://pydantic.dev/logfire) is an observability platform developed by the team who created and maintain Pydantic and PydanticAI. Logfire aims to let you understand your entire application: Gen AI, classic predictive AI, HTTP traffic, database queries and everything else a modern application needs.
18
+
[Pydantic Logfire](https://pydantic.dev/logfire) is an observability platform developed by the team who created and maintain Pydantic and PydanticAI. Logfire aims to let you understand your entire application: Gen AI, classic predictive AI, HTTP traffic, database queries and everything else a modern application needs, all using OpenTelemetry.
19
19
20
20
!!! tip "Pydantic Logfire is a commercial product"
21
21
Logfire is a commercially supported, hosted platform with an extremely generous and perpetual [free tier](https://pydantic.dev/pricing/).
@@ -27,15 +27,17 @@ Here's an example showing details of running the [Weather Agent](examples/weathe
# or instrument all agents to avoid needing to add `instrument=True` to each agent:
67
-
Agent.instrument_all()
63
+
logfire.configure() # (1)!
64
+
logfire.instrument_pydantic_ai() # (2)!
65
+
66
+
agent = Agent('openai:gpt-4o', instructions='Be concise, reply with one sentence.')
67
+
result = agent.run_sync('Where does "hello world" come from?') # (3)!
68
+
print(result.output)
69
+
"""
70
+
The first known use of "hello, world" was in a 1974 textbook about the C programming language.
71
+
"""
68
72
```
69
73
70
-
The [logfire documentation](https://logfire.pydantic.dev/docs/) has more details on how to use logfire,
71
-
including how to instrument other libraries like [Pydantic](https://logfire.pydantic.dev/docs/integrations/pydantic/),
72
-
[HTTPX](https://logfire.pydantic.dev/docs/integrations/http-clients/httpx/) and [FastAPI](https://logfire.pydantic.dev/docs/integrations/web-frameworks/fastapi/).
74
+
1.[`logfire.configure()`][logfire.configure] configures the SDK, by default it will find the write token from the `.logfire` directory, but you can also pass a token directly.
75
+
2.[`logfire.instrument_pydantic_ai()`][logfire.Logfire.instrument_pydantic_ai] enables instrumentation of PydanticAI.
76
+
3. Since we've enabled instrumentation, a trace will be generated for each run, with spans emitted for models calls and tool function execution
77
+
78
+
_(This example is complete, it can be run "as is")_
73
79
74
-
Since Logfire is built on [OpenTelemetry](https://opentelemetry.io/), you can use the Logfire Python SDK to send data to any OpenTelemetry collector.
80
+
Which will display in Logfire thus:
75
81
76
-
Once you have logfire set up, there are two primary ways it can help you understand your application:
***Debugging** — Using the live view to see what's happening in your application in real-time.
79
-
***Monitoring** — Using SQL and dashboards to observe the behavior of your application, Logfire is effectively a SQL database that stores information about how your application is running.
84
+
The [logfire documentation](https://logfire.pydantic.dev/docs/) has more details on how to use Logfire,
85
+
including how to instrument other libraries like [HTTPX](https://logfire.pydantic.dev/docs/integrations/http-clients/httpx/) and [FastAPI](https://logfire.pydantic.dev/docs/integrations/web-frameworks/fastapi/).
86
+
87
+
Since Logfire is built on [OpenTelemetry](https://opentelemetry.io/), you can use the Logfire Python SDK to send data to any OpenTelemetry collector, see [below](#using-opentelemetry).
80
88
81
89
### Debugging
82
90
@@ -90,65 +98,161 @@ We can also query data with SQL in Logfire to monitor the performance of an appl
In order to monitor HTTPX requests made by models, you can use `logfire`'s [HTTPX](https://logfire.pydantic.dev/docs/integrations/http-clients/httpx/) integration.
103
+
!!! tip ""F**k you, show me the prompt.""
104
+
As per Hamel Husain's influential 2024 blog post ["Fuck You, Show Me The Prompt."](https://hamel.dev/blog/posts/prompt/)
105
+
(bear with the capitalization, the point is valid), it's often useful to be able to view the raw HTTP requests and responses made to model providers.
96
106
97
-
Instrumentation is as easy as adding the following three lines to your application:
107
+
To observe raw HTTP requests made to model providers, you can use `logfire`'s [HTTPX instrumentation](https://logfire.pydantic.dev/docs/integrations/http-clients/httpx/) since all provider SDKs use the [HTTPX](https://www.python-httpx.org/) library internally.
result = agent.run_sync('What is the capital of France?')
122
+
print(result.output)
123
+
#> Paris
124
+
```
125
+
126
+
1. See the [`logfire.instrument_httpx` docs][logfire.Logfire.instrument_httpx] more details, `capture_all=True` means both headers and body are captured for both the request and response.
127
+
128
+

result = agent.run_sync('What is the capital of France?')
142
+
print(result.output)
143
+
#> Paris
144
+
```
145
+
146
+

147
+
148
+
## Using OpenTelemetry
149
+
150
+
PydanticAI's instrumentation uses [OpenTelemetry](https://opentelemetry.io/) (OTel), which Logfire is based on.
151
+
152
+
This means you can debug and monitor PydanticAI with any OpenTelemetry backend.
153
+
154
+
PydanticAI follows the [OpenTelemetry Semantic Conventions for Generative AI systems](https://opentelemetry.io/docs/specs/semconv/gen-ai/), so while we think you'll have the best experience using the Logfire platform :wink:, you should be able to use any OTel service with GenAI support.
155
+
156
+
### Logfire with an alternative OTel backend
157
+
158
+
You can use the Logfire SDK completely freely and send the data to any OpenTelemetry backend.
159
+
160
+
Here's an example of configuring the Logfire library to send data to the excellent [otel-tui](https://github.com/ymtdzzz/otel-tui) — an open source terminal based OTel backend and viewer (no association with Pydantic).
161
+
162
+
Run `otel-tui` with docker (see [the otel-tui readme](https://github.com/ymtdzzz/otel-tui) for more instructions):
163
+
164
+
```txt title="Terminal"
165
+
docker run --rm -it -p 4318:4318 --name otel-tui ymtdzzz/otel-tui:latest
103
166
```
104
167
105
-
1. See the [logfire docs](https://logfire.pydantic.dev/docs/integrations/http-clients/httpx/) for more `httpx` instrumentation details.
168
+
then run,
106
169
107
-
In particular, this can help you to trace specific requests, responses, and headers:
result = agent.run_sync('What is the capital of France?')
118
184
print(result.output)
119
-
# > The capital of France is Paris.
185
+
#> Paris
120
186
```
121
187
122
-
1. Capture all of headers, request body, and response body.
188
+
1. Set the `OTEL_EXPORTER_OTLP_ENDPOINT` environment variable to the URL of your OpenTelemetry backend. If you're using a backend that requires authentication, you may need to set [other environment variables](https://opentelemetry.io/docs/languages/sdk-configuration/otlp-exporter/). Of course, these can also be set outside the process, e.g. with `export OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318`.
189
+
2. We [configure][logfire.configure] Logfire to disable sending data to the Logfire OTel backend itself. If you removed `send_to_logfire=False`, data would be sent to both Logfire and your OpenTelemetry backend.
123
190
124
-
=== "With `httpx` instrumentation"
191
+
Running the above code will send tracing data to `otel-tui`, which will display like this:
125
192
126
-

193
+

127
194
128
-
=== "Without `httpx` instrumentation"
195
+
Running the [weather agent](examples/weather-agent.md) example connected to `otel-tui` shows how it can be used to visualise a more complex trace:
129
196
130
-

`httpx` instrumentation might be of particular utility if you're using a custom `httpx` client in your model in order to get insights into your custom requests.
199
+
For more information on using the Logfire SDK to send data to alternative backends, see
You can also emit OpenTelemetry data from PydanticAI without using Logfire at all.
205
+
206
+
To do this, you'll need to install and configure the OpenTelemetry packages you need. To run the following examples, use
136
207
137
-
PydanticAI's instrumentation uses [OpenTelemetry](https://opentelemetry.io/), which Logfire is based on. You can use the Logfire SDK completely freely and follow the [Alternative backends](https://logfire.pydantic.dev/docs/how-to-guides/alternative-backends/) guide to send the data to any OpenTelemetry collector, such as a self-hosted Jaeger instance. Or you can skip Logfire entirely and use the OpenTelemetry Python SDK directly.
208
+
```txt title="Terminal"
209
+
uv run \
210
+
--with 'pydantic-ai-slim[openai]' \
211
+
--with opentelemetry-sdk \
212
+
--with opentelemetry-exporter-otlp \
213
+
raw_otel.py
214
+
```
215
+
216
+
```python {title="raw_otel.py" test="skip"}
217
+
import os
218
+
219
+
from opentelemetry.exporter.otlp.proto.http.trace_exporter import OTLPSpanExporter
220
+
from opentelemetry.sdk.trace import TracerProvider
221
+
from opentelemetry.sdk.trace.export import BatchSpanProcessor
222
+
from opentelemetry.trace import set_tracer_provider
result = agent.run_sync('What is the capital of France?')
237
+
print(result.output)
238
+
#> Paris
239
+
```
138
240
139
241
## Data format
140
242
141
243
PydanticAI follows the [OpenTelemetry Semantic Conventions for Generative AI systems](https://opentelemetry.io/docs/specs/semconv/gen-ai/), with one caveat. The semantic conventions specify that messages should be captured as individual events (logs) that are children of the request span. By default, PydanticAI instead collects these events into a JSON array which is set as a single large attribute called `events` on the request span. To change this, use [`InstrumentationSettings(event_mode='logs')`][pydantic_ai.agent.InstrumentationSettings].
0 commit comments