Skip to content
Merged
Show file tree
Hide file tree
Changes from all commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 5 additions & 2 deletions docs/content/docs/agent-sdk/integrations/meta.json
Original file line number Diff line number Diff line change
@@ -1,4 +1,7 @@
{
"title": "Integrations",
"pages": ["hud"]
}
"pages": [
"hud",
"observability"
]
}
62 changes: 62 additions & 0 deletions docs/content/docs/agent-sdk/integrations/observability.mdx
Original file line number Diff line number Diff line change
@@ -0,0 +1,62 @@
---
title: Observability
description: Trace CUA execution steps and sessions
---

## Observability

CUA has a native integration with [Laminar](https://laminar.sh/) – open-source platform for tracing, evals, and labeling of autonomous AI agents. Read more about Laminar in the [Laminar docs](https://docs.lmnr.ai/).

## Setup

Register on [Laminar Cloud](https://laminar.sh/) or spin up a [local instance](https://github.com/lmnr-ai/lmnr) and get the key from your project settings. Set the `LMNR_PROJECT_API_KEY` environment variable to your key.

```bash
pip install lmnr[all]
export LMNR_PROJECT_API_KEY=your-key
```

## Usage

Then, initialize Laminar at the entry point of your application, register Laminar LiteLLM callback, and all steps of CUA will be automatically traced.

```python
import os

import litellm

from agent import ComputerAgent
from computer import Computer
from lmnr import Laminar, LaminarLiteLLMCallback # [!code highlight]

Laminar.initialize() # [!code highlight]
litellm.callbacks.append(LaminarLiteLLMCallback()) # [!code highlight]

computer = Computer(
os_type="linux",
provider_type="cloud",
name=os.getenv("CUA_CONTAINER_NAME"),
api_key=os.getenv("CUA_API_KEY"),
)

agent = ComputerAgent(
model="openai/computer-use-preview",
tools=[computer],
)

async def main():
async for step in agent.run("Create a new file called 'test.txt' in the current directory"):
print(step["output"])

if __name__ == "__main__":
asyncio.run(main())
```

## Viewing traces

You can view traces in the Laminar UI by going to the traces tab in your project. When you select a trace,
you will see all the agent execution steps, including computer actions, LLM calls, and screenshots.

For each step, you will see the LLM call, the computer action. The computer actions are highlighted in the timeline in yellow.

<img src="/docs/img/laminar_trace_example.png" alt="Example trace in Laminar showing the litellm.response span and its output." width="800px" />
2 changes: 1 addition & 1 deletion docs/content/docs/meta.json
Original file line number Diff line number Diff line change
Expand Up @@ -16,4 +16,4 @@
"---[CodeXml]API Reference---",
"...libraries"
]
}
}
Binary file added docs/public/img/laminar_trace_example.png
Loading
Sorry, something went wrong. Reload?
Sorry, we cannot display this file.
Sorry, this file is invalid so it cannot be displayed.
Loading