Skip to content

SigNoz/signoz-mcp-demo

Repository files navigation

LangChain SigNoz MCP Demo

Using OpenInference and OpenTelemetry to send traces from your LangChain MCP agent app to Signoz

Getting Started

First, install all the necessary dependencies for the backend:

Optional Create Python virtual env:

python -m venv myenv && \
source myenv/bin/activate

Then:

pip install -r requirements.txt

Install all the necessary dependencies for the frontend:

cd frontend && \
npm install

Next create a .env file with the following(in root directory):

OPENAI_API_KEY=<your-openai-api-key>
SIGNOZ_INGESTION_KEY=<your-signoz-ingestion-key>

Run the fastapi backend:

uvicorn main:app --reload --port 8001

Run the frontend:

cd frontend && \
npm start

Setup MCP server: Follow the README in the following repo to setup and deploy the signoz mcp server locally (default localhost port 8000).

Open http://localhost:3000 with your browser to see the result and interact with the application.

After using the application, you should be able to view traces in your SigNoz Cloud platform:

Traces

Screenshot 2025-08-21 at 11 33 51 AM Screenshot 2025-08-21 at 11 33 00 AM

You can also create custom dashboards using these traces and span attributes:

Import Dashboard

Go to the Dashboards tab in SigNoz.

Click on + New Dashboard

Go to Import JSON dashboard_import

Import the langchain-mcp-dashboard.json file from the repo.

Your dashboard should now be imported and look something like this: Screenshot 2025-08-21 at 11 37 25 AM

About

No description, website, or topics provided.

Resources

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Contributors 2

  •  
  •