Skip to content

Commit

Permalink
Fixed Grafana / Tempo communication
Browse files Browse the repository at this point in the history
Grafana was no longer able to communicate with the Tempo backend after some config changes recently.

Checking the Grafana logs, I found

```
2024-12-27 04:52:35 logger=grafana-apiserver t=2024-12-27T09:52:35.291900754Z level=info msg="[core] [Channel #2 SubChannel #3]grpc: addrConn.createTransport failed to connect to {Addr: \"tempo:3200\", ServerName: \"tempo:3200\", }. Err: connection error: desc = \"error reading server preface: http2: frame too large\""
```

After some googling, this seemed to be an indicator that the client (Grafana) was trying to connect to the backend (Tempo) with http2, but the backend only supporting http1.1.

In particular, Tempo uses grpc for internal communication, but by default, exposes its grpc capabilities on the grpc port, 4317; However, we had Grafana connecting to the http port, 3200.

One option would be to expose the grpc port and use that, but it seems this is common enough that there's a built in [flag](https://github.com/grafana/tempo/blob/main/example/docker-compose/shared/tempo.yaml#L1) for it.

This causes tempo to expose its grpc API over http; in particular, this means that the http2 request to initiate the grpc connection now succeeds. I chose to go with this approach, since that's the way the [examples](https://github.com/grafana/tempo/blob/main/example/docker-compose/shared/tempo.yaml#L1) I saw were structured, and it's the first thing that worked.

See also this github discussion for context:
open-telemetry/opentelemetry-collector#7680
  • Loading branch information
Quantumplation committed Dec 27, 2024
1 parent 85016df commit 1335555
Showing 1 changed file with 1 addition and 0 deletions.
1 change: 1 addition & 0 deletions monitoring/grafana-tempo/tempo.yml
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
stream_over_http_enabled: true
server:
http_listen_port: 3200

Expand Down

0 comments on commit 1335555

Please sign in to comment.