TempMap Renderer is a FastAPI service plus a lightweight browser editor for building multi-floor temperature maps and rendering live heatmap images for Home Assistant dashboards.
It runs as a single service, stores floorplans on disk, polls Home Assistant for sensor states, and renders PNGs (and timelapses) on demand.
- You draw your floorplan in a simple canvas editor (
/editor). - You bind sensors to spots on the floorplan.
- The backend polls Home Assistant for those sensor values.
- The renderer solves a heatmap and serves it as an image (
/render/live/{floor_id}.png). - Home Assistant displays the image on your dashboard.
Everything is stored on disk (defaults to /data) so you can restart the container without losing your work.
cp backend/config.example.yaml backend/config.yaml
python -m venv .venv
source .venv/bin/activate
pip install -r backend/requirements.txt
uvicorn backend.main:app --reload --host 0.0.0.0 --port 8000Open the editor at:
http://localhost:8000/editor
Floorplans are stored under:
/data/floorplans/{floor_id}.json
docker build -f docker/Dockerfile -t tempmap-renderer .
docker run --rm -p 8000:8000 -v $(pwd)/data:/data tempmap-rendererThen visit:
http://localhost:8000/editor
Note: In the container, the config path is
/app/backend/config.yamland the data path is/data.
Import the template in Unraid → Docker → Add Container → Template.
If you keep config.yaml in Unraid appdata, bind-mount it to:
/app/backend/config.yaml
- Open the editor at
/editor. - Create a floor (top-left menu).
- Draw walls and doors using the toolbar.
- Drop sensors where your physical sensors live.
- Name each sensor and assign a Home Assistant entity ID.
- Save the floorplan.
When saved, the floorplan becomes a JSON file at /data/floorplans/{floor_id}.json.
Copy the example config and edit the Home Assistant section:
home_assistant:
base_url: http://homeassistant.local:8123
token: YOUR_LONG_LIVED_TOKEN
refresh_seconds: 15Tips:
base_urlmust be reachable from the renderer (container → HA).- Use a long‑lived access token from your Home Assistant profile.
refresh_secondscontrols how often sensor states are pulled.
Once you have a floorplan, the live image URL is:
GET /render/live/{floor_id}.png
Example (floor id = floor1):
http://YOUR_HOST:8000/render/live/floor1.png
The renderer recalculates the heatmap each time the image is requested (using cached sensor values).
type: picture-entity
entity: sensor.living_room_temperature
image: http://YOUR_HOST:8000/render/live/floor1.png
name: Floor 1 Heatmaptype: markdown
content: >-
.timestamp() }})type: markdown
content: >-
The backend reads config once at startup. Use backend/config.example.yaml as a base.
server:
host: 0.0.0.0
port: 8000
data:
path: /data
home_assistant:
base_url: http://homeassistant.local:8123
token: YOUR_LONG_LIVED_TOKEN
refresh_seconds: 15
render:
default_grid:
width: 400
height: 250
default_legend:
min_f: 60
max_f: 80
timelapse:
frame_retention_hours: 48
window_hours: 48
sampling_seconds: 120
target_duration_seconds: 60
fps: 10
output_path: /data/timelapses
rolling_enabled: true
rolling_interval_seconds: 900
stitch_multi_floor: true
border_px: 12
label_font_size: 18path: base directory for floorplans, frames, and timelapses. Defaults to/data.- Override with
TEMP_MAP_DATA_PATHif you need a different path.
Defaults used for new floorplans created in the editor.
default_grid: solver grid size (lower = faster, higher = smoother).default_legend: default min/max temperature range for the legend.
Controls rolling timelapse generation and on-demand timelapses.
frame_retention_hours: how long to keep cached PNG frames.window_hours: rolling timelapse time window.sampling_seconds: base sampling cadence for frames.target_duration_seconds: target output video length.fps: frames per second for the output MP4.output_path: directory for rendered MP4s.rolling_enabled: enable periodic rolling generation.rolling_interval_seconds: how often to regenerate rolling timelapses.stitch_multi_floor: generate a combinedall/rolling.mp4.border_px: padding between floors in stitched outputs.label_font_size: font size for floor labels.
Generate a timelapse on demand:
GET /api/timelapse/{floor_id}?window=48h&sampling_seconds=120&target_duration_seconds=60&fps=10&stitch=true
Parameters:
window: duration string (30m,12h,2d) or hours as a number.sampling_seconds: base sampling interval in seconds.target_duration_seconds: target length of the output video.fps: frames per second.stitch: settrueto stitch all floors whenfloor_id=all.
GET /editorGET /api/floorplansGET /api/floorplans/{floor_id}PUT /api/floorplans/{floor_id}POST /api/floorplans/{floor_id}/validatePOST /api/ha/testGET /render/live/{floor_id}.pngGET /render/live/{floor_id}.jsonGET /render/timelapse.gif?floor=&window=&step=&width=
They are stored as JSON files at:
/data/floorplans/{floor_id}.json
The editor works without Home Assistant, but live rendering requires sensor values. You can still render if your floorplan defines fallback values.
Auto-cropping trims blank space around your floorplan. You can disable it or adjust padding in each floorplan’s render section.
backend/— FastAPI app, rendering logic, configurationfrontend/— Vanilla JS canvas editordocker/— Dockerfile for container buildsunraid/— Unraid template