A sophisticated monitoring system for tracking Christmas tree water levels with unnecessary precision.
This project monitors and visualizes the water level of a Christmas tree through four main components:
- MQTT Logger - Captures MQTT messages from IoT sensors and stores them in DuckDB
- Uploader - Queries DuckDB, aggregates data, and uploads to S3
- Infrastructure - AWS CDK scripts that provision S3 bucket and IAM credentials
- Static Site - Vite-powered visualization dashboard served via GitHub Pages
IoT Sensors (ESP8266, Yolink, etc.)
β (MQTT messages)
MQTT Broker (Mosquitto)
β
MQTT Logger (Docker/systemd)
β
DuckDB File (batched writes)
β
Uploader (Docker daemon)
β (aggregated data)
S3 Bucket (gzipped JSON, public)
β
GitHub Pages (static site)
β
User's Browser (Chart.js visualizations)
- Python 3.11+ with uv installed
- Node.js 18+ with npm (for CDK CLI via
npxand Vite) - Docker
- AWS CLI configured with appropriate permissions
- MQTT broker (e.g., Mosquitto) for IoT sensor data
The MQTT Logger captures sensor data and stores it in DuckDB. See mqtt_logger/README.md for complete documentation.
Quick Start with Docker:
cd mqtt_logger
docker build -t mqtt-logger .
docker run -d \
--name mqtt-logger \
--restart unless-stopped \
-v $(pwd)/data:/app/data \
-e MQTT_BROKER=mqtt.example.com \
-e TOPICS="xmas/tree/water/raw:water_level:Water level readings" \
mqtt-loggerOr with systemd: See mqtt_logger/QUICKSTART.md
The uploader runs as a long-running daemon that queries DuckDB and uploads to S3.
cd uploader
uv syncBuild and run the Docker container:
cd uploader
docker build -t treelemetry-uploader .
docker run -d \
--name treelemetry-uploader \
--restart unless-stopped \
-e AWS_ACCESS_KEY_ID=xxx \
-e AWS_SECRET_ACCESS_KEY=yyy \
-v /path/to/mqtt_logs.db:/data/tree.duckdb:ro \
treelemetry-uploaderThe container runs continuously, uploading every 30 seconds.
Deploy the CDK stack to create IAM credentials:
cd infrastructure
uv sync
npx aws-cdk bootstrap # First time only
npx aws-cdk deployAfter deployment, the stack outputs will include the IAM credentials needed for the uploader.
cd site
npm install
npm run dev # Development server
npm run build # Production build to ../docs/The static site is built to the docs/ directory and served via GitHub Pages:
- Push changes to GitHub
- Go to repository Settings β Pages
- Set source to "Deploy from a branch"
- Select
mainbranch and/docsfolder - Site will be available at: https://treelemetry.tomlee.space
The JSON file uploaded to S3 includes:
- Raw measurements (last 10 minutes)
- Aggregated data (1m, 5m, 1h intervals)
- Consumption analysis (detected segments, slopes, predictions)
- Statistics (min, max, avg, stddev)
All data is gzip-compressed for efficient transfer.
Sample structure:
{
"generated_at": "2025-12-05T12:00:00Z",
"measurements": [...],
"agg_1m": { "data": [...] },
"agg_5m": { "data": [...] },
"agg_1h": { "data": [...] },
"analysis": {
"segments": [...],
"current_prediction": {
"slope_mm_per_hr": 2.5,
"time_to_50mm_hours": 8.5
}
}
}treelemetry/
βββ mqtt_logger/ # MQTT message logger
β βββ Dockerfile
β βββ main.py
β βββ config/ # Configuration templates
β βββ src/ # Logger implementation
β βββ tests/ # Test suite
βββ uploader/ # Data aggregation & S3 uploader
β βββ Dockerfile
β βββ pyproject.toml
β βββ sample_data.py # Test data generator
β βββ src/
βββ infrastructure/ # AWS CDK for S3 & IAM
β βββ app.py
β βββ cdk.json
β βββ infrastructure/
βββ site/ # Vite static site
β βββ index.html
β βββ package.json
β βββ src/
βββ docs/ # Built site (GitHub Pages)
See PROJECT_STRUCTURE.md for complete details.
MIT