A modern, production-ready FastAPI template utilizing a clean, layered (hexagonal-style) architecture. This template is designed for building robust and scalable backend services with clear separation of concerns, easy testing, and high maintainability.
While FastAPI is incredibly fast and flexible, it doesn't enforce a specific project structure. As projects grow, they often turn into a tangled mess of tightly coupled route handlers, business logic, and database calls. This template provides Enterprise-Grade Readiness from minute zero.
- Pre-Configured Tooling:
uv,pytest,ruff, andalembicare pre-integrated. No wrestling with environment setups. - Scalable Architecture: Extends beyond simple MVC. It isolates API routing, business logic (Services), and database interactions (Repositories) making the app highly testable and maintainable.
- Production-Ready Security & Observability: JWT authentication, HttpOnly cookies for refresh tokens, token blacklisting, password hashing, and rate limiting are baked in. Out-of-the-box integration with Sentry for robust error monitoring.
- Consistent Responses: Global exception handlers utilizing centralized success and error messages prevent hardcoded strings and standardize the API response structures.
- Docker First: A
docker-compose.yamlis ready to spin up your backend, PostgreSQL database, and Redis instances instantly.
This backend template is designed to seamlessly integrate with the companion Next.js 16 + React 19 + TypeScript Enterprise Template. You can find the frontend template here: kemalcalak/NextJS-Template.
The two templates share the same auth contract: HttpOnly access_token / refresh_token cookies, /api/v1 prefix, X-Requested-With CSRF header, and a uniform { success, data, message, error } response envelope.
This template integrates the best-in-class Python ecosystem tools to provide a seamless developer experience:
- Framework: FastAPI for building APIs with Python 3.12+ based on standard Python type hints.
- Architecture: Strict Layered Architecture separating routers, services, repositories, use cases, and models, fully utilizing FastAPI's dependency injection.
- Database & ORM: SQLAlchemy 2.0 with
asyncpgfor non-blocking operations, and Alembic for schema migrations. - Observability & Error Tracking: Sentry built-in integration for tracking unhandled exceptions and performance tracing.
- Caching: Redis integration using
redis.asynciofor robust, high-performance distributed caching. - Validation & Config: Pydantic v2 and
pydantic-settingsfor robust data validation and environment management. - Security & Auth: JWT access/refresh tokens accepted via either HttpOnly cookies or
Authorization: Bearer,bcryptpassword hashing, Redis-backed token blacklist (logout invalidation), strict origin-check middleware (returns 404 for foreign origins), and Slowapi rate limiting for brute-force protection. - Account Lifecycle: Email verification, password reset, password change, and soft-delete with grace period β accounts marked for deletion can be reactivated until the cron worker purges them.
- Background Jobs: arq worker (separate container in compose) runs cron jobs such as
delete_expired_accountsat the configured time. - Audit Trail:
user_activitytable records auth events and CRUD actions with IP / user agent. Theaudit_unexpected_failuredecorator captures unexpected route failures. - Smart Email Validation & Delivery: Built-in asynchronous email sending with SMTP, domain MX record checking using
dnspython, and auto-updating disposable email provider filtering via Redis cache. - Standardized API Responses: Global exception handlers standardizing success/error schemas, utilizing a centralized
messagesmodule (app/core/messages/) to prevent hardcoded responses. - First Superuser Seed: On startup, an initial admin is created from
FIRST_SUPERUSER/FIRST_SUPERUSER_PASSWORDif none exists. - Tooling: uv for blazing-fast package management, and Ruff for linting and formatting.
- Testing: Comprehensive async testing setup with
pytestandpytest-asyncio, in-memory SQLite viaaiosqlite,fakeredis, and autouse SMTP/MX patches β tests never hit Postgres or the network.
The repository structure supports standard Continuous Integration pipelines out-of-the-box. Ensure you configure your CI (GitHub Actions, GitLab CI, etc.) to run:
- Dependency Install:
uv sync - Linting:
uv run ruff check . - Formatting Check:
uv run ruff format --check . - Unit & Integration Tests:
uv run pytest
- Python >= 3.12
- Docker & Docker Compose (for local database and Redis)
uvpackage manager (recommended)
Clone the repository to your local machine:
git clone https://github.com/kemalcalak/fastapi-template.git
cd fastapi-templateCreate a .env file from the provided template:
cp .env.example .envYour .env file should look like this, filled with your actual configuration:
# Application Settings
PROJECT_NAME="FastAPI Template"
SECRET_KEY="changethis"
ENVIRONMENT="local"
FIRST_SUPERUSER="admin@example.com"
FIRST_SUPERUSER_PASSWORD="changethis"
FRONTEND_HOST="http://localhost:5173"
# Database Settings
POSTGRES_SERVER="localhost"
POSTGRES_PORT=5432
POSTGRES_USER="postgres"
POSTGRES_PASSWORD="changethis"
POSTGRES_DB="app"
# Redis Cache Settings
REDIS_URL="redis://localhost:6379/0"
# Email / SMTP Settings (Optional)
SMTP_HOST="smtp.example.com"
SMTP_PORT=465
SMTP_USE_STARTTLS=True
SMTP_USE_SSL=False
SMTP_USER="smtp_username"
SMTP_PASSWORD="smtp_password"
EMAILS_FROM_EMAIL="noreply@example.com"
# Sentry (only initialized when ENVIRONMENT != "local")
SENTRY_DSN=This project provides a docker-compose.yaml to spin up the entire stack, including the backend service, a local PostgreSQL instance, a Redis container, and the arq background worker:
docker-compose up -d --buildThe API will be available at http://localhost:8000. You can test the endpoints via the Swagger UI available at http://localhost:8000/docs.
To run the worker manually (e.g. when developing the API on the host):
uv run arq app.worker.settings.WorkerSettingsTo generate and apply the database tables using Alembic, run the migration command inside the backend container:
# Apply existing migrations
docker-compose exec backend uv run alembic upgrade headIf you modify models in app/models/ and need to generate a new migration script:
docker-compose exec backend uv run alembic revision --autogenerate -m "description_of_changes"
docker-compose exec backend uv run alembic upgrade headTests are written using pytest and configured for async execution with pytest-asyncio. Configuration details can be found in pytest.ini.
To run the test suite locally:
uv run pytestThis project uses Ruff for both code linting and formatting. The configurations are specified in pyproject.toml.
- Check for issues:
uv run ruff check . - Format code:
uv run ruff format . - Auto-fix lint issues:
uv run ruff check --fix .
The repo ships a .pre-commit-config.yaml. After cloning, install both hook stages once:
uv run pre-commit install --hook-type pre-commit --hook-type pre-pushpre-commit (~5β15 s) β runs on every git commit against staged files:
trim trailing whitespace,end-of-file-fixer,check-yaml,check-toml,check-added-large-files,check-merge-conflict,detect-private-keyruff check --fix(auto-fix lint)ruff format
pre-push (~30β60 s) β runs on every git push:
uv run pytestβ full test suite must pass before code leaves the machine.
To run all hooks manually against the whole repo: uv run pre-commit run --all-files.
prometheus-fastapi-instrumentator collects metrics for every handled request. The /metrics endpoint (root path, outside /api/v1, hidden from Swagger) exposes them in the standard Prometheus exposition format. Default metrics: request count, latency histograms, in-progress requests, exceptions per handler, plus the standard Python runtime + process metrics. Health endpoints and /metrics itself are excluded from instrumentation to avoid noise.
Auth model β three layers:
include_in_schema=Falseβ the endpoint is invisible in Swagger / OpenAPI.- Environment-gated bearer token β outside
ENVIRONMENT="local", the endpoint requiresAuthorization: Bearer ${METRICS_TOKEN}. Mismatched or missing tokens return 404 (not 401/403) so the endpoint's existence is not disclosed. origin_check_middlewareβ browser cross-origin requests with a foreignOriginheader are rejected by the global middleware, regardless of the token.
Local dev β open access:
curl http://localhost:8000/metricsProduction / staging β bearer token required:
# .env (or your secret store)
METRICS_TOKEN=$(openssl rand -hex 32)
# Prometheus scrape config (prometheus.yml)
scrape_configs:
- job_name: fastapi
authorization:
type: Bearer
credentials: <your METRICS_TOKEN>
static_configs:
- targets: ['api.example.com:8000']Even with the token, prefer to also restrict
/metricsat the reverse proxy (Prometheus scraper IP allowlist or VPC-internal-only exposure). Defense-in-depth β the token is one layer, network policy is another.
Metrics tell you what is slow ("p99 latency on /users/{id} doubled at 14:02"). Traces tell you why β a single request's full waterfall: route handler β SQLAlchemy queries β Redis calls β outbound httpx requests, each with its own span and timing.
Opt-in by design. When OTEL_EXPORTER_OTLP_ENDPOINT is unset, init_telemetry() returns early and there is zero overhead β no spans created, no exporter started, no extra allocations. Local dev and the test suite stay clean.
Wiring (app/core/telemetry.py):
| Layer | Instrumentation | What you get |
|---|---|---|
| FastAPI | FastAPIInstrumentor |
Per-request span named after the route template (/users/{id}, not /users/42); /metrics and /health excluded |
| SQLAlchemy | SQLAlchemyInstrumentor |
One span per query, with the SQL statement and duration |
| Redis | RedisInstrumentor |
One span per Redis call (cache hits, rate-limit checks, pub/sub) |
| httpx | HTTPXClientInstrumentor |
Outbound HTTP spans (e.g. disposable-email blocklist refresh) |
Enabling traces: point OTEL_EXPORTER_OTLP_ENDPOINT at any OTLP/HTTP collector β Tempo, Jaeger, Honeycomb, the OTel Collector, etc.
# .env
OTEL_EXPORTER_OTLP_ENDPOINT=http://localhost:4318
OTEL_EXPORTER_OTLP_PROTOCOL=http/protobuf
OTEL_SERVICE_NAME=fastapi-templateThe SDK reads every other OTEL_* variable natively (sampler, headers, batch size, etc.) β see the OTel SDK environment variables reference for the full list.
βββ app/
β βββ alembic/ # Alembic env + versions/ (generated migration scripts)
β βββ api/ # API Layer: routers, deps.py, exception handlers, decorators
β β βββ routes/
β β βββ auth.py, users.py, health.py
β β βββ admin/ # Admin surface (gated by CurrentSuperUser)
β βββ core/ # config, db, security, redis, rate_limit, email, messages/
β βββ models/ # Domain Layer: SQLAlchemy ORM (User, UserActivity, β¦)
β βββ repositories/ # Data Layer: async DB queries (no business rules)
β βββ schemas/ # Pydantic v2 DTOs (Create / Update / Response per domain)
β βββ services/ # Business Logic Layer: pure async functions, take AsyncSession
β βββ use_cases/ # Cross-domain orchestration (e.g. activity logging)
β βββ worker/ # arq worker β settings + cron jobs (e.g. account deletion)
β βββ utils/ # Helper functions (datetime, email templates)
β βββ tests/ # pytest suite (in-memory SQLite, fakeredis, mocked SMTP)
β βββ main.py # FastAPI app, lifespan, CORS + origin middleware
βββ alembic.ini # Alembic settings
βββ docker-compose.yaml # Compose stack: backend + worker + db + redis
βββ dockerfile # Backend image (uv-based multi-stage build)
βββ pyproject.toml # Project dependencies and tool configurations
βββ pytest.ini # Pytest settings
βββ uv.lock # Dependency lock file (commit alongside dep changes)Contributions are what make the open source community such an amazing place to learn, inspire, and create. Any contributions you make are greatly appreciated.
This project follows Conventional Commits.
- Fork the Project
- Create your Feature Branch (
git checkout -b feature/AmazingFeature) - Commit your Changes (
git commit -m 'feat: Add some AmazingFeature') - Push to the Branch (
git push origin feature/AmazingFeature) - Open a Pull Request
Distributed under the MIT License. See the LICENSE file at the root of the workspace for more information.