Skip to content

Commit 615da8b

Browse files
Added service for auto remove expired otps (#1)
* Added service for auto remove expired otps * fix isort issue
1 parent 8ae2347 commit 615da8b

File tree

10 files changed

+354
-0
lines changed

10 files changed

+354
-0
lines changed

.github/workflows/lint.yml

Lines changed: 34 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,34 @@
1+
name: Python Lint
2+
3+
on:
4+
push:
5+
branches: [main]
6+
pull_request:
7+
branches: [main]
8+
9+
jobs:
10+
python-lint:
11+
name: Lint Python Code
12+
runs-on: ubuntu-latest
13+
14+
steps:
15+
- name: Checkout code
16+
uses: actions/checkout@v3
17+
18+
- name: Set up Python
19+
uses: actions/setup-python@v4
20+
with:
21+
python-version: '3.13'
22+
23+
- name: Install linters
24+
run: |
25+
python -m pip install --upgrade pip
26+
pip install flake8 black isort
27+
28+
- name: Run flake8
29+
run: |
30+
flake8 . --count --select=E9,F63,F7,F82 --show-source --statistics
31+
32+
- name: Check import sorting with isort
33+
run: |
34+
isort . --check-only

.github/workflows/test.yml

Lines changed: 43 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,43 @@
1+
name: Python Tests with Coverage
2+
3+
on:
4+
push:
5+
branches: [main]
6+
pull_request:
7+
branches: [main]
8+
9+
jobs:
10+
test:
11+
if: false # disables the job temporary
12+
runs-on: ubuntu-latest
13+
14+
strategy:
15+
matrix:
16+
python-version: ['3.11', '3.12', '3.13']
17+
18+
steps:
19+
- name: Checkout code
20+
uses: actions/checkout@v3
21+
22+
- name: Set up Python
23+
uses: actions/setup-python@v4
24+
with:
25+
python-version: ${{ matrix.python-version }}
26+
27+
- name: Install dependencies
28+
run: |
29+
python -m pip install --upgrade pip
30+
pip install -r requirements.txt
31+
pip install pytest pytest-cov
32+
33+
- name: Run tests with coverage
34+
run: |
35+
pytest --cov=your_package_name --cov-report=term --cov-report=xml --cov-fail-under=80
36+
37+
# Optional: Upload coverage report to Codecov (for public repos or with CODECOV_TOKEN)
38+
- name: Upload to Codecov
39+
uses: codecov/codecov-action@v3
40+
with:
41+
files: coverage.xml
42+
env:
43+
CODECOV_TOKEN: ${{ secrets.CODECOV_TOKEN }} # Only needed for private repos

.github/workflows/trivy.yml

Lines changed: 24 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,24 @@
1+
name: Trivy Security Scan
2+
3+
on:
4+
push:
5+
branches: [main]
6+
pull_request:
7+
branches: [main]
8+
9+
jobs:
10+
trivy-scan:
11+
runs-on: ubuntu-latest
12+
name: Trivy FS Scan
13+
14+
steps:
15+
- name: Checkout repo
16+
uses: actions/checkout@v3
17+
18+
- name: Run Trivy vulnerability scanner on file system
19+
uses: aquasecurity/trivy-action@master
20+
with:
21+
scan-type: 'fs'
22+
scan-ref: '.'
23+
scanners: 'vuln,secret,config'
24+
ignore-unfixed: true

Dockerfile

Lines changed: 17 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,17 @@
1+
FROM python:3.13-alpine
2+
3+
# Set working directory
4+
WORKDIR /app
5+
6+
# Install system dependencies (use Alpine package manager)
7+
# 'apk add --no-cache' keeps the image small and matches the alpine base
8+
RUN apk add --no-cache postgresql-client
9+
10+
# Copy requirements first for better caching
11+
COPY requirements.txt .
12+
RUN pip install --no-cache-dir -r requirements.txt
13+
14+
# Copy the rest of the application
15+
COPY . .
16+
17+
CMD ["python", "cleanup.py"]

README.md

Lines changed: 50 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,50 @@
1+
# OTP Cleanup Service
2+
3+
This service handles the automatic cleanup of expired OTPs in the JamAndFlow database.
4+
5+
## Setup
6+
7+
1. Create a `.env` file:
8+
```env
9+
POSTGRES_USER=postgres
10+
POSTGRES_PASSWORD=PassWord
11+
POSTGRES_DB=JamAndFlow
12+
POSTGRES_HOST=db
13+
POSTGRES_PORT=5432
14+
CLEANUP_INTERVAL_SECONDS=300 # 5 minutes
15+
```
16+
17+
2. Build and run with Docker Compose:
18+
```bash
19+
docker compose up --build
20+
```
21+
22+
## Configuration
23+
24+
- `POSTGRES_USER`: PostgreSQL username
25+
- `POSTGRES_PASSWORD`: PostgreSQL password
26+
- `POSTGRES_DB`: PostgreSQL database name
27+
- `POSTGRES_HOST`: PostgreSQL host (default: `db` for Docker Compose)
28+
- `POSTGRES_PORT`: PostgreSQL port (default: `5432`)
29+
- `CLEANUP_INTERVAL_SECONDS`: Interval between cleanup runs (default: 300 seconds / 5 minutes)
30+
31+
## Docker Network
32+
33+
This service needs to be on the same network as your main JamAndFlow API:
34+
35+
```bash
36+
# Create the network if it doesn't exist
37+
docker network create jamandflows-network
38+
```
39+
40+
## Development
41+
42+
1. Install dependencies:
43+
```bash
44+
pip install -r requirements.txt
45+
```
46+
47+
2. Run locally:
48+
```bash
49+
python cleanup.py
50+
```

cleanup.py

Lines changed: 109 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,109 @@
1+
import asyncio
2+
import logging
3+
from datetime import datetime, timezone
4+
5+
from sqlalchemy import create_engine, text
6+
from sqlalchemy.orm import sessionmaker
7+
8+
from config import settings
9+
from models import OTP, Base
10+
11+
# Set up logging with explicit handler control
12+
logger = logging.getLogger(__name__)
13+
# Remove any existing handlers to prevent duplicates
14+
for handler in logger.handlers[:]:
15+
logger.removeHandler(handler)
16+
# Remove root logger handlers
17+
for handler in logging.getLogger().handlers[:]:
18+
logging.getLogger().removeHandler(handler)
19+
20+
# Add single stream handler
21+
handler = logging.StreamHandler()
22+
handler.setFormatter(logging.Formatter('%(asctime)s - %(name)s - %(levelname)s - %(message)s'))
23+
logger.addHandler(handler)
24+
logger.setLevel(logging.DEBUG)
25+
26+
# Database setup
27+
engine = create_engine(settings.database_url)
28+
SessionLocal = sessionmaker(autocommit=False, autoflush=False, bind=engine)
29+
30+
def wait_for_db(max_retries=5, retry_interval=5):
31+
"""Wait for database to be available."""
32+
retry_count = 0
33+
while retry_count < max_retries:
34+
try:
35+
db = SessionLocal()
36+
try:
37+
db.execute(text("SELECT 1"))
38+
logger.info("Database connection successful")
39+
return True
40+
finally:
41+
db.close()
42+
except Exception as e:
43+
retry_count += 1
44+
if retry_count < max_retries:
45+
logger.warning(f"Database connection attempt {retry_count} failed: {e}")
46+
logger.info(f"Retrying in {retry_interval} seconds...")
47+
asyncio.sleep(retry_interval)
48+
else:
49+
logger.error(f"Failed to connect to database after {max_retries} attempts: {e}")
50+
return False
51+
return False
52+
53+
def cleanup_expired_otps():
54+
"""Delete expired OTPs from the database."""
55+
try:
56+
db = SessionLocal()
57+
try:
58+
# Use timezone-aware UTC for comparison to match models
59+
now = datetime.now(timezone.utc)
60+
logger.debug(f"Running cleanup check at {now}")
61+
62+
result = db.query(OTP).filter(
63+
OTP.expires_at < now
64+
).delete()
65+
db.commit()
66+
67+
# Always log the check, even if no deletions
68+
if result > 0:
69+
logger.info(f"Deleted {result} expired OTPs")
70+
else:
71+
logger.debug("No expired OTPs found to delete")
72+
73+
finally:
74+
db.close()
75+
except Exception as e:
76+
logger.error(f"Error cleaning up expired OTPs: {e}")
77+
78+
async def run_cleanup_loop():
79+
"""Run the cleanup task periodically."""
80+
logger.info(f"Starting cleanup loop with interval: {settings.CLEANUP_INTERVAL_SECONDS} seconds")
81+
82+
while True:
83+
logger.debug("Running cleanup cycle...")
84+
cleanup_expired_otps()
85+
logger.debug(f"Sleeping for {settings.CLEANUP_INTERVAL_SECONDS} seconds...")
86+
await asyncio.sleep(settings.CLEANUP_INTERVAL_SECONDS)
87+
88+
def main():
89+
# Only show startup banner once
90+
logger.info("Starting OTP cleanup service...")
91+
logger.info(f"Database URL: {settings.database_url.replace(settings.POSTGRES_PASSWORD, '****')}")
92+
logger.info(f"Cleanup interval: {settings.CLEANUP_INTERVAL_SECONDS} seconds")
93+
94+
# Wait for database with retries
95+
if not wait_for_db():
96+
logger.error("Failed to connect to database after retries. Exiting.")
97+
return
98+
99+
try:
100+
# Run the cleanup loop
101+
asyncio.run(run_cleanup_loop())
102+
except KeyboardInterrupt:
103+
logger.info("Shutting down OTP cleanup service...")
104+
except Exception as e:
105+
logger.error(f"Error in cleanup service: {e}")
106+
raise
107+
108+
if __name__ == "__main__":
109+
main()

config.py

Lines changed: 28 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,28 @@
1+
from pydantic_settings import BaseSettings
2+
3+
4+
class Settings(BaseSettings):
5+
# Database settings
6+
POSTGRES_USER: str
7+
POSTGRES_PASSWORD: str
8+
POSTGRES_DB: str
9+
POSTGRES_HOST: str = "localhost"
10+
POSTGRES_PORT: str = "5432"
11+
12+
# Cleanup settings
13+
CLEANUP_INTERVAL_SECONDS: int = 300 # 5 minutes
14+
15+
@property
16+
def database_url(self) -> str:
17+
"""Construct the database URL from the settings."""
18+
return (
19+
f"postgresql://{self.POSTGRES_USER}:{self.POSTGRES_PASSWORD}"
20+
f"@{self.POSTGRES_HOST}:{self.POSTGRES_PORT}/{self.POSTGRES_DB}"
21+
)
22+
23+
class Config:
24+
env_file = ".env"
25+
case_sensitive = True
26+
27+
28+
settings = Settings()

docker-compose.yml

Lines changed: 18 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,18 @@
1+
version: '3.8'
2+
3+
services:
4+
cleanup:
5+
build: .
6+
container_name: otp_cleanup_service
7+
env_file:
8+
- .env
9+
networks:
10+
- jamandflows-network
11+
restart: "no" # Don't auto-restart on failure
12+
13+
volumes:
14+
postgres_data:
15+
16+
networks:
17+
jamandflows-network:
18+
external: true

models.py

Lines changed: 26 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,26 @@
1+
from datetime import datetime, timezone
2+
3+
from sqlalchemy import Column, DateTime, Integer, String
4+
from sqlalchemy.ext.declarative import declarative_base
5+
6+
Base = declarative_base()
7+
8+
class OTP(Base):
9+
__tablename__ = "otps"
10+
id = Column(Integer, primary_key=True, index=True)
11+
email = Column(String, index=True, nullable=False)
12+
otp_code = Column(String, nullable=False)
13+
name = Column(String, nullable=False)
14+
password = Column(String, nullable=True)
15+
is_active = Column(Integer, default=1)
16+
# Use timezone-aware DateTime columns and defaults
17+
created_at = Column(DateTime(timezone=True), default=lambda: datetime.now(timezone.utc))
18+
expires_at = Column(DateTime(timezone=True), nullable=False)
19+
20+
def is_expired(self):
21+
# Safely compare timezone-aware datetimes. If expires_at is naive, treat it as UTC.
22+
now = datetime.now(timezone.utc)
23+
expires = self.expires_at
24+
if expires is not None and expires.tzinfo is None:
25+
expires = expires.replace(tzinfo=timezone.utc)
26+
return now > expires

requirements.txt

Lines changed: 5 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,5 @@
1+
sqlalchemy>=2.0.0
2+
pydantic>=2.0.0
3+
pydantic-settings>=2.0.0
4+
psycopg2-binary>=2.9.0
5+
python-dotenv>=1.0.0

0 commit comments

Comments
 (0)