File: scripts/run_local_backend.sh
- ✅ Activates Python virtual environment
- ✅ Checks Qdrant connection (Docker)
- ✅ Checks Ollama models (Mixtral + Qwen)
- ✅ Starts FastAPI on 0.0.0.0:8000
- ✅ Verifies server is reachable
- ✅ Shows helpful error messages with fixes
File: scripts/setup_tunnel.sh
- ✅ Supports Ngrok (quick testing)
- ✅ Supports Cloudflare Tunnel (stable, free)
- ✅ Interactive menu to choose provider
- ✅ Automatic URL detection
- ✅ Clear instructions for Netlify setup
File: scripts/start_everything.sh
- ✅ Starts/checks Qdrant
- ✅ Checks Ollama + models
- ✅ Starts backend
- ✅ Creates tunnel
- ✅ Shows all URLs and PIDs
- ✅ Complete automation
File: scripts/test_services.sh
- ✅ Tests Qdrant connectivity
- ✅ Tests Ollama + model availability
- ✅ Tests backend health endpoint
- ✅ Color-coded status output
File: backend/app.py
- ✅ Restricted to specific domains only
- ✅ Supports Netlify production
- ✅ Supports localhost development
- ✅ Supports Ngrok tunnels
- ✅ Supports Cloudflare tunnels
- ✅ NO wildcard CORS in production
File: .env.local.example
- ✅ Example configuration provided
- ✅ NEXT_PUBLIC_API_URL properly used
- ✅ Fallback to localhost:8000
- ✅ Clear instructions for tunnel URLs
Files:
docs/local_hosting.md- 500+ line comprehensive guidedocs/QUICKSTART.md- Quick reference cheat sheetREADME.md- Updated with local hosting info
./scripts/start_everything.sh# Terminal 1: Qdrant
docker start qdrant
# Terminal 2: Ollama
ollama serve
# Terminal 3: Backend
./scripts/run_local_backend.sh
# Terminal 4: Tunnel
./scripts/setup_tunnel.sh./scripts/test_services.shWhen everything is running, you should see:
- ✅ Qdrant: http://localhost:6333
- ✅ Ollama: http://localhost:11434
- ✅ Backend: http://localhost:8000
- ✅ Tunnel: https://your-url.ngrok-free.app (or trycloudflare.com)
./scripts/start_everything.shFrom the script output, copy the HTTPS URL:
https://abc123.ngrok-free.app
- Go to: https://app.netlify.com/sites/assistantstudy
- Site settings → Environment variables
- Add/Update:
- Key:
NEXT_PUBLIC_API_URL - Value:
https://abc123.ngrok-free.app
- Key:
- Trigger deploy
- Open: https://assistantstudy.netlify.app
- Check JARVIS status indicator
- Should show: "AI ONLINE" 🟢
- Try asking a question
- Backend runs on your laptop (full AI power)
- Qdrant stores your materials locally
- Ollama provides free AI inference
- Public HTTPS URL for Netlify frontend
- No cloud costs for backend
- Full control over your data
- CORS restricted to known domains
- No public wildcard access
- Tunnel provides HTTPS encryption
- Local services not exposed directly
- Choose Ngrok (quick testing)
- Or Cloudflare (stable, permanent)
- Switch between tunnel providers
- No code changes needed
- One-command setup
- Auto-detection of services
- Helpful error messages
- No manual configuration
docs/local_hosting.md includes:
- Prerequisites setup
- Detailed service configuration
- Tunnel comparison table
- CORS explanation
- Troubleshooting guide
- Performance optimization
- Advanced configuration
- Security best practices
docs/QUICKSTART.md includes:
- Command cheat sheet
- Service ports table
- Common troubleshooting
- Pro tips
- Architecture diagram
- How it works section
- Quick start guide
- Links to documentation
ALLOWED_ORIGINS = [
"http://localhost:3000",
"https://assistantstudy.netlify.app",
"https://*.ngrok-free.app",
"https://*.trycloudflare.com",
]- ✅ Only specific domains allowed
- ✅ Regex for tunnel subdomains
- ✅ Credentials enabled for cookies
- ✅ All HTTP methods allowed (within CORS)
- ✅ Custom frontend URL support via env var
- ❌ Manual service startup
- ❌ No public access
- ❌ Hardcoded URLs
- ❌ No CORS configuration
- ❌ Complex setup process
- ✅ Automated service checks
- ✅ Public HTTPS tunnel
- ✅ Environment-based URLs
- ✅ Secure CORS setup
- ✅ One-command deployment
- Run
./scripts/start_everything.sh - Copy the tunnel URL
- Set in Netlify environment variables
- Deploy and test!
- Add systemd service for auto-start on boot
- Create Docker Compose for all services
- Add monitoring/alerting
- Set up backup for Qdrant data
- Configure static Ngrok domain (paid)
- Keep terminals organized: Use tmux or terminal tabs
- Monitor resources: Ollama uses GPU if available
- Backup Qdrant: Data persisted in Docker volume
- Use Cloudflare for long sessions: More stable than Ngrok
- Check logs: Each service shows real-time logs in terminal
If you encounter issues:
-
Run diagnostics:
./scripts/test_services.sh
-
Check service logs:
- Backend: Terminal output
- Qdrant:
docker logs qdrant - Ollama: Terminal output
-
Review documentation:
-
Common fixes:
- Port in use:
lsof -ti:8000 | xargs kill -9 - Qdrant not running:
docker start qdrant - Ollama not responding: Restart
ollama serve
- Port in use:
What you got:
- 🎯 4 automated scripts
- 📚 500+ lines of documentation
- 🔐 Secure CORS configuration
- 🌐 Two tunnel options
- ✅ Complete testing suite
- 🚀 One-command deployment
Time to deploy: ~5 minutes
Result: Full AI-powered study assistant running on your laptop, accessible via secure HTTPS tunnel, connected to your live Netlify frontend! 🎉
All files committed and pushed to GitHub!
Ready to start? Run:
./scripts/start_everything.sh