An automated pull request reviewer that runs entirely on your local machine. When a PR is opened on GitHub, PRobot fetches the diff, sends it to a local LLM via Ollama, and posts the AI-generated review back as a PR comment.
GitHub PR opened
│
▼
GitHub Webhook (HTTP POST)
│
▼
ngrok (tunnels to localhost)
│
▼
FastAPI server (/webhook)
│
├── Fetch PR diff via GitHub API
│
├── Send diff to Ollama (local LLM)
│
└── Post AI review as PR comment
| Component | Tool |
|---|---|
| Web server | FastAPI + Uvicorn |
| GitHub integration | PyGitHub |
| Local LLM | Ollama (qwen2.5-coder:7b) |
| Tunnel (dev) | ngrok |
| Config | python-dotenv |
probot/
├── app/
│ ├── main.py # FastAPI app & webhook handler
│ ├── github_handler.py # Fetch PR diff, post PR comment
│ ├── ollama_reviewer.py # Send diff to Ollama, return review
│ └── prompts.py # LLM prompt template
├── .env # Secrets (not committed)
├── requirements.txt
└── README.md
git clone https://github.com/Kush05Bhardwaj/PRobot.git
cd PRobot/probot
python -m venv venv
venv\Scripts\activate # Windows
pip install -r requirements.txtCreate a .env file in the probot/ directory:
GITHUB_TOKEN=your_github_pat_here
WEBHOOK_SECRET=your_webhook_secret
OLLAMA_MODEL=qwen2.5-coder:7bGITHUB_TOKEN— A GitHub Personal Access Token withreposcope.WEBHOOK_SECRET— A secret string; set the same value in your GitHub webhook config.OLLAMA_MODEL— Any code-capable model pulled in Ollama.
ollama pull qwen2.5-coder:7buvicorn app.main:app --reload --port 8000ngrok http 8000Copy the HTTPS URL (e.g. https://xxxx.ngrok-free.app).
In your GitHub repo → Settings → Webhooks → Add webhook:
| Field | Value |
|---|---|
| Payload URL | https://xxxx.ngrok-free.app/webhook |
| Content type | application/json |
| Secret | value from WEBHOOK_SECRET in .env |
| Events | Pull requests |
Open a pull request in your GitHub repo. PRobot will automatically:
- Receive the
pull_requestopened event - Fetch the changed files and diffs
- Run the diff through the local LLM
- Post a structured review comment on the PR
🤖 PRobot Review
📋 Summary:
Adds basic Express authentication middleware and a login endpoint.
🐛 Bugs:
- Hardcoded credentials ('admin'/'secret') should never be used in production.
⚡ Performance:
- No issues for this scope.
🔒 Security:
- Plaintext password comparison is a critical vulnerability; use bcrypt or similar.
- JWT token is hardcoded ('fake-jwt-token'); replace with a real signing library.
📝 PR Description Feedback:
Consider documenting the expected request/response schema for the login route.
- Python 3.10+
- Ollama running locally
- ngrok (for local development tunneling)