"Virtual Intelligence Response Unit" — A Self-Hosted, Agentic Operating System.
Free alternative to ChatGPT and GitHub Copilot - VIRU is a privacy-first, self-hosted AI coding assistant that runs 100% locally. Unlike cloud-based alternatives, your code never leaves your machine, and there are no monthly subscription fees.
ChatGPT alternative · GitHub Copilot alternative · free AI coding assistant · local LLM · self-hosted AI · privacy-first AI · open-source coding assistant · autonomous AI agent · Ollama · local AI developer tool
Landing Page: viru.vercel.app (Deploy your own from landing-page/)
GitHub: github.com/rajpratham1/VIRU
License: MIT (100% Free & Open Source)
This repository contains:
/client- React frontend (main VIRU interface)/server- Node.js backend (AI engine)/admin- Admin dashboard/landing-page- Marketing website (not needed for VIRU to run)
To download VIRU: Use the "Download ZIP" button which excludes the landing page automatically.
To deploy landing page: See landing-page/README.md
"See it, Build it."
- Drag and drop any UI screenshot into the terminal.
- VIRU analyzes the image using Multimodal AI (LLaVA/GPT-4V) and writes the complete React/Tailwind code to replicate it.
"Hands-free Coding."
- Command Mode: Click the Mic for single instructions.
- God Mode (∞): Continuous conversations. VIRU listens, thinks, executes, and speaks back.
"Total Recall."
- VIRU maintains a persistent Vector Database of your project.
- It remembers context from previous conversations and documents.
- Visualizer: View your project's memory as a 3D interactive starfield.
"Code that fixes itself."
- Command:
/autopilot <goal> - VIRU writes code, runs tests, reads errors, and fixes itself in a loop.
"Control Everything."
- A standalone SaaS Dashboard running on port
5174. - Manage users, broadcast system-wide alerts, and manage subscription tiers.
- Frontend: React 19, Vite, TailwindCSS
- Backend: Node.js, Express, TypeScript
- Database: SQLite + Prisma ORM
- AI Engine: Ollama (Local) or OpenAI (Cloud)
- Infrastructure: Docker, LocalTunnel
Prerequisites:
- Docker Desktop
- Ollama → Run:
ollama pull mistral
Installation:
git clone https://github.com/rajpratham1/VIRU.git
cd VIRU
docker-compose up -dAccess:
- Main Console: http://localhost:5173
- Admin Dashboard: http://localhost:5174
- API: http://localhost:5000
Stop: docker-compose down
📘 See DOCKER.md for publishing to Docker Hub
Prerequisites: Node.js v18+, Ollama
# Clone & Install
git clone https://github.com/rajpratham1/VIRU.git
cd VIRU
npm install
cd client && npm install && cd ..
cd server && npm install && npx prisma generate && npx prisma db push && cd ..
cd admin && npm install && cd ..
# Run
npm run devVIRU runs locally but can be accessed from anywhere via:
- Backend: Runs on your machine with auto-tunnel:
https://viru-rajpratham-gen1.loca.lt - Frontend: Deployed to Vercel
- Connection: Set Vercel env
VITE_API_URLto your tunnel URL
CORS/Network Error:
- Ensure backend is running
- Visit tunnel URL once to bypass IP check
500 Error:
- Check
server/error.log - Verify Ollama is running:
ollama list
Docker Issues:
- Can't connect to Ollama? Use
AI_MODEL_URL=http://host.docker.internal:11434 - Port conflicts? Change ports in
docker-compose.yml
- DOCKER.md - Docker distribution guide
- documentation.md - Full technical docs
- learn.md - Developer glossary
MIT License - See LICENSE
Architect: Raj Pratham
System: VIRU (Virtual Intelligence Response Unit)
"Engineered for the Future."