Gravity Omega is the sovereign operator's terminal — a desktop AI platform that unifies local model inference, agent orchestration, browser automation, and multi-channel communication under one VERITAS-governed interface. It does not depend on cloud APIs by default; every inference runs locally through Ollama, every action is logged, and every decision passes through the Omega agent loop before execution.
Gravity Omega is the command surface of the VERITAS & Sovereign Ecosystem — the place where operator intent becomes deterministic action. It hosts the Edge Gallery (skill launcher), the Hermes bridge (external agent tool-call protocol), the Ollama inference router, and the browser automation engine inside a single Electron shell. Other ecosystem nodes — veritas-vault (memory), omega-brain-mcp (governance), Aegis (security), Ollama-Omega (inference transport) — are all accessible from within the Gravity Omega operator interface. The platform enforces a single invariant: no action executes before the operator approves or the agent loop validates it.
Gravity Omega v2 is an Electron desktop application combining three operational layers:
| Layer | Technology | Role |
|---|---|---|
| Main Process | Electron + Node.js | Window management, IPC hub, system tray, auto-updater |
| Renderer | HTML/CSS/JS + Electron APIs | UI surface: chat, settings, gallery, browser view |
| Backend | Python (Flask/FastAPI) | Model orchestration, agent loop, tool execution |
| Bridge | Hermes ACP Adapter | External agent tool-call protocol (67-tool arsenal when unleashed) |
The operator interacts through a chat-style interface that routes to the Omega agent loop. The loop determines whether to use local Ollama models, cloud inference (optional), browser automation, or filesystem tools — with every execution logged and every tool call passing through the omega-brain-mcp governance layer when integrated.
| Capability | Detail |
|---|---|
| Multi-Model Chat | Switch between Ollama local models, Mistral API, and Gemini without restart |
| Agent Loop | Omega agent evaluates context, selects tools, and executes with operator confirmation |
| Browser Automation | Puppeteer-driven browser control for research, scraping, and UI interaction |
| Edge Gallery | Launch ecosystem skills as standalone windows within the platform |
| Hermes Bridge | Connect external Hermes agents with full 67-tool access when OMEGA_UNLEASH is set |
| File System Tools | Read, write, search, and analyze files across the operator's machine |
| Vault Integration | Session persistence through the Veritas Vault capture server |
| Omega Brain Gate | Optional governance layer — every tool call validated by omega-brain-mcp |
| System Tray | Minimize to tray; hotkey-activate for instant access |
| Dark Theme | VERITAS gold-and-obsidian aesthetic; no light mode |
+---------------------------------------------------------------+
| ELECTRON SHELL |
| main.js - Window/tray lifecycle, IPC hub |
| preload.js - Secure contextBridge API surface |
| renderer/ - Chat UI, Settings, Gallery, Browser View |
+-----------------------+----------------------+----------------+
| |
v v
+---------------------------------------------------------------+
| OMEGA AGENT LOOP |
| omega/omega_agent.js - Intent evaluation, tool selection |
| backend/web_server.py - Model routing, inference dispatch |
| main.js _ollamaChat() - Electron main fallback layer |
+-----------------------+----------------------+----------------+
| |
+------------+----------+ +-------+---------------+
| | | |
v v v v
+-------------------+ +------------------+ +----------------+
| LOCAL INFERENCE | | CLOUD BRIDGE | | TOOL EXECUTOR |
| Ollama daemon | | Mistral, Gemini | | Filesystem |
| (port 11434) | | (optional) | | Browser |
+-------------------+ +------------------+ | Shell |
+----------------+
| Hermes ACP |
+----------------+
Gravity Omega has three distinct inference layers — all must be aligned for correct routing:
- Python Backend (
backend/web_server.py): REST API for model listing, chat completion, and tool execution - JS Agent Loop (
omega/omega_agent.js): Frontend-side agent orchestration with tool-call dispatch - Electron Main Fallback (
main.js _ollamaChat()): Direct Ollama Cloud API fallback when local daemon is unavailable
Critical: When modifying Ollama Cloud routing, all three layers must be updated simultaneously. Updating only the Python backend while leaving
main.json local Ollama will produce silent failures.
- Node.js 20+
- Python 3.11+
- Ollama running locally (for local inference)
- Windows (primary target with WSL support)
# 1. Clone the repository
git clone https://github.com/VrtxOmega/Gravity-Omega.git
cd Gravity-Omega
# 2. Install Node dependencies
npm install
# 3. Install Python dependencies
pip install -r requirements.txt
# 4. Start the application
npm start# Electron dev mode with live reload
npm run dev
# Python backend standalone
python backend/web_server.pynpm run dist
# Output: dist/Gravity-Omega-Setup.exe| Path | Content |
|---|---|
backend/.env |
OLLAMA_API_KEY, model defaults, inference timeout |
omega/config.js |
Agent loop parameters, tool timeout, retry policy |
renderer/app.js |
UI preferences, theme, keyboard shortcuts |
~/.omega-brain/ |
Session logs, RAG store, vault artifacts |
Ollama: The local inference expects http://localhost:11434. Pull models before use:
ollama pull qwen2.5:7b
ollama pull mistral:7bEnvironment Variables:
OMEGA_UNLEASH=1— Enables full 67-tool Hermes bridge arsenal (use with discretion)OLLAMA_API_KEY— Required for Ollama Cloud routing on main.js fallback layer
| Layer | Technology | Path | Description |
|---|---|---|---|
| Session Logs | SQLite | ~/.omega-brain/logs.db |
Chat history, tool calls, agent decisions |
| RAG Store | SQLite+FTS5 | ~/.omega-brain/omega_brain.db |
Knowledge fragments, semantic search |
| Vault Artifacts | Filesystem | ~/.gemini/antigravity/ |
Session captures, clipboard snapshots |
| Settings | JSON | ~/.omega/settings.json |
UI preferences, model selection, API keys |
- Local-first: All inference defaults to local Ollama. Cloud APIs are opt-in.
- No telemetry: No analytics, no crash reporting, no update checks.
- Hermes bridge isolation: When
OMEGA_UNLEASHis not set, the bridge is restricted to 30 safe tools. - IPC sanitization: All renderer->main process messages pass through contextBridge with type validation.
- WAL journaling: SQLite databases use Write-Ahead Logging for crash resilience.
The Android companion is maintained as a mobile/ subdirectory within this repository:
cd mobile
# See mobile/README.md for React Native build instructionsThe original standalone OmegaMobile repository is archived and superseded by this integrated location.
| Repository | Role |
|---|---|
| veritas-vault | Session capture, knowledge retention, RAG chat |
| omega-brain-mcp | Governance, audit ledger, 10-gate pipeline |
| Ollama-Omega | Local Ollama inference bridge for IDEs |
| Aegis | Security posture and threat hunting |
| aegis-rewrite | Next-gen security scanner with AI remediation |
| hermes-sentinel | Secret execution without exposing credentials |
| drift | 3D visualization of GitHub development universe |
📖 Read the master narrative: Why Sovereign AI?
This project is part of the VERITAS Omega Universe — a sovereign AI infrastructure stack.
- VERITAS-Omega-CODE — Deterministic verification spec (10-gate pipeline)
- omega-brain-mcp — Governance MCP server (Triple-A rated on Glama)
- Gravity-Omega — Desktop AI operator platform
- Ollama-Omega — Ollama MCP bridge for any IDE
- OmegaWallet — Desktop Ethereum wallet (renderer-cannot-sign)
- veritas-vault — Local-first AI knowledge engine
- sovereign-arcade — 8-game arcade with VERITAS design system
- SSWP — Deterministic build attestation protocol
Released under the MIT License.