Description
While configuring Ollama (Local) as the provider in the Superpowers AI browser extension, the Test Connection action consistently fails with the following result:
Test Result:
403 status code (no body)
This happens even though the Ollama server is running correctly and is reachable outside the extension.
The issue occurs:
- When Ollama is running on the same machine
- When Ollama is running on another server in the same network
- When Ollama is running on an external/remote server
The behavior is identical in all cases.
Steps to Reproduce
- Install the Superpowers AI extension
- Open the side panel → Settings
- Select Provider: Ollama (Local)
- Enter a valid Ollama base URL (for example:
http://127.0.0.1:11434)
- Click Test Connection
Expected Behavior
The connection test should succeed when the Ollama server is running and reachable.
Actual Behavior
The test always fails with:
403 status code (no body)
No additional error details are shown.
Environment
- Extension: Superpowers AI
- Provider: Ollama (Local)
- Model selected:
llama3.2:latest
- Ollama URLs tested:
http://127.0.0.1:11434
http://<LAN-IP>:11434
http://<remote-server-IP>:11434
- OS: Linux
- Browser: Chrome (latest)
Additional Notes
- Ollama is confirmed to be running and responding to requests outside the extension
- Same 403 response occurs across multiple servers and machines
- No authentication or reverse proxy is configured in front of Ollama
Please let me know if additional logs or debugging information would help.

Description
While configuring Ollama (Local) as the provider in the Superpowers AI browser extension, the Test Connection action consistently fails with the following result:
This happens even though the Ollama server is running correctly and is reachable outside the extension.
The issue occurs:
The behavior is identical in all cases.
Steps to Reproduce
http://127.0.0.1:11434)Expected Behavior
The connection test should succeed when the Ollama server is running and reachable.
Actual Behavior
The test always fails with:
No additional error details are shown.
Environment
llama3.2:latesthttp://127.0.0.1:11434http://<LAN-IP>:11434http://<remote-server-IP>:11434Additional Notes
Please let me know if additional logs or debugging information would help.