-
Notifications
You must be signed in to change notification settings - Fork 10.4k
Completly disable Cloudflare workers and AI SDK telemetry #2095
Description
To start off, thank you for the team behind bolt.diy and open sourcing it! <3
For my use case, I really needed a self hostable UI builder that I can also connect to my own LLM
I deploy POCs behind firewalls and closed systems.
In my case, I am running my LLM as a local vLLM in my VM
I am also running LiteLLM as a passthrough server to my vLLM
Now i am connecting Bolt.diy to my local LiteLLM instance as a Local Provider setup as an OpenAILike Provider
My issue is that I see several components of bolt.diy that keep trying to reach the open internet (something that is completely blocked by my firewalls)
The biggest issue is the cloudflare workers, all my API calls fail with this:
at /data/bolt.diy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25{
DOMException [Error]: Request was cancelled
....
....
cause: RequestAbortedError [AbortError]: Proxy Response (407) !== 200 when HTTP Tunneling
at client2.connect (/data/bolt.diy/node_modules/wrangler/wrangler_dist/cli.js:19959:26)
{
code: "UND_ERR_ABORTED"
}
}
This is normal since i block all traffic that is not striclty between bolt.diy <--> LiteLLM <--> vLLM
Is there some way to disable these cloudflare workers / wrangler since I will never deploy to cloudflare to begin with
Another issue is seeing the Vercel AI SDK constantly trying to push telemetry metrics, there should be a simple way to block all telemetry via some block for deployments inside closed systems that cannot or are not allowed to send telemetry
Motivation
This will improve the self-hosted version of bolt.diy to be deployable in closed environments or enterprise environments with restrictions to cloud providers