I’m encountering an issue with bolt.diy when hosting it over the internet. While the application works fine on localhost, I’m unable to use the terminal functionality when accessing it remotely. Here are the details:
The application runs successfully on localhost:5173
I’ve set up port forwarding to make it accessible over the internet
When accessing via my public IP or domain, the main application loads, but the terminal fails to spawn
Error message: “Failed to spawn bolt shell” and “Failed to execute ‘postMessage’ on ‘Worker’: SharedArrayBuffer transfer requires self.crossOriginIsolated.”
My current setup:
Using Docker with docker-compose
Not using Nginx, just direct port forwarding
Debian-based system
Any guidance on how to properly configure bolt.diy for internet access while maintaining terminal functionality would be greatly appreciated. Are there specific settings or configurations I’m missing for cross-origin isolation when hosting remotely? Thank you in advance for your help!
Off the to of my head, it feels like an https or Cors issue. Also, running it locally uses “dev”, so I’d consider fully hosting it or publishing it to Cloudflare Pages.
Hello. I setup Caddy to resolve the CORS errors and I’ve run into an issue. I can access Bolt via HTTPS now and the Ollama models are present in the drop down. The issue comes in when I create a new chat and send my first message. The message is sent with no errors but nothing gets processed on the Ollama side and no response is returned to Bolt.
Is this because of the self signed cert? I see no errors when using Chrome Canary browser. Could Caddy be rewriting the URLs incorrectly causing a failure? Any suggestions as I continue to troubleshoot? Thank you.
To further elaborate. My goal is to run Bolt like I do OpenWebUI and Ollama. Via containers running on my desktop host with GPUs. I want to be able to access Bolt via any system on my LOCAL LAN.
I do realize an alternative option may be to spin up a windows VM with GPU passthrough and run all these services locally to that machine and just remote in and access via http://localhost. But that is a lot of work and not really sustainable.
I was using a hacky and somewhat cumbersome method to get my local Ollama and LMStudio web-facing. But after looking into it, I’d suggest just using ngrok to expose the service running on the given port. Nice thing is you can just use one solution for one or both (but Ollama is lighter and probably the better option, though LMStudio gives more control).