Trying to Connect Cloudflare Pages to Local Model

I wanted to do a quick and dirty test to try and get a Cloudflare Pages instance of Bolt.diy working with a local model. I setup port forwarding and it even detects my models in LM Studio running locally (which is pretty cool right off the bat). Super easy to get setup, just added the LMSTUDIO_API_BASE_URL variable to the site for Cloudflare Pages: https://bolt.cyopsys.com/

But the problem is routing isn’t working (or something lol):

But locally it works fine:

I tried modifying a bunch of Firewall settings and whathaveyou, but couldn’t immediately get it working. I think it’s doable, just not sure steps to try.


Currently the only limitation I am having HTTPS/TLS: This request has been blocked; the content must be served over HTTPS. I did tried putting it behind Cloudflare DNS Proxy settings to take care of the HTTPS and change the port to 8080, because port 1234 is not one that’s supported. But I couldn’t get it returning responses either with my direct external IP or the one that LMStudio provides (also external).

I might try Ollama next, and I know this is a niche within a niche kind of thing, but I was wondering if anyone else had tried something crazy like this. They next thing for me to try would be getting a self-signed certificate and HTTPS working from my local machine, or spinning up a little Linux server running the LLM.

LMStudio Settings:
image

And I only tried this on my PC, I have another machine intended for running LLM’s a service but that is also a bit more involved.

Promising but I will need to figure out a simple way to setup local host with a self-signed certificate to do HTTPS.

1 Like

Nice,

HTTPS => Maybe just use Caddy with self signed certs. Easy and fast:

1 Like

Thanks, I’ll take a look.