Hi, I’m new to Bolt.diy.
I’ve managed to deploy Bolt.diy within Railway by using the main GitHub fork and was successfully able to view it live but unfortunately nothing is functional, can’t input prompt or see any model on the right field.
I’ve tried to set the environment variables for OpenAi and Ollama keys within Railway but still nothing.
Ran into the same thing deploying to Cloudflare Pages. The problem (I believe) is that Bolt doesn’t read encrypted API Keys. And it’s kind of a catch 22 because the default behavior in Cloudflare is to use secure “Secrets” which do not allow for adding in this way.
I had a workaround of publishing twice, once direct to cloudflare Pages (so that it keeps the ability to use runners on commit, so automatically update when changes are pushed) and then again using the Wrangler method (wrangler.toml) which pushed the keys “correctly” using clear-text.
Maybe just try adding them as “variables” instead of “Secrets”? Cloudflare basically prevents you from doing this directly.
edit: let me check the ctx value - I only have a 3080 w/10GB of VRAM I think I may have set the value incorrectly in the .env file. I am using DEFAULT_NUM_CTX=1536
edit2: tried canary and removing the CTX value and still no joy. I see my models populate but cannot get them to return anything. This is probably a me issue.
edit3: got it working. thanks - just make sure to save the .env.local file before starting the docker image. I guess I just missed that step.
Yeah, that’s what I’m doing with Cloudflare Pages. Just hosting the front-end and using AI through API endpoints. It runs light weight (client side) and is completely free (no crazy infrastructure, and don’t have to “host” it in the traditional sense).
And you can automate the build process. I can literally make changes locally, commit/push, and the build/tests happen through Actions on GitHub automatically (I can choose to build locally or not). I don’t even have to install the node modules (maybe not best practice for development, but very cool).
This could be an amazing option say you developed on your phone (if the interface was usable enough on mobile).
Someone told me the only way to get this running online was using Cloudflare Workers. I’m not sure, I’m pretty green when it comes to this stuff. I tried for weeks to install this on AWS but never got it to work and someone on here told me it was designed to run only Cloudflare Workers without making some hefty changes but again, I don’t know for sure because I’m new to all this.
How easy was it to install for you? If you did get it working, maybe you can point me in the right direction.
What is the cost and overhead of running that though? Cloudflare Pages is none for both.
Edit: I’m not saying it’s a bad way to run Bolt.diy or doesn’t offer some benefits, I just don’t see a more compelling way to deploy Bolt.diy than with Cloudflare or GitHub Pages. Just saying.
I’m currently on their “Hobby Plan” ($5/month) which includes up to 8 GB of RAM, 8 vCPU, and 100 GB of Shared Disk running both n8n and now Bolt.diy + openrouter free LLM, so we’ll see how much it will consume monthly.
Unfortunately i can’t get any model to create a simple To-Do app for testing purpose. They do generate the code but failed to launch a preview…any ideas?
There was an error processing your request: No details were returned
And error in the console:
chunk-RMBP4JJR.js?v=c630a162:7399 Warning: Maximum update depth exceeded. This can happen when a component calls setState inside useEffect, but useEffect either doesn’t have a dependency array, or one of the dependencies changes on every render.
Tried to run a simple weather app with HTML and it works.
Also loading my previous Bolt project folder works but can’t enter any prompts.