I’ve installed the latest version of bolt.diy but only the Google models are working. I have Anthropic, OpenAI, OpenRouter, and Google APIs. I’ve done the following:
Copied a known working version of .env from a different version of bolt to this new version
Deleted the .env file. Created a brand new one. Copied and pasted the API keys into the new .env
Added the API keys into the UI
Created and tested a brand new Open AI API key
Restarted the server, browser, and eventually my pc
The error is always the same: ERROR Chat Request failed. Error: An error occured. No other info.
I did try adding all the APIs to the UI but they didn’t work. I just changed .env to .env local, restarted the server but exactly the same with one exception. Now Google LLMs don’t work either.
I closed my browser (Canary), killed the process in the Windows terminal, and restarted everything and tested Anthropic, OpenAI, OpenRouter, and Google. Only Google works. Same as before.
Is there a workaround in the mean time? Is there another version of bolt I could use? Gemini 2.0 is not the best for building apps. Very very frustraiting. Like trying to ask a monkey to write code.
Hmmm. I was using 0.0.3 and ran into this issue which is why I updated to 0.0.5. I’ll go back to 0.0.3 and check. I’ll also check 0.0.4 just because I haven’t used it yet.
I think the 4o Model is just not capable of your project size. Try Gemini 2.0 Flash.
Also you can provide the project to me if you want, then I try/verify on my system. You can also provide it via PM if you dont want to post public here.
I’m been using Gemini for days. I’ve moved from Canary to Brave, tested OpenRouter, and found Sao10k: Llama 3.1 Euryale 70B v2.2. Works perfectly. Super cheap.