About the [bolt.diy] Issues and Troubleshooting category

Use this category if you have any issues when running oTToDev yourself, you found a bug, or if you need assistance with troubleshooting in any way!

1 Like

the biggest problem is to get to ollama to run in docker tried all does not work at all

I get the usual “There was an error processing your request: An error occurred.”

Here are some more details.


In my cmd i get this error : No routes matched location “/meta.json”


In the web console I get : ERROR Chat Request failed
Error: An error occurred.


In Inspect → Network I get : :gear: meta.json 404 fetch background.js:514 4.4 kB 367 ms

Which is linked to this part of background.js

async function getStoreMeta(link) {
if(exceptionURL.some(el => link.includes(el))) return false;
else return await fetch(${link}/meta.json)
.then(res => res.json())
.catch(() => false);



From what I understand it seems like it can’t find meta.json

Here are the steps I’ve taken so far :

  • I deleted and downloaded again bolt.diy
  • I’ve tried using different LLM (OpenAI LLMS, OpenRouter LLMS & DeepSeek LLMS)
  • I’ve tried hosting bolt.diy on CloudFlare but I still get the error

I am able to import a project, I can create a new project but at some point I start to perpetually get the error.

If anyone knows what it can be, please let me know :slight_smile:

@simonfortinpro please open a seperate topic to find investigate this. I will answer there then.