Help with an error

I get the usual “There was an error processing your request: An error occurred.”

Here are some more details.


In my cmd i get this error : No routes matched location “/meta.json”


In the web console I get : ERROR Chat Request failed
Error: An error occurred.


In Inspect → Network I get : :gear: meta.json 404 fetch background.js:514 4.4 kB 367 ms

Which is linked to this part of background.js

async function getStoreMeta(link) {
if(exceptionURL.some(el => link.includes(el))) return false;
else return await fetch(${link}/meta.json)
.then(res => res.json())
.catch(() => false);



From what I understand it seems like it can’t find meta.json

Here are the steps I’ve taken so far :

  • I deleted and downloaded again bolt.diy
  • I’ve tried using different LLM (OpenAI LLMS, OpenRouter LLMS & DeepSeek LLMS)
  • I’ve tried hosting bolt.diy on CloudFlare but I still get the error

I am able to import a project, I can create a new project but at some point I start to perpetually get the error.

If anyone knows what it can be, please let me know :slight_smile:

Did you confirm manually that the meta.json file actually exists and that you didn’t accidently use an extra forward slash, or something? I don’t think that really has anything to do with bolt. And if you expected the LLM to create that file, it probably just failed to do so (common with many LLMs).

I looked into to original files and there’s no meta.json. I don’t know where this comes from

Please provide more screenshots of your system. I think like this its easier to discover whats going wrong / you doing.

This is not a bolt.diy issue, this is an error being generated from the project you are trying to create in Bolt.diy. There is code that is looking for the meta.json file and it does not exist because the AI failed to create it. Maybe try again with a model like DeepSeek-V3, Gemini Flash 2.0, or ChatGPT o1.

1 Like