Failed creating file

First of all, i’m sorry if i’m reposting this issue. But can anyone help me to solve this problem:

I’m using macbook pro m2 pro. Fresh clean install of bolt.diy and try running it (pnpm run dev), no docker. Able to access localhost:5173, LLM connected to OpenAI GPT4o-Mini, i have tried using LLAMA 3.2 as well. I prompted “Create me a simple webapp”, the result always like this.

Anyone can help?

Hey,

at first => You should specifiy your prompt a bit more. What app it should implement for what purpose :smiley: => “Create a simple webapp to manage my daily todos”.

Do you have errors within the DEV-Tools console and/or the shell/terminal?

Are you on the stable branch? => Please post the Debug Tab info


(enable it in “Features” first, if you dont see)

Debug info:

{
“System”: {
“os”: “macOS”,
“browser”: “Chrome 131.0.0.0”,
“screen”: “1512x982”,
“language”: “en-US”,
“timezone”: “Asia/Jakarta”,
“memory”: “4 GB (Used: 103.14 MB)”,
“cores”: 10,
“deviceType”: “Desktop”,
“colorDepth”: “30-bit”,
“pixelRatio”: 2,
“online”: true,
“cookiesEnabled”: true,
“doNotTrack”: false
},
“Providers”: [
{
“name”: “Ollama”,
“enabled”: true,
“isLocal”: true,
“running”: true,
“lastChecked”: “2024-12-19T09:01:45.921Z”,
“responseTime”: 14.939999997615814,
“url”: “http://localhost:11434
},
{
“name”: “OpenAILike”,
“enabled”: true,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2024-12-19T09:01:45.906Z”,
“url”: null
},
{
“name”: “LMStudio”,
“enabled”: true,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2024-12-19T09:01:45.906Z”,
“url”: null
}
],
“Version”: {
“hash”: “50e6778”,
“branch”: “main”
},
“Timestamp”: “2024-12-19T09:01:46.799Z”
}

Here’s the what’s on the Chrome look like

And here’s what in my terminal log when i ran the prompt:

thanks for the infos. My guess is that the model you are using is to small (just 3B) and bolt not working well with such small models yet.
Cant test myself at the moment, cause I dont have the hardware/setup at the moment to run properly.

Maybe @aliasfox / @thecodacus can test/help.

Did you test with a external Provider? Think that makes more sense if you also have not the hardware to run bigger models.

1 Like

I already fill the .env.local API Keys of OpenAI, Anthropic and Gemini, none of them works. Same result, failed creating the file.

Try to work with the stable branch for now please, to be sure you work on a well tested state.

Is it then really the same error because in your screenshot above there was a package.json created and it tried to install it within the webcontainer.

1 Like

Sorry, but how do i get the Stable branch?

Forgive me for my lack of knowledge :sweat_smile:

No problem, we are all here to learn and grow together.

Within your terminal where you cloned the repository:

git checkout stable
1 Like

Thanks, will try it later and update it to you

Hi, I see you are using llama 3.2b which is really a small model might not be able to create an entire project. try some external provider there are free ones like google and togatherai, mistral, these will give you better results

Tried with ANTHROPIC and still fail.

I tried to do it with stable version, and still fail. Still have no clue, i try in my PC (Windows) and give the same problem. It couldn’t create/write any file.

In Mac i already gave permission to Chrome and Terminal to access full storage. But still does not work.

I tried “bolt.new” github and run it locally using Anthropic, and it work. It able to create file in my PC

Ok everyone, it is all my fault. Turns out I missed one step in Github which this in the picture:

Thank you so much for your time and sorry that turned out its just this mistake i missed

I think thats not necessary anymore. Works fine for me in current chrome version.

1 Like

Which chrome version you’re currently using?

Version 131.0.6778.139 (Official Build) (64-bit)