See this ! I am having an error

what error are you getting is it a WSL problem?

WARN Constants Failed to get Ollama models: fetch failed is what I get if I try to use LM studio. Guess I will just use other APIs, cause I cant figure it out lol

I’m having the exact same issue, version is stable, Terminal error is the same: “WARN Constants Failed to get Ollama models: fetch failed”.

I have disabled Ollama from the settings, and I’m trying with both OpenAI and Deepseek (both of which I have credits available), and I keep getting the generic error with any propmt, and the Ollama Terminal Error message. Anyone has any ideas how to solve this? I haven’t been able to make bolt.diy work.

{
“System”: {
“os”: “macOS”,
“browser”: “Chrome 131.0.0.0”,
“screen”: “1512x982”,
“language”: “en-US”,
“timezone”: “America/Mazatlan”,
“memory”: “4 GB (Used: 134.87 MB)”,
“cores”: 11,
“deviceType”: “Desktop”,
“colorDepth”: “30-bit”,
“pixelRatio”: 2,
“online”: true,
“cookiesEnabled”: true,
“doNotTrack”: false
},
“Providers”: [
{
“name”: “Ollama”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2024-12-27T23:37:44.197Z”,
“url”: null
},
{
“name”: “OpenAILike”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2024-12-27T23:37:44.197Z”,
“url”: null
},
{
“name”: “LMStudio”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2024-12-27T23:37:44.197Z”,
“url”: null
}
],
“Version”: {
“hash”: “eb6d435”,
“branch”: “stable”
},
“Timestamp”: “2024-12-27T23:37:45.237Z”
}

As written in the other topic you just commented, pls open a seperate topic and put all infos in there, then we can investigate it.

“WARN Constants Failed to get Ollama models: fetch failed”

=> This is just a warning and can be ignored if you dont want to use ollama. has nothing to do with not working other providers

Here it is: Any prompt gives me: There was an error processing your request: An error occurred

Thank you @leex279

1 Like

I answered this in the other thread, but I’m your case you don’t have enough system memory to run the Bolt.diy development server.