New bolt.diy ollama not working (Errors within)

WARN Constants Failed to get Ollama models: fetch failed

After a few seconds they are fetched, but it still doesn’t function, ALSO tried Mistral, and LMStudio versions but neither of them are working for me either. They always return with an error saying they’re not responding, but i’m also getting this sometimes:

https://i.imgur.com/S4ydJDm.png

Acts like it’s thinking but NEVER moves forward.

https://i.imgur.com/ctxwJY5.png

Disabled EVERYTHING in the .env.local and “settings” except ollama for testing: https://i.imgur.com/ojzLb96.png

https://i.imgur.com/g7vKRmE.png

Biggest error i could find when sending a prompt to ollama: https://i.imgur.com/r5VZGwh.png

Mistral gives a basic couldn’t respond error.

Side note: Refreshing the localhost:5173 tab is EXTREMELY slow so i’m wondering if something else is going on…

Here is my debug info:
{
“OS”: “Win32”,
“Browser”: “Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/131.0.0.0 Safari/537.36”,
“ActiveFeatures”: ,
“BaseURLs”: {},
“Version”: “8f3b4cd08249d26b14397e66241b9d099d3eb205”
}

Hmmm… it almost seems like something is off with your Ollama installation. Maybe try a reinstall? I know it’s a pretty generic suggestion haha but it doesn’t seem like an issue in Bolt.diy!

but it wont return results

I reinstalled ollama, but never had an issue with it before, even used it with some IDE’s like Cline with zero issues.

1 Like

Gotcha! I think there was actually a small bug introduced in a PR yesterday. I would try again if you could!

1 Like