I have bolt successfully installed on ubuntu system through WSL 2. I have what I believe to be the base api url for Ollama which i guess is http://localhost:11434 but i’m not sure and added it to the .env.local file. I installed ollama on the ubuntu system and ran it using ollama serve, it generated an api that i in turn added to my ollama keys on ollama.com. but bolt.diy doesn’t show any of the models when choosing ollama and gives an error when trying to run.
I did not use docker desktop.