I have bolt successfully installed on ubuntu system through WSL 2. I have what I believe to be the base api url for Ollama which i guess is http://localhost:11434 but i’m not sure and added it to the .env.local file. I installed ollama on the ubuntu system and ran it using ollama serve, it generated an api that i in turn added to my ollama keys on ollama.com. but bolt.diy doesn’t show any of the models when choosing ollama and gives an error when trying to run.
I would probably just suggest run it on straight windows using the npm install pnpm & pnpm install method. I can only imagine the headache of getting it to run within WSL.
Hello there, sorry to bump this topic but no matter what I try I cannot get ollama to show my models on bolt, everything else seems to work, just want to work with local