Trouble understanding Ollama base api url and api

I have bolt successfully installed on ubuntu system through WSL 2. I have what I believe to be the base api url for Ollama which i guess is http://localhost:11434 but i’m not sure and added it to the .env.local file. I installed ollama on the ubuntu system and ran it using ollama serve, it generated an api that i in turn added to my ollama keys on ollama.com. but bolt.diy doesn’t show any of the models when choosing ollama and gives an error when trying to run.

I did not use docker desktop.

I would probably just suggest run it on straight windows using the npm install pnpm & pnpm install method. I can only imagine the headache of getting it to run within WSL.

Have you entered the host for Ollama in Bolt.diy via settings…?

1 Like

Make sure there’s no trailing spaces after the URL. I did this and had a hell of a time.

added this cleanup built in the code itself for newer releases. :sweat_smile:

1 Like