Allready tried almost everything
Watched all of the videos, saw all the documentation
But my ollama models do not show
I have ollama installed locally and use it via OpenWebUI
tried with setting the ollama API key as:
http://host.docker.internal:3000
http://127.0.0.1:3000
on the web UI, on the .env.local file
none of them work
also added the variables on the System…
I get bolt.diy to work locally running pnpm run dev command, and it works fine, I tried with google API, but my local LLMs models from ollama does not show up