Unable to get Ollama to work on bolt.diy

Allready tried almost everything
Watched all of the videos, saw all the documentation

But my ollama models do not show

I have ollama installed locally and use it via OpenWebUI

tried with setting the ollama API key as:
http://host.docker.internal:3000
http://127.0.0.1:3000
on the web UI, on the .env.local file

none of them work

also added the variables on the System…

I get bolt.diy to work locally running pnpm run dev command, and it works fine, I tried with google API, but my local LLMs models from ollama does not show up

If you say all Videos, I assume also my Ollama video on youtube?

You can also try my docker-stack. At least the NVIDIA version should work fine (AMD is WIP, as the GPU is not working at the moment, just with CPU then).