Local ollama not working

the Settings part of the UI can see the local ollama


but still can not select none of the modells

my env file has
OLLAMA_API_BASE_URL=
http://127.0.0.1:11434 where i have “Ollama is running”
and
DEFAULT_NUM_CTX=4096

i use Docker runing this


here is my main screen
other info . I use docker on linux but i guess this is not the issue

1 Like

Hi,
maybe the base-url is the problem. Think for docker it should be host.docker.internal:11434.

You can also look at my docker-stack here:

but this is not just in case you run ollama also from a container?
Here i have ollama natively running as a linux service …

Yes, within this stack, all running in docker.

Is there a reason why you dont wanna run it also in the docker environment?

so first of all Tnx for helping me !!
Regarding Ollama . So i have it allready installed natively up and running .
i have other applications like open-web-ui (running also in docker) using this native Ollama service

In my view, running bolt from a Docker image shouldn’t require the local environment for ollama to also run from Docker. At least, the documentation doesn’t mention anything like that.

The open-webui running from Docker can also work this way.

What’s strange is that the custom settings section detects it but still doesn’t display it in the menu. This could even be a bug?!

1 Like

Correct, and I´m pretty sure it isnt needed. Just asked because I know this is working.

I´m pretty sure I had this running like you at some time. Need to test it again and come back to you here.

Maybe with the main branch then. If you like, you could give it a try with the stable branch. If this works then its not a bug I guess, cause this was working. Then its just a configuration thing.

I just testet it out with the bolt.diy from my docker-stack (only startet this container) + my local ollama. Works without problems.

As shown in my youtube video, make sure you allowed for your ollama to get connected from outside, by setting the correct environment variables (host + origin)

Hey, I am facing the same problem as the other user. I am running bolt.diy from a Docker image and I have same Ollama issue. I have pulled some Ollama models but are not displaying. Can you send me your youtube video how to allow Ollama to get connected outside? Btw I am using Windows.

Hey @dime-git, its linked here:

3rd Video under my name.

So i made some diggings . and i succeed . Here is my roadmap ( if somebody has this issue )
First of all I done what leex279 suggested .
back to the fix version tag 0.0.6
git fetch
git tag
git checkout v0.0.6

this version is bundled with node:20.18.0
you can downgrade or
you need to change it to the latest LTS since the documentation from GitHub - stackblitz-labs/bolt.diy: Prompt, run, edit, and deploy full-stack web applications using any LLM you want! said :
2. Download the “LTS” (Long Term Support) version for your operating system
(for the maintainer: maybe is better to specify a concrete LTS version)
so for that you need to change your Dockerfile

ARG BASE=node:22.14.0
#node:20.18.0

after that
npm run dockerbuild

start your latest image - (put as parameter the same port 5173 )
and it will start the container

after that you need to enable in the settings
Experimental Providers (button)
Enable experimental providers such as Ollama, LMStudio, and OpenAILike.

and give the Ollama Base URL :
http://127.0.0.1:11434
also here in the settings. (Providers part) and done you can select your local models

I guess the guys are working on the feature to have this url also from the .env file
but this will be in the next version only . Because i have it included , but the value was not absorbed from the file yet…

1 Like

Yes, it was by misstake that they gone lost in the actual main branch. I think its already fixed in a PR and will come soon to the main branch :slight_smile:

1 Like

somehow the 2 buttons from the bottom were also missing (in the latest version) or are they placed on the right side? (The settings was there) anyway Nice work !

I can see them (just pulled newest changes):