Hosting Ollama on a diffrent machine/server

Hey guys, so i currently have a dedicated AI server running ollama, but bolt.diy is running on my local machine

The Ollama is accesible over the network but i need to set the Ollama Base URL, but it seems like i can only do that for LMStudio?

Is there maybe a file i can set it to?

Ive tried creating a .env file in the project root
"OLLAMA_API_BASE_URL=“MY.SERVER.IP:11434”
but that didnt work either, am i missing something?

Hi @neyloxog,

yes, you can set it in the .env file:

Thats correct, as far as I see. What errors do you get? I guess you have cross origin problems and/or your ollama not configured to allow connections from different hosts.

You can checkout my video, if not already done:

If you still get problems, let me know here.

The console tells me this
image

But for example open-webui accepts this url and work with my server…

In your video you install bolt and ollama on the same machine right?

And thx for your help!

1 Like

You missing a 4 in at your port :wink:
its 11434 and not 1143, if you didnt change it :wink:
image

Yes, I do everything local in this case, but I show the variables you need to set for origin and different host, as well as the docs from ollama where you find more infos.

1 Like

Oh man youre absolutely right! I didnt even question it since i copied it from the open-webui .env but i missed the last digit :grimacing: ty!

1 Like

@leex279 Sorry to bother you again, the connection to Ollama is working now and i can chat with it, it also answers BUT when i ask it to code something then i get this error.

As you can see it also tells me i dont have an APIkey set but i guess thats because im running it localy? Or do i need to set the BASE_URL and a key? If yes how would i get an API key from my Ollama server?

You can ignore the API key, its not needed. Just in the UI cause all providers got it and it was not removed for ollama.

The Errors is most likely cause its not working without https, see also:

I still did not investigate this deeper, but maybe it helps you getting it fixed :slight_smile:

1 Like