N8n/coolify/ollama

hi, I have an existing vps running coolify to manage some apps/containers.
I have an n8n container. I added an ollama container (on the same docker network) but I am not seeing the llms I pulled into ollama in n8n. Anyone have experience with running coolify that may give me a clue?

TIA

Bob

2 Likes

Hi,

my guess => Ollama not bind to the correct address:

Ollama binds 127.0.0.1 port 11434 by default. Change the bind address with the OLLAMA_HOST environment variable.

=> so bind it to 0.0.0.0 or to the containername “ollama:11434” or what you have

3 Likes

thanks will give it a try. What about having ollama pull the models I want when it starts up? I have to do it manually every time

bolt does list all available models within ollama (“ollama list”). When they not running it will spinn them up, what can then take some time.
So not to be done manually.

You just need to pull the models you need one time, so they are available.

1 Like

I tried that but it didnt work

@aliasfox / @ColeMedin maybe you can help better, as I am not that deep into n8n yet

checking again…
Any help would be appreciated

@bobm67 Sorry I forgot to follow up here! What is the URL that you have set for Ollama right now? If your Ollama container is called “ollama” and is within the same Docker network, then your URL should work if it’s:

http://ollama:11434

Assuming your Ollama instance is running on port 11434 like it does by default.

1 Like

Thanks Cole,

it is named ollama (I am sending that in the docker-config.yml) and it is up and running. on the same network (both in the yml and the coolify network is checked for both containers

Still my n8n and Webui containers don’t see any of the models (they were successfully downloaded)

When you say the same Docker network do you mean they are in the same Docker compose stack? The containers need to be both defined in the same docker-compose.yml file for the container name (like referencing “ollama”) to work.

1 Like

they are not… in coolify I created separate containers for each program. Is that a requirement? I figured I can update them separately etc.

It’s a requirement if you want to reference the container name directly (like “ollama”). If they are running separately you’ll have to reference the IP of whatever is hosting the container. I don’t have experience with Coolify specifically so I don’t know exactly how that is set up, but I hope that can point you in the right direction.

Otherwise to simplify things I would run everything together in one docker-compose.yml file!

1 Like

Was your issue resolved? Being a non techie, i was also inclined of using coolify and self hosting. There is youtube channle ‘Syntax’ and they recently have 1 and half hour long video of coolify… seems straight forward but your issue would educate non techie like me.