N8n/coolify/ollama

hi, I have an existing vps running coolify to manage some apps/containers.
I have an n8n container. I added an ollama container (on the same docker network) but I am not seeing the llms I pulled into ollama in n8n. Anyone have experience with running coolify that may give me a clue?

TIA

Bob

1 Like

Hi,

my guess => Ollama not bind to the correct address:

Ollama binds 127.0.0.1 port 11434 by default. Change the bind address with the OLLAMA_HOST environment variable.

=> so bind it to 0.0.0.0 or to the containername “ollama:11434” or what you have

2 Likes

thanks will give it a try. What about having ollama pull the models I want when it starts up? I have to do it manually every time

bolt does list all available models within ollama (“ollama list”). When they not running it will spinn them up, what can then take some time.
So not to be done manually.

You just need to pull the models you need one time, so they are available.

1 Like

I tried that but it didnt work

@aliasfox maybe you can help better, as I am not that deep into n8n yet