hi, I have an existing vps running coolify to manage some apps/containers.
I have an n8n container. I added an ollama container (on the same docker network) but I am not seeing the llms I pulled into ollama in n8n. Anyone have experience with running coolify that may give me a clue?
bolt does list all available models within ollama (“ollama list”). When they not running it will spinn them up, what can then take some time.
So not to be done manually.
You just need to pull the models you need one time, so they are available.
@bobm67 Sorry I forgot to follow up here! What is the URL that you have set for Ollama right now? If your Ollama container is called “ollama” and is within the same Docker network, then your URL should work if it’s:
it is named ollama (I am sending that in the docker-config.yml) and it is up and running. on the same network (both in the yml and the coolify network is checked for both containers
Still my n8n and Webui containers don’t see any of the models (they were successfully downloaded)
When you say the same Docker network do you mean they are in the same Docker compose stack? The containers need to be both defined in the same docker-compose.yml file for the container name (like referencing “ollama”) to work.
It’s a requirement if you want to reference the container name directly (like “ollama”). If they are running separately you’ll have to reference the IP of whatever is hosting the container. I don’t have experience with Coolify specifically so I don’t know exactly how that is set up, but I hope that can point you in the right direction.
Otherwise to simplify things I would run everything together in one docker-compose.yml file!
Was your issue resolved? Being a non techie, i was also inclined of using coolify and self hosting. There is youtube channle ‘Syntax’ and they recently have 1 and half hour long video of coolify… seems straight forward but your issue would educate non techie like me.