hi, I have an existing vps running coolify to manage some apps/containers.
I have an n8n container. I added an ollama container (on the same docker network) but I am not seeing the llms I pulled into ollama in n8n. Anyone have experience with running coolify that may give me a clue?
bolt does list all available models within ollama (“ollama list”). When they not running it will spinn them up, what can then take some time.
So not to be done manually.
You just need to pull the models you need one time, so they are available.