How to set up using an external supabase install and external open-webui/ollama

Hello,

i try to install the local-ai-packaged git and try to avoid installing another supabase, open-webui, ollama because i already got those running on different machines. So i commented out the lines for those services in the .enc, docker-compose.yml and start_services.py and tried to get it setup. But the container of n8n_import is failing:

Running: docker compose -p localai -f docker-compose.yml up -d
[+] Running 8/8
 ✔ Network localai_default  Created                                                                                                                   0.0s
 ✔ Container flowise        Started                                                                                                                   0.4s
 ✔ Container qdrant         Started                                                                                                                   0.4s
 ✘ Container n8n-import     service "n8n-import" didn't complete successfully: exit 1                                                                 4.5s
 ✔ Container caddy          Started                                                                                                                   0.3s
 ✔ Container redis          Started                                                                                                                   0.5s
 ✔ Container searxng        Started                                                                                                                   0.4s
 ✔ Container n8n            Created                                                                                                                   0.0s
service "n8n-import" didn't complete successfully: exit 1
Traceback (most recent call last):
  File "/root/local-ai-packaged/start_services.py", line 242, in <module>
    main()
  File "/root/local-ai-packaged/start_services.py", line 239, in main
    start_local_ai(args.profile)
  File "/root/local-ai-packaged/start_services.py", line 74, in start_local_ai
    run_command(cmd)
  File "/root/local-ai-packaged/start_services.py", line 21, in run_command
    subprocess.run(cmd, cwd=cwd, check=True)
  File "/root/.pyenv/versions/3.12.9/lib/python3.12/subprocess.py", line 573, in run
    raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['docker', 'compose', '-p', 'localai', '-f', 'docker-compose.yml', 'up', '-d']' returned non-zero exit status 1.

It narrowed it down to the postgres (supabase) connection - if i remove the environment variables from docker-compose.yml the script does run, but the n8n flows don’t get imported - so how can i tell n8n to use my external db? I tried “POSTGRES_HOST=192.168.10.252” in the .env and “POSTGRES_HOST=http://192.168.10.252” to set it to my external supabase but this isn’t working. Any tipp how to set this up?

Thank you very much!

1 Like

Take a look at my video at about 9:45. I use local ollama and I think you need the same “host.docker.internal…” maybe for the configuration instead of the IP. I also think you miss the Port for your host:

1 Like

Thank you very much!
I thought the port was added automaticly because of the “POSTGRES_PORT=5432” variable in the .env which i did set. In the mean time i just commented out the hole bunch of postgres stuff inside of the docker-compose.yml part of x-n8n and added “- N8N_SECURE_COOKIE=false” to be able to access it in the browser without setting up DNS/certificates.
Thing is the n8n-import is now doing nothing and i imported the json files by hand and set the credentials - which i had to adjust anyways to the external IPs. Its all up and runing now!

1 Like