Hello,
i try to install the local-ai-packaged git and try to avoid installing another supabase, open-webui, ollama because i already got those running on different machines. So i commented out the lines for those services in the .enc, docker-compose.yml and start_services.py and tried to get it setup. But the container of n8n_import is failing:
Running: docker compose -p localai -f docker-compose.yml up -d
[+] Running 8/8
✔ Network localai_default Created 0.0s
✔ Container flowise Started 0.4s
✔ Container qdrant Started 0.4s
✘ Container n8n-import service "n8n-import" didn't complete successfully: exit 1 4.5s
✔ Container caddy Started 0.3s
✔ Container redis Started 0.5s
✔ Container searxng Started 0.4s
✔ Container n8n Created 0.0s
service "n8n-import" didn't complete successfully: exit 1
Traceback (most recent call last):
File "/root/local-ai-packaged/start_services.py", line 242, in <module>
main()
File "/root/local-ai-packaged/start_services.py", line 239, in main
start_local_ai(args.profile)
File "/root/local-ai-packaged/start_services.py", line 74, in start_local_ai
run_command(cmd)
File "/root/local-ai-packaged/start_services.py", line 21, in run_command
subprocess.run(cmd, cwd=cwd, check=True)
File "/root/.pyenv/versions/3.12.9/lib/python3.12/subprocess.py", line 573, in run
raise CalledProcessError(retcode, process.args,
subprocess.CalledProcessError: Command '['docker', 'compose', '-p', 'localai', '-f', 'docker-compose.yml', 'up', '-d']' returned non-zero exit status 1.
It narrowed it down to the postgres (supabase) connection - if i remove the environment variables from docker-compose.yml the script does run, but the n8n flows don’t get imported - so how can i tell n8n to use my external db? I tried “POSTGRES_HOST=192.168.10.252” in the .env and “POSTGRES_HOST=http://192.168.10.252” to set it to my external supabase but this isn’t working. Any tipp how to set this up?
Thank you very much!