I have set up everything and when I open the node of open ai I change the base url as it is mentioned. But it also needs an API key. Whatever I put in the end it says error not found. When I type in the chat I get an error model not found. What I am doing wrong.
I am using Linux mint 22.1 and I follow the steps below:
-Installed docker (also desktop)
-Set up supabase
-Cloned ai local repo
-ran the start services
-went to n8n template and I trying testing it
Also I am facing some problems with docker desktop. sudo docker ps returns all active containers but the desktop app shows nothing. I tried changing contexts but failed
I installed a fresh install again on mint 22.1 and if someone managed to run local ai in mint I would greatly appreciate the exact steps they followed.
So firstly, I installed Docker (not desktop) because i am on Linux and I find the app a little buggy. From what i searched online you can do everything perfectly fine from the CLI.
I am running Linux Mint 22.1 XFCE and i followed the instructions on your README.md file.
After the containers are up and running I go to my n8n flow (v3 local agentic rag) at localhost:5678
-I add my postgress credential (db, postgress_password).
-I run the 2 nodes and the 2 tables are successfully created.
Now I go to the Open AI Node:
-I change the base URL to ollama:11434 and i put a random API key and click save. (I get an error “not found”)
-I also change the URL to the embedding ollama (to ollama:11434).
(I can’t add the whole links “http+local host + port” because the forum doesn’t allow multiple links to new members.)
After the credentials setup I put a file in the shared directory while the flow is running and it does not trigger at all.
So i just send a chat to test it.
I get an error from ollama and something from postgres.
So this is my setup and my problems.
Because i can’t send my pictures i compiled them at this imgur album:
Thanks for the reply Cole! It solved my error. But now I have an easier problem. It says qwen2.5:14b-8k not found, try pulling it. I guess I need to go to the ollama container and execute some code there. I have only docker cli, not desktop.
I have 2 containers related to ollama:
ollama
ollama-pull-llama
(Sorry for the dumb question, but i am afraid not to break my installation )
Edit1: Also i opened the open ai node again and somehow my model selected is qwen2.5:7b-instruct-q4_K_M instead of qwen2.5:14b-8k. I didn’t excecute any command yet. I am testing the node with that model and it haven’t threw an error yet. It is loading for some time, i guess it is a good thing.
edit 2: i got Request timed out.
i will try to excecute the command @leex279 suggested, but i only need to find out which container is the correct one
@anicitos007 the “ollama” is the correct one. the other one is just spinning up for loading the initial model cole configured. you could also change the pull command for this one in the docker-compose.yml and run this. Both works. But the command from me is faster if you do it more often and dont want to change anytime the file
edit: for debugging purposes, here is my docker ps output.
nik@nik-pc:~/local-ai-packaged$ sudo docker ps
CONTAINER ID IMAGE COMMAND CREATED STATUS PORTS NAMES
dbd1b0d7096b n8nio/n8n:latest "tini -- /docker-ent…" About an hour ago Up About an hour 0.0.0.0:5678->5678/tcp, [::]:5678->5678/tcp n8n
f3080b5ae4e0 flowiseai/flowise "/bin/sh -c 'sleep 3…" About an hour ago Up About an hour 0.0.0.0:3001->3001/tcp, [::]:3001->3001/tcp flowise
e187e8321428 caddy:2-alpine "caddy run --config …" About an hour ago Up About an hour caddy
8f395054925c valkey/valkey:8-alpine "docker-entrypoint.s…" About an hour ago Up About an hour 6379/tcp redis
6f9c04b558c9 searxng/searxng:latest "/sbin/tini -- /usr/…" About an hour ago Restarting (1) 45 seconds ago searxng
dd86928ade17 qdrant/qdrant "./entrypoint.sh" About an hour ago Up About an hour 0.0.0.0:6333->6333/tcp, [::]:6333->6333/tcp, 6334/tcp qdrant
3b58a4ed58db ghcr.io/open-webui/open-webui:main "bash start.sh" About an hour ago Up About an hour (healthy) 0.0.0.0:3000->8080/tcp, [::]:3000->8080/tcp open-webui
e9b953ac8456 supabase/storage-api:v1.19.3 "docker-entrypoint.s…" About an hour ago Up About an hour (healthy) 5000/tcp supabase-storage
3bd9b24506bd supabase/edge-runtime:v1.67.2 "edge-runtime start …" About an hour ago Up About an hour supabase-edge-functions
682fe0a214b1 supabase/supavisor:2.4.12 "/usr/bin/tini -s -g…" About an hour ago Restarting (1) 7 seconds ago supabase-pooler
2800ba86da71 supabase/realtime:v2.34.40 "/usr/bin/tini -s -g…" About an hour ago Up About an hour (healthy) realtime-dev.supabase-realtime
f0f38bb08cea postgrest/postgrest:v12.2.8 "postgrest" About an hour ago Up About an hour 3000/tcp supabase-rest
0cea0ffcbfae supabase/studio:20250224-d10db0f "docker-entrypoint.s…" About an hour ago Up About an hour (healthy) 3000/tcp supabase-studio
c215e7c08a4c supabase/gotrue:v2.170.0 "auth" About an hour ago Up About an hour (healthy) supabase-auth
299f79f59d9b kong:2.8.1 "bash -c 'eval \"echo…" About an hour ago Up About an hour (healthy) 0.0.0.0:8000->8000/tcp, [::]:8000->8000/tcp, 8001/tcp, 0.0.0.0:8443->8443/tcp, [::]:8443->8443/tcp, 8444/tcp supabase-kong
bda7e0008d3f supabase/postgres-meta:v0.86.1 "docker-entrypoint.s…" About an hour ago Up About an hour (healthy) 8080/tcp supabase-meta
f5d05aa99fe2 supabase/logflare:1.12.5 "sh run.sh" About an hour ago Up About an hour (healthy) 0.0.0.0:4000->4000/tcp, [::]:4000->4000/tcp supabase-analytics
894c31ae0418 supabase/postgres:15.8.1.049 "docker-entrypoint.s…" About an hour ago Up About an hour (healthy) 5432/tcp supabase-db
649513213de6 darthsim/imgproxy:v3.8.0 "imgproxy" About an hour ago Up About an hour (healthy) 8080/tcp supabase-imgproxy
9f214bc40b96 timberio/vector:0.28.1-alpine "/usr/local/bin/vect…" About an hour ago Up About an hour (healthy) supabase-vector
38600e174752 ollama/ollama:latest "/bin/ollama serve" 2 days ago Up About an hour 0.0.0.0:11434->11434/tcp, [::]:11434->11434/tcp ollama
It seems that my laptop can’t run the model. I used the most lightweight model of qwen2.5:0.5b and I get a response when I chat. But I have some problems with the knowledge base.
I will install Linux on a drive on my powerful computer and then if I come across any problems I will open a new post. Thanks for all the help @ColeMedin@leex279 . You can close this post know.