I have set up everything and when I open the node of open ai I change the base url as it is mentioned. But it also needs an API key. Whatever I put in the end it says error not found. When I type in the chat I get an error model not found. What I am doing wrong.
I am using Linux mint 22.1 and I follow the steps below:
-Installed docker (also desktop)
-Set up supabase
-Cloned ai local repo
-ran the start services
-went to n8n template and I trying testing it
Also I am facing some problems with docker desktop. sudo docker ps returns all active containers but the desktop app shows nothing. I tried changing contexts but failed
I installed a fresh install again on mint 22.1 and if someone managed to run local ai in mint I would greatly appreciate the exact steps they followed.
What is the base URL you are using for Ollama? Or which have you tried?
Thanks for the reply!
So firstly, I installed Docker (not desktop) because i am on Linux and I find the app a little buggy. From what i searched online you can do everything perfectly fine from the CLI.
I am running Linux Mint 22.1 XFCE and i followed the instructions on your README.md file.
After the containers are up and running I go to my n8n flow (v3 local agentic rag) at localhost:5678
-I add my postgress credential (db, postgress_password).
-I run the 2 nodes and the 2 tables are successfully created.
Now I go to the Open AI Node:
-I change the base URL to ollama:11434 and i put a random API key and click save. (I get an error “not found”)
-I also change the URL to the embedding ollama (to ollama:11434).
- (I can’t add the whole links “http+local host + port” because the forum doesn’t allow multiple links to new members.)
After the credentials setup I put a file in the shared directory while the flow is running and it does not trigger at all.
So i just send a chat to test it.
I get an error from ollama and something from postgres.
So this is my setup and my problems.
Because i can’t send my pictures i compiled them at this imgur album:
Thanks in advance,
Nik