I am pretty new to this. I am trying to develop an app. I am using VSCode on windows 10. I am going to try to see if Bolt.diy can prototype it for me. It will involve n8n in local-ai-packaged which I have successfully working (with added web search). My question is, Should I be installing bolt.diy (and node.js) inside another new docker container, an existing docker container (I expect not), or on my Windows 10, outside of any containers to do my developing? Does it need adding to the docker compose of local-ai-packaged, adding another docker image to the 26 already used, so that it appears inside my local-ai-packaged project? I just need pointing in the right direction.
Hi, it doesnt really matter, as it does not have to interact with other stuff in local-ai-package etc.
you can also just use the new desktop app came with the last release.
Thank you Lexx, your reply is appreciated. Also, thanks for your indespensible videos. I have almost everything working now. I have bolt.diy installed. I have successfully got a few API keys and can call up Google’s list, Huggingface’s list and openrouter’s list but no matter what values I try to put in .env (or .env.local) I cannot seem to get the ollama menu to populate. Ollama is running in docker as part of local-ai-packaged. I’ve tried localhost, ollama (like set in local-ai-packaged), 127.0.0.1, 0.0.0.0, 192.168.0.119 (my local IP) and 172.18.0.1 (found in dockers logs). All with port 11434. All of those display “ollama is running”.
No menus would populate at all until I renamed .env.local to .env and then changed the name in a couple of files that reference it to match. I wouldn’t be surprised if I’ve screwed things up by doing that and will have to reinstall it entirely, lol.
I’ll keep at it. I’m tenacious if nothing else
Hi Lexx,
In case anyone else has the same issue, I finally found the settings that work.
In my .env (.env.local) file I set the endpoint to 127.0.0.1:11434 and ALSO in the bolt.diy setting, in the running app, I set it to match. Also made sure there was no trailing slashes. Finally sees all of my local LLMs, Wahoo!