Solution for OLLAMA Models Not Showing Up in Bolt.DIY
I’ve encountered an issue where OLLAMA models weren’t visible in Bolt.DIY on my Ubuntu 24.10 Server with Docker running as a VM (172.16.1.19). I was able to resolve this by adding an environment variable to the Ollama container.
Setup Details:
- Ubuntu 24.10 Server with Docker
- Ollama installed via OpenWebUI (ghcr.io/open-webui/open-webui:ollama_)
- Bolt.DIY set up on the same server using the instructions for Docker
Troubleshooting Steps:
- Accessing Bolt.DIY from another host on the LAN resulted in no models being available in the dropdown menu for OLLAMA.
- A successful
curl
request from BOLT container to the Ollama container confirmed that it was accessible.
Fix:
To resolve this issue, stop the Ollama container and add an environment variable:
- Environment=“OLLAMA_ORIGINS=*”
Then restart your Ollama container and refresh the browser window for Bolt.DIY. This should populate the OLLAMA models in your dropdown menu.
Let me know if you have any further questions or if this solution helps resolve your issue!