Hello,
I am using a Windows computer. I installed a Linux OS and then installed Docker and Ollama with WebUI. Today, I installed Docker for Windows Desktop and set up Bolt.DIY by following the installation documentation. The DeepSeek API key is working fine, but the OpenAI API key is not being detected through the .env
file. However, it works when I enter the API key manually through the UI. My main issue is that Ollama is not working because it doesn’t display the LLMs, even though I have no issues with the WebUI.
Here is the message:
“ERROR LLMManager Error getting dynamic models Ollama : Error: No baseUrl found for OLLAMA provider”
Thanks for posting your question! Make sure you go into the settings and set the base URL for Ollama!
Typically it is:
The settings can be accessed in the bottom left by where the conversation history is listed. If you scroll down in the provider tab you’ll see a place to put the base URL for Ollama!
@ikhaipi make sure you set up ollama correctly. if you got bolt on your local system (locally installed) and only use ollama and WebUI within docker, you need to set the OLLAMA_HOST and maybe also OLLAMA_ORIGINS as env for Ollama. See my install Ollama Video on youtube (profile website link to my channel). Its for local all install, but I show the variables and they are also on my linked task-list.
This instruction solved the problem. Thanks a lot for your help!
Awesome I’m glad, you’re welcome!