Communications issues - local host install of ollama - docker install of Bolt.diy

I have an issue after I just installed the bolt.diy tool

ollama is installed natively on the host (windows 11)
bolt.diy is installed in docker (per instructions)

I see several topics that kind of approximate but not directly cover the issue

I had a coworker who is much more knowledgeable than me with these kinds of tools run the same install etc, and he is getting same results.

Any advice would be helpful.


app-dev-1  |  INFO   LLMManager  Getting dynamic models for Ollama
app-dev-1  |  ERROR   LLMManager  Error getting dynamic models Ollama : TypeError: fetch failed
app-dev-1  |  ERROR   api.chat  Error: No models found for provider Ollama
app-dev-1  |     at Module.streamText (/app/app/lib/.server/llm/stream-text.ts:132:13)
app-dev-1  |     at processTicksAndRejections (node:internal/process/task_queues:95:5)
app-dev-1  |     at chatAction (/app/app/routes/api.chat.ts:116:20)
app-dev-1  |     at Object.callRouteAction (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/data.js:36:16)
app-dev-1  |     at /app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4899:19
app-dev-1  |     at callLoaderOrAction (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4963:16)
app-dev-1  |     at async Promise.all (index 0)
app-dev-1  |     at defaultDataStrategy (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4772:17)
app-dev-1  |     at callDataStrategyImpl (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4835:17)
app-dev-1  |     at callDataStrategy (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3992:19)
app-dev-1  |     at submit (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3755:21)
app-dev-1  |     at queryImpl (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3684:22)
app-dev-1  |     at Object.queryRoute (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3629:18)
app-dev-1  |     at handleResourceRequest (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:402:20)
app-dev-1  |     at requestHandler (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
app-dev-1  |     at /app/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run+react@2.15.0_react-dom@18.3.1_react@18.3.1__react@18.3.1_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25
Gracefully stopping... (press Ctrl+C again to force)
1 Like

Make sure you set your ports correctly, docker exposes its own URLs, so it’s not the Ollama default. You’ll need to modify the BaseURL to match. Another option would be to run Ollama without docker and just use the defaults.

See/Watch also I can't make bolt.diy working using local Ollama - #4 by MaLaH
=> more or less same question a

1 Like

I am running ollama natively on windows (outside docker) so using the default url

1 Like

According to this it appears I have the Ollama side working

But when i issue a chat it gives me this in a browser notice
There was an error processing your request: No details were returned

The issue is that you have an extra slash at the end of your Base URL:
http://localhost:11434/ should just be http://localhost:11434. The Base URL should not end in a slash.

And unless it has changed, you can just put nothing and Bolt.diy will assume the correct LOCALHOST address and port. Either should work.

1 Like

I will try that when I get home, I thought I had tested it both ways, but certainly willing to try it again …

Yep, tried with and with out the trailing end slash does not work.

downloaded and reinstalled .05 this morning
still getting “There was an error processing your request: No details were returned” When you put a prompt in the chat.
Here are the logs
**app-dev-1 | ERROR LLMManager Error getting dynamic models Ollama : TypeError: fetch failed
app-dev-1 | ERROR api.chat Error: No models found for provider Ollama
app-dev-1 | at Module.streamText (/app/app/lib/.server/llm/stream-text.ts:132:13)
app-dev-1 | at processTicksAndRejections (node:internal/process/task_queues:95:5)
app-dev-1 | at chatAction (/app/app/routes/api.chat.ts:116:20)
app-dev-1 | at Object.callRouteAction (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/data.js:36:16)
app-dev-1 | at /app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4899:19
app-dev-1 | at callLoaderOrAction (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4963:16)
app-dev-1 | at async Promise.all (index 0)
app-dev-1 | at defaultDataStrategy (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4772:17)
app-dev-1 | at callDataStrategyImpl (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4835:17)
app-dev-1 | at callDataStrategy (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3992:19)
app-dev-1 | at submit (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3755:21)
app-dev-1 | at queryImpl (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3684:22)
app-dev-1 | at Object.queryRoute (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3629:18)
app-dev-1 | at handleResourceRequest (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:402:20)
app-dev-1 | at requestHandler (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
app-dev-1 | at /app/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run+react@2.15.0_react-dom@18.3.1_react@18.3.1__react@18.3.1_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25
**

1 Like

Did you resolved this? Im experiencing the same issue :frowning:

I just fixed the issue, make sure to update the Ollama URL on the env.local and also do it on the bolt.diy GUI Settings: once done, you should see your models being cached as shown below!

FYI for some reason it does connect but it took a loooot of time for the mssage from bolt to be sent to the Ollama API, and it also took a while to get a respond, might be my crappy computer…

1 Like

See models being cached on the bolt docker terminal
image

1 Like

You should just need one of them, not UI and .env file.

Anyway I just saw that this is still buggy. Doesnt matter what you put in, it always use the default.