Make sure you set your ports correctly, docker exposes its own URLs, so it’s not the Ollama default. You’ll need to modify the BaseURL to match. Another option would be to run Ollama without docker and just use the defaults.
The issue is that you have an extra slash at the end of your Base URL: http://localhost:11434/ should just be http://localhost:11434. The Base URL should not end in a slash.
And unless it has changed, you can just put nothing and Bolt.diy will assume the correct LOCALHOST address and port. Either should work.
downloaded and reinstalled .05 this morning
still getting “There was an error processing your request: No details were returned” When you put a prompt in the chat.
Here are the logs
**app-dev-1 | ERROR LLMManager Error getting dynamic models Ollama : TypeError: fetch failed
app-dev-1 | ERROR api.chat Error: No models found for provider Ollama
app-dev-1 | at Module.streamText (/app/app/lib/.server/llm/stream-text.ts:132:13)
app-dev-1 | at processTicksAndRejections (node:internal/process/task_queues:95:5)
app-dev-1 | at chatAction (/app/app/routes/api.chat.ts:116:20)
app-dev-1 | at Object.callRouteAction (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/data.js:36:16)
app-dev-1 | at /app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4899:19
app-dev-1 | at callLoaderOrAction (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4963:16)
app-dev-1 | at async Promise.all (index 0)
app-dev-1 | at defaultDataStrategy (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4772:17)
app-dev-1 | at callDataStrategyImpl (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4835:17)
app-dev-1 | at callDataStrategy (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3992:19)
app-dev-1 | at submit (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3755:21)
app-dev-1 | at queryImpl (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3684:22)
app-dev-1 | at Object.queryRoute (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3629:18)
app-dev-1 | at handleResourceRequest (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:402:20)
app-dev-1 | at requestHandler (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
app-dev-1 | at /app/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run+react@2.15.0_react-dom@18.3.1_react@18.3.1__react@18.3.1_typ_3djlhh3t6jbfog2cydlrvgreoy/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25
**
I just fixed the issue, make sure to update the Ollama URL on the env.local and also do it on the bolt.diy GUI Settings: once done, you should see your models being cached as shown below!
FYI for some reason it does connect but it took a loooot of time for the mssage from bolt to be sent to the Ollama API, and it also took a while to get a respond, might be my crappy computer…