Docker installation broken? API Key; Not set

Hi,

I came across oTToDev on Youtube and was fascinated. Tried to install, but API key is not loading.

I followed the guide and installed the latest commit in Docker using npm helper scripts. I have Ollama installed on another server, and entered the URL as OLLAMA_API_BASE_URL in .env.example, and renamed it to .env.

After the build, I got the following error when running the image with docker compose:
env file /installpath/bolt.new-any-llm/.env.local not found: stat /installpath/bolt.new-any-llm/.env.local: no such file or Directory

I renamed .env to env.local, and I got the system up and running. But the API key is not set in the app.

I took the image down in docker, and changed docker-compose.yaml to point to .env instead of env.local, and renamed the file back to .env. Ran the image once more in docker, but still no access to the API.

Iā€™ve built both development and production, with no luck. (Am I wrong to assume that dev and prod points to .env in different ways? Looks like it in docker-compose.yaml where the file is not mentioned in the dev segment.)

The warning messages from the Youtube video WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string. is not to be seen in my install process, even though I have many environmentals that have not been set. I should have seen those warnings when running docker compose, but they do not show up.

Ollama is up and running. Itā€™s also in use with Open WebUI, connecting from another server, pointing to the Ollama URL.

Iā€™ve seen similar posts, and with one suggesting to install an earlier commit. I know only how to install the latest commit, and need pointers if Iā€™m to install an earlier commit.

Would love to get this up and running and be a part of this community. Please help.

2 Likes

i have same issue with my installation on coolify the env is set but ottodev not recognize the the env or apikey

./bindings.sh: line 12: .env.local: No such file or directory
āœ˜ ;31m[;97mERROR;31m] Error getting Ollama models: Error: Network connection lost.

      at async Object.getOllamaModels [as getDynamicModels] (file:///app/build/server/index.js:394:22)
      at async Promise.all (index 0)
      at async initializeModelList (file:///app/build/server/index.js:465:9)
      at async handleRequest (file:///app/build/server/index.js:476:3)
      at async handleDocumentRequest (file:///app/node_modules/.pnpm/@remix-run+server-runtime@2.10.2_typescript@5.5.2/node_modules/@remix-run/server-runtime/dist/server.js:349:12)
      at async requestHandler (file:///app/node_modules/.pnpm/@remix-run+server-runtime@2.10.2_typescript@5.5.2/node_modules/@remix-run/server-runtime/dist/server.js:160:18)
      at async handleFetch (file:///app/node_modules/.pnpm/@remix-run+cloudflare-pages@2.10.2_@cloudflare+workers-types@4.20240620.0_typescript@5.5.2/node_modules/@remix-run/cloudflare-pages/dist/esm/worker.js:75:18)
      at null.<anonymous> (async file:///app/.wrangler/tmp/dev-QSzMaV/functionsWorker-0.8576367561975613.js:7619:14)
      at async next (file:///app/node_modules/.pnpm/wrangler@3.63.2_@cloudflare+workers-types@4.20240620.0/node_modules/wrangler/templates/pages-template-worker.ts:161:22)
      at async Object.fetch (file:///app/node_modules/.pnpm/wrangler@3.63.2_@cloudflare+workers-types@4.20240620.0/node_modules/wrangler/templates/pages-template-worker.ts:180:11) {
    retryable: true
2 Likes

Hopefully you get this figured out because I too am having similar issues. I will check back tomorrow when I have my laptop up and going

1 Like

Had a similar issue when first pulling down Bolt oTToDev, and trying to run it with Ollama locally on the same Macbook Pro M4 Max. Hopefully the following can help you guys and others.

Note that I have Ollama running first before running Bolt oTToDev.

When running the command:

docker-compose --profile development up

I was seeing these two errors pop-up in the terminal:

WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.

./bindings.sh: line 12: .env.local: No such file or directory

Basically, even after renaming the .env.example file to .env.local it wasnā€™t being picked up by Docker Compose.

A bit of investigation, and the main culprit is the .dockerignore file.

# Ignore environment examples and sensitive info
.env
*.local
*.example

This is basically stopping Docker from consuming the .env.local file.
Adding the lines below the above lines will allow Docker to see the .env.local file and use it.

# Allow .env.local
!.env.local

Now, open the Docker Dashboard app and delete pretty much everything in Containers, Images and Builds.

With that done, you should be able to run Docker Compose once again and it will pick up the .env.local file.

docker-compose --profile development up

After that I was up and running, Ollama works in Bolt oTToDev.

NOTE 1: Running the production version results in 500 Internal Server Error and ā€œThere was an error processing your request: No details were returnedā€. No idea why this is, havenā€™t taken a look into the documentation enough yet. development seems to be enough to begin work.

NOTE 2: Modified the .env.local file for Ollama with the following information in case anyone wants to replicate my steps exactly.

OLLAMA_API_BASE_URL=http://127.0.0.1:11434

DEFAULT_NUM_CTX=32768

Running on the latest main branch commit a0eb0a0771f69b84ba56c6ce5473e5107aa7b075 at the time of posting this.

4 Likes

I was going to say this, but not as well lol. Iā€™ve ran into this a bit with different environments. Sometimes they want .env and sometimes .env.local. Thanks for the great explanation!

I deleted everything and started over with the change in .dockerignore you purposed. This finally gets me these error messages when running docker-compose:

WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string.
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string.

But I still get the warning for "OLLAMA_API_BASE_URL" variable is not set, even though Iā€™ve set that value in .env.local.

My app-window looks like this:

When I try to enter text in the chat, I get these lines in my Docker log:

Error getting Ollama models: TypeError: fetch failed
    at processTicksAndRejections (node:internal/process/task_queues:95:5)
    at async Promise.all (index 0)
    at Object.getOllamaModels [as getDynamicModels] (/app/app/utils/constants.ts:367:22)
    at Module.getModelList (/app/app/utils/constants.ts:312:9)
    at Module.streamText (/app/app/lib/.server/llm/stream-text.ts:43:22)
    at chatAction (/app/app/routes/api.chat.ts:63:20)
    at Object.callRouteAction (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/data.js:36:16)
    at /app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4899:19
    at callLoaderOrAction (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4963:16)
    at defaultDataStrategy (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4772:17)
    at callDataStrategyImpl (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:4835:17)
    at callDataStrategy (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3992:19)
    at submit (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3755:21)
    at queryImpl (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3684:22)
    at Object.queryRoute (/app/node_modules/.pnpm/@remix-run+router@1.21.0/node_modules/@remix-run/router/router.ts:3629:18)
    at handleResourceRequest (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:402:20)
    at requestHandler (/app/node_modules/.pnpm/@remix-run+server-runtime@2.15.0_typescript@5.7.2/node_modules/@remix-run/server-runtime/dist/server.js:156:18)
    at /app/node_modules/.pnpm/@remix-run+dev@2.15.0_@remix-run+react@2.15.0_react-dom@18.3.1_react@18.3.1__react@18.3.1_typ_zyxju6yjkqxopc2lqyhhptpywm/node_modules/@remix-run/dev/dist/vite/cloudflare-proxy-plugin.js:70:25 {
  [cause]: Error: connect ECONNREFUSED 172.17.0.1:11434
      at TCPConnectWrap.afterConnect [as oncomplete] (node:net:1607:16)
      at TCPConnectWrap.callbackTrampoline (node:internal/async_hooks:130:17) {
    errno: -111,
    code: 'ECONNREFUSED',
    syscall: 'connect',
    address: '172.17.0.1',
    port: 11434
  }
}

This probably makes sense since the app is not looking at the right place for my Ollama instance.

You mentioned running another server. I donā€™t fully know what your setup is, but is Ollama running on the same machine youā€™re running the bolt oTToDev docker compose on?

If itā€™s not, then there maybe something additional that needs to be done on the Docker side to map the Ollama URL you have of 172.17.0.1:11434. Docker may not be able to communicate externally unless you map that external connection internally to Docker.

Iā€™m no Docker whizz, so this is just a hunch.

Usually when running Ollama locally it will default to 127.0.0.1:11434 which is essentially localhost.

Iā€™d advise you to install and run Ollama locally on the same machine bolt oTToDev is running on first and updating the URL to the localhost URL to confirm if it works at all. (Assuming this is not already the case)

I have the same setup on my Open WebUI instance, which is deployed in docker on one server, with access to Ollama on another server. The IP for Ollama is set in the environment variables for Open WebUI in docker-compose.yaml

For oTToDev; Iā€™m setting the IP-address in the environment variables, but the app is totally oblivious to that location - as shown in the log.

Since the Docker install isnā€™t working, I used pnpm run dev and that worked.


I am having the same issue.

Iā€™ve edited the .dockerignore file and deleted everything in Containers, Images and Builds and am still running into the same issue:

bolt.diy git:(main) āœ— docker-compose --profile development up
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] The "GROQ_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "HuggingFace_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OPENAI_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "ANTHROPIC_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OPEN_ROUTER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "GOOGLE_GENERATIVE_AI_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "OLLAMA_API_BASE_URL" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_KEY" variable is not set. Defaulting to a blank string. 
WARN[0000] The "TOGETHER_API_BASE_URL" variable is not set. Defaulting to a blank string.

Any suggestions on how to fix? Iā€™m running this on an M2 MacBook Pro.

If Ollama isnā€™t working for you guys, you could try LM Studio.

Iā€™ve found that itā€™s less of a hassle compared to Ollama. No need to make any modelfiles, setup context length or the sort.

After installing LM Studio and downloading a model or two to test, thereā€™s some additional setup recommended by a user on GitHub.

Do take note of the steps provided by jh-younicorns here

This is a better alternative to Ollama too if youā€™re running on Mac as youā€™ll get access to the MLX version of the models. Significant speed boosts for M chips.

2 Likes

Thanks, Iā€™ll definitely have to take a look into this. Itā€™s been a while since Iā€™ve used LM Studio.

1 Like

for local urls instead of localhost try 127.0.0.1, you can also set the urls on settings menu ā†’ provider tab
should be working even if the env file does not have the values set

1 Like

Thanks great Great piece of info I didnā€™t even think about that Thank you so much for posting that in here