Unable to use models despite API keys being set in .env.local

I’m encountering an issue with Bolt.new where the application doesn’t recognize my API keys, even though they’re properly set in the .env.local file:

Environment:

  • macOS on M3
  • Chrome Canary (latest version)
  • Using local installation of Bolt.new

Setup:

  1. Successfully cloned and set up Bolt.new
  2. Added all required API keys in .env.local
  3. Successfully running Ollama models locally:
    • Qwen 2.5 Coder 32B with extended context (256k)
    • Mistral with extended context
    • Confirmed working through curl http://localhost:11434/api/tags

Issue:
Despite having configured the API keys in .env.local, the application:

  • Keeps asking for API keys that are already set
  • Doesn’t recognize existing API key configurations
  • Shows API key input prompts repeatedly

Expected behavior:

  • Application should recognize the API keys set in .env.local
  • Should allow immediate use of configured models without requesting keys again

Would appreciate any guidance on resolving this issue.

Can you try going back a few commits to know when it was working? Then give us the working commit hash

1 Like

Thank you for the suggestion about checking previous commits. However, I should clarify that this has been happening since my very first launch of the application. Some additional context:

  1. First-time Setup Details:
  • This was my first installation of Bolt.new
  • I followed the README instructions step by step with assistance from Claude AI
  • We did not use the Docker installation method (went with direct installation)
  • The .env.local file was kept as is (didn’t rename to .env as mentioned in the file’s comment)
  1. Installation Process:
  • Successfully set up all prerequisites
  • Installed and configured Ollama
  • Set up models (Qwen 2.5 Coder 32B and Mistral with extended contexts)
  • Added all API keys to .env.local
  1. Current Status:
  • Ollama models are running correctly (verified through API)
  • All API keys are properly set in .env.local
  • Issue has been consistent since first launch

I’m new to development and followed every command precisely. Not sure if using Docker would make a difference, or if keeping the file as .env.local instead of renaming it to .env could be related to the issue.

Any guidance would be greatly appreciated!

I’ve had some inconsistencies with naming of the file. Try renaming to .env

1 Like

Renaming the file to .env does not work because the bindings.sh file sets the environment variables by redirecting the file .env.local. I’ve tried copying this file to ./app however this does not work either.

I’ve recloned the repo several times and have cleaned out my docker containers and images and still does not work for both production and development.

I’m kind of at the end of my rope on this, I’m not sure how this is working?

I opened a console window into the running container and it looks like the docker build will not include .env.local file into the build image, however will include it if its in the ./app folder. I had to then odify bindings.sh to instead take the file from ./app/.env.local instead of ./.env.local.

Either .env.local is excluded somehow in the build does not include it.

I’m on windows. When I was using docker I could see that the image was updated if I renamed the .env file or edited the file. In windows, that file needs to be within the bolt.new-any-llm folder. For what’s its worth, with .env.local I could not get model to work. Renaming to .env made everything work.

here is a fix: Ollama not respecting model selection by dustinwloring1988 · Pull Request #460 · coleam00/bolt.new-any-llm · GitHub

1 Like

Thanks, should we run the following for a clean install to try it?:
⦁ pnpm store prune
⦁ pnpm install

Since I think a git pull would only update the new modified code file?
Or how would you suggest to remove everything and then reinstall everything according to the oTToDev readme GIT from the beginning?

FYI: The .dockerignore file has the *.local files excluded. Might that be an issue?

# Ignore environment examples and sensitive info
.env
*.local
*.example
2 Likes