Issue while running oTToDev via docker-compose

Hi All , hope your are doing good , was wondering about the error I get while running OttoDev via docker-compose, are any other members getting the same issue? will wait for any responses before I try fix and raise a PR.

Following is the error:
boltnew-any-llm-bolt-ai-1 | ./bindings.sh: line 12: .env.local: No such file or directory
P.S : this is only via docker-compose, if binding is executed separately it works also if the Env variables are set in the compose.yaml it works.

Hey there,

You’ll want to duplicate the .env.example file to .env.local - This is the environment file used by docker compose flow to include environment variables, i.e. base URL and API key entries.

1 Like

Thanks @mahoney for taking out time and responding,
yes this is after the renaming .env.example to .env.local.

As mentioned earlier, just executing bindings.sh works and bindings.sh internally refers .env.local

2 Likes

I am experiencing the same issue.

I had the same issue and resolved it by doing the following feat: docker - Update bindings.sh to Use Environment Variables Directly by turbra · Pull Request #222 · coleam00/bolt.new-any-llm · GitHub

Alternatively this works as well 🐞 Docker compose up failing · Issue #68 · coleam00/bolt.new-any-llm · GitHub

3 Likes

Thanks for letting us know about your solve via those PRs, @turbra - A question for this thread ahead of merging one of these PRs in terms of best practices, is #222 or #68 the desirable solution? I prefer to avoid bindings.sh another spot to maintain the environment variables, any thoughts there would be helpful in terms of Compose usage.

1 Like

The more I think about it, I don’t really like either approach.

It might be better to implement an Admin menu where we can configure these integrations post deployment similar to Open WebUI. That would allow for a much cleaner and easier approach to maintaining the integrations as a user.

Also eliminate the hard coded localhost default or statements. Let the user configure this post deployment. Many of us have these deployed on a separate host/hosts.

I think it’s a great time to focus on:

  1. Standardize the image, make it easy to deploy. Adoption will be even greater if the barrier to entry is easier.

  2. Administrative features specific to configuration post deployment.

Open Web UI is a great example of making things easy to deploy and integrate with external services.

Sorry for the rant :saluting_face:

2 Likes

Thanks, that is sort of why I asked and I also come from a background in orchestrating Compose stacks. Open WebUI is a fantastic example of an AI-enabled frontend in terms of their auth, admin/user settings, and all of the configurable bits being set up within an easy to use modal UI. Not advocating for an integration of their interface, but rather it is a great model to start from.

We did have an earlier PR submission of the beginnings of a settings UI before the API key inline field was merged in, likely time to start some discussion around a consolidated settings system and I think that should definitely be a community chat. I’m sure @ColeMedin and the core team have thoughts as well. Thanks for joining us to improve the dev experience!

3 Likes

Yeah @mahoney and @turbra I certainly agree it’s time to come together and discuss standardization and a place to manage configuration post-deployment.

I love the organization of all the configuration options in Open WebUI, so I appreciate you guys referencing it! I’ll be thinking about what that can look like for oTToDev.

3 Likes

@mahoney , @turbra , @ColeMedin ,
Agreed, we should focus and try to externalize the configuration as much as possible as a best practice, Please let me know if any PR is planned I will also try to add my bit in the meantime.

3 Likes

NEED HELP ASAP!!!

No code in the IDE within Bolt webapp

I cloned the repo
I created the .env file
I created the modelfile
I created the LLM using that modelfile
I created the docker image
I started the container
Installed google canary
On http://localhost:5173/, not getting any code inside IDE after any prompt

I already have 2 models of qwen2.5 LLM

Please somebody tell me what am I missing here

@don121004 this question is better asked in the Issues section of the github repo, as it is unclear and unrelated to this thread discussion.

1 Like