Cant get Ollama and bolt.new-any-llm to work

To establish a base, I have a local machine with some 4070’s running ubuntu, fresh install. I am able to successfully install Ollama (just with the one line install, not docker) and run it in CLI and with my open-webui as well as other similar interfaces. So to my knowledge my Ollama and all the models run fine.

I’ve tried to install bolt.new-any-llm in every flavor there is from the docker to the pnpm. Ive even looked at a few issues that seemed similar on the github and tried running it with anything that seemed relevant. Im not super code/dev savvy but Ive managed with other things so far, so not sure why I can’t get this to run.

But whatever I do, Ollama never pops up any models when I launch the website locally.

Can anybody give me any kind of direction to what I may be missing?

what ip address are you giving it for ollama in the .env.local file

Tried everything:
(not all at once, of course)

localhost:11434
127.0.0.1:11434
host.docker.internal:11434

Not sure what else I could even attempt.

So, small update. I actually read another post that used the machines local IP so replaced the IP in the .env.local to the network IP of the machine and the models loaded up, but when i asked it to create something now i get :

Failed to spawn bolt shell

Failed to execute 'postMessage' on 'Worker': SharedArrayBuffer transfer requires self.crossOriginIsolated.

So, back to being lost lol

I was able to replicate your setup and see the issue you are talking about I will look into this a little more tomorrow.

2 Likes

ok thank you. let me know if you need any other info from me.

@dustinwloring1988 - Im hitting same error. I have bolt.diy running on a VM on AWS/GCP, but when I try to create something I get same error @cylver has mentioned.

@wahi80, I was able to self host and run this on Runpod as they have a ollama template that might help. I will post a video on to set them up. An A40 (48GB VRAM) is only $0.39 an hour to run.

This was with the latest copy.

3 Likes

That would be awesome! I’m able to setup the VM , get the app up and running, but unable to bypass

Failed to execute ‘postMessage’ on ‘Worker’: SharedArrayBuffer transfer requires self.crossOriginIsolated.

The VM has an external IP, which Im trying to access from my local machine. Maybe some CORS setting is messing it up. When I run it on my local no issues

anybody got any fixes for this? Tried to do a clean install and still same error.

@dustinwloring1988 can you help out?

Still praying for a fix, but haven’t had any luck with this.

@cylver try my stack: GitHub - leex279/bolt-diy-full-stack

I fixed yesterday some stuff with the docker containers in bolt.diy and modified my stack.

Let me know if it works. Also wanted to showcase this then in a short youtube video.

I already have webui and ollama running separately, this one includes those and has errors when installing. so this might not be the solution for me.

@cylver what errors do you get? Also you dont need the ollama and openwebui. Just remove them from the file or copy the bolt containers into your docker compose.
Should work fine. If not let me know, cause this will maybe go to the official docker-compose file soon. So would be good to get feedback, if something is not working for you.

ok i removed the other two from the docker and im trying to get it to use my ollama but it asks for an api key? my ollama is hosted on the same machine and it just goes to the 11434 url so not sure what to do there. i entered the url into the env.local as well so not sure.

You can ignore the API key. This is also nothing docker specific. If you are on the main branch, this is already gone.

I would recommend trying the main branch, cause there were also a lot of fixes for docker. Latest in a few weeks when the stable has a new release, it is good to go again.

maybe ill try the main branch again to see if anything has changed because im usually stopped at this point and bolt cant communicate with my ollama and when it does the only work around is telling it to use my local ip instead of localhost or 127.0.0.1, either way bolt just cant find my ollama.

I wanted to do a youtube video on the stack anyway.

If you tell me your concrete setup, I can testout and cover this in the video maybe as well.

very simple rig, just a linux computer with amd gpu. openweb ui and ollama already installed and working. nothing special really. if you want anything specific i can add more detail?