Cant get Ollama and bolt.new-any-llm to work

To establish a base, I have a local machine with some 4070’s running ubuntu, fresh install. I am able to successfully install Ollama (just with the one line install, not docker) and run it in CLI and with my open-webui as well as other similar interfaces. So to my knowledge my Ollama and all the models run fine.

I’ve tried to install bolt.new-any-llm in every flavor there is from the docker to the pnpm. Ive even looked at a few issues that seemed similar on the github and tried running it with anything that seemed relevant. Im not super code/dev savvy but Ive managed with other things so far, so not sure why I can’t get this to run.

But whatever I do, Ollama never pops up any models when I launch the website locally.

Can anybody give me any kind of direction to what I may be missing?

what ip address are you giving it for ollama in the .env.local file

Tried everything:
(not all at once, of course)

localhost:11434
127.0.0.1:11434
host.docker.internal:11434

Not sure what else I could even attempt.

So, small update. I actually read another post that used the machines local IP so replaced the IP in the .env.local to the network IP of the machine and the models loaded up, but when i asked it to create something now i get :

Failed to spawn bolt shell

Failed to execute 'postMessage' on 'Worker': SharedArrayBuffer transfer requires self.crossOriginIsolated.

So, back to being lost lol

I was able to replicate your setup and see the issue you are talking about I will look into this a little more tomorrow.

2 Likes

ok thank you. let me know if you need any other info from me.

@dustinwloring1988 - Im hitting same error. I have bolt.diy running on a VM on AWS/GCP, but when I try to create something I get same error @cylver has mentioned.

@wahi80, I was able to self host and run this on Runpod as they have a ollama template that might help. I will post a video on to set them up. An A40 (48GB VRAM) is only $0.39 an hour to run.

This was with the latest copy.

3 Likes

That would be awesome! I’m able to setup the VM , get the app up and running, but unable to bypass

Failed to execute ‘postMessage’ on ‘Worker’: SharedArrayBuffer transfer requires self.crossOriginIsolated.

The VM has an external IP, which Im trying to access from my local machine. Maybe some CORS setting is messing it up. When I run it on my local no issues