To establish a base, I have a local machine with some 4070’s running ubuntu, fresh install. I am able to successfully install Ollama (just with the one line install, not docker) and run it in CLI and with my open-webui as well as other similar interfaces. So to my knowledge my Ollama and all the models run fine.
I’ve tried to install bolt.new-any-llm in every flavor there is from the docker to the pnpm. Ive even looked at a few issues that seemed similar on the github and tried running it with anything that seemed relevant. Im not super code/dev savvy but Ive managed with other things so far, so not sure why I can’t get this to run.
But whatever I do, Ollama never pops up any models when I launch the website locally.
Can anybody give me any kind of direction to what I may be missing?
So, small update. I actually read another post that used the machines local IP so replaced the IP in the .env.local to the network IP of the machine and the models loaded up, but when i asked it to create something now i get :
Failed to spawn bolt shell
Failed to execute 'postMessage' on 'Worker': SharedArrayBuffer transfer requires self.crossOriginIsolated.
@dustinwloring1988 - Im hitting same error. I have bolt.diy running on a VM on AWS/GCP, but when I try to create something I get same error @cylver has mentioned.
@wahi80, I was able to self host and run this on Runpod as they have a ollama template that might help. I will post a video on to set them up. An A40 (48GB VRAM) is only $0.39 an hour to run.
That would be awesome! I’m able to setup the VM , get the app up and running, but unable to bypass
Failed to execute ‘postMessage’ on ‘Worker’: SharedArrayBuffer transfer requires self.crossOriginIsolated.
The VM has an external IP, which Im trying to access from my local machine. Maybe some CORS setting is messing it up. When I run it on my local no issues