To establish a base, I have a local machine with some 4070’s running ubuntu, fresh install. I am able to successfully install Ollama (just with the one line install, not docker) and run it in CLI and with my open-webui as well as other similar interfaces. So to my knowledge my Ollama and all the models run fine.
I’ve tried to install bolt.new-any-llm in every flavor there is from the docker to the pnpm. Ive even looked at a few issues that seemed similar on the github and tried running it with anything that seemed relevant. Im not super code/dev savvy but Ive managed with other things so far, so not sure why I can’t get this to run.
But whatever I do, Ollama never pops up any models when I launch the website locally.
Can anybody give me any kind of direction to what I may be missing?
So, small update. I actually read another post that used the machines local IP so replaced the IP in the .env.local to the network IP of the machine and the models loaded up, but when i asked it to create something now i get :
Failed to spawn bolt shell
Failed to execute 'postMessage' on 'Worker': SharedArrayBuffer transfer requires self.crossOriginIsolated.
@dustinwloring1988 - Im hitting same error. I have bolt.diy running on a VM on AWS/GCP, but when I try to create something I get same error @cylver has mentioned.
@wahi80, I was able to self host and run this on Runpod as they have a ollama template that might help. I will post a video on to set them up. An A40 (48GB VRAM) is only $0.39 an hour to run.
That would be awesome! I’m able to setup the VM , get the app up and running, but unable to bypass
Failed to execute ‘postMessage’ on ‘Worker’: SharedArrayBuffer transfer requires self.crossOriginIsolated.
The VM has an external IP, which Im trying to access from my local machine. Maybe some CORS setting is messing it up. When I run it on my local no issues
@cylver what errors do you get? Also you dont need the ollama and openwebui. Just remove them from the file or copy the bolt containers into your docker compose.
Should work fine. If not let me know, cause this will maybe go to the official docker-compose file soon. So would be good to get feedback, if something is not working for you.
ok i removed the other two from the docker and im trying to get it to use my ollama but it asks for an api key? my ollama is hosted on the same machine and it just goes to the 11434 url so not sure what to do there. i entered the url into the env.local as well so not sure.
You can ignore the API key. This is also nothing docker specific. If you are on the main branch, this is already gone.
I would recommend trying the main branch, cause there were also a lot of fixes for docker. Latest in a few weeks when the stable has a new release, it is good to go again.
maybe ill try the main branch again to see if anything has changed because im usually stopped at this point and bolt cant communicate with my ollama and when it does the only work around is telling it to use my local ip instead of localhost or 127.0.0.1, either way bolt just cant find my ollama.
very simple rig, just a linux computer with amd gpu. openweb ui and ollama already installed and working. nothing special really. if you want anything specific i can add more detail?