i have tried bolt new , it’s working but now need solution for the llm paying for the api with claude and openai is not an option it’s too exp tried open router same , tokens are just burned like wild forest fire.
so i tried colab + ollama with ngrok module but the link i got seems to work in terminal but not in bolt.new
the .env file contains openai, router api keys and, ollama ngrok url .
openai works and open router also but not ollama shows error.
any idea if there is any modification needed ?
or a ready guide maybe you know of to be kind to share ?
try setting ollama url in the settings tab
Tried that the request is sent to colab and i see it in colab but seem bolt not able to process that.
i’m running bolt in a docker,
never tried this kind of setup need to replicate the scenario to be able to understand it
I don’t think it matters how you run bolt, unless you maybe you are running it in the Collab and then it should basically work the same (might be the easiest option).
Otherwise, you will need to modify your .env.local
(or configure in the settings tab) to point to a public IP Address and port. So the Collab would need to be accessable from the the Web and you would need to get the IP. And then make sure to confirm the Ollama endpoint is working.
I would need to test this out to be more helpful.