How to combine this colab Ollama with bolt?

Running Ollama on Google Colab

Github

so i tested it and it works in terminal just great but connecting it to bolt doesn’t work the request from bolt is sent to colab but there is error showing in bolt.

also how to increase the input output token amount for ollama on colab ?

1 Like

If I’m understand correctly, and I looked at the notebook, you would need to either install Ollama locally or point it into the .env.local to a public IP Address & port running it.

And tokens default to 8000 in the Bolt.diy, which seems to be required for reliable output and interaction with files.