Ollama Integration if not till yet!

BTW, the team of bolt.diy should use Ollama for there LLM integration if not used till yet now . and it is open-source tooo - GitHub - ollama/ollama: Get up and running with Llama 3.3, Mistral, Gemma 2, and other large language models. :thinking: :grinning: :wink:

- this not even half of the models list

Hi @parthgajera320,

not sure what you mean. Ollama is one of the many providers you can use with bolt.diy since a long time.

You just have to configure the base url in the settings.

ohh!! sorry , i was not knowing

1 Like