Ollama model with error

Hello guys, I’m having a problem when i run the Ollama models, it is always showing the message on the right corner below that has an error processing my request.

I personally don’t know anymore what else should I do.

Captura de tela de 2024-11-19 15-18-25

  1. Where are you running oTToDev, locally or Docker?
  2. Same question about ollama

I would check your Chrome Dev Tools → Network tab for any indication that requests are failing, this usually means that Ollama is not accessible from the frontend.

I ran into same issue with ollama. It turned out I didn’t have enough memory for the model i tried using. The error went away when I switched to a smaller model. It’s helpful to verify ollama APIs are working by using curl on the command line.