Hello guys, I’m having a problem when i run the Ollama models, it is always showing the message on the right corner below that has an error processing my request.
I personally don’t know anymore what else should I do.
Hello guys, I’m having a problem when i run the Ollama models, it is always showing the message on the right corner below that has an error processing my request.
I personally don’t know anymore what else should I do.
I would check your Chrome Dev Tools → Network tab for any indication that requests are failing, this usually means that Ollama is not accessible from the frontend.
I ran into same issue with ollama. It turned out I didn’t have enough memory for the model i tried using. The error went away when I switched to a smaller model. It’s helpful to verify ollama APIs are working by using curl on the command line.