What Ollama model we can use?

Hi,
Only llama or qwen ?

Thanks

Hi @algowifi,

you just need to make sure it supports β€œtools”. Other then that I guess you can use / try each model that matches this.

I quickly showed it here:

2 Likes

I have installed localai on my mac m4 and work perfectly. :slight_smile:

1 Like