This wasn’t built with Bolt but was mostly done using AI tools (cline and sonnet3.5) I originally make this for my own purposes but decided to share it if anyone else can make use of it
The main features as it stands are.
Connect to multiple Ollama endpoints simultaneously
Web-based interface for model management
Support for both local and remote Ollama instances
thats pretty nice. cant wait to do a video on this, as soon we got it ready
We should test some models and provide configurations/prompt that works best then with this models, cause if we push such features it more and more make the impression that local models work well, what is at the moment not the case.
maybe we highlight / do a info box there then with good working models and working settings/prompt for these.