Hey @ColeMedin,
Can you please check out this site, I hope it’s a interesting site for LLMs, And it will be better if you add it in your otto dev project.
Hey @ColeMedin,
Can you please check out this site, I hope it’s a interesting site for LLMs, And it will be better if you add it in your otto dev project.
Thanks for sharing this @thinkassist2! Have you tried this platform out yourself yet? I’m curious what you think!
I visited the site and tried out some models there. It seems promising to me, but I think it would be best if this feature could be added to your current project on bolt.new-any-llm.
It would be especially helpful for people like me who don’t have a powerful GPU to run large language models locally.
I’ve also shared a video where I learned about this feature.
I’m new to this field, so I may lack some technical knowledge, but I hope you’ll view this suggestion positively.
Thank you for sharing!