The Code LLM of Choice?

I used qwen 2.5 32B
As you could imagine, that didn’t go very far. It wanted to do something but it was painfully slow. So now I’m on the hunt.
I’m about to test out smollm2 and see how well it can do anything. I’ve honestly spent too much money on credit across multiple LLMs and now looking into what is out there that is opensourced. Because my bank account will appreciate it.

Have you had good experience with any open sourced LLMs for our use cases in oTToDev? I’m aware there are some availability for using some APIs for free. I really wish to run local for some reason though.

Look forward to hearing what you’ve found. Thanks in advance!

2 Likes

I have not yet. The actual Bolt.new has out performed everything I have thrown at this by a long shot so far.
But with smaller scope items, it could probably be pretty effective.
You can try Mixtral, Codellama, Qwen.

1 Like

I have had a TON of luck using Qwen 2.5 Coder. The 7b parameter model has been around for a while and the 32b parameter model was just released yesterday!

1 Like

Hey Cole, I sent you a message on your new website. Please go check and I hope to hear from you soon. There’s something I’ve been working on that needs to be executed in a way that no one will owe on it, and after seeing your life the other night, I think you’re the man with the heart to pull this off. Let’s get to work!

Hey @jaystonez, did you send me an email? I don’t have a new site up unless you are referring to oTTomator!

Contact Us – Classes and Education oTTomator shoot me an email I can send contact information there.

Oh reply with an email and I’ll send one back. :rocket::rocket::rocket:

@jaystonez Feel free to send an email to cole@dynamous.ai!

Hey Cole, I sent you a message marked important.

Which exact version are you using? I have noticed that with coding you really need to try and use at a minimum q8. Anything with higher loss is going to yield nothing but bad results. And to answer your question qwen2.5-coder:32b-instruct-q8_0 seems to be the best right now, by a considerable margin.