Not strictly hardware intensive per se. Depends on what you are asking it to do, because the actual work is still processed on your machine. What branch are you using (stable)? And have you updated the deployment recently (because there’s been a lot of updates)?
Can you test one of my instances (it’s up to date and uses the stable branch)?
Update: I’m actually seeing the same behavior, even after turning off all the extra providers (including local) in the settings and clearing the chat history. And all my token usage is returning NaN
Update #2: Not sure what the issue was, but I ran in incognito and disabled all the extra options I didn’t want, and it worked fine.
Still seems to be a problem. Not sure the issue really. I wonder if the 8K context window is being reached or response is being abruptly ending. Not sure, but I do have a lot of console errors going on (the usually for local providers not being found, but they are “disabled” in settings).
It works fine at first but eventually if you keep prompting the AI, it craps out. Tells me some limit or something is being reached. Idk.
I will play with this some more later and try to figure out at least why it is happening.
In my case, the chat would just stop or if on a step just keep spinning. I could tell if it was updating files while the scroll basically didn’t work in preview. I was mostly using DeepSeek-V3 through Openrouter.
I’ve been watching the console while it happens. Clear everything and it seems good for a little while… but then when it gets “stuck” it returns the line Token usage: {completionTokens: NaN, promptTokens: NaN, totalTokens: NaN} back. Basically, it appears to simply close the connection abruptly.
@aliasfox Definitely still an issue! Extremely frustrating!
I see there are similar errors being reported on Github too! It seems like their description is better than mine, it seems to be overwriting the existing files from scratch every time!
I have tried with multiple different models and seem to be experiencing the same.
Running on the Stable branch, with the latest updates.
I’ve given up using Bolt.diy until a fix can be found.
Yesterday I pulled the latest man branch, which included PR #1006 and it seemed promising, but didn’t seem to do much, issue still persists. As of today, I hadn’t tested yet but noticed the main branch was behind a bit, so it looks like some new PR’s were merged (but stable is the same). But I don’t imagine it will be any better.