Hey everybody i hope somebody knows how to deal with it.
I installed and run it on Windows 11 with docker. Entering the web interface worked also with specific prompts it even generated the project structure and code. But theres one thing failing and i dont know how to fix it.
Create package.json
Create src/index.js
Start Application (! thats the part which is not working, it got an red “X”)
the preview wont work and the commands like npm run dev doesnt work.
If you get the error saying “vite” isn’t installed, that means the LLM hallucinates and didn’t install vite (npm install vite) before running a vite command.
The webcontainer is a different environment then your computer so that’s why vite isn’t found even though in your terminal it’s installed.
Try running “npm install vite” within the “bolt terminal” in the oTToDev interface. Then the “npm run dev” command will work.
Is there a way to have vite permanently installed in the webcontainer environment? So i dont need to run these commands manually?
For this test i used Ollama deepseek-coder-v2:bolt (:latest orignalname) ( Modelfile modified as described in the wiki “Super Important Note on Running Ollama Models”)
Which model from Ollama do you like the most or achieved the best compatiblity with bolt ?
Next weird thing i witnessed is it seems like the produced code isnt pasted anymore into project files but only in the prompt answer itself.
No that isn’t possible, but generally the LLM will install Vite as a part of the build process. It’s only in rare cases where it forgets to do that and you have to yourself.
The best local LLM I have found is Qwen 2.5 Coder 32b. I would try using that! It’s available through HuggingFace or OpenRouter if it’s too big to run on your machine as well.
Larger models are also going to be less apt to do weird things like put the code in the chat as you saw there.