bolt.diy is a revolutionary open-source AI coding assistant that enables full-stack web development directly in your browser. It’s the community version of bolt.new that you can run yourself (simple instructions below!) and allows you to choose from multiple LLM providers for each prompt, including:
OpenAI
Anthropic
Ollama
OpenRouter
Gemini
LMStudio
Mistral
xAI
HuggingFace
DeepSeek
Groq
And more!
What makes bolt.diy and bolt.new special is its ability to give AI complete control over the development environment, allowing it to:
# Clone the repository
git clone https://github.com/stackblitz-labs/bolt.diy
# Navigate to the project directory
cd bolt.diy
# Install pnpm globally
npm install -g pnpm
# Install dependencies
pnpm install
# Start the development server
pnpm run dev
That’s it! Your bolt.diy instance will be up and running locally and you can set your LLM API keys within the interface.
bolt.diy can also be installed with Docker, instructions for that are in the repository.
Get Involved
The bolt.diy community is growing rapidly and it’s the perfect time to get involved!
Join the Community
Join our official community here at the oTTomator Think Tank! Don’t be a lurker!
Participate in discussions around new features, get help with troubleshooting, share what you’ve built with bolt.diy, and get inspired with the incredible things people are building with it!
bolt.diy is actively evolving with several exciting goals on the horizon:
Current Priorities
Implementing file locking and diffs to prevent unnecessary file rewrites
Enhancing prompting capabilities for smaller LLMs
Backend agents instead of single model calls (optional feature)
Future Roadmap
Have the LLM plan the project in MD files for better transparency
VSCode integration with git-like confirmations
Document upload for knowledge enhancement
Integration with more AI providers
bolt.diy started as a project by me but has evolved into a thriving community effort to build the best open-source AI coding assistant. Join us in shaping the future of AI-powered development!
Hello, I have a question, I usually use cloud machine to use models like ollama, so I wonder how I can use this tool with it, because when I install ollama in the cloud machine, put the URL like “127.0.0.1:11434” the models don’t load.
Hi ColeMerlin, i cant install bolt.diy on my win11 pc. again and again tryid but it i havent got a success install. I done all steps one by one.
but again i cant
can u say anything to me?
PS C:\Users\Admin\bolt.diy> pnpm run dev
bolt@0.0.3 dev C:\Users\Admin\bolt.diy
node pre-start.cjs && remix vite:dev
★═══════════════════════════════════════★
B O L T . D I Y
Welcome
★═══════════════════════════════════════★
Current Commit Version: eb6d4353565be31c6e20bfca2c5aea29e4f45b6d
★═══════════════════════════════════════★
warn Data fetching is changing to a single fetch in React Router v7
┃ You can use the v3_singleFetch future flag to opt-in early.
┃ → Future Flags (v2.13.1) | Remix
┗
Error: write EOF
at WriteWrap.onWriteComplete [as oncomplete] (node:internal/stream_base_commons:87:19) {
errno: -4095,
code: ‘EOF’,
syscall: ‘write’
}
ELIFECYCLE Command failed with exit code 1.
PS C:\Users\Admin\bolt.diy>
You need to replace the localhost/127.0.0.1 with the actual public IP Address of the server/VPS, and update the value in your .env.local file (no slash at the end).
I did it, but it seems there’s an error in the code, I tried with docker and without, it’s the same.
3:14:17 PM [vite] Internal server error: Not Found
at /bolt.diy/node_modules/.pnpm/@ai-sdk+provider-utils@1.0.20_zod@3.23.8/node_modules/@ai-sdk/provider-utils/src/response-handler.ts:72:16
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at postToApi (/bolt.diy/node_modules/.pnpm/@ai-sdk+provider-utils@1.0.20_zod@3.23.8/node_modules/@ai-sdk/provider-utils/src/post-to-api.ts:81:28)
at OllamaChatLanguageModel.doStream (/bolt.diy/node_modules/.pnpm/ollama-ai-provider@0.15.2_zod@3.23.8/node_modules/ollama-ai-provider/src/ollama-chat-language-model.ts:230:50)
I got the same error as @spacecodee tried disabling firewall didn’t work, gave permission to ollama and docker still didn’t work. i tried above with all possible combinations.
Hi @avi00728,
please open a seperate topic for it and describe your complete setup.
If you working with cloud / ollama and bolt not and the machine you also use it, it can be very hard to get it running. There are a lot of problems with cors, no https, certificate missmatch etc.
This error is not related to Bolt.diy, powershell is looking for a script called npm.ps1. Nothing after that step will work because npm was not ran. Make sure node is installed correctly and the path is in your system environment. And maybe try running in just a normal command prompt.
P.S. Also, is your username really called user or did you modify the graphic?