Everything You Need to Get Started with bolt.diy

What is bolt.diy?

Visit bolt.diy to check it out!

bolt.diy is a revolutionary open-source AI coding assistant that enables full-stack web development directly in your browser. It’s the community version of bolt.new that you can run yourself (simple instructions below!) and allows you to choose from multiple LLM providers for each prompt, including:

  • OpenAI
  • Anthropic
  • Ollama
  • OpenRouter
  • Gemini
  • LMStudio
  • Mistral
  • xAI
  • HuggingFace
  • DeepSeek
  • Groq
  • And more!

What makes bolt.diy and bolt.new special is its ability to give AI complete control over the development environment, allowing it to:

  • Install and manage packages
  • Run backend servers
  • Interact with APIs
  • Deploy applications
  • Edit code in real-time

Quick Setup

Getting started with bolt.diy is super simple:

  1. Prerequisites

  2. Installation

    # Clone the repository
    git clone https://github.com/stackblitz-labs/bolt.diy
    
    # Navigate to the project directory
    cd bolt.diy
    
    # Install pnpm globally
    npm install -g pnpm
    
    # Install dependencies
    pnpm install
    
    # Start the development server
    pnpm run dev
    

That’s it! Your bolt.diy instance will be up and running locally and you can set your LLM API keys within the interface.

bolt.diy can also be installed with Docker, instructions for that are in the repository.

Get Involved

The bolt.diy community is growing rapidly and it’s the perfect time to get involved!

  1. Join the Community

    • Join our official community here at the oTTomator Think Tank! Don’t be a lurker! :slight_smile:
    • Participate in discussions around new features, get help with troubleshooting, share what you’ve built with bolt.diy, and get inspired with the incredible things people are building with it!
  2. Contribute

    • Fork the repository
    • Pick up open issues or create new features
    • Submit pull requests
    • Help improve documentation
    • Check out our Contribution Guide

Goals and Future

bolt.diy is actively evolving with several exciting goals on the horizon:

Current Priorities

  • Implementing file locking and diffs to prevent unnecessary file rewrites
  • Enhancing prompting capabilities for smaller LLMs
  • Backend agents instead of single model calls (optional feature)

Future Roadmap

  • Have the LLM plan the project in MD files for better transparency
  • VSCode integration with git-like confirmations
  • Document upload for knowledge enhancement
  • Integration with more AI providers

bolt.diy started as a project by me but has evolved into a thriving community effort to build the best open-source AI coding assistant. Join us in shaping the future of AI-powered development!

7 Likes

Hey sir I need your help. When I run bolt.diy locally through VS Code, I get an error that says:

‘There was an error processing your request: An error occurred.’

Please help me!

2 Likes

@MotionNinja Please open your own topic for your case in [bolt.diy] Issues and Troubleshooting

Provide detailed information in it:

  • How do you rund bolt (docker, without docker)?
  • System (Windows, Mac, …)
  • Which branch are you on (main or stable)?
  • What Provider you are using?
  • What was your prompt?
  • Are there Errors in the Dev-Tools-Console and/or the terminal/shell where you run bolt itself?
  • Provide the Debug-Tab output:

@ColeMedin maybe add this (a enhanced version) in your original post :slight_smile:

{
“System”: {
“os”: “Windows”,
“browser”: “Edge”,
“screen”: “1536x864”,
“language”: “en-US”,
“timezone”: “Asia/Calcutta”,
“memory”: “2.01 GB (Used: 77.37 MB)”,
“cores”: 4,
“deviceType”: “Desktop”,
“colorDepth”: “24-bit”,
“pixelRatio”: 1.25,
“online”: true,
“cookiesEnabled”: true,
“doNotTrack”: false
},
“Providers”: [
{
“name”: “Ollama”,
“enabled”: true,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2024-12-19T08:28:13.127Z”,
“url”: null
},
{
“name”: “OpenAILike”,
“enabled”: true,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2024-12-19T08:28:13.127Z”,
“url”: null
},
{
“name”: “LMStudio”,
“enabled”: true,
“isLocal”: true,
“running”: false,
“lastChecked”: “2024-12-19T08:28:14.181Z”,
“responseTime”: 1053.785000000149,
“url”: “http://172.31.240.1:8080
}
],
“Version”: {
“hash”: “e42f407”,
“branch”: “main”
},
“Timestamp”: “2024-12-19T08:28:28.504Z”
}

What is the error you get in the terminal where you started bolt.diy?

Yeah I agree, I’ll add this to my todo list to make a short guide on getting the debug info for troubleshooting posts!

2 Likes

Hello, I have a question, I usually use cloud machine to use models like ollama, so I wonder how I can use this tool with it, because when I install ollama in the cloud machine, put the URL like “127.0.0.1:11434” the models don’t load.

1 Like

Unexpected Server Error

TypeError: Cannot read properties of null (reading ‘useEffect’)
i have this error i am using the docker version

Hi ColeMerlin, i cant install bolt.diy on my win11 pc. again and again tryid but it i havent got a success install. I done all steps one by one.
but again i cant :frowning:

can u say anything to me?

PS C:\Users\Admin\bolt.diy> pnpm run dev

bolt@0.0.3 dev C:\Users\Admin\bolt.diy
node pre-start.cjs && remix vite:dev

★═══════════════════════════════════════★
B O L T . D I Y
:zap: Welcome :zap:
★═══════════════════════════════════════★

:round_pushpin: Current Commit Version: eb6d4353565be31c6e20bfca2c5aea29e4f45b6d
★═══════════════════════════════════════★
warn Data fetching is changing to a single fetch in React Router v7
┃ You can use the v3_singleFetch future flag to opt-in early.
┃ → Future Flags (v2.13.1) | Remix

Error: write EOF
at WriteWrap.onWriteComplete [as oncomplete] (node:internal/stream_base_commons:87:19) {
errno: -4095,
code: ‘EOF’,
syscall: ‘write’
}
ELIFECYCLE Command failed with exit code 1.
PS C:\Users\Admin\bolt.diy>

@MegaLord
Found this: 🐛 BUG: Error: write EOF at WriteWrap.onWriteComplete [as oncomplete] (node:internal/stream_base_commons:94:16) · Issue #3698 · cloudflare/workers-sdk · GitHub

Sounds like it might be related to another issue related to your version of Visual C++ Redistributable. Seems specific to Windows 11.

Try installing the Latest supported Visual C++ Redistributable downloads.

And can you please make sure to respond and confirm wether it fixed your issue or not? Thanks.

2 Likes

You need to replace the localhost/127.0.0.1 with the actual public IP Address of the server/VPS, and update the value in your .env.local file (no slash at the end).

I did it, but it seems there’s an error in the code, I tried with docker and without, it’s the same.

3:14:17 PM [vite] Internal server error: Not Found
at /bolt.diy/node_modules/.pnpm/@ai-sdk+provider-utils@1.0.20_zod@3.23.8/node_modules/@ai-sdk/provider-utils/src/response-handler.ts:72:16
at processTicksAndRejections (node:internal/process/task_queues:95:5)
at postToApi (/bolt.diy/node_modules/.pnpm/@ai-sdk+provider-utils@1.0.20_zod@3.23.8/node_modules/@ai-sdk/provider-utils/src/post-to-api.ts:81:28)
at OllamaChatLanguageModel.doStream (/bolt.diy/node_modules/.pnpm/ollama-ai-provider@0.15.2_zod@3.23.8/node_modules/ollama-ai-provider/src/ollama-chat-language-model.ts:230:50)

I’m guessing the connection is getting blocked to your machine hosting Ollama. Have you done any configuring with the firewall?

I got the same error as @spacecodee tried disabling firewall didn’t work, gave permission to ollama and docker still didn’t work. i tried above with all possible combinations.

Hi @avi00728,
please open a seperate topic for it and describe your complete setup.
If you working with cloud / ollama and bolt not and the machine you also use it, it can be very hard to get it running. There are a lot of problems with cors, no https, certificate missmatch etc.

Well, it works well when I use any web or application for using ollama like AnythingLLM in my local machine.
I think it’s a problem with the package.

@spacecodee If you want to investigate further, please open a separate topic, summarizing your case and describe your setup within [bolt.diy] Issues and Troubleshooting


Hi, I got the error when running the script from powershell on win 10. Would like to know how to solve it.

This error is not related to Bolt.diy, powershell is looking for a script called npm.ps1. Nothing after that step will work because npm was not ran. Make sure node is installed correctly and the path is in your system environment. And maybe try running in just a normal command prompt.

P.S. Also, is your username really called user or did you modify the graphic?

1 Like