Preview not working. LLM is asking for WebContainer URL

Hey guys,

I’m new to Bolt.new and Bolt.diy and need help me with displaying my app in the Preview Page? I’m trying to get the LLM to display the website in the Preview page but it is asking for my WebContainer URL but I’m not sure where to get this info. I also ran the npm install && npm run dev commands and still couldn’t get the Preview page to work.

From the LLM:
"The WebContainer’s preview mechanism is different from a typical npm run dev setup. A standard npm run dev starts a development server on your local machine, making it accessible via localhost. The WebContainer, however, runs in a sandboxed environment, and its preview functionality handles the serving and accessibility of the application differently.

The instructions you quoted assume a standard Node.js development environment where you’d run commands on your local machine to start a development server. The WebContainer, however, manages the server and preview internally. My previous responses incorrectly implied that a simple npm run dev would update the preview. That’s not how the WebContainer’s preview works.

To see your application, you must obtain the correct URL from the WebContainer interface (as explained in previous responses) and access it through your browser. There’s no direct “preview update” mechanism controlled by running npm run dev or similar commands within the WebContainer in this specific context."

Hi @SupaAgentAI,
welcome :slight_smile:

What provider and model you are running?
Sounds like its not working well with bolt, as this should all work automatically.

I was using Gemini exp 1206 and just switched to OpenAI 4o mini. Anthropic 3.5 Sonnet and OpenAi 4o are both throwing errors.

I haven’t been able to display anything in Preview page. Here is a screenshot of my instance of bolt.diy

I May have come up with the fix. at least for linux mint users.
for windows or other os you will have to find the equivalent of this idea.

Install a Web Server (Choose Either Apache or Nginx):

  • For Apache:
sudo apt update
sudo apt install apache2
  • For Nginx:
sudo apt update
sudo apt install nginx

This did the trick for me. I can explicitly ask the model to obtain the image urls from some site like pexels describing what I want or I can go and get them myself and tell the llm to implement these, pasting the image urls into the query window, telling the llm to implimest them into the script.

There are 2 on going PR that detects terminal error and browser ui error and feeds that back to llm to fix this might help you solve this issue.

As there could be 100s of different reason for preview not showing up which depends on what the llm has written for the code.

So it’s better that the llm solves the mess it creates :wink:

1 Like