Bolt.diy install and tested ok. after importing my project from bolt.new no chat works in the imported project

Hi there,
i am trying to figure out what’s wrong here:
i installed bolt.diy (no docker) and tested to create example projects it works great!
Now i built a project on bolt.new which i imported into bolt.diy, and after loading the project the chat does not work anymore.
The project loads correctly and app is working.
But no chat…

When going back to creating a new chat, it works…

What is wrong in my project that could break bolt.diy to work? I tried on 2 different PC, same issue.
Anybody came across this? Could you help me ?
Thank you in advance

1 Like

Welcome @fabien,

the model you choosen to run your prompts could be not able to handle your project, cause its to big.

Also maybe you run into an error with the newest bolt.diy version. Try using v0.0.3

Also try to use Google Gemini 2.0, as it has a very large context and capabilities.

You can also provide your Chat (Export Chat) and/or your project export if you want. Easier to verify/test whats going wrong here.

hi @leex279,
i am using Anthropic 3.5 sonnet with my paid API key.
i tried with gemini 2.0 and OpenAI paid api too.
i will try later to get a copy of v0.0.3 to test.
for now i am stuck.
thank you very much for further ideas and help

1 Like

running now on 0.0.3
getting same symptoms and
14:20:57 [vite] Internal server error: Cannot read properties of undefined (reading ‘toolCalls’)
at Object.flush (file:///C:/appdev/bolt.new.clone/bolt.diy/node_modules/.pnpm/ai@4.0.26_react@18.3.1_zod@3.24.1/node_modules/ai/core/generate-text/stream-text.ts:675:33)

How did you install bolt.diy? can you please post all your commands / terminal output.

as per the bolt.diy Docs instructions:
git clone -b stable GitHub - stackblitz-labs/bolt.diy: Prompt, run, edit, and deploy full-stack web applications using any LLM you want!
pnpm install
git pull origin main
pnpm install

just to be sure, i created a new directory and done it again. same result :
in the browser console:
ERROR Chat Request failed : logger.ts 85

logger.ts:
export type DebugLevel = ‘trace’ | ‘debug’ | ‘info’ | ‘warn’ | ‘error’;
import { Chalk } from ‘chalk’;

const chalk = new Chalk({ level: 3 });

type LoggerFunction = (…messages: any) => void;

interface Logger {
trace: LoggerFunction;
debug: LoggerFunction;
info: LoggerFunction;
warn: LoggerFunction;
error: LoggerFunction;
setLevel: (level: DebugLevel) => void;
}

let currentLevel: DebugLevel = (import.meta.env.VITE_LOG_LEVEL ?? import.meta.env.DEV) ? ‘debug’ : ‘info’;

export const logger: Logger = {
trace: (…messages: any) => log(‘trace’, undefined, messages),
debug: (…messages: any) => log(‘debug’, undefined, messages),
info: (…messages: any) => log(‘info’, undefined, messages),
warn: (…messages: any) => log(‘warn’, undefined, messages),
error: (…messages: any) => log(‘error’, undefined, messages),
setLevel,
};

export function createScopedLogger(scope: string): Logger {
return {
trace: (…messages: any) => log(‘trace’, scope, messages),
debug: (…messages: any) => log(‘debug’, scope, messages),
info: (…messages: any) => log(‘info’, scope, messages),
warn: (…messages: any) => log(‘warn’, scope, messages),
error: (…messages: any) => log(‘error’, scope, messages),
setLevel,
};
}

function setLevel(level: DebugLevel) {
if ((level === ‘trace’ || level === ‘debug’) && import.meta.env.PROD) {
return;
}

currentLevel = level;
}

function log(level: DebugLevel, scope: string | undefined, messages: any) {
const levelOrder: DebugLevel = [‘trace’, ‘debug’, ‘info’, ‘warn’, ‘error’];

if (levelOrder.indexOf(level) < levelOrder.indexOf(currentLevel)) {
return;
}

const allMessages = messages.reduce((acc, current) => {
if (acc.endsWith(‘\n’)) {
return acc + current;
}

if (!acc) {
  return current;
}

return `${acc} ${current}`;

}, ‘’);

const labelBackgroundColor = getColorForLevel(level);
const labelTextColor = level === ‘warn’ ? ‘#000000’ : ‘#FFFFFF’;

const labelStyles = getLabelStyles(labelBackgroundColor, labelTextColor);
const scopeStyles = getLabelStyles(‘#77828D’, ‘white’);

const styles = [labelStyles];

if (typeof scope === ‘string’) {
styles.push(‘’, scopeStyles);
}

let labelText = formatText(${level.toUpperCase()}, labelTextColor, labelBackgroundColor);

if (scope) {
labelText = ${labelText} ${formatText( ${scope} , '#FFFFFF', '77828D')};
}

if (typeof window !== ‘undefined’) {
console.log(%c${level.toUpperCase()}${scope ? %c %c${scope} : ''}, …styles, allMessages);
} else {
console.log(${labelText}, allMessages);
}
}

function formatText(text: string, color: string, bg: string) {
return chalk.bgHex(bg)(chalk.hex(color)(text));
}

function getLabelStyles(color: string, textColor: string) {
return background-color: ${color}; color: white; border: 4px solid ${color}; color: ${textColor};;
}

function getColorForLevel(level: DebugLevel): string {
switch (level) {
case ‘trace’:
case ‘debug’: {
return ‘#77828D’;
}
case ‘info’: {
return ‘#1389FD’;
}
case ‘warn’: {
return ‘#FFDB6C’;
}
case ‘error’: {
return ‘#EE4744’;
}
default: {
return ‘#000000’;
}
}
}

export const renderLogger = createScopedLogger(‘Render’);

Those steps seem a little off. Just try the following: remove the node_modules directory, drop the duplicate pnpm install and pull origin main:

# Clone the repository
git clone --branch stable https://github.com/stackblitz-labs/bolt.diy.git
cd bolt.diy

# Install pnpm globally (if not already installed)
npm install -g pnpm

# Install dependencies using pnpm
pnpm install

# Start the development server
pnpm run dev

all done.
same as before:
[Violation] Forced reflow while executing JavaScript took 32ms
logger.ts:85 ERROR Chat Request failed

ERROR Chat Request failed is likely an API response issue. Did you confirm that it works and you have available credits?

What provider and model are you trying to use?

1 Like

When I start a new chat it works.
Just when I am importing my project downloaded from bolt.new, the chat does not work anymore.

Sounds like the same as here: There was an error processing your request - An error occurred - #11 by ushic

I guess the project maybe also to big. If you want to share your project I can try/verify. You can PM me, if you dont want to share it publicly here.

我也遇到一模一樣的問題,請問現在這個問題有解決方案了嗎,謝謝

1 Like

nice, but please write in english within this community :wink:

I also encountered the exact same problem. Is there a solution to this problem now? Thank you.

The solution at the moment is just to find an Model/Provider which can handle big projects, if this is really the problem.
In future releases of bolt, when diffs etc. can be handeld, its getting easier, I guess.

Thank you very much for seeing our needs. I am also looking forward to this day. After all, my project still needs the help of this large model. Thank you again.

2 Likes