Error [vite] Internal server error: Cannot read properties of undefined (reading 'toolCalls')

I’m using bolt and this is my debugging situation.
When I try to run “Build a to do app” using Ollama, I get this error
14:18:54 [vite] Internal server error: Cannot read properties of undefined (reading ‘toolCalls’)
at Object.flush (file:///C:/bolt.diy/node_modules/ai/core/generate-text/stream-text.ts:569:33)
at invokePromiseCallback (node:internal/webstreams/util:162:10)
at Object. (node:internal/webstreams/util:167:23)
at transformStreamDefaultSinkCloseAlgorithm (node:internal/webstreams/transformstream:621:43)
at node:internal/webstreams/transformstream:379:11
at writableStreamDefaultControllerProcessClose (node:internal/webstreams/writablestream:1162:28)
at writableStreamDefaultControllerAdvanceQueueIfNeeded (node:internal/webstreams/writablestream:1242:5)
at writableStreamDefaultControllerClose (node:internal/webstreams/writablestream:1209:3)
at writableStreamClose (node:internal/webstreams/writablestream:722:3)
at writableStreamDefaultWriterClose (node:internal/webstreams/writablestream:1091:10)
at writableStreamDefaultWriterCloseWithErrorPropagation (node:internal/webstreams/writablestream:1083:10)
at node:internal/webstreams/readablestream:1558:15
at complete (node:internal/webstreams/readablestream:1437:9)
at processTicksAndRejections (node:internal/process/task_queues:105:5) (x4)

Any suggestions?

one of the package was updated to a newer version, please do a pnpm install
and then try running the server
if issue persist then delete the node_modules folder and then they pnpm install

I deleted the entire node_modules directory of the project and reinstalled.
Messages appeared during the installation
C:\bolt.diy>npm install
npm warn EBADENGINE Unsupported engine {
npm warn EBADENGINE package: ‘@blitz/eslint-plugin@0.1.3’,
npm warn EBADENGINE required: { node: ‘^18.0.0 || ^20.0.0’ },
npm warn EBADENGINE current: { node: ‘v22.12.0’, npm: ‘10.9.0’ }
npm warn EBADENGINE }
npm warn deprecated inflight@1.0.6: This module is not supported, and leaks memory. Don’t use it. Check out lru-cache if you want a good and tested way to coalesce async requests by a key value, which is much more comprehensive and powerful.
npm warn deprecated @humanwhocodes/config-array@0.13.0: Use @eslint/config-array instead
npm warn deprecated rimraf@3.0.2: Rimraf versions prior to v4 are no longer supported
npm warn deprecated rollup-plugin-inject@3.0.2: This package has been deprecated and is no longer maintained. Please use @rollup/plugin-inject.
npm warn deprecated glob@7.2.3: Glob versions prior to v9 are no longer supported
npm warn deprecated @humanwhocodes/object-schema@2.0.3: Use @eslint/object-schema instead
npm warn deprecated sourcemap-codec@1.4.8: Please use @jridgewell/sourcemap-codec instead
npm warn deprecated eslint@8.57.1: This version is no longer supported. Please see Version Support - ESLint - Pluggable JavaScript Linter for other options.

At runtime, using Ollama as always, the error is always the same

can you try running npm run dev now

Unfortunately, as mentioned, the error persists and is always the same.
Checking the first line of the error file:///C:/bolt.diy/node_modules/ai/core/generate-text/stream-text.ts:569:33) I noticed that the file is not there. The subdirectory C:/bolt.diy/node_modules/ai/core is completely missing

which version are you using ?

I installed it today. The version is v0.0.3
Thanks for the help

1 Like

I actually faced this issue before, forgot how i resolved this. its not the issue with bolt but some cached installation that is messing with the app.
try loading into a fresh clone into another directory may help

I will try to do as you suggest but unfortunately I have already done an uninstallation and a new installation but the problem is still the same.
I’ll try again!!!
Thanks for the support

I deleted the entire directory and did a new installation but unfortunately the problem is still the same.

just remembered now… for me it was api key issue, try double checking your api key :sweat_smile:

But I use it for Ollama so I have no API KEY to enter.
I just entered the OLLAMA_API_BASE_URL by entering http://localhost:11434

# You only need this environment variable set if you want to use oLLAMA models
# EXAMPLE http://localhost:11434
OLLAMA_API_BASE_URL=http://localhost:11434

In my case , it was not working with OpenAPI Key , only worked with Anthropic key .

In my opinion, the most interesting part of the project is to use the application locally with Ollama so that you don’t have to pay any LLM. Too bad it doesn’t work for me at least. I will try to update the files

please use the setting window to add the url, and use 127.0.0.1 inplace of localhost
also what model you are trying to use and what vram?

did you set the context size

# Example Context Values for qwen2.5-coder:32b
# 
# DEFAULT_NUM_CTX=32768 # Consumes 36GB of VRAM
# DEFAULT_NUM_CTX=24576 # Consumes 32GB of VRAM
# DEFAULT_NUM_CTX=12288 # Consumes 26GB of VRAM
# DEFAULT_NUM_CTX=6144 # Consumes 24GB of VRAM
DEFAULT_NUM_CTX=

whats the error you are getting ?

Thanks. This morning I downloaded the files again and did a new installation
Now the code generation works! There is another small problem though.
The preview page remains white

Sounds good, so far.

Do you get any errors in the bolt-terminal and/or the dev-tools?

What Model you are using?

There are no errors in the terminal, while in the terminal I have the error Failed to load resource: the server responded with a status of 404 (Not Found)

Which model do you use? The white preview page is mostly a problem of the model is not capable to create a working setup (to small LLM).

If you are on the main branch, you can try to start with a starter template (currently beta).
I quickly showcased it here: https://youtube.com/shorts/J2CNudRybxA