** Describe the bug**
I’ve been using Chrome Canary as my primary browser. I tried uploading files from a project I started with Bolt to create a new chat, hoping to avoid any issues with AI hallucinations. However, when I load the project files to start the new chat, it doesn’t work. Interestingly, if I start a chat without loading any files, everything works perfectly. I’m not sure what’s causing the problem
To repeat the issue
- Start a chat and create a project.
- Download the project to your desktop.
- Start a new chat by loading the downloaded project.
- Ask the LLM a question in my case (Open AI , and Deepseek).
- It will not process the question, and then I get an error.
*I do want to note this is not the case with gemini 2.0 flash. *
And My API calls do work if it is a new chat without uploading files, I also cant embed a photo or I would provide a screenshot
Expected Behavior
For LLM to process questions as if a new chat.
Model Used
- gpt-4o
- deepseek-coder
INFO stream-text Sending llm call to Deepseek with model deepseek-coder
DEBUG api.chat usage {"promptTokens":null,"completionTokens":null,"totalTokens":null}
DEBUG api.chat usage {"promptTokens":null,"completionTokens":null,"totalTokens":null}
DEBUG api.chat usage {"promptTokens":null,"completionTokens":null,"totalTokens":null}
INFO stream-text Sending llm call to OpenAI with model gpt-4o
INFO stream-text Sending llm call to OpenAI with model gpt-4
INFO stream-text Sending llm call to OpenAI with model gpt-4o
INFO stream-text Sending llm call to OpenAI with model gpt-4o
DEBUG api.chat usage {"promptTokens":null,"completionTokens":null,"totalTokens":null}
INFO stream-text Sending llm call to OpenAI with model gpt-4o
Here is the Debug Information :
{
“System”: {
“os”: “Windows”,
“browser”: “Chrome 133.0.0.0”,
“screen”: “2560x1440”,
“language”: “en-US”,
“timezone”: “America/New_York”,
“memory”: “4 GB (Used: 69.45 MB)”,
“cores”: 24,
“deviceType”: “Desktop”,
“colorDepth”: “24-bit”,
“pixelRatio”: 1,
“online”: true,
“cookiesEnabled”: true,
“doNotTrack”: false
},
“Providers”: [
{
“name”: “LMStudio”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2025-01-05T16:52:57.280Z”,
“url”: null
},
{
“name”: “Ollama”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2025-01-05T16:52:57.280Z”,
“url”: null
},
{
“name”: “OpenAILike”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2025-01-05T16:52:57.280Z”,
“url”: null
}
],
“Version”: {
“hash”: “be7a754”,
“branch”: “stable”
},
“Timestamp”: “2025-01-05T16:52:59.567Z”
}