Context Limit Issue

HI,

This is probably a well known issue but let quickly present my issue regarding an application that I’m developping.

When I’m using Bolt.diy with most of the LLM and I submit a prompt, I get the following error message: “This endpoint’s maximum context length is 131072 tokens. However, you requested about 243776 tokens (235776 of text input, 8000 in the output” which prevent me from using the best LLM models for coding.

In Bolt.new I don’t have this issue. I’m wondering why?

Is there any plan to solve this issue with Bolt.diy?

Thanks!

Hi @walkirie2,

this is already solved, you just have to enable context optimization in the settings.

Also see this, to understand bolt.diy vs bolt.new etc.:

1 Like

As simple as that! I’ll try that. Thanks @leex279 , you 're opening a new world of possibility for my developments :slight_smile:

1 Like