I’m getting the following error when running Bolt with LM Studio:
ERROR api.chat AI_TypeValidationError: Type validation failed: Value: “Trying to keep the first 6844 tokens when context the overflows. However, the model is loaded with context length of only 4096 tokens, which is not enough. Try to load the model with a larger context length, or provide a shorter input”.
I have set the context length in the .env file to DEFAULT_NUM_CTX=24576 but it doesn’t seem to be picking it up.
Can someone please explain how I can resolve this?