most of the time when using it on lengthy projects , llm often ignores previous statements given by user. and hallucination occurs even in models like 3.5 sonet , so if there is functionality that a llm maintain a txt file which becomes context for further given prompts it would be great.
for eg m using windforce editor with sonet. (1)either there are dependency conflicts if they get solved then (2)missing module error comes , or sometimes it just (3)delete imp functions to solve all errors and situation 1,2,3 happens in loop , i wish llm usage comes with memory so situation like this wont happen
3 Likes
This is a challenging implementation but I agree it would be a game changer if done right!
This was on my radar for quite a while. It also includes Vercel AI integration.