Hi guys, is there an updated tutorial on how to correctly add a new llm with custom endpoint and model selections? I have added but it’s not working, keep having error when I started chatting with the llm somehow
Tried finding the existing threads but those seem to be quite different than the folder structure I have in the latest version
Thanks Leex, so basically to add a new provider, I just need to add a new .ts file under app/lib/modules/llm/providers/myprovidername.ts then add edit the app/lib/modules/llm/registry.ts??
@thecodacus it works now after changing the base url by removing /chat/completions. But it only works for some model of the provider, and does not work for some other models, any idea why?
However I noticed even though there’s response from the few llm models, no code changes was done at all (changes not reflected in code)
Same issue when I test the official deepseek api as well