Adding new LLM not working, keep having errors

Hi guys, is there an updated tutorial on how to correctly add a new llm with custom endpoint and model selections? I have added but it’s not working, keep having error when I started chatting with the llm somehow

Tried finding the existing threads but those seem to be quite different than the folder structure I have in the latest version

Thanks :slight_smile:

1 Like

Hi Adam.
I don’t think you can add a new provider and LLM. They have to be configured and added to the system to select from.

@adamjonesdym Github has just been added. Take a look at this PR and its commits:

2 Likes

Thanks Leex, so basically to add a new provider, I just need to add a new .ts file under app/lib/modules/llm/providers/myprovidername.ts then add edit the app/lib/modules/llm/registry.ts??

1 Like

I dont know exactly, but @thecodacus you could help here for sure :slight_smile:

yes, thats correct.you need to implement the baseProvider class and export that .

Thanks guys, by the way, do I need to modify .env.example to .env once I change anything in it like the api key and base provider url for my new llm?

Getting error:

image

image

Debug Info:
{
“System”: {
“os”: “Windows”,
“browser”: “Chrome 134.0.0.0”,
“screen”: “1536x960”,
“language”: “en-US”,
“timezone”: “Asia/Kuala_Lumpur”,
“memory”: “4 GB (Used: 80.4 MB)”,
“cores”: 32,
“deviceType”: “Desktop”,
“colorDepth”: “24-bit”,
“pixelRatio”: 1.25,
“online”: true,
“cookiesEnabled”: true,
“doNotTrack”: false
},
“Providers”: [
{
“name”: “LMStudio”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2025-01-18T03:31:54.506Z”,
“url”: null
},
{
“name”: “Ollama”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2025-01-18T03:31:54.506Z”,
“url”: null
},
{
“name”: “OpenAILike”,
“enabled”: false,
“isLocal”: true,
“running”: false,
“error”: “No URL configured”,
“lastChecked”: “2025-01-18T03:31:54.506Z”,
“url”: null
}
],
“Version”: {
“hash”: “no-git-”,
“branch”: “main”
},
“Timestamp”: “2025-01-18T03:31:57.487Z”
}

@thecodacus it works now after changing the base url by removing /chat/completions. But it only works for some model of the provider, and does not work for some other models, any idea why?

However I noticed even though there’s response from the few llm models, no code changes was done at all (changes not reflected in code)

image

Same issue when I test the official deepseek api as well

Note: this is for an imported bolt folder