Using own LLM APIs in Visual Studio Code

Hi,

Is it possible to use own LLM API’s in Visual Studio Code (regular or insider edition)?
I guess, there should be some sort of extension that allows it.

Or please advise a better solution.

Currently, I’m using windsurf.

PS. Please don’t advise to run locally as I don’t have sufficient hardware for that yet.

Regards.

1 Like

If I understand your question correctly, yes.

You can get several extensions that allow this such as Aider (Aider CLI or Aider Composer with VSCode Integration), Cline, or Roo Code. My preference is Cline using MCP’s and I like using VSCode because I actually have all three installed but I’m sure there are several other options.

You can run an LLM of your choice through an OpenAI-Like API, which is the standard. Works with pretty much everything. The main ways I use it is with vLLM (local), Openrouter (cheap, all-in-one, & reliable), or GroqCloud (speed). But it also supports Anthropic, Google AI Studio, Hugging Face, etc.

And technically you could just run a CLI tool and you get simi-IDE integration because they can both talk to the folder and files. So you can even kind of create your own without the need of creating a VSCode extension per se.

There are a lot of options, which kind of presents its own tradeoffs.

Good luck!

2 Likes

Thanks a lot!
I hope, your reply will be helpful not only to myself but others as well.