Hey
Can you make it so Archon will use the llm calls from windsurf/cursor instead of having to give his separate apis?
And if not - can you add Gemini to the api options?
1 Like
I replied to your YouTube comment as well, but unfortunately we wouldn’t be able to use the LLM in the AI IDE. I do want to add Gemini support in the future though and you can already use it through OpenRouter!