How you can use OpenRouter with langchain - Python Tutorial

There were some pitfalls like what is the correct import, environment variables and parameters having openai in the naming altough they are not so I thought I share my little helper function… Before I used OpenAI API directly until I received my free OpenRouter credits for the hackathon.

from langchain_openai import ChatOpenAI
from langchain.prompts import PromptTemplate
from langchain.chains import LLMChain
from os import getenv

# https://openrouter.ai/docs/frameworks#using-langchain
def get_openrouter(model: str = "openai/gpt-4o") -> ChatOpenAI:
    # alternatively you can use OpenAI directly via ChatOpenAI(model="gpt-4o") via -> from langchain_openai import ChatOpenAI
    return ChatOpenAI(model=model,
        openai_api_key=getenv("OPENROUTER_API_KEY"),
        openai_api_base=getenv("OPENROUTER_BASE_URL"))

How to use it:
llm = get_openrouter(model="gpt-4o")

For deepseek you might need to install langchain_community package

Also in your application you might need to load variables from .env in main.py via

from dotenv import load_dotenv
load_dotenv()

PS: I also tried deepseek v3 but my usecase needs vision. Here in the API call response you can check what the model supports Models | OpenRouter and for deepseek the modality is just text->text

4 Likes

This is super helpful, thank you for sharing @JohannesM!!