OpenRouter
LiteLLM supports all the text / chat / vision models from OpenRouter
Usageโ
import os
from litellm import completion
os.environ["OPENROUTER_API_KEY"] = ""
os.environ["OPENROUTER_API_BASE"] = "" # [OPTIONAL] defaults to https://openrouter.ai/api/v1
os.environ["OR_SITE_URL"] = "" # [OPTIONAL]
os.environ["OR_APP_NAME"] = "" # [OPTIONAL]
response = completion(
model="openrouter/google/palm-2-chat-bison",
messages=messages,
)
Configuration with Environment Variablesโ
For production environments, you can dynamically configure the base_url using environment variables:
import os
from litellm import completion
# Configure with environment variables
OPENROUTER_API_KEY = os.getenv("OPENROUTER_API_KEY")
OPENROUTER_BASE_URL = os.getenv("OPENROUTER_API_BASE", "https://openrouter.ai/api/v1")
# Set environment for LiteLLM
os.environ["OPENROUTER_API_KEY"] = OPENROUTER_API_KEY
os.environ["OPENROUTER_API_BASE"] = OPENROUTER_BASE_URL
response = completion(
model="openrouter/google/palm-2-chat-bison",
messages=messages,
base_url=OPENROUTER_BASE_URL # Explicitly pass base_url for clarity
)
This approach provides better flexibility for managing configurations across different environments (dev, staging, production) and makes it easier to switch between self-hosted and cloud endpoints.
OpenRouter Completion Modelsโ
๐จ LiteLLM supports ALL OpenRouter models, send model=openrouter/<your-openrouter-model> to send it to open router. See all openrouter models here
| Model Name | Function Call |
|---|---|
| openrouter/openai/gpt-3.5-turbo | completion('openrouter/openai/gpt-3.5-turbo', messages) |
| openrouter/openai/gpt-3.5-turbo-16k | completion('openrouter/openai/gpt-3.5-turbo-16k', messages) |
| openrouter/openai/gpt-4 | completion('openrouter/openai/gpt-4', messages) |
| openrouter/openai/gpt-4-32k | completion('openrouter/openai/gpt-4-32k', messages) |
| openrouter/anthropic/claude-2 | completion('openrouter/anthropic/claude-2', messages) |
| openrouter/anthropic/claude-instant-v1 | completion('openrouter/anthropic/claude-instant-v1', messages) |
| openrouter/google/palm-2-chat-bison | completion('openrouter/google/palm-2-chat-bison', messages) |
| openrouter/google/palm-2-codechat-bison | completion('openrouter/google/palm-2-codechat-bison', messages) |
| openrouter/meta-llama/llama-2-13b-chat | completion('openrouter/meta-llama/llama-2-13b-chat', messages) |
| openrouter/meta-llama/llama-2-70b-chat | completion('openrouter/meta-llama/llama-2-70b-chat', messages) |
Passing OpenRouter Params - transforms, models, routeโ
Pass transforms, models, routeas arguments to litellm.completion()
import os
from litellm import completion
os.environ["OPENROUTER_API_KEY"] = ""
response = completion(
model="openrouter/google/palm-2-chat-bison",
messages=messages,
transforms = [""],
route= ""
)