← Home

Guide

Hermes Agent + OpenRouter: One Key for 200+ AI Models

OpenRouter is Hermes Agent's default fallback aggregator — one API key that routes to Claude, GPT, Gemini, DeepSeek, Grok, Kimi, GLM, and dozens more without managing separate credentials for each provider.

What Is OpenRouter?

OpenRouter is a unified AI model gateway. It exposes an OpenAI-compatible API and routes requests to the underlying provider — Anthropic, OpenAI, Google, DeepSeek, xAI, Moonshot, and many others — based on the model ID you specify. You maintain one billing relationship and one API key regardless of how many providers you use.

Hermes Agent is built with OpenRouter as its default fallback aggregator. If no other provider is configured, Hermes reads OPENROUTER_API_KEY and routes inference there automatically. This makes OpenRouter the easiest zero-friction path to getting a wide range of models working with Hermes.

Popular Models Available via OpenRouter with Hermes

Model IDProviderBest For
anthropic/claude-sonnet-4.6AnthropicBest all-round agent model
anthropic/claude-opus-4.6AnthropicDeep reasoning, 1M context
openai/gpt-5.5OpenAILatest GPT flagship
google/gemini-3.1-pro-previewGoogleLong context, multimodal
deepseek/deepseek-v4-proDeepSeekStrong coding, very low cost
x-ai/grok-4.20xAIReal-time knowledge, reasoning
moonshotai/kimi-k2.6MoonshotLong context, research tasks

OpenRouter supports 200+ models. The full list is at openrouter.ai/models.

Option 1: Hermes Agent on OpenClaw Launch (Easiest)

OpenClaw Launch uses OpenRouter automatically for all managed Hermes deployments. You don't need to touch any API keys unless you want BYOK billing.

  1. Go to openclawlaunch.com/hermes-hosting and start a Hermes deploy.
  2. Pick any model from the dropdown — Claude, GPT, Gemini, DeepSeek, Grok, and others all route through OpenRouter automatically.
  3. Connect your channel and click Deploy. Done.
Tip: switch between any OpenRouter-available model at runtime using the /model command in chat — no redeploy needed.

Option 2: OpenRouter with Self-Hosted Hermes

Set the OPENROUTER_API_KEY env var and point Hermes at the openrouter provider. Since OpenRouter is the default fallback, this is the simplest self-hosted path:

export OPENROUTER_API_KEY=sk-or-...

# Tell Hermes to use OpenRouter and set a default model
hermes inference set openrouter
hermes model set anthropic/claude-sonnet-4.6

# Or edit /opt/data/config.yaml:
# inference:
#   provider: openrouter
# model:
#   default: anthropic/claude-sonnet-4.6

Get your OpenRouter API key from openrouter.ai/keys. No minimum spend — you pay per token used.

Model ID Format

When using OpenRouter as the inference provider, model IDs follow the format provider/model-name. Examples:

anthropic/claude-sonnet-4.6
anthropic/claude-opus-4.6
openai/gpt-5.5
openai/gpt-5.4-mini
google/gemini-3.1-pro-preview
deepseek/deepseek-v4-pro
x-ai/grok-4.20
moonshotai/kimi-k2.6

You can switch to any of these at runtime from any connected channel:

/model deepseek/deepseek-v4-pro
/model anthropic/claude-opus-4.6
/model x-ai/grok-4.20

OpenRouter vs Provider-Direct for Hermes

FeatureOpenRouterProvider Direct
API keys neededOne key for everythingOne key per provider
Models accessible200+ across all major providersOne provider's models only
LatencyOne extra proxy hopDirect, lowest latency
BillingUnified OpenRouter dashboardPer-provider billing pages
Model switchingAny model, same keyDifferent key per provider

OpenRouter is the best default for Hermes users who want flexibility. Use a provider-direct key only if you're committed to one provider and want the lowest possible latency on every request.

BYOK on OpenClaw Launch

If you have your own OpenRouter key with credits, you can bring it to OpenClaw Launch Hermes deployments. In the configurator, choose BYOK and paste your OPENROUTER_API_KEY. All inference routes through your OpenRouter account with full usage visibility on your OpenRouter dashboard.

What's Next?

Deploy Hermes with OpenRouter

Get a Hermes Agent running in 10 seconds with access to 200+ AI models.

Deploy Hermes