← Home

Integration Guide

OpenClaw Codex — Use GPT Codex Models with Your AI Agent

Connect OpenClaw to Codex models like GPT-5.3-Codex for code generation, debugging, refactoring, and development assistance. Run Codex through OpenRouter or a direct API endpoint and turn your AI agent into a powerful coding companion accessible from Telegram, Discord, WhatsApp, or any supported channel.

What Is Codex?

Codex refers to OpenAI's family of code-specialized language models. The latest generation, GPT-5.3-Codex, is optimized for understanding and generating code across dozens of programming languages. Unlike general-purpose chat models, Codex models are fine-tuned on code repositories, documentation, and developer workflows — making them exceptionally good at tasks like writing functions, explaining code, debugging errors, and generating tests.

OpenClaw added native support for Codex models starting with v2026.2.6, letting you route any conversation to a Codex model through OpenRouter or a direct OpenAI API key.

Why Use Codex with OpenClaw?

Running Codex through OpenClaw gives you capabilities that a standalone coding assistant doesn't offer:

  • Multi-channel access — Ask coding questions from Telegram, Discord, WhatsApp, Slack, or web chat. No need to keep a terminal open.
  • Persistent memory — OpenClaw remembers your project context, coding preferences, and past conversations across sessions.
  • Skills and tools — Combine Codex with OpenClaw skills like web search, file access, or MCP servers for richer development workflows.
  • Model switching — Use Codex for code tasks and switch to Claude or GPT for general conversations, all from the same agent.
  • Team sharing — Deploy one OpenClaw instance with Codex and share it with your team via a group chat or channel.

Setting Up Codex via OpenRouter

The easiest way to use Codex with OpenClaw is through OpenRouter, which provides access to Codex models alongside hundreds of other models with a single API key.

  1. Sign up at openrouter.ai and create an API key
  2. In your OpenClaw configuration, add OpenRouter as a provider under models.providers.openrouter
  3. Set the model ID to openrouter/openai/gpt-5.3-codex in your agent configuration
  4. Deploy or restart your instance to apply the changes

If you're using OpenClaw Launch, you can configure this directly from the visual configurator — select the Codex model from the model dropdown and paste your OpenRouter API key.

Setting Up Codex via Direct OpenAI API

If you prefer to use your OpenAI API key directly (without OpenRouter), you can configure a custom endpoint:

  1. Get an API key from platform.openai.com
  2. Add a custom provider named (e.g. openai-direct) in your OpenClaw config pointing to https://api.openai.com/v1
  3. Set the model ID to openai-direct/gpt-5.3-codex — OpenClaw requires model IDs to be prefixed with the provider name you chose (e.g. openai-direct/gpt-5.3-codex)
  4. Deploy or restart your instance

Direct API access gives you the lowest latency and avoids OpenRouter's routing overhead, but you lose the ability to easily switch between providers.

Best Use Cases for OpenClaw + Codex

Here are practical ways to use your Codex-powered OpenClaw agent:

Code Generation on the Go

Message your agent from your phone: “Write a Python function that parses CSV files and returns a list of dictionaries.” Codex generates clean, well-documented code that you can copy straight into your project.

Debugging via Chat

Paste an error traceback into Telegram or Discord and ask your agent to explain the issue and suggest a fix. Codex excels at reading stack traces and identifying root causes.

Code Review and Refactoring

Share a code snippet and ask: “Review this for performance issues” or “Refactor this to use async/await.” Codex provides specific, actionable suggestions.

Learning and Explanation

Ask “Explain how React hooks work” or “What does this regex do?” and get detailed, code-aware explanations tailored to your skill level (especially powerful when combined with OpenClaw's memory feature, which remembers your experience level).

Test Generation

Share a function and ask the agent to write unit tests for it. Codex generates tests covering edge cases, error conditions, and expected outputs.

Combining Codex with OpenClaw Skills

Codex becomes even more powerful when paired with OpenClaw skills:

  • Web search — Ask “How do I use the new React 19 compiler?” and the agent searches the web for current docs before generating code
  • MCP tools — Connect to your GitHub repos, databases, or local files so Codex can read your actual codebase
  • Memory — The agent remembers your tech stack, coding conventions, and project architecture across conversations

See the MCP guide and skills installation guide for setup instructions.

Tips for Best Results

  • Be specific about language and framework — “Write a TypeScript Express middleware” gives better results than “write a middleware”
  • Include context — Paste relevant code, error messages, or config files alongside your question
  • Use multi-model setups — Configure Codex as your default model for code-heavy conversations and Claude or GPT for general chat
  • Leverage memory — Tell the agent about your project once (“We use Next.js 15 with Drizzle ORM”) and it remembers for future conversations

Frequently Asked Questions

Which Codex model should I use?

GPT-5.3-Codex is the latest and most capable. It handles complex multi-file tasks, understands project structure, and generates production-quality code. For simpler tasks like quick snippets or syntax questions, older Codex variants work fine and cost less.

How much does Codex cost through OpenRouter?

Pricing varies by model variant and usage. Check OpenRouter's model page for current per-token pricing. OpenClaw Launch plans include OpenRouter credits — see pricing for details.

Can I use Codex and other models in the same agent?

Yes. OpenClaw supports multiple model configurations. You can set Codex as the primary model and add fallback models, or switch models per conversation. Configure this in the models section of your OpenClaw config.

Does Codex work with all OpenClaw channels?

Yes. Codex works with every channel OpenClaw supports — Telegram, Discord, WhatsApp, Slack, web chat, and more. The model processes your message regardless of which channel it arrives from.

What's Next?

  • OpenRouter Guide — Set up OpenRouter for access to Codex and hundreds of other models
  • MCP Guide — Connect external tools and repos to your Codex-powered agent
  • Memory Guide — Give your agent persistent context about your projects
  • Agent Guide — Learn how to configure and customize your AI agent
  • See pricing — Deploy your Codex-powered AI agent starting at $3/month

Code Smarter with Codex

Deploy your AI agent with Codex models. Generate code, debug errors, and review pull requests from any messaging app.

Deploy with OpenClaw Launch