You ask your AI bot a question, it gives a great answer. Five minutes later, you reference the same conversation — and it has no idea what you're talking about. Sound familiar?
This is one of the most common frustrations with AI assistants, and the good news is: it's fixable. This guide explains exactly why AI bots forget and how to configure OpenClaw so yours remembers what matters.
Why AI Bots Forget
Every AI model has a context window — a maximum number of tokens (roughly words) it can process at once. When a conversation exceeds that limit, older messages get dropped to make room for new ones.
This isn't a bug. It's a fundamental limitation of how large language models work. Think of it like a desk that can only hold so many papers — when it's full, the oldest papers fall off the edge.
Here's what the context windows look like for popular models:
- Claude — 200K tokens (~150,000 words)
- Gemini — 1M tokens (~750,000 words)
- GPT-5 — 128K tokens (~96,000 words)
- DeepSeek — 128K tokens (~96,000 words)
For most conversations, these limits are more than enough. But if you use your bot heavily throughout the day, or across multiple days, you'll eventually hit the wall. That's where memory configuration comes in.
Session Memory vs Cross-Session Memory
There are two types of memory to understand:
Session memory is what happens within a single conversation. The bot remembers everything you've said in the current chat thread. This works automatically — no configuration needed. But when the session ends (or the context window fills up), it's gone.
Cross-session memory is the ability to remember across separate conversations — days, weeks, even months apart. This is what most users actually want when they say "my bot keeps forgetting." They want the bot to remember their name, preferences, and past conversations.
Most users want cross-session memory but don't know it exists as a separate feature.
How to Enable Cross-Session Memory on OpenClaw
OpenClaw has an experimental sessionMemory feature that stores key facts, preferences, and context across sessions. When enabled, your bot will automatically extract and remember important information from conversations.
Here's how to enable it:
- Open your OpenClaw configuration (from your dashboard)
- Enable the
sessionMemoryexperimental feature - Configure the embedding model (Qwen3 via OpenRouter works well)
- Redeploy your instance
Once enabled, the bot will automatically remember key facts like your name, preferences, projects you're working on, and other context — even across completely separate conversations.
For a detailed walkthrough, see our complete memory configuration guide.
Choosing the Right Context Window
If your main issue is forgetting within a single conversation, the fix is simple: choose a model with a larger context window.
Models with larger context windows can hold more of your conversation history before dropping older messages. If you have long, complex conversations, consider switching to a model like Gemini (1M tokens) or Claude (200K tokens).
You can compare all available models and their context windows on our models page.
Using AGENTS.md for Persistent Instructions
There's another powerful tool for making sure your bot never forgets critical information: the system prompt (also called AGENTS.md in OpenClaw).
Anything you put in the system prompt is always included at the start of every conversation. It never gets dropped, no matter how long the conversation gets. This makes it perfect for:
- Your name and how you want to be addressed
- Your role, company, or industry context
- Specific instructions for how the bot should behave
- Key facts the bot should always know
- Output format preferences
Think of the system prompt as the bot's "permanent memory" — it's the one place that's guaranteed to persist.
Session Isolation Settings
If multiple people use your bot (for example, a shared Telegram bot), you need session isolation to prevent messages from different users bleeding into each other.
Set session.dmScope to "per-channel-peer" in your OpenClaw config. This ensures each user gets their own separate memory and conversation history. Without this setting, User A might see responses that reference User B's messages — which is both confusing and a privacy concern.
7 Tips to Reduce Forgetting
- Keep conversations focused — Don't mix unrelated topics in one thread. Start a new conversation for new topics.
- Enable cross-session memory — Use OpenClaw's
sessionMemoryfeature for facts that should persist across conversations. - Put critical context in the system prompt — Anything the bot must always know belongs in AGENTS.md.
- Choose models with larger context windows — Gemini and Claude offer the largest context windows available today.
- Summarize long conversations — Periodically ask the bot to summarize the key points, then start a new conversation with that summary.
- Use session isolation — Set
per-channel-peerscope if multiple users share the bot. - Don't repeat yourself — If the bot already knows something from the system prompt, you don't need to re-state it in every message.
Set Up Memory on OpenClaw Launch
Cross-session memory is available on all plans on OpenClaw Launch. You can configure it directly from your dashboard — no coding or server management required.
For a full walkthrough, check out our memory configuration guide.
Ready to deploy an AI agent that actually remembers? Deploy on OpenClaw Launch and configure cross-session memory in under a minute.