Skill Guide
OpenClaw + Graphify
Graphify is an open-source skill that builds a queryable knowledge graph from any codebase. Wire it up to your OpenClaw agent and a question like “where does billing deduct credits?” returns the right files in about 1.7k tokens instead of the 123k a naive grep-and-read would burn.
What Is Graphify?
Graphify is an open-source (MIT-licensed) skill that turns a repository — code, docs, papers, and diagrams — into a queryable knowledge graph. It uses Tree-sitter for static analysis, Leiden clustering for community structure, and an LLM pass for semantic labels. The result is a graph where nodes are functions, files, and concepts, and edges are the calls, imports, and semantic relationships between them.
Launched in April 2026, Graphify crossed 22,000 GitHub stars in under ten days and got native integrations across ten AI coding platforms. Its headline number is the token saving: roughly 1.7k tokens per query vs. ~123k with a naive read-everything approach. That's a 71x reduction, which adds up fast if you're running a coding agent over a real project.
Why Pair It with OpenClaw?
OpenClaw agents are great at chat-first coding help — “explain this bug”, “write a migration for X”, “review this diff”. But they hit a wall on whole-repo questions: they can only see what you paste, and loading a whole service into context is expensive and noisy.
Graphify solves the retrieval side cleanly. Your OpenClaw agent gets a single “ask the graph” tool, the graph returns the few chunks that actually matter, and the agent reasons over those. Three concrete wins:
- Smaller context, sharper answers — the agent sees only the relevant slice of the codebase
- No amnesia between turns — the graph is persistent, so follow-ups work without re-pasting
- Code stays local — graph construction runs on your machine; only the query result travels to the model
How the Integration Works
Graphify ships as an MCP-compatible skill, which is the same protocol OpenClaw uses internally. The flow is:
- Run Graphify locally once to build the graph for your repo.
- Expose the graph query tool over MCP (Graphify ships helpers for this).
- Register the MCP endpoint in your OpenClaw config (see the MCP guide).
- Ask your agent a repo question from any channel. It calls the graph tool, gets back a curated chunk, and answers.
Setup
Step 1: Build the graph
From your repo root:
pip install graphify graphify build .Graphify walks the tree, runs Tree-sitter on code files, clusters with Leiden, and writes a graph artefact under .graphify/. The build happens on your machine — no source code is sent anywhere external.
Step 2: Expose the MCP endpoint
graphify mcp serve --port 7812This starts a local MCP server your OpenClaw agent can call. Put it behind whatever networking you use for local tools — Tailscale, a reverse tunnel, or a sidecar, depending on where your agent is running.
Step 3: Wire it into OpenClaw
In your agent config on the dashboard, add the Graphify MCP endpoint. The generic pattern is covered in the OpenClaw MCP guide. Once connected, your agent sees a graph_query(question) tool.
Step 4: Ask repo questions
From any chat channel the agent is on (Telegram, Discord, web chat, whatever you configured), try:
- “Where do we deduct credits when a subscription renews?”
- “Which files implement the warm pool, and how are they connected?”
- “Show me every place we call the OpenRouter API.”
- “Walk me through the auth flow from login to session cookie.”
The agent calls the graph, gets back the relevant nodes (plus a short summary), and answers. Follow-ups stay in context because the graph handle persists across turns.
Keeping the Graph Fresh
The graph is a snapshot. Rebuild it when the code changes:
- Local workflow: rerun
graphify build .as part of your save or pre-commit hook. - Team workflow: rebuild nightly in CI, publish the graph artefact to an object store, and point the MCP server at the latest copy.
When Graphify Is Overkill
Not every OpenClaw use case needs a knowledge graph:
- Single-file or tiny-repo work — just paste the file. A graph adds latency without much payoff.
- Chat-first assistants (personal bots, customer support) — unrelated to repos.
- Docs search — Firecrawl + a vector store is usually simpler.
Reach for Graphify when the question is “how does this codebase work?” and the codebase is big enough that the answer lives across many files.
Bottom Line
Graphify and OpenClaw compose well: Graphify does the retrieval, OpenClaw does the conversation. For teams running a coding-oriented agent in Slack, Discord, or Telegram, this is one of the cleanest ways to make it genuinely helpful on a real codebase — cheap per query, no code leaves the box, and the agent stops hallucinating files that don't exist.
Next Steps
- OpenClaw MCP guide — how to register any MCP tool
- OpenClaw memory — persistent memory for the agent itself
- Best MCP servers — curated list for coding agents
- Deploy on OpenClaw Launch — chat agent + your graph in one place