Dhee — Self-Evolving Memory & Context Router for AI Agents
Dhee is an open-source memory and context-router layer for LLM agents. It cuts token usage by 90%+ on
Claude Code, Cursor, Codex, Gemini CLI, Aider, Cline, and any MCP-compatible client — and leads the
LongMemEval benchmark at R@1 94.8% and R@5 99.4% on the full 500-question set.
What Dhee does
- Self-evolving memory: chunks your CLAUDE.md, AGENTS.md, and skills library into decay-aware vector memory; injects only the ~300 relevant tokens per turn.
- Context router: digests Read, Bash, and subagent output at source — a 10 MB git log becomes a 40-token summary with a pointer the model can expand if needed.
- Perfect recall, years in: Ebbinghaus decay, auto-promotion, and insight synthesis keep the per-turn cost flat even as memory crosses 50,000 entries.
- Self-tuning policy: the retrieval depth per tool and per intent is learned from actual expansion patterns, not hand-configured.
Works with
Claude Code, Cursor, Codex, Gemini CLI, Aider, Cline, Goose, Claude Desktop, Python SDK, CLI, and Docker.
View Dhee on GitHub