OpenClaw 2026.4.12: Active Memory + Local Model Lean Mode (Practical Upgrade Notes)
OpenClaw v2026.4.12 shipped as a quality-heavy release, but two changes matter most for operators running personal/self-hosted agent stacks: Active Memory and leaner local-model defaults.
This update is worth prioritizing if you care about reducing context bloat, improving recall quality in long chats, and keeping small local models responsive.
What’s new (high signal)
1) Active Memory plugin is now first-class
The release adds an optional blocking memory sub-agent that runs before the main response, so memory retrieval is proactive instead of waiting for users to say “remember this” or “search memory.”
Why this matters in practice:
- Better continuity in long-running chats
- Fewer missed preferences/decisions
- Less manual prompting for memory lookups
The docs also clarify safe defaults, including:
allowedChatTypes: ["direct"]- bounded timeout (
timeoutMs) - transcript persistence off by default unless debugging
2) New local-model optimization path
The release introduces agents.defaults.experimental.localModelLean: true, designed to reduce default tool/context weight for weaker local setups.
In plain terms: if your self-hosted model struggles with large prompts, this flag helps by trimming heavy defaults so responses stay faster and more stable.
3) Reliability and security hardening across gateway/tooling
There are meaningful hardening fixes around tool-name collisions, media trust boundaries, and packaging/runtime isolation. For production operators, this is a “fewer weird edge cases later” release.
Quick operator checklist
If you’re upgrading this week:
- Upgrade to
v2026.4.12in a staging environment first. - Enable Active Memory only for your main conversational agent initially.
- Keep Active Memory bounded (
timeoutMs, summary length), and keep transcript persistence off unless troubleshooting. - If you run smaller local models, test
localModelLeanto reduce prompt/tool overhead. - Re-validate your approval/tool-call flows after upgrade.
Why this topic now
Clawly already covered earlier April releases, but 2026.4.12 is the first one in this cycle that combines memory architecture changes with practical local-model ergonomics. That combo has immediate impact for technical users running real agent workflows day to day.
Sources
- OpenClaw GitHub release
v2026.4.12 - OpenClaw docs: Active Memory concept + config guidance
Protect your AI agent with Clawly
Deploy your OpenClaw agent in an isolated, hardened container with encrypted credentials and managed updates. No DevOps required.
Deploy Your Agent