Observational memory changes the AI governance equation
AI agents are moving from retrieving data to building memories about users. Most privacy frameworks weren't designed for that shift — and the gap is widening fast.
AI governance isn't abstract—it's decisions under constraints. Foundations covers what matters: tech concepts vital to governance (yes, we geek out here), how obligations work in practice, what privacy means for product design, and why frameworks taking shape now determine what you can build next.
AI agents are moving from retrieving data to building memories about users. Most privacy frameworks weren't designed for that shift — and the gap is widening fast.
Engineers call this context management. Lawyers should call it something else: selective deletion with no retention policy.
AI agents with memory aren't just smarter — they're harder to govern. Each memory layer creates distinct privacy and retention obligations product counsel needs to address at the architecture stage.
MCP servers let AI agents access your APIs without custom code. Most weren't built for production security. That gap between "works in demo" and "safe at scale" is where the liability lives.
When long-running AI agents summarize their own context to stay within token limits, they're deciding what to forget. That's not an engineering problem — it's a governance one.
SaiKrishna Koorapati's piece in VentureBeat makes the case that observable AI isn't about adding monitoring dashboards. It's about audit trails that connect every AI decision back to its prompt, policy, and outcome
The accountability gap doesn't just create compliance risk. It creates operational security risk. When model developers point to deployers and deployers point to model developers, the space between them becomes the attack surface.
A new research paper from Stanford, Harvard, UC Berkeley, and Caltech — "Adaptation of Agentic AI" — provides the clearest framework I've seen for diagnosing what goes wrong when agentic AI systems move from controlled demonstrations to real-world deployment.