PSR Field Report- Privacy governance as product enabler, not department of no
In October I spent two days at IAPP Privacy. Security. Risk. 2025 in San Diego, watching 500+ practitioners try to solve problems that didn't exis…
In October I spent two days at IAPP Privacy. Security. Risk. 2025 in San Diego, watching 500+ practitioners try to solve problems that didn't exis…
Backends are retreating to governance roles while AI agents become the execution layer. InfoQ's analysis shows this architectural shift is already happening in production at banks, healthcare systems, and call centers—with major implications for legal teams.
Organizations are building AI compliance functions like they built human compliance departments—but without the foundational work of defining what compliance means for autonomous systems that operate in unanticipated contexts.
Comet can't distinguish user commands from malicious instructions hidden in websites, treating poisoned blog posts and social media content as legitimate orders while operating with full access to emails and authenticated sessions.
Ant Group's Online RLHF eliminates reward models from training, cutting costs 50% while processing 1 trillion tokens on eight GPUs—proof that removing complexity beats adding scale.
Mission owners coordinate humans and AI agents, but when your agent makes the wrong autonomous decision, who bears legal liability? Before redesigning workflows, build the accountability architecture underneath.
AI agents promise to be indispensable by remembering everything about you. But if those memories can't transfer between platforms, you're locked in. Control over memory is control over identity—and companies are writing the rules
In October, I spent two days at IAPP Privacy Security & Risk 2025 in San Diego, watching 500+ practitioners try to solve problems that didn't…