When autonomous agents need their own compliance departments
Organizations are building AI compliance functions like they built human compliance departments—but without the foundational work of defining what compliance means for autonomous systems that operate in unanticipated contexts.
MarkTechPost published a tutorial on building "ethically aligned" autonomous agents, and it's worth examining not for the technical implementation but for what it tells us about organizational readiness.
The framework is straightforward: a policy model generates candidate actions, an ethics judge evaluates them against organizational values (respect privacy, follow laws, avoid manipulation), and the system selects the lowest-risk option. Demo case: a banking agent that needs to follow disclosure regulations while driving customer adoption.
What strikes me is how this mirrors human organizational structure. We built compliance departments for employees. Now we're building them for agents. Same function, different substrate.
But there's a gap between building the mechanism and having the organizational capacity to define what it should enforce. The tutorial provides sample values—"prioritize user well-being and long-term trust over short-term gain"—but translating corporate principles into operational guardrails isn't a technical problem. It's a governance maturity problem.
As I wrote about the infrastructure reality check for agentic AI, companies are treating agents as employees without having done the foundational work: What decisions can agents make autonomously? What requires escalation? How do we audit their reasoning? Who's accountable when the risk scoring fails?
The tutorial shows how to build the compliance function. It doesn't address whether organizations know what compliance means for autonomous systems, or whether they're prepared for the fact that these systems will operate in contexts their builders didn't anticipate.
This is proactive architecture applied to reactive governance—building the structure before we've figured out what it should contain. I've seen this pattern before. It doesn't end well.

