Switzerland proves responsible AI development works
Compliance as a feature, not an afterthought: What Switzerland's Apertus model means for AI procurement
Switzerland proved something I've been telling AI teams: you can build competitive models while respecting copyright from the outset. Their new Apertus model—70 billion parameters, trained entirely on public data with proper opt-out mechanisms—performs comparably to systems built on questionable legal foundations.
The design approach matters here. While American companies lobby for copyright exemptions, Switzerland worked within existing law. They made transparency central: full training documentation, source code, everything public. Product teams evaluating AI partnerships get something rare—regulatory certainty without sacrificing performance.
This changes how compliance conversations work. Switzerland treated copyright respect as a feature that European companies and institutions actually want when they're procuring AI systems. Fewer liability questions later. Cleaner due diligence. The kind of certainty that makes procurement decisions easier.
Organizations building AI strategies face a straightforward choice. Models with legally questionable training create risks that alternatives like Apertus avoid entirely. You either explain to stakeholders why standard legal principles shouldn't apply to your AI vendor, or you pick the option that doesn't raise the question.