WeTransfer's AI terms disaster reveals the trust gap in legal strategy

WeTransfer's quick reversal on AI training rights shows how standard legal approaches to AI can instantly undermine brand trust—even for privacy-conscious companies that should know better.

2 min read
WeTransfer's AI terms disaster reveals the trust gap in legal strategy
Photo by Mathew Schwartz / Unsplash

WeTransfer's recent terms of service debacle demonstrates how not to introduce AI policies. The company tried to quietly add expansive AI training rights to their terms—granting themselves perpetual, global, sub-licensable licenses to user content—only to reverse course within days after creative professionals revolted.

WeTransfer slipped in language that would have let them commercialize AI models trained on user files, sublicense content to third parties, and develop new technologies without user notification or compensation. For a platform built on trust with creative professionals, this represented a complete reversal of their core promise.

The explanation made things worse. WeTransfer claimed the language was meant to cover potential future AI content moderation tools, not commercial training. But if that was the intent, why grant such sweeping rights? The explanation revealed either sloppy legal drafting or after-the-fact rationalization.

The pattern is familiar. Companies are retrofitting AI policies onto existing platforms without considering how those policies interact with their brand promises. WeTransfer joins Adobe, Zoom, and others who've had to walk back overreaching AI terms after user backlash.

AI governance can't be treated as another contract update. Users now see contradictions between company values and legal frameworks in real-time. The cost of getting this wrong—in terms of user trust and brand damage—often exceeds whatever flexibility the broad terms were meant to preserve. Product counsel working on AI policies needs to test new terms against existing brand commitments before users do it for them.

WeTransfer clarifies it won’t use your files to train AI
WeTransfer says it won’t use data to train AI models and updated the language of its terms of service after an earlier update that caused confusion and backlash.