The creation and consumption of content has fundamentally change in the post GPT world. On the face of it the US and UK are taking dramatically different approaches to one of the most contentious issues of our time: whether AI companies should be able to use copyrighted material without permission to train their models.
But here's the uncomfortable truth that politicians and lawyers seem reluctant to acknowledge: the horses have already bolted from the stable. Large language models are already trained, deployed, and generating billions of responses daily using whatever data they ingested during their development. Whether you view this as the greatest theft in history or legitimate fair use, Pandora's box is wide open, and no amount of parliamentary procedure or copyright office reorganisation is going to close it.
The debate has reached a fever pitch on both sides of the Atlantic, but while the UK engages in thorough parliamentary debate and the United States cuts through complexity with rapier like swiftness, the fundamental reality remains unchanged. As George Bernard Shaw observed, "The reasonable man adapts himself to the world; the unreasonable man persists in trying to adapt the world to himself. Therefore all progress depends on the unreasonable man." The AI pioneers were decidedly unreasonable—they moved fast, broke things, and built systems that are now integral to how millions of people work and create.
In Britain, the government's proposal to allow AI companies to use copyright-protected work without permission—unless creators actively opt out—has sparked one of the most significant cultural policy debates in recent memory. The plan would fundamentally shift the burden from AI companies seeking permission to creators trying to protect their work.
The response has been swift and unified. More than 400 leading figures from across the UK's creative industries have signed an open letter to Prime Minister Keir Starmer, including household names like Paul McCartney, Dua Lipa, Coldplay, Ian McKellen, and organisations like the Royal Shakespeare Company. Their message is unequivocal: "We will lose an immense growth opportunity if we give our work away at the behest of a handful of powerful overseas tech companies."
The creative community describes copyright as the "lifeblood" of their professions, warning that the proposed changes threaten Britain's status as a creative powerhouse. Music producer Giles Martin, son of Beatles producer George Martin, captured the practical challenge perfectly: "When Paul McCartney wrote Yesterday his first thought was 'how do I record this' and not 'how do I stop someone stealing this.'"
This sentiment, while understandable, reflects the old paradigm. In today's reality, creators need to think about protection from the moment of creation because the "unreasonable" technologists have already demonstrated they'll use whatever they can access. Its like having a house with no front door, let alone no lock.
The House of Lords has twice voted to require AI companies to reveal which copyrighted material they've used in their models, with crossbench peer Beeban Kidron leading the charge. However, the government has pushed back using parliamentary procedure, including invoking financial privilege to strip transparency amendments from the data bill.
Technology Secretary Peter Kyle initially championed the opt-out system but has since signalled it's no longer his preferred option. The government now says it's considering four possibilities: leaving things unchanged, requiring AI companies to seek licenses, allowing the proposed opt-out system, or permitting AI firms to use copyrighted work with no restrictions at all.
This democratic process, while admirable in its deliberation, may be fundamentally addressing yesterday's problem. The question isn't whether AI companies should be allowed to use copyrighted material—they already have. The question is how creators get paid for that use, both retroactively and going forward.
Across the Atlantic, the Trump administration appears to have chosen a more direct approach reminiscent of Alexander the Great cutting through the Gordian knot. Rather than untangling the complex legal and ethical questions surrounding AI and copyright, the administration has simply removed key players from the equation.
In May 2025, Trump fired Shira Perlmutter, the head of the US Copyright Office, just days after she published a report questioning AI companies' growing data needs and casting doubt on their expressed need to circumvent existing copyright laws. This followed the dismissal of Carla Hayden, the Librarian of Congress who oversees the copyright office—the first woman and first Black person to serve in that role.
The timing was particularly striking. Perlmutter's report, while not highly critical of AI development, stated that "government intervention would be premature at this time" regarding copyright changes. For an administration influenced by our dear friend Elon such measured resistance may have been unwelcome.
Democratic Representative Joe Morelle called the firing "a brazen, unprecedented power grab," specifically pointing to Musk's influence: "It is surely no coincidence he acted less than a day after she refused to rubber-stamp Elon Musk's efforts to mine troves of copyrighted works to train AI models."
While this approach may to some lack democratic legitimacy, it does acknowledge the reality that traditional copyright enforcement mechanisms are inadequate for the AI age. The administration seems to be clearing the deck for a new reality rather than fighting to preserve an old one that was already breached.
Both countries are grappling with the same fundamental tension, but they're both missing the most practical solution. The UK's creative industries are worth £120 billion to the economy, theoretically representing massive collective economic power. Instead of fighting endless legal battles that will primarily enrich lawyers regardless of outcome, creators need to recognise their strength lies in unity, not litigation.
Let's be honest: copyright law was already struggling to keep pace with the internet age. Musicians watched their revenue streams collapse as file sharing evolved into streaming services that pay fractions of pennies per play. Publishers saw their content aggregated and summarised by search engines. Photographers discovered their images scattered across the web without attribution. The legal frameworks designed for physical media and clear-cut reproduction were inadequate for digital distribution, let alone the wholesale ingestion of content by machine learning systems.
Now, in the LLM age, those same inadequate legal structures are being asked to govern technologies that can absorb and remix the entirety of human creative output. It's like trying to regulate space travel with maritime law.
The practical implications extend far beyond legal theory. As Lady Kidron argued in the House of Lords: "Creators do not deny the creative and economic value of AI, but we do deny the assertion that we should have to build AI for free with our work, and then rent it back from those who stole it."
This is exactly right, but the solution isn't more lawsuits or parliamentary procedures. The solution is collective bargaining power. Copyright holders need to organise into something entirely new—a creative coalition that makes the Empire's Death Star look like a modest engineering project. Not to block AI development, but to negotiate fair payment for the use of their work, both past and future.
Think about it practically: individual artists trying to opt out of AI training datasets are playing whack-a-mole with companies that have already scraped most of the internet. But the collective owners of copyrighted material—publishers, record labels, film studios, news organisations, and artist collectives—represent the content that makes AI models valuable. Together, they have negotiating power that no individual creator could match.
This isn't about existing rights organisations like ASCAP or BMI, useful as they are. This is about something bigger: a rebel alliance of content creators striking back against the dark side of the force that assumes human creativity should be free raw material for corporate profit.
The opt-out system proposed in the UK presents particular challenges for emerging artists who lack the resources or knowledge to effectively protect their work. But collective action could level that playing field, ensuring that even new creators benefit from the negotiating power of established organisations.
The contrasting approaches between the UK and US reveal different philosophies, but both are essentially rearranging deck chairs while the ship of AI development sails on. The "unreasonable" technologists have already built systems that are transforming entire industries. Fighting to return to a pre-AI copyright regime is like trying to uninvent the printing press.
Instead of endless consultation periods and impact assessments, creators need to focus on practical power. This means organising collectively to demand payment for AI training data use—not as charity or goodwill, but as the price of access to the content that makes AI systems valuable.
The debate also highlights the importance of thinking beyond traditional copyright frameworks. Generative AI models require vast amounts of training data sourced primarily from online content including Wikipedia, YouTube, newspaper articles, and digital book archives. Rather than fighting whether this use should be permitted, the focus should be on ensuring it's compensated.
Some licensing deals between AI companies and publishers are already emerging, proving that negotiated solutions are possible. These early agreements point toward a future where collective bargaining could replace individual lawsuits as the primary mechanism for protecting creator interests.
For creators watching these developments, the lesson is clear: strength comes from organisation, not litigation. The unified response from UK artists has influenced government thinking, demonstrating the power of collective action. But that same energy would be far more effective focused on direct negotiations with AI companies rather than parliamentary lobbying. Yes it gets the artists airtime and has the halo effect of them representing the 'little man', but it doesn't do a lot.
The international nature of both AI development and creative industries means that effective collective action must also be international. A creator cartel that operates only in one jurisdiction will find itself negotiating with companies that can simply base their operations elsewhere.
The question isn't whether AI companies will use copyrighted material—they already have done, and will continue to do so. The question is whether creators will organise effectively enough to get paid for that use.
The stakes are too high for either nostalgic appeals to traditional copyright law—which was already broken for the digital age—or naive faith that technology companies will voluntarily do the right thing. Copyright laws do need to catch up, but waiting for that glacial process while companies profit from existing content is a losing strategy.
What's needed is practical collective action that treats AI companies as business partners to be negotiated with, not enemies to be defeated in court. The "unreasonable" technologists have already won—now it's time for creators to be equally unreasonable in demanding their fair share of the value they've helped create.
Here's a call to arms for all content owners who are tired of watching the LLM Empire strip-mine their creativity for corporate profit: if you're interested in striking back and joining forces to build something that actually has the negotiating power to demand fair compensation, drop me an email. The Force may be strong with Big Tech, but it's stronger when creators unite. And I like a fight.
That said, I rationally thought the people of Britain would realise we were stronger in Europe than out of it, and I was spectacularly wrong about that. So I'm not exactly holding my breath waiting for creators to embrace collective action over individual grievance either. But hope springs eternal, and the alternative—watching AI companies harvest decades of human creativity for free while creators squabble over parliamentary procedures—is too depressing to accept without a fight.
Lawyers will get rich regardless of which approach wins. The question is whether creators will organise effectively enough to ensure they do too.