UK Government’s Stance on AI and Copyright Ignites Creator Fury Amidst Deepening Economic Threats
The United Kingdom’s vibrant creative sector is facing an unprecedented crisis, with revelations that government ministers have consistently failed to engage with leading artists’ organizations on the critical issue of Artificial Intelligence and copyright. This lack of dialogue comes as new data highlights the severe economic repercussions AI is already inflicting on creators, painting a grim picture for the future of artistic livelihoods.
The latest insights emerged from a Westminster policy conference, where Natasha Mangal, Legal and Policy Advisor for Creator Relations at CISAC (the International Confederation of Societies of Authors and Composers), presented stark figures illustrating the financial devastation AI poses. CISAC, a global network representing over five million creators across music, audiovisual, drama, literature, and visual arts, has been at the forefront of advocating for artists’ rights in the face of AI’s transformative, and often exploitative, capabilities.
AI’s Looming Shadow: A Ten Billion Pound Threat to Music
Mangal unveiled findings from a CISAC global economic study, which projected that within the music sector alone, a staggering 24% of creators’ revenues could be at risk by 2028. This translates to a cumulative loss exceeding €10 billion (approximately $11.7 billion USD) in the coming years. This projection, made two years ago, is now considered a conservative estimate given the accelerating proliferation of AI-generated music flooding streaming platforms.
"In 2024, CISAC released a global economic study on the impacts of AI on the creative industries," Mangal stated at the Westminster Media Forum conference on Music Policy. "It found that, in the music sector alone, 24% of creators’ revenues would be at risk by 2028 due to the substitutive effect of AI outputs on the marketplace. This represents a cumulative loss of over €10 billion [$11.7 billion] over the next few years."
The urgency of the situation, characterized by Mangal as "very high economic stakes," necessitates immediate policy interventions. She outlined four key priorities for governments worldwide:
Transparency as the Cornerstone of Licensing
The foremost priority identified is the imperative for absolute transparency from AI model providers. This means holding these entities fully accountable for disclosing the specific sources of protected works used in their training data and for detailing how that content is utilized. Crucially, rightsholders must be granted the ability to audit these disclosures to ensure compliance and accuracy.
Mangal pointed to ongoing struggles within the European Union, citing the recent Voss Report. This report underscored that the transparency requirements embedded within the EU AI Act are not being adequately met, with current disclosure templates proving insufficient. This highlights a global challenge in ensuring that the foundational principles of AI regulation translate into meaningful protections for creators.
Developer Accountability for Data Sourcing
Secondly, Mangal emphasized that AI system developers, not end-users, should bear the full responsibility for their data sourcing decisions. This principle dictates that AI services should prioritize the use of legally obtained and fully licensed copyrighted content. The article notes the potential role of the UK’s proposed Creative Content Exchange in facilitating this, though its effectiveness remains to be seen.
This stance is gaining traction, evidenced by recent legal victories against AI vendors. In the autumn, the generative AI music vendor Udio was compelled by Universal Music Group to operate as a fully licensed platform. Similar legal actions are underway against other entities like Suno. In the literary sphere, a class-action lawsuit by US authors against Anthropic resulted in a $1.5 billion settlement last year, acknowledging the unauthorized scraping of copyrighted books for AI training.
Cultivating Licensing Markets for Sustainable Creation
The third critical policy area focuses on the development of robust licensing solutions, including collective licensing models. These frameworks are essential to enable lawful, large-scale access to training data while simultaneously safeguarding creators’ incentives to produce new work. Mangal highlighted the role of music societies in providing diverse and culturally rich repertoires, which can enhance the quality and utility of AI services.
Furthermore, Mangal urged the UK government to abandon its previously favored approach of a commercial Text and Data Mining (TDM) exception. Instead, she advocated for creating an environment conducive to developing licensing markets that strengthen rights enforcement and prevent the circumvention of transparency obligations through territorial restrictions.
While the British government did signal a shift away from its preferred TDM exception on March 18th, following widespread opposition, the exact nature of this pivot remains unclear, raising concerns about potential reintroduction in a modified form. This indecision reflects a broader pattern where the government, despite opportunities, appears to be inadvertently undermining its own creative industries. The forthcoming response to the UK Government’s House of Lords Communications and Digital Committee’s report on AI and Copyright, due by May 6th, is anticipated to offer further clarity, or perhaps further obfuscation, on Downing Street’s intentions. This report, much like its predecessor on Large Language Models, has strongly advocated for government support of the creative sector, robust copyright enforcement, and the exploration of paid, opt-in licensing models.
Global Principled Stances: Lessons for the UK
Mangal underscored that nations can and must adopt principled positions in this evolving landscape. She pointed to Australia’s firm stance against the introduction of a commercial TDM exception with opt-outs. Similarly, the European Parliament’s recent report signals a concession that the opt-out system, as initially conceived, is not a viable mechanism for rightsholders to control and regulate online digital content.
The House of Lords Committee’s report this year echoed these sentiments, urging ministers to emulate Australia’s proactive approach. Mangal concluded her points by stressing the necessity of further legislative intervention to ensure the EU AI Act’s promises are realized. Most importantly, she advised the UK to learn from these experiences and actively promote voluntary and other licensing models.
The Bleak Midwinter of British Musicians
Following Mangal’s presentation, Deborah Annetts, CEO of the Independent Society of Musicians (ISM) and Chair of the Creators’ Rights Alliance, provided a sobering account of the direct impact on musicians. Annetts, who was the source of the earlier revelations about the government’s lack of engagement, detailed how US AI vendors allegedly promised politicians significant economic growth in exchange for unfettered access to artists’ intellectual property, a proposition that appears to have been met with alarming naiveté by some in government.
Annetts described interactions with civil servants who expressed an optimistic, yet detached, view of AI’s future benefits for musicians, envisioning "better jobs." She countered this by asserting that for a musician, the ideal "better job" is simply to continue being a musician.
Drawing on the ISM’s extensive membership of 11,000 professional musicians across all genres, Annetts painted a picture of widespread pre-existing financial precarity within the industry. For every globally recognized star, she noted, there are tens of thousands of talented individuals struggling to survive on the financial margins.
The Precarious Lives of Freelance Musicians
"Ninety-three percent of our members are freelance," Annetts explained. "That means it’s already difficult for them to get mortgages, it’s difficult for them to manage debt, and it’s incredibly difficult for them to manage the new provisions coming through HMRC in relation to tax. We’re hearing lots of terrible stories happening there."
She was referring to the UK’s mandatory Making Tax Digital (MTD) scheme, which imposes additional financial and administrative burdens on the self-employed through requirements for paid cloud services and continuous digital record-keeping. This scheme, implemented without direct benefit to freelancers, exacerbates existing challenges.
Annetts elaborated on the inherent instability of a freelance musician’s life: constant uncertainty about future work, the ability to cover expenses, and even maintaining basic necessities like housing. Compounding this are issues like promoters defaulting on payments, with many musicians too fearful to demand their dues lest they jeopardize future engagements. The cumulative impact of COVID-19, Brexit, and now AI has created a deeply challenging environment. Annetts speculated that the enduring dedication of musicians can only be attributed to their profound love for their craft.
Brexit’s Economic Blow and AI’s Escalating Threat
The economic fallout from Brexit alone has been catastrophic for British musicians, with research indicating a reduction in income by approximately 50%. Yet, Annetts reported that discussions with civil servants often result in the acknowledgement of the issue’s complexity and the EU’s demands for reciprocal action, a response that feels inadequate given the escalating crisis posed by AI.
The ISM’s report, "Brave New World," details the tangible impact of generative AI. A stark 73% of musicians surveyed believe unregulated AI threatens their ability to earn a living. Alarmingly, 53% have already experienced job losses due to AI, and 17% report being compelled to undertake AI-related work, such as training AI systems. Many session musicians and composers have found their work replaced by AI-generated alternatives, with some forced to record sounds for AI training, knowing this will ultimately lead to their own displacement.
Investigations have uncovered lost commissions valued at £10,000 or more. Crucially, only a mere 7% of musicians have been approached to license their work for AI training, and fewer than one in five of those have received any payment, meaning approximately only 1% of musicians are compensated by AI vendors. The threat extends to other performers and voice artists, with 65% and 93% respectively viewing AI as a significant risk. The issue of personality rights is also a major concern, with actors’ union Equity reporting similar negative impacts on creators’ unique identities and performances.
The Erosion of Personality Rights and the Value Chain Gap
This phenomenon involves AI generating content in the distinct style, sound, and voice of human artists without consent or compensation, thereby exploiting their talents, diluting their artistic output, and diverting income to artificial creations. Actors face comparable threats, with AI capable of mimicking their likenesses, personalities, and performances without authorization. The emergence of companies offering "synthespian" performances trained on human actors’ work, and virtual fashion models, exemplifies this growing concern.
Annetts stressed the urgent need for legislative intervention in this domain. She articulated a critical gap in the value chain: "Whatever is going on with tech firms in terms of licensing at the top level is certainly not finding its way down to ISM members. One of the questions I keep asking, in relation to the value chain, is, ‘Is the ISM member ever going to get paid in relation to their work, which has been scraped by tech companies without permission?’ So far, given the structure of our legislation and the way collective management organisations work, the answer is no. So, even though my members’ work has been stolen, they are never going to see any payment. And that is frankly just wrong."
She concluded by highlighting a pattern of US tech companies entering markets, disrupting them, and departing without adequate compensation. This, she argued, stands in stark contrast to the immense value of the UK’s creative industries, which contribute £125 billion ($169 billion USD) annually and employ 2.4 million workers, dwarfing the UK’s own AI companies in both economic output and employment figures. Even UK AI startups have voiced dissent against the government’s copyright proposals, labeling the opt-out system for creators as "misguided," "damaging," and "divisive."
Author’s Analysis: A Government Out of Touch
Natasha Mangal’s concise and impactful presentation was a masterclass in delivering critical information, standing in stark contrast to the UK government’s own recent report on AI and copyright. That 125-page document, released following a lengthy public consultation, offered little beyond platitudes about listening and assessing the situation, raising serious questions about the government’s genuine engagement.
The resilience of musicians, who continue to create out of sheer passion despite overwhelming odds, is a testament to their dedication. As a part-time professional musician myself, I can attest to this. However, the existing landscape of the music industry, where listeners are conditioned to expect music to be free and where platforms like Spotify offer negligible royalties for the vast majority of streams, already makes earning a sustainable income a Herculean task. Even successful artists like Gary Numan have reported receiving minuscule sums for millions of streams, and this is before the artist’s share of profits is even considered after advances are recouped. My own band has experienced similar disheartening returns from major streaming platforms. The advent of AI further exacerbates this already precarious financial situation for creators.
The core issue, as highlighted by Annetts, appears to be a skewed, US vendor-influenced perspective on AI’s impact, which has prevented British ministers from initiating substantive dialogue with the very communities they are supposed to represent. The continued failure to meet with creative leaders, despite repeated requests, remains a deeply concerning and perplexing aspect of the government’s approach.
Adding another layer of complexity, Professor Mykaell Riley of the University of Westminster’s Black Music Research Unit reportedly presented a grim outlook at the same Westminster Forum conference, detailing how Black artists are disproportionately bearing the financial brunt of AI while grappling with widespread cultural appropriation. This critical issue warrants separate and immediate attention.