UK consults on opt-out model for training AIs on copyrighted content
The U.K. government is consulting on an opt-out copyright regime for AI training that would require rights holders to take active steps if they don’t want their intellectual property to become free AI training fodder.
The rise of generative AI models that are trained on vast quantities of data has brought intellectual property concerns to the forefront, with many creatives up in arms that their work is being processed without permission (or compensation) to train AI technologies that can churn out competing output — whether text, visuals or audio, or a combination of all three.
The visual arts, music, film production, and video games all look to be viable targets for GenAI, which replaces traditional (skilled human) production processes with highly scalable AI tools that rely on a system of prompting to trigger models to instantly generate output that’s based on statistical analysis of information patterns in their training data.
With global attention fixed on large language models (LLMs) such as OpenAI’s GPT, which underpins the popular ChatGPT chatbot, the days of AI startups quietly scraping the web to grab free training data for model developing without anyone noticing or caring are over.
Deals are being struck by AI companies to license certain types of content to use for training data. At the same time, a growing number of lawsuits are challenging unlicensed used of IP for AI training.
The situation demands legal clarity, at the least, and that’s what the U.K. government says it hopes this consultation will help deliver as lawmakers consider how they may shape policy in this (fraught) area. Future U.K. policymaking could include legislation “to provide legal certainty,” although the government says it has yet to decide on that.
For now, ministers are seeking to tread a line between claiming the government wants to support the U.K.’s creative sector and a stated desire to boost AI investment and uptake. But this framing looks like an attempt to fudge a position that favors the AI industry.
“Both our creative industries and our AI sector are UK strengths. They are vital to our national mission to grow the economy. This consultation sets out our plan to deliver a copyright and AI framework that rewards human creativity, incentivises innovation and provides the legal certainty required for long-term growth in both sectors,” the government wrote in a ministerial foreword to the consultation.
There’s no doubt that setting up an opt-out regime for use of IP for AI training would put the burden on creatives to act to protect their works — a situation that could disproportionately disadvantage smaller creatives compared to larger rights holders. So the approach is unlikely to be universally, or even widely, popular with the creative sector.
Whereas, AI companies have been actively lobbying for such an opt-out regime .
“The proposals include a mechanism for right holders to reserve their rights, enabling them to license and be paid for the use of their work in AI training. Alongside this, we propose an exception to support use at scale of a wide range of material by AI developers where rights have not been reserved,” the government continued. “This approach would balance right holders’ ability to seek remuneration while providing a clear legal basis for AI training with copyright material, so that developers can train leading models in the UK while respecting the rights of right holders.”
The government goes on to state that its “key objectives” for both the creative and AI industries include “promoting greater trust and transparency between the sectors”.
And its stated goals of supporting rights holders’ control of their content and ability to be remunerated for its use and the development of “world-leading AI models in the UK by ensuring wide and lawful access to high-quality data” will clearly require some fancy-footwork if the end-result doesn’t end up downgrading the interests of one sector over the other.
As it stands, the AI industry appears to be getting the better deal from the Labour government so far.
That said, ministers stress that whatever “package of interventions” the government ends up presenting must tackle the AI industry’s lack of transparency. So while it frames the proposed opt-out regime as “balanced”, it also states explicitly that “greater transparency from AI developers is a prerequisite” for the approach to work.
Specifically, the government says this means “transparency about the material they use to train models, how they acquire it, and about the content generated by their models”, adding: “This is vital to strengthen trust, and we are seeking views on how best to deliver it.”
Another component it emphasizes as necessary for an opt-out regime to work is the development of “simple technical means for creators to exercise their rights, either individually or collectively.”
“This will require both the AI companies and creative industries to come together to create new technical systems to deliver the desired outcome of greater control and licensing of IP,” it also suggested.
“This approach aims to protect the interests of our creative industries and AI sectors. But successfully delivering it is not straightforward. It will require practical and technical solutions as well as good policy. We are open-eyed about this, but optimistic that we can succeed by working together — across our departments and both sectors,” the government added.
The consultation runs for 10 weeks — closing on February 25, 2025. Web submissions can be made via an online survey .
“As AI evolves rapidly, the UK’s response must adapt,” the government also wrote, couching the consultation as “an opportunity for anyone with an interest in these issues to share their views and provide evidence regarding the economic impact of these proposals,” and committing to run a program of “wider engagement activity” over the consultation period to “ensure that the full range of views is heard”.