Copyright and Artificial Intelligence: Impact Assessment
Published 18 March 2026
Report on Copyright and Artificial Intelligence
Presented to Parliament pursuant to Section 136 of the Data (Use and Access) Act 2025
March 2026
© Crown copyright 2026
ISBN 978-1-5286-6308-3
E03546782 03/26
Executive summary
This Impact Assessment evaluates the potential economic effects of the policy options set out in the government’s consultation on Copyright and Artificial Intelligence[footnote 1]. It has been prepared pursuant to section 135 of the Data (Use and Access) Act 2025. It presents an illustration of the likely costs and benefits across options and identifies key areas for further evidence gathering amidst a rapidly evolving evidence base and international context.
We will not introduce reforms to copyright law until we are confident that they will meet our objectives for the economy and UK citizens. This means protecting the UK’s position as a creative powerhouse, while unlocking the extraordinary potential of AI to grow the economy and improve lives. Any reform must ensure that right holders can be fairly rewarded for the economic value their work creates, and that they are protected against unlawful and unfair use of their work. It must also ensure that AI developers can access high quality content. It is clear through the consultation and our subsequent engagement that there is no consensus on how these objectives should be achieved. We must take the time needed to get this right.
AI is transforming every part of the economy. It offers extraordinary promise to grow the economy, create new, more rewarding jobs, and improve living standards. The OECD estimates that adoption of AI into the whole UK economy could add 0.4 to 1.3 percentage points to the UK’s productivity growth. These estimates are highly uncertain, but this would be the equivalent of adding £55-140 billion to UK Gross Value Added (GVA) in 2030.[footnote 2]
The UK’s Creative Industries (CIs) are world-leading and one of the growth-driving sectors in the government’s Industrial Strategy. Together, they are worth £146 billion GVA – nearly 6% of the UK’s total GVA in 2024. £62 billion (42%) of this GVA comprises the IT software and computer services sub-sector, which includes AI services and developers. The CIs are growing at more than twice the rate of the wider economy and in 2023 accounted for 13% of all UK services exports.[footnote 3]
The UK is ranked as having the third largest AI sector globally and the largest in Europe.[footnote 4] The AI sector is growing 23 times faster than the economy as a whole and contributed approximately £12 billion GVA in 2024.[footnote 5] It could see year-on-year growth of 10%-40%, if it grows in line with previous general-purpose technologies.[footnote 6] Based on these trajectories, the UK AI sector could contribute between £20-90 billion in GVA to the UK economy in 2030.[footnote 7]
The success of the AI sector and the CIs are intertwined. The CIs generate high-quality content that is needed to train the best AI models. Meanwhile, AI has the potential to transform creators’ workflows, amplifying their productivity and giving them powerful new tools.
Copyright law is central to sustaining the success of the CIs and the wider economy. It enables creators to earn recognition and financial benefit from their work, while attracting investment for the future. However, right holders say they face challenges in controlling use of and access to their works. AI developers say they struggle to access large quantities of data to train foundation models domestically, and that UK copyright laws may also constrain the development and adoption of AI tools that are built on top of foundation models. The impact of existing copyright law in relation to Text and Data Mining (TDM) on the UK economy is therefore uncertain, with the current regime potentially limiting growth.
The government’s consultation on Copyright and Artificial Intelligence set out 4 policy options to reform copyright and AI:
- do nothing
- strengthen copyright
- introduce a broad exception for Text and Data Mining (TDM)
- introduce a TDM copyright exception with rights reservation mechanism.
The government’s original assessment, as set out in the Summary assessment of options [footnote 8] was that the status quo did not achieve its objectives of control, transparency and access. The final option, a TDM copyright exception with rights reservation mechanism, was presented as the government’s preferred option in the consultation.
The evidence base and the international context are evolving rapidly. Therefore, this Impact Assessment does not present monetised costs and benefits for the options, and it does not establish a preferred option. It does, however, provide an illustration of the likely costs and benefits across options and identifies key areas for further evidence gathering. This is proportionate in this fast-developing policy area. Any future legislation would be accompanied by a full final-stage Impact Assessment under the Better Regulation Framework. This Impact Assessment presents the government’s best current available evidence and emerging conclusions, subject to uncertainty.
As noted in the accompanying Report on Copyright and Artificial Intelligence, there are also other potential options for change in this area, some of which were proposed by respondents to the government’s consultation. These are discussed in Section C of the Report. As highlighted in the Report, alternative options would need to be developed further before their impacts could be assessed and additional evidence would need to be gathered. As such, this Impact Assessment only considers the 4 options set out in the government’s consultation.
Under a ‘do nothing’ scenario (Option 0), permission would usually – absent the applicability of an existing exception – be needed to copy protected works at different stages of AI training and development that take place in the UK. The market for licensing UK copyright works for use in AI systems is in the early stages of development and is expected to expand under a ‘do nothing’ scenario. How this market develops will be highly sensitive to the copyright regimes in other jurisdictions, and the use and effectiveness of technical measures that restrict access to works. Meanwhile, under the status quo, UK copyright law would continue to act as a significant constraint on competitive general-purpose model training in the UK and could inhibit wider AI development and adoption.
Under Option 1, licences would be required to access copyright works for TDM in all cases, alongside transparency and market access measures. This means that AI models imported to the UK would need to have complied with UK copyright law during their development. This may increase licensing and remuneration for right holders and grow the UK’s CIs. However, if the costs of licensing are high relative to the economic returns from accessing the UK market, some overseas AI developers may delay access to leading AI models and services in the UK or withdraw from the UK market entirely. This could dampen UK development and adoption of AI, reduce the productivity benefits of AI and hold back economic growth across the whole economy, including the CIs.
Under option 2, there would be a broad exception without a mechanism for rights reservation. This could decrease the incentive to invest in creative outputs, which may have a negative impact on the growth of the UK’s creative industries. Under this option, the UK AI sector would be more internationally competitive and could attract more investment in AI and bolster wider AI development and deployment. This option is therefore the most enabling for AI sector growth. However, the impact is uncertain and dependent on other barriers.
Under Option 3 – a broad exception with rights reservation – the impact on the licensing market would depend on the effectiveness and level of take-up by right holders. Where copyright works continue to be accessed for training in other jurisdictions, the legal effect of the rights reservation is unlikely to be significant. If, however, AI developers increasingly license copyright works of value to them because of rights reservations, then this could increase payment for and investment in creative content. Taken together, the overall impacts on creative industry growth of Option 3 are uncertain. Under Option 3, the UK would be less competitive internationally relative to Option 2, but would still be more attractive for AI development and deployment than it is today. The size of any impacts would depend on the nature and effectiveness of any rights reservation mechanism.
There remains substantial uncertainty in this assessment. We need better evidence on the evolving licensing market, including in other countries, and the impacts of potential copyright interventions on the development and use of AI across the economy. To gain greater clarity, the government is monitoring developments in copyright case law, legislation, technical standards and practices, and developer responses to overseas regulation.
We have commissioned research to gather evidence on CI businesses’ interaction with AI licensing, and their development and adoption of AI. The AI Sector Study will collect evidence on how copyright law is affecting specific areas of UK strength, supported by targeted stakeholder engagement. We will also provide analysis of AI’s impact on the economy and labour market, including through the AI Adoption Survey, to inform our understanding of how copyright affects AI adoption across the whole economy, including in the CIs.
Introduction and policy context
Under the Data (Use and Access) Act 2025, the government is committed to publishing:
- An assessment of the economic impact in the United Kingdom of each of the 4 policy options described in section B.4 of the Consultation on Copyright and Artificial Intelligence - GOV.UK. These options concern copyright law and the training of AI models using copyright works.
- The assessment may also include an assessment of alternatives to these options.
- It must, among other things, include assessment of the economic impact of each option on copyright owners, and persons who develop or use AI systems (i.e. developers and users), including the impact on individuals, and micro, small and medium-sized businesses (“SMEs”).
In compliance with the Better Regulation Framework, the government conducted an Options Assessment (published as Summary Assessment of Options on 17 December 2024[footnote 9]) prior to announcing a preferred option at consultation, which was reviewed by the Regulatory Policy Committee (RPC). The requirements under the Data (Use and Access) Act 2025 do not require a further review from the RPC at this stage but, were legislation to be introduced in this area, the government would need to conduct a ‘final-stage’ Regulatory Impact Assessment that would be published alongside legislation. This may include a further review from the RPC subject to conditions under the Better Regulation Framework.
Market context
This section outlines the current state of both the UK’s CIs and Artificial Intelligence sectors, their scope, dynamics, as well as place within the wider UK economy.
Creative Industries and copyright owners
In 2024 the UK’s CIs generated £146 billion in Gross Value Added (GVA), representing around 6% of the UK economy, growing at 2.5x the rate of the UK economy between 2010 and 2024 (60.3% vs 24.3%).[footnote 10]
The Creative Industries Sector Plan demonstrates the sector’s potential future growth, with estimates – based on past trends – indicating annual GVA could increase by £60 billion in real terms by 2035.[footnote 11]
The top 3 sub-sectors driving CI growth were “IT, software and computer services”, which represents around £62 billion of 2024 GVA, followed by “Advertising and marketing” (£24 billion) and “Film, TV, radio and photography” (£24 billion).[footnote 12] A significant part of the AI sector forms part of the IT, software and computer services sub-sector, demonstrating how the AI and CI sectors are inherently intertwined.
The UK CIs draw in significant levels of inward investment, accounting for 10% of all UK Foreign Direct Investment (FDI) between 2010 and 2023, also representing over 9% of global CIs FDI.[footnote 13] Copyright protects certain types of intangible assets including artistic originals and software. Investment in artistic originals was £7 billion in 2023, whilst investment in computer software and databases was £57 billion in 2023.[footnote 14]
The UK has a comparative advantage in creative services, and the CIs accounted for 13% of all UK services exports in 2023.[footnote 15] Global demand for CIs imports have grown by 76% over the last 10 years, to £600 billion,[footnote 16] driven by a rapid expansion of digital consumption and young populations in emerging markets.
AI has the potential to do work that was previously done by humans, such as generating music and video content. However, it also has the potential to support the creative process when used as a tool, such as for video editing.
Developers of AI and the wider AI sector
- The UK AI sector is the third largest AI sector globally, behind the USA and China.[footnote 17] According to the AI sector study, in 2024 there were just under 6,000 AI firms operating in the UK, employing an estimated 86,000 employees, and contributing
£12 billion in GVA.[footnote 18] These are defined as firms who provide AI goods and services.
The top 3 sectors providing AI goods and services in the UK economy are “Information Technology” (£7.7 billion GVA in 2024), “Professional Services” (£1.7 billion), and “Telecommunications” (£0.46 billion).[footnote 19]
International AI firms and developers provide a significant contribution to the UK AI sector, with just over half (52%) of revenues in 2024 generated by firms not headquartered in the UK. We expect the ongoing growth of the UK AI sector to depend strongly on foreign investment.
If the UK AI sector grows in line with previous general-purpose technologies (GPTs), it could see year-on-year growth of 10%-40%.[footnote 20] This would see the AI sector adding between £20 and £90 billion to UK GVA by 2030.[footnote 21]
We are not aware of any frontier foundation model training that takes place in UK data centres, but there are UK-based AI firms and employees that develop frontier models. This difference is key, as training is likely to engage copyright-relevant acts (for example pretraining may involve scraping large volumes of potentially copyright-protected text), while model development can focus on other, non-data centric activities such as system design and development, which are less likely to infringe copyright. This development layer contributed approximately £2 billion GVA to the UK economy in 2024. The wider AI sector contributed around £8 billion, which includes the development of narrow models and applications, all of which will also be impacted by UK copyright laws to some extent.[footnote 22]
AI Adoption
Within this Impact Assessment, adoption into the wider economy is taken to mean firms and individuals who have adopted AI products and services. This corresponds to persons who use AI systems under Section 135(3) of the D(UA) Act. There are several measures of AI adoption; we use the ONS’s Business Insights and Conditions Survey (BICS) which shows that in January 2026, 25% of UK firms had adopted some form of AI, up from just 16% a year prior.[footnote 23]
The OECD estimates that adoption of AI into the whole economy could add 0.4 to 1.3 percentage points to the UK’s productivity growth. These estimates are highly uncertain, but this would be the equivalent of adding £55-140 billion to UK GVA in 2030.[footnote 24]
Strategic rationale
Problem under consideration
Copyright law in the UK balances the interests of creators with the needs of the wider public. Creative businesses invest large sums to produce creative works but these can be easy to copy. Copyright allows creators to control the use of their works and make a return on their investment and thereby encourages more investment in new content.
The status quo presents challenges to AI developers, right holders and users of AI systems. These include, but are not limited to:
-
Access to training data - AI models require large quantities of data for training. High-performing foundation models, which can be applied to multiple applications, require billions or trillions of datapoints from a diverse range of sources. It is easier for AI developers to access and use some data than others. Many foundation models have been trained using copyright works that have been made available online without restriction (such as paywalls or other barriers). In this Impact Assessment we refer to these as works that are “publicly available online”. Other works are less easily accessible to AI developers. This will include privately held datasets, as well as datasets made available online but with access restrictions such as paywalls. In this Impact Assessment we refer to these as “privately held” or “restricted access” works.[footnote 25]
-
Overseas training - The usage of online material is treated differently under the copyright laws of different countries. Most frontier model training occurs in jurisdictions such as the USA, where AI developers argue that the fair use doctrine permits use of online and other copyright works without the need for licences. Both right holders and AI companies acknowledge the use of unlicensed material in training. For example, 99% of creators surveyed in a report by the Independent Society of Musicians say their work has been scraped without their consent, [footnote 26] and AI developers have stated that it would be “impossible to train today’s leading AI models without using copyrighted materials”.[footnote 27]
-
Legal risks and costs - Copyright law in the UK, and ongoing litigation have been cited by AI developers as presenting legal risks and costs that are a barrier to investment and UK-based frontier model training. Similar legal risks exist for firms adapting existing models. These risks may differ depending on the activity, and jurisdiction. Legal risks may also be present for firms implementing AI tools in their business models, however, evidence indicates that these risks are not limited to copyright. For right holders, it is costly to navigate and manage the enforcement of their rights across jurisdictions, especially for SMEs who don’t have access to the necessary resources to litigate.
-
Fragmented standards - There is no single standard or technical tool that enables the identification of copyright works published online, their owner, and the permissions or licences required to use that work. This means copyright owners find it difficult to control the use of their works in AI models, while AI developers don’t have an agreed framework to check permissions and ensure content is only obtained from lawfully accessed sources. While protection measures and tools have developed, these are not always effective. Some web crawlers do not respect existing standards, and it is not always possible to distinguish between and block web crawlers for one purpose and not another. For example, if some web crawlers must be allowed in order for a website to appear in search engine results, blocking them could have a negative impact on sales. The lack of universally recognised standards prevents the market’s development and means transaction costs associated with licensing remain high.
-
Transparency - In parallel, AI developers provide varying levels of information on what data they have used to train their models and how it has been sourced. This makes it difficult for right holders to identify whether specific works were used in AI training and whether these works were lawfully accessed. This increases the cost and complexity of acquiring evidence for enforcement, or to seek licensing agreements.
Competition with human creativity - AI has the potential to do work previously done by humans, meaning it can both compete with human creators and enhance their productivity and creativity. It therefore has the potential to capture a growing share of value traditionally earned by creators and right holders.
Market failures
While the market continues to develop and the needs of AI training evolve, structural market failures have been identified based on our current understanding.
Public goods – Without technical protection tools and suitable enforcement, publicly available copyright works online may be copied without remuneration to the right holder. This is because when it is shared online, content is typically available for anybody to access (non-excludable) and access by one individual does not prevent or reduce access by others (non-rivalrous). This reduces the economic incentives to create new works. However, overly constraining access to copyright works can limit important positive externalities and spillovers from knowledge sharing.
Information asymmetry – Unless AI developers disclose the source of their training data or actively seek to license works, right holders face difficulty determining whether their works have been used. This information asymmetry can make it harder for right holders to enforce their rights, or to understand the potential value of their works. One survey from a Collective Management Organisation for artists showed that only 22% identified that their work had been used for AI training, with 63% unsure.[footnote 28]
Coordination failure – The potential market for licensing copyright works may be dampened due to a lack of coordinated action. For example, approaches to transparency may not be adopted due to the absence of agreed standards. However, the lack of coordination disincentivises investment in those standards. Legal disputes may also prevent coordinated action as it creates risks for developers investing in a market model that could become obsolete as legal cases are decided.
Market concentration and high barriers to entry – There are high barriers to entry to becoming a developer of AI models that require high volumes of data, including foundation models. Large established firms are more likely to have a competitive advantage in building AI models – especially where they have control over, or agreements with, social media platforms that allow them to train on user-generated content. Small and micro firms, and individual developers, are unlikely to have the same access to this type of content. Where markets are dominated by a small number of large AI developers, this can weaken the negotiation power of right holders. Currently, most reported licensing deals are between large right holders and large AI developers.
Objectives and intended outcomes
This government’s objective is to protect the UK’s position as a creative powerhouse, while unlocking the extraordinary potential of AI to grow the economy and improve lives. The approach to copyright and AI must support prosperity for all UK citizens, and drive innovation and growth for sectors across the economy, including the CIs. This means keeping the UK at the cutting edge of science and technology so UK citizens can benefit from major breakthroughs, transformative innovation and greater prosperity. It also means continuing to support our CIs, which make a significant economic contribution, shape our national identity and give us a unique position on the world stage.
The government’s consultation on Copyright and Artificial Intelligence set out 3 objectives for its approach to the use of copyright works in the development of AI models, including Text and Data Mining (TDM). These were:
- Increasing transparency about the use of copyright works to train AI models and AI-generated content
- Giving right holders of creative works greater control over use of their material to train AI models and supporting their ability to be remunerated where it is used
- Enhancing the ability of AI developers to access
Options analysis
In light of these objectives, the consultation set out 4 policy options:
- Option 0 - Do nothing / counterfactual,
- Option 1 – Strengthen copyright, so that licences are needed for uses of copyright material in AI training with transparency requirements and market access measures
- Option 2 – Introduce a broad copyright exception for Text and Data Mining (TDM),
- Option 3 – Introduce a TDM copyright exception with a rights reservation mechanism and transparency requirements.
In line with the commitments under the DUA Act, this assessment considers the economic impacts of each option on the following market actors:
- Copyright owners
- Developers of AI systems
- Users of AI systems
It includes the impact on copyright owners, developers and users who are individuals, micro businesses, small businesses or medium-sized businesses.
It should be noted that, for the purposes of this assessment, there is a strong focus on core copyright sectors[footnote 29] which would be most affected by regulatory change. The core copyright sectors largely overlap with the CIs, and so the impact on these industries is described, which will also reflect copyright owners as a class. However, copyright owners and right holders in general are also referred to throughout.
The analysis of options in this paper has several limitations. As this is a new and quickly developing area, parts of the evidence base are incomplete or not yet developed. Throughout this assessment, we have sought to include reliable evidence from sources including academic and government publications. We have also included information from statements by stakeholders (including those shared in responses to the consultation) where relevant to illustrate our understanding of the status quo and potential future outcomes.
There are also several uncertainties which affect how we compare each of the consultation options to the status quo. These uncertainties include but are not limited to:
-
Transaction costs – Licensing negotiation costs could fall if AI-related collective licensing develops more quickly across the CIs. These could also reduce if technical standards and transparency ease the identification of copyright works and their owners, and protection tools increase right holders’ ability to control access to their works.
-
New technological controls – Methods are being developed to restrict access to data by web crawlers, including improving the effectiveness of opt-outs, the ability of website owners to distinguish between types of crawlers, and more advanced capabilities to block crawlers without permissions.[footnote 30] However, as market solutions develop, so too does the ability of AI to circumvent these protections.
-
Copyright regimes may change – Regardless of regulatory intervention by the UK government, case law may change both domestically and abroad, as could copyright policy and legislation in other countries. For example, if USA courts were to rule in favour of right holders and against the unlicensed training practices of AI companies, then it is possible there could be a more fundamental global shift in training practices towards more licensing.
-
The future trajectory of AI investment and adoption – There is significant uncertainty concerning how the UK AI sector will develop over the coming years, and additionally how widely AI may come to be adopted and what the resulting productivity gains might be across the UK economy.
-
The future role of copyright data in AI model development – AI is a fast-moving technology, and new model architectures or paradigms in training could change the role of copyright data in training AI models. For example, synthetic data for training AI models could complicate the role of copyright works in training the next generation of AI models. There is uncertainty over how copyright affects different parts of the AI sector, with subsectors employing fine-tuning potentially playing a larger role in the UK than primary model developers.
-
How AI is adopted and how this interacts with copyright – The extent to which copyright is inhibiting AI adoption is unclear. Internal AI solutions and off-the-shelf model adoption may be impacted differently, and the law may apply differently to different applications.
-
The future labour market impacts of AI – There is significant disagreement among economists and forecasters about the potential labour-market impacts of AI over the coming years. Whether AI will be labour-augmenting or labour-displacing for the UK economy in general, and for the creative sector in particular, is currently unclear.
-
International transparency – Transparency rules are being introduced in the EU and California and, given their locations and market access requirements, could cover a significant proportion of multinational AI companies. This may lead to transparency measures being more widely implemented globally and could have spillover benefits for UK right holders.
Option 0 – “Do Nothing” (Status quo)
Description, assumptions, and key uncertainties
The current TDM exception, applicable to non-commercial research using copyright works and performances, would remain, alongside other existing exceptions such as the exception for making temporary copies. Unless an exception applies (which will be in limited and fact-specific circumstances), permission would usually be needed to copy protected works at different stages of AI training and development that take place in the UK.
Some aspects of UK copyright law and its application to AI training are disputed. Disputed aspects include the scope and application of existing copyright exceptions, and the status of models trained outside the UK’s jurisdiction. Since the consultation was launched there has been a significant court judgment in the case of Getty Images vs Stability AI[footnote 31] which ruled on some aspects of the law, notably the status of models trained outside, and imported into, the UK. This is subject to appeal.
This option would mean any remaining lack of clarity in the law will persist until clarified by the courts. However, many right holders who responded to the consultation suggested that there is little or no ambiguity in UK law at present and their main concerns related to enforcement of rights. Without greater transparency measures, right holders may continue to find it difficult to identify how their works are being used, making it difficult to enforce their rights.
Bearing in mind the uncertainty described above, there are different potential outcomes under the evolving status quo (and therefore “do nothing” option). We assess the options against 2 counterfactual scenarios:
-
A limited licensing market scenario. Model training continues to take place in more permissive jurisdictions which creates only limited licensing for publicly available copyright works in the UK
- An expanding licensing market scenario. The licensing market for publicly available copyright works in the UK increases due to international policy developments, litigation or technological protection measures.
Creative Industries and copyright owners
AI licensing market for copyright works
Licensing allows AI developers to access a range of high-quality content and is the mechanism through which creators are rewarded for and encouraged to invest in new content. The licensing market under the status quo is explored in more detail in Section G of the accompanying Report.
Data from right holders is generally acquired through licensing agreements, which is more common with privately held or specialist content, or through unlicensed scraping of publicly available online material. The latter is commonly done in other countries, where local copyright exceptions and defences are relied on. As set out in Section G of the Report, the AI licensing market for copyright works is in the early stages of development and we expect it to continue to grow and evolve without government intervention.
Based on the limited evidence of publicly reported licensing deals, the market for licensing is growing. However, the majority of deals appear to be with large right holders and predominantly for restricted or privately held material, although publicly available content could be included too.[footnote 32] There is evidence that AI companies are willing to enter into agreements for publicly accessible but technologically restricted data which is of high enough value to them and where equivalent data cannot be obtained elsewhere. For example, Reddit changed its policy from allowing web crawlers to scrape content to requiring authorisation,[footnote 33] and has struck major licensing deals with OpenAI[footnote 34] and Google.[footnote 35] They continue to take action against AI developers and web scrapers that have not agreed licensing deals.[footnote 36]
There is also a growing market for AI licensing through intermediaries,[footnote 37] permission-based crawling,[footnote 38] or CI-owned AI models that compensate creators.[footnote 39] Further licensing deals or strategic agreements are emerging, including some which have been struck following litigation around unlawful access of copyright works. One significant court ruling was against Anthropic’s use of online pirated books.[footnote 40]
Under Option 0, the licensing market will continue to grow and evolve without government intervention. The existing evidence is limited given the full extent and details of current licensing deals are rarely in the public domain. However, from reports of the publicly announced deals it is possible to draw some preliminary conclusions.
Licensing deals represent a net inflow to the UK economy. According to reports referenced in the CREATe research,[footnote 41] there were approximately 14 major deals reported in 2024 where the content provider parent company is UK-based.[footnote 42] This will likely increase as more licensing deals are reached. These 2024 UK deals came from just 4 AI companies which are all based in the USA. While licences have typically been agreed among large right holders and large developers, it is often not clear whether these include any means of remunerating individual creators who operate on a freelance basis, or smaller SME creative businesses contracted to these large right holders.
Whilst some of these deals could include permission to scrape publicly available content, many of the deals so far appear to be primarily for restricted or privately held copyright works. The market for restricted or privately held copyright works is likely to continue to grow. Where licensing deals cover privately held works that are partly accessible to web-crawlers online, these agreements could provide legal certainty for AI developers’ use of publicly available data owned by those copyright holders.
AI developers are not able to train or fine-tune AI models and systems in the UK on unlicensed copyright content unless an existing exception applies. If other jurisdictions remain permissive, and creators do not effectively restrict the use of their content for AI training (for example through technological protection measures explored in more detail below), AI developers are likely to continue to use copyright works for TDM elsewhere and the licensing market for this content is unlikely to grow. However, if other jurisdictions move towards more licensing models, or creators can effectively restrict the use of their content for AI training, then the licensing market for publicly available content could grow under the status quo.
Technical tools, standards, and enforcement
The current state of the market for technical tools and standards to control access to, and use of, copyright works with AI is explored in Section F of the Report, while enforcement more generally is explored in Section H.
Existing technical tools and standards available for right holders to protect and control their content online are growing in availability, use, scope, and efficacy. Standards such as the Robots Exclusion Protocol, and tools such as those available to websites hosted with Cloudflare, are increasingly used to restrict AI web crawlers.[footnote 43] However, these kinds of tools currently do not support all right holders’ needs, with right holders facing challenges in effectively monitoring and controlling how their content is accessed and reused. Content on right holders’ own websites, or those with whom they have a contractual relationship, is within their control to protect using existing tools. However, content uploaded to other sites, such as social media platforms, may have less protection, for example, where these remove metadata from uploaded files.[footnote 44]
Under Option 0, adoption and use of technical tools and standards will continue to grow without government intervention, including for publicly available content. Protection tools include provenance systems which drive content traceability and the implementation of rights such as that by the Coalition for Content Provenance and Authenticity.[footnote 45] Tools such as Glaze and Nightshade, that allow creators to technically disrupt style extraction and corrupt training signals are also gaining traction.[footnote 46] Platforms hosting content have also adapted their contractual frameworks to restrict third party AI training for example YouTube[footnote 47] and the BBC[footnote 48] is taking action to prevent unauthorised scraping. Cloudflare, which hosts websites and manages part of the internet’s infrastructure, started blocking AI web-crawlers by default on content creators’ work unless AI companies pay for access.[footnote 49]
While only a minority of sites are fully protected, restrictions are becoming more common across the core publicly available datasets used for AI training. News media is already at the forefront of this shift – estimates vary but one suggests 79% of the top news sites already deploy some form of protection from AI scraping.[footnote 50] Uptake and the development of these technical tools may be stimulated by regulatory requirements in other jurisdictions such as the EU. However, some right holders may not to adopt these tools or standards, and – where they do - some AI developers may choose not to use or comply with them.
The EU’s new AI Act[footnote 51] is in place under the status quo and includes transparency requirements for frontier models and commitments to put in place technology to recognise AI training opt out measures. This may support some right holders in the UK in their ability to identify when their works have been trained on and to control where their works are used in training, especially where training takes place in the EU.
Existing enforcement remains practically difficult for some right holders, especially for individual right holders and for SMEs. Evidencing use of protected works is challenging given limited transparency, and monitoring can be costly. Litigation routes are expensive relative to likely recovery, particularly for SMEs. Without transparency, right holders are unable to identify if their protected works are being ingested, making it difficult to seek remuneration. Under Option 0, enforcement costs, along with information asymmetries and difficulties evidencing infringement, mean that right holders may face barriers to pursuing litigation. Enforcement may remain inaccessible for smaller right holders without collective action, effective transparency measures or significantly reduced litigation costs.
Creative Industries growth and investment
Under the status quo (Option 0), the scale of CIs growth is uncertain. The CIs have shown strong growth, increasing GVA at 2.5 times the rate of the UK economy between 2010 and 2024 (60.3% vs 24.3%). The sector’s future economic contribution is expected to grow, based on past trends annual GVA could increase by £60 billion in real terms by 2035.[footnote 52] These estimates do not account for the impact of AI on the sector where there are both opportunities from productivity gains, and challenges associated with enforcing and licensing rights and AI outputs that compete with creative products and services.
Current UK copyright law does not permit AI training in the UK on copyright works – including publicly available online works – except where either licences exist or existing exceptions apply. While there is evidence of revenues to some UK right holders from licensing works for AI training, the current limitations identified above could inhibit the future growth prospects for the UK’s CIs. Current copyright protection also supports recognition of a firm’s value and can provide a route to accessing finance for the UK’s CIs. Specialist lenders like Coutts & Co. accept copyright-protected scripts as loan security, while firms like Inspiretec have successfully leveraged copyright-protected software code to access millions in growth.[footnote 53]
In a scenario where training on publicly available copyright works continues to take place in other jurisdictions and creators are unable to restrict access to their content, remuneration for the use of such works to develop AI would likely remain limited. However, in a scenario where there are developments in litigation, technological protection measures or other factors, the licensing market for copyright works (including those “publicly available” online) would grow more quickly.
Irrespective of the policy approach to copyright, AI presents both new opportunities and challenges for the CIs. This may offer productivity benefits where AI is successfully integrated into business models. It could also disrupt value chains to the extent that revenues move to AI firms away from human creators, this could particularly affect freelancers and SME businesses.
AI developers
Compliance costs and access to data
With no change to UK copyright law, there continues to be legal risk of infringement by firms utilising TDM techniques to develop AI models. This can deter innovation and investment. For SMEs, these risks are especially acute, as the costs of litigation or compliance may be unsustainable.
There is limited evidence for the impact of copyright laws on AI developers. However, research by ACT, a tech SME trade body, found that nearly 60% of EU and UK developers report launch delays, and more than one-third are forced to strip or downgrade features to comply with regulations. However, it is unclear whether and how far copyright laws are the reason for these effects.[footnote 54]
In recent years, the proportion of “notable models” developed by UK-based companies has declined from around 11% in 2010–14 to around 3% in 2020–24, and to 0% in 2025.[footnote 55] It is likely that the current copyright regime represents a barrier for UK-based firms seeking to train and develop next-generation frontier AI foundation models. However, this decline may also be driven by other significant factors, including compute availability and energy costs.
Growth and investment
The UK AI sector operates within a competitive global economic context. This is emphasised by survey data that suggests around one in 3 UK AI start-ups are reportedly considering relocating. The main reasons given by these firms are better funding availability and access to a larger market (over 25%), whilst the regulatory environment is also cited by some (over 10%).[footnote 56] Further, research commissioned by the Computer and Communications Industry Association, surveying AI developers and investors suggests retaining the status-quo would limit future investment.[footnote 57]
As the world’s third-largest destination for AI investment,[footnote 58] the UK has potential for growth across the AI ecosystem. The UK AI sector contributed £12 billion to the UK economy in 2024, over double its 2023 output.[footnote 59] This growth has occurred regardless of copyright, though retaining the status quo may present a constraint to the development of certain layers within the AI value-chain. Existing evidence suggests that the status quo effectively prohibits AI foundation model training from taking place in the UK and may be a constraint for other layers of the AI value-chain.
More evidence is needed to understand the impact on activities such as fine-tuning and narrow model development. UK developers may conduct elements of fine-tuning on existing models, integrating them into new products and services. According to the ONS BICS, on average around 22% of AI adopters are currently using AI developed in-house, compared to 44% of firms using external, ready-to-use software.[footnote 60] It may be more difficult for UK firms that do this, especially small and micros, if it requires locating their training activity outside the UK. If that is the case, they may be more affected by UK copyright law than larger, often US-based, developers.
According to recent evidence from Public First, commissioned by Microsoft, around a fifth of UK firms surveyed (19%) already carry out some form of TDM operation. Of these, 74% said access to external data was essential for their business, and more important than internal data for 59%.[footnote 61]
AI adoption
AI technology is being widely deployed across the economy in a range of sectors and by firms of all sizes.[footnote 62] The ONS Business Insight and Conditions Survey (BICS) asks businesses about their use of specific AI technologies such as text generation using Large Language Models and Visual content creation. As of December 2025, 25% of UK businesses are using at least one AI technology surveyed, up from 16% in January 2025. Large businesses have a significantly higher rate of adoption at around 44%, compared to micro businesses at 24%. The UK sectors with the highest level of AI adoption are ‘Information and communication’ and ‘Education’.
Adoption is high in the CIs; in December 2025, 43% of businesses said they use AI, compared to 25% of all businesses.[footnote 63] CI businesses are also more likely than UK businesses overall to use AI for visual content creation (24% vs 12%), Large Language Model text generation (28% vs 13%) and Machine Learning data processing (14% vs 7%).[footnote 64]
Currently, the cost of AI to the end user does not appear to be significantly determined by the cost of training data. The global revenue of the AI training dataset market (£2.2 billion)[footnote 65] was around 1% of the total global investment in AI in 2024 (£197 billion)[footnote 66] suggesting it is not the most significant cost of production, with the main capital costs for AI training typically being R&D staff, hardware and energy.[footnote 67]
AI adoption is impacted by both trust in and availability of models. UK businesses may not want to develop or use AI products that are perceived to be infringing copyright works. Simultaneously, the copyright landscape may be limiting access to useful data and training methods within the UK, and therefore access to and development of certain models – see comments on fine-tuning, above. Both factors could ultimately limit adoption by UK firms. ONS survey evidence indicates that around 1% of businesses reported exposure to copyright related legal risks was preventing or delaying adopting AI technologies.[footnote 68] Further, according to Public First research, 39% of existing UK TDM users cited legal risks as a barrier to further AI adoption, though the extent this is driven by copyright is not clear.[footnote 69]
Separate DSIT research found that the main barriers to AI adoption were a lack of identified business need, and limited AI skills or knowledge. In the same survey, regulation was reported as preventing AI adoption by 28% of businesses, though this is inclusive of factors such as data protection, with the significance of copyright uncertain. Businesses were also asked to rate the significance of each barrier in preventing them from adopting AI, with the most significant being ethical concerns (80%) followed by regulation (72%)[footnote 70]. Another survey found that regulation is perceived to be the greatest barrier to technology adoption (despite being a net enabler), this is second to data privacy and security for AI specifically.[footnote 71] There is also evidence that regulation has delayed AI adoption across the UK and EU, particularly amongst SMEs and Start-Ups, with 6 in 10 EU and UK tech startups and SMEs facing delayed access to frontier AI models.[footnote 72]
AI may be used by businesses in a wide range of applications. For example, businesses may use “off the shelf” models and services, often developed abroad by large AI developers, or more specialist systems, which may be trained or deployed on local datasets. As discussed in the Report accompanying this assessment, the effect of copyright law will be different depending on the specific application and use of copyright material, and its location. The wide range of applications of AI means there is a large degree of uncertainty about the effect of copyright on AI adoption, and effects are likely to differ between sectors and applications.
Under the status quo, it is likely that adoption trends will continue, with a range of issues potentially slowing the rate of adoption. Copyright could therefore be one of several factors in the UK underperforming its high growth potential, particularly given AI adoption is set to be the biggest driver of AI related productivity growth, meaning even a small change could have a significant impact.
Summary: Option 0 Assessment
- Under ‘do nothing’ (Option 0), the impact on the licensing market for AI will be highly sensitive to the regimes in other jurisdictions and the use and effectiveness of technical measures that restrict access to works. The licensing market for restricted or privately held copyright works will continue to grow and evolve without government intervention, but the highest value agreements may be limited to larger right holders and AI developers. It is unclear on how far reform to copyright would impact this growth.
-
In a scenario where training on publicly available copyright works continues to take place in other jurisdictions and creators are unable to restrict access to their content, remuneration for the use of such works would likely remain limited.
-
However, in a scenario where there are developments in litigation or changes in policy in other jurisdictions, or creators can restrict access through technological protection measures, the licensing market for copyright works (including those “publicly available” online) would grow more quickly.
-
For right holders, practical barriers to enforcement may continue, including litigation costs and difficulties evidencing infringement. The market for tools and standards will continue to develop but may not support all right holders’ needs, and challenges with adoption and compliance may continue.
-
For AI developers, current UK copyright law is likely to restrict competitive general-purpose model training, due to the amount of data required, which may be more easily accessed in other countries. It may also restrict development in other elements of the AI value chain, such as fine-tuning and narrow model development. However, existing evidence indicates copyright law is one of several challenges to the UK AI sector.
- AI adoption is set to be the biggest driver of AI related productivity growth. Evidence suggests copyright law may restrict AI adoption, though likely for only a small proportion of businesses. There is also uncertainty as to how significant these effects are by adoption and application type, with use of off-the-shelf models potentially presenting different copyright risks compared to adoption of internal TDM solutions.
Option 1: Strengthen copyright requiring licensing in all cases
Description, assumptions, and key uncertainties
This option would mean that permission is required through licences for the use of copyright works in AI training. As UK copyright law already provides extensive rights, with limited exceptions, copyright law under this option would remain largely the same as the status quo.
Robust transparency measures would be implemented, making it easier for right holders to see if their works have been included in AI training datasets and helping right holders to enforce their rights. AI foundation model providers would be required to disclose detailed information on how they have trained their models in a manner that complies with UK copyright law. AI providers would be required to publish detailed summaries of how their models are trained such as information on the type of data used (text, video, images etc.), size of the data, list of public datasets used, the date of acquisition. Information on the web crawlers used to collect the data would also be required, such as the name of the web crawlers, date of activity and the purpose of the web crawlers. Under this option, more detailed disclosure could also be required.
To make sure that AI providers are meeting their transparency obligations, AI providers could be required to provide additional information on request from the government or a public body. The government would likely need to introduce new regulatory duties or powers to undertake this activity, which may include establishing a new regulatory body along with appropriate resources and powers to deliver these transparency measures. It is not clear at this stage how these regulatory duties would be funded.
Output transparency measures could also be included. These could seek to make it easier to distinguish between human-created works and AI-generated content. Potential measures could include requiring AI-generated content to be clearly labelled or requiring information about its provenance to be included in its metadata.
Alongside these changes, the status of models trained in other jurisdictions would be addressed. This could take place through revisions to copyright law that relate to importation of infringing copies, or through an EU-style regulatory approach, based on market access, which would apply the UK’s transparency and copyright rules to models place on the UK market. The effect would be that AI models trained under other countries’ copyright laws then imported to the UK must be licensed as if they had been trained in the UK. This could create a level playing field in the UK between AI models trained in the UK and those trained outside the UK.
Creative Industries and copyright owners
Licensing market for copyright works
The effect of any intervention on the licensing market will be dependent on the value of UK content accessed by AI model developers. It is considered that online publicly available content will be more influenced by legislative changes where there are lower technical barriers to access. Effects will also hinge on the transaction costs of negotiating licences and on how important the UK market is for AI developers.
Option 1 would have a limited direct impact on transaction costs, on a per-deal basis, though if more data is licensed the number of transactions may increase. Those wishing to license restricted access data in the UK would face similar transaction barriers on a per-deal basis as the counterfactual. Larger firms may be better placed to meet these transaction costs than SMEs and individuals.
Option 1 may grant right holders more negotiating power when attempting to enter into agreements. Larger right holders are likely to benefit more than SMEs and individuals, because of factors including the breadth and volume of their protected works, and their legal resources. Right holders with specialist and difficult to substitute content are also likely to have more bargaining power than those whose content can be substituted more easily.
Under market access measures in Option 1, we assume UK copyright law would apply to the training of any model using copyright works that is made available in the UK regardless of where that training had taken place. Transparency rules would also apply to models made available in the UK. In order to access the UK market, multinational AI developers who commonly train their models in other jurisdictions would need to comply with UK copyright law and transparency measures. This could lead to an increase in licensing and payment for UK copyright holders where their work is used. However, this effect could be limited where the costs of obtaining licences outweighs the commercial benefits of selling access to AI models in the UK or where there are changes to training dataset composition in response. It is not clear how AI developers would approach acquiring licences for copyright content under this option, and the extent to which any licensed datasets would include UK copyright content.
The Option 1 requirements, based on market access, could result in international AI developers withdrawing or delaying models being released in the UK. For example, in 2024 Meta cancelled the release of its frontier Llama model into the EU market citing the unpredictable nature of the European regulatory environment.[footnote 73] A developer might take this decision if the overall costs of compliance with UK law, including licensing, outweighed the benefits of providing a model or service on the UK market. Such decisions are likely to depend on the individual model or service.
There is limited evidence on the scale of potential licensing costs compared to UK revenues to assess the likelihood of this behaviour. Where AI developers withdraw their models from the UK market, payments for UK copyright works, in relation to those models, would not increase compared to the ‘do nothing’ option. Against a counterfactual where there is more licensing in other jurisdictions (including works publicly available online), the marginal impact of Option 1 on the behaviour of AI companies may also be less significant.
Enforcement and transparency
Option 1 is more likely to support right holders’ ability to enforce their rights. Under Option 1 the UK would introduce transparency obligations similar to or going beyond those required by the EU’s AI Act. AI foundation model providers would be required to disclose (potentially detailed) information on training inputs.
Stakeholders have suggested that in legal disputes between AI developers and right holders, huge and prohibitive sums of money are being spent on legal discovery. Some of these costs could be reduced under Option 1 depending on how successful the transparency measures are in helping to identify non-compliance. However, the inherent cost and time of litigation work would remain substantial, so smaller right holders and individuals may still be deterred from pursuing cases. On the other hand, greater transparency may enable smaller right holders or groups representing them (such as Collective Management Organisations) to pursue action that would otherwise present too significant legal risk or cost.
Input transparency may allow certain technical measures to operate more effectively online, by granting right holders knowledge of which web crawlers are accessing their content and for which purpose, allowing them to decide which crawlers they would like to block.
The benefits of transparency measures will depend on how the measures differ from those in the EU and other jurisdictions. If they have similar scope to EU measures, large right holders already operating in both the EU and UK may not see a significant difference to the status quo. For SMEs who don’t operate in the EU, this could be a more substantial policy change, with the benefits dependent on the ability to enforce as discussed above.
Output transparency measures could make it easier to distinguish between human-created works and AI-generated content. This may allow right holders to better identify where their work has been used to train or develop AI models which produce outputs that infringe copyright, though this depends on whether output transparency includes provision to identify the model used. Output transparency could have wider impacts on trust in information and brand security.
There are examples of AI being used to alter information to present a different narrative to true events. This could lead to reduced trust in information as AI makes it harder to distinguish original imagery from AI-generated content. This can impact brands, such as news services, where trust in their reporting is eroded. Improving information on AI-generated outputs could therefore improve trust in these companies and in reliable information.
There is a risk that any transparency obligations we impose are too burdensome and cause AI developers to remove their products from the UK market and continue to pursue less transparent practices in other countries, using UK right holders’ works.
Compliance costs
Under Option 1, there may be limited additional compliance costs for the CIs and copyright owners. As UK copyright law under this option would remain largely the same as the status quo, with changes to how UK copyright applies to models trained abroad but imported here, this would limit familiarisation costs. There are other costs that could be incurred by right holders which would be indirect. If there is increased licensing, copyright owners may choose to enter into more transactions, and incur the associated transaction costs, to receive additional revenue. As AI training taking place overseas would require licences for copyright training data, for models made available in the UK, right holders may face a reduced need for technical measures to restrict access to their works. However, for any models which are not provided in the UK or do not comply with these requirements, the need for these measures could persist. The change in the associated set-up and ongoing implementation costs for technical measures may be small to neutral.
Creative industry growth and investment
Requiring licences in all cases could have a net positive effect on the rate of GVA and employment growth in the CIs as it could maximise potential licensing revenue for the sector. In a scenario where copyright owners are able to restrict access to their works in the counterfactual (Option 0) the increase in licensing under this option may not be as significant compared to the scenario where protection measures are not in place. Changes in international policy and the outcomes of litigation would also affect the scale of the licensing market, and therefore the impact of this option.
Strengthening existing copyright law may provide stronger incentives for firms to create British content. Goldberg and Lam’s 2025 research suggests an overly permissive copyright regime may weaken these incentives, ultimately reducing the supply of human-created content which will underpin future AI model training.[footnote 74] The CIs may also have increased confidence that productivity-boosting AI tools are compliant with copyright rules, potentially leading to greater adoption of AI than under the status quo. It could also support their own development of AI models on their content, offering further opportunities for productivity benefits, as well as the sale of these models to third parties.
However, if market access-based copyright and transparency restrictions result in the withdrawal or delayed access to overseas AI models, potential licensing opportunities will be reduced. Changes to productivity gains or potential relative international competitiveness of the CIs are uncertain but may be reduced if the scale of overseas developer withdrawal is significant.
For SME and individual copyright holders, as this option does not directly reduce transaction costs associated with licensing, benefits from increased licensing may be proportionately lower than for large right holders. Equally the growth impacts of transparency measures depend on the ability to enforce these obligations, with these expected to be constrained by the lower resources SMEs have at their disposal.
AI developers
Compliance costs and access to data
Requiring AI companies that train or make their models available in the UK to have licences for all uses of copyright works in AI development would likely result in significant compliance costs, especially if SMEs are required to comply. Under such a regime, every AI developer or business wishing to operate in or with the UK market would need to secure licences for all relevant copyright materials used in model training, development, and deployment.
This process could involve negotiating with multiple right holders, managing complex legal agreements, and potentially paying substantial licensing fees. Where datasets are obtained through third parties or via web crawling, compliance costs may be compounded by difficulty in checking whether those sets are usable, especially where data is unlabelled, and checking all data in a dataset would likely be excessively costly and not commercially viable.
For SMEs, these administrative and financial burdens could be prohibitive. In a highly restrictive scenario with stringent transparency requirements, this increased cost and complexity may deter new entrants and potentially incentivise businesses to relocate development activities, and model release, to jurisdictions with more streamlined or permissive legal frameworks.
Strict transparency requirements on developers to disclose detailed information on how they have trained their models may lead to developers not developing or deploying their models in the UK. If these requirements were set in a way that reveals proprietary datasets, model weights, or other valued information, it could plausibly push some large developers of closed source models to delay or avoid UK deployment. This depends on several factors, such as how costly the disclosures are, whether similar rules exist in other major markets and whether there are workable compliance paths that preserve sensitive information.
This stringency also impacts technical feasibility, as this is highly dependent on the level of granularity of information AI developers are required to disclose. Measures such as providing high-level public summaries of training data and implementing output labelling (for example via watermarking) are likely to be more feasible with existing technologies and practices. However, detailed, auditable reports on training data composition and curation will be more complex, and granular, work-level disclosure of all copyright works used in training may be unfeasible.
Growth and investment
If the UK were to implement this option it is likely that there will continue to be no frontier AI model training taking place domestically, while the existing UK AI sector will face additional costs, where licensing copyright works. The compliance costs above could deter firms and be especially prohibitive for SMEs. This, combined with administrative complexity, could limit investment and growth in the sector. That said, many existing non-frontier AI companies will already license where required, or use internal, copyright compliant data.
Furthermore, licensing would have an uneven impact on the market, as the most capitalised firms can afford to license and absorb compliance costs, disadvantaging domestic AI startups and scale ups who may be less likely to be able to afford these costs. As a result, both domestic and international AI developers may be incentivised to relocate model development and training activities to jurisdictions with more permissive or streamlined copyright frameworks. Some UK frontier model development activity being relocated abroad would affect the approximately £2 billion of GVA related to frontier AI development activity in the UK, with a potential negative impact on domestic AI fine-tuning and narrow model development.[footnote 75]
Again, depending on the stringency of transparency requirements and application of UK copyright law to models trained abroad but made available on the UK market, it is possible that current models are withdrawn from the UK where they do not comply. Leading frontier AI firms may see compliance as not worth the commercial gain from deploying in the UK and remove their models, limiting access to top global models. However, this has not been seen in the EU in response to their associated transparency requirements and is highly dependent on the requirements imposed.
Were this to happen, it would decrease the commercial viability of early-stage AI firms, including UK-based SMEs, as they will have restricted access to frontier models to build applications on top of. UK AI activities may build on potentially infringing models, for example by fine-tuning for a specific purpose. Therefore, they may face pass-through costs of compliance or removal of their base model entirely. This could reduce domestic model capability and growth.[footnote 76] A full licensing requirement could reduce the value of the AI sector to the UK economy, which is currently projected to be between £20 billion and £90 billion per year by 2030.[footnote 77]
The extent of this competitive impact on developers would however be lessened in a scenario where litigation or access controls in other jurisdictions leads to an increase in licensing. Where firms do remain, increased policy certainty would allow for longer term planning and investment.
AI adoption
Option 1 could affect the use of AI through influencing either the trust in using AI outputs or affecting the availability of AI models. Both are dependent on the behaviour of developers reacting to these measures and the impact on firms adopting internal TDM solutions, as well as consumer preferences.
Where any model that continues to be made available in the UK needs to be compliant with the requirement to license copyright material in all cases and provide detailed transparency information, there could be increased trust in the use of models. Users may be assured that the input data going into models did not result in a breach of copyright law. This may also reassure users that outputs they generate with AI are compliant where licensing agreements include permission to use the training data in outputs (for example, as seen in the licensing deal between OpenAI and Disney).[footnote 78]
A potential outcome for compliant models is an increase in training costs. As discussed under Option 0, dataset costs appear to be a relatively low proportion of the overall costs of model training and development currently, but if firms were to license the entirety of copyright works within their training corpus, this could have a pass-through implication on the costs to users. This could in turn decrease adoption of paid AI tools. This impact is uncertain as it depends on factors such as how sensitive user demand is to changes in price. Where models that continue to be made available in the UK reduce the level of non-licensed content in their models, this could influence model quality, potentially reducing the benefits of adoption.
For models that do not meet the requirements under Option 1, UK-based users of these AI models trained overseas would need to find an alternative model. As discussed above, model developers may choose to delay or withhold models being released in the UK to avoid legal risk. If this were to occur, both the potential growth of the UK’s AI sector, as well as the UK’s ability to harness the productivity benefits of AI, could be diminished.
A range of viewpoints exist on the impact of licensing requirements on the adoption of AI. Independent think tanks exemplify this – the Centre for British Progress predicts large negative impacts of requiring full licensing on AI adoption in the UK,[footnote 79] while the Social Market Foundation expects that AI adoption is “likely to continue apace regardless of the position the UK takes on copyright”.[footnote 80]
The OECD estimates that adoption of AI into the whole economy could add 0.4 to 1.3 percentage points to the UK’s productivity growth.[footnote 81] If models provided to the UK market are required to be trained in accordance with UK copyright law, it is plausible that users will have fewer models to adopt. This could potentially leave UK firms and users reliant on non-frontier models with lower capabilities. If this were the case, resulting opportunity costs could be significant.
Summary: Option 1 Assessment
-
Under this option, we assume that licences would be needed for uses of copyright material in AI training, including market-access measures requiring that AI models imported to the UK must have complied with UK copyright law during their development and new UK transparency measures.
-
The licensing and remuneration for right holders may increase and grow the UK’s CIs. This impact depends on the market reaction of AI developers, as well as the counterfactual (Option 0). The level of additional licensing could be limited if AI developers withdraw or delay access to leading AI models in the UK but continue to train in other jurisdictions. Against a counterfactual where there is more licensing in other jurisdictions, the marginal impact of Option 1 may also be less significant.
-
For AI developers, there would likely be significant compliance costs associated with this option. Under this option, there would likely continue to be no frontier AI model training taking place domestically and the growth of the UK AI sector - which is projected to generate £20-90 billion GVA per year by 2030 without intervention - may be dampened. In a scenario where litigation or access controls in other jurisdictions leads to an increase in licensing, this potential negative effect on AI sector growth may be smaller.
-
AI adoption may also be affected by this option. Users may face higher costs and a restricted choice of AI models and services. If leading AI models were withdrawn from the UK entirely, Option 1 would limit adoption of AI, which would have significant impacts, as the overall economic benefits of AI adoption are estimated to be £55-140 billion in 2030.
-
This option may benefit larger right holders to a greater extent than startups, individuals, and SME creators. Distribution effects will depend on the nature of licensing deals, including the use of collective licensing.
Option 2: A broad data mining exception
Description, assumptions, key uncertainties
Option 2 would introduce a broad text and data mining exception. This would allow data mining of copyright works for any purpose, including for AI training, and would be subject to relatively fewer restrictions than the status quo.
The Copyright and AI consultation did not define this broad data mining exception in detail, but it cited exceptions in other countries including the USA, Japan, and Singapore. While these exceptions are often presented as ‘broad’, it should be noted that they are not unrestricted. For example, Japan’s data mining exception does not apply to uses that would unreasonably prejudice the interests of the copyright owner. In the USA, the fair use principle applies and there is ongoing debate as about the extent to which fair use allows unlicensed AI training. The scope of fair use or these exceptions is therefore subject to litigation in those countries, and their exact scope and effect is difficult to assess.
Given the difficulty in assessing the scope of similar international comparators, for this Impact Assessment, we consider Option 2 as a broad data mining exception without constraints such as ‘fair use’. This is akin to the exception available in Singapore.
The exception would not be entirely without constraint, as data mining activity would only be allowed where the user has lawful access. However, right holders would not be able to reserve their rights and ‘opt-out’ of the exception. Furthermore, any exception would also need to pass the ‘3 step test’, as defined in Section B of the Report.
Creative Industries and copyright owners
Licensing market for copyright works
The impact of a broad exception (Option 2) on the licensing market is uncertain. In a scenario where training on publicly available copyright works continues to take place in other jurisdictions and creators have a limited ability to restrict access to their content through technological protection tools, the impact of the UK introducing a broad exception may be more limited. However, in a scenario where other jurisdictions see increased propensity for licensing due to legal developments, Option 2 would more significantly reduce opportunities for remuneration.
By allowing lawfully accessed copyright content to be used for training without permission in the UK, a broad exception may reduce incentives for licensing publicly available content. Licensing of privately held or restricted access content in the UK may also reduce in value if AI developers choose to use publicly available data instead. As noted under Option 0, the AI licensing market for copyright works is in the early stages of development but faces several barriers and it is uncertain how these will develop under the status quo. If smaller right holders see fewer benefits from licensing deals, then Option 2 could continue to limit the revenues of SME and individual copyright owners.
For any current or future licensing agreements which rely on UK copyright protection under the status quo, especially for publicly available works, a broad exception could reduce the income of copyright owners.
Right holders already have incentives to use technological protection tools, and a broad exception could accelerate right holders using these, as well as the development of new tools. This would place constraints on the availability and use of works by AI developers. SME and individual right holders would have less resources available to implement these measures. Where content is protected by tools and remains valuable for AI training, there may be increases in the level and value of licensing for this content. Equally for content that is protected and not valuable for training it is not expected that the level of licensing will change compared to the status quo. If the exception allows access to content behind paywalls this would reduce the value of licences for this form of restricted online content.
Reduced licensing revenue under a broad exception would largely be captured by non-UK firms, representing a transfer of value from UK CI firms to overseas AI developers. If the broad exception incentivises domestic TDM by UK AI companies, and reduces licensing agreements between UK licensees and UK right holders, the change in payment is considered a transfer cost at the UK economy-wide level, whilst any fall in transaction costs will be a net benefit.
Compliance costs
A broad exception (Option 2) is unlikely to create significant additional compliance costs for the CIs compared to the status quo. There will be some familiarisation costs for copyright owners who need to understand how changes to TDM exceptions would affect their business. In addition, some businesses who provide notices or permissions on their website based on the current non-commercial research exception would need to update this information.
However, there will be no ongoing compliance costs required by copyright owners as part of this option. For example, there will be no need to implement any rights reservation mechanism under Option 2, though some businesses may choose to continue to use these tools as they do under the status quo.
Enforcement and transparency
Under Option 2, the challenges for effective enforcement under the status quo are likely to remain. There are likely to be minimal input or output transparency obligations placed on AI developers, although voluntary guidance or standards may be developed, and other countries’ transparency regulations will have some spillover effect in the UK. For copyright owners this would largely mean a continuation of the status quo where it is difficult to prove if copyright material has been used to train or develop an AI model in the UK, and discovery and litigation costs are likely to be high.
Creative Industries growth and investment
Option 2 is likely to have some negative – albeit uncertain – impact on the growth of UK CIs and their incentives to invest in creative outputs compared to the status quo. This impact would be more significant if there are developments that mean creators can effectively restrict the use of their content for AI training in other, more permissive jurisdictions.
Any reduction in revenues received through licensing would be greater against a counterfactual scenario of an expanded licensing market. In a limited licensing market scenario, the broad exception may still reduce revenues for both publicly available and privately held works. The level would be influenced by factors including copyright law in other jurisdictions and the extent to which AI developers licence publicly accessible works to reduce litigation costs.
Even in a scenario where reduced licensing revenues are limited, the incentive to invest in and produce high-quality creative content could weaken over time. Creators and businesses rely on returns from their work to justify investment in skills, production, and risk-taking. If the loss of income from this option were significant, investment decisions may be scaled back, putting at risk a globally competitive sector for which the UK has comparative advantage. If there is also a reduction in investor confidence, there could be a similar effect on wider investment (including FDI inflows) and growth. Countries with high levels of intellectual property protections can enhance their economic growth through higher FDI inflows.[footnote 82]
Yang and Zhang’s (2025) theoretical model found that more lenient copyright protections can improve AI model quality without reducing incentives to create when pre-existing data is abundant, but impede future development when data is scarce by weakening creators’ incentives to produce new content.[footnote 83] This would also impact AI firms, which depend on a continued supply of high-quality, professionally produced content to train and improve their models.
Option 2 may reduce creators’ ability to control the use of their works in the UK. However, if technological protection tools are used at increased scale under Option 2 and are effective, this may increase the bargaining power of right holders, potentially increasing the value of licenses where data is valuable for AI training.
AI developers
Compliance costs and access to data
In a scenario where training on publicly available copyright works continues to take place in other jurisdictions, a broad exception (Option 2) is likely to only impact UK developers, as international firms can already operate in more permissive regimes. However, in a scenario where there is more successful litigation in other jurisdictions, the impact of a broad exception could be more significant, as the UK may be seen as more permissive. Introducing a broad exception could reduce transaction costs for AI developers using copyright works for AI training, relative to the status quo, if those developers are training or developing their models in the UK.
For publicly available content online, a broad exception (Option 2) would largely remove transaction costs. However, as noted above, foundation model developers appear to already avoid (or seek to avoid) transaction costs in relation to these works by training their models in more permissive jurisdictions. In a scenario where other jurisdictions converge towards licensing of public works, cost reductions would therefore be more significant.
There are also many developers of smaller AI models in the UK which may include uses of commercial TDM. A collation of known AI models by Epoch AI[footnote 84] shows that there are many models being developed by UK organisations such as universities, healthcare and scientific research companies. Were any of these organisations to train or develop models using copyright works in the UK, a broad exception could affect how they seek permission from copyright owners. This would remove legal processes for accessing copyright data and increase their access to a wider range of data.
If the UK were to implement a broad TDM exception, compliance costs for AI developers and businesses developing models in the UK are expected to be reduced, though this is dependent on their current licensing arrangements. While a broad TDM exception would still require AI developers to strike licensing deals for privately held works, they would not need to seek permission for using copyright works that are publicly available and lawfully accessible online. To an extent this would simplify legal processes for developers and lower administrative burdens.
For micro, small and medium-sized enterprises, as well as individual developers, the reduction in legal risk and transaction costs could make it significantly easier to innovate and compete internationally, though other barriers remain. The effects could be greater in relation to these firms as they are less likely to have access to other sources of data (such as data obtained through terms of service by online platforms) and may find it more difficult to locate training activities outside the UK. However, some compliance costs may remain, especially if transparency, reporting or other requirements are attached to the exception.
Growth and investment
Responses to the consultation from AI developers suggested that the introduction of a broad TDM exception would position the UK as a more attractive environment for AI research, development, and investment. Removal of copyright as a constraint on AI activity could make the UK more likely to attract both domestic and international AI firms to develop models in the UK, though it is one of many factors in this decision. Again, this effect could be greater if licensing of publicly available works were to grow internationally, as the UK could be seen as more permissive.
By reducing legal risk, a broad TDM exception could foster a more competitive domestic AI ecosystem, allowing all AI firms and businesses deploying AI technology, not just large capital rich firms, to access the data required to compete in the market. This would enable firms who cannot afford to conduct TDM activities abroad under the status quo to access the same data as larger, international firms.
A broad exception could in theory lead to the development of a UK based frontier general-purpose model, as it would reduce a key cost and legal risk to development. This could potentially drive benefits associated with a model training market such as increased investment and job creation, however this would also require removal of other significant barriers such as energy costs. Regardless, this option should boost business confidence and certainty in the current AI sector, as the status quo remains uncertain and dependent on ongoing legal cases.
As policy certainty typically boosts investment, Option 2 may boost existing UK model development, in layers such as fine-tuning and narrow purpose model development. Firstly, increased clarity of what activities are permitted in the UK increases the likelihood of firms conducting AI development in the UK, through reduced risk, increasing firm count in the AI sector, which in turn boosts competition, both of which stimulate growth. Secondly, increased access to data and models may lead to greater capabilities, increasing productivity, and therefore growth.
Research undertaken by American academics indicates that fair use copyright regimes are associated with a 5.25% higher level of R&D spending by technology and hardware firms compared to countries that had more restrictive copyright regimes.[footnote 85] Some research has suggested that countries with comparatively broader copyright exceptions are associated with higher rates of AI patent filings and formations of new AI organisations.[footnote 86] Research presented by the CCIA, an information technology trade association, indicates that AI investment intensity is higher in countries with flexible copyright regimes compared to similarly developed economies with more restrictive rules in place, though this does not imply causation.[footnote 87]
These factors could provide a boost to the existing strengths of the UK AI market. As a result, the UK AI sector could see growth, particularly in the frontier model development layer and the wider AI sector, with these layers currently contributing between £2 billion and £8 billion GVA in 2024 respectively.[footnote 88]
AI adoption
The extent to which a broad exception would affect users of AI is highly uncertain. It may have a more significant influence on users of models which are trained or developed in the UK, as users may have more assurances that models were complying with copyright law. Users’ ability to trust AI is a key factor in achieving widespread AI adoption. However, as the UK has not trained any leading foundation models domestically,[footnote 89] there would likely be both a marginal impact on trust and a limited scope of models affected.
If the majority of foundation models were still trained abroad, there would likely be little impact from Option 2 on the availability, quality, and cost of most AI models for UK users. However, if UK developers were able to access more data (or the same data at a lower cost) for building and fine-tuning models, there could be an increase in their usage. If this leads to developers designing models to address issues distinct to the UK economy, it could potentially increase the level and benefits of adoption. Furthermore, access to cheaper data may feed through into lower prices for AI services, increasing adoption by end users, though this mechanism is uncertain, and training costs are not typically the primary driver.
As set out in Section B of the Report accompanying this assessment, there is uncertainty about the extent to which copyright constrains the operation of AI services that access copyright works as part of their operation – for example AI agents and Retrieval-Augmented Generation (RAG).
As discussed above, copyright concerns can be a consideration for some firms when considering adopting TDM techniques or AI products. If so, even a modest increase in adoption could translate into significant economic gains, given the magnitude of estimated productivity growth from AI.[footnote 90]
Summary: Option 2 assessment
-
Option 2 is likely to have negative – albeit uncertain – impacts on the licensing revenue of the UK CIs. This may affect incentives to invest in creative outputs and therefore growth compared to the status quo.
-
Each of these impacts are highly dependent on other factors and how they change under the counterfactual (Option 0), especially on how far training on publicly available works continues to take place in other jurisdictions and the extent to which creators can restrict access to their content
-
There are unlikely to be significant compliance or enforcement costs compared to the status quo. The negative effects on licensing income, as well as incentives for investment in creative works, would have a negative influence on the growth of CIs, more so than other options.
-
A broad exception would strengthen UK competitiveness for AI development and deployment and remove a significant constraint to competitive general-purpose model training. However, the impact is uncertain and dependent on other barriers. It may also address a constraint to growth in other elements of the AI value chain, such as fine-tuning and narrow model development. Taken together, this option may support the UK AI sector, with the boost potentially larger depending on international licensing, as in a scenario where there is more successful litigation in other jurisdictions, the UK would be seen as more internationally permissive.
-
AI adoption may also grow, through increased certainty in the use of models and TDM. There may also be some second order impacts from the growth of the UK AI sector. Option 2 has the potential to be the most enabling for the AI sector, with the AI sector playing an increasingly important role for growth both as a sector in itself and via adoption across the wider economy.
Option 3: A data mining exception with a rights reservation mechanism
Description, assumptions, key uncertainties
Option 3 would introduce a text and data mining exception for any purpose, including commercial purposes and training AI models.
The exception would apply only if a right holder had not expressly reserved their rights. This reservation would allow right holders to “switch off” the exception on selected works. For content publicly available online, this reservation would need to be by machine readable means.
The exception would only allow data mining to take place on works to which a user has lawful access. This arrangement would, in principle, allow right holders to factor data mining into any access conditions applied to works, and what is charged for that access.
To support the effective introduction of a new data mining exception and rights reservation mechanism, new transparency measures would also be introduced, broadly aligned with comparable measures in the EU. This would include transparency measures on AI inputs and AI outputs. The government may need to introduce new regulatory duties to undertake this activity and may include establishing a new regulatory body along with appropriate resources and powers to deliver these transparency measures. It is not clear at this stage how these regulatory duties would be funded.
There would be no new copyright requirements on models trained abroad under this option, meaning that AI models imported to the UK would not be required to have complied with UK copyright law during their development. However, transparency rules may apply to AI systems developed outside the UK.
A key uncertainty under this option is the extent to which opt-outs are used by right holders. This will depend on factors including the efficacy and ease of use of rights reservation tools.
Respondents to the consultation took different views on the extent to which works would be opted out. Many right holders thought opt-outs would be difficult to implement, so a large volume of works would be available to AI developers. This may imply similar impacts to the broad exception, Option 2. However, many AI developers, including SMEs, felt that insufficient works would be available under this option, particularly when compared to the situation in other countries. In a scenario where a large number of works are opted-out, the impacts of the exception would be reduced and tend towards the status quo, Option 0. Other respondents felt that this option would achieve a balance where certain (especially high value) works were licensed but sufficient data for training was still available. At this stage, it is difficult to predict the extent to which rights will be reserved, and the impact of this on access to data. We will continue to monitor the market in the EU, which has a similar exception, with opt-out.
Creative Industries and copyright owners
Licensing market for copyright works
Under Option 3, the impact on the size and value of the licensing market, including for publicly available works, is highly uncertain. It would depend on the extent rights reservation is implemented by copyright owners, including the awareness and skills of right holders to implement it, and the extent that these controls work effectively in practice. For any right holders that reserve their rights, AI developers would need explicit permission to carry out TDM activities in the UK.
With a legally recognised rights reservation mechanism, the changes to the licensing market may still depend on whether the technical measures included are effective in protecting the use of works in training by overseas AI developers. Where technical measures act as an effective rights reservation mechanism, this could see increased use of licensing where protected content remains valuable for AI training. If the reservation mechanisms do not prevent overseas AI developers using reserved works, they could continue to train their models on publicly available UK copyright works without paying licensing fees. This may limit the demand for the licensing of works where rights have been reserved in the UK.
If a copyright exception with rights reservation in the UK does not stop certain AI developers from training on content which is publicly available in other jurisdictions without a licence, there would be a limited impact on the extent of licensing, data access to AI developers or the levels of control and remuneration for right holders for this type of AI activity.
Under Option 3, AI developers would still need to negotiate licences for restricted or privately held copyright works, though the effects on the value of licensing could be affected by any substitution to publicly available content.
Although measures such as transparency may help to support a licensing market, enabling copyright owners to gain remuneration from their works, the demand for works that have reserved their rights is unclear. This demand may be lessened by the substitutability of reserved and unreserved content. Where this takes place the value of privately held or restricted access works could decrease. Generic images or unstructured text may not be of great value if alternatives can be accessed elsewhere for free. The evidence of frontier models making greater use of internal data[footnote 91] and synthetic data[footnote 92] suggests that companies are using broader methods for expanding training and fine-tuning datasets.
Enforcement and transparency
The effectiveness of Option 3 would depend on the extent to which AI developers adhere to rights reservation mechanisms. Transparency measures could help right holders assert their rights, including that AI developers are adhering to rights reservations. We assume for the purpose of this assessment that transparency measures may apply to models developed abroad, as well as those developed in the UK.
As previously noted, high legal discovery costs remain a significant barrier for stakeholders in AI-related disputes. While transparency measures could lower these costs by making it easier to detect non-compliance, the fundamental time and financial burdens of litigation would persist especially for smaller businesses. Consequently, while enhanced transparency may empower smaller right holders or Collective Management Organisations (CMOs) to pursue previously high-risk cases, many individuals may still find the legal process prohibitive.
If transparency obligations are too burdensome AI developers may remove their products from the UK market. This could harm the productivity of sectors across the economy, including the CIs. Transparency measures may also apply to CIs firms developing their own in-house models utilising their IP, increasing their administrative burden.
The impact of output transparency measures is expected to be the same as those outlined under Option 1. There is uncertainty as to whether output transparency would help right holders enforce their rights as this would depend on how the measure is implemented.
Technical tools and standards
Under Option 3, introducing an opt-out may provide UK right holders with a clearer route to signal how they wish their works to be used, and thereby strengthen control over use and licensing of their works. However, the impact is uncertain and will depend on the extent to which rights are reserved. This in turn would depend on whether a rights reservation can be made technically operable, standardised, and easy to apply across different types of content and distribution channels, and whether AI developers reliably detect and respect it in practice.
If workable, it could reduce reliance on bespoke technical measures and improve right holders’ negotiating position in the UK and abroad. If not, it may have limited practical effect while still creating implementation and monitoring costs, which are likely to be more burdensome for smaller right holders.
We anticipate that any rights reservation mechanism would involve the following costs:
a. Time and familiarisation costs, including for right holders who do not reserve their rights.
b. Set-up and implementation costs.
c. Transaction costs
d. Monitoring and maintenance costs.
e. Enforcement costs.
f. Opportunity costs to right holders who do not reserve their rights.
The cost of implementing and maintaining a rights reservation mechanism will depend on the mechanism used, for example:
a. Tagging/Metadata on each existing and future digital work. The implementation and ongoing cost will depend on how much copyright content a right holder has already uploaded and will upload online in the future. As metadata and tags can be stripped by third parties there will be monitoring costs to ensure rights reservation is being respected.
b. Robots Exclusion Protocol. Creating or adapting a robots.txt file to implement rights reservation standards may take time and effort. If new REP standards are not agreed, files may need to be continuously updated by the website owner to block new bots and adapt to changing web crawler behaviour. As the Robots Exclusion Protocol can be circumvented there will be costs associated with monitoring the activity of bots to ensure rights reservation is being respected.
c. An alternative rights reservation mechanism – for example a register. There will likely be an implementation cost and ongoing monitoring and maintenance cost to the right holder under any other rights reservation mechanism that is introduced or enforced under this option.
Creative Industries growth and investment
Under Option 3, the overall impact on right holders is uncertain. As described above, the effects of Option 3 on the licensing market could be positive or negative, depending on the level of rights reservation and the effectiveness of these controls. These impacts also depend on the counterfactual scenario (Option 0). If there is a higher propensity for licensing copyright works in other jurisdictions, then Option 3 could have a limiting effect on licensing if the UK copyright regime is considered more permissive. Option 3 would place the UK in a similar position to that adopted by the EU. The compliance costs faced by copyright owners under Option 3 could also be detrimental to growth if they are overly burdensome.
If there is a reduction in copyright owners’ income, Option 3 may reduce the incentives for creators to invest in new creative works. If rights reservation is adopted and complied with at scale, alongside transparency measures, then AI developers may increasingly license publicly available works of value to them. The impact on privately held or restricted access works also remains unclear, if the option leads to continued development and growth of the licensing market, this may retain the value of these works. Conversely, if unreserved works can be accessed for free it may reduce the demand and value of licenses for privately held or restricted access works.
The rights reservation mechanism also presents several risks. If opt-out technology does not allow fine-grained distinctions to be made between different uses of works, then this could lead to over-blocking of web crawlers, for example limiting search crawlers which may in turn limit creator discoverability. The net effect on the CIs will depend on the specific rights reservation mechanism, enforcement powers, and any transparency obligations introduced.
There could be a disproportionate effect on SMEs as the time and resources to implement a rights reservation mechanism would make up a greater share of an SMEs total costs compared to a large business with more resources.
AI developers
Compliance costs and access to data
Developers’ access to data in the UK may increase under this option, as more works will be available for AI development in the UK without a licence. In particular, it will be easier to use works which have been made available online, without restriction, to train and develop AI models. However, the extent to which this enables access to sufficient volumes of works to encourage AI training and development in the UK is unclear. Some AI developers that responded to the consultation, including SMEs, said that it would not, as large numbers of works would be opted out, but others disagreed. As noted above, in a scenario where limited numbers of works are opted out, benefits to AI developers may be similar to those under Option 2. However, where opt-outs are extensive, benefits will be reduced and closer to the status quo.
Option 3 would require AI companies training models in the UK to implement systems for detecting and respecting opt-outs. Depending on its implementation, this could require developers to track and respect reservations on an ongoing basis. This could mean re-validating datasets before each training run to ensure that no new rights reservations have been registered since data was first acquired, with associated time and cost burdens.[footnote 93]
These impacts are more likely to disadvantage SME AI developers, who are likely to have fewer of these systems already in place, as well as experiencing higher relative costs to implement them. Where multinational AI companies already comply with similar obligations under the EU AI Act, the additional cost of meeting these requirements in the UK would likely be minimal. Again, the stringency of transparency requirements will determine the impact to business and therefore the UK, as there are competitive limits on what a firm will be willing to share publicly, as well as feasibility limits.
Growth and investment
Similar to Option 2, in a scenario where training on publicly available copyright works continues to take place in other jurisdictions, the growth impacts may be minimal; however, in a scenario where there is more litigation or access controls in other jurisdictions, stronger growth may be seen. By reducing legal risk, this policy could encourage greater AI development and adoption in the UK. However, this will depend on the extent to which copyright owners reserve their rights, which depends on a range of factors including how effectively the opt-out system is implemented. There is a worst-case scenario where this option is too burdensome and costly for SME AI developers to adhere to and procure sufficient data in the UK, thus putting them at a competitive disadvantage, whilst large global AI firms train and develop models abroad.[footnote 94] However, in a scenario where licensing is more common abroad, this incentive to train abroad may be lessened, boosting the growth potential of this option.
AI adoption
As Option 3 would not include any restrictions on AI models trained in other jurisdictions, the UK economy would still have access to global frontier models. Therefore, the impact would be on models which are trained or developed in the UK. It may become easier to develop and provide these models, and users may find it easier to manage legal risks when using them, for example when using AI tools to analyse lawfully-acquired datasets. As noted above, the extent of these effects will depend on the extent to which works are opted out. Input and output transparency measures introduced under this option, could also increase trust in, and usage of AI.
However, there is a risk that if transparency requirements become overly burdensome, they could cause firms to delay the release of products and services and thereby impede AI adoption and deployment. According to a survey of 500 members of the UK AI ecosystem, 83% of respondents indicated that implementing a transparency requirement would be likely to result in delays to the deployment of AI services or features in the UK.[footnote 95] It should be noted that this research was conducted by the CCIA, with its members largely in support of a TDM exception.
Under this option, AI adoption may grow, though the overall impact of Option 3 is uncertain and would depend on the detail of how the opt-out is implemented in law, the technical measures supporting it, and the extent to which it is used by right holders in practice.
Summary: Option 3 Assessment
-
Under a broad exception with a mechanism for rights reservation (Option 3) the effects on the licensing market could be positive or negative, depending on the level of take-up of the rights reservation by copyright owners, and its effectiveness in controlling access to works. Where copyright works continue to be accessed for training in other jurisdictions, the legal effect of the rights reservation will be less significant. If there is a higher propensity for licensing copyright works in other jurisdictions, then Option 3 could have a limiting effect on licensing compared to the status quo, if the UK copyright regime is considered more permissive by comparison.
-
Right holders will face set-up, monitoring and enforcement costs for the reservation mechanism, which are likely to be more burdensome for SMEs and individuals.
-
If there is a reduction in copyright owners’ income, Option 3 may reduce the incentives for creators to create. If AI developers increasingly license copyright works of value to them as a result of widely-implemented rights reservations, then this could increase CI income. The overall impacts on creative industry growth of Option 3 are therefore uncertain.
-
Under a broad exception with a mechanism for rights reservation (Option 3), the UK could be less competitive internationally relative to Option 2 but would still be more attractive for AI development and deployment than it is today. The extent of this would depend on the extent to which rights reservations are used by right holders, and whether other market barriers change which currently restrict development and deployment.
-
The cost of licensing works, when rights are reserved, and transparency requirements may deter frontier model developers from training their models in the UK, which could mean there remains no frontier model training in the UK. It is uncertain whether this option would address constraints to growth in other elements of the AI value chain, such as fine-tuning and narrow model development. The extent to which a boost is provided will depend on the extent to which rights reservations are implemented.
-
AI adoption may also grow, through reduced legal risk over the use of models and TDM more widely, though other barriers could remain. Second order effects on productivity benefits of adoption depend on whether the reservation increases or decreases access to quality data.
Summary of impact on SMEs
Creative Industries and copyright owners
In March 2025, 77.8% of businesses in the CIs had a turnover of less than £250,000, a higher proportion than that for UK registered businesses overall (65.3%). Of firms within the CIs, 93% are micro (0 to 9 employees), 5% are small (10 to 49) and 1% are medium (50 to 249) businesses.[footnote 96] The status quo is less favourable to SMEs than larger businesses in the CIs. Smaller businesses have less resources to monitor and enforce their rights, including having less ability to afford the latest software which prevents web-scraping, or to take legal action against actors they believe to have unlawfully used their works.
Some large UK-based creative organisations have licensing agreements with AI developers. While there is some remuneration for smaller creators within these, the arrangements for most agreements are unclear. The HarperCollins deal with Microsoft was reported to pay authors $2,500 per book that is included in AI training, where the terms of the deal was agreed to by hundreds of authors.[footnote 97] However, given the scale of modern LLMs, per-author licensing payments across multi-billion-parameter models are likely to be difficult in practice.
To reduce transaction costs, licensing agreements for individual creatives could be organised through CMOs or large right holders, however some creative industry bodies have cautioned on how this should be implemented. The Council of Music Makers said the government should ensure that consent is required from primary creators before their works can be used with AI, and that such consent should be “explicit, specific and meaningful”, not assumed through “generic terms in old agreements”.
Any compliance costs from the consultation options would likely have a disproportionate impact on smaller businesses. Option 3 (exception with rights reservation) was identified as having the greatest compliance costs for right holders. Smaller creative businesses, and other SME owners of copyright, would have fewer staff and resources to divert time to ensuring the rights for their works are reserved. Any further transaction costs that arise, which was identified to be most likely in scale under Option 1, would also be more significant for smaller right holders that directly negotiate these deals. This would be lower where negotiations are done by larger organisations on behalf of them, but they could also be subject to a lower share of revenue in these cases.
The relative effect of the consultation options on the growth and wider income of smaller right holders compared to larger right holders is less clear. Sole traders or freelancers could be at greater risk loss of work where creative services are replaced by AI, though larger organisations, such as advertising bodies and digital publications, could also be subject to loss of revenue. However, the marginal effect of changes in UK copyright law on the ability of frontier AI models to replace human labour is highly uncertain. This could be particularly limited for options which have no market access conditions in place (Options 2 and 3) and therefore have less effect on current frontier model training practices.
AI developers
Of identified AI companies in the AI Sector Study, 70% are micro, 15% are small and 9% are medium businesses.[footnote 98] The status quo is unfavourable for UK AI SMEs, who need to comply with stronger copyright protections compared to SMEs operating in other jurisdictions. Whilst training of frontier general-purpose models does not occur in the UK, some UK SMEs training narrower models or machine learning algorithms may be subject to these requirements. That said, AI development is already a large market in the UK and has been able to grow under the current copyright framework.
Option 1 could have the most significant cost implications for UK SMEs. A requirement to license all copyright works used in model training could be significant. Current UK developers already need to comply with copyright law, prohibiting TDM of publicly available work, suggesting that firms that already develop models in the UK are compliant. However, for those that build on top of models, a requirement under Option 1 for models supplied to the UK to be trained in compliance with UK copyright law would mean restricted access to frontier models to build from. This is more likely to impact SMEs who may not be able to build their own model without this building block. This option could also have greater implications for any SMEs that would have been looking to relocate training outside the UK.
Compliance with transparency measures, under Options 1 and 3, would be disproportionately burdensome on UK AI SMEs that fall in scope, as smaller AI companies may have less resources or processes in place to meet the additional requirements.
Options 2 and 3 could be of net benefit to UK AI SMEs, as firms may be able to legally access more copyright data versus the status quo. This could enable greater certainty in being able to establish business operations for AI development and training in the UK.
AI adoption
Small and micro businesses are less likely to be users of AI. As of December 2025, 24% of micro and 27% of small businesses indicated they were using AI for at least one business function surveyed. This is followed by medium (37%) and large (44%) businesses in turn. However, this is a significant increase in adoption from the figures reported in January 2025, where 16% of micro and 17% of small businesses indicated they were using AI for at least one business function surveyed.
SME users of AI products could face disproportionately higher compliance and familiarisation costs compared to larger businesses under the options considered.
Under all options, SMEs will have less resources to determine whether the models they use are compliant with changes to copyright law and/or transparency measures. However, the onus will be on AI service providers to ensure that they comply with the jurisdiction they are operating in. Despite this, concern over adoption of infringing outputs is a potential barrier in the status quo. Under Option 2, there is unlikely to be additional legal challenge for use of copyright works compared to the status quo, and legal risk for UK SME users may be perceived as lower.
The impact of the various options on AI adoption is uncertain, despite having the potential for largest impact. In our analysis above, we discussed adoption being affected by several factors including trust, availability, quality and cost. Some of these factors may be more pertinent to SME users of AI. For example, those without the ability to hire legal resource may rely on trust when deciding whether to use AI products. Cost would also be a greater issue for smaller companies, though many SMEs and individuals will use free versions of AI products.
Next Steps
-
Although we have made an initial assessment of impacts of these options, there remain significant gaps in evidence and uncertainties about how the market will develop. We will continue to build our evidence base on how copyright impacts economic growth in the UK.
-
We will continue to monitor important developments in the wider market since the close of the consultation. The licensing market continues to grow, to the benefit of right holders and AI developers. Meanwhile, the volume of litigation has increased, particularly in the US where most AI models are trained. New transparency rules have come into effect in the EU and parts of the USA, and new technical standards continue to be developed. It is important that these developments inform our final policy position.
-
We will use commissioned research to understand the impact of copyright reform on the CIs. It will build the evidence base on the supply and demand of creative content for use in training and will also be used to build a framework for quantifying the impacts of copyright reform on the CIs and AI sectors. We will also explore evidence gathering on the effectiveness on protection tools and standards and how these support CI competitiveness and access to data.
-
We will use the DSIT AI Sector Study to collect evidence on developers across the tech stack, and the extent to which copyright affects business activity. It will focus on areas of UK strength such as fine-tuning and application-layer development. This will be supported by stakeholder engagement, including with sector and industry bodies, to understand how developers source training data and deploy text and data mining (TDM) solutions in practice, and how they may respond to different policy options.
-
We will use the AI and Future of Work Unit to provide the best analysis and evidence on AI’s impact on the economy and labour market. Along with DSIT’s UK AI Adoption Survey and other surveys such as ONS’s BICS surveys, DSIT will track the pace and pattern of AI adoption across sectors, as well as survey the extent to which copyright affects AI adoption.
-
DSIT analysis based on OECD: Macroeconomic productivity gains from Artificial Intelligence in G7 economies (EN) ↩
-
This is illustrative analysis based on a range of previous general-purpose technologies growth, from railways (8%) to mobile telecoms (40%). This is an uncertain estimate, however according to the UK AI Sector Study, between 2022 and 2024, the UK AI sector grew at a compound annual rate of 79%. ↩
-
DSIT analysis applying the estimated sectoral growth rate to the AI Sector Study’s estimate of sectoral GVA of £11.8 billion in 2024. ↩
-
DCMS Sector Economic Estimates. Projections for Creative Industries GVA based on OBR forecasts and Creative Industries compound average growth rate for 2010–2023, in real terms 2022 prices. This figure is updated from the Creative Industries Sector Plan following updates to GVA estimates from 2023 to 2024. Using the same method this previously indicated an additional £40bn in GVA to 2035. ↩
-
Creative PEC (2024) Over the same period FDI is concentrated at the sub-sectoral level, with the IT, software and computer services sub-sector accounting for over 70% of projects. ↩
-
United Nations Trade and Development– global import data, 2013-2023 (latest available years), mapped to industry sectors using sector definitions used in: Department for Business and Trade and Department for International Trade (2023) – Global Trade Outlook - February 2023 report. USD converted to GBP using year average exchange rates, drawn from the ONS. Aggregate growth was calculated keeping USD/ GBP exchange rates constant ↩
-
This is illustrative analysis based on a range of previous general-purpose technologies growth, from railways (8%) to mobile telecoms (40%). This is an uncertain estimate, however according to the UK AI Sector Study, between 2022 and 2024, the UK AI sector grew at a compound annual rate of 79%. ↩
-
AI Sector Study 2024 2030 GVA is calculated by applying year-on-year growth rates (10-40%) to 2024 UK Dedicated and Diversified AI Sector GVA - £11.8 billion ↩
-
In this report we consider impacts in light of current evidence of unlicensed use of publicly available online works by AI developers, but do not comment on the legal status of this activity, which usually takes place in jurisdictions outside the UK. Copyright works made available online remain subject to copyright protection. ↩
-
House of Lords Communications and Digital Select Committee inquiry: Large language models. December 2023 ↩
-
The World Intellectual Property Organization (WIPO) identify 9 sectors of any global economy that they classify as core copyright sectors ↩
-
Centre for Regulation of the Creative Economy (CREATe) (2025) ↩
-
Cloudflare is introducing a pay-per-crawl system to charge crawlers at the point at which they access a site for scraping – this allows publishers to define a flat rate for access or more directly control the access of certain crawlers ↩
-
Andrea Bartz, Charles Graeber, and Kirk Wallace Johnson v Anthropic PBC, Defendant.: United States District Court Northern District of California No. C 24-05417 WHA ↩
-
For the purpose of this analysis, we have included the deals connected with the Financial Times who have their head office in the UK but are owned by Japanese-based Nikkei. ↩
-
The European Union AI Act: Regulation - EU - 2024/1689 - EN - EUR-Lex ↩
-
DCMS Sector Economic Estimates. Projections for Creative Industries GVA based on OBR forecasts and Creative Industries compound average growth rate for 2010–2023, in real terms 2022 prices. This figure is updated from the Creative Industries Sector Plan following updates to GVA estimates from 2023 to 2024. Using the same method this previously indicated an additional £40 billion in GVA to 2035. ↩
-
Stanford University Human Centered Artificial Intelligence, 2025 ↩
-
ONS Business Insights and impact on the UK economy (wave 147) ↩
-
The IT, software and computer services sub-sector is a significant driver of the overall CI picture, with adoption reported by 56% of businesses, compared to 47% in design and designer fashion, 39% in film, tv, video and photography, and 12% in music, performing and visual arts. ↩
-
Stanford University Human Centered Artificial Intelligence, 2025 ↩
-
Ben Cottier, Robi Rahman, Loredana Fattorini, Nestor Maslej, and David Owen. ‘The rising costs of training frontier AI models’, 2024 ↩
-
37% of the firms who found regulation to be a barrier to adoption (3% of all firms) stated copyright risk as a regulatory barrier. ↩
-
UK innovation diffusion and adoption survey- 28% of businesses stated regulation is an enabler compared to 19% who said it was a barrier. ↩
-
See previous explanation ↩
-
EPOCH.AI, 2025. Some UK based firms such as Google Deepmind have produced frontier models whilst the training took place internationally. ↩