It is becoming increasingly clear that artificial intelligence (AI) will be the defining technological advance of this decade – if not century. In a new White Paper the Government has said it wants to make it easier for tech businesses to grow, innovate and create jobs in the UK – yet its approach to intellectual property risks making this country one of the least attractive places in the developed world for many sectors of the AI industry.
AI and IP are intrinsically linked
Intellectual property and, in particular, copyright and database rights are important for AI. From machine learning with materials scraped from the internet (e.g. ChatGPT, medical research etc), data mining articles sourced from scientific journals or, as Google did, digitising something like 10m in-copyright books to improve their algorithms across multiple products, AI frequently involves the intellectual property rights of others. How we regulate IP therefore really matters for British businesses.
What may surprise many is that most AI techniques that involve use of third party information without permission from the copyright owner are illegal in the UK if done for commercial gain. That’s an awful lot of website owners, publishers and others to seek permissions from in an era of Big Data. By contrast, companies based in the US, Israel, Japan, Singapore, Taiwan, South Korea etc can develop AI using third party materials without needing to pay for additional licences, safe in the knowledge that as long as they have legitimate access in the first place, much of what they are doing is enabled by their innovation-friendly copyright laws.
As it stands, undertaking AI in the UK for non-commercial purposes is, however, lawful. This is because in 2014 the UK government was the second in the world to introduce specific copyright laws that allow for machine learning if done on a not-for-profit basis. Realising that seeking licences from potentially hundreds of thousands, if not millions, of website owners – or renegotiating licences with database providers such as scientific publishers to which research organisations were already paying significant sums – was going to put data-driven research and innovation in the deep freeze, it did away with the requirement to seek or reseek permission.
The reason businesses were not included in the 2014 amendment was because of EU law, which imposes an artificial and increasingly criticised distinction between commercial and non-commercial research. When it was announced in 2022 that the Government would extend the existing AI exception to include UK businesses it therefore felt like not just hyperbole, but a real and genuine Brexit benefit. Since then we have however seen climbdown after climbdown, until in February 2023 the Government announced it was scrapping its plans entirely, justifying it with reference to concerns about ‘creators’.
A recipe for irrelevance
The result of this volte-face has been the disappearance of a commitment to updating the Copyright Act. In the Government’s response to the ‘Pro-innovation Regulation of Technologies Review’ (the Vallance Report) there is reference to a ‘code of practice’, which is hardly likely to excite investors in search of legal certainty. There is also very little reassurance in the assertion that companies committing to it can ‘expect to…have a reasonable licence offered by a rights holder in return’.
Further contradictions and questions arising from this response abound. Where a reasonable licence is not forthcoming, implicitly the Government suggests it will intervene in companies’ licensing practices to ensure one becomes available. But this is neither feasible nor credible, given the huge range of sources of data – including, of course, from around the world – used for machine learning.
Where does this leave AI companies and others who make their money from analysing the internet, where every page contains terms and conditions? Must they seek permission? What about the universities, charities, bioscience companies who already pay millions year on year to access scientific databases – will they have to renegotiate and pay more to do machine learning now if this work, today or tomorrow, could have commercial relevance?
In short, it appears that unlike our competitors in the US, Israel and East Asia, if you are a UK-based company you will from now on have to seek permission to use copyright works you already have lawful access to – even when your AI and its outputs are entirely non-competing, deployed in a different market or do not resemble in any way a work it has trained on.
Anyone who has experience as a licensee can tell you that if the UK government continues on this path, the transactional overheads and innovation penalty on both British businesses and scientific research will be immense. No doubt it will force some businesses wishing to maximise access to training data while minimising their liabilities to look to friendlier AI jurisdictions.
From an IP perspective not only will the UK’s AI exports be uncompetitive, but exports to the EU will potentially be particularly problematic. The EU’s AI Act will require access by the competent authorities to the training data of products and services deemed ‘high-risk’. How does a UK company comply with this if the licence, as it no doubt will, prevents sharing data with a third party, or requires that training data is destroyed immediately or when out of licence?
The wrong end of the telescope
Worryingly, this change of heart appears to be the direct result of lobbying from the music industry. Indeed, in the Government’s response to the the Vallance Report, there were no references to the huge economic and social benefits that AI will bring in the section dealing with IP. In its stead was a preoccupation with the needs of the creative industries. It seems that the needs of startups, researchers, technology companies, universities, the environment and the NHS are of little concern compared to those of the entertainment industry. This cannot be right.
Much of the fuss has focused on the assertion that AI outputs could too closely imitate and so compete with original creations. Yet copyright law for centuries has dealt with infringements where the work of one artist is similar to that of another. Generative AI does not change this – there is nothing new here. Requiring licensing or relicensing of materials that companies already have access to (with all the transaction costs that come with this) makes little economic sense – especially when our international competitors won’t have the same requirements.
If the Government wants to help the entertainment industries it should do this by supporting them to build and sell access to data trusts. Given the high cost of data cleansing, there is a real value to building such tools, meaning that businesses will be ready to pay, thereby returning value back to creators. Once a company has paid to access the data for training their AI models, they should be free to develop products and services as they want. The one caveat is where they develop generative art that is substantially similar to the work of another creator. This is the defined space where (re)licensing for the creation of generative art has a legitimate role and is what copyright law is there to protect.
Make or break time for UK AI
The AI White Paper says it wants ‘to build the UK’s capabilities in foundation models, including large language models’. This won’t be possible at any scale without amending copyright law in a way that makes sense.
What’s more, the White Paper states a desire ‘to build public trust in cutting-edge technologies’. Yet everyone knows that less data to train an AI model (owing for example to licensing overheads) means poorly trained algorithms which can feed into poor algorithmic decision making and all that implies. From a purely economic perspective, it will mean inferior products and services, and lower competitiveness for homegrown UK AI players.
AI is a general purpose technology that can bring with it huge economic and societal benefits. Yet, the UK government appears to be looking at our AI IP strategy through the narrow lens of one comparatively small and tangential industry. It must stop and concentrate on the bigger picture. We need a vibrant AI ecosystem where UK-based tech companies, scientists and universities are left to innovate and compete on a level playing field with our competitors abroad. To make it easier to innovate and grow, businesses need less regulation and more flexibility in our IP system. Without an urgent rethink, the Government risks delivering the opposite.
Click here to subscribe to our daily briefing – the best pieces from CapX and across the web.
CapX depends on the generosity of its readers. If you value what we do, please consider making a donation.