UK music industry needs urgent guidance on AI training rules

Partner Simon Goodbody and Associate Jonathan Coote have written for The Times, calling for clarity in the government’s policy on training AI models on copyrighted works.

Simon and Jonathan’s article was published in The Times, 28 March 2024, and can be found here.

On 13 March, the world’s first major AI regulation was passed (resoundingly) in the EU. The UK, however, is stuck between a pro-innovation and pro-rightsholder approach, leaving it seemingly unable to implement effective regulatory control. In limbo, the music industry, which generated £6.7bn for the UK economy in 2023, desperately needs clarity on the law relating to the training of AI tools on copyright works.

Although untested, the current statutory framework in the UK suggests that the training of AI tools on copyright works for commercial purposes is prohibited. However, in 2022, the government controversially announced plans to provide a complete exception for training. This about-turn would have created one of the world’s most lenient regimes. Despite aligning with the UK’s pro-tech National AI Strategy, the proposed exception was soon withdrawn after being described as “music laundering” by rightsholders.

The government then established a working group to develop a system for AI companies to obtain a “fair licence”. Perhaps unsurprisingly, given the diametrically opposed interests of tech and rightsholders, these talks failed early this year. So, it seems, we are back at square one.

In the absence of statutory clarification, tech companies and rightsholders are heading to court on both sides of the Atlantic. Most notably in England, stock-image site Getty Images has sued Stability AI for training its models on Getty’s massive image library. Whilst the UK’s prohibition on the training of AI tools on copyright works seems clear, establishing that infringing activity has taken place is complex. For example, proving that infringing acts occurred in the UK is challenging and, as Stability AI’s training data is not public, Getty had to reverse-engineer prompts to demonstrate that its works were used (sometimes with comical results). Stability AI has failed to strike-out the case, but the strength of Getty’s arguments is uncertain.

We are now in a race for legislation to arrive before judgment is handed down. Without a change in the law, the case could set an unhelpful precedent based not on the balance of creative and economic interests, but on its particular facts and law written in a time before large language models and generative AI.

The music industry is especially concerned. Contrary to expectation, there is no specific right to prevent the use of voice clones of artists in the UK; and AI-generated music is already undercutting human-made music when it is mass uploaded to streaming platforms. The most effective way to tackle these issues is for rightsholders to control training of their works and get paid.

There is progress, with an All-Party Parliamentary Group investigating and a private member’s bill in the Lords, but the government urgently needs to act to inject some clarity to the rights position.

RELATED

Jonathan Coote discusses the potential impact of Getty Images v Stability AI on the music industry

Associate Jonathan Coote has discussed the potential impact of Getty Images’ lawsuit against Stability AI on the music industry, which the image database company accuses of infringing its ...

Read more >

Jonathan Coote comments in City A.M. on the growing pressure on AI developers from the music industry

Associate Jonathan Coote has commented on the increasing tension between tech companies and the music industry after 200 artists signed an open letter calling for the end to the “predatory...

Read more >

Jonathan Coote provides written evidence to All-Party Parliamentary Group inquiry into AI and the Music Industry

Associate Jonathan Coote has provided written evidence to UK Music re the All-Party Parliamentary Group inquiry into AI and the music industry. You can read his report here.  Jonathan has...

Read more >
< BACK TO INSIGHTS