The UK’s AI and copyright legislative railroad came to a major stop in June 2025 with the Data (Use and Access) Bill finally being waved through – though not with the nostril flaring, AI copyright transparency provisions so loudly praised by the House of Lords and the creative industries.
The Data (Use and Access) Bill had been caught in a legislative “ping-pong” between the House of Lords and the House of Commons for months. Peers, in particular Baroness Kidron, had consistently argued for amendments that would force AI developers to reveal the copyrighted contents that had been used to train their models.
This agenda was supported by some big names, with plenty of high-profile creatives, such as Sir Paul McCartney and Sir Elton John, fretting their work would be stolen and used without permission or due compensation by generative AI systems.
Despite major victories for the Lords in previous votes, the Commons rejected time and time again these transparency measures. The government’s position was that it wanted to complete the ongoing consultations on AI and copyright, which ended February 2015, before introducing significant legislative reforms.
And tech industry voices, such as Google and TechUK said that broad transparency requirements could suppress AI innovation in the UK and even require companies to disclose trade secrets.
On 11 June 2025 the house of lords reluctantly accepted this, allowing the Data (Use and Access) Bill to proceed to Royal Assent. As a concession, the government pledged to release a report on its plans for copyright and AI within nine months of the Bill coming into force.
This result is making a lot of people in the creative community feel like their voices were heard, but, for the moment anyway, that more concrete protections for transparency and compensation are still a ways off.
In the meantime, a legal fight plays out outside of Parliament. One of the High Court’s first Big Tech trials of the summer kicked off in early June, as Getty Images squared off against artificial-intelligence company Stability AI over what Getty called “brazen infringement” of copyrighted photography to train the Stable Diffusion image generator.
The high-profile case will put existing copyright laws of the AI era to test, and may have a far-reaching impact on future commercial negotiations and licensing deals relating to content. These many discussions and legal actions offer a visual reminder of the shared challenge of developing a clear, balanced set of standards that can protect creators and be fair to AI innovators.