Copyright & IP Resources, Digital, Public Affairs
Artificial Intelligence (AI) is simply the intelligence of a digital system. It can be as simple as the software that monitors the products and social media posts you engage with online and recommends similar content.
Recently, more complex AI software capable of imitating human writing and reasoning as well as producing photorealistic images has become widely available to the public. This has sparked widespread interest in what it means for content producers, publishers, and many employees whose roles this new technology may affect.
For AI to develop to its most advanced stages, it requires machine learning. Much like a human, AI has to identify patterns and themes before it can reproduce them. This requires human input, meaning that while AI may be impressive in combining or imitating what it’s learned from humans, there is a limit to how truly creative it can be. AI is only as good as the information it’s trained on. The old computing adage “rubbish in=rubbish out” applies.
Developers are already using online content to train AI, regardless of whether a paywall or copyright protections cover the content. In March 2023, computing website Tom’s Hardware reported on how Google’s Bard AI was copying its content. Several PPA members have also reported similar adverse effects on the use and value of their content. Beyond existing copyright law, there is little protection for publishers. The PPA is calling on the Government to improve this.
Some news publishers have used AI to generate basic content for almost a decade. The Associated Press began to use AI-generated financial reports as early as 2014. Since then, other outlets like the Washington Post and Reuters have developed AI writing technology.
This technology was proprietary. Now, tools like ChatGPT have made AI content-generation software widely available. Online news publishing already features many content farms with a growing risk that AI-generated images and content will spread misinformation.
In April 2023, German entrepreneurs Arian Okhovat and Jörg Salamon launched Panta Rhai: a 136-page magazine whose text and illustrations were generated by AI in five days. Other publishers remain more sceptical of publicly available AI content-generation software. Neil Clarke, editor of US-based speculative fiction magazine Clarkesworld, suspended reader submissions in early 2023 after a deluge of AI-generated content. Clarke told CBC: “ChatGPT-3 was producing some of the worst stories we’ve ever seen in 17 years.”
AI also affects advertising. According to Forbes, combining AI and Big Data (meaning the large amount of newly available data resulting from the growth of digital technology) can automate almost 80% of all physical work, 70% of data processing, and 64% of data collection tasks.
HubSpot’s State of AI survey found 72% of marketing and business professionals agree AI tools can help them find insights in data they couldn’t otherwise find. Larger brands like Coca-Cola have used data gathering and AI analysis to support marketing for several years.
Publishers may soon find advertisers are more responsive in placing or withdrawing advertisements. It also looks like it may be a long while before AI can imitate the creativity required to create distinctive, memorable print adverts.
There is no one body of UK law governing AI. The Government is seeking to develop a more comprehensive framework. In 2023 alone it has already published a consultation on a policy paper, begun to assemble a £100m Foundation Model Taskforce, and announced that Britain will host a global summit on AI Safety.
The PPA is disappointed in the relative lack of detail on publishers’ rights in the latest Government AI policy paper. The framework explicitly claims it “does not seek to address all of the wider societal and global challenges that may relate to the development or use of AI. This includes issues relating to access to data, […] as well as the balancing of the rights of content producers and AI developers.” The framework rightfully falls back on the existing legal framework for copyright, which we believe can be applied in the context of AI.
However, there is currently no visible regulatory pathway of enforcement of transparency provisions that would give publishers access to information they need on the extent and nature of AI data scraping in order to enforce their copyrights. Specialist publishers are already suffering due to AI data scraping. This simply isn’t good enough.
The framework identifies risks in the current confused regulatory landscape. It also recognises that AI regulators must have transparency from AI systems. It proposes guidance and technical standards as a means of moving towards transparency.
The Intellectual Property Office (IPO), which is the official UK Government body for IP, has been developing a code of practice on copyright in consultation with the PPA which looks at the issue of transparency. The PPA believes this code of practice should be put on a statutory footing and that the Government must give the IPO additional enforcement powers to ensure that this code of practice is adhered to by the companies who own AI systems. If the government does not fill this regulatory gap, cases will have to be settled by private litigation, which is slow and burdensome, especially for smaller publishers.
The PPA has also been calling on the Government to put the Digital Markets Unit (DMU) within the Competitions and Markets Authority (CMA) on a clear statutory footing to help to tackle abuses of AI. We believe that this framework misses out by failing to specify the Digital Markets Unit as a key regulator. At the same time, the current Digital Markets, Competition and Consumers Bill misses out by not naming AI among the DMU’s responsibilities.
The Digital Markets, Competition and Consumers Bill does, however, allow the CMA, through the DMU, to require tech companies to “provide clear, relevant, accurate and accessible information about the relevant digital activity to users or potential users”, ensuring a degree of transparency. It also allows the DMU to require platforms to explain planned “changes in relation to the relevant digital activity where those changes are likely to have a material impact on the users or potential users”.
This could affect changes in, for example, search or feed prominence. These requirements may help publishers find out how AI systems are using their content. They could also help publishers access large tech companies’ data at a level granular enough to match their own user data. The CMA can launch an enquiry into a tech company’s actions under these terms. This may result in the CMA’s issuing an enforcement order. We support moves to make the CMA, through the DMU, best reflect citizens’ interests.
Speaking in June 2023, Labour leader Sir Keir Starmer promised “an overarching regulatory framework” that would be “stronger” than the Government’s proposals; though Labour is yet to provide detail on how this would be done. On the same day, 11 July 2023, the Shadow Secretary of State for Work and Pensions Jonathan Ashworth highlighted AI’s potential to help jobseekers while the then Shadow Secretary of State for Digital, Culture, Media and Sport Lucy Powell warned workers could end up “on the wrong side of biased algorithms and robot firing”. In an interview with the Guardian the previous month, Powell had called for a licensing regime for AI systems, citing nuclear power and medicine as sectors with exemplar regulatory frameworks.
Other countries are yet to make major changes in regulation but are consulting on options:
Without government intervention, open AI systems’ unlawful use of content will threaten the very existence of publishers. Through our response to the House of Lords’ Communications Committee’s enquiry on large language models and through our drumbeat of lobbying, we are calling on the Government to stand up for the specialist publishing sector and do the following:
If you have any questions or wish to discuss this further with our public affairs team, please contact our Policy and Public Affairs Manager, Eilidh.wilson@ppa.co.uk
Chancery House, 53-64 Chancery Lane, London WC2A 1QS
If you have a member login, enter your details below. Please note, that your login is for PPA.co.uk only and not for our event sites.
If you are a member but don’t have an account yet, you can setup your account here.
Any problems, please contact membership@ppa.co.uk.