ARTificial Intelligence II

I’ve just been watching ‘Sunday with Laura Kuenssberg’ and one of her guests was Baroness Beeban Kidron, a former film director who was appointed as a crossbench peer. She specialises in protecting children’s rights in the digital world, and is an authority on digital regulation and accountability.

The topic of conversation was Labour’s plan to change copyright law so that tech companies can scrape copyrighted work from the internet to train their generative AI tools free of charge, unless individual creatives decide to opt out. They are proposing what is, in essence, legalised theft.

Labour launched their AI Opportunities Action Plan two weeks ago. They intend that the UK should become a world leader in AI with the amount of computing power under public control being increased 20 times over by 2030. They will achieve this by making huge investments (£14bn provided by tech companies, of course!) in setting up the infrastructure needed to create AI growth zones, a ‘super computer’ as well as huge energy intensive data centres necessary to support it. Clearly this will have a significant environmental impact and is at odds with Labour’s election promise to hit its green target to create a clean power system by the same date of 2030, which some experts think was going to be difficult to meet anyway.

Apparently, this will result in the UK’s economy being boosted by £470bn over the next 10 years. This may well turn out to be Starmer’s figure on the side of the bus. Kidron commented that the small print reveals that the figure was sourced from a tech lobbying group which was paid for Google and which was arrived at by asking generative AI, and which, in any event, reflects the global, not the national, uplift in the economy.

Kidron is a vocal supporter of the option to opt-in rather than putting the onus on the individual creative to contact each of the AI companies, who are using their work, to opt out. In fact, opting-out is something that’s not technically possible to do at the moment. To this end, she has put forward amendments to the Data (Use and Access) Bill which will be debated in the House of Lords this week. She has also previously commented in the press that she can’t think of another situation in which someone who is protected by law must actively wrap it around themselves on an individual basis. I think she makes a good point, and I agree with her view that the solution is to review the copyright laws and make them fit for purpose in an AI age. The creative sector, which includes artists, photographers, musicians, writers, journalists, and anyone else who creates original content, is made up of about 2.4 million people and is hugely important to the country’s economy, generating £126bn. That money should be kept within the economy, and not be siphoned off to Silicon Valley.

Not surprisingly there has been a great deal of backlash from creatives including actors and musicians, such as Kate Bush and Sir Paul McCartney, since Labour announced their plans. As part of the segment, Kuenssberg interviewed McCartney who is very concerned as to the effect this will have, especially on young up and coming artists. He commented that art is not just about the ‘muse’ but is about earning an income which allows the artist to keep on creating. He fears that people will just stop creating because they won’t own what they create, and someone else will profit from it. AI is also a positive thing: he explained that they used it to clean up John Lennon’s voice from a scratchy cassette recording making it sound like he only recorded it yesterday, but he is, nevertheless, concerned by its ability to ‘rip off’ artists. He mentioned that there is a recording of him singing ‘God Only Knows’ by the Beach Boys. He never recorded the track; it was created by AI. He can tell it doesn’t quite sound like him, but a normal bystander wouldn’t be able to tell the difference. In a year’s time, even he won’t be able to tell the difference.

There is a petition which has been signed by over 40,000 creatives, and the Government is undergoing a consultation procedure which you can respond to with your comments online here. The consultation ends on 25th February.

So, what can we do in the meantime?

Short of going offline, which isn’t really an option, there is nothing which will ensure that our work is not used in training generative AI. Just from some cursory research, which incidentally was helpfully summarised by Google’s AI Overview, there is the possibility of using a watermark to protect images either physically (not so good for promoting work) or invisibly embedded in the image, using digital signatures, or a cloaking app such as Glaze, which was developed by the University of Chicago, which confuses the way AI sees your image by altering the pixels, or by using another of their apps, Nightshade, which alters the match between image and descriptive text.

For now, all I can do is to change my privacy settings on my Facebook and Instagram accounts to prevent Meta from being able to use data from my posts to train its own AI tools. I had to fill in a form explaining why I objected to them using the data, and I received email confirmation that they would honour my objection, but who’s to know if they do or not? Apparently, there is a website, Have I Been Trained?, which allows you to search for your work in the most popular AI image dataset, LAION-5B.

What’s possibly just as disturbing is the Government’s plan to allow big tech access to one of the biggest and comprehensive datasets in the world – the NHS. It’s all in one place, we all have an NHS number which gives access to a lifetime’s history of personal and health data. It will be done on an anonymous basis, but with enough data, even experts say it’s easy to re-identify people. No-one’s doubting the incredible possibilities that AI offers in terms of delivering healthcare, but proper safeguards are needed.

Anyway, I asked the WordPress AI to generate a header image based on its own prompt:

“Create a high-resolution, highly detailed image illustrating the theme of digital rights and AI regulation. Feature Baroness Beeban Kidron in a thoughtful pose, surrounded by symbols of creativity such as art supplies, musical instruments, and books. The backdrop should convey a digital landscape, with elements representing technology and copyright, like binary code and padlocks. Use soft, natural light to evoke a sense of seriousness, yet hopefulness. The image should be in a documentary style, capturing the urgency of the conversation about protecting creatives’ rights in the age of AI. Ensure sharp focus to highlight the intricate details in each element.”

Sorry, Jonathan – I will switch my mobile phone off for the rest of the day so I don’t have to recharge the battery, but, in the meantime, do we have much to worry about? It doesn’t even look like Beeban Kidron.

Leave a comment