Journo Resources Fellow

April 13, 2021 (Updated )

When the Guardian posted an article that was shared over 70,000 times, it ignited a fiery discussion on social media. Some said they were fascinated, others were scared. Why? Because the opinion piece was written by a robot.

The GPT-3 robot, a language generator, told the readers “not to worry”. It said: “Artificial intelligence will not destroy humans, believe me.’’ 

Although this may sound like a synopsis of a sci-fi novel, Artificial Intelligence (AI) and robo-journalism have been around newsrooms for a while now. Across the globe, the journalism industry has found – and keeps finding – new ways to implement the latest technological advancements. From Forbes to the Washington Post, BBC to the LA Times, most in the field now rely on AI in one way or another. 

As technology is taking over the world, it is easy to see why many are scared of the rise of robots, especially when hundreds of news journalists and editorial staff have lost their positions at Microsoft (and beyond) to AI earlier this year. But does automated journalism mark the end of human journalism? Or is it not that doom and gloom?

‘The Talk About Robots Taking Jobs Is Nonsense’

Director of the Journalism AI survey, Charlie Beckett, says: “I think in practice, certainly at this stage, the talk about robots taking jobs is nonsense because there are no robots. I haven’t met anybody who has said ‘we are going to replace a journalist with a piece of software or a set of algorithms’ – and, of course, there are no robots, it’s just algorithms.’

Indeed, many journalists who took part in the global poll said that the term AI is vague. Although many of us still associate AI with the likes of the Terminator and the Matrix, for the time being, AI is just software and algorithms – all of which are created and steered by people.

Some journalists are not worried about artificial intelligence simply because robots do not yet exist (Image Credit: Franki Chamaki / Unsplash)

Already a part of every newsroom, these technological advancements show how a responsible and careful application of AI can revamp tools – which in turn improves journalists’ workflow and potentially leads to new opportunities.

Such computer programmes, for example, Otter, an automated transcription service – which I also used for this article – are becoming increasingly common amongst those working in the media.

“I started using it after my regular transcriptionist quit the work and began her own career as a journalist,’’ says Nicole Kobie, a tech and science freelancer contributing for WIRED, Vice, and The Guardian. “AI transcription is super fast and cheap.’’

Alongside automated transcription, natural language generators, such as GPT-3, and news aggregators, a recent addition to technologised newsrooms is Journalist Studio, a suite of tech-powered tools for reporters created by Google. 

“What we’re seeing is programmes doing quite boring, and often at scale, work. And then, you have to ask yourself why do you want a human being doing that work, rather than the robot?”

Professor Charlie Beckett

The team behind it understood the chief struggle in quality journalism is a lack of time and human resources. They spent two years collaborating with newsrooms around the world to tackle the issue.

With tools, such as Pinpoint which quickly sifts through large amounts of different formats of files, the Google News Lab has already proven that its anchor product has been a worthwhile and much-needed investment.

According to the tech giant, Pinpoint has so far come in handy for some major investigative projects. The journalists at USA TODAY used the data-scanning tool for a story on almost 41,000 COVID-19 deaths in nursing homes. Another report made possible by the software brought to light the Coronavirus testing disaster across ICE detention centres.  

“What we’re seeing is programmes doing quite boring and often at scale work,’’ says Professor Beckett (LSE). “And then you have to ask yourself why do you want a human being to do that work rather than the robot?’’

Freeing Up Time For The Things Humans Do Well

These stories are just a few examples of how technology can assist those in the world of media. With more resources and tools at hand, journalists can spend their time and energy on their core strengths only human journalists, at least so far, possess. 

“The various softwares and algorithms available,” says Beckett, “free up time to then do the things that humans do well – being creative, being critical, building relationships.’’ And this, in turn, has allowed newsrooms to do additional stories they wouldn’t have done before.

AI can do things humans would never have the scope to do manually (Image Credit: Mina FC / Unsplash)

“For example, investigative journalism; we’ve seen some amazing stories, and it would’ve been impossible for a team of humans to go through all that data manually.’’

Beckett, who’s written on ethical journalism issues and the role of journalism in creating civic society, also points out that, for the time being, “AI is not some sort of miracle robot that walks in and sorts out all your problems. Like any tool, it requires someone to use it and make sure it does what you want.’’

Let’s not forget that even the piece that had been claimed to have been composed by a robot, in fact, wasn’t the brainchild of one. Despite the fearful concerns of many, the GPT-3 language model was simply following the task the editors had set – keeping the language relatively simple, it was asked to write a short piece explaining “why humans have nothing to fear from AI’’.

Afterward, the generator was instructed to follow these variables seven more times. The end product was heavily refined by human journalists and lost 90% of its original content in the process of editing.

“A human being is still in control, a hand on the steering wheel choosing the direction”

Albert Fox Cahn

The founder of Surveillance Technology Oversight Project’s (S.T.O.P.’s) and technologist, Albert Fox Cahn, compares this phenomenon to self-driving cars. “Calling this robot-authored is much like saying a car on cruise control is ‘self-driving’, since you can take your foot off the gas. No, a human being is still in control, a hand on the steering wheel choosing the direction.’’ 

As Cahn emphasises, without the vital human brainpower and input, there wouldn’t be an essay. The GPT-3 language generator, he explains, “simply did what technology has done for decades: it helped human authors automate parts of the writing process.’’

Whenever a new technology comes along, humans, naturally, tend to be fearful and wary of the unknown. Even the equine industry was thought to be at risk when bicycles first came around in the 19th century – with people also fearing that men would become incompetent and that women would become promiscuous. 

Technology Isn’t Neutral – It Can Be Good Or Bad

Over the years, however, we have also seen how relying on AI can backfire.

“DO BETTER!” posted Jade Thirwall, one of the now three members of the British girl band, Little Mix, after a photo mix-up incident last year. The MSN story, which shared Jade Thirwall’s personal reflections on racism, was published illustrated with a photo of bandmate Leigh-Anne Pinnock; something the two singers claim happens on a regular basis.

She added: “It offends me that you couldn’t differentiate the two women of colour out of four members of a group!’’

The MSN AI managed to mix up two of the bandmates. (Image Credit: Rach/Flickr)

 

The error was made in the wake of Microsoft laying off almost all of the editorial staff who worked on the website, both in the UK and US. Rather than a human curating the stories, headlines, and pictures, it was instead done by AI.

For their part, Microsoft say it was not the result of algorithmic bias, but an an experimental feature in the automated system. But this is far from the only occasion where racism has crept into AI.

“It offends me that you couldn’t differentiate the two women of colour out of four members of a group.”

Jade Thirwall

“AI can be sexist and racist,” write Dr James Zou and Dr Londa Schiebinger of Standford University, “it’s time to make it fair”. They point to a range of errors, from Google Translate automatically changing “she said” to “he said” in copy and Nikon cameras always interpreting Asian people as blinking.

The problem, they say, lies in how “relatively little attention is paid to how data are collected, processed and organised”. In short, it’s the data fed to the AI programmes by humans in the first place that’s the problem, with some populations under represented and others over represented.

Indeed, in essence, technology isn’t neutral – it can be used for good and bad. Just like in the real world, there needs to be an active effort to redress bias.

Social Media Enables People And Gives Power

However, with the right tools, AI can be a force for good. “We’ve seen it with the internet,” says Beckett, “social media enables people, Black Lives Matter campaigns to get extra power and to change the world.’’

But, he points out, the people and organisations one may not like also have access to the very same technology one uses. Take the news industry, for example – “people use it for propaganda, people use it for social justice’’ – the very same principle can be applied to Artificial Intelligence.”

Beckett says journalists shouldn’t fear job losses to AI. (Image Credit: Fox / Pexels)

The real question here, according to Beckett, is not whether “robots’’ are going to take all the news presenters’ jobs but rather how these technological advancements can support and further the core values of journalism – truth, accuracy, objectivity, impartiality, public accountability – as well as reflecting on genuine threats. 

“If I’m a journalist sitting in a newsroom, the thing that would be worrying me about losing my job would be the thought of falling advertising sales; it would be the competition from Facebook or TikTok; it would be the fact that people are very depressed about journalism – they find that it’s very narrow, it’s very depressing and that it’s not engaging enough.’’

“And those are the reasons I might lose my job. It’s not because some robot’s going to come in and write more beautifully than I write.’’

Main Image Credit: SwayAway1 / Vecteezy

Join Our Events For Free And Support Our Work With JR Membership