Rogue Drones and Tall Tales Byline Times – Byline Times

Receive our Behind the Headlines email and well post a free copy of Byline Times

Sam Altman, CEO of OpenAI, wants you to know that everything is super. How has his world tour gone? Its been super great!. Does he have a mentor? Ive been super fortunate to have had great mentors. Whats the big threat hes worried about? Superintelligence.

Altmans whistlestop visit to London in late May was a chance for adoring fans and sceptics alike to hear him answer some carefully selected and pre-approved questions on stage at University College London. The queue for ticketholders stretched right down the street. For OpenAI, the trip to the UK was also a chance for Altman to meet Rishi Sunak, the latest in the list of world leaders to listen to the 38-year-old tech bro.

Prior to December last year, OpenAI wasnt on the public radar at all. It was the release of ChatGPT that changed all that. Its large language model became the hottest software around. Students delighted in it. Copywriters panicked. Journalists inevitably turned to it for an easy 200-word opening paragraph to show how convincing it was. Then came the existential dread.

Superintelligence has long been the stuff of sci-fi. It still is, but somehow the past few months have seen it being treated as imminent, despite the fact that we arent anywhere near that point and might never be. A cynic might wonder if there is a vested interest in a Silicon Valley tech company maintaining its lead by asking for a moratorium on AI progress. Each week seems to bring yet another letter calling for a halt to development, signed by the very people who make the technologies. Where was this concern earlier, as they were building them?

Artificial intelligence has already made its way into newsrooms what are the risks?

Emma DeSouza

Not everyone is convinced of the threat. There is vocal pushback from numerous other researchers who question the fearmongering, the motivation, and the silence on the AI issues already on the ground today: bias, uneven distribution, sustainability, and labour exploitation. But that doesnt make for good clickbait. Instead, we see headlines so doom-laden that they couldve been generated with the prompt: write a title about the end of the world via an evil computer.

Columnists, some of whose knowledge of technology comes from having watched The Terminator in the 80s, were quick to pontificate about the urgent need for global action right now, quick as you can, before the robot uprising.

In early June, most of the dailies were carrying the story that an AI-enabled drone had killed its operator in a simulated test. This was based on an anecdote by a Colonel in the U.S. Air Force who had stated that, in a simulation: the system started realising that, while they did identify the threat, at times the human operator would tell it not to kill that threat, but it got its points by killing that threat. So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective.

Nice tale. Shame it wasnt true. A retraction followed. But its a good example of AIs alignment problem: that if we dont properly phrase the commands, we run the risk of Mickey Mouses panic over the unstoppable brooms in the Sorcerers Apprentice. The (fictional) problem is not a sentient drone with bad intentions; the problem is that we, the human operators, have given an order that is badly worded. Thats a tale weve told for years, right back to the Ancient Greek myth of King Midas: when offered a reward, Midas asked that everything he touch be turned to gold, but he wasnt specific enough, so his food and drink turned to gold too and he died of hunger. That tale has as much truth in it as the rogue drone one, but it shows weve been worrying about this for over 2,000 years.

Receive the monthly Byline Times newspaper and help to support fearless, independent journalism that breaks stories, shapes the agenda and holds power to account.

Were not funded by a billionaire oligarch or an offshore hedge-fund. We rely on our readers to fund our journalism. If you like what we do, please subscribe.

The rogue drone story is also a good example of the deceptive and hyperbolic headlines rolled out on a regular basis, pushing the narrative that AI is a threat. News framing shapes our perceptions; done well, its an important contribution to public understanding of the technology and we need that. Done badly, and it perpetuates the dystopia.

We do need regulation around AI, but the existential risk from superintelligence shouldnt be the reason. The UK governments national AI strategy specifically acknowledges that we have a responsibility to not only look at the extreme risks that could be made real with AGI, but also to consider the dual-use threats we are already faced with today but the latter are the stories that arent being told.

Missing, too, are the headlines about the harms already here. Bias and discrimination as a result of technologies such as facial recognition are already well known. In addition to that, companies are outsourcing the labelling, flagging and moderation of data required for machine learning, which has resulted in the largely unregulated employment of poorly paid ghost workers, often exposed to disturbing and harmful content, such as hate speech, violence, and graphic images. It is work that is vital to AI development but its unseen and undervalued.

Likewise, we chose to ignore that many of the components used in AI hardware, such as magnets and transistors, require rare earth minerals, often sourced from countries in the Global South in hazardous working conditions. There are significant environmental impacts too, with academics highlighting the 360,000 gallons of water needed daily to cool a middle-sized data centre.

If the UK government want to show theyre serious about the responsible development of AI, its okay to keep one eye on the distant future, but theres work to be done now on real and tangible harms. If we want to show were serious about an AI future, we need to focus on the present.

See the original post here:

Rogue Drones and Tall Tales Byline Times - Byline Times

Related Posts

Comments are closed.