Can the planet really afford the exorbitant power demands of machine learning? – The Guardian

There is, alas, no such thing as a free lunch. This simple and obvious truth is invariably forgotten whenever irrational exuberance teams up with digital technology in the latest quest to change the world. A case in point was the bitcoin frenzy, where one could apparently become insanely rich by mining for the elusive coins. All you needed was to get a computer to solve a complicated mathematical puzzle and lo! you could earn one bitcoin, which at the height of the frenzy was worth $19,783.06. All you had to do was buy a mining kit (or three) from Amazon, plug it in and become part of the crypto future.

The only problem was that mining became progressively more difficult the closer we got to the maximum number of bitcoins set by the scheme and so more and more computing power was required. Which meant that increasing amounts of electrical power were needed to drive the kit. Exactly how much is difficult to calculate, but one estimate published in July by the Judge Business School at the University of Cambridge suggested that the global bitcoin network was then consuming more than seven gigwatts of electricity. Over a year, thats equal to around 64 terawatt-hours (TWh), which is 8 TWh more than Switzerland uses annually. So each of those magical virtual coins turns out to have a heavy environmental footprint.

At the moment, much of the tech world is caught up in a new bout of irrational exuberance. This time, its about machine learning, another one of those magical technologies that change the world, in this case by transforming data (often obtained by spying on humans) into depending on whom you talk to information, knowledge and/or massive revenues.

As is customary in these frenzies, some inconvenient truths are overlooked, for example, warnings by leaders in the field such as Ali Rahimi and James Mickens that the technology bears some resemblances to an older speciality called alchemy. But thats par for the course: when youve embarked on changing the world (and making a fortune in the process), why let pedantic reservations get in the way?

Recently, though, a newer fly has arrived in the machine-learning ointment. In a way, its the bitcoin problem redux. OpenAI, the San Francisco-based AI research lab, has been trying to track the amount of computing power required for machine learning ever since the field could be said to have started in 1959. What its found is that the history divides into two eras. From the earliest days to 2012, the amount of computing power required by the technology doubled every two years in other words, it tracked Moores law of growth in processor power. But from 2012 onwards, the curve rockets upwards: the computing power required for todays most-vaunted machine-learning systems has been doubling every 3.4 months.

When youve embarked on changing the world, why let pedantic reservations get in the way?

This hasnt been noticed because the outfits paying the bills are huge tech companies. But the planet will notice, because the correspondingly enormous growth in electricity consumption has environmental consequences.

To put that in context, researchers at Nvidia, the company that makes the specialised GPU processors now used in most machine-learning systems, came up with a massive natural-language model that was 24 times bigger than its predecessor and yet was only 34% better at its learning task. But heres the really interesting bit. Training the final model took 512 V100 GPUs running continuously for 9.2 days. Given the power requirements per card, wrote one expert, a back of the envelope estimate put the amount of energy used to train this model at over 3x the yearly energy consumption of the average American.

You dont have to be Einstein to realise that machine learning cant continue on its present path, especially given the industrys frenetic assurances that tech giants are heading for an AI everywhere future. Brute-force cloud computing wont achieve that goal. Of course smarter algorithms will make machine learning more resource-efficient (and perhaps also less environmentally damaging).

Companies will learn to make trade-offs between accuracy and computational efficiency, though that will have unintended, and antisocial, consequences too. And, in the end, if machine learning is going to be deployed at a global scale, most of the computation will have to be done in users hands, ie in their smartphones.

This is not as far-fetched as it sounds. The new iPhone 11, for example, includes Apples A13 chip, which incorporates a unit running the kind of neural network software behind recent advances in natural language processing language and interpreting images. No doubt other manufacturers have equivalent kit.

In preparation for the great day of AI Everywhere, I just asked Siri: Is there such a thing as a free lunch? She replied: I can help you find a restaurant if you turn on location services. Clearly, the news that there is no such thing hasnt yet reached Silicon Valley. Theyll get it eventually, though, when Palo Alto is underwater.

Capital ideaThe Museum of Neoliberalism has just opened in Lewisham, London. Its a wonderful project and website my only complaint is that neoliberalism isnt dead yet.

Who needs humans?This Marketing Blog Does Not Exist is a blog entirely created by AI. Could you tell the difference between it and a human-created one? Not sure I could.

All the right notesTheres a lovely post about Handel by Ellen T Harris on the Bank of Englands blog, Bank Underground. The German composer was a shrewd investor, but it was The Messiah that made him rich.

See more here:

Can the planet really afford the exorbitant power demands of machine learning? - The Guardian

Related Posts

Comments are closed.