The Future of Computing: Hype, Hope, and Reality – CIOReview

Bill Reichert, Partner, Pegasus Tech Ventures

For roughly 75 years, the fundamental architecture of computers has not changed much. Certainly, the hardware has changed tremendously, and software has evolved accordingly. But the basic idea of storing instructions and data in binary code, and using on/off digital hardware to execute mathematical and logical operations, has remained roughly the same for decades.

All that is changing.

The same advances in semiconductor fabrication technology that powered Moores Lawthe exponential increase in the power of computers over the last several decadeshave enabled hardware engineers to develop new architectures that promise to transform the computing landscape over the coming decades.

At the same time, software engineering is also progressing. Marc Andreessen has famously said, Software is eating the world. What he did not make clear, though, is that virtually all the progress in computing over the past 30 years has been thanks to hardware, not software.

Heterogeneous Computing

New architectures, however,require that both software engineers and hardware engineers work together. A new class of hardware is emerging that takes advantage of what is called heterogeneous computing, multi-core chips that incorporate multiple different co-processors on the chip that are optimized for specialized tasks. Writing software that takes full advantage of these new chips is extremely challenging, and so companies like SambaNova Systems are developing operating systems and software compilers that optimize the application code automatically and allocate resources to compute tasks dynamically in real-time as computing demands change.

AI Chips

With the emergence of deep neural network software, engineers realized that Graphics Processing Units, an architecture commercialized by Nvidia, were nicely designed for doing the massive matrix calculations required by neural network models. But GPUs are not exactly optimized for AI, and so there has been an explosion of startups seeking to develop chips that offer 10x or 100x the performance and power efficiency of GPUs. On the server side, companies like Cerebras Systems and Graphcore, and more recently SambaNova, are promising order of magnitude improvements. And on the edge, companies like Gyrfalcon Technology, Syntiant, and Blaize are promising even greater improvements inperformance and power efficiency.

Virtually all the progress in computing over the past 30 years has been thanks to hardware, not software

Edge Computing

The second half of the 20th century was all about moving computing from centralized mainframe computers to desktop and laptop distributed computers. With the development of a high-speed Internet, the thinking shifted, and an application could sit in the cloud and support thousands, even millions, of users. But as the Internet of Things took off and enabled data collection from literally billions of devices, moving all that data up to the cloud in order to crunch it has become a challenge. Now companies are looking to process data at the edge, at the point of collection, rather than sending it up to the cloud, thereby reducing latency and cutting bandwidth and storage costs. At its simplest level, edge computing filters out unimportant data and sends only the most important data to the cloud. For more complex tasks, such as autonomous driving, edge computing requires processing massive AI models and making very accurate judgments in milliseconds. For these tasks, the new special-purpose chips discussed above and below are fighting for design wins.

Analog Computing

As brilliant as binary code is for enabling absolutely precise calculations, the real world is analog, not digital, and many compute tasks could be done more efficiently if we could operate with analog values rather than having to digitize them. But analog computing is imprecise, and most computing problems require exact values, not approximate values. (How much money do you have in your bank account?) Some problems, like AI inference and monitoring sensor data, do not need six sigma precision to get the right answer or make the right decision. Companies like Mythic, Analog Inference, and Aspinity are incorporating analog computing architectures into their chips to make them up to 100x more efficient solving problems involving data from our analog world.

Photonic Computing

Light has been used for digital communications and computer networks for decades, but using photons to do the math and putting photonic processors on a chip are extremely challenging. That is what several startups are trying to do. Spinning technologies out of MIT and Princeton, three companies, Lightelligence, Lightmatter, and Luminous Computing, are racing to commercialize the first photonic chip for doing AI inference at the edge.

Neuromorphic Computing

In spite of what the media portrays as the imminent cyber-apocalypse where robots rebel against their human masters and take over the world, we are a long way away from the science fiction world imagined in popular culture. The fact is that the human brain is still massively more powerful and efficient than the most powerful supercomputers on earth. But computer scientists think there is a path to create an artificial brain. The branch of artificial intelligence that uses neural network mathematical frameworks to compute information in a manner similar to the human brain is sometimes referred to as neuromorphic, because it mimics human neuro-biology. But researchers have been working on models that even more closely mimic the human brain in its design and efficiency. The brain sends signals as electrochemical spikes, not digital bytes, and the brains roughly 86 billion neurons are interconnected in a way that is very different from transistors on a chip. Researchers at Stanford, Intel, IBM, and several startup companies, such as Rain Neuromorphics and BrainChip, are trying to develop hardware and software that uses neuromorphic principles to deliver very high-power computing on very small semiconductor chips.

Quantum Computing

Almost certainly the most radical initiative in computing is the attempt to harness the potential of quantum computing. At the subatomic level, particles of matter behave in extraordinary and wonderful ways they can exist in more than one state simultaneously, and they can entangle with one another across a distance without any apparent physical connection. It turns out that electronic devices like transistors and diodes wouldnt even work if the universe were strictly Newtonian. If we can figure out how to control the quantum properties of light and matter the way we figured out how to use gears to make adding machines and transistors to make computers, we will be able to make quantum computers that are as superior to current supercomputers as supercomputers are to adding machines.

Some people say we are still a long way away from quantum supremacy, when quantum computers can solve problems that no classical computer can solve. But recent advances indicate that we may not be that far away from quantum advantage, when quantum computers can solve certain specialized problems faster than classical computers.

Already big players like IBM, Google, Intel, Honeywell, and Microsoft are demonstrating machines that can execute quantum algorithms and startups like Rigetti Computing,IonQ, and PsiQuantum are joining the race, along with quantum software companies like QC Ware, Cambridge Quantum Computing, and Zapata Computing. Big corporations and governments are investing in projects that will take advantage of the power of quantum computing in chemistry, pharmaceuticals, finance, logistics, failure analysis, and artificial intelligence.

Each of these emerging technologies promises to significantly advance computing, and with these advances will come new technology leaders. The evolution of computing has given rise to multiple generations of spectacular success stories like IBM, Intel, Microsoft, Nvidia, Google, and Amazon Web Services. Most of these companies are trying to reinvent themselves to catch the next wave of computing technology, but certainly new companies will emerge in these new sectors, and some famous names will founder and go the way of the dinosaurs, like Univac, Digital Equipment, MIPS, and Silicon Graphics. Meanwhile, corporate CIOs will have to decide where to place their bets and start investing in these new technologies, if they havent already.

More:
The Future of Computing: Hype, Hope, and Reality - CIOReview

Related Posts

Comments are closed.