Since 1965, the computer industry has relied on Moores Law to accelerate innovation, pushing more transistors into integrated circuits to improve computation performance. Making transistors smaller helped lift all boats for the entire industry and enable new applications. At some point, we will reach a physical limit that is, a limit stemming from physics itself. Even with this setback, improvements kept on pace thanks to increased parallelism of computation and consolidation of specialized functions into single chip packages, such as systems on chip).
In recent years, we are nearing another peak. This article proposes to improve computation performance not only by building better hardware, but by changing how we use existing hardware. More specifically, the focusing on how we use existing processor types. I call this approach Compute Orchestration: automatic optimization of machine code to best use the modern datacenter hardware (again, with special emphasis on different processor types).
So what is compute orchestration? It is the embracing of hardware diversity to support software.
There are many types of processors: Microprocessors in small devices, general purpose CPUs in computers and servers, GPUs for graphics and compute, and programmable hardware like FPGAs. In recent years, specialized processors like TPUs and neuromorphic processors for machine learning are rapidly entering the datacenter.
There is potential in this variety: Instead of statically utilizing each processor for pre-defined functions, we can use existing processors as a swarm, each processor working on the most suitable workloads. Doing that, we can potentially deliver more computation bandwidth with less power, lower latency and lower total cost of ownership).
Non-standard utilization of existing processors is already happening: GPUs, for example, were already adapted from processors dedicated to graphics into a core enterprise component. Today, GPUs are used for machine learning and cryptocurrency mining, for example.
I call the technology to utilize the processors as a swarm Compute Orchestration. Its tenets can be described in four simple bullets:
Compute orchestration is, in short, automatic adaptation of binary code and automatic allocation to the most suitable processor types available. I split the evolution of compute orchestration into four generations:
Compute Orchestration Gen 1: Static Allocation To Specialized Co-Processors
This type of compute orchestration is everywhere. Most devices today include co-processors to offload some specialized work from the CPU. Usually, the toolchain or runtime environment takes care of assigning workloads to the co-processor. This is seamless to the developer, but also limited in functionality.
Best known example is the use of cryptographic co-processors for relevant functions. Being liberal in our definitions of co-processor, Memory Management Units (MMUs) to manage virtual memory address translation can also be considered an example.
Compute Orchestration Gen 2: Static Allocation, Heterogeneous Hardware
This is where we are at now. In the second generation, the software relies on libraries, dedicated run time environments and VMs to best use the available hardware. Lets call the collection of components that help better use the hardware frameworks. Current frameworks implement specific code to better use specific processors. Most prevalent are frameworks that know how to utilize GPUs in the cloud. Usually, better allocation to bare metal hosts remains the responsibility of the developer. For example, the developer/DevOps engineer needs to make sure a machine with GPU is available for the relevant microservice. This phenomenon is what brought me to think of Compute Orchestration in the first place, as it proves there is more slack in our current hardware.
Common frameworks like OpenCL allow programming compute kernels to run on different processors. TensorFlow allows assigning nodes in a computation graph to different processors (devices).
This better use of hardware by using existing frameworks is great. However, I believe there is a bigger edge. Existing frameworks still require effort from the developer to be optimal they rely on the developer. Also, no legacy code from 2016 (for example) is ever going to utilize a modern datacenter GPU cluster. My view is that by developing automated and dynamic frameworks, that adapt to the hardware and workload, we can achieve another leap.
Compute Orchestration Gen 3: Dynamic Allocation To Heterogeneous Hardware
Computation can take an example from the storage industry: Products for better utilization and reliability of storage hardware have innovated for years. Storage startups develop abstraction layers and special filesystems that improve efficiency and reliability of existing storage hardware. Computation, on the other hand, remains a stupid allocation of hardware resources. Smart allocation of computation workloads to hardware could result in better performance and efficiency for big data centers (for example hyperscalers like cloud providers). The infrastructure for such allocation is here, with current data center designs pushing to more resource disaggregation, introduction of diverse accelerators, and increased work on automatic acceleration (for example: Workload-aware Automatic Parallelization for Multi-GPU DNN Training).
For high level resource management, we already have automatic allocation. For example, project Mesos (paper) focusing on fine-grained resource sharing, Slurm for cluster management, and several extensions using Kubernetes operators.
To further advance from here would require two steps: automatic mapping of available processors (which we call the compute environment) and workload adaptation. Imagine a situation where the developer doesnt have to optimize her code to the hardware. Rather, the runtime environment identifies the available processing hardware and automatically optimizes the code. Cloud environments are heterogeneous and changing, and the code should change accordingly (in fact its not the code, but the execution model in the run time environment of the machine code).
Compute Orchestration Gen 4: Automatic Allocation To Dynamic Hardware
A thought, even a possibility, can shatter and transform us. Friedrich Wilhelm Nietzsche
The quote above is to say that there we are far from practical implementation of the concept described here (as far as I know). We can, however, imagine a technology that dynamically re-designs a data center to serve needs of running applications. This change in the way whole data centers meet computation needs as already started. FGPAs are used more often and appear in new places (FPGAs in hosts, FPGA machines in AWS, SmartNICs), providing the framework for constant reconfiguration of hardware.
To illustrate the idea, I will use an example: Microsoft initiated project Catapult, augmenting CPUs with an interconnected and configurable compute layer composed of programmable silicon. The timeline in the projects website is fascinating. The project started off in 2010, aiming to improve search queries by using FPGAs. Quickly, it proposed the use of FPGAs as bumps in the wire, adding computation in new areas of the data path. Project Catapult also designed an architecture for using FPGAs as a distributed resource pool serving all the data center. Then, the project spun off Project BrainWave, utilizing FPGAs for accelerating AI/ML workloads.
This was just an example of innovation in how we compute. Quick online search will bring up several academic works on the topic. All we need to reach the 4th generation is some idea synthesis, combining a few concepts together:
Low effort HDL generation (for example Merlin compiler, BORPH)
In essence, what I am proposing is to optimize computation by adding an abstraction layer that:
Automatic allocation on agile hardware is the recipe for best utilizing existing resources: faster, greener, cheaper.
The trends and ideas mentioned in this article can lead to many places. It is very likely, that we are already working with existing hardware in the optimal way. It is my belief that we are in the midst of the improvement curve. In recent years, we had increased innovation in basic hardware building blocks, new processors for example, but we still have room to improve in overall allocation and utilization. The more we deploy new processors in the field, the more slack we have in our hardware stack. New concepts, like edge computing and resource disaggregation, bring new opportunities for optimizing legacy code by smarter execution. To achieve that, legacy code cant be expected to be refactored. Developers and DevOps engineers cant be expected to optimize for the cloud configuration. We just need to execute code in a smarter way and that is the essence of compute orchestration.
The conceptual framework described in this article should be further explored. We first need to find the killer app (what type of software we optimize to which type of hardware). From there, we can generalize. I was recently asked in a round table what is the next generation of computation? Quantum computing? Tensor Processor Units? I responded that all of the above, but what we really need is better usage of the existing generation.
Guy Harpak is the head of technology at Mercedes-Benz Research & Devcelopment in its Tel Aviv, Israel facility. Please feel free to contact him on any thoughts on the topics above at harpakguy@gmail.com. Harpak notes that this contributed article reflects his personal opinion and is in no way related to people or companies that he works with or for.
Related Reading: If you find this article interesting, I would recommend researching the following topics:
Some interesting articles on similar topics:
Return Of The Runtimes: Rethinking The Language Runtime System For The Cloud 3.0 Era
The Deep Learning Revolution And Its Implications For Computer Architecture And Chip Design (by Jeffrey Dean from Google Research)
Beyond SmartNICs: Towards A Fully Programmable Cloud
Hyperscale Cloud: Reimagining Datacenters From Hardware To Applications
More here:
Disrupt The Datacenter With Orchestration - The Next Platform
- Mathematician breaks down how to defend against quantum ... - Phys.Org [Last Updated On: February 28th, 2017] [Originally Added On: February 28th, 2017]
- Here Is Everything You Need to Know About Quantum Computers - Interesting Engineering [Last Updated On: March 18th, 2017] [Originally Added On: March 18th, 2017]
- Quantum Computing Market Forecast 2017-2022 | Market ... [Last Updated On: March 18th, 2017] [Originally Added On: March 18th, 2017]
- What is Quantum Computing? Webopedia Definition [Last Updated On: March 18th, 2017] [Originally Added On: March 18th, 2017]
- Quantum computing is about to disrupt the government contracts market - Bloomberg Government (blog) [Last Updated On: April 22nd, 2017] [Originally Added On: April 22nd, 2017]
- Scientists: We Have Detected the Existence of a Fundamentally New State of Matter - Futurism [Last Updated On: April 22nd, 2017] [Originally Added On: April 22nd, 2017]
- What Sorts Of Problems Are Quantum Computers Good For? - Forbes [Last Updated On: April 22nd, 2017] [Originally Added On: April 22nd, 2017]
- quantum computing - WIRED UK [Last Updated On: April 22nd, 2017] [Originally Added On: April 22nd, 2017]
- Inside Microsoft's 'soup to nuts' quantum computing ramp-up - Computerworld Australia [Last Updated On: April 29th, 2017] [Originally Added On: April 29th, 2017]
- Molecular magnets closer to application in quantum computing - Next Big Future [Last Updated On: May 15th, 2017] [Originally Added On: May 15th, 2017]
- The Bizarre Quantum Test That Could Keep Your Data Secure - WIRED [Last Updated On: May 18th, 2017] [Originally Added On: May 18th, 2017]
- IBM boosts power of quantum computing processors as it lays ... - www.computing.co.uk [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- IBM makes leap in quantum computing power - ITworld [Last Updated On: May 22nd, 2017] [Originally Added On: May 22nd, 2017]
- Toward mass-producible quantum computers | MIT News - MIT News [Last Updated On: June 2nd, 2017] [Originally Added On: June 2nd, 2017]
- Purdue, Microsoft Partner On Quantum Computing Research | WBAA - WBAA [Last Updated On: June 2nd, 2017] [Originally Added On: June 2nd, 2017]
- Tektronix AWG Pulls Test into Era of Quantum Computing - Electronic Design [Last Updated On: June 2nd, 2017] [Originally Added On: June 2nd, 2017]
- Google to Achieve "Supremacy" in Quantum Computing by the End of 2017 - Big Think [Last Updated On: July 1st, 2017] [Originally Added On: July 1st, 2017]
- Quantum Computing Becomes More Accessible - Scientific American [Last Updated On: July 1st, 2017] [Originally Added On: July 1st, 2017]
- Qudits: The Real Future of Quantum Computing? - IEEE Spectrum - IEEE Spectrum [Last Updated On: July 1st, 2017] [Originally Added On: July 1st, 2017]
- Alkermes and IBM's quantum computing. Who'll be the big winner? Malcolm Berko - Durham Herald Sun [Last Updated On: July 6th, 2017] [Originally Added On: July 6th, 2017]
- Quantum Computers Made Even More Powerful with New microchip generating 'Qudits' - TrendinTech [Last Updated On: July 8th, 2017] [Originally Added On: July 8th, 2017]
- Quantum Computing Record Broken - Wall Street Pit [Last Updated On: July 8th, 2017] [Originally Added On: July 8th, 2017]
- Technique for measuring and controlling electron state is a ... - UCLA Newsroom [Last Updated On: July 9th, 2017] [Originally Added On: July 9th, 2017]
- Quantum cheques could be a forgery-free way to move money - New Scientist [Last Updated On: July 10th, 2017] [Originally Added On: July 10th, 2017]
- Quantum-computer node uses two different ion species - physicsworld.com [Last Updated On: July 10th, 2017] [Originally Added On: July 10th, 2017]
- Quantum Computers vs Bitcoin How Worried Should We Be? - The Merkle [Last Updated On: July 10th, 2017] [Originally Added On: July 10th, 2017]
- Why you might trust a quantum computer with secretseven over ... - Phys.Org [Last Updated On: July 12th, 2017] [Originally Added On: July 12th, 2017]
- Physicists Take Big Step Towards Quantum Computing and ... - Universe Today [Last Updated On: August 1st, 2017] [Originally Added On: August 1st, 2017]
- Quantum Computing Market Worth 495.3 Million USD by 2023 | 08 ... - Markets Insider [Last Updated On: August 10th, 2017] [Originally Added On: August 10th, 2017]
- China uses a quantum satellite to transmit potentially unhackable data - CNBC [Last Updated On: August 10th, 2017] [Originally Added On: August 10th, 2017]
- Blind quantum computing for everyone - Phys.org - Phys.Org [Last Updated On: August 12th, 2017] [Originally Added On: August 12th, 2017]
- Quantum Computing Is Real, and D-Wave Just Open ... - WIRED [Last Updated On: August 12th, 2017] [Originally Added On: August 12th, 2017]
- Machine learning tackles quantum error correction - Phys.Org [Last Updated On: August 15th, 2017] [Originally Added On: August 15th, 2017]
- Quantum Internet Is 13 Years Away. Wait, What's Quantum Internet? - WIRED [Last Updated On: August 15th, 2017] [Originally Added On: August 15th, 2017]
- Physicists Have Made Exotic Quantum States From Light - Futurism [Last Updated On: August 16th, 2017] [Originally Added On: August 16th, 2017]
- $495.3 Million Quantum Computing Market 2017 by Revenue Source, Application, Industry, and Geography - Global ... - PR Newswire (press release) [Last Updated On: August 18th, 2017] [Originally Added On: August 18th, 2017]
- How quantum mechanics can change computing - The Conversation US [Last Updated On: August 23rd, 2017] [Originally Added On: August 23rd, 2017]
- Introducing Australia's first quantum computing hardware company - Computerworld Australia [Last Updated On: August 23rd, 2017] [Originally Added On: August 23rd, 2017]
- IEEE Approves Standards Project for Quantum Computing ... - insideHPC [Last Updated On: August 23rd, 2017] [Originally Added On: August 23rd, 2017]
- Commonwealth Bank investing in Australia's first quantum computer company - Which-50 (blog) [Last Updated On: August 25th, 2017] [Originally Added On: August 25th, 2017]
- How quantum mechanics can change computing - San Francisco ... - San Francisco Chronicle [Last Updated On: August 25th, 2017] [Originally Added On: August 25th, 2017]
- Quantum Computing Is Coming at Us Fast, So Here's Everything You Need to Know - ScienceAlert [Last Updated On: August 27th, 2017] [Originally Added On: August 27th, 2017]
- Quantum computing event explores the implications for business - Cambridge Network [Last Updated On: August 30th, 2017] [Originally Added On: August 30th, 2017]
- Microsoft's Aussie quantum computing lab set to scale up next-gen ... - ARNnet [Last Updated On: September 7th, 2017] [Originally Added On: September 7th, 2017]
- An Entirely New Type of Quantum Computing Has Just Been Invented - Futurism [Last Updated On: September 7th, 2017] [Originally Added On: September 7th, 2017]
- Microsoft just upped its multi-million bet on quantum computing - ZDNet [Last Updated On: September 7th, 2017] [Originally Added On: September 7th, 2017]
- Here's what quantum computing is and why it matters [Last Updated On: October 6th, 2017] [Originally Added On: October 6th, 2017]
- What will you actually use quantum computing for? | ZDNet [Last Updated On: October 11th, 2017] [Originally Added On: October 11th, 2017]
- Quantum Computing | Intel Newsroom [Last Updated On: October 13th, 2017] [Originally Added On: October 13th, 2017]
- Intel Takes First Steps To Universal Quantum Computing [Last Updated On: October 13th, 2017] [Originally Added On: October 13th, 2017]
- Qudits: The Real Future of Quantum Computing? - IEEE Spectrum [Last Updated On: October 13th, 2017] [Originally Added On: October 13th, 2017]
- quantum computing - engadget.com [Last Updated On: October 13th, 2017] [Originally Added On: October 13th, 2017]
- Quantum computing - news.microsoft.com [Last Updated On: November 1st, 2017] [Originally Added On: November 1st, 2017]
- IBM's processor pushes quantum computing ... - engadget.com [Last Updated On: November 16th, 2017] [Originally Added On: November 16th, 2017]
- Yale Professors Race Google and IBM to the First Quantum ... [Last Updated On: November 16th, 2017] [Originally Added On: November 16th, 2017]
- Quantum Computing Is the Next Big Security Risk | WIRED [Last Updated On: December 8th, 2017] [Originally Added On: December 8th, 2017]
- Microsoft offers developers a preview of its quantum ... [Last Updated On: December 12th, 2017] [Originally Added On: December 12th, 2017]
- New silicon structure opens the gate to quantum computers [Last Updated On: December 14th, 2017] [Originally Added On: December 14th, 2017]
- Quantum Computing Explained | What is Quantum Computing? [Last Updated On: December 21st, 2017] [Originally Added On: December 21st, 2017]
- What is Quantum Computing? | SAP News Center [Last Updated On: December 23rd, 2017] [Originally Added On: December 23rd, 2017]
- Is Quantum Computing an Existential Threat to Blockchain ... [Last Updated On: December 25th, 2017] [Originally Added On: December 25th, 2017]
- IBM puts its quantum computer to work in relaxing, nerdy ASMR ... [Last Updated On: January 8th, 2018] [Originally Added On: January 8th, 2018]
- Quantum computing is going to change the world. Here's what ... [Last Updated On: January 8th, 2018] [Originally Added On: January 8th, 2018]
- The Era of Quantum Computing Is Here. Outlook: Cloudy ... [Last Updated On: January 26th, 2018] [Originally Added On: January 26th, 2018]
- What is quantum computing? - Definition from WhatIs.com [Last Updated On: February 5th, 2018] [Originally Added On: February 5th, 2018]
- Senate bills would make quantum computing a priority [Last Updated On: June 10th, 2018] [Originally Added On: June 10th, 2018]
- Two Quantum Computing Bills Are Coming To Congress [Last Updated On: July 5th, 2018] [Originally Added On: July 5th, 2018]
- Quantum Computing Market Research Report- Forecast 2022 | MRFR [Last Updated On: August 1st, 2018] [Originally Added On: August 1st, 2018]
- What Is Quantum Computing? The Complete WIRED Guide | WIRED [Last Updated On: August 22nd, 2018] [Originally Added On: August 22nd, 2018]
- Quantum Computing | USRA [Last Updated On: August 30th, 2018] [Originally Added On: August 30th, 2018]
- The quantum computing race the US cant afford to lose [Last Updated On: September 3rd, 2018] [Originally Added On: September 3rd, 2018]
- The reality of quantum computing could be just three years ... [Last Updated On: September 12th, 2018] [Originally Added On: September 12th, 2018]
- US takes first step toward a quantum computing workforce ... [Last Updated On: September 17th, 2018] [Originally Added On: September 17th, 2018]
- China bet big on quantum computing. Now the ... - money.cnn.com [Last Updated On: September 17th, 2018] [Originally Added On: September 17th, 2018]
- China bet big on quantum computing. Now the US races to ... [Last Updated On: October 26th, 2018] [Originally Added On: October 26th, 2018]
- A new type of quantum computer has smashed every record ... [Last Updated On: December 21st, 2018] [Originally Added On: December 21st, 2018]
- IBM unveils its first commercial quantum computer [Last Updated On: January 9th, 2019] [Originally Added On: January 9th, 2019]
- IBM thinks outside of the lab, puts quantum computer in a box [Last Updated On: January 11th, 2019] [Originally Added On: January 11th, 2019]
- Quantum Computing | The MIT Press [Last Updated On: January 11th, 2019] [Originally Added On: January 11th, 2019]
- CES 2019: IBM's Q System One Is the Rock Star Quantum ... [Last Updated On: January 13th, 2019] [Originally Added On: January 13th, 2019]