Longer coherence: How the quantum computing industry is maturing – DatacenterDynamics

Quantum computing theory dates back to the 1980s, but it's really only in the last five to ten years or so that weve seen it advance enough to the point it could realistically become a commercial enterprise.

Most quantum computing companies have been academic-led science ventures; companies founded by PhDs leading teams of PhDs. But, as the industry matures and companies look towards a future of manufacturing and operating quantum computers at a production-scale, the employee demographics are changing.

While R&D will always play a core part of every technology company, making quantum computers viable out in the real world means these startups are thinking about how to build, maintain, and operate SLA-bound systems in production environments.

This new phase in the industry requires companies to change mindset, technology, and staff.

Plus rebuilding Ukraine, Cologix's CEO, and more

20 Dec 2023

At quantum computing firm Atom Computing, around 40 of the companys 70 employees have PhDs, many joining straight out of academia. This kind of academic-heavy employee demographic is commonplace across the quantum industry.

I'd venture that over half of our company doesn't have experience working at a company previously, says Rob Hays, CEO of Atom. So theres an interesting bridge between the academic culture versus the Silicon Valley tech startup; those are two different worlds and trying to bridge people from one world to the other is challenging. And it's something you have to focus and work on openly and actively.

Maturing from small startups into large companies with demanding customers and shareholders is a well-trodden path for hundreds of technology companies in Silicon Valley and across the world.

And quantum computers are getting there: the likes of IonQ, Rigetti, and D-Wave are already listed in the Nasdaq and New York Stock Exchange although the latter two companies have had to deal at various times with the prospect of being de-listed due to low stock prices.

Most of the quantum companies DCD spoke to for this piece are undergoing a transition from pure R&D mode to a more operational and engineering phase.

When I first joined four years ago, the company was entirely PhDs, says Peter Chapman, IonQ CEO. We're now in the middle of a cultural change from an academic organization and moving to an engineering organization. We've stopped hiring PhDs; most of the people we're hiring nowadays are software, mechanical, and hardware engineers. And the next phase is to a customer-focused product company.

Chapman points to the hirings of the likes of Pat Tan and Dean Kassmann previously at Amazons hardware-focused Lab126 and rocket firm Blue Origin, respectively as evidence of the company moving to a more product- and engineering-focused workforce.

2023 also saw Chris Monroe, IonQ co-founder and chief scientist, leave the company to return to academia at North Carolinas Duke University.

During the earnings call announcing Monroes departure, Chapman said: Chris would be the first one to tell you that the physics behind what IonQ is doing is now solved. It's [now] largely an engineering problem.

Atoms Hays notes a lot of the engineering work that the company is doing to get ready for cloud services and applications is software-based, meaning the company is looking for software engineers.

We are mostly looking for people that have worked at cloud service providers or large software companies and have an interest in either learning or already some foundational knowledge of the underlying physics and science, he says. But we're kind of fortunate that those people self-select and find us. We have a pretty high number of software engineers who have physics undergrads and an extreme interest in quantum mechanics, even though by trade and experience they're software engineers.

On-premise quantum computers are currently rarities largely reserved for national computing labs and academic institutions. Most quantum processing unit (QPU) providers offer access to their systems via their own web portals and through public cloud providers.

But todays systems are rarely expected (or contracted) to run with the five-9s resiliency and redundancy we might expect from tried and tested silicon hardware.

Right now, quantum systems are more like supercomputers and they're managed with a queue; they're probably not online 24 hours, users enter jobs into a queue and get answers back as the queue executes, says Atoms Hays.

We are approaching how we get closer to 24/7 and how we build in redundancy and failover so that if one system has come offline for maintenance, there's another one available at all times. How do we build a system architecturally and engineering-wise, where we can do hot swaps or upgrades or changes with minimal downtime as possible?

Other providers are going through similar teething phases of how to make their systems which are currently sensitive, temperamental, and complicated enterprise-ready for the data centers of the world.

I already have a firm SLA with the cloud guys around the amount of time that we do jobs on a daily basis, and the timeframes to be able to do that, says Chapman. We are moving that SLA to 24/7 and being able to do that without having an operator present. It's not perfect, but its getting better. In three or four years from now, you'll only need an on-call when a component dies.

Rigetti CTO David Rivas says his company is also working towards higher uptimes.

The systems themselves are becoming more and more lights out every quarter, he says, as we outfit them for that kind of remote operation and ensure that the production facilities can be outfitted for that kind of operation.

Rigetti

Manufacturing and repair of these systems is also maturing, since the first PhD-built generations of quantum computers. These will never be mass-produced, but the industry needs to move away from one-off artisanal machines to a more production line-like approach.

A lot of the hardware does get built with the assistance of electronics engineers, mechanical engineers, says Atoms Hays, but much is still built by experimental physicists.

IonQs Chapman adds: In our first-generation systems, you needed a physicist with a screwdriver to tune the machine to be able to run your application. But every generation of hardware puts more under software control.

Everywhere a screwdriver could be turned, there's now a stepper motor under software control, and the operating system is now doing the tuning.

Simon Phillips, CTO of the UKs Oxford Quantum Circuits, says OQC is focused on how it hires staff and works with partners to roll out QPUs into colocation data centers.

And the first part of that starts with if we put 10 QPUs in 10 locations around the world, how do we do that without having an army of 100 quantum engineers on each installation?

And the first part of that starts with having a separate deployment team and a site reliability engineering team that can then run the SLA on that machine.

He adds: Not all problems are quantum problems. It can't just be quantum engineers; it's not scalable if it's the same people doing everything.

It's about training and understanding where the first and second lines of support sit, having a cascading system, and utilizing any smart hands so we can train people who already exist in data centers.

IonQ

While the quantum startups are undergoing their own maturing process, their suppliers are also being forced to learn about the needs of commercial operators and what it means to deploy in a production data center.

For years, the supply chain including for the dilution refrigerators that keep many quantum computers supercooled has dealt with largely self-reliant academic customers in lab spaces.

Richard Moulds, general manager of Amazon Braket at AWS, told DCD the dilution refrigerator market is a cottage industry with few suppliers.

One of the main fridge suppliers is Oxford Instruments, an Oxford University spin-out from the late 1950s that released the first commercial dilution unit back in 1966. The other large incumbent, Blufors, was spun out of what is now the Low Temperature Laboratory at Aalto University in Finland 15 years ago.

Prior to the quantum computing rush, the biggest change in recent years was the introduction of pulse tube technology. Instead of a cryostat inserted into a bath of liquid helium4, quantum computers could now use a closed loop system (aka a dry fridge/cryostat).

This meant the systems could become smaller, more efficient, more software-controlled - and more user-friendly.

With the wet dilution fridge (or wet cryostat), you need two-floor rooms for ceiling height. You need technicians to top up helium and run liquefiers, you need to buy helium to keep topping up, says Harriet van der Vliet, product segment manager, quantum technologies, Oxford Instruments.

It was quite a manual process and it would take maybe a week just to pre-cool and that would not even be getting to base temperature.

For years, the fridges were the preserve of academics doing materials science; they were more likely to win a Nobel prize than be part of a computing contract.

Historically, it's been a lab product. Our customers were ultra-low temperature (ULT) experts; if anything went wrong, they would fix it themselves, says van der Vliet. Now our customers have moved from being simply academics to being commercial players who need user-friendly systems that are push button.

While the company declined to break out numbers, Oxford said it has seen a noticeable change in the customer demographic towards commercial quantum computing customers in recent years, but also a change in buying trends. QPU companies are more likely to buy multiple fridges at once, rather than a single unit every few years for an academic research lab.

The commercial part is growing for sure, adds David Gunnarsson, CTO at Blufors. The company has expanded factory capacity to almost double production capabilities to meet growing demand.

There have been more and more attempts to create revenue on quantum computing technology. They are buying our systems to actually deploy or have an application that they think they can create money from. We welcome discussion with data centers so they can understand our technology from the cryogenics perspective.

And while the industry is working towards minimizing form factors as much as possible, for the foreseeable future the industry has settled on essentially brute force supercooling with bigger fridges. Both companies have released new dilution fridges designed for quantum computers.

Smaller fridges (and lower qubit-count) systems may be able to fit into racks, but most larger qubit-count supercooled systems require a much larger footprint than traditional racks. Blufors largest Kide system can cool around 1,000 qubits: the system is just under three meters in height and 2.5 meters in diameter, and the floor beneath it needs to be able to take about 7,000 kilograms of weight.

It has changed the way we do our product, says Gunnarsson. They were lab tools before; uptime wasnt discussed much before. Now we are making a lot of changes to our product line to ensure that you can be more certain about what the uptime of your system will be.

Part of the uptime challenge suppliers face around fridges an area where Gunnarsson notes there is still something of a mismatch is in the warm-up/cool-down cycle of the machines.

While previously the wet bath systems could take a week to get to the required temperatures, the new dry systems might only take a day or two each way. That is important, because cooling down and warming up cycles are effectively downtime; a dirty word when talking about service availability.

The speed with which you can get to temperature is almost as important as the size of the chip that you can actually chill, says AWS Moulds. Today, if you want to change the device's physical silicon, you have got to warm this device up and then chill it back down again, that's a four-day cycle. That's a problem; it means machines are offline for a long time for relatively minor changes.

While this might not be an issue for in-operation machines Rigetti CTO Rivas says its machines can be in service for months at a time, while Oxford Instruments says an OQC system was in operation non-stop for more than a year the long warm-up/cool-down cycle is a barrier to rapid testing.

From a production perspective, the systems remain cold for a relatively long time, says Rivas. But we're constantly running chips through test systems as we innovate and grow capacity, and 48 hours to cool a chip down is a long time in an overall development cycle.

Oxford Instruments and Blufors might be the incumbents, but there are a growing number of new players entering the fridge space, some specifically focusing on quantum computing.

The market has grown for dilution fridges, so there are lots more startups in the space as well making different cooling systems, says van der Vliet. There are many more players, but the market is growing.

I think it's really healthy that there's loads of players in the field, particularly new players who are doing things a little bit differently to how we've always done it.

The incumbents are well-placed to continue their lead in the market, but QPU operators are hopeful that competition will result in better products.

There will be genuine intellectual property that will emerge in this area and you'll definitely start to see custom designs and proprietary systems that can maintain temperature in the face of increasing power.

Atoms Hays notes that, for laser-based quantum systems, the lasers themselves are probably the largest constraint in the supply chain. Like the dilution fridges, these are still largely scientific technologies made by a handful of suppliers.

We need relatively high-powered lasers that need to be very quiet and very precise," he says. Ours are off the shelf, but they're semi-custom and manufacturer builds to order. That means that there's long lead times; in some cases up to a year.

He adds that many of the photonic integrated circuits are still relatively small - the size of nickels and dimes - but hopes they can shrink down to semiconductor size in future to help reduce the footprint

For now, the quantum industry is still enjoying what might be the autumn of its happy-go-lucky academic days. The next phase may well lead to quantum supremacy and a new phase in high-performance computing, but it will likely lead to a less open industry.

I think its nice that the industry is still sort of in that mode, says AWS Moulds. The industry is still taking a relatively open approach to the development. We're not yet in the mode of everybody working in their secret bunkers, building secret machines. But history shows that once there's a clear opportunity, there's a risk of the shutters coming down, and it becoming a more cut-throat industry.

In the end, that's good for customers; it drives down costs and drives up reliability and performance. But it might feel that might feel a little bit brutal for some of the academics that are in the industry now.

See the original post:

Longer coherence: How the quantum computing industry is maturing - DatacenterDynamics

Related Posts

Comments are closed.