Category Archives: Quantum Computer

New faculty add to Yale’s strength in applied mathematics – Yale News

At the heart of almost any scientific or technological challenge, youre likely to find a mathematician or two trying to work the problem.

This is the realm of applied mathematics. It covers a wide spectrum of research disciplines, from medicine and engineering to computer science and quantum physics.

Yales Department of Mathematics has a long tradition of applied math research that combines innovative theory and practical expertise and the recent arrival of faculty members Anna C. Gilbert and John C. Schotland adds to that tradition.

Annas work is in the interface between mathematics, data science, and computer science, and John is a mathematical physicist with a deep interest in imaging techniques that often have medical applications, said Yair Minsky, the Einar Hille Professor and chair of mathematics. Anna and John will enrich our department, both from the research perspective and in terms of our graduate and undergraduate programs.

Anna C. Gilbert

Gilbert, the John C. Malone Professor of Mathematics, has a dual appointment in the Department of Statistics & Data Science. She has degrees from the University of Chicago and Princeton, worked for seven years at AT&T Labs-Research, and spent 16 years on the faculty of the University of Michigan before rejoining Yale where she spent a year as a postdoctoral fellow in the late 1990s.

My mentor was [Phillips Professor of Mathematics] Raphy Coifman, Gilbert said. He has a unique way of looking at applied math, computation, data, and interactions with more traditional forms of mathematics. He had a big impact on how I think about aspects of the research I do.

Gilbert studies various aspects of compression and sparse approximation the notion of taking a complicated signal or data set and being able to represent it accurately with a much smaller amount of information. Conversely, she works on techniques for taking a compressed version of a dataset and converting it into an approximation of the larger, overall signal.

The work has a wide range of applications, from image compression to group testing for high-throughput biological screening.

My work right now covers algorithms, geometry, and data, she said. Often, real world data are noisy. Theyre missing values or are corrupted in some way and these corruptions distort the geometry of the data. You dont get a very good picture of how the data are organized or related to one another. Im working on algorithms to correct those pictures.

A concept that comes up regularly in Gilberts work is harmonic analysis. Its a way of viewing complex datasets as a combination of building blocks like small waves. Gilbert likened it to a written piece of music. A musical score is a harmonic analysis of a piece of music, she said. It tells you what will happen if you play this note, at this time, for this long.

As she continues to settle in and forge new research collaborations at Yale, Gilbert plans to expand her work with biological data. Shes also interested in doing more research on data involving how people make social decisions.

I like to do beautiful mathematics and I like to make things work. At Yale I can do both, she said.

John C. Schotland

Like Gilbert, professor of mathematics Schotland comes to Yale from the University of Michigan, where he was a professor of mathematics and physics and founding director of the Michigan Center for Applied and Interdisciplinary Mathematics. He holds M.D. and Ph.D. degrees from the University of Pennsylvania.

Schotland comes from a family of scientists and academics, including his parents, their siblings, and his grandparents. It was in the air when I was growing up, he said of his interest in science and math. As a kid, it was tremendously exciting. In a way, I didnt choose science. It chose me.

A key aspect of Schotlands background, his medical training, has informed both his mathematics research and his awareness of how math and physics can be applied to medical challenges.

For instance, as a medical student, Schotland performed epilepsy surgeries in which a patients electroencephalogram (EEG) was measured on the surface of the brain as a way to localize seizures. It was a two-dimensional approach to gathering information on a three-dimensional problem. In the years since, Schotland has worked on developing mathematical tools to recover medical information that is inherently three dimensional.

Medical training has helped me understand science in a broader context, he said. It motivates me to think of things I might not have otherwise.

One of Schotlands main research areas is quantum optics, which applies quantum mechanics to light specifically, the many body problem of how intrinsically quantum states of light, such as entangled photons, interact with large collections of atoms. Schotland said Yales groundbreaking work in quantum science was a major factor in his desire to come to New Haven.

Much of my previous work was about the propagation of light in highly scattered media such as milk, paint, or biological tissue, he said. All of that work was prefaced on the assumption that light behaves according to the laws of classical physics. But what happens if entangled pairs of photons travel through that same medium?

Schotland said it is a line of study with implications for the transmission of quantum information and for improving the resolution of imaging devices.

Indeed, Schotlands investigations of the relationship between light and matter have taken him into a variety of research areas, including partial differential equations, probability, and differential geometry. Some of that work has led to practical applications in optical imaging.

Understanding light scattering in complex media, such as biological tissue, plays a key role in developing new optical imaging modalities, he said.

Minsky noted that Schotland is a leading innovator in combining optics with mathematical techniques to develop algorithms for improving biomedical imaging.

Schotland and Gilbert each stressed an eagerness to enhance their ongoing research and draw inspiration from their new colleagues.

The people here are outstanding, Gilbert said. For some time, Yale has had an outsized impact on applied math. Im thrilled to be a part of it.

See the article here:
New faculty add to Yale's strength in applied mathematics - Yale News

IBM Just Committed to Having a Functioning 1,000 Qubit Quantum Computer by 2023 – ScienceAlert

We're still a long way from realising the full potential of quantum computing, but scientists are making progress all the time and as a sign of what might be coming, IBM now says it expects to have a 1,000 qubit machine up and running by 2023.

Qubits are the quantum equivalents of classical computing bits, able to be set not just as a 1 or a 0, but as a superposition state that can represent both 1 and 0 at the same time. This deceptively simple property has the potential to revolutionise the amount of computing power at our disposal.

With the IBM Quantum Condor planned for 2023 running 1,121 qubits, to be exact we should start to see quantum computers start to tackle a substantial number of genuine real-world calculations, rather than being restricted to laboratory experiments.

IBM's quantum computing lab. (Connie Zhou for IBM)

"We think of Condor as an inflection point, a milestone that marks our ability to implement error correction and scale up our devices, while simultaneously complex enough to explore potential Quantum Advantages problems that we can solve more efficiently on a quantum computer than on the world's best supercomputers," writes physicist Jay Gambetta, IBM Fellow and Vice President of IBM Quantum.

It's a bold target to set, considering IBM's biggest quantum computer to date holds just 65 qubits. The company says it plans to have a 127-qubit machine ready in 2021, a 433-qubit one available in 2022, and a computer holding a million qubits at... some unspecified point in the future.

Today's quantum computers require very delicate, ultra-cold setups and are easily knocked off course by almost any kind of atmospheric interference or noise not ideal if you're trying to crunch some numbers on the quantum level.

What having more qubits does is provide better error correction, a crucial process in any computer that makes sure calculations are accurate and reliable, and reduces the impact of interference.

The complex nature of quantum computing means error correction is more of a challenge than normal. Unfortunately, getting qubits to play nice together is incredibly difficult, which is why we're only seeing quantum computers with qubits in the 10's right now.

Around 1,000 qubits in total still wouldn't be enough to take on full-scale quantum computing challenges, but it would be enough to maintain a small number of stable, logical qubit systems that could then interact with each other.

And while it would take more like a million qubits to truly realise the potential of quantum computing, we're seeing steady progress each year from achieving quantum teleportation between computer chips, to simulating chemical reactions.

IBM hopes that by committing itself to these targets, it can better focus its quantum computing efforts, and that other companies working in the same space will know what to expect over the coming years adding a little bit of certainty to an unpredictable field.

"We've gotten to the point where there is enough aggregate investment going on, that it is really important to start having coordination mechanisms and signaling mechanisms so that we're not grossly misallocating resources and we allow everybody to do their piece," technologist Dario Gil, senior executive at IBM, told TechCrunch.

Read more here:
IBM Just Committed to Having a Functioning 1,000 Qubit Quantum Computer by 2023 - ScienceAlert

NU receives $115 million federal grant to research and develop beyond state-of-the-art quantum computer – Daily Northwestern

Courtesy of James Sauls

SQMS director Anna Grassellino and deputy director James Sauls hold a superconducting radio-frequency cavity at Fermilab. These cavities play a pivotal role in developing quantum technologies.

Grace Wu, Assistant Video EditorSeptember 23, 2020

The U.S. Department of Energy recently announced that the Superconducting Quantum Materials and Systems Center, a partnership between NU and Fermi National Accelerator Laboratory, was selected as one of five national quantum information science centers.

The grant, as part of the National Quantum Initiative, consists of $115 million to be used over five years. Quantum information science has huge implications for multiple fields, ranging from traditional sciences such as physics and chemistry to industries like finance to national security, due to quantum computers ability to process computations at never-before seen speeds.

Quantum computers are able to perform computations at such high speeds due to the nature of quantum bits, or qubits. Classical, or traditional, computers data is encoded in binary bits, a one or a zero, and the data can only exist in one of these states at a time. Qubits are unique because they have an atomic system that allows them to exist in both states simultaneously, thus creating exponential computing potential.

Google and IBM have already each built their own quantum computer. At its unveiling in Oct. 2019, Googles Sycamore quantum computer could process 53 qubits, meaning it could process two to the 53rd power quantum bits, or pieces of information. For context, the Sycamore processor performed a computation in 200 seconds that would take the fastest classical computers about 10,000 years.

Out of the five NQI centers, SQMS is the only center who has proposed to build a next-generation, state-of-the-art quantum computer, according to SQMS deputy director and McCormick Prof. James Sauls. This computer is aimed to process over 100 qubits, and NU has the technology to engineer it, Sauls added.

There will be some synergy between these five different centers as we grow, and we figure out what each other are doing, Sauls said. Another mission that the Department of Energy has is to make the five national centers something bigger than just the sum of each part.

As for NUs partnership with Fermilab in SQMS, research in quantum information science has already been underway before submitting the proposal for the DOE grant, according to McCormick Prof. Peter Voorhees, one of the six materials science and engineering faculty working in the center. Fermilab has some of the worlds best superconducting electromagnetic cavities, and Northwestern has already established strength and knowledge in the field of superconductivity and materials science, Voorhees said.

Weve been working on it before, and we were waiting for people to agree with us that its an important thing to do, Voorhees said. Between (Fermilab) and Northwestern, this is the place where you want to put your investment.

SQMS has four components dedicated to developing the technology, building the devices, elaborating on the physics of the sensing and recruiting young scientists and providing them with research experience and learning in quantum information science, Sauls said.

There are currently 35 people on staff from NU, a number that will easily grow to 50, Sauls said. Faculty from the physics and astronomy, materials science and engineering, and electrical engineering departments will lead research and engineering initiatives. SQMS will also be working in conjunction with Rigetti Computing, National Institute of Standards and Technology and other universities.

In addition to engineering a powerful computer, SQMS will also be engaging in fundamental science research, as the same devices that are used for computing can also be used in detecting particles such as dark matter, Sauls said.

Research funded by the DOE has not yet commenced because the grant hasnt arrived yet at NU, Voorhees said. However, he said he thinks its going to happen in record time due to the technologys important implications.

I think its really fun and exciting to be part of an effort thats truly interdisciplinary, Voorhees said. (It) involves a national lab (and) people from other disciplines that is in an area thats so exciting and promising for the future.

Email: [emailprotected]Twitter: @gracewu_10

Related Stories: Northwesterns Center for Molecular Quantum Transduction receives $12.4 million in research funding

Read more:
NU receives $115 million federal grant to research and develop beyond state-of-the-art quantum computer - Daily Northwestern

Extending the life of the qubit | Temple Now – Temple University News

More reliable weather forecasts, new medicines, improved cyber security and data encryption. These are just some of the innovations that may be accomplished through quantum computing.

Now, Temples Department of Physics has joined a national effort, funded by the U.S. Department of Energy, to build and deploy an advanced quantum computer.

A quantum computer can solve problems that traditional computers cannot, said Professor of Physics Maria Iavarone, who is leading the departments effort. It can process an enormous amount of data in a much shorter time, which means that it is possible to solve much more complex problems and handle vastly larger data sets.

One of the biggest barriers to the construction of a quantum computer is the short life span of the information that lives in a qubit, the quantum analog to the traditional computer bit. Todays highest-performing qubits maintain information for, at most, only 100 microsecondsnot long enough for some calculations to be performed.

What contributes to a qubits extremely short life span, or decoherence, in a superconducting quantum system are material defects and imperfections at surfaces and interfaces, explained Iavarone.

Her lab will be using low temperature scanning tunneling microscopy, which enables scientists to understand the electronic properties of materials down to the atomic scaleessentially allowing the human eye to see a single atomto understand the basic mechanisms of decoherence.

Professor Iavarones group has a proven record of success applying scanning tunneling microscopy to superconductors and to other materials, says Jim Napolitano, professor and chair of the Department of Physics. That expertise will be critical to this national initiative and to developing the next generation of quantum computers.

Read the full story.

Greg Fornia

Read more from the original source:
Extending the life of the qubit | Temple Now - Temple University News

IBM plans to build a 1121 qubit system. What does this technology mean? – The Hindu

(Subscribe to our Today's Cache newsletter for a quick snapshot of top 5 tech stories. Click here to subscribe for free.)

Last week, IBM said it will build Quantum Condor, a 1121 qubit quantum computer, by the end of 2023. The company claims the system can control behaviour of atoms to run applications, and generate world-changing materials to transform industries. IBM says its full-stack quantum computer can be deployed via cloud, and that it can be programmed from any part of the world.

The technology company is developing a super-fridge, internally codenamed Goldeneye, to house the computer. The 10-foot-tall and 6-foot-wide refrigerator is being designed for a million-qubit system.

What are Qubits and quantum computers?

Quantum computers process data exponentially faster than personal computers do. They deploy non-intuitive methods, coupled with lots of computing, to solve intractable problems. These machines operate using qubits, similar to bits in personal computers.

The similarity ends there. The way quantum machines solve a problem is very different from how a traditional machine does.

A classical computer tries solving a problem intuitively. If they are given a command, they attempt every possible move, one after another, turning back at dead ends, until they find a solution.

Quantum computers deploy superposition to solve problems. This allows them to exist in multiple states, and test all possible ways at once. And qubits, the fundamental units of data in quantum computing, enables these machines to compute this way.

In regular computers, bits have either 0 or 1 value, and they come in four possible combinations - - 00, 01, 10, 11. Only one combination can exist at a single point of time, which limits processing speed.

But, in quantum machines, two qubits can represent same values, and all four can exist at the same time. This helps these systems to run faster.

This means that n qubits can represent 2n states. So, 2 qubits represent 4 states, 3 qubits 8 states, 4 qubits 16 states, and so on. And now imagine the many states IBMs 1121 qubit system can represent.

An ordinary 64-bit computer would take hundred years to cycle through these combinations. And thats exactly why quantum computers are being built: to solve intractable problems and break-down theories that are practically impossible for classical computers.

To make such large and difficult calculations happen, the qubits need to be linked together in quantum entanglement. This enables qubits at any end of the universe to connect and be manipulated in such a way that not one can be described without referencing the others.

Why are qubits difficult?

One of the key challenges for processing in qubits is the possibility of losing data during transition. Additionally, assembling qubits, writing and reading information from them is a difficult task.

The fundamental units demand special attention, including a perfect isolation and a thermostat set of one hundredth of a degree above absolute zero. Despite strict monitoring, due to their highly sensitive nature, they can lose superposition even from a slightest variation. This makes programming very tricky.

Since quantum computers are programmed using a sequence of logic gates of various kinds, programmes need to run quickly before qubits lose coherence. The combination of superposition and entanglement makes this process a whole lot harder.

Other companies building quantum computers

There has been a lot of interest in quantum computing in recent times. In 2016, IBM put the first quantum computer in the cloud. Google launched Sycamore quantum computer last year, and said it was close to achieving quantum supremacy.

This month, IBM released its 65-qubit IBM Quantum Hummingbird processor to IBM Q Network members, and the company is planning to surpass the 100-qubit milestone with its 127-qubit IBM Quantum Eagle processor next year. It is also planning to roll out a 433-qubit IBM Quantum Osprey system in 2022.

D-Wave systems, a Canada-based quantum computing company, launched its cloud service in India and Australia this year. It gives researchers and developers in these two countries real-time access to its quantum computers.

Honeywell recently outlined its quantum system, and other technology companies like Microsoft and Intel are also chasing commercialisation.

The ongoing experiments and analysis speak volumes on how tech companies are viewing quantum computers as the next big breakthrough in computing.

Quantum computers will likely deliver tremendous speed, and will help in solving problems related to optimisation in defence, finance, and other industries.

IBM views the 1000-qubit mark as the point from where the commercialisation of quantum computers can take off.

View original post here:
IBM plans to build a 1121 qubit system. What does this technology mean? - The Hindu

OSTP, NSF, DoE, and IBM make major push to strengthen research in AI and quantum – BlackEngineer.com

Almost a month after the White House Office of Science and Technology Policy, the National Science Foundation, and the Department of Energy announced over $1 billion for the establishment of 12 new artificial intelligence (AI) and quantum information science (QIS) research institutes nationwide, IBM announced its first IBM Quantum education and research initiative for Historically Black Colleges and Universities (HBCU).

Led by Howard University and 12 additional HBCUs, the statement said the IBM-HBCU Quantum Center will offer access to its quantum computers, as well as collaboration on academic, education, and community outreach programs.

In addition, as part of the companys continued efforts around diversity and inclusion, IBM will make a $100M investment in technology, assets, resources, and skills development through partnerships with additional HBCUs through the IBM Skills Academy Academic Initiative.

We believe that in order to expand opportunity for diverse populations, we need a diverse talent pipeline of the next generation of tech leaders from HBCUs. Diversity and inclusion is what fuels innovation and students from HBCUs will be positioned to play a significant part of what will drive innovations for the future like quantum computing, cloud, and artificial intelligence, said Carla Grant Pickens, Chief Global Diversity & Inclusion Officer, IBM.

The $1 billion announced by the White House Office of Science and Technology Policy, the National Science Foundation (NSF), and the U.S. Department of Energy will go to National Science Foundation-led AI Research Institutes hosted by universities across the country, including at the University of Oklahoma, Norman, University of Texas, Austin, University of Colorado, Boulder, the University of Illinois at Urbana-Champaign, University of California, Davis, and the Massachusetts Institute of Technology.

The 13 HBCUs intending to participate in the Quantum Center were prioritized based on their research and education focus in physics, engineering, mathematics, computer science, and other STEM fields. They include;

Albany State University Clark Atlanta University Coppin State University Hampton University Howard University Morehouse College Morgan State University North Carolina Agricultural, and Technical State University Southern University Texas Southern University University of the Virgin Islands Virginia Union University Xavier University of Louisiana.

Howard University has prioritized our efforts to support our students pathway to STEM fields for many years with exciting results as we witness more and more graduates becoming researchers, scientists, and engineers with renowned national companies. Our faculty and students look forward to collaborating with our peer institutions through the IBM-HBCU Quantum Center. Were excited to share best practices and work together to prepare students to participate in a quantum-ready workforce, said President Wayne A. I. Frederick.

The HBCUs who are part of the Skills Academy Academic Initiative include Clark Atlanta University, Fayetteville State University, Grambling State University, Hampton University, Howard University, Johnson C. Smith University, Norfolk State University, North Carolina A&T State University, North Carolina Central University, Southern University System, Stillman College, Virginia State, and West Virginia State University.

Read more here:
OSTP, NSF, DoE, and IBM make major push to strengthen research in AI and quantum - BlackEngineer.com

Heres why quantum computing is a cat among the pigeons – BusinessLine

Much is made of the virtues of this new technology called quantum computing, which earned itself a special mention and money allocation in Finance Minister Nirmala Sitaramans budget speech of February.

Quantum computers, (fundamentally and not incrementally) different from conventional computers in the core of their working, will be far, far faster than the most complex (exoscale) conventional computers of today.

Good, right? Not quite.

The reason is, quantum computers can pose a grave threat to data security, which has implications not just in privacy and business, but also in defence and national security. Data security today is ensured by encrypting datasort of jumbling everything upwhich can be reversed only at the other end.

But even computers, while encrypting data, follow certain patterns only these patterns are so complex that to copy them would take much more computing power than is available today. That is why today data is safe, if at all.

However, when quantum computers enter the fray (which is likely to be soon), they can decode the encryption, no matter how complex. And, there goes your data security.

Many experts have agonised over this problem in the last few years and now, solutions are coming up. How do you tackle a problem posed by quantum technology? By going back to the same quantum technology for a solution.

IIT-Madras-incubated, Bengaluru-based start-up QNu Labs is among the few companies that have developed a solution. We are both a product and solutions company, says Sunil Gupta, Co-founder & CEO of the company.

The company offers two products for encryption. The quantum random number generator gives complete randomness in encryption keys. Basically, if encryption can be thought of assigning a number to every data point, if the assignment of numbers is not random enough and is based on a formula, a fast computer can potentially read the formula and guess the data. But the quantum random number generator produces truly random numbers, making the data secure.

Another product is the quantum key distributor, which exchanges cryptographic keys over networks. Essentially, these products make eavesdropping impossible.

Gupta says these two products enable the company to offer complete data encryption services to customers. These find application in transfer of data between data centers, securing access to data in the cloud, securing virtual private networks (VPN) and securing blockchain transactions. Businesses such as banks, healthcare providers, pharma companies and, importantly, Defence, are potential customers of QNus services, Gupta told Business Line.

For example, the company has provided some details of its service to the Defence PSU, Bharat Electronics Ltd. BEL used to courier encryption keys manually to every defence field units for safety, but this also meant that the keys could not be refreshed frequently enough.

QNu solved this problem by transporting the encryption keys in a secure way over public network in real time. It estimates potential saving of 100 crore to BEL.

Gupta says the market for quantum encryption is about to boom. One estimate puts it at $ 25 billion by 2025. The start-up is about to scale up and is in the market to raise funds. About $ 5-7 million, Gupta said.

Visit link:
Heres why quantum computing is a cat among the pigeons - BusinessLine

The Hyperion-insideHPC Interviews: ORNL Distinguished Scientist Travis Humble on Coupling Classical and Quantum Computing – insideHPC

Oak Ridge National Labs Travis Humble has worked at the headwaters of quantum computing research for years. In this interview, he talks about his particular areas of interest, including integration of quantum computing with classical HPC systems. Weve already recognized that we can accelerate solving scientific applications using quantum computers, he says. These demonstrations are just early examples of how we expect quantum computers can take us to the most challenging problems for scientific discovery.

In This Update. From the HPC User Forum Steering Committee

By Steve Conway and Thomas Gerard

After the global pandemic forced Hyperion Research to cancel the April 2020 HPC User Forum planned for Princeton, New Jersey, we decided to reach out to the HPC community in another way by publishing a series of interviews with leading members of the worldwide HPC community. Our hope is that these seasoned leaders perspectives on HPCs past, present and future will be interesting and beneficial to others. To conduct the interviews, Hyperion Research engaged insideHPC Media. We welcome comments and questions addressed to Steve Conway, sconway@hyperionres.com or Earl Joseph, ejoseph@hyperionres.com.

This interview is with Travis Humble, Deputy Director at the Department of Energys Quantum Science Center, a Distinguished Scientist at Oak Ridge National Laboratory, and director of the labs Quantum Computing Institute. Travis is leading the development of new quantum technologies and infrastructure to impact the DOE mission of scientific discovery through quantum computing. He is editor-in-chief for ACM Transactions on Quantum Computing, Associate Editor for Quantum Information Processing, and co-chair of the IEEE Quantum Initiative. Travis also holds a joint faculty appointment with the University of Tennessee Bredesen Center for Interdisciplinary Research and Graduate Education, where he works with students in developing energy-efficient computing solutions. He received his doctorate in theoretical chemistry from the University of Oregon before joining ORNL in 2005.

The HPC User Forum was established in 1999 to promote the health of the global HPC industry and address issues of common concern to users. More than 75 HPC User Forum meetings have been held in the Americas, Europe and the Asia-Pacific region since the organizations founding in 2000.

Doug Black: Hi, everyone. Im Doug Black. Im editor-in-chief at InsideHPC and today we are talking with Dr. Travis Humble. He is a distinguished scientist at Oak Ridge National Lab, where he is director of the labs Quantum Computing Institute. Dr. Humble, welcome. Thanks for joining us today.

Travis Humble: Thanks for having me on, Doug.

Black: Travis, tell us, if you would, the area of quantum computing that youre working in and the research that youre doing that youre most excited about, that has what you would regard as the greatest potential.

Humble: Quantum computing is a really exciting area, so its really hard to narrow it down to just one example. This is the intersection of quantum informationquantum mechanicswith computer science.

Weve already recognized that we can accelerate solving scientific applications using quantum computers. At Oak Ridge, for example, we have already demonstrated examples in chemistry, material science and high-energy physics, where we can use quantum computers to solve problems in those areas. These demonstrations are just early examples of how we expect quantum computers can take us to the most challenging problems for scientific discovery.

My own research is actually focused on how we could integrate quantum computers with high-performance computing systems. Of course, we are adopting an accelerator model at Oak Ridge, where we are thinking about using quantum processors to offload the most challenging computational tasks. Now, this seems like an obvious approach; the best of both worlds. But the truth is that there are a lot of challenges in bringing those two systems together.

Black: It sounds like sort of a hybrid approach, almost a CPU/GPU, only were talking about systems writ large. Tell us about DOEs and Oak Ridges overall quantum strategy and how the Quantum Computing Institute works with vendors and academic institutions on quantum technology development.

Humble: The Oak Ridge National Laboratory has played an important role within the DOEs national laboratory system, a leading role in both research and infrastructure. In 2018, the President announced the National Quantum Initiative, which is intended to accelerate the development of quantum science and technology in the United States. Oak Ridge has taken the lead in the development of research, especially software applications and hardware, for how quantum computing can address scientific discovery.

At the same time, weve helped DOE establish a quantum computing user program; something we call QCUP. This is administered through the Oak Ridge Leadership Computing Facility and it looks for the best of the best in terms of approaches to how quantum computers could be used for scientific discovery. We provide access to the users through the user program in order for them to test and evaluate how quantum computers might be used to solve problems in basic energy science, nuclear physics, and other areas.

Black: Okay, great. So how far would you we are from practical quantum computing and from what is referred to as quantum advantage, where quantum systems can run workloads faster than conventional or classical supercomputers?

Humble: This is such a great question. Quantum advantage, of course, is the idea that a quantum computer would be able to outperform any other conventional computing system on the planet. Very early in this fiscal year, back in October, there was an announcement from Google where they actually demonstrated an example of quantum advantage using their quantum computing hardware processor. Oak Ridge was part of that announcement, because we used our Summit supercomputer system as the baseline to compare that calculation.

But heres the rub: the Google demonstration was primarily a diagnostic check that their processor was behaving as expected, and the Summit supercomputer actually couldnt keep up with that type of diagnostic check. But when we look at the practical applications of quantum computing, still focusing on problems in chemistry, material science and other scientific disciplines, we appear to still be a few years away from demonstrating a quantum advantage for those applications. This is one of the hottest topics in the field at the moment, though. Once somebody can identify that, we expect to see a great breakthrough in how quantum computers can be used in these practical areas.

Black: Okay. So, how did you become involved in quantum in the first place? Tell us a little bit about your background in technology.

Humble: I started early on studying quantum mechanics through chemistry. My focus, early on in research, was on theoretical chemistry and understanding how molecules behave quantum mechanically. What has turned out to be one of the greatest ironies of my career is that quantum computers are actually significant opportunities to solve chemistry problems using quantum mechanics.

So I got involved in quantum computing relatively early. Certainly, the last 15 years or so have been a roller coaster ride, mainly going uphill in terms of developing quantum computers and looking at the question of how they can intersect with high-performance computing. Being at Oak Ridge, thats just a natural question for me to come across. I work every day with people who are using some of the worlds fastest supercomputers in order to solve the same types of problems that we think quantum computers would be best at. So for me, the intersection between those two areas just seems like a natural path to go down.

Black: I see. Are there any other topics related to all this that youd like to add?

Humble: I think that quantum computing has a certain mystique around it. Its an exciting area and it relies on a field of physics that many people dont yet know about, but I certainly anticipate that in the future thats going to change. This is a topic that is probably going to impact everyones lives. Maybe its 10 years from now, maybe its 20 years, but its certainly something that I think we should start preparing for in the long term, and Oak Ridge is really happy to be one of the places that is helping to lead that change.

Black: Thanks so much for your time. It was great to talk with you.

Humble: Thanks so much, Doug. It was great to talk with you, too.

More here:
The Hyperion-insideHPC Interviews: ORNL Distinguished Scientist Travis Humble on Coupling Classical and Quantum Computing - insideHPC

Oxford Instruments Partners With The 10 Million Consortium, To Launch The First Commercial Quantum Computer In UK – AZoNano

Oxford Instruments NanoScience announced today it will partner with a 10 million consortium to accelerate the commercialisation of quantum computing in the UK, led by Rigetti Computing. The three-year programme will build and operate the first quantum computer in the UK, make it available over the cloud, and pursue practical applications in machine learning, materials simulation, and finance. The consortium is joined by University of Edinburgh, quantum software start-up Phasecraft, and Standard Chartered.

Oxford Instruments is delighted to be part of such a strong consortium and play a key role in the development of a quantum computer by providing the latest state-of-the-art Proteox dilution refrigerator for this prestigious project. The company works closely with customers and partners to develop the next generation superconducting and cryogenic solutions for quantum and nanotechnology applications. For quantum technologies, Oxford Instruments offers a wide range of solutions, both for research and commercial exploitation from the fabrication of qubits to measurement and characterisation, no matter what the customers qubit strategy might be.

Our ambition is to be the worlds first quantum economy, which could provide UK businesses and industries with billions of pounds worth of opportunities. Therefore, I am delighted that companies across the country will have access to our first commercial quantum computer, to be based in Abingdon. This is a key part of our plan to build back better using the latest technology, attract the brightest and best talent to the UK and encourage world-leading companies to invest here.

Science Minister Amanda Solloway

Many industries central to the UK economy are poised to benefit from quantum computing, including finance, energy, and pharmaceuticals. A recent BCG report projected the global quantum industry to reach 4 billion by 2024.

Rigetti will build the superconducting quantum computer in a Proteox dilution refrigerator provided by Oxford Instruments. The University of Edinburgh will develop new ways of testing quantum hardware and verifying the performance of quantum programs, and will work with Standard Chartered Bank to advance quantum machine learning applications for finance. In addition, Phasecraft will use its deep knowledge of quantum algorithms and high-efficiency quantum software to harness this hardware for near-term applications in materials design, energy, and pharmaceuticals.

In addition to delivering a practical quantum computer in the UK, a key goal of the initiative is tofurther develop the countrys quantum computing talent, infrastructure, and national supply chain, and to advance the high-performance computing industry.

Oxford Instruments new Proteox dilution refrigerator will be used as the cryogenic platform. This next generation cryo-fridge offers enhanced wiring capacity, multi-experiment capability and improved system modularity. A secondary insert allows pre-testing of the quantum device before fitting it to the main fridge. These features will benefit the collaboration by increasing the number of signal lines and speed up future upgrades, allowing offline component assembly and characterisation.

As we are presently facing a second quantum revolution, quantum technologies are triggering a broad range of diverse applications to address significant global challenges. At this disruptive time in the quantum space, this is a great project to be part of and will help in developing innovative solutions to meet these challenges. I am sure this collaboration will open a new future for many more innovative applications, and these applications will require an ecosystem where skills development, design & engineering excellence, and technology partners all combine to enable new discoveries and solutions.

Simon Holden, Managing Director Oxford Instruments NanoScience.

We are excited to deliver the UKs first quantum computer and help accelerate the development of practical algorithms and applications, said Chad Rigetti, CEO of Rigetti Computing. By providing access to quantum hardware, the collaboration aims to unlock new capabilities within the thriving UK ecosystem of quantum information science researchers, start-ups, and enterprises who have already begun to explore the potential impact of quantum computing.

The consortium is backed by 10 million government and industry investment, including funding from the governments Quantum Technologies Challenge, led by UK Research & Innovation.

See the original post:
Oxford Instruments Partners With The 10 Million Consortium, To Launch The First Commercial Quantum Computer In UK - AZoNano

Combinations of new technologies will upend finance – The Australian Financial Review

Banks have sat near the technological frontier for many decades but the maturing of artificial intelligence, cloud computing, distributed ledger technology, the internet of things, virtual reality, 5G networks and quantum computing at similar times will create unprecedented challenges for institutions and their regulators.

The report points to NAB's work to migrate applications into Amazon Web Services illustrating a broader trend that will see US cloud giants play a more fundamental role in the Australian financial services sector.Bloomberg

The Swiss-based forum, famous for organising the annual Davos shindig, is urging industry leaders and regulators to imagine the outcomes when all of these technologies are combined rather than thinking about them individually. The message is it's the combined impact that matters.

This will undoubtedly be immense; examples already proliferate.

Take Barclays' work with IBM. In a recent trial, the British bank used IBMs cloud-based, seven-qubit quantum computer to speed up transaction settlements during a batch window. Germanys Commerzbank is testing sensors attached to construction cranes to tailor repayments to production levels and help borrowers manage liquidity. Citibank is using Microsoft's augmented reality gear to help its analysts visualise data.

It's becoming clear that the most cutting-edge developments are not coming from consumer-facing fintech applications but back-end processes. IT infrastructure and financial system plumbing can appear boring but the forum suggests the most transformative changes are happening behind the scenes, in the B2B world.

It's not so much about the threat of competition from Google, Apple and Amazon, but about these companies embedding themselves as enablers for financial institutions to increase value - and this points to the need to create alliances," says Arthur Calipo, who leads the financial services practice in Australia for Deloitte, which worked with the World Economic Forum on the report.

Google and Deutsche Bank signed a 10 year partnership in July for cloud services that includes a co-investment and revenue-sharing deal for new investments.AP

Given their cloud infrastructure, IBM and Microsoft are the other US tech giants that will play a fundamental role in banking as they help link disparate data sources together to create new insights.

These cloud giants will create new tensions. They will be accessible to new competitors, both fintechs and players in other sectors, unencumbered with bureaucracy and legacy bank systems. Traditional industry lines blur.

Incumbents are starting to understand that it's not so much speed and efficiency that will be the source of comparative advantage in the future, but their ability to assemble, to execute and to maintain healthy relationships with these powerful third-party vendors and, of course, with customers.

Banks will have to be fluent in all of these new and emerging technologies to play in the new economy. The ones that will flourish will need to understand what coordinated deployment looks like and develop a powerful innovation strategy around these interactions.

Artificial intelligence and cloud should be the critical anchors in any investment strategy with other technologies specifically enabled by these, the forum suggests.

Banks all over the world are carefully assessing and developing their relationships with cloud providers. Half of all global banking IT spending is going towards cloud projects, the report says, and deals are getting more sophisticated.

IBM's head of cloud services is briefing local banks and insurers on Friday on how it can help them manage the confluence of technologies. AP

For example, Google and Deutsche Bank announced in July a 10-year cloud partnership that also includes a co-investment strategy for new banking-related technologies and joint product development under a revenue-sharing agreement.

Locally, the forum calls out National Australia Bank's work with Amazon Web Services as a leading example of the migration of applications from legacy systems to the cloud, including NAB's entire foreign exchange platform and data lake, part of its large-scale IT transformation project.

Howard Boville, IBM's new global head of cloud computing, will brief about 20 banking and insurance sector executives on Friday on topics including secured cloud access, advanced automation, artificial intelligence and blockchain, at Trans-Tasman Business Circle event.

IBM, Amazon, Google and Microsoft are battling it out to be trusted partners for major institutions. The forum's report shows cloud is about far more than moving legacy systems and processes to an external provider, to reduce costs or bolster security.

It's also about access to much more powerful computing services. Think "artificial intelligence-as-a-service" and "quantum-computing-as-a-service", where banks can hire the latest systems to perform whatever functions needed.

Cloud providers could also drive banks' "know-your-client" (KYC) tools, analytics for assessing credit risk, and cyber-security services. As AI continuously learns from data provided by multiple financial clients it becomes more powerful than what a single institution could develop," the report says. Defending against new vulnerabilities will require solutions that are at ecosystem scale.

The World Economic Forum is trying of bring clarity to banks grappling with the impact of new technology. This is the 8th report in its Future of Financial Services series. Bloomberg

While COVID-19 has sucked up the banks' bandwidth this year, the forum suggests banks use it to accelerate the pace of digitisation, pointing to cautious signs of regulatory flexibility towards progressive innovation agendas.

Conditions for action have never been stronger, giving institutions the licence to pursue these innovation pathways at a pace and sophistication seldom seen before, it says.

The report provides many other pointers to where financial services is heading over the course of this decade. For example, banks will be forced to play a broader role in the digital "ecosystem" beyond finance, such as becoming a "trusted data steward" under open banking in Australia, and similar regimes, aboutearning new revenue streams by confirming customers' digital identity.

White-labelling of products will also become more common, as non-financial players seek to embed financial services in products. An example could be a gig worker application providing short-term loans in the app, based on data generated by the worker. The lender providing the loan might lose a direct customer relationship but could get access to data as a quid pro quo.

The use of sensor technology, real-time distributed ledgers and open data regimes will also see shifts towards continuous assessment of customers, including "just-in-time" lending where business borrowers can tap capital based on a dynamic assessment of their cash flow.

The report points to the birth of outcomes-based investment products, where institutions are paid for delivering a future experience; dynamic life and health insurance, with pricing linked to biometrics; and embedding payments into augmented reality.

Of course, the technological tsunami introduces many new risks, along with plenty of questions on environmental, social and corporate governance. For one, using blockchain and quantum computing consumes a lot of energy.

Deploying AI in a heavily regulated industry like financial services will also inevitably raise many issues, including the need to explain decision making and create "responsible AI" systems, the subject of a report by the forum last year.

The arrival of these new technologies will throw up many regulatory challenges: emerging risks will no longer sit neatly inside a supervised institution but instead could be dispersed across an interconnected set of players, including multinational technology companies and specialised fintechs.

While the consumer-facing fintechs get most of the attention, many start-ups are building applications to facilitate new market entrants enabled by "application programming interfaces" (APIs). For example, local start-ups Tic:Toc offers responsible lending-as-a-service; Kyckr provides KYC checks; Modul8 can handle payment card issuing.

It's a world where a new entrant can plug in what they need to service the customer without having to build everything themselves.

As Deloitte's Calipo points out, this creates new, strategic questions for banks. As they turn to big cloud providers, will they also be willing to use a variety of specialist service providers for work traditionally done in-house? Will they be willing to let other companies control customer relationships and supply product in the background, taking a clip of the revenue? And how will they manage the risk of all this?

The forum said its next report in the series will help to answer the third question, by examining how all the new emerging technologies create new sources of risk - but also how they will be able to be used to mitigate them.

Originally posted here:
Combinations of new technologies will upend finance - The Australian Financial Review