Category Archives: Quantum Computer

Physicists Captured The Moment That An Atom Enters Quantum Measurement – Somag News

A team of physicists from Sweden, Germany and Spain managed to document the moments of transition of an electron by taking a series of images of strontium ion held in an electric field. Scientists research has attracted a lot of attention with the comments that the universe we experience in our daily lives is not like what we see when we try to look closely.

Objects are an extraordinary result of physics, and these objects can only be identified using a number of sets of probabilities. They all look like duplicates until they try to explore with light to determine their specific size and nature.

In the 1940s, the American-Hungarian mathematician John von Neumann thought that part of the quantum system, for example, the position of the orbital electron, would create enough quantum for all to give up the probable nature of its measurement.

Years later, a German theoretical physicist named Gerhart Lders disagreed with Neumanns assumptions, pointing out that some unstable qualities of a particles possibilities can circulate even while others are being clarified. Although physicists have agreed with Lders in theory, it is not easy to demonstrate experimentally based on measuring some naturally occurring actions in such a way that they do not interfere with each other.

The same quantum computer systemThe researchers placed the electron in a missing strontium atom, trapped the ion in a way to clarify which of the remaining electrons was inside, causing both to meet.

It is actually the same setup that is used in many quantum computers. Quantum computers calculate based on the probability of an objects state before measuring, which means they have an exponentially higher data processing potential than conventional computers.

Research sheds light on the inner workings of natureEvery time we measure the orbit of the electron, the answer will be whether the electron is in a lower or higher orbit, nothing will be between them, said physicist Fabian Pokorny from the University of Stockholm. These findings shed new light on the inner workings of nature and are consistent with the predictions of modern quantum physics, said colleague Markus Hennrich, a physicist researcher at the University of Stockholm.

The research is not the first experiment to show that the quantum leap in the probability of an electron is an expansion process, such as eruption of a volcano rather than a key. However, it is possible to say that the way the change took place adds some interesting details that allow such ideal measurements. Scientists experiments on the subject continue at full speed.

View original post here:
Physicists Captured The Moment That An Atom Enters Quantum Measurement - Somag News

Top 10 Strategic Technology Breakthroughs That Will Transform Our Lives – Analytics Insight

The world is surrounded by technology technology that makes our jobs easy, the technology that makes our commute easy, the technology that makes out communication easy and so on. Hence, such advancements have turned into a boon to our lives while easing out numerous works that would conventionally take a long time to complete. Now that we look back we see so many new technologies have taken over the world that its nearly impossible to enlist them at once. And how further advancements will impact our lives in new ways we cannot even imagine.

MIT has drafted a list of top 10 strategic technology breakthroughs that will revolutionize our lives in the coming years.

An internet based on quantum physics will soon enable inherently secure communication. A team led by Stephanie Wehner, at Delft University of Technology, is building a network connecting four cities in the Netherlands entirely by means of quantum technology. Messages sent over this network will be unhackable.

The Delft network will be the first to transmit information between cities using quantum techniques from end to end.The technology relies on a quantum behavior of atomic particles called entanglement. Entangled photons cant be covertly read without disrupting their content.

Heres a definition of a hopeless case: a child with a fatal disease so exceedingly rare that not only is there no treatment, theres not even anyone in a lab coat studying it. Too rare to care, goes the saying.

Thats about to change, thanks to new classes of drugs that can be tailored to a persons genes. If an extremely rare disease is caused by a specific DNA mistakeas several thousand aretheres now at least a fighting chance for a genetic fix through hyper-personalized medicine. One such case is that of Mila Makovec, a little girl suffering from a devastating illness caused by a unique genetic mutation, who got a drug manufactured just for her. Her case made the New England Journal of Medicine in October after doctors moved from a readout of her genetic error to treatment in just a year. They called the drug milasen, after her. The treatment hasnt cured Mila. But it seems to have stabilized her condition: it has reduced her seizures, and she has begun to stand and walk with assistance.

Milas treatment was possible because creating a gene medicine has never been faster or had a better chance of working. The new medicines might take the form of gene replacement, gene editing, or antisense (the type Mila received), a sort of molecular eraser, which erases or fixes erroneous genetic messages. What the treatments have in common is that they can be programmed, in digital fashion and with digital speed, to correct or compensate for inherited diseases, letter for DNA letter.

Last June Facebook unveiled a global digital currency called Libra. The idea triggered a backlash and Libra may never launch, at least not in the way it was originally envisioned. But its still made a difference: just days after Facebooks announcement, an official from the Peoples Bank of China implied that it would speed the development of its own digital currency in response. Now China is poised to become the first major economy to issue a digital version of its money, which it intends as a replacement for physical cash.

The first wave of a new class of anti-aging drugs has begun human testing. These drugs wont let you live longer (yet) but aim to treat specific ailments by slowing or reversing a fundamental process of aging.

The drugs are called senolyticsthey work by removing certain cells that accumulate as we age. Known as senescent cells, they can create low-level inflammation that suppresses normal mechanisms of cellular repair and creates a toxic environment for neighboring cells.

The universe of molecules that could be turned into potentially life-saving drugs is mind-boggling in size: researchers estimate the number at around 1060. Thats more than all the atoms in the solar system, offering virtually unlimited chemical possibilitiesif only chemists could find the worthwhile ones.

Now machine-learning tools can explore large databases of existing molecules and their properties, using the information to generate new possibilities. This AI enabled technology could make it faster and cheaper to discover new drug candidates.

Satellites that can beam a broadband connection to internet terminals. As long as these terminals have a clear view of the sky, they can deliver the internet to any nearby devices. SpaceX alone wants to send more than 4.5 times more satellites into orbit this decade than humans have ever launched since Sputnik.

These mega-constellations are feasible because we have learned how to build smaller satellites and launch them more cheaply. During the space shuttle era, launching a satellite into space cost roughly US$24,800 per pound. A small communications satellite that weighed four tons cost nearly $200 million to fly up.

Quantum computers store and process data in a way completely different from the ones were all used to. In theory, they could tackle certain classes of problems that even the most powerful classical supercomputer imaginable would take millennia to solve, like breaking todays cryptographic codes or simulating the precise behavior of molecules to help discover new drugs and materials.

There have been working quantum computers for several years, but its only under certain conditions that they outperform classical ones, and in October Google claimed the first such demonstration of quantum supremacy. A computer with 53 qubitsthe basic unit of quantum computationdid a calculation in a little over three minutes that, by Googles reckoning, would have taken the worlds biggest supercomputer 10,000 years, or 1.5 billion times as long. IBM challenged Googles claim, saying the speedup would be a thousandfold at best; even so, it was a milestone, and each additional qubit will make the computer twice as fast.

AI has a problem: in the quest to build more powerful algorithms, researchers are using ever greater amounts of data and computing power and relying on centralized cloud services. This not only generates alarming amounts of carbon emissions but also limits the speed and privacy of AI applications.

But a countertrend of tiny AI is changing that. Tech giants and academic researchers are working on new algorithms to shrink existing deep-learning models without losing their capabilities. Meanwhile, an emerging generation of specialized AI chips promises to pack more computational power into tighter physical spaces, and train and run AI on far less energy.

In 2020, the US government has a big task: collect data on the countrys 330 million residents while keeping their identities private. The data is released in statistical tables that policymakers and academics analyze when writing legislation or conducting research. By law, the Census Bureau must make sure that it cant lead back to any individuals.

But there are tricks to de-anonymize individuals, especially if the census data is combined with other public statistics.

So the Census Bureau injects inaccuracies, or noise, into the data. It might make some people younger and others older, or label some white people as black and vice versa while keeping the totals of each age or ethnic group the same. The more noise you inject, the harder the de-anonymization becomes.

Differential privacy is a mathematical technique that makes this process rigorous by measuring how much privacy increases when noise is added. The method is already used by Apple and Facebook to collect aggregate data without identifying particular users.

Ten days after Tropical Storm Imelda began flooding neighborhoods across the Houston area last September, a rapid-response research team announced that climate change almost certainly played a role.

The group, World Weather Attribution, had compared high-resolution computer simulations of worlds where climate change did and didnt occur. In the former, the world we live in, the severe storm was as much as 2.6 times more likelyand up to 28% more intense.

Earlier this decade, scientists were reluctant to link any specific event to climate change. But many more extreme-weather attribution studies have been done in the last few years, and rapidly improving tools and techniques have made them more reliable and convincing.

This has been made possible by a combination of advances. For one, the lengthening record of detailed satellite data is helping us understand natural systems. Also, increased computing power means scientists can create higher-resolution simulations and conduct many more virtual experiments.

These and other improvements have allowed scientists to state with increasing statistical certainty that yes, global warming is often fueling more dangerous weather events.

By disentangling the role of climate change from other factors, the studies are telling us what kinds of risks we need to prepare for, including how much flooding to expect and how severe heatwaves will get as global warming becomes worse. If we choose to listen, they can help us understand how to rebuild our cities and infrastructure for a climate-changed world.

Read the original:
Top 10 Strategic Technology Breakthroughs That Will Transform Our Lives - Analytics Insight

Particle accelerator technology could solve one of the most vexing problems in building quantum computers – Fermi National Accelerator Laboratory

Last year, researchers at Fermilab received over $3.5 million for projects that delve into the burgeoning field of quantum information science. Research funded by the grant runs the gamut, from building and modeling devices for possible use in the development of quantum computers to using ultracold atoms to look for dark matter.

For their quantum computer project, Fermilab particle physicist Adam Lyon and computer scientist Jim Kowalkowski are collaborating with researchers at Argonne National Laboratory, where theyll be running simulations on high-performance computers. Their work will help determine whether instruments called superconducting radio-frequency cavities, also used in particle accelerators, can solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits.

Fermilab has pioneered making superconducting cavities that can accelerate particles to an extremely high degree in a short amount of space, said Lyon, one of the lead scientists on the project. It turns out this is directly applicable to a qubit.

Researchers in the field have worked on developing successful quantum computing devices for the last several decades; so far, its been difficult. This is primarily because quantum computers have to maintain very stable conditions to keep qubits in a quantum state called superposition.

Superconducting radio-frequency cavities, such as the one seen here, are used in particle accelerators. They can also solve one of the biggest problems facing the successful development of a quantum computer: the decoherence of qubits. Photo: Reidar Hahn, Fermilab

Superposition

Classical computers use a binary system of 0s and 1s called bits to store and analyze data. Eight bits combined make one byte of data, which can be strung together to encode even more information. (There are about 31.8 million bytes in the average three-minute digital song.) In contrast, quantum computers arent constrained by a strict binary system. Rather, they operate on a system of qubits, each of which can take on a continuous range of states during computation. Just as an electron orbiting an atomic nucleus doesnt have a discrete location but rather occupies all positions in its orbit at once in an electron cloud, a qubit can be maintained in a superposition of both 0 and 1

Since there are two possible states for any given qubit, a pair doubles the amount of information that can be manipulated: 22 = 4. Use four qubits, and that amount of information grows to 24 = 16. With this exponential increase, it would take only 300 entangled qubits to encode more information than there is matter in the universe.

Qubits can be in a superposition of 0 and 1, while classical bits can be only one or the other. Image: Jerald Pinson

Parallel positions

Qubits dont represent data in the same way as bits. Because qubits in superposition are both 0 and 1 at the same time, they can similarly represent all possible answers to a given problem simultaneously. This is called quantum parallelism, and its one of the properties that makes quantum computers so much faster than classical systems.

The difference between classical computers and their quantum counterparts could be compared to a situation in which there is a book with some pages randomly printed in blue ink instead of black. The two computers are given the task of determining how many pages were printed in each color.

A classical computer would go through every page, Lyon said. Each page would be marked, one at a time, as either being printed in black or in blue. A quantum computer, instead of going through the pages sequentially, would go through them all at once.

Once the computation was complete, a classical computer would give you a definite, discrete answer. If the book had three pages printed in blue, thats the answer youd get.

But a quantum computer is inherently probabilistic, Kowalkowski said.

This means the data you get back isnt definite. In a book with 100 pages, the data from a quantum computer wouldnt be just three. It also could give you, for example, a 1 percent chance of having three blue pages or a 1 percent chance of 50 blue pages.

An obvious problem arises when trying to interpret this data. A quantum computer can perform incredibly fast calculations using parallel qubits, but it spits out only probabilities, which, of course, isnt very helpful unless, that is, the right answer could somehow be given a higher probability.

Interference

Consider two water waves that approach each other. As they meet, they may constructively interfere, producing one wave with a higher crest. Or they may destructively interfere, canceling each other so that theres no longer any wave to speak of. Qubit states can also act as waves, exhibiting the same patterns of interference, a property researchers can exploit to identify the most likely answer to the problem theyre given.

If you can set up interference between the right answers and the wrong answers, you can increase the likelihood that the right answers pop up more than the wrong answers, Lyon said. Youre trying to find a quantum way to make the correct answers constructively interfere and the wrong answers destructively interfere.

When a calculation is run on a quantum computer, the same calculation is run multiple times, and the qubits are allowed to interfere with one another. The result is a distribution curve in which the correct answer is the most frequent response.

When waves meet, they may constructively interfere, producing one wave with a higher crest. Image: Jerald Pinson

Waves may also destructively interfere, canceling each other so that theres no longer any wave to speak of. Image: Jerald Pinson

Listening for signals above the noise

In the last five years, researchers at universities, government facilities and large companies have made encouraging advancements toward the development of a useful quantum computer. Last year, Google announced that it had performed calculations on their quantum processor called Sycamore in a fraction of the time it would have taken the worlds largest supercomputer to complete the same task.

Yet the quantum devices that we have today are still prototypes, akin to the first large vacuum tube computers of the 1940s.

The machines we have now dont scale up much at all, Lyon said.

Theres still a few hurdles researchers have to overcome before quantum computers become viable and competitive. One of the largest is finding a way to keep delicate qubit states isolated long enough for them to perform calculations.

If a stray photon a particle of light from outside the system were to interact with a qubit, its wave would interfere with the qubits superposition, essentially turning the calculations into a jumbled mess a process called decoherence. While the refrigerators do a moderately good job at keeping unwanted interactions to a minimum, they can do so only for a fraction of a second.

Quantum systems like to be isolated, Lyon said, and theres just no easy way to do that.

When a quantum computer is operating, it needs to be placed in a large refrigerator, like the one pictured here, to cool the device to less than a degree above absolute zero. This is done to keep energy from the surrounding environment from entering the machine. Photo: Reidar Hahn, Fermilab

Which is where Lyon and Kowalkowskis simulation work comes in. If the qubits cant be kept cold enough to maintain an entangled superposition of states, perhaps the devices themselves can be constructed in a way that makes them less susceptible to noise.

It turns out that superconducting cavities made of niobium, normally used to propel particle beams in accelerators, could be the solution. These cavities need to be constructed very precisely and operate at very low temperatures to efficiently propagate the radio waves that accelerate particle beams. Researchers theorize that by placing quantum processors in these cavities, the qubits will be able to interact undisturbed for seconds rather than the current record of milliseconds, giving them enough time to perform complex calculations.

Qubits come in several different varieties. They can be created by trapping ions within a magnetic field or by using nitrogen atoms surrounded by the carbon lattice formed naturally in crystals. The research at Fermilab and Argonne will be focused on qubits made from photons.

Lyon and his team have taken on the job of simulating how well radio-frequency cavities are expected to perform. By carrying out their simulations on high-performance computers, known as HPCs, at Argonne National Laboratory, they can predict how long photon qubits can interact in this ultralow-noise environment and account for any unexpected interactions.

Researchers around the world have used open-source software for desktop computers to simulate different applications of quantum mechanics, providing developers with blueprints for how to incorporate the results into technology. The scope of these programs, however, is limited by the amount of memory available on personal computers. In order to simulate the exponential scaling of multiple qubits, researchers have to use HPCs.

Going from one desktop to an HPC, you might be 10,000 times faster, said Matthew Otten, a fellow at Argonne National Laboratory and collaborator on the project.

Once the team has completed their simulations, the results will be used by Fermilab researchers to help improve and test the cavities for acting as computational devices.

If we set up a simulation framework, we can ask very targeted questions on the best way to store quantum information and the best way to manipulate it, said Eric Holland, the deputy head of quantum technology at Fermilab. We can use that to guide what we develop for quantum technologies.

This work is supported by the Department of Energy Office of Science.

Original post:
Particle accelerator technology could solve one of the most vexing problems in building quantum computers - Fermi National Accelerator Laboratory

New Intel chip could accelerate the advent of quantum computing – RedShark News

The marathon to achieve the promise of quantum computers hasedged a few steps forward as Intel unveils a new chip capable, it believes, of accelerating the process.

Called Horse Ridgeand named after one of the coldest places in Oregon, the system-on-chip can control a total of 128 qubits (quantum bits) which is more than double the number of qubits Intel heralded in its Tangle Lake test chip in early 2018.

While companies like IBM and Microsoft have been leapfrogging each other with systems capable of handling ever greater qubits the breakthrough in this case appears to be an ability to lead to more efficient quantum computers by allowing one chip to handle more tasks. It is therefore a step toward moving quantum computing from the lab and into real commercial viability.

Applying quantum computing to practical problems hinges on the ability to scale, and control, thousands of qubits at the same time with high levels of fidelity. Intel suggests Horse Ridge greatly simplifies current complex electronics required to operate a quantum system.

To recap why this is important lets take it for read that Quantum computing has the potential to tackle problems conventional computers cant by leveraging a phenomena of quantum physics: that Qubits can exist in multiple states simultaneously. As a result, they are able to conduct a large number of calculations at the same time.

This can dramatically speed up complex problem-solving from years to a matter of minutes. But in order for these qubits to do their jobs, hundreds of connective wires have to be strung into and out of the cryogenic refrigerator where quantum computing occurs (at temperatures colder than deep space).

The extensive control cabling for each qubit drastically hinders the ability to control the hundreds or thousands of qubits that will be required to demonstrate quantum practicality in the lab not to mention the millions of qubits that will be required for a commercially viable quantum solution in the real world.

Researchers outlined the capability of Horse Ridge in a paper presented at the 2020 International Solid-State Circuits Conference in San Francisco and co-written by collaborators at Dutch institute QuTech.

The integrated SoC design is described as being implemented using Intels 22nm FFL (FinFET Low Power) CMOS technology and integrates four radio frequency channels into a single device. Each channel is able to control up to 32 qubits leveraging frequency multiplexing a technique that divides the total bandwidth available into a series of non-overlapping frequency bands, each of which is used to carry a separate signal.

With these four channels, Horse Ridge can potentially control up to 128 qubits with a single device, substantially reducing the number of cables and rack instrumentations previously required.

The paper goes on to argue that increases in qubit count trigger other issues that challenge the capacity and operation of the quantum system. One such potential impact is a decline in qubit fidelity and performance. In developing Horse Ridge, Intel optimised the multiplexing technology that enables the system to scale and reduce errors from crosstalk among qubits.

While developing control systems isnt, evidently, as hype-worthy as the increase in qubit count has been, it is a necessity, says Jim Clarke, director of quantum hardware, Intel Labs. Horse Ridge could take quantum practicality to the finish line much faster than is currently possible. By systematically working to scale to thousands of qubits required for quantum practicality, were continuing to make steady progress toward making commercially viable quantum computing a reality in our future.

Intels own research suggests it will most likely take at least thousands of qubits working reliably together before the first practical problems can be solved via quantum computing. Other estimates suggest it will require at least one million qubits.

Intel is exploring silicon spin qubits, which have the potential to operate at temperatures as high as 1 kelvin. This research paves the way for integrating silicon spin qubit devices and the cryogenic controls of Horse Ridge to create a solution that delivers the qubits and controls in one package.

Quantum computer applications are thought to include drug development high on the worlds list of priorities just now, logistics optimisation (that is, finding the most efficient way from any number of possible travel routes) and natural disaster prediction.

Follow this link:
New Intel chip could accelerate the advent of quantum computing - RedShark News

Top 10 breakthrough technologies of 2020 – TechRepublic

Between tiny AI and unhackable internet, this decade's tech trends will revolutionize the business world.

MIT Technology Review unveiled its top 10 breakthrough technology predictions on Wednesday. The trends--which include hype-inducing tech like quantum computing and unhackable internet--are expected to become realities in the next decade, changing the enterprise and world.

SEE: Internet of Things: Progress, risks, and opportunities (free PDF) (TechRepublic)

While many of the trends have a more scientific background, most can also apply to business, said David Rotman editor at MIT Technology Review.

"Even though some of these sound science-y or research-y, all really do have important implications and business impacts. [For example], unhackable internet," Rotman said. "It's early, but we can all see why that would be a big deal.

"Digital money will change how we do commerce; satellite mega constellations will potentially change how we do communications and the price of communications," Rotman added.The methodology behind determining the breakthrough technologies focused on what writers, editors, and journalists have been reporting on in the past year. All of the technologies are still being developed and improved in labs, Rotman said.

The MIT Technology Review outlined the following 10 most exciting technologies being created and deployed in the next 10 years.

One of the most exciting technologies of the bunch, according to Rotman, quantum supremacy indicates that quantum computers are not only becoming a reality, but the functionality is becoming even more advanced.Murmurs of quantum computer development have floated around the enterprise. The technology is able to process massive computational solutions faster than any supercomputer.

While this form of computing hasn't been widely used yet, it will not only be usable by 2030, but possibly reach quantum supremacy, MIT found.

"Quantum supremacy is the point where a quantum computer can do something that a classical conventional computer cannot do or take hundreds of years for a classical computer to do," Rotman said.

The technology is now getting to the point where people can test them in their businesses and try different applications, and will become more popular in the coming years, Rotman said.

Quantum computers are especially useful for massive scheduling or logistical problems, which can be particularly useful in large corporations with many moving parts, he added.

"Satellites have become so small and relatively cheap that people are sending up whole clusters of these satellites," Rotman said. "It's going to have an enormous impact on communication and all the things that we rely on satellites for."

These satellites could be able to cover the entire globe with high-speed internet. Applications of satellite mega-constellation use are currently being tested by companies including SpaceX, OneWeb, Amazon, and Telesat, according to the report.

Another interesting, and surprising, technology in the study concerned tiny AI. The surprising nature of this comes with how quickly AI is growing, Rotman said.

Starting in the present day, AI will become even more functional, independently running on phones and wearables. This ability would prevent devices from needing the cloud to use AI-driven features, Rotman said.

"It's not just a first step, but it would be an important step in speeding up the search for new drugs," Rotman said.

Scientists have used AI to find drug-like compounds with specific desirable characteristics. In the next three to five years, new drugs might be able to be commercialized for a lesser cost, compared to the current $2.5 billion it takes to currently commercialize a new drug, the report found.

Researchers are now able to detect climate change's role in extreme weather conditions. With this discovery, scientists can help people better prepare for severe weather, according to the report.

In less than five years, researchers will find drugs that treat ailments based on the body's natural aging process, the report found. Potentially, diseases including cancer, heart disease and dementia could be treated by slowing age.

Within five years, the internet could be unhackable, the report found.

Researchers are using quantum encryption to try and make an unhackable internet, which is particularly important as data privacy concerns heighten, Rotman said.

Digital money, also known as cryptocurrency, will become more widely used in 2020. However, the rise of this money will also have major impacts on financial privacy, as the need for an intermediary becomes less necessary, according to the report.

Occupying three trends on the list, medicine is proving to potentially be a huge area for innovation. Currently, doctors and researchers are designing novel drugs to treat unique genetic mutations. These specialized drugs could cure some ailments that were previously uncurable, the report found.

Differential privacy is a technique currently being used by the US government collecting data for the 2020 census. The US Census Bureau has issues keeping the data it collects private, but this tactic helps to anonymize the data, a tactic other countries may also adopt, according to the report.

For more, check out Forget quantum supremacy: This quantum-computing milestone could be just as important on ZDNet.

Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations. Delivered Wednesdays and Fridays

Image: Urupong, Getty Images/iStockphoto

Read the original post:
Top 10 breakthrough technologies of 2020 - TechRepublic

21st ISQED Conference to Commence With Focus on Quantum Computing, Security, and AI/ML & Electronic Design – PRNewswire

SANTA CLARA, Calif., Feb. 25, 2020 /PRNewswire/ --The21st annualInternational Symposium on Quality Electronic Design (ISQED'20) will commence on March 25 with special focus on Quantum Computing, Security, and AI/ML & Electronic Design. The premier electronic design quality conferencerecently announced its 2020 program, consisting of talks by experts that cover multiple topics related to electronic designand semiconductor technology.

"The industry has slowly started to realize the importance of the concept of quality in electronic design and the hypercritical role it plays in the creation of secure, reliable, manufacturable, and user-friendly circuits and systems,"said Dr. Ali Iranmanesh, the conference founder and president of the International Society for Quality in Electronic Design. "Prior to the inception of ISQED in 1998, the lexicon of technical terminologies hardly contained any combination of 'Quality,' 'Design,' and 'Electronic' words. Terms such as 'Quality Electronic Design,' 'Quality in Electronic Design,' etc. could not be found in any Internet search. Then, the concept of 'Quality' in the design of integrated circuits and systems was a foreign concept that confused even ardent industry practitioners."

ISQED convenes Wednesday, March 25, through Thursday, March 26, 2020, at Santa Clara Convention Center, Santa Clara, CA. The event includes free admission to keynote presentations. For information and registration visit http://www.isqed.org.

Conference Highlights

ISQED features twenty technical sessions with near 100 peer-reviewed papers, as well as keynotes, invited speeches, and embedded tutorials, all with a focus on the latest innovations and developments in electronic design and semiconductor technology. A few conference highlights are as follows:

Keynote Speaker

Security as the Enabler of Quality Electronics

Dr. Chi-Foon Chan, President and co-CEO, SynopsysRe-Engineering Computing with

Neuro-Inspired Learning: Devices, Circuits, and Systems

Prof. Kaushik Roy - Edward G. Tiedemann Jr. Distinguished Professor, Purdue University

Semiconductors for and by AI

Anwar Awad - Vice President, Infrastructure and Platform Solutions Group,

General Manager, Mixed-Signal IP Solution Group, Intel

Active Learning for Fast, Comprehensive SPICE Verification

Jeff Dyck, Director of Engineering - Mentor, a Siemens Business

Spintronic Devices for Memory, Logic, and Neuromorphic Computing

Joseph S. Friedman - Assistant Professor, Director of the NeuroSpinCompute Laboratory,

Department of Electrical & Computer Engineering, The University of Texas at Dallas

Panel Discussion

Driving forward: Is autonomous vehicle development heading towards a crash?

Panelists:Nirmal R. Saxena - NVIDIA

Jan-Philipp Gehrmann - NXP

Burkhard Huhnke - Synopsys

Vaibhav Garg - Texas Instruments

Lee Harrison - Mentor, A Siemens Business

Embedded Tutorials

Abundant-Data Computing: The N3XT 1,000X

Prof. Subhasish Mitra, Stanford University

Energy-efficient Secure Circuits for Entropy Generation & Cryptography

Dr. Sanu Mathew,Intel

EDA for Quantum Computing

Dr. Leon Stok,IBM Corp., Poughkeepsie, NY

Bitcoin Demystified: Disrupting Technology or Mafia Haven?

Dr. Eric Peeters,Texas Instruments

"We are pleased to see an increase in the number of papers submitted to the conference this year,"said Steven Heinrich-Barna, ISQED'20 General Chair. "The two-day technical program with four parallel sessions packs over 80 peer-reviewed papers, highlighting the latest trends in electronic circuit and system design & automation, testing, verification, sensors, security, semiconductor technologies, cyber-physical systems, etc."

About ISQED

The 21st International Symposium on Quality Electronic Design (ISQED'20) is the premier interdisciplinary and multidisciplinary Electronic Design conference. ISQED'20 is held with the technical sponsorship of IEEE CASS, IEEE EDS, and IEEE Reliability Society. ISQED Corporate sponsors are Synopsys and Mentor, a Siemens Business. Additional technical support has been provided byInnovotekand Silicon Valley Polytechnic Institute.

Editorial Contact:

Lana Dunn lanad@calpt.com

All trademarks and tradenames are the property of their respective owners.

Related Files

ISQED 2020 Press Release - Feb. 25.doc

ISQED 2020 Press Release - Feb. 25.pdf

Related Images

image1.png

SOURCE ISQED

https://www.isqed.org

Read the original here:
21st ISQED Conference to Commence With Focus on Quantum Computing, Security, and AI/ML & Electronic Design - PRNewswire

NTT Research to Collaborate with UCLA and Georgetown on Cryptography and Blockchain – Yahoo Finance

NTT Research CIS Lab Focused on Information Security Announces Two Joint Research Agreements

NTT Research, Inc., a division of NTT (TYO:9432), today announced that its Cryptography and Information Security (CIS) Lab has reached joint research agreements with the University of California, Los Angeles (UCLA) and Georgetown University. The five-year agreement with UCLA covers research on the theoretical aspects of cryptography, and the three-year agreement with Georgetown University will take advantage of a global scale testbed for research into blockchain. Both involve the use of mathematical theory to prove security levels and enable greater system reliability.

One of three divisions at NTT Research, the CIS Lab is engaged in basic research of cryptography with the potential for long-term impact. Directed by NTT Fellow Tatsuaki Okamoto, the CIS Lab is focused on foundational research problems in cryptography and blockchain. Dr. Okamoto, a renowned expert in cryptography, will supervise the research. NTT Research Distinguished Scientist Brent Waters, who heads the CIS Labs cryptography research group, will be involved in the collaboration with UCLA. The principal investigator at UCLA is Dr. Amit Sahai, professor of computer science at the Samueli School of Engineering. A recognized global blockchain expert, Dr. Shinichiro Matsuo, a research professor at Georgetown University who heads the blockchain research group at Cyber SMART at Georgetown, and heads the CIS Labs blockchain group at NTT Research, will be involved in the joint research on blockchain. Cyber SMART is the new cross-discipline cyber research center at Georgetown that adheres to the standards and requirements of the National Science Foundations Industry-University Cooperative Research Centers (IUCRC) Program.

"These agreements reflect our commitment to engage and work with the strongest and most dedicated researchers, as well as our focus on foundational research problems," said CIS Lab Director Okamoto. "Our collaboration with UCLA will complement the important basic research that Brent Waters has undertaken, and our planned work with Georgetown is a good example of our openness to exploring security in relatively new use case and test scenarios."

The scope of work for the five-year agreement with UCLA covers advanced secure cryptosystems, secure protocols, new sources of hardness, and mathematical foundations of cryptography. In addition to his role as professor of computer science, Dr. Sahai is also director of the Center for Encrypted Functionalities at UCLA. Professor Sahai has published more than 100 original technical research papers and has contributed in the areas of obfuscation, functional encryption, zero-knowledge proofs and secure multi-party computation.

"NTT's commitment to fundamental research is evident in their generous support, and we are very grateful that they share our vision," said Professor Sahai. "This grant will enable our explorations of the boundary between the possible and the impossible with regards to cryptography. Well be able to answer difficult questions, and then turn that new knowledge into innovative applications in information security."

NTT Research actively explores opportunities to work with experts in its three fields of study. Last fall it entered an Industrial Partnership between its CIS Lab and the Simons Institute for the Theory of Computing at UC Berkeley; set up joint research agreements between its Physics and Informatics (PHI) Lab and six universities (CalTech, Cornell, Michigan, MIT, Stanford and Swinburne), one US Federal Agency (NASAs Ames Research Center) and one private quantum computing software company (1QBit); and reached a joint research agreement between its Medical and Informatics (MEI) Lab and the Technical University of Munich (TUM).

About NTT Research

NTT Research opened its Palo Alto offices in July 2019 as a new Silicon Valley startup to conduct basic research and advance technologies that promote positive change for humankind. Currently, three labs are housed at NTT Research: the Physics and Informatics (PHI) Lab, the Cryptography and Information Security (CIS) Lab, and the Medical and Health Informatics (MEI) Lab. The organization aims to upgrade reality in three areas: 1) quantum information, neuro-science and photonics; 2) cryptographic and information security; and 3) medical and health informatics. NTT Research is part of NTT, a global technology and business solutions provider with an annual R&D budget of $3.6 billion.

Story continues

Go here to read the rest:
NTT Research to Collaborate with UCLA and Georgetown on Cryptography and Blockchain - Yahoo Finance

Should decision makers be concerned by the threat of quantum? – Information Age

Executives are wary about the possible threat of quantum, but how concerned should we be, and when should we start getting worried?

There is still room for quantum computing to advance.

The IT decision makers that we interviewed for our 2019 Post Quantum Crypto survey registered concern. Over half (55%) of them said that today it was a somewhat to extremely large threat, while others looked towards a darker horizon, with 71% saying that if it isnt today, it will definitely become one in the future.

When will tomorrow come, and how far away do quantum threats loom? Estimates differ as to when quantum computing will be available some say 5 years, some say 10, and others say 25. The estimates go up and up. One thing is clear: The timer has started.

In late 2018, researchers at the University of Munich proved that quantum computers have an edge over classical computers, by developing a quantum circuit that can solve problems that were comparatively unattainable for a classical computer. 2019 was a banner year for quantum. IBM kicked it off by revealing Q System One, the worlds first commercial quantum computer.

Later that year, Google announced that its current quantum project had begun to solve problems which were impossible for classical computers and reached quantum supremacy. So, the race to fulfil the potential of quantum computing has begun in earnest, and for all of the bountiful goods that it can provide the world, it can also pose considerable threats.

Quantum computing will be one of the defining technologies that will emerge over the next five to ten years, according to Chris Lloyd-Jones from Avanade. Read here

Quantum will break much of the encryption that underpins the modern internet. Thats at least what the US National Institute of Standards and Technology says.

While classical computing speaks in bits, a language composed of 1s and 0s, quantum computing speaks in qubits. Like normal bits, a qubit can either be a 1 or 0, or it can be an indeterminate state. Its that seemingly small difference which makes quantum, well, the quantum leap that it is.

This brings us to the 2048-bit RSA Key the minimum possible key length used to protect computer systems. Using classical computing, DigiCert has predicted that it would take several quadrillion years to defeat such a key. By comparison, the right quantum computer could break one in a matter of months.

The computer that can beat RSA or elliptic-curve cryptography the algorithms on which internet security relies has not yet been built. We are still on the first generation of quantum computers.

Quantum Computing will render much of todays encryption unsafe, says Dr. Andrew Shields, Cambridge Research Laboratory of Toshiba Research Europe, but Quantum Cryptography could be the solution. Read here

In January last year, the US National Academy of Sciences released a report entitled Quantum Computing: Progress and Prospects, and said that the computer that can do this must be five orders of magnitude larger and requires technological advances which have not yet been invented.

However, quantum computing, and thus quantum threats, have been proved possible in the last few years, and everyone is betting big on it. According to Gartner, 20% of all companies will be investing in quantum in the next five years. So, when do organisations need to start preparing?

Michele Mosca, co-founder of the Institute for Quantum Computing, devised a formula for organisations to determine when they have to start transitioning to quantum-safe algorithms:

D + T Qc

D represents how long a piece of data needs to remain secret; T represents how long it will take for all systems to become quantum-safe, and Qc is how long before a quantum threat arrives.

If Qc turns out to be less than the sum of D and T, then an organisation is vulnerable. Establishing the values of D and T will be a more difficult task, but it sets out a useful frame of reference for quantum preparation.

Commercially available quantum computing might not be here yet, and that gives us time to prepare. Unfortunately, wide cryptographic changes often take a long time to take effect, and there are often decades between the call to update and the actual update.

There are still organisations around today who cling to long outdated cryptographic protocols. By the time quantum becomes an imminent threat, there will still be plenty of computers that are using obsolete cryptography. Whether quantum arrives in 25 years, 15 years, 5 years or tomorrow, the clock is ticking, and organisations should start preparing now.

See original here:
Should decision makers be concerned by the threat of quantum? - Information Age

Keeping classified information secret in a world of quantum computing – Bulletin of the Atomic Scientists

By the end of 1943, the US Navy had installed 120 electromechanical Bombe machines like the one above, which were used to decipher secret messages encrypted by German Enigma machines, including messages from German U-boats. Built for the Navy by the Dayton company National Cash Register, the US Bombe was an improved version of the British Bombe, which was itself based on a Polish design. Credit: National Security Agency

Quantum computing is a technology that promises to revolutionize computing by speeding up key computing tasks in areas such as machine learning and solving otherwise intractable problems. Some influential American policy makers, scholars, and analysts are extremely concerned about the effects quantum computing will have on national security. Similar to the way space technology was viewed in the context of the US-Soviet rivalry during the Cold War, scientific advancement in quantum computing is seen as a race with significant national security consequences, particularly in the emerging US-China rivalry. Analysts such as Elsa Kania have written that the winner of this race will be able to overcome all cryptographic efforts and gain access to the state secrets of the losing government. Additionally, the winner will be able to protect its own secrets with a higher level of security than contemporary cryptography guarantees.

These claims are considerably overstated. Instead of worrying about winning the quantum supremacy race against China, policy makers and scholars should shift their focus to a more urgent national security problem: How to maintain the long-term security of secret information secured by existing cryptographic protections, which will fail against an attack by a future quantum computer.

The race for quantum supremacy. Quantum supremacy is an artificial scientific goalone that Google claims to have recently achievedthat marks the moment a quantum computer computes an answer to a well-defined problem more efficiently than a classical computer. Quantum supremacy is possible because quantum computers replace classical bitsrepresenting either a 0 or a 1with qubits that use the quantum principles of superposition and entanglement to do some types of computations an order of magnitude more efficiently than a classical computer. While quantum supremacy is largely meant as a scientific benchmark, some analysts have co-opted the term and set it as a national-security goal for the United States.

These analysts draw a parallel between achieving quantum supremacy and the historical competition for supremacy in space and missile technology between the United States and the Soviet Union. As with the widely shared assessment in the 1950s and 1960s that the United States was playing catchup, Foreign Policy has reported on a quantum gap between the United States and China that gives China a first mover advantage. US policy experts such as Kania, John Costello, and Congressman Will Hurd (R-TX) fear that if China achieves quantum supremacy first, that will have a direct negative impact on US national security.

Some analysts who have reviewed technical literature have found that quantum computers will be able to run algorithms that allow for the decryption of encrypted messages without access to a decryption key. If encryption schemes can be broken, message senders will be exposed to significant strategic and security risks, and adversaries may be able to read US military communications, diplomatic cables, and other sensitive information. Some of the policy discussion around this issue is influenced by suggestions that the United States could itself become the victim of a fait accompli in code-breaking after quantum supremacy is achieved by an adversary such as China. Such an advantage would be similar to the Allies advantage in World War II when they were able to decrypt German radio traffic in near-real time using US and British Bombe machines (see photo above).

The analysts who have reviewed the technical literature have also found that quantum technologies will enable the use of cryptographic schemes that do not rely on mathematical assumptions, specifically a scheme called quantum key distribution. This has led to the notion in the policy community that quantum communications will be significantly more secure than classical cryptography. Computer scientist James Kurose of the National Science Foundation has presented this view before the US Congress, for example.

Inconsistencies between policy concerns and technical realities. It is true that quantum computing threatens the viability of current encryption systems, but that does not mean quantum computing will make the concept of encryption obsolete. There are solutions to this impending problem. In fact, there is an entire movement in the field to investigate post-quantum cryptography. The aims of this movement are to find efficient encryption schemes to replace current methods with new, quantum-secure encryption.

The National Institute of Standards and Technology is currently in the process of standardizing a quantum-safe public key encryption system that is expected to be completed by 2024 at the latest. The National Security Agency has followed suit by announcing its Commercial National Security Algorithm Encryption Suite. These new algorithms can run on a classical computera computer found in any home or office today. In the future, there will be encryption schemes that provide the same level of security against both quantum and classical computers as the level provided by current encryption schemes against classical computers only.

Because quantum key distribution enables senders and receivers to detect eavesdroppers, analysts have claimed that the ability of the recipient and sender [to] determine if the message has been intercepted is a major advantage over classical cryptography. While eavesdropper detection is an advancement in technology, it does not actually provide any significant advantage over classical cryptography, because eavesdropper detection is not a problem in secure communications in the first place.

When communicating parties use quantum key distribution, an eavesdropper cannot get ciphertext (encrypted text) and therefore cannot get any corresponding plaintext (unencrypted text). When the communicating parties use classical cryptography, the eavesdropper can get ciphertext but cannot decrypt it, so the level of security provided to the communicating parties is indistinguishable from quantum key distribution.

The more pressing national security problem. While the technical realities of quantum computing demonstrate that there are no permanent security implications of quantum computing, there is a notable longer-term national security problem: Classified information with long-term intelligence value that is secured by contemporary encryption schemes can be compromised in the future by a quantum computer.

The most important aspect of the executive order that gives the US government the power to classify information, as it relates to the discussion of quantum computing and cryptography, is that this order allows for the classification of all types of information for as long as 25 years. Similarly, the National Security Agency provides guidelines to its contractors that classified information has a potential intelligence life of up to 30 years. This means that classified information currently being secured by contemporary encryption schemes could be relevant to national security through at least 2049and will not be secure in the future against cryptanalysis enabled by a quantum computer.

In the past, the United States has intercepted and stored encrypted information for later cryptanalysis. Toward the end of World War II, for example, the United States became suspicious of Soviet intentions and began to intercept encrypted Soviet messages. Because of operator error, some of the messages were partially decryptable. When the United States realized this, the government began a program called the Venona Project to decrypt these messages.

It is likely that both the United States and its adversaries will have Venona-style projects in the future. A few scholars and individuals in the policy community have recognized this problem. Security experts Richard Clarke and Robert Knake have stated that governments have been rumored for years to be collecting and storing other nations encrypted messages that they now cannot crack, with the hope of cracking them in the future with a quantum computer.

As long as the United States continues to use encryption algorithms that are not quantum-resistant, sensitive information will be exposed to this long-term risk. The National Institute of Standards and Technologys quantum-resistant algorithm might not be completedand reflected in the National Security Agencys own standarduntil 2024. The National Security Agency has stated that algorithms often require 20 years to be fully deployed on NSS [National Security Systems]. Because of this, some parts of the US national security apparatus may be using encryption algorithms that are not quantum-resistant as late as 2044. Any information secured by these algorithms is at risk of long-term decryption by US adversaries.

Recommendations for securing information. While the United States cannot take back any encrypted data already in the possession of adversaries, short-term reforms can reduce the security impacts of this reality. Taking 20 years to fully deploy any cryptographic algorithm should be considered unacceptable in light of the threat to long-lived classified information. The amount of time to fully deploy a cryptographic algorithm should be lowered to the smallest time frame feasible. Even if this time period cannot be significantly reduced, the National Security Agency should take steps to triage modernization efforts and ensure that the most sensitive systems and information are updated first.

Luckily for the defenders of classified information, existing encryption isnt completely defenseless against quantum computing. While attackers with quantum computers could break a significant number of classical encryption schemes, it still may take an extremely large amount of time and resources to carry out such attacks. While the encryption schemes being used today can eventually be broken, risk mitigation efforts can increase the time it takes to decrypt information.

This can be done by setting up honeypotssystems disguised as vulnerable classified networks that contain useless encrypted dataand allowing them to be attacked by US adversaries. This would force adversaries to waste substantial amounts of time and valuable computer resources decrypting useless information. Such an operation is known as as defense by deception, a well-proven strategy to stymie hackers looking to steal sensitive information. This strategy is simply an application of an old risk mitigation strategy to deal with a new problem.

Quantum computing will have an impact on national security, just not in the way that some of the policy community claims that it will. Quantum computing will not significantly reduce or enhance the inherent utility of cryptography, and the outcome of the race for quantum supremacy will not fundamentally change the distribution of military and intelligence advantages between the great powers.

Still, the United States needs to be wary of long-term threats to the secrecy of sensitive information. These threats can be mitigated by reducing the deployment timeline for new encryption schemes to something significantly less than 20 years, triaging cryptographic updates to systems that communicate and store sensitive and classified information, and taking countermeasures that significantly increase the amount of time and resources it takes for adversaries to exploit stolen encrypted information. The threats of quantum computing are manageable, as long as the US government implements these common-sense reforms.

Editors Note: The author wrote a longer version of this essay under a Lawrence Livermore National Laboratory contract with the US Energy Department. Lawrence Livermore National Laboratory is operated by Lawrence Livermore National Security, LLC, for the US Department of Energy, National Nuclear Security Administration under Contract DE-AC52-07NA27344. The views and opinions of author expressed herein do not necessarily state or reflect those of the United States government or Lawrence Livermore National Security, LLC. LLNL-JRNL-799938.

Read more:
Keeping classified information secret in a world of quantum computing - Bulletin of the Atomic Scientists

A neural network that learned to predict the behavior of a quantum system – Tech Explorist

A wide range of issues in modern science are tackled through quantum mechanical calculations. The quantum nature of the problems involved improves quantum calculations better-suited to them.

Creating quantum computers is expensive and tedious, and the subsequent gadgets are not ensured to display any quantum advantage. That is, operate faster than a conventional computer. So specialists need devices for anticipating whether a given quantum device will have a quantum advantage.

One of the approaches to execute quantum computations in quantum walks. In simple terms, the technique can be envisioned as a particle traveling in a specific system, which underlies a quantum circuit. On the off chance that a molecules quantum walks starting with one network node then onto the next happens quicker than its classical analog, a device-dependent on that circuit will have a quantum advantage. The quest for such superior systems is a significant errand handled by quantum walk specialists.

In a new study, Russian scientists from the Moscow Institute of Physics and Technology, Valiev Institute of Physics and Technology, and ITMO University have replaced quantum walk experts with artificial intelligence. They trained the machine to recognize networks and tell if a given system will convey quantum advantage. This pinpoints the networks that are good candidates for building a quantum computer.

Scientists used a neural network geared toward image recognition. An adjacency matrix served as the info data alongside the quantities of the input and output nodes. The neural system restored a forecast of whether the old style or the quantum walk between the given nodes would be quicker.

Associate Professor Leonid Fedichkin of the theoretical physics department at MIPT said,It was not obvious this approach would work, but it did. We have been quite successful in training the computer to make free predictions of whether a complex network has a quantum advantage. The line between quantum and classical behaviors is often blurred. The distinctive feature of our study is the resulting special-purpose computer vision, capable of discerning this fine line in the network space.

Scientists also created a tool to simplify the development of computational circuits based on quantum algorithms. The resulting devices will be of interest in biophotonics research and materials science.

Scientists noted,Solving a problem that formally involves finding the quantum walk time from one node to another may reveal what happens to an electron at a particular position in a molecule, where it will move, and what kind of excitation it will cause.

Compared with architectures based on qubits and gates, quantum walks are expected to offer an easier way to implement the quantum calculation of natural phenomena. The reason for this is that the walks themselves are a natural physical process.

The findings are reported in the New Journal of Physics.

Read the original:
A neural network that learned to predict the behavior of a quantum system - Tech Explorist