Category Archives: Quantum Computer

Team develops first blockchain that can’t be hacked by quantum computer – Siliconrepublic.com

It was a busy week for IoT technologies, with Russia preparing its networks for quantum computer hacks through blockchain.

Earlier this week, Irish forestry organisation Coillte revealed its latest effort to get into the internet of trees space following a 1.2m deal with the European Space Agency to roll out a tree growth analytics system, including a unique tree sensor device.

When operational, the sensors will create a kind of mesh network that maps out a digital forest. The resulting data will be transmitted via satellite to provide real-time analytics for forest managers.

Meanwhile, across the Atlantic Ocean, Android co-founder Andy Rubin was discussingthe prevalence of smart home assistants on the market, and how they are creating a disjointed ecosystem.

Rubin wasspeakingas part of thelaunch of Androids new Essential Home device and a new open source, smart assistant operating system called Ambient OS.

All of these [companies] have ecosystem envy and want to create their own ecosystem, Rubin said.

But consumers dont want just Samsung stuff in their house. They want diversity.

The fields of blockchain and quantum computing are fascinating and complex in their own regard, but new research from Russia claims that a merger between the two could be very interesting.

According to the International Business Times, a team from the Russian Quantum Center in Moscow has developed quantum blockchain technology that would prevent any hacker from accessing connections, despite thecomputing technology still being experimental.

As has been explained before, blockchain is the technology that makes a transaction of currency or information traceable and transparent to both parties and, by its nature, is supposed to be incredibly secure.

However, when quantum computers enter the mainstream, this might not be the case as their incredible processing power would be able to crack any encryption.

Parties that communicate via a quantum channel can be completely sure that they are talking to each other, not anybody else, said Alexander Lvovsky, group lead ofthe research.

This is the main idea. Then we had to reinvent the entire blockchain architecture to fit our new authentication technology, thereby making this architecture immune to quantum computer attacks.

Earlier this week (29 May), Enterprise Ireland held a trade mission in Canada with a focus on IoT, led by Minister Sean Canney, TD.

The biggest success at the event was with Clare-based Tekelek, whichsigned a $1.4m deal with PayGo, a company that provides sensors for businesses to monitor fuel consumption remotely and make changes where necessary.

As part of the deal, Tekelek will begin developing an intrinsically safe sensor to facilitate the expansion of PayGos service offering in the US and Canadian markets.

Oliver McCarthy, general manager of Tekelek, said: Were very excited to apply this thinking and our technology to the industrial fuels marketplace, and were similarly pleased to partner with an organisation of PayGos calibre to bring our technologies to the North America market.

Dublin City Councils (DCC) Smart Dublin initiative has announced a partnership with the Connect Centre and Intel to deploy low-cost sensors across the capital to monitor rainfall, weather conditions and river levels.

The new sensors will communicate data wirelessly to DCCs operations team, which will analyse water levels and use Connects Pervasive Nation IoT network to provide city authorities with an early warning of potential flooding.

Jamie Cudden, DCCs Smart City programme manager, said: Dublin is emerging as a leading location for smart city and IoT innovations.

Intels Dublin Living Lab programme has already carried out some initial flood monitoring activity across the city, which has led to the prototyping of a set of river and rainfall sensors.

Autonomous cars are gradually heading onto our open roads, albeit with a driver behind the wheel to make sure everything goes OKduring the trials.

Now, anew method of testing these cars developed by the University of Michigan may have found a way to drastically cut the amount of time it could take to make them road-legal.

Developed using data from more than 40m km of driving in the real world, a team of researchers believes that they can save 99.9pc of the testing time and costs with their system.

The evaluation process breaks down typical driving situations into components that can be tested or simulated over and over again, exposing autonomous vehicles to a condensed set of the most challenging driving situations.

This, the researchers argue, means that 1,600km of testing would equate to 1.6m km in real-world testing, but, in order to make the public feel safe being in these vehicles, as much as 20bn km of testing will need to be done.

The teams white paper is published here.

Want stories like this and more direct to your inbox? Sign up for Tech Trends, Silicon Republics weekly digest of need-to-know tech news.

Read more from the original source:
Team develops first blockchain that can't be hacked by quantum computer - Siliconrepublic.com

D-Wave partners with U of T to move quantum computing along – Financial Post

Not even the greatest geniuses in the world could explain quantum computing.

In the early 1930s Einstein, in fact, called quantum mechanics the basis for quantum computing spooky action at a distance.

Then theres a famous phrase from the late Nobel Laureate in physics, Richard Feynman: If you think you understand quantum mechanics, then you dont understand quantum mechanics.

That may be so, but the mystery behind quantum has not stopped D-Wave Systems Inc. from making its mark in the field. In the 1980s it was thought maybe quantum mechanics could be used to build a computer. So people starting coming up with ideas on how to build one, says Bo Ewald, president of D-Wave in Burnaby, B.C.

Two of those people were UBC PhD physics grads Eric Ladizinsky and Geordie Rose, who had happened to take an entrepreneur course before founding D-Wave in 1999. Since there werent a lot of businesses in the field, they created and collected patents around quantum, Ewald says.

What we have with D-Wave is the mother of all ships: that is the hardware capability to unlock the future of AI

While most who were exploring the concept were looking in the direction of what is called the universal gate model, D-Wave decided to work on a different architecture, called annealing. The two do not necessarily compete, but perform different functions.

In quantum annealing, algorithms quickly search over a space to find a minimum (or solution). The technology is best suited for speeding research, modelling or traffic optimization for example.

Universal gate quantum computing can put basic quantum circuit operations together to create any sequence to run increasingly complex algorithms. (Theres a third model, called topological quantum computing, but it could be decades before it can be commercialized.)

When D-Wave sold its first commercial product to Lockheed Martin about six years ago, it marked the first commercial sale of a quantum computer, Ewald says. Google was the second to partner with D-Wave for a system that is also being run by NASA Ames Research Center. Each gets half of the machine, Ewald says. They believed quantum computing had an important future in machine learning.

Most recently D-Wave has been working with Volkswagen to study traffic congestion in Beijing. They wanted to see if quantum computing would have applicability to their business, where there are lots of optimization problems. Another recent coup is a deal with the Los Alamos National Laboratory.

Theres no question that any quantum computing investment is a long-term prospect, but that has not hindered their funding efforts. To date, the company has acquired more than 10 rounds of funding from the likes of PSP, Goldman Sachs, Bezos Expeditions, DFJ, In-Q-Tel, BDC Capital, GrowthWorks, Harris & Harris Group, International Investment and Underwriting, and Kensington Partners Ltd.

What we have with D-Wave is the mother of all ships: that is the hardware capability to unlock the future of AI, says Jrme Nycz, executive vice-president, BDC Capital. We believe D-Waves quantum capabilities have put Canada on the map.

Now, Ewing says, the key for the company moving forward is getting more smart people working on apps and on software tools in the areas of AI, machine earning and deep learning.

To that end, D-Wave recently not only open-sourced its Qbsolv software tool, it launched an initiative with Creative Destruction Lab at the University of Torontos Rotman School of Management to create a new track focused on quantum machine learning. The intensive one-year program will go through an introductory boot camp led by Dr. Peter Wittek, author of Quantum Machine Learning: What Quantum Computing means to Data Mining, with instruction and technical support from D-Wave experts, and access to a D-Wave technology.

While it is still early days in terms of deployment for quantum computing, Ewald believes D-Waves early start gives them a leg up if and when quantum hits the mainstream. So far customers tend to be government and/or research related. Google is the notable exception. But once apps come along that are applicable for other industries, it will all make sense.

The early start has given D-Wave the experience to be able to adopt other architectures as they evolve. It may be a decade before a universal gate model machine becomes a marketable product. If that turns out to be true, we will have a 10-year lead in getting actual machines into the field and having customers working on and developing apps.

Ewald is the first to admit that as an early entrant, D-Wave faces criticism around its architecture. There are a lot of spears and things that we tend to get in the chest. But we see them coming and can deal with it. If we can survive all that, we will have a better view of the market, real customers and relationships with accelerators like Creative Destruction Lab. At the end of day we will have the ability to adapt when we need to.

Continue reading here:
D-Wave partners with U of T to move quantum computing along - Financial Post

Telstra just wants a quantum computer to offer as-a-service – ZDNet

Due to the changes needed to algorithms and computational thinking, Telstra chief scientist Hugh Bradlow believes the first commercial users of quantum computers will need some help adjusting -- and the Australian incumbent telco will be there to offer that help at a price.

"I can assure you they are not going to walk in on day one and know how to use these things," Bradlow said on Wednesday evening.

"We want to be able to offer it as-a-service to them ... they will need a lot of hand holding, and they are not going to run the equipment themselves, it's complicated."

Telstra and the Commonwealth Bank of Australia (CBA) are two of the companies backing the work of a team at the University of New South Wales (UNSW) that is looking to develop quantum computing in silicon.

At the end of 2015, both companies contributed AU$10 million over five years to UNSW.

Despite racing against far greater funded rivals, head of UNSW's quantum effort professor Michelle Simmons said she is happy with the funding the Centre of Excellence for Quantum Computation and Communication Technology has received.

"At the moment, you have to prove you have the best hardware of anything out there to know whether you are going to go further or not," Simmons said. "I guess one of the things we've been very much driven by is milestone-based research.

"Can we actually develop the qubits, qubit by qubit, and prove that they are better than other qubits that out there? And so if you have lots of money in the beginning, and you are not doing that systematic thorough approach, it's actually not that helpful to you. You have to do it, proving it along the way."

Simmons said her team is currently looking at producing a 10-qubit system by the end of the decade, and, if successful, will be looking to move up to 100 qubits.

In October last year, the UNSW team announced that they had created a new qubit that remains in a stable superposition for 10 times longer than previously achieved.

A year earlier, the team built the first 2-qubit logic gate in silicon.

"The prototype chip we want to make within five years is a pretty shrinkable manufacturing process, and it will be able to perform a variety of calculations; we hope it will be able to potentially solve the problem that currently can't be solved on an existing computer," Andrew Dzurak, scientia professor at the university, said at the time.

"That particular type of problem may not be the sort of problem that is going to excite many commercial people in the first instance, but it will be an important principal."

Even though UNSW is at the frontier of quantum computing, however, Bradlow said Telstra just wants to get its hands on one.

"We are agnostic at the end of the day; we just want a quantum computer," he said. "We do hope Michelle's team wins ... we've gone and put our money on it because we think it's got the best odds, so it's not just a random bet, but we are obviously keeping across anything that is out there.

"Over the last year and a half, I've probably visited every major group in the world, and they all have very different views and by seeing multiple views you get a much better perspective.

"So it's important to keep across everything."

For its part, CBA is preparing for a quantum future by using a quantum computing simulator from QxBranch.

"The difference between the emulator of a quantum computer and the real hardware is that we run the simulator on classical computers, so we don't get the benefit of the speed up that you get from quantum, but we can simulate its behaviour and some of the broad characteristics of what the eventual hardware will do," QxBranch CEO Michael Brett told ZDNet in April.

"What we provide is the ability for people to explore and validate the applications of quantum computing so that as soon as the hardware is ready, they'll be able to apply those applications and get the benefit immediately of the unique advantages of quantum computing."

Read the rest here:
Telstra just wants a quantum computer to offer as-a-service - ZDNet

MIT Just Unveiled A Technique to Mass Produce Quantum Computers – Futurism

In BriefResearchers have found a way to make the creation of qubitssimpler and more precise. The team hopes that this new techniquecould, one day, allow for the mass production of quantum computers. Commercial Quantum Computing Quantum computing is, if you are not already familiar, simply put, a type of computation that uses qubits to encode data instead of the traditional bit (1s and 0s). In short, itallows for the superposition of states, which is where data can be in more than one state at a given time.

So, while traditional computing is limited to information belonging to only one or another state, quantum computing widens those limitations. As a result,more information can be encoded into a much smaller type of bit, allowing for much larger computing capacity. And, while it is still in relatively early development, many believe that quantum computing will be the basis of future technologies, advancing our computational speed beyond what we can currently imagine.

It was extremely exciting then whenresearchers from MIT, Harvard University, and Sandia National Laboratories unveileda simpler way of using atomic-scale defects in diamond materials to build quantum computers in a way that could possibly allow them to be mass produced.

For this process, defects are they key. They are precisely and perfectly placed to function as qubits and hold information. Previous processes weredifficult, complex, and not precise enough. This new methodcreates targeted defects in a much simpler manner. Experimentally, defects created were, on average, at or under 50 nanometers of the ideal locations.

The significance of this cannot be overstated. The dream scenario in quantum information processing is to make an optical circuit to shuttle photonic qubits and then position a quantum memory wherever you need it, says Dirk Englund, an associate professor of electrical engineering and computer science, in an interview with MIT. Were almost there with this. These emitters are almost perfect.

Image Credit: carmule / Pixabay

While the reality of quantum computers, let alone mass produced quantum computers, is still a bit of a ways off, this research is promising. One of the main remaining hurdles is how these computers will read the qubits. But these diamond defects aim to solve that problem because they naturally emit light, and since the light particles emitted can retain superposition, they could help to transmit information.

The research goes on to detail how the completion of these diamond materials better allowed for the amplification of the qubit information. By the end, the researchers found that the light emitted was approximately 80-90 percent as bright as possible.

If this work eventuallyleads to the full creation of a quantum computer, life as we know it would change irrevocably. From completely upendingmodern encryption methods to allowing us to solve previously unsolvable problems, our technology and infrastructure would never be the same. Moreover, the limitations that currently exist in how we store and transmit information would shatter, opening new opportunities foras yetunimaginable exploration.

Visit link:
MIT Just Unveiled A Technique to Mass Produce Quantum Computers - Futurism

Research collaborative pursues advanced quantum computing – Phys.Org

May 31, 2017 by Steve Tally Purdue University and Microsoft Corp. have signed a five-year agreement to develop a useable quantum computer. Purdue is one of four international universities in the collaboration.Michael Manfra, Purdue University's Bill and Dee O'Brien Chair Professor of Physics and Astronomy, professor of materials engineering and professor of electrical and computer engineering, will lead the effort at Purdue to build a robust and scalable quantum computer by producing what scientists call a "topological qubit." Credit: Purdue University photo/Rebecca Wilcox

"If this project is successful it will cause a revolution in computing."

That's the forecast of Michael Manfra, Purdue University's Bill and Dee O'Brien Chair Professor of Physics and Astronomy, Professor of Materials Engineering and Professor of Electrical and Computer Engineering, on a new long-term enhanced collaboration between Purdue and Microsoft Corp. to build a robust and scalable quantum computer by producing what scientists call a "topological qubit."

Purdue President Mitch Daniels noted that Purdue was home to the first computer science department in the United States, and says this partnership and Manfra's work places the university at the forefront of quantum computing.

"Someday quantum computing will move from the laboratory to actual daily use, and when it does, it will signal another explosion of computing power like that brought about by the silicon chip," Daniels says. "It's thrilling to imagine Purdue at the center of this next leap forward."

In the computers that we currently use every day, information is encoded in an either/or binary system of bits, what are commonly thought of as 1s and 0s. These computers are based on silicon transistors, which, like a light switch, can only be in either an on or off position.

With quantum computers, information is encoded in qubits, which are quantum units of information. With a qubit, however, this physical state isn't just 0 or 1, but can also be a linear combination of 0 and 1. Because of a strange phenomenon of quantum mechanics called "superposition," a qubit can be in both states at the same time.

This characteristic is essential to quantum computation's potential power, allowing for solutions to problems that are intractable using classical architectures.

Advocates of quantum computing believe this never-before-seen technology will create a new global "quantum economy."

The team assembled by Microsoft will work on a type of quantum computer that is expected to be especially robust against interference from its surroundings, a situation known in quantum computing as "decoherence." The "scalable topological quantum computer" is theoretically more stable and less error-prone.

"One of the challenges in quantum computing is that the qubits interact with their environment and lose their quantum information before computations can be completed," Manfra says. "Topological quantum computing utilizes qubits that store information "non-locally" and the outside noise sources have less effect on the qubit, so we expect it to be more robust."

Manfra says that the most exciting challenge associated with building a topological quantum computer is that the Microsoft team must simultaneously solve problems of materials science, condensed matter physics, electrical engineering and computer architecture.

"This is why Microsoft has assembled such a diverse set of talented people to tackle this large-scale problem," Manfra says. "No one person or group can be expert in all aspects."

Purdue and Microsoft entered into an agreement in April 2016 that extends their collaboration on quantum computing research, effectively establishing "Station Q Purdue," one of the "Station Q" experimental research sites that work closely with two "Station Q" theory sites.

The new, multi-year agreement extends that collaboration, and includes Microsoft employees being embedded in Manfra's research team at Purdue.

Manfra's group at Station Q Purdue will collaborate with Redmond, Washington-based Microsoft team members, as well as a global experimental group established by Microsoft including experimental groups at the Niels Bohr Institute at the University of Copenhagen in Denmark, TU Delft in the Netherlands, and the University of Sydney, Australia. They are also coupled to the theorists at Microsoft Station Q in Santa Barbara. All groups are working together to solve quantum computing's biggest challenges.

"What's exciting is that we're doing the science and engineering hand-in-hand, at the same time," Manfra says. "We are lucky to be part of this truly amazing global team."

Mathematician and Fields Medal recipient Michael Freedman leads Microsoft's Station Q in Santa Barbara working on quantum computing.

"There is another computing planet out there, and we, collectively, are going to land on it. It really is like the old days of physical exploration and much more interesting than locking oneself in a bottle and traveling through space. We will find an amazing unseen world once we have general purpose programmable quantum computers," Freedman says. "Michael Manfra and Purdue University will be a key collaborator on this journey. I'm not interested in factoring numbers, but solving chemistry and materials science problems, and most ambitiously machine intelligence. Curiously, we need great materials science and transport physics Mike Manfra's work to build the systems we will use to do quantum computing and, thus, to usher in the next era of materials science."

Purdue's role in the project will be to grow and study ultra-pure semiconductors and hybrid systems of semiconductors and superconductors that may form the physical platform upon which a quantum computer is built. Manfra's group has expertise in a technique called molecular beam epitaxy, and this technique will be used to build low-dimensional electron systems that form the basis for quantum bits, or qubits.

The work at Purdue will be done in the Birck Nanotechnology Center in the university's Discovery Park, as well as in the Department of Physics and Astronomy. The Birck facility houses the multi-chamber molecular beam epitaxy system, in which three fabrication chambers are connected under ultra-high vacuum. It also contains clean-room fabrication and necessary materials characterization tools. Laboratories for low-temperature measurement of the materials electronic properties will be conducted in the Department of Physics and Astronomy.

Suresh Garimella, executive vice president for research and partnerships, and Purdue's Goodson Distinguished Professor of Mechanical Engineering, says the tools and laboratories found in Discovery Park have enabled Purdue to become a world leader in several areas.

"Combining these world-leading facilities with our incredibly talented and knowledgeable faculty, such as Professor Manfra, has placed Purdue at the forefront of research and development of nanotechnology, nanoelectronics, next-generation silicon transistor-based electronics, and quantum computing. To have Purdue contribute to the construction of the world's first quantum computer is be a dream come true for us," he says.

Explore further: The mystery of quantum computers

Our computers, even the fastest ones, seem unable to withstand the needs of the enormous quantity of data produced in our technological society. That's why scientists are working on computers using quantum physics, orquantum ...

While technologies that currently run on classical computers, such as Watson, can help find patterns and insights buried in vast amounts of existing data, quantum computers will deliver solutions to important problems where ...

With a combined $1.8 million from the W.M. Keck Foundation and the University of Arizona, materials science and engineering professor Pierre Deymier explores building a quantum computer that uses sound instead of quantum ...

The global race towards a functioning quantum computer is on. With future quantum computers, we will be able to solve previously impossible problems and develop, for example, complex medicines, fertilizers, or artificial ...

What does the future hold for computing? Experts at the Networked Quantum Information Technologies Hub (NQIT), based at Oxford University, believe our next great technological leap lies in the development of quantum computing.

IBM has announced its plans to begin offering the world's first commercial universal quantum-computing servicecalled IBM Q, the system will be made available to those who wish to use it for a fee sometime later this year. ...

Capacitors, electronic components that store and quickly release a charge, play an important role in many types of electrical circuits. They'll play an equally important role in next-generation spintronic devices, which take ...

(Phys.org)Physicists have found that the structure of certain types of quantum learning algorithms is very similar to their classical counterpartsa finding that will help scientists further develop the quantum versions. ...

When scientists at the Department of Energy's SLAC National Accelerator Laboratory focused the full intensity of the world's most powerful X-ray laser on a small molecule, they got a surprise: A single laser pulse stripped ...

Pizza enthusiasts know well that a simple u-shaped curvature at the crust can keep a thin slice from drooping over when lifted from a plate. A team of engineers from Brown University has shown that fish may take advantage ...

Scientists at the University of York have shown that a sperm tail utilizes interconnected elastic springs to transmit mechanical information to distant parts of the tail, helping it to bend and ultimately swim toward an egg.

(Phys.org)The Balinese subak is a self-organized agrarian society on the island of Bali in Indonesia, whose members must share a limited amount of water for irrigation and rice production. Some of the farmers share the ...

Please sign in to add a comment. Registration is free, and takes less than a minute. Read more

Here is the original post:
Research collaborative pursues advanced quantum computing - Phys.Org

Microsoft, Purdue Tackle Topological Quantum Computer – HPCwire – HPCwire (blog)

Topological qubits are among the more baffling, and if practical, more promising ways to approach scalable quantum computing. At least thats what Microsoft, Purdue University, and three other universities are hoping after having recently signed a five-year agreement to develop a topological qubit based quantum computer.

Qubits are strange no matter what form they take. The basic idea being that through superposition, a qubit can be in two states at once (0 and 1) and hence a quantum computers capacity scales exponentially with the number of qubits versus classical computers which scale linearly with the number of bits. Most quantum computing efforts rely on producing superposition in some material IBM uses superconducting devices and there have been many qubit schemes proposed.

Topological qubits are among the more mysterious. They rely on quasi particles called non-abelian anyons which have not definitively been proven to exist. Using these topological qubits, information is encoded by braiding the paths of these quasi-particles. The benefit, say researchers, is topological qubits resist decoherence much better than other qubit types and should require far less error correction. At least thats the theory.

Microsoft has been dabbling in topological qubit theory for several years. Last fall Microsoft quantum researcher Alex Boharov was interviewed by Nature (Inside Microsofts quest for a topological quantum computer, October 21, 2016) on why pursue such an exotic path.

Our qubits are not even material things. But then again, the elementary particles that physicists run in their colliders are not really solid material objects. Here we have non-abelian anyons, which are even fuzzier than normal particles. They are quasiparticles. The most studied kinds of anyon emerge from chains of very cold electrons that are confined at the edge of a 2D surface. These anyons act like both an electron and its antimatter counterpart at the same time, and appear as dense peaks of conductance at each end of the chain. You can measure them with high-precision devices, but not see them under any microscope said Boharov.

As explained by Boharov, Noise from the environment and other parts of the computer is inevitable, and that might cause the intensity and location of the quasiparticle to fluctuate. But thats OK, because we do not encode information into the quasiparticle itself, but in the order in which we swap positions of the anyons.We call that braiding, because if you draw out a sequence of swaps between neighbouring pairs of anyons in space and time, the lines that they trace look braided. The information is encoded in a topological property that is, a collective property of the system that only changes with macroscopic movements, not small fluctuations.

The upside is tantalizing he said, So far, weve had an amazing ride in terms of creating more-efficient algorithms reducing the number of qubit interactions, known as gates, that you need to run certain computations that are impossible on classical computers. In the early 2000s, for example, people thought it would take about 24 billion years to calculate on a quantum computer the energy levels of ferredoxin, which plants use in photosynthesis. Now, through a combination of theory, practice, engineering and simulation, the most optimistic estimates suggest that it may take around an hour. We are continuing to work on these problems, and gradually switching towards more applied work, looking towards quantum chemistry, quantum genomics and things that might be done on a small-to-medium-sized quantum computer.

Now, Microsoft is further ramping up its quantum efforts with a collaboration that includes Purdue as well as a global experimental group established by Microsoft at the Niels Bohr Institute at the University of Copenhagen in Denmark, TU Delft in the Netherlands, and the University of Sydney, Australia.For Purdue, this is an extension of joint work on quantum computing with Microsoft begun roughly one year ago. Michael Freedman of Microsofts Station Q in Santa Barbara leads the effort.

Whats exciting is that were doing the science and engineering hand-in-hand, at the same time, says Purdue researcher Michael Manfra in an article on the project posted on the Purdue web site yesterday.

Purdues role in the project will be to grow and study ultra-pure semiconductors and hybrid systems of semiconductors and superconductors that may form the physical platform upon which a quantum computer is built. Manfras group has expertise in a technique called molecular beam epitaxy, and this technique will be used to build low-dimensional electron systems that form the basis for quantum bits, or qubits, according to the article.

Purdue President Mitch Daniels noted in the article that Purdue was home to the first computer science department in the United States, and said the partnership and Manfras work places the university at the forefront of quantum computing.Someday quantum computing will move from the laboratory to actual daily use, and when it does, it will signal another explosion of computing power like that brought about by the silicon chip, Daniels said.

Link to Purdue article written by Steve Tally: http://www.purdue.edu/newsroom/releases/2017/Q2/microsoft,-purdue-collaborate-to-advance-quantum-computing-.html

Link Nature interview: https://www.nature.com/news/inside-microsoft-s-quest-for-a-topological-quantum-computer-1.20774

Caption for feature image:Michael Freedman (left), Microsoft Corp. quantum computing researcher, and Suresh Garimella, executive vice president for research and partnerships, and Purdues Goodson Distinguished Professor of Mechanical Engineering, sign a new five-year enhanced collaboration between Purdue and Microsoft to build a robust and scalable quantum computer by producing what scientists call a topological qubit. (Purdue University photo/Charles Jischke)

Go here to read the rest:
Microsoft, Purdue Tackle Topological Quantum Computer - HPCwire - HPCwire (blog)

Here’s How We Can Achieve Mass-Produced Quantum Computers – ScienceAlert

Still waiting patiently for quantum computing to bring about the next revolution in digital processing power? We might now be a little closer, with a discovery that could help us build quantum computers at mass scale.

Scientists have refined a technique using diamond defects to store information, adding silicon to make the readouts more accurate and suitable for use in the quantum computers of the future.

To understand how the new process works, you need to go back to the basics of the quantum computing vision: small particles kept in a state of superposition, where they can represent both 1, 0, and a combination of the two at the same time.

These quantum bits, or qubits, can process calculations on a much grander scale than the bits in today's computer chips, which are stuck representing either 1 or 0 at any one time.

Getting particles in a state of superposition long enough for us to actually make use of them has proved to be a real challenge for scientists, but one potential solution is through the use of diamond as a base material.

The idea is to use tiny atomic defects inside diamonds to store qubits, and then pass around data at high speeds using light optical circuits rather than electrical circuits.

Diamond-defect qubits rely on a missing carbon atom inside the diamond lattice which is then replaced by an atom of some other element, like nitrogen. The free electrons created by this defect have a magnetic orientation that can be used as a qubit.

So far so good, but our best efforts so far haven't been accurate enough to be useful, because of the broad spectrum of frequencies in the light emitted and that's where the new research comes in.

Scientists added silicon to the qubit creation process, which emits a much narrower band of light, and supplies the precision that quantum computing requires.

At the moment, these silicon qubits don't keep their superposition as well, but the researchers are hopeful this can be overcome by reducing their temperature to a fraction of a degree above absolute zero.

"The dream scenario in quantum information processing is to make an optical circuit to shuttle photonic qubits and then position a quantum memory wherever you need it," says one of the team, Dirk Englund from MIT. "We're almost there with this. These emitters are almost perfect."

In fact, the researchers produced defects within 50 nanometres of their ideal locations on average, which is about one thousandth the size of a human hair.

Being able to etch defects with this kind of precision means the process of building optical circuits for quantum computers then becomes more straightforward and feasible.

If the team can improve on the promising results so far, diamonds could be the answer to our quantum computing needs: they also naturally emit light in a way that means qubits can be read without having to alter their states.

You still won't be powering up a quantum laptop anytime soon, but we're seeing real progress in the study of the materials and techniques that might one day bring this next-generation processing power to the masses.

The research has been published in Nature Communications.

Read more:
Here's How We Can Achieve Mass-Produced Quantum Computers - ScienceAlert

AI and Quantum Computers Are Our Best Weapons Against Cyber Criminals – Futurism

In BriefMajor companies like IBM are turning to artificialintelligence and quantum computing to protect against cyberattacks. While these technologies aren't silver bullets, they areessential tools for cyber security in the age of the Internet ofThings. Weapons of Cyber Warfare

Cyber security has become a key issue in our national and international discussions. No longer do cyber attacks concern only email companies and individuals who are unwilling to update their tech. Now, cyber crime has had a major impact on both U.S. mainstream political parties, and almost any organization even hospitals should have some concern about the possibility of an attack through a computer network.

In their struggle to fight cyber crime, major companies like IBM are turning to two of the worlds most powerful technologies artificial intelligence (AI) and quantum computing.

IBMs AI, Watson,helps human analysts sift through the200,000 or so security events the company has to deal with on a day-to-day basis. It helps determine which events dont require special attention, such as instances when an employee forgets their password, and which should receive more scrutiny.

Before artificial intelligence, wed have to assume that a lot of the data say 90 percent is fine. We only would have bandwidth to analyze this 10 percent, Daniel Driver from Chemring Technology Solutions, a provider of electric warfare, said in an interview with the Financial Times.The AI mimics what an analyst would do, how they look at dataIts doing a huge amount of legwork upfront, which means we can focus our analysts time.

Watson is about 60 times faster that its human counterparts, and speed is key for defending against cyber attacks. But even Watsons impressive rates pale in comparison to those that can be attained with quantum computers.

The analogy we like to use is that of a needle in a haystack, Driver said in the interview. A machine can be specially made to look for a needle in a haystack, but it still has to look under every piece of hay. Quantum computing means, Im going to look under every piece of hay simultaneously and find the needle immediately.

While these technologies are not silver bullets against cyber attacks, they are becoming vital tools in the cyber security industry, which is projected to grow from $74 billion last year to $100 billion in 2020. Part of this growth may be attributed to societys increasing reliance on the Internet of Things (IoT). As everything from light bulbs to our jackets become digitally accessible, every person should be more concerned about cyber security.

As we continue to see advancements in both AI and quantum computing technologies, more businesses and households will have access to these protective tools. AIs are already finding their places in different sectors of our society including healthcare. Perhaps in addition to diagnosing medical images, AIs can also protect hospitals from future cyber attacks.

Go here to see the original:
AI and Quantum Computers Are Our Best Weapons Against Cyber Criminals - Futurism

Scientists claim to have invented the world’s first quantum-proof … – ScienceAlert

Researchers in Russia say they've developed and tested the world's first blockchain that won't be vulnerable to encryption-breaking attacks from future quantum computers.

If the claims are verified, the technique could be a means of protecting the vast amounts of wealth invested in fast-growing cryptocurrencies like Bitcoin and Ethereum which are safe from today's code-breaking methods, but could be exposed by tomorrow's vastly more powerful quantum machines.

A team from the Russian Quantum Centre in Moscow says its quantum blockchain technology has been successfully tested with one of Russia's largest banks, Gazprombank, and could be used as a proof of concept to underpin secure data encryption and storage methods in the future.

To backtrack a little, a blockchain is a publicly accessible, decentralised ledger of recorded information, spread across multiple computers on the internet.

This kind of distributed database is the underlying technology that makes Bitcoin possible where it maintains a list of timestamped digital transactions that can be viewed by anyone on the platform.

The idea is that the blockchain frees users on the network from needing any kind of middleman or central authority to regulate transactions or exchanges of information.

Because all interactions are recorded in the distributed ledger, the blockchain makes everything a matter of public record, which, when it comes to Bitcoin, is what ensures that transactions are legitimate, and that units of the currency aren't duplicated.

The problem with this is that when someone's computer conducts transactions, the system uses digital signatures for authentication purposes but while that protection layer may offer strong enough encryption to secure those exchanges today, they won't be able to withstand quantum computers.

Quantum computers are a technology that's still in development, but once they mature, they're set to offer computational power and speed far in excess of what today's computers can achieve.

While that means quantum computers are poised to do great things for us in tomorrow's world, it's a double-edged sword because that massive increase in performance also means these machines could pose a huge security risk in the world of IT, breaking through comparatively weak encryption walls that currently protect the world of banking, defence, email, social media, you name it.

"If quantum computing takes three decades to truly arrive, there's no reason to panic," as Nicole Kobie reported for Wired last year.

"If it lands in 10 years, our data is in serious trouble. But it's impossible to predict with certainty when it will happen."

Because of this, today's security researchers are busy trying to invent secure systems that can defend us from the unbelievably fast supercomputers of tomorrow a pretty tall order, considering these awesome systems haven't even really been invented yet.

That's what the Russian team's quantum-proof blockchain is another attempt to devise a digital fortress that won't be crushed by quantum computers. And the key, the researchers say, is abandoning part of what currently helps protect blockchain transactions.

"In our quantum-secure blockchain setup, we get rid of digital signatures altogether," one of the researchers, Alexander Lvovsky, told Mary-Ann Russon at IBTimes UK.

"Instead, we utilise quantum cryptography for authentication."

Quantum cryptography depends on entangled particles to work, and the researchers' system used what's called quantum key distribution, which the researchers say makes it possible to make sure nobody's eavesdropping on private communications.

"Parties that communicate via a quantum channel can be completely sure that they are talking to each other, not anybody else. This is the main idea," Lvovsky said.

"Then we had to re-invent the entire blockchain architecture to 'fit' our new authentication technology, thereby making this architecture immune to quantum computer attacks."

The system they've experimented with was tested on a 3-node (computer) network, but it's worth pointing out that while the team is claiming victory so far, this kind of research remains hypothetical at this point, and the study has yet to undergo peer-review.

But given the looming technological avalanche that quantum computers represent for digital security, all we'll say is we're glad scientists are working on this while there's still time.

Because, make no doubt, the future is headed this way fast.

The study has been published on pre-print website arXiv.org.

Read the rest here:
Scientists claim to have invented the world's first quantum-proof ... - ScienceAlert

Google Plans to Demonstrate the Supremacy of Quantum … – IEEE Spectrum

Photo: Erik Lucero Put Chip Here: Google will put its superconducting quantum computer chip in this 10-millikelvin dilution refrigerator.

Quantum computers have long held the promise of performing certain calculations that are impossibleor at least, entirely impracticalfor even the most powerful conventional computers to perform. Now, researchers at a Google laboratory in Goleta, Calif., may finally be on the cusp of proving it, using the same kinds of quantum bits, or qubits, that one day could make up large-scale quantum machines.

By the end of this year, the team aims to increase the number of superconducting qubits it builds on integrated circuits to create a 7-by-7 array. With this quantum IC, the Google researchers aim to perform operations at the edge of whats possible with even the best supercomputers, and so demonstrate quantum supremacy.

Weve been talking about, for many years now, how a quantum processor could be powerful because of the way that quantum mechanics works, but we want to specifically demonstrate it, says team member John Martinis, a professor at the University of California, Santa Barbara, who joined Google in 2014.

A system size of 49 superconducting qubits is still far away from what physicists think will be needed to perform the sorts of computations that have long motivated quantum computing research. One of those is Shors algorithm, a computational scheme that would enable a quantum computer to quickly factor very large numbers and thus crack one of the foundational components of modern cryptography. In a recent commentary in Nature, Martinis and colleagues estimated that a 100-million-qubit system would be needed to factor a 2,000-bit numbera not-uncommon public key lengthin one day. Most of those qubits would be used to create the special quantum states that would be needed to perform the computation and to correct errors, creating a mere thousand or so stable logical qubits from thousands of less stable physical components, Martinis says.

There will be no such extra infrastructure in this 49-qubit system, which means a different computation must be performed to establish supremacy. To demonstrate the chips superiority over conventional computers, the Google team will execute operations on the array that will cause it to evolve chaotically and produce what looks like a random output. Classical machines can simulate this output for smaller systems. In April, for example, Lawrence Berkeley National Laboratory reported that its 29-petaflop supercomputer, Cori, had simulated the output of 45 qubits. But 49 qubits would pushif not exceedthe limits of conventional supercomputers.

This computation does not as yet have a clear practical application. But Martinis says there are reasons beyond demonstrating quantum supremacy to pursue this approach. The qubits used to make the 49-qubit array can also be used to make larger universal quantum systems with error correction, the sort that could do things like decryption, so the chip should provide useful validation data.

Photo: Erik Lucero Steps to Supremacy: Googles quantum computing chip is a 2-by-3 array of qubits. The company hopes to make a 7-by-7 array later this year.

There may also be, the team suspects, untapped computational potential in systems with little or no error correction. It would be wonderful if this were true, because then we could have useful products right away instead of waiting for a long time, says Martinis. One potential application, the team suggests, could be in the simulation of chemical reactions and materials.

Google recently performed a dry run of the approach on a 9-by-1 array of qubits and tested out some fabrication technology on a 2-by-3 array. Scaling up the number of qubits will happen in stages. This is a challenging system engineering problem, Martinis says. We have to scale it up, but the qubits still have to work well. We cant have any loss in fidelity, any increase in error rates, and I would say error rates and scaling tend to kind of compete against each other. Still, he says, the team thinks there could be a way to scale up systems well past 50qubits even without error correction.

Google is not the only company working on building larger quantum systems without error correction. In March, IBM unveiled a plan to create such a superconducting qubit system in the next few years, also with roughly 50qubits, and to make it accessible on the cloud. Fifty is a magic number, says Bob Sutor, IBMs vice president for this area, because thats around the point where quantum computers will start to outstrip classical computers for certain tasks.

The quality of superconducting qubits has advanced a lot over the years since D-Wave Systems began offering commercial quantum computers, says Scott Aaronson, a professor of computer science at the University of Texas at Austin. D-Wave, based in Burnaby, B.C., Canada, has claimed that its systems offer a speedup over conventional machines, but Aaronson says there has been no convincing demonstration of that. Google, he says, is clearly aiming for a demonstration of quantum supremacy that is not something youll have to squint and argue about.

Its still unclear whether there are useful tasks a 50-or-so-qubit chip could perform, Aaronson says. Nor is it certain whether systems can be made bigger without error correction. But he says quantum supremacy will be an important milestone nonetheless, one that is a natural offshoot of the effort to make large-scale, universal quantum machines: I think that it is absolutely worth just establishing as clearly as we can that the world does work this way. Certainly, if we can do it as a spin-off of technology that will be useful eventually in its own right, then why the hell not?

This article appears in the June 2017 print issue as Google Aims for Quantum Computing Supremacy.

See the article here:
Google Plans to Demonstrate the Supremacy of Quantum ... - IEEE Spectrum