Category Archives: Quantum Computer

I confess, I’m scared of the next generation of supercomputers – TechRadar

Earlier this year, a Japanese supercomputer built on Arm-based Fujitsu A64FX processors snatched the crown of worlds fastest machine, blowing incumbent leader IBM Summit out of the water.

Fugaku, as the machine is known, achieved 415.5 petaFLOPS by the popular High Performance Linpack (HPL) benchmark, which is almost three times the score of the IBM machine (148.5 petaFLOPS).

It also topped the rankings for Graph 500, HPL-AI and HPCH workloads - a feat never before achieved in the world of high performance computing (HPC).

Modern supercomputers are now edging ever-closer to the landmark figure of one exaFLOPS (equal to 1,000 petaFLOPS), commonly described as the exascale barrier. In fact, Fugaku itself can already achieve one exaFLOPS, but only in lower precision modes.

The consensus among the experts we spoke to is that a single machine will breach the exascale barrier within the next 6 - 24 months, unlocking a wealth of possibilities in the fields of medical research, climate forecasting, cybersecurity and more.

But what is an exaFLOPS? And what will it mean to break the exascale milestone, pursued doggedly for more than a decade?

To understand what it means to achieve exascale computing, its important to first understand what is meant by FLOPS, which stands for floating point operations per second.

A floating point operation is any mathematical calculation (i.e. addition, subtraction, multiplication or division) that involves a number containing a decimal (e.g. 3.0 - a floating point number), as opposed to a number without a decimal (e.g. 3 - a binary integer). Calculations involving decimals are typically more complex and therefore take longer to solve.

An exascale computer can perform 10^18 (one quintillion/100,000,000,000,000,000) of these mathematical calculations every second.

For context, to equal the number of calculations an exascale computer can process in a single second, an individual would have to perform one sum every second for 31,688,765,000 years.

The PC Im using right now, meanwhile, is able to reach 147 billion FLOPS (or 0.00000014723 exaFLOPS), outperforming the fastest supercomputer of 1993, the Intel Paragon (143.4 billion FLOPS).

This both underscores how far computing has come in the last three decades and puts into perspective the extreme performance levels attained by the leading supercomputers today.

The key to building a machine capable of reaching one exaFLOPS is optimization at the processing, storage and software layers.

The hardware must be small and powerful enough to pack together and reach the necessary speeds, the storage capacious and fast enough to serve up the data and the software scalable and programmable enough to make full use of the hardware.

For example, there comes a point at which adding more processors to a supercomputer will no longer affect its speed, because the application is not sufficiently optimized. The only way governments and private businesses will realize a full return on HPC hardware investment is through an equivalent investment in software.

Organizations such as the Exascale Computing Project (EPC) the ExCALIBUR programme are interested in solving precisely this problem. Those involved claim a renewed focus on algorithm and application development is required in order to harness the full power and scope of exascale.

Achieving the delicate balance between software and hardware, in an energy efficient manner and avoiding an impractically low mean time between failures (MTBF) score (the time that elapses before a system breaks down under strain) is the challenge facing the HPC industry.

15 years ago as we started the discussion on exascale, we hypothesized that it would need to be done in 20 mega-watts (MW); later that was changed to 40 MW. With Fugaku, we see that we are about halfway to a 64-bit exaFLOPS at the 40 MW power envelope, which shows that an exaFLOPS is in reach today, explained Brent Gorda, Senior Director HPC at UK-based chip manufacturer Arm.

We could hit an exaFLOPS now with sufficient funding to build and run a system. [But] the size of the system is likely to be such that MTBF is measured in single digit number-of-days based on todays technologies and the number of components necessary to reach these levels of performance.

When it comes to building a machine capable of breaching the exascale barrier, there are a number of other factors at play, beyond technological feasibility. An exascale computer can only come into being once an equilibrium has been reached at the intersection of technology, economics and politics.

One could in theory build an exascale system today by packing in enough CPUs, GPUs and DPUs. But what about economic viability? said Gilad Shainer of NVIDIA Mellanox, the firm behind the Infiniband technology (the fabric that links the various hardware components) found in seven of the ten fastest supercomputers.

Improvements in computing technologies, silicon processing, more efficient use of power and so on all help to increase efficiency and make exascale computing an economic objective as opposed to a sort of sporting achievement.

According to Paul Calleja, who heads up computing research at the University of Cambridge and is working with Dell on the Open Exascale Lab, Fugaku is an excellent example of what is theoretically possible today, but is also impractical by virtually any other metric.

If you look back at Japanese supercomputers, historically theres only ever been one of them made. They have beautifully exquisite architectures, but theyre so stupidly expensive and proprietary that no one else could afford one, he told TechRadar Pro.

[Japanese organizations] like these really large technology demonstrators, which are very useful in industry because it shows the direction of travel and pushes advancements, but those kinds of advancements are very expensive and not sustainable, scalable or replicable.

So, in this sense, there are two separate exascale landmarks; the theoretical barrier, which will likely be met first by a machine of Fugakus ilk (a technological demonstrator), and the practical barrier, which will see exascale computing deployed en masse.

Geopolitical factors will also play a role in how quickly the exascale barrier is breached. Researchers and engineers might focus exclusively on the technological feat, but the institutions and governments funding HPC research are likely motivated by different considerations.

Exascale computing is not just about reaching theoretical targets, it is about creating the ability to tackle problems that have been previously intractable, said Andy Grant, Vice President HPC & Big Data at IT services firm Atos, influential in the fields of HPC and quantum computing.

Those that are developing exascale technologies are not doing it merely to have the fastest supercomputer in the world, but to maintain international competitiveness, security and defence.

In Japan, their new machine is roughly 2.8x more powerful than the now-second place system. In broad terms, that will enable Japanese researchers to address problems that are 2.8x more complex. In the context of international competitiveness, that creates a significant advantage.

In years gone by, rival nations fought it out in the trenches or competed to see who could place the first human on the moon. But computing may well become the frontier at which the next arms race takes place; supremacy in the field of HPC might prove just as politically important as military strength.

Once exascale computers become an established resource - available for businesses, scientists and academics to draw upon - a wealth of possibilities will be unlocked across a wide variety of sectors.

HPC could prove revelatory in the fields of clinical medicine and genomics, for example, which require vast amounts of compute power to conduct molecular modelling, simulate interactions between compounds and sequence genomes.

In fact, IBM Summit and a host of other modern supercomputers are being used to identify chemical compounds that could contribute to the fight against coronavirus. The Covid-19 High Performance Computing Consortium assembled 16 supercomputers, accounting for an aggregate of 330 petaFLOPS - but imagine how much more quickly research could be conducted using a fleet of machines capable of reaching 1,000 petaFLOPS on their own.

Artificial intelligence, meanwhile, is another cross-disciplinary domain that will be transformed with the arrival of exascale computing. The ability to analyze ever-larger datasets will improve the ability of AI models to make accurate forecasts (contingent on the quality of data fed into the system) that could be applied to virtually any industry, from cybersecurity to e-commerce, manufacturing, logistics, banking, education and many more.

As explained by Rashid Mansoor, CTO at UK supercomputing startup Hadean, the value of supercomputing lies in the ability to make an accurate projection (of any variety).

The primary purpose of a supercomputer is to compute some real-world phenomenon to provide a prediction. The prediction could be the way proteins interact, the way a disease spreads through the population, how air moves over an aerofoil or electromagnetic fields interact with a spacecraft during re-entry, he told TechRadar Pro.

Raw performance such as the HPL benchmark simply indicates that we can model bigger and more complex systems to a greater degree of accuracy. One thing that the history of computing has shown us is that the demand for computing power is insatiable.

Other commonly cited areas that will benefit significantly from the arrival of exascale include brain mapping, weather and climate forecasting, product design and astronomy, but its also likely that brand new use cases will emerge as well.

The desired workloads and the technology to perform them form a virtuous circle. The faster and more performant the computers, the more complex problems we can solve and the faster the discovery of new problems, explained Shainer.

What we can be sure of is that we will see the continuous needs or ever growing demands for more performance capabilities in order to solve the unsolvable. Once this is solved, we will find the new unsolvable.

By all accounts, the exascale barrier will likely fall within the next two years, but the HPC industry will then turn its attention to the next objective, because the work is never done.

Some might point to quantum computers, which approach problem solving in an entirely different way to classical machines (exploiting symmetries to speed up processing), allowing for far greater scale. However, there are also problems to which quantum computing cannot be applied.

Mid-term (10 year) prospects for quantum computing are starting to shape up, as are other technologies. These will be more specialized where a quantum computer will very likely show up as an application accelerator for problems that relate to logistics first. They wont completely replace the need for current architectures for IT/data processing, explained Gorda.

As Mansoor puts it, on certain problems even a small quantum computer can be exponentially faster than all of the classical computing power on earth combined. Yet on other problems, a quantum computer could be slower than a pocket calculator.

The next logical landmark for traditional computing, then, would be one zettaFLOPS, equal to 1,000 exaFLOPS or 1,000,000 petaFLOPS.

Chinese researchers predicted in 2018 that the first zettascale system will come online in 2035, paving the way for new computing paradigms. The paper itself reads like science fiction, at least for the layman:

To realize these metrics, micro-architectures will evolve to consist of more diverse and heterogeneous components. Many forms of specialized accelerators are likely to co-exist to boost HPC in a joint effort. Enabled by new interconnect materials such as photonic crystal, fully optical interconnecting systems may come into use.

Assuming one exaFLOPS is reached by 2022, 14 years will have elapsed between the creation of the first petascale and first exascale systems. The first terascale machine, meanwhile, was constructed in 1996, 12 years before the petascale barrier was breached.

If this pattern were to continue, the Chinese researchers estimate would look relatively sensible, but there are firm question marks over the validity of zettascale projections.

While experts are confident in their predicted exascale timelines, none would venture a guess at when zettascale might arrive without prefacing their estimate with a long list of caveats.

Is that an interesting subject? Because to be honest with you, its so not obtainable. To imagine how we could go 1000x beyond [one exaFLOPS] is not a conversation anyone could have, unless theyre just making it up, said Calleja, asked about the concept of zettascale.

Others were more willing to theorize, but equally reticent to guess at a specific timeline. According to Grant, the way zettascale machines process information will be unlike any supercomputer in existence today.

[Zettascale systems] will be data-centric, meaning components will move to the data rather than the other way around, as data volumes are likely to be so large that moving data will be too expensive. Regardless, predicting what they might look like is all guesswork for now, he said.

It is also possible that the decentralized model might be the fastest route to achieving zettascale, with millions of less powerful devices working in unison to form a collective supercomputer more powerful than any single machine (as put into practice by the SETI Institute).

As noted by Saurabh Vij, CEO of distributed supercomputing firm Q Blocks, decentralized systems address a number of problems facing the HPC industry today, namely surrounding building and maintenance costs. They are also accessible to a much wider range of users and therefore democratize access to supercomputing resources in a way that is not otherwise possible.

There are benefits to a centralized architecture, but the cost and maintenance barrier overshadows them. [Centralized systems] also alienate a large base of customer groups that could benefit, he said.

We think a better way is to connect distributed nodes together in a reliable and secure manner. It wouldnt be too aggressive to say that, 5 years from now, your smartphone could be part of a giant distributed supercomputer, making money for you while you sleep by solving computational problems for industry, he added.

However, incentivizing network nodes to remain active for a long period is challenging and a high rate of turnover can lead to reliability issues. Network latency and capacity problems would also need to be addressed before distributed supercomputing can rise to prominence.

Ultimately, the difficulty in making firm predictions about zettascale lies in the massive chasm that separates present day workloads and HPC architectures from those that might exist in the future. From a contemporary perspective, its fruitless to imagine what might be made possible by a computer so powerful.

We might imagine zettascale machines will be used to process workloads similar to those tackled by modern supercomputers, only more quickly. But its possible - even likely - the arrival of zettascale computing will open doors that do not and cannot exist today, so extraordinary is the leap.

In a future in which computers are 2,000+ times as fast as the most powerful machine today, philosophical and ethical debate surrounding the intelligence of man versus machine are bound to be played out in greater detail - and with greater consequence.

It is impossible to directly compare the workings of a human brain with that of a computer - i.e. to assign a FLOPS value to the human mind. However, it is not insensible to ask how many FLOPS must be achieved before a machine reaches a level of performance that might be loosely comparable to the brain.

Back in 2013, scientists used the K supercomputer to conduct a neuronal network simulation using open source simulation software NEST. The team simulated a network made up of 1.73 billion nerve cells connected by 10.4 trillion synapses.

While ginormous, the simulation represented only 1% of the human brains neuronal network and took 40 minutes to replicate 1 seconds worth of neuronal network activity.

However, the K computer reached a maximum computational power of only 10 petaFLOPS. A basic extrapolation (ignoring inevitable complexities), then, would suggest Fugaku could simulate circa 40% of the human brain, while a zettascale computer would be capable of performing a full simulation many times over.

Digital neuromorphic hardware (supercomputers created specifically to simulate the human brain) like SpiNNaker 1 and 2 will also continue to develop in the post-exascale future. Instead of sending information from point A to B, these machines will be designed to replicate the parallel communication architecture of the brain, sending information simultaneously to many different locations.

Modern iterations are already used to help neuroscientists better understand the mysteries of the brain and future versions, aided by advances in artificial intelligence, will inevitably be used to construct a faithful and fully-functional replica.

The ethical debates that will arise with the arrival of such a machine - surrounding the perception of consciousness, the definition of thought and what an artificial uber-brain could or should be used for - are manifold and could take generations to unpick.

The inability to foresee what a zettascale computer might be capable of is also an inability to plan for the moral quandaries that might come hand-in-hand.

Whether a future supercomputer might be powerful enough to simulate human-like thought is not in question, but whether researchers should aspire to bringing an artificial brain into existence is a subject worthy of discussion.

View original post here:
I confess, I'm scared of the next generation of supercomputers - TechRadar

Q-NEXT collaboration awarded National Quantum Initiative funding – University of Wisconsin-Madison

The University of WisconsinMadison solidified its standing as a leader in the field of quantum information science when the U.S. Department of Energy (DOE) and the White House announced the Q-NEXT collaboration as a funded Quantum Information Science Research Center through the National Quantum Initiative Act. The five-year, $115 million collaboration was one of five Centers announced today.

Q-NEXT, a next-generation quantum science and engineering collaboration led by the DOEs Argonne National Laboratory, brings together nearly 100 world-class researchers from three national laboratories, 10 universities including UWMadison, and 10 leading U.S. technology companies to develop the science and technology to control and distribute quantum information.

The main goals for Q-NEXT are first to deliver quantum interconnects to find ways to quantum mechanically connect distant objects, says Mark Eriksson, the John Bardeen Professor of Physics at UWMadison and a Q-NEXT thrust lead. And next, to establish a national resource to both develop and provide pristine materials for quantum science and technology.

Q-NEXT will focus on three core quantum technologies:

Eriksson is leading the Materials and Integration thrust, one of six Q-NEXT focus areas that features researchers from across the collaboration. This thrust aims to: develop high-coherence materials, including for silicon and superconducting qubits, which is an essential component of preserving entanglement; develop a silicon-based optical quantum memory, which is important in developing a quantum repeater; and improve color-center quantum bits, which are used in both communication and sensing.

One of the key goals in Materials and Integration is to not just improve the materials but also to improve how you integrate those materials together so that in the end, quantum devices maintain coherence and preserve entanglement, Eriksson says. The integration part of the name is really important. You may have a material that on its own is really good at preserving coherence, yet you only make something useful when you integrate materials together.

Six other UWMadison and Wisconsin Quantum Institute faculty members are Q-NEXT investigators: physics professors Victor Brar, Shimon Kolkowitz, Robert McDermott, and Mark Saffman, electrical and computer engineering professor Mikhail Kats, and chemistry professor Randall Goldsmith. UWMadison researchers are involved in five of the six research thrusts.

Im excited about Q-NEXT because of the connections and collaborations it provides to national labs, other universities, and industry partners, Eriksson says. When youre talking about research, its those connections that often lead to the breakthroughs.

The potential impacts of Q-NEXT research include the creation ofa first-ever National Quantum Devices Databasethat will promote the development and fabrication of next generation quantum devices as well as the development of the components and systems that enable quantum communications across distances ranging from microns to kilometers.

This funding helps ensure that the Q-NEXT collaboration will lead the way in future developments in quantum science and engineering, says Steve Ackerman, UWMadison vice chancellor for research and graduate education. Q-NEXT is the epitome of the Wisconsin Idea as we work together to transfer new quantum technologies to the marketplace and support U.S. economic competitiveness in this growing field.

Read more from the original source:
Q-NEXT collaboration awarded National Quantum Initiative funding - University of Wisconsin-Madison

UArizona Scientists to Build What Einstein Wrote off as Science Fiction – UANews

By Daniel Stolte, University Communications

Wednesday

Arizona Gov. Doug Ducey today joined University of Arizona President Robert C. Robbins and leading scientists from the new University of Arizona-based Center for Quantum Networks to talk about how the center will help develop the "internet of the future."

The National Science Foundation has awarded UArizona a five-year, $26 million grant with an additional $24 million, five-year option to lead the Center for Quantum Networks, or CQN, which is a National Science Foundation Engineering Research Center. The award has placed Arizona at the forefront of quantum networking technologies, which are expected to transform areas such as medicine, finance, data security, artificial intelligence, autonomous systems and smart devices, which together are often are referred to as "the internet of things."

"Arizona continues to lead the nation in innovation. Establishing the Center for Quantum Networks will position the state as a global leader in advancing this technology and developing the workforce of the future," Gov. Doug Ducey said. "We're proud of the work the University of Arizona has done to secure this grant and look forward to the scientific achievements that will result from it."

The CQN will take center stage in a burgeoning field. Companies like IBM, Microsoft and Google are racing to build reliable quantum computers, and China has invested billions of dollars in quantum technology research. The U.S. has begun a serious push to exceed China's investment and to "win" the global race to harness quantum technologies.

"Less than a year ago, a quantum computer for the first time performed certain calculations that are no longer feasible for even the largest conventional supercomputers," said Saikat Guha, CQN director and principal investigator and associate professor in the UArizona James C. Wyant College of Optical Sciences, who joined Ducey and Robbins for the virtual event. "The quantum internet will allow for applications that will never be possible on the internet as we know it."

Unlike the existing internet in which computers around the globe exchange data encoded in the familiar 0s and 1s the quantum internet will rely on a global network of quantum processors speaking to one another via "quantum bits," or qubits.

Qubits offer dramatic increases in processing capacity over conventional bits because they can exist in not just one state, but two at the same time. Known as superposition, this difficult-to-grasp principle was first popularized by "Schrdinger's Cat" the famous thought experiment in which an imaginative cat inside a box is neither dead nor alive until an equally imaginative observer opens the box and checks.

The key new resource that the quantum network enables by being able to communicate qubits from one point to another is to create "entanglement" across various distant users of the network. Entanglement another hallmark of quantum mechanics so strange that even Einstein was reluctant to accept it at first allows a pair of particles, including qubits, to stay strongly correlated despite being separated by large physical distances. Entanglement enables communication among parties that is impossible to hack.

One of the center's goals is to develop technologies that will put the entanglement principle to use in real-world applications for example, to stitch together far-apart sensors, such as the radio telescopes that glimpsed the first image of a black hole in space, into one giant instrument that is far more capable than the sum of the individual sensors. Similar far-reaching implications are expected in the autonomous vehicles industry and in medicine.

"Who knows, 50 years from now, your internet service provider may send a technician to your house to install a CQN-patented quantum-enabled router that does everything your current router does, but more," Guha said. "It lets you hook up your quantum gadgets to what we are beginning to build today the new internet of the future."

A first-of-its-kind campuswide quantum networking testbed will be built at the University of Arizona, connecting laboratories across the UArizona campus, initially spanning the College of Optical Sciences, Department of Electrical and Computer Engineering, Department of Materials Science and Engineering and the BIO5 Institute.

"The next few years will be very exciting, as we are at a time when the community puts emerging quantum computers, processors, sensors and other gadgets to real use," Guha said. "We are just beginning to connect small quantum computers, sensors and other gadgets into quantum networks that transmit quantum bits."

According to Guha, quantum-enabled sensors will be more sensitive than classical ones, and will dramatically improve technologies such as microscopes used in biomedical research to look for cancer cells, sensors on low-Earth-orbit satellites, and magnetic field sensors used for positioning and navigation.

Guha says today's internet is a playground for hackers, due to insecure communication links to inadequately guarded data in the cloud. Quantum systems will provide a level of privacy, security and computational clout that is impossible to achieve with today's internet.

"The Center for Quantum Networking stands as an example for the core priorities of our university-wide strategic plan," said UArizona President Robert C. Robbins. "As a leading international research university bringing the Fourth Industrial Revolution to life, we are deeply committed to (our strategic plan to) advance amazing new information technologies like quantum networking to benefit humankind. And we are equally committed to examining the complex, social, legal, economic and policy questions raised by these new technologies.

"In addition to bringing researchers together from intellectually and culturally diverse disciplines, the CQN will provide future quantum engineers and social scientists with incredible learning opportunities and the chance to work side by side with the world's leading experts."

The center will bring together scientists, engineers and social scientists working on quantum information science and engineering and its societal impacts. UArizona has teamed up with core partners Harvard University, the Massachusetts Institute of Technology and Yale University to work on the core hardware technologies for quantum networks and create an entrepreneurial ecosystem for quantum network technology transfer.

In addition to creating a diverse quantum engineering workforce, the center will develop a roadmap with industry partners to help prioritize CQN's research investments in response to new application concepts generated by the center.

Jane Bambauer, CQN co-deputy director and professor in the James E. Rogers College of Law, who also spoke about the center, said that "the classical internet changed our relationship to computers and each other."

"While we build the technical foundations for the quantum internet, we are also building the foundation for a socially responsible rollout of the new technology," Bambauer said. "We are embedding policy and social science expertise into our center's core research activities. We're also creating effective and inclusive education programs to make sure that the opportunities for jobs and for invention are shared broadly."

This is the third National Science Foundation Engineering Research Center led by the University of Arizona. The other two are the ERC for Environmentally Benign Semiconductor Manufacturing, led by the College of Engineering, and the Center for Integrated Access Networks, led by the Wyant College of Optical Sciences. CQN will be bolstered by the Wyant College's recent endowments including the largest faculty endowment gift in the history of the University of Arizona and the planned construction of the new Grand Challenges Research Building, supported by the state of Arizona.

Additional speakers at today's event included:

Original post:
UArizona Scientists to Build What Einstein Wrote off as Science Fiction - UANews

Quantum leap? US plans for unhackable internet may not fructify within a decade, but India is far behind – The Financial Express

Last week, the US department ofenergy released a blueprint for a nationalquantum internet. The project if successful, the government claimed, willensure a safer and nearly unhackableinternet within the next decade. Earlierthis year, the University of Chicago, oneof the participants in DoEs project, hadcreated a 52-mile quantum loop totransfer subatomic particles.

The US, however, is not the only country to be working on quantum internet.

Dutch researchers are expected to es tablish a link between Delft and the Hague later this year. In February, Chineseresearchers, in a paper published in Nature had claimed to have successfully demonstrated some success in thisregard over a 50 km range.

Last year, when Google announcedthat its quantum computer had inchedcloser towards quantum supremacy when a computer can perform tasks that are out of reach for even the mostadvanced supercomputer blockchainexperts had expressed concerns aboutthe technology as it would make it easyto break encryption. Recently, currencyexchange and payments platform, Ripples CTO said that quantum computers could pose a threat to blockchain technology and bitcoin. Given that development of quantum computers is happening at an accelerated pace, there is also aneed for a quantum internet to transferinformation without danger of hacking.

While the US experiments are laudable, all eyes will now be on the Dutchresearchers as they would also demonstrate a quantum repeater, which willhelp transport a single photon over long distances. The problem, till now, in establishing such a network has been transferring a photon. A repeater, if successful,can make this a reality.

The success, in this case, is dependenton a property of quantum particles called entanglement.

Entanglementallows two photons, when observed, at aconsiderable distance to take opposite values from each other. Albert Einsteinin a 1935 EPR paper had discussed entanglement buttermedit as a spookyphenomenon as it defied his laws on relativity. Bell later confirmed Einsteinsobservation. But using photons has beenimpossible largely because a photon cannot be cloned. Any interference destroysthe information. And, this property iswhat makes them safe as well.

But developing a point-to-point connection is quite different from creatinga network. Dr Apoorva D Patel, one ofIndias leading quantum researchers anda professor at Centre for High EnergyPhysics, Indian Institute of Science, Bengaluru says that we are still a long wayfrom establishing a network. We wouldneed hubs, relays, standard storage anderror correction, and all this is still yearsaway. He also debunks the claims of aquantum internet within a decade. The10-year timeline is the governmentshypothesis, not the scientists hypothesis Even if it is developed, the speeds DrPatel says will be much slower, probablyby a factor of thousand, than the regular internet. And this will not improve. So, itwill have only limited application.

Besides, the system is not entirelyunhackable.The quantum transmission is unhackable, but end-points are still hackable. The sending and receiver stations are vulnerable, as at those points you are trying to convert the classical signal into the quantum signal or vice versa, Dr Patel says.

Even worse, India may only be a spectator in this quantum race. We are farbehind in terms of the quantum internet, and much more behind when itcomes to quantum computers. We arestarting from scratch, and the government will need to do more, Dr Patelhighlights.

While the government announced aplan to invest Rs 8,000 crore over the nextfive years in the National Mission onquantum technology in the Budget, nodisbursement has been made till now.

Get live Stock Prices from BSE, NSE, US Market and latest NAV, portfolio of Mutual Funds, calculate your tax by Income Tax Calculator, know markets Top Gainers, Top Losers & Best Equity Funds. Like us on Facebook and follow us on Twitter.

Financial Express is now on Telegram. Click here to join our channel and stay updated with the latest Biz news and updates.

Read the rest here:
Quantum leap? US plans for unhackable internet may not fructify within a decade, but India is far behind - The Financial Express

Google distinguished scientist Hartmut Neven is one of Fast Company’s – Fast Company

A pioneer in AI-powered computer vision, whose research helps power Google Maps, YouTube, and more, Hartmut Neven now leads Googles efforts in quantum computingthe still-emerging science that aims to transform fields as diverse as pharmaceutical discovery and financial services by manipulating subatomic particles. In October 2019, his team published a paper stating that its processor, Sycamore, had achieved quantum supremacy by performing calculations in 200 seconds that would take 10,000 years with a conventional computer. The feat wasnt without controversyquantum rival IBM contended that its fastest supercomputer would take only two and a half days to do the same, downgrading the magnitude of Googles claimbut it was still widely hailed as a landmark.

The quantum project has posed new challenges, such as how to cool a processor to near absolute-zero temperatures using liquid helium. But Neven has always thrived on the edge. He was among the first to connect the AI revolution currently in full swing and quantum advances yet to come. Im guilty of having popularized the terms quantum machine learning and quantum AI, he says. Now there are whole sections of departments at universities that do quantum machine learning.

When people ask Neven how soon quantum computers will be ready for commercial deployment in areas where their speed might be transformative, his standard estimate is 10 years. He appreciates Google giving him the runway. Theres this nerdy pleasure of pushing into disciplines that are intrinsically fascinating, he says. Not, When can we see a return on our investment? But, Hey, this sounds interesting. Sooner or later, this technology will play an important role. Why dont you explore it?'

Read more:
Google distinguished scientist Hartmut Neven is one of Fast Company's - Fast Company

Quantum physicists say time travelers don’t have to worry about the butterfly effect – The Next Web

What if I told you all your favorite time-travel films and books were actually created by big tech in order to wrest control of the time-travel industry from the proletariat?

Think about it. Back to the Future, The Terminator, The Time Machine, all of these stories share a central theme where traveling through time is a dangerous proposition that could destroy the very fabric of our reality.

Its called the butterfly effect. The big idea is that youd step out of your time travel machine and accidentally step on a bug. Because this bug doesnt exist maybe a frog goes hungry and dies. And maybe that frog was supposed to hop on a sabre-tooth tigers face at exactly the right moment so the cave person from which our greatest leader will descend can escape death.

But you just had to time travel didnt you? Now, because that bug is dead, the cave man didnt live and our planet is a nuclear wasteland when you return to the present.

[Read:This quantum physics breakthrough could be the origin story for time travel]

Speaking of nuclear wastelands, a team of scientists from the Los Alamos National Laboratory recently conducted a time travel simulation on IBMs quantum computer. And what they determined is that all those Hollywood fear mongers are full of it.

Per a press release from the lab, one of the studys coauthors, theoretical physicist Nikolai Sinitsyn, said:

On a quantum computer, there is no problem simulating opposite-in-time evolution, or simulating running a process backwards into the past. So we can actually see what happens with a complex quantum world if we travel back in time, add small damage, and return. We found that our world survives, which means theres no butterfly effect in quantum mechanics.

We cant actually travel back in time, but what we can do is simulate how systems react to perturbances with benefit of hindsight. And the reason we can do this is because quantum computers can travel back in time.

What makes quantum computers so special is that theyre capable of producing all outcomes simultaneously. With a classical computer, for example, we use binary bits to process data by switching resistors on and off. Quantum computers use qubits though. And qubits can be on, off, both, or neither all at the same time. So, if we want to solve a really complex problem we can run it through a quantum computer and get all the answers at once rather than running it through a classical computer multiple times with different parameters to achieve diverse predictions when an outcome is uncertain.

Thats a long-winded way of saying quantum computers can reverse-engineer the past to determine exactly how things in a given system would have unfolded had something else happened.

This doesnt mean we can finally solve the JFK assassination, that version of the past is a closed system that we currently dont have access to. But we can create an open system and give the quantum computer access to it via simulation so that it can determine all the different ways things can play out over time.

Quick take: Whats most interesting here is that the simulation itself works as a bit of a quantum mechanics detector.

We know that classical systems suffer from the butterfly effect. Dont believe me? Go back about 10 code commits and start randomly changing things and then generate new code from the flawed version and see how that works out for your next software build.

Sinitsyn and his coauthor Bin Yan tested their quantum mechanics hypothesis out with help from IBMs cloud-based Q system. Per the Los Alamos press release:

To test the butterfly effect in quantum systems, Yan and Sinitsyn used theory and simulations with the IBM-Q quantum processor to show how a circuit could evolve a complex system by applying quantum gates, with forwards and backwards cause and effect.

According to the researchers, the butterfly effect doesnt affect the quantum world so its existence can effectively determine whether a system is classical or quantum in nature.

We can certainly assume that any form of time travel involving human temporal displacement will rely on quantum mechanics unless of course we turn out to be binary constructs trapped within a simulation ourselves.

And that means, unless were in the Matrix, Marty McFly and John Connor were just propaganda meant to scare us regular folk off from using time machines at our leisure. Even Ashton Kutcher lied to us. Butterfly effect schmutterfly effect.

You can read the full study here.

H/t: ScienceAlert

Read next: Prosecutors claim to have caught teenage mastermind behind Twitter hack

Do you want to get the sassiest daily tech newsletter every day, in your inbox, for FREE? Of course you do: sign up for Big Spam here.

Read the rest here:
Quantum physicists say time travelers don't have to worry about the butterfly effect - The Next Web

Week in review: BootHole, RCEs in industrial VPNs, the cybersecurity profession crisis – Help Net Security

Heres an overview of some of last weeks most interesting news, articles, interviews and reviews:

Attackers are exploiting Cisco ASA/FTD flaw in search for sensitive dataAn unauthenticated file read vulnerability (CVE-2020-3452) affecting Cisco Adaptive Security Appliance (ASA) and Firepower Threat Defense (FTD) software is being exploited by attackers in the wild.

Researchers find critical RCE vulnerabilities in industrial VPN solutionsCritical vulnerabilities in several industrial VPN implementations for remotely accessing operational technology (OT) networks could allow attackers to overwrite data, execute malicious code or commands, cause a DoS condition, and more.

Twitter employees were spear-phished over the phoneTwitter has finally shared more details about how the perpetrators of the recent hijacking of high-profile accounts to push a Bitcoin scam managed to pull it off. Also, three alleged perpetrators have been identified.

Review: Cyber Warfare Truth, Tactics, and StrategiesMany future battles will be fought with cyber weapons, narrowing the resources and capabilities gap that long existed between rich and poor nations. All of them can now effectively bring their enemy down.

Public cloud environments leave numerous paths open for exploitationCloud estates are being breached through their weakest links of neglected internet-facing workloads, widespread authentication issues, discoverable secrets and credentials, and misconfigured storage buckets.

62,000 QNAP NAS devices infected with persistent QSnatch malwareThere are approximately 62,000 malware-infested QNAP NAS (Network Attached Storage) devices located across the globe spilling all the secrets they contain to unknown cyber actors, the US CISA and the UK NCSC have warned.

What are script-based attacks and what can be done to prevent them?In todays threat landscape, scripts provide initial access, enable evasion, and facilitate lateral movements post-infection.

How do I select an endpoint protection solution for my business?To select an appropriate endpoint protection solution for your business, you need to think about a variety of factors. Weve talked to several cybersecurity professionals to get their insight on the topic.

Lack of training, career development, and planning fuel the cybersecurity profession crisisThe cybersecurity skills crisis continues to worsen for the fourth year in a row and has impacted 70 percent of organizations, as revealed in a global study of cybersecurity professionals by ISSA and ESG.

Bug in widely used bootloader opens Windows, Linux devices to persistent compromiseA vulnerability (CVE-2020-10713) in the widely used GRUB2 bootloader opens most Linux and Windows systems in use today to persistent compromise.

Delivering and maintaining security at the speed of digital transformationDustin Rigg Hillard, CTO at eSentire, talks about modern digital threats, the challenges cybersecurity teams face, cloud-native security platforms, and more.

Security teams increasingly stressed due to lack of proper tools, executive support93% of security professionals lack the tools to detect known security threats, and 92% state they are still in need of the appropriate preventative solutions to close current security gaps, according to LogRhythm.

How well do face recognition algorithms identify people wearing masks?The answer, according to a preliminary study by the National Institute of Standards and Technology (NIST), is with great difficulty.

NIST selects algorithms to form a post-quantum cryptography standardAfter spending more than three years examining new approaches to encryption and data protection that could defeat an assault from a quantum computer, the National Institute of Standards and Technology (NIST) has winnowed the 69 submissions it initially received down to a final group of 15.

Its time to tap the next generation of cyber defendersAs college graduates of the Class of 2020 enter the workforce, we welcome a new generation of cyber professionals.

Attackers have created a specialized economy around email account takeover

Things to consider when selecting enterprise SSDs for critical workloadsWe sat down with Scott Hamilton, Senior Director, Product Management, Data Center Systems at Western Digital, to learn more about SSDs and how they fit into current business environments and data centers.

Offensive Security acquires security training project VulnHubOffensive Security has acquired open source security training resource hub VulnHub. The acquisition is part of OffSecs ongoing mission to provide practical training content to aspiring cybersecurity professionals.

The distinction between human and bot behavior is becoming increasingly blurredAs consumers change their online habits, the distinction between human and bot behavior is becoming increasingly blurred, presenting cybersecurity teams with an even bigger challenge than before when it comes to differentiating humans from bots, and good bot behavior from bad.

What is privacy and why does it matter?Privacy is a basic right and a necessary protection in the digital age to avoid victimization and manipulation.

DeimosC2: Open source tool to manage post-exploitation issuesTEAMARES launched DeimosC2, addressing the market need for a cross-compatible, open source Command and Control (C2) tool for managing compromised machines that includes mobile support.

Qualys unveils Multi-Vector EDR, a new approach to endpoint detection and responseTaking a new multi-vector approach to Endpoint Detection and Response (EDR), Qualys now brings the unifying power of its highly scalable cloud platform to EDR.

New infosec products of the week: July 31, 2020A rundown of infosec products released last week.

Read the original:
Week in review: BootHole, RCEs in industrial VPNs, the cybersecurity profession crisis - Help Net Security

New UC-led institute awarded $25M to explore potential of quantum computing and train a future workforce – University of California

In the curious world of quantum mechanics, a single atom or subatomic particle can exist simultaneously in multiple conditions. A new UC-led, multiuniversity institute will explore the realities of this emerging field as it focuses on advancing quantum science and engineering, with an additional goal of training a future workforce to build and use quantum computers.

The National Science Foundation (NSF) has awarded $25 million over five years to establish the NSF Quantum Leap Challenge Institute (QLCI) for Present and Future Quantum Computation as part of the federal governments effort to speed the development of quantum computers. The institute will work to overcome scientific challenges to achieving quantum computing and will design advanced, large-scale quantum computers that employ state-of-the-art scientific algorithms developed by the researchers.

There is a sense that we are on the precipice of a really big move toward quantum computing, said Dan Stamper-Kurn, UC Berkeley professor of physics and director of the institute. We think that the development of the quantum computer will be a real scientific revolution, the defining scientific challenge of the moment, especially if you think about the fact that the computer plays a central role in just about everything society does. If you have a chance to revolutionize what a computer is, then you revolutionize just about everything else.

Unlike conventional computers, quantum computers seek to harness the mysterious behavior of particles at the subatomic level to boost computing power. Once fully developed, they could be capable of solving large, extremely complex problems far beyond the capacity of todays most powerful supercomputers. Quantum systems are expected to have a wide variety of applications in many fields, including medicine, national security and science.

Theoretical work has shown that quantum computers are the best way to do some important tasks: factoring large numbers, encrypting or decrypting data, searching databases or finding optimal solutions for problems. Using quantum mechanical principles to process information offers an enormous speedup over the time it takes to solve many computational problems on current digital computers.

Scientific problems that would take the age of the universe to solve on a standard computer potentially could take only a few minutes on a quantum computer, said Eric Hudson, a UCLA professor of physics and co-director of the new institute. We may get the ability to design new pharmaceuticals to fight diseases on a quantum computer, instead of in a laboratory. Learning the structure of molecules and designing effective drugs, each of which has thousands of atoms, are inherently quantum challenges. A quantum computer potentially could calculate the structure of molecules and how molecules react and behave.

The project came to fruition, in part, thanks to a UC-wide consortium, the California Institute for Quantum Entanglement, funded by UCs Multicampus Research Programs and Initiatives (MRPI).The MRPI funding opportunity incentivizes just this kind of multicampus collaboration in emerging fields that can position UC as a national leader.

This new NSF institute is founded on the outstanding research contributions in theoretical and experimental quantum information science achieved by investigators from across the UC system through our initiative to foster multicampus collaborations, said Theresa Maldonado, Ph.D., vice president for Research and Innovation of the University of California. The award recognizes the teams vision of how advances in computational quantum science can reveal new fundamental understanding of phenomena at the tiniest length-scale that can benefit innovations in artificial intelligence, medicine, engineering, and more. We are proud to lead the nation in engaging excellent students from diverse backgrounds into this field of study.

The QLCI for Present and Future Quantum Computation connects UC Berkeley, UCLA and UC Santa Barbara with five other universities around the nation and in California. The institute will draw on a wealth of knowledge from experimental and theoretical quantum scientists to improve and determine how best to use todays rudimentary quantum computers, most of them built by private industry or government labs. The goal, ultimately, is to make quantum computers as common as mobile phones, which are, after all, pocket-sized digital computers.

The institute will be multidisciplinary, spanning physics, chemistry, mathematics, computer science, and optical and electrical engineering, among other fields, and will include scientists and engineers with expertise in quantum algorithms, mechanics and chemistry. They will partner with outside institutions, including in the emerging quantum industry, and will host symposia, workshops and other programs. Research challenges will be addressed jointly through a process that incorporates both theory and experiment.

Situated near the heart of todays computer industry, Silicon Valley and Silicon Beach, and at major California universities and national labs, the institute will train a future workforce akin to the way computer science training at universities fueled Silicon Valleys rise to become a tech giant. UCLA will pilot a masters degree program in quantum science and technology to train a quantum-smart workforce, while massive online courses, or MOOCs, will help spread knowledge and understanding of quantum computers even to high school students.

This center establishes California as a leader nationally and globally in quantum computing, Stamper-Kurn said.

The institutes initial members are all senior faculty from UC Berkeley, UCLA, UC Santa Barbara, the California Institute of Technology, the Massachusetts Institute of Technology, the University of Southern California, the University of Washington and the University of Texas at Austin.

We still do not know fully what quantum computers do well, Stamper-Kurn said, and we face deep challenges that arise in scaling up quantum devices. The mission of this institute is to address fundamental challenges in the development of the quantum computer.

More information on NSF-supported research on quantum information science and engineering is available at nsf.gov/quantum.

See the article here:
New UC-led institute awarded $25M to explore potential of quantum computing and train a future workforce - University of California

IBM and University of Tokyo team up for Quantum Innovation Initiative Consortium – SmartPlanet.com

Big Blue and the University of Tokyo launched the Quantum Innovation Initiative Consortium (QIIC) on Thursday, in an effort to bring together industry, academics, and government to push forward quantum computing in Japan.

QIIC will be housed at the University of Tokyo and have access to the IBM Quantum Computation Center, which has 20 of Big Blue's "most advanced" quantum computers, according to IBM.

Joining the consortium, as well as IBM's Q Network, will be Toshiba, Hitachi, Mizuho, MUFG, JSR, DIC, Toyota, Mitsubishi Chemicals, and Keio University.

"I believe that Japan will play an important role in implementing quantum computing technology to society ahead of rest of the world, and that industry-academia-government collaboration is necessary for this," president of the University of Tokyo professor Makoto Gonokami said.

IBM and the University of Tokyo signed an agreement at the end of 2019 that would see a Q System One, owned and operated by IBM, installed in an IBM facility in Japan. At the time, it was said to be the third in the world after installations in the US and Germany. IBM said on Thursday the installation is planned for next year.

On Wednesday, Toshiba said it would lead a Global Quantum Cryptography Communications Network research project that was commissioned by the Japanese Ministry of Internal Affairs and Communications. Alongside Toshiba, NEC, Mitsubishi Electric, Furukawa Electric, Hamamatsu Photonics, University of Tokyo, Hokkaido University, Yokohama National University, Gakushuin University, the National Institute of Information and Communications Technology, the National Institute of Advanced Industrial Technology, and the National Institute of Materials and Technology will be involved in the project.

The project is set to run until the end of the 2024 financial year and will look to create a network of 100 quantum cryptographic devices and 10,000 users. Four areas of research have been identified: Quantum communication link technology; trusted node technology to ensure cryptographic keys are tamper resistant; quantum relay technology; and WAN construction and operation.

Toshiba said the project has a planned budget of 1.44 billion yen for its first year.

Elsewhere, the European arm of Japanese giant Fujitsu said it has signed up BBVA, Spain's second largest bank, for a proof of concept involving its digital annealer technology.

The annealer will be used to optimise asset portfolios and minimise risk.

"Finding the optimal selection from just 20 stocks generates the equivalent of over one quintillion (10 18) permutations," Fujitsu said. "Because of this complexity, portfolio optimisation has traditionally been a manual task, guided more frequently by guesswork than empirical data -- simply because the convoluted calculations far exceed the capabilities of regular computers."

"However, Fujitsu's digital annealer has been designed to process exactly this sort of complex combinatorial problem in just minutes."

The bank also intends to use the annealer to determine when is the best time to buy or sell assets, Fujitsu added.

"While true quantum computing as a technology is still in the laboratory testing phase, digital annealer represents a bridge to this future technology, thanks to its ability to evaluate multiple different combinations extremely rapidly," CTO for Fujitsu in Span Carlos Cordero said.

Beyond trying to define the stock market, the Japanese giant also said its annealer had previously been used to help optimise seams for welding robots in car making and find the best routes for delivery trucks. It has also been involved in helping pharmaceutical companies with discovering new substances.

UNSW offers Bachelor of Quantum Engineering degree

University says the degree will build a quantum workforce for Australia.

Q-CTRL launches service to help with cloud-based quantum computing

Meanwhile, UNSW has spun out an Internet of Things-focused security startup called CyAmst.

NEC to create hybrid quantum systems alongside D-Wave

Companies to look into developing services that combine Leap quantum annealing cloud with NEC supercomputers.

Quantum entanglement breakthrough could boost encryption, secure communications

Using quantum entanglement, a team of researchers has developed a new way to communicate via particles of light.

Honeywell claims to have world's highest performing quantum computer according to IBM's benchmark

Honeywell said JP Morgan Chase and other customers are using its quantum computer in production, which it claims is the most powerful currently in use based on a benchmark established last year by IBM.

Read more:
IBM and University of Tokyo team up for Quantum Innovation Initiative Consortium - SmartPlanet.com

The future of encryption: Getting ready for the quantum computer attack – TechRepublic

PQShield, a spin-out from the UK's Oxford University, is developing advanced cryptographic solutions for hardware, software and communications to protect businesses' data from the quantum threat.

The development of quantum computers poses a cybersecurity problem such as the IT industry has never seen before. All stored data currently deemed secure by modern standards whether that's health records, financial data, customer databases and even critical government infrastructure could, in theory, be cracked by quantum computers, which are capable of effectively short circuiting the encryption we've used to protect that data until now.

Efforts to protect our data from the quantum threat are underway, though whether the issue is being looked at with the urgency it deserves is up for debate. PQShield, a post-quantum cryptography startup spun out of Oxford University, perceives a disconnect between the scale of the threat and the current cyber-readiness of most businesses in 2020, which it is now trying to address.

SEE: Quantum computing: Myths v. Realities (TechRepublic)

"The scale of the quantum attack is just too big to imagine," Dr. Ali Kaafarani, research fellow at Oxford's Mathematical Institute and founder of PQShield, tells TechRepublic.

"The most important part of what we're doing is to educate the market."

Kaafarani is a former engineer at Hewlett-Packard Labs and leads a team of 10 full-time quantum cryptographers, from what he estimates to be a worldwide pool of just a hundred or so. The company is busy working on the development of quantum-secure cryptography encryption solutions for hardware, software and communications that will secure information from future risk, yet can be implemented using today's technology.

This comprises a system on chip (SoC) and software development kit that allow companies to create secure messaging applications, protected by a "post-quantum" variant of the Signal cryptographic protocol. Central to PQShield's technology is that it is designed to work with both legacy systems as well as those expected in the years to come, meaning it could offer protection for everything from keyless cars and other connected devices, to data moving to and from cloud servers.

This, Kaafarani explains, is important owing to the fact that post-quantum cryptography cannot be retrospectively implemented meanwhile data encrypted by modern standards remains open to post-quantum threats. "What we're using right now as end-to-end encryption...is secure now, but people can intercept them and steal encrypted data," he says.

"Once they have access to a quantum computer, they can decrypt them, so confidentiality is threatened in retrospect, because whatever is considered confidential now can be decrypted later on."

Kaafarani also perceives an issue with the current attitudes to remediating cyberattacks, which he likens to applying a band-aid to a repeating problem.

SEE: SSL Certificate Best Practices Policy (TechRepublic Premium)

"That's why we started PQShield to fill in this gap, to lead the way to a smooth and secure transition to the quantum era. There is a real opportunity here to get things right from the beginning."

The startup recently completed a 5.5m funding round led by VC Firm Kindred Capital and has now secured German engineering company Bosch as its first OEM customer. While the exact details of the deal are still under wraps, Kaafarani says the deal is indicative of the threats businesses are beginning to identify as the age of quantum computing approaches.

"Their hardware may be built to last, but right now, their security isn't," he says.

"If you're designing a car that's going to go on the roads in the next three years, if you're doing security by design, you should be thinking of the next security standards: not the standards that are valid now, but the standards that will be valid in the next five, 10, 15 years," he says.

"Future-proofing is an imperative, just as it is for the banks and agencies that hold so much of our sensitive data."

Be in the know about smart cities, AI, Internet of Things, VR, AR, robotics, drones, autonomous driving, and more of the coolest tech innovations. Delivered Wednesdays and Fridays

Read this article:
The future of encryption: Getting ready for the quantum computer attack - TechRepublic