Category Archives: Quantum Computer

Physicists hunt for room-temperature superconductors that could revolutionize the world’s energy system – The Conversation US

Waste heat is all around you. On a small scale, if your phone or laptop feels warm, thats because some of the energy powering the device is being transformed into unwanted heat.

On a larger scale, electric grids, such as high power lines, lose over 5% of their energy in the process of transmission. In an electric power industry that generated more than US$400 billion in 2018, thats a tremendous amount of wasted money.

Globally, the computer systems of Google, Microsoft, Facebook and others require enormous amounts of energy to power massive cloud servers and data centers. Even more energy, to power water and air cooling systems, is required to offset the heat generated by these computers.

Where does this wasted heat come from? Electrons. These elementary particles of an atom move around and interact with other electrons and atoms. Because they have an electric charge, as they move through a material like metals, which can easily conduct electricity they scatter off other atoms and generate heat.

Superconductors are materials that address this problem by allowing energy to flow efficiently through them without generating unwanted heat. They have great potential and many cost-effective applications. They operate magnetically levitated trains, generate magnetic fields for MRI machines and recently have been used to build quantum computers, though a fully operating one does not yet exist.

But superconductors have an essential problem when it comes to other practical applications: They operate at ultra-low temperatures. There are no room-temperature superconductors. That room-temperature part is what scientists have been working on for more than a century. Billions of dollars have funded research to solve this problem. Scientists around the world, including me, are trying to understand the physics of superconductors and how they can be enhanced.

A superconductor is a material, such as a pure metal like aluminum or lead, that when cooled to ultra-low temperatures allows electricity to move through it with absolutely zero resistance. How a material becomes a superconductor at the microscopic level is not a simple question. It took the scientific community 45 years to understand and formulate a successful theory of superconductivity in 1956.

While physicists researched an understanding of the mechanisms of superconductivity, chemists mixed different elements, such as the rare metal niobium and tin, and tried recipes guided by other experiments to discover new and stronger superconductors. There was progress, but mostly incremental.

Simply put, superconductivity occurs when two electrons bind together at low temperatures. They form the building block of superconductors, the Cooper pair. Elementary physics and chemistry tell us that electrons repel each other. This holds true even for a potential superconductor like lead when it is above a certain temperature.

When the temperature falls to a certain point, though, the electrons become more amenable to pairing up. Instead of one electron opposing the other, a kind of glue emerges to hold them together.

Discovered in 1911, the first superconductor was mercury (Hg), the basic element of old-fashioned thermometers. In order for mercury to become a superconductor, it had to be cooled to ultra-low temperatures. Kamerlingh Onnes was the first scientist who figured out exactly how to do that by compressing and liquefying helium gas. During the process, once helium gas becomes a liquid, the temperature drops to -452 degrees Fahrenheit.

When Onnes was experimenting with mercury, he discovered that when it was placed inside a liquid helium container and cooled to very low temperatures, its electric resistance, the opposition of the electric current in the material, suddenly dropped to zero ohms, a unit of measurement that describes resistance. Not close to zero, but zero exactly. No resistance, no heat waste.

This meant that an electric current, once generated, would flow continuously with nothing to stop it, at least in the lab. Many superconducting materials were soon discovered, but practical applications were another matter.

These superconductors shared one problem they needed to be cooled down. The amount of energy needed to cool a material down to its superconducting state was too expensive for daily applications. By the early 1980s, the research on superconductors had nearly reached its conclusion.

In a dramatic turn of events, a new kind of superconductor material was discovered in 1987 at IBM in Zurich, Switzerland. Within months, superconductors operating at less extreme temperatures were being synthesized globally. The material was a kind of a ceramic.

These new ceramic superconductors were made of copper and oxygen mixed with other elements such as lanthanum, barium and bismuth. They contradicted everything physicists thought they knew about making superconductors. Researchers had been looking for very good conductors, yet these ceramics were nearly insulators, meaning that very little electrical current can flow through. Magnetism destroyed conventional superconductors, yet these were themselves magnets.

Scientists were seeking materials where electrons were free to move around, yet in these materials, the electrons were locked in and confined. The scientists at IBM, Alex Mller and Georg Bednorz, had actually discovered a new kind of superconductor. These were the high-temperature superconductors. And they played by their own rules.

Scientists now have a new challenge. Three decades after the high-temperature superconductors were discovered, we are still struggling to understand how they work at the microscopic level. Creative experiments are being conducted every day in universities and research labs around the world.

In my laboratory, we have built a microscope known as a scanning tunneling microscope that helps our research team see the electrons at the surface of the material. This allows us to understand how electrons bind and form superconductivity at an atomic scale.

We have come a long way in our research and now know that electrons also pair up in these high-temperature superconductors. There is great value and utility in answering how high-temperature superconductors work because that may be the route to room-temperature superconductivity. If we succeed in making a room-temperature superconductor, then we can address the billions of dollars that it costs in wasted heat to transmit energy from power plants to cities.

More remarkably, solar energy harvested in the vast empty deserts around the world could be stored and transmitted without any loss of energy, which could power cities and dramatically reduce greenhouse gas emissions. The potential is hard to imagine. Finding the glue for room-temperature superconductors is the next million-dollar question.

[Youre too busy to read everything. We get it. Thats why weve got a weekly newsletter. Sign up for good Sunday reading. ]

See the article here:
Physicists hunt for room-temperature superconductors that could revolutionize the world's energy system - The Conversation US

Covid 19 Pandemic: Quantum Computing Technologies Market 2020, Share, Growth, Trends And Forecast To 2025 – 3rd Watch News

Research report on global Quantum Computing Technologies market 2020 with industry primary research, secondary research, product research, size, trends and Forecast.

The report presents a highly comprehensive and accurate research study on the globalQuantum Computing Technologies market. It offers PESTLE analysis, qualitative and quantitative analysis, Porters Five Forces analysis, and absolute dollar opportunity analysis to help players improve their business strategies. It also sheds light on critical Quantum Computing Technologies Marketdynamics such as trends and opportunities, drivers, restraints, and challenges to help market participants stay informed and cement a strong position in the industry. With competitive landscape analysis, the authors of the report have made a brilliant attempt to help readers understand important business tactics that leading companies use to maintainQuantum Computing Technologies market sustainability.

Download Premium Sample Copy Of This Report:Download FREE Sample PDF!

Global Quantum Computing Technologies Market valued approximately USD 75.0 million in 2018 is anticipated to grow with a healthy growth rate of more than 24.0% over the forecast period 2019-2026. The Quantum Computing Technologies Market is continuously growing in the global scenario at significant pace. As it is recognized as a computer technology based on the principles of quantum theory, which explains the nature and behavior of energy and matter on the quantum level. A Quantum computer follows the laws of quantum physics through which it can gain enormous power, have the ability to be in multiple states and perform tasks using all possible permutations simultaneously. Surging implementation of machine learning by quantum computer, escalating application in cryptography and capability in simulating intricate systems are the substantial driving factors of the market during the forecast period. Moreover, rising adoption & utility in cyber security is the factors that likely to create numerous opportunity in the near future. However, lack of skilled professionals is one of the major factors that restraining the growth of the market during the forecast period.

The regional analysis of Global Quantum Computing Technologies Market is considered for the key regions such as Asia Pacific, North America, Europe, Latin America and Rest of the World. North America is the leading/significant region across the world in terms of market share due to increasing usage of quantum computers by government agencies and aerospace & defense for machine learning in the region. Europe is estimated to grow at second largest region in the global Quantum Computing Technologies market over the upcoming years. Further, Asia-Pacific is anticipated to exhibit higher growth rate / CAGR over the forecast period 2019-2026 due to rising adoption of quantum computers by BFSI sectors in the region.

The major market player included in this report are:

D-Wave Systems Inc.

IBM Corporation

Lockheed Martin Corporation

Intel Corporation

Anyon Systems Inc.

Cambridge Quantum Computing Limited

The objective of the study is to define market sizes of different segments & countries in recent years and to forecast the values to the coming eight years. The report is designed to incorporate both qualitative and quantitative aspects of the industry within each of the regions and countries involved in the study. Furthermore, the report also caters the detailed information about the crucial aspects such as driving factors & challenges which will define the future growth of the market. Additionally, the report shall also incorporate available opportunities in micro markets for stakeholders to invest along with the detailed analysis of competitive landscape and product offerings of key players. The detailed segments and sub-segment of the market are explained below:

By Application:

Optimization

Machine Learning

Simulation

By Vertical:

BFSI

IT and Telecommunication

Healthcare

Transportation

Government

Aerospace & Defense

Others

By Regions:

North America

U.S.

Canada

Europe

UK

Germany

Asia Pacific

China

India

Japan

Latin America

Brazil

Mexico

Rest of the World

Furthermore, years considered for the study are as follows:

Historical year 2016, 2017

Base year 2018

Forecast period 2019 to 2026

Target Audience of the Global Quantum Computing Technologies Market in Market Study:

Key Consulting Companies & Advisors

Large, medium-sized, and small enterprises

Venture capitalists

Value-Added Resellers (VARs)

Third-party knowledge providers

Investment bankers

Investors

Have Any Query Or Specific Requirement?Ask Our Industry Experts!

Table of Contents:

Study Coverage:It includes study objectives, years considered for the research study, growth rate and Quantum Computing Technologies market size of type and application segments, key manufacturers covered, product scope, and highlights of segmental analysis.

Executive Summary:In this section, the report focuses on analysis of macroscopic indicators, market issues, drivers, and trends, competitive landscape, CAGR of the global Quantum Computing Technologies market, and global production. Under the global production chapter, the authors of the report have included market pricing and trends, global capacity, global production, and global revenue forecasts.

Quantum Computing Technologies Market Size by Manufacturer: Here, the report concentrates on revenue and production shares of manufacturers for all the years of the forecast period. It also focuses on price by manufacturer and expansion plans and mergers and acquisitions of companies.

Production by Region:It shows how the revenue and production in the global market are distributed among different regions. Each regional market is extensively studied here on the basis of import and export, key players, revenue, and production.

About Us:

We publish market research reports & business insights produced by highly qualified and experienced industry analysts. Our research reports are available in a wide range of industry verticals including aviation, food & beverage, healthcare, ICT, Construction, Chemicals and lot more. Brand Essence Market Research report will be best fit for senior executives, business development managers, marketing managers, consultants, CEOs, CIOs, COOs, and Directors, governments, agencies, organizations and Ph.D. Students.

Top Trending Reports:

https://brandessenceresearch.com/energy-and-mining/patient-engagement-solutions-market-size-and-share

https://brandessenceresearch.com/chemical-and-materials/natural-colorants-market-size

See the original post:
Covid 19 Pandemic: Quantum Computing Technologies Market 2020, Share, Growth, Trends And Forecast To 2025 - 3rd Watch News

Molecular dynamics used to simulate 100 million atoms | Opinion – Chemistry World

The TV series Devs took as its premise the idea that a quantum computer of sufficient power could simulate the world so completely that it could project events accurately back into the distant past (the Crucifixion or prehistory) and predict the future. At face value somewhat absurd, the scenario supplied a framework on which to hang questions about determinism and free will (and less happily, the Many Worlds interpretation of quantum mechanics).

Quite what quantum computers will do for molecular simulations remains to be seen, but the excitement about them shouldnt eclipse the staggering advances still being made in classical simulation. Full ab initio quantum-chemical calculations are very computationally expensive even with the inevitable approximations they entail, so it has been challenging to bring this degree of precision to traditional molecular dynamics, where molecular interactions are still typically described by classical potentials. Even simulating pure water, where accurate modelling of hydrogen bonding and the ionic disassociation of molecules involves quantum effects, has been tough.

Now a team that includes Linfeng Zhang and Roberto Car of Princeton University, US, has conducted ab initio molecular dynamics simulations for up to 100 million atoms, probing timescales up to a few nanoseconds.1 Sure, its a long way from the Devs fantasy of an exact replica of reality. But it suggests that simulations with quantum precision are reaching the stage where we can talk not in terms of handfuls of molecules but of bulk matter.

How do they do it? The trick, which researchers have been exploring for several years now, is to replace quantum-chemical calculations with machine learning (ML). The general strategy of ML is that an algorithm learns to solve a complex problem by being trained with many examples for which the answers are already known, from which it deduces the general shape of solutions in some high-dimensional space. It then uses that shape to interpolate for examples that it hasnt seen before. The familiar example is image interpretation: the ML system works out what to look for in photos of cats, so that it can then spot which new images have cats in them. It can work remarkably well so long as it is not presented with cases that lie far outside the bounds of the training set.

The approach is being widely used in molecular and materials science, for example to predict crystal structures from elemental composition,2-3 or electronic structure from crystal structure.4-5 In the latter case, bulk electronic properties such as band gaps have traditionally been calculated using density functional theory (DFT), an approximate way to solve the quantum-mechanical equations of many-body systems. Here the spatial distribution of electron density is computationally iterated from some initial guess until it fits the equations in a self-consistent way. But its computationally intensive, and ML circumvents the calculations by figuring out from known cases what kind of electron distribution a given configuration of atoms will have.

The approach can in principle be used for molecular dynamics by recalculating the electron densities at each time step. Zhang and colleagues have now shown how far this idea can be pushed using supercomputing technology, clever algorithms, and state-of-the-art artificial intelligence.6 They present results for simulations of up to 113 million atoms for the test case of a block of copper atoms, enabling something approaching a prediction of bulk-like mechanical behaviour from quantum chemistry. Their simulations of liquid water, meanwhile, contain up to 12.6 million atoms.

For small systems where the comparison to full quantum DFT calculations can be made, the researchers find electron distributions essentially indistinguishable from the full calculations, while gaining 45 orders of magnitude in speed. Their system can capture the full phase diagram of water over a wide range of temperature and pressure, and can simulate processes such as ice nucleation. In some situations water can be coarse-grained such that hydrogen bonding can still be modelled without including the hydrogen atoms explicitly.7 The researchers say it should be possible soon to follow such processes on timescales approaching microseconds for about a million water molecules, enabling them to look at processes such as droplet and ice formation in the atmosphere.

For small systems where the comparison to full quantum DFT calculations can be made, the researchers find electron distributions essentially indistinguishable from the full calculations, while gaining 45 orders of magnitude in speed. Their system can capture the full phase diagram of water over a wide range of temperature and pressure, and can simulate processes such as ice nucleation. The researchers say it should be possible soon to follow such processes on timescales approaching microseconds for about a million water molecules, enabling them to look at processes such as droplet and ice formation in the atmosphere.

Both of these test cases are helped by being relatively homogeneous, involving largely identical atoms or molecules. Still, the prospects of this deep-learning approach look good for studying much more heterogeneous systems such as complex alloys.8 One very attractive goal is, of course, biomolecular systems, where the ability to model fully solvated proteins, membranes and other cell components could help us understand complex mesoscale cell processes and predict the behaviour of drug candidates. One challenge here is how to include long-range interactions such as electrostatic forces.

Its a long way from Devs-style simulations of minds and histories, which will perhaps only ever be fantasies. But one scene in that series showed what might be a more tractable goal: the simulation of a growing snowflake. What a wonderful way that would be to advertise the simulators art.

1. Jia et al., arXiv, 2020 http://www.arxiv.org/abs/2005.00223 (submitted, ACM, New York, 2020)

2 C C Fischer et al, Nat. Mater., 2006, 5, 641 (DOI:10.1038/nmat1691)

3 N Mounet et al, Nat. Nanotechnol., 2018, 13, 246 (DOI: 10.1038/s41565-017-0035-5)

4 Y Dong et al, npj Comput. Mater., 2019, 5, 26 (DOI:10.1038/s41524-019-0165-4)

5 A Chandrasekaran et al, npj Comput. Mater., 2019, 5, 22 (DOI:10.1038/s41524-019-0162-7)

6 L Zhang et al, Phys. Rev. Lett., 2018, 120, 143001 (DOI:10.1103/PhysRevLett.120.143001)

7. L Zhang et al, J. Chem. Phys., 2018, 149, 034101 (DOI:10.1063/1.5027645)

8. F-Z Dai et al., J. Mater. Sci. Technol., 2020, 43, 168 (DOI:10.1016/j.jmst.2020.01.005)

Go here to see the original:
Molecular dynamics used to simulate 100 million atoms | Opinion - Chemistry World

Highest-performing quantum simulator IN THE WORLD delivered to Japan – TechGeek

Atos, a global leader in digital transformation, introduced the worlds first commercially available quantum simulator capable of simulating up to 40 quantum bits, or Qubits, which translates to very fucking fast.

The simulator, named Atos Quantum Learning Machine, is powered by an ultra-compact supercomputer and a universal programming language.

Quantum computing is a key priority for Japan. It launched a dedicated ten-year, 30 billion yen (.. aka US$280 million / AUD$433 million) quantum research program in 2017, followed by a 100 billion yen (.. aka US$900 million / AUD $1 billion) investment into its Moonshot R&D Program one focus of which will be to create a fault-tolerant universal quantum computer to revolutionise the economy, industry, and security sectors by 2050.

Were delighted to have sold our first QLM in Japan, thanks to our strong working partnership with Intelligent Wave Inc.. We are proud to be part of this growing momentum as the country plans to boost innovation through quantum

Combining a high-powered, ultra-compact machine with a universal programming language, the Atos Quantum Learning Machine (enables researchers and engineers to develop an experiment with quantum software. It is the worlds only quantum software development and simulation appliance for the coming quantum computer era.

It simulates the laws of physics, which are at the very heart of quantum computing, to compute the exact execution of a quantum program with double-digit precision.

Continued here:
Highest-performing quantum simulator IN THE WORLD delivered to Japan - TechGeek

Light, fantastic: the path ahead for faster, smaller computer processors – News – The University of Sydney

Research team: (from left) Associate Professor Stefano Palomba, Dr Alessandro Tuniz, Professor Martijn de Sterke. Photo: Louise Cooper

Light is emerging as the leading vehicle for information processing in computers and telecommunications as our need for energy efficiency and bandwidth increases.

Already the gold standard for intercontinental communication through fibre-optics, photons are replacing electrons as the main carriers of information throughout optical networks and into the very heart of computers themselves.

However, there remain substantial engineering barriers to complete this transformation. Industry-standard silicon circuits that support light are more than an order of magnitude larger than modern electronic transistors. One solution is to compress light using metallic waveguides however this would not only require a new manufacturing infrastructure, but also the way light interacts with metals on chips means that photonic information is easily lost.

Now scientists in Australia and Germany have developed a modular method to design nanoscale devices to help overcome these problems, combining the best of traditional chip design with photonic architecture in a hybrid structure. Their research is published today in Nature Communications.

We have built a bridge between industry-standard silicon photonic systems and the metal-based waveguides that can be made 100 times smaller while retaining efficiency, said lead author Dr Alessandro Tuniz from the University of Sydney Nano Institute and School of Physics.

This hybrid approach allows the manipulation of light at the nanoscale, measured in billionths of a metre. The scientists have shown that they can achieve data manipulation at 100 times smaller than the wavelength of light carrying the information.

This sort of efficiency and miniaturisation will be essential in transforming computer processing to be based on light. It will also be very useful in the development of quantum-optical information systems, a promising platform for future quantum computers, said Associate Professor Stefano Palomba, a co-author from the University of Sydney and Nanophotonics Leader at Sydney Nano.

Eventually we expect photonic information will migrate to the CPU, the heart of any modern computer. Such a vision has already been mapped out by IBM.

On-chip nanometre-scale devices that use metals (known as plasmonic devices) allow for functionality that no conventional photonic device allows. Most notably, they efficiently compress light down to a few billionths of a metre and thus achieve hugely enhanced, interference-free, light-to-matter interactions.

As well as revolutionising general processing, this is very useful for specialised scientific processes such as nano-spectroscopy, atomic-scale sensing and nanoscale detectors, said Dr Tuniz also from the Sydney Institute of Photonics and Optical Science.

However, their universal functionality was hampered by a reliance on ad hoc designs.

We have shown that two separate designs can be joined together to enhance a run-of-the-mill chip that previously did nothing special, Dr Tuniz said.

This modular approach allows for rapid rotation of light polarisation in the chip and,becauseof that rotation, quickly permits nano-focusing down to about 100 times less than the wavelength.

Professor Martijn de Sterke is Director of the Institute of Photonics and Optical Science at the University of Sydney. He said: The future of information processing is likely to involve photons using metals that allow us to compress light to the nanoscale and integrate these designs into conventional silicon photonics.

This research was supported by the University of Sydney Fellowship Scheme, the German Research Foundation (DFG) under Germanys Excellence Strategy EXC-2123/1. This work was performed in part at the NSW node of the Australian National Fabrication Facility (ANFF).

Read this article:
Light, fantastic: the path ahead for faster, smaller computer processors - News - The University of Sydney

Wiring the quantum computer of the future – Space Daily

Quantum computing is increasingly becoming the focus of scientists in fields such as physics and chemistry, and industrialists in the pharmaceutical, airplane, and automobile industries. Globally, research labs at companies like Google and IBM are spending extensive resources on improving quantum computers, and with good reason.

Quantum computers use the fundamentals of quantum mechanics to process significantly greater amounts of information much faster than classical computers. It is expected that when error-corrected and fault-tolerant quantum computation is achieved, scientific and technological advancement will occur at an unprecedented scale.

But, building quantum computers for large-scale computation is proving to be a challenge in terms of their architecture. The basic units of a quantum computer are the "quantum bits" or "qubits." These are typically atoms, ions, photons, subatomic particles such as electrons, or even larger elements that simultaneously exist in multiple states, making it possible to obtain several potential outcomes rapidly for large volumes of data. The theoretical requirement for quantum computers is that these are arranged in two-dimensional (2D) arrays, where each qubit is both coupled with its nearest neighbor and connected to the necessary external control lines and devices.

When the number of qubits in an array is increased, it becomes difficult to reach qubits in the interior of the array from the edge. The need to solve this problem has so far resulted in complex three-dimensional (3D) wiring systems across multiple planes in which many wires intersect, making their construction a significant engineering challenge.

A group of scientists from Tokyo University of Science, Japan, RIKEN Centre for Emergent Matter Science, Japan, and University of Technology, Sydney, led by Prof Jaw-Shen Tsai, proposes a unique solution to this qubit accessibility problem by modifying the architecture of the qubit array. "Here, we solve this problem and present a modified superconducting micro-architecture that does not require any 3D external line technology and reverts to a completely planar design," they say. This study has been published in the New Journal of Physics.The scientists began with a qubit square lattice array and stretched out each column in the 2D plane. They then folded each successive column on top of each other, forming a dual one-dimensional array called a "bi-linear" array. This put all qubits on the edge and simplified the arrangement of the required wiring system. The system is also completely in 2D.

In this new architecture, some of the inter-qubit wiring--each qubit is also connected to all adjacent qubits in an array--does overlap, but because these are the only overlaps in the wiring, simple local 3D systems such as airbridges at the point of overlap are enough and the system overall remains in 2D. As you can imagine, this simplifies its construction considerably.

The scientists evaluated the feasibility of this new arrangement through numerical and experimental evaluation in which they tested how much of a signal was retained before and after it passed through an airbridge. Results of both evaluations showed that it is possible to build and run this system using existing technology and without any 3D arrangement.

The scientists' experiments also showed them that their architecture solves several problems that plague the 3D structures: they are difficult to construct, there is crosstalk or signal interference between waves transmitted across two wires, and the fragile quantum states of the qubits can degrade. The novel pseudo-2D design reduces the number of times wires cross each other, thereby reducing the crosstalk and consequently increasing the efficiency of the system.

At a time when large labs worldwide are attempting to find ways to build large-scale fault-tolerant quantum computers, the findings of this exciting new study indicate that such computers can be built using existing 2D integrated circuit technology. "The quantum computer is an information device expected to far exceed the capabilities of modern computers," Prof Tsai states. The research journey in this direction has only begun with this study, and Prof Tsai concludes by saying, "We are planning to construct a small-scale circuit to further examine and explore the possibility."

Research paper

Related LinksTokyo University Of ScienceComputer Chip Architecture, Technology and ManufactureNano Technology News From SpaceMart.com

With the rise of Ad Blockers, and Facebook - our traditional revenue sources via quality network advertising continues to decline. And unlike so many other news sites, we don't have a paywall - with those annoying usernames and passwords.

Our news coverage takes time and effort to publish 365 days a year.

If you find our news sites informative and useful then please consider becoming a regular supporter or for now make a one off contribution.

The rest is here:
Wiring the quantum computer of the future - Space Daily

Technologies That You Can Explore Other Than Data Science During Lockdown – Analytics India Magazine

Data science has become the hottest job of the century, but that does not mean it is for all. Not everyone can become data scientists as it requires continuous learning. Every other day, we witness innovation in the domain, thereby keeping up with the pace is demanding. Consequently, you can aspire to other rising technologies that will democratise rapidly in the coming years. Today, due to the lockdown, one has ample time to explore innovative technologies that will become popular as we move ahead.

Here is the list of technologies that you can explore:-

A combination of maths, physics and computer science, quantum computing has started seeing the light of the sun. Due to the rise of the data science landscape, the need for processing a colossal amount of data is increasing as organisations are rapidly collecting data. However, due to the dearth of required computational power, firms are struggling to process gathered information, thereby creating data silos. And since Moores law is starting to fail, we have to move away from dated hardware methodologies.

In 2019, Google claimed supremacy in quantum computing by benchmarking computation in 200 seconds, which would have taken the current worlds fastest computer 10,000 years. The technology proposed in the 1980s by Paul Benioff took decades for researchers to make a breakthrough. Now that everyone sees value in quantum computing, it is poised to rise in the coming years. Getting your hand on such technology during its nascent stage can open up new opportunities in the future.

Start exploring by enrolling in this course for free.

Blockchain has been at the forefront of the rising technologies for quite some time. Earlier deemed as bitcoin and other cryptocurrencies, today, it is used in a wide range of use cases to simplify the authenticity while keeping the security in check. Undoubtedly, it will revolutionise the finance industry, the state-of-the-art blockchain will be widely used in digital identity, supply chain management, healthcare, among others. According to a report, global blockchain technology will expand to reach $176 billion by 2025 and exceed $3.1 trillion by 2030. Blockchain in the future might become the go-to technology for almost all enterprise operations. Consequently, one should pursue a career in blockchain and innovate to streamline business processes.

Blockchain course here.

Augmented reality and virtual reality has the potential to revolutionise the whole world by providing immersive experiences in almost every use case. As remote working is becoming the new normal, the demand for VR and AR has doubled in the last two months as per Hemanth Satyanarayan, chief executive Imaginate. Among other technologies, mixed reality would be the next big thing. And according to Digi-Capital AR/VR market will reach around $65 billion revenue by 2024. The COVID-19 crisis has made humans realise how important these technologies are, however, to democratise companies will have to overcome numerous barriers associated with the AR/VR landscape. The requirement of high-performance computation, intuitive 3D interface design, and more, have been the reason why VR and AR have witnessed a slower growth. But, companies like Facebook is showing promise in making breakthroughs in the field to enable mass adoption of the technology.

Get started with these AR and VR courses by Google.

In the data-driven world, people are concerned about their privacy, which has led governments to make stringent rules and regulations for privacy protection. The introduction of various data protection rules such as GDPR and PDP across the globe has forced organisations to rethink their cybersecurity practices. Although PDP is yet to be enforced on organisations, firms are already changing their cybersecurity operations to immediately comply once the bill gets final approval. Currently, the Indian cybersecurity industry is valued at $6.7 billion, and with the rising digital growth, we are witnessing a renewed interest in cybersecurity.

Learn more about cybersecurity with this free course.

comments

More here:
Technologies That You Can Explore Other Than Data Science During Lockdown - Analytics India Magazine

Will Quantum Computing Really Change The World? Facts And Myths – Analytics India Magazine

In recent years, some big tech companies like IBM, Microsoft, Intel, or Google have been working in relative silence on something that sounds great: quantum computing. The main problem with this is that it is difficult to know what exactly it is and what it can be useful for.

There are some questions that can be easily solved. For example, quantum computing is not going to help you have more FPS on your graphics card at the moment. Nor will it be as easy as changing the CPU of your computer for a quantum to make it hyperfast. Quantum computing is fundamentally different from the computing we are used to, but how?

At the beginning of the 20th century, Planck and Einstein proposed that light is not a continuous wave (like the waves in a pond) but that it is divided into small packages or quanta. This apparently simple idea served to solve a problem called the ultraviolet catastrophe. But over the years other physicists developed it and came to surprising conclusions about the matter, of which we will be interested in two: the superposition of states and entanglement.

To understand why we are interested, lets take a short break and think about how a classic computer works. The basic unit of information is the bit, which can have two possible states (1 or 0) and with which we can perform various logical operations (AND, NOT, OR). Putting together n bits we can represent numbers and operate on those numbers, but with limitations: we can only represent up to 2 different states, and if we want to change x bits we have to perform at least x operations on them: there is no way to magically change them without touching them.

Well, superposition and entanglement allow us to reduce these limitations: with superposition, we can store many more than just 2 ^ n states with n quantum bits (qubits), and entanglement maintains certain relations between qubits in such a way that the operations in one qubit they forcefully affect the rest.

Overlapping, while looking like a blessing at first glance, is also a problem. As Alexander Holevo showed in 1973, even though we have many more states than we can save in n qubits, in practice we can only read 2 ^ n different ones. As we saw in an article in Genbeta about the foundations of quantum computing: a qubit is not only worth 1 or 0 as a normal bit, but it can be 1 in 80% and 0 in 20%. The problem is that when we read it we can only obtain either 1 or 0, and the probabilities that each value had of leaving are lost because when we measured it we modified it.

This discrepancy between the information kept by the qubits and what we can read led Benioff and Feynman to demonstrate that a classical computer would not be able to simulate a quantum system without a disproportionate amount of resources, and to propose models for a quantum computer that did. was able to do that simulation.

Those quantum computers would probably be nothing more than a scientific curiosity without the second concept, entanglement, which allows two quite relevant algorithms to be developed: quantum tempering in 1989 and Shors algorithm in 1994. The first allows finding minimum values of functions, which So said, it does not sound very interesting but it has applications in artificial intelligence and machine learning, as we discussed in another article. For example, if we manage to code the error rate of a neural network as a function to which we can apply quantum quenching, that minimum value will tell us how to configure the neural network to be as efficient as possible.

The second algorithm, the Shor algorithm, helps us to decompose a number into its prime factors much more efficiently than we can achieve on a normal computer. So said, again, it doesnt sound at all interesting. But if I tell you that RSA, one of the most used algorithms to protect and encrypt data on the Internet, is based on the fact that factoring numbers are exponentially slow (adding a bit to the key implies doubling the time it takes to do an attack by force) then the thing changes. A quantum computer with enough qubits would render many encryption systems completely obsolete.

Until now, quantum computing is a field that hasnt been applied much in the real world. To give us an idea, with the twenty qubits of the commercial quantum computer announced by IBM, we could apply Shors factorization algorithm only to numbers less than 1048576, which as you can imagine is not very impressive.

Still, the field has a promising evolution. In 1998 the first ord quantum drive (only two qubits, and needed a nuclear magnetic resonance machine to solve a toy problem (the so-called Deutsch-Jozsa problem). In 2001 Shors algorithm was run for the first time. Only 6 years later, in 2007, D-Wave presented its first computer capable of executing quantum quenching with 16 qubits. This year, the same company announced a 2000 qubit quantum quenching computer. On the other hand, the new IBM computers, although with fewer qubits, they are able to implement generic algorithms and not only that of quantum quenching. In short, it seems that the push is strong and that quantum computing will be increasingly applicable to real problems.

What can those applications be? As we mentioned before, the quantum tempering algorithm is very appropriate for machine learning problems, which makes the computers that implement it extremely useful, although the only thing they can do is run that single algorithm. If systems can be developed that, for example, are capable of transcribing conversations or identifying objects in images and can be translated to train them in quantum computers, the results could be orders of magnitude better than those that already exist. The same algorithm could also be used to find solutions to problems in medicine or chemistry, such as finding the optimal treatment methods for a patient or studying the possible structures of complex molecules.

Generic quantum computers, which have fewer qubits right now, could run more algorithms. For example, they could be used to break much of the crypto used right now as we discussed earlier (which explains why the NSA wanted to have a quantum computer). They would also serve as super-fast search engines if Grovers search algorithm can be implemented, and for physics and chemistry, they can be very useful as efficient simulators of quantum systems.

Unfortunately, algorithms and codes for classic computers couldnt be used on quantum computers and magically get an improvement in speed: you need to develop a quantum algorithm (not a trivial thing) and implement it in order to get that improvement. That, at first, greatly restricts the applications of quantum computers and will be a problem to overcome when those systems are more developed.

However, the main problem facing quantum computing is building computers. Compared to a normal computer, a quantum computer is an extremely complex machine: they operate at a temperature close to absolute zero (-273 C), the qubits support are superconducting and the components to be able to read and manipulate the qubits are not simple either.

What can a non-quantum quantum computer be like? As we have explained before, the two relevant concepts of a quantum computer are superposition and entanglement, and without them, there cannot be the speed improvements that quantum algorithms promise. If computer disturbances modify overlapping qubits and bring them to classical states quickly, or if they break the interweaving between several qubits, what we have is not a quantum computer but only an extremely expensive computer that only serves to run a handful of algorithms. equivalent to a normal computer (and will probably give erroneous results).

Of the two properties, entanglement is the most difficult to maintain and prove to exist. The more qubits there are, the easier it is for one of them to deinterlace (which explains why increasing the number of qubits is not a trivial task). And it is not enough to build the computer and see that correct results come out to say that there are intertwined qubits: looking for evidence of entanglement is a task in itself and in fact, the lack of evidence was one of the main criticisms of D-systems. Wave in its beginnings.

A priori and with the materials that quantum computers are being built with, it does not seem that miniaturization is too feasible. But there is already research on new materials that could be used to create more accessible quantum computers. Who knows if fifty years from now we will be able to buy quantum CPUs to improve the speed of our computers.

comments

View original post here:
Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine

Google’s top quantum computing brain may or may not have quit – Fudzilla

We will know when someone opens his office door

John Martinis, who had established Googles quantum hardware group in 2014, has cleaned out his office, put the cats out and left the building.

Martinis says a few months after he got Googles now legendary quantum computing experiment to go he was reassigned from a leadership position to an advisory one.

Martinis told Wired that the change led to disagreements with Hartmut Neven, the longtime leader of Googles quantum project.

Martinis said he had to go because his professional goal is for someone to build a quantum computer.

Google has not disputed this account, and says that the company is grateful for Martinis contributions and that Neven continues to head the companys quantum project.

Martinis retains his position as a professor at the UC Santa Barbara, which he held throughout his tenure at Google, and says he will continue to work on quantum computing.

To be fair, Googles quantum computing project was founded by Neven, who pioneered Googles image search technology, and got enough cats together.

The project took on greater scale and ambition when Martinis joined in 2014 to establish Googles quantum hardware lab in Santa Barbara, bringing along several members of his university research group. His nearby lab at UC Santa Barbara had produced some of the most prominent work in the field over the past 20 years, helping to demonstrate the potential of using superconducting circuits to build qubits, the building blocks of quantum computers.

Googles ground-breaking supremacy experiment used 53 qubits working together. They took minutes to crunch through a carefully chosen math problem the company calculated would take a supercomputer 10,000 years to work out. It still does not have a practical use, and the cats were said to be bored with the whole thing.

Continue reading here:
Google's top quantum computing brain may or may not have quit - Fudzilla

On the Heels of a Light Beam – Scientific American

As a 16-year-old boy, Albert Einstein imagined chasing after a beam of light in the vacuum of space. He mused on that vision for years, turning it over in his mind, asking questions about the relation between himself and the beam. Those mental investigations eventually led him to his special theory of relativity. Such thought experiments, which Einstein referred to by the German term gedankenexperiment, continue to nourish the heart of physics today, especially in the field of quantum mechanics, which he helped to establish.

In quantum mechanics, things don't happen, theoretical physicist Stephen L. Adler tells our reporter Tim Folger, referring to the probabilistic nature of quantum reality.

Philosophically, this may be true, but it hasn't stopped researchers from testing quantum concepts. Using lasers to excite electrons into emitting photons, a group at Delft University of Technology in the Netherlands ruled out the existence of hidden variables, which Einstein believed were controlling so-called entangled particlesone of the main tenets of quantum theory. Without these mysterious forces, bizarre dynamics could indeed be at work in the quantum world, defying our notions of space and time. Physicist Lee Smolin argues that the fabric of the cosmos is a vast collection of atomic interactions within an evolving network of relations where causality among events is complex and irrespective of distance.

Despite the theoretical mysteries of quantum theory, its real-world applications are growing. Researchers are cooling atomic systems to near absolute zero for use as quantum simulators to study applications in superconductors and superfluids. Others are using tabletop experiments to monitor the gravitational fields around entangled objectsminuscule gold or diamond spheres, for examplelooking for signs that gravity itself is quantized into discrete bits. At a larger scale, tools such as the Event Horizon Telescope, which recently took the first picture of a black hole, and gravitational-wave detectors could help resolve long-standing, vexing contradictions between quantum mechanics and general relativity.

These quantum insights are fueling tremendous innovation. A team of researchers in China successfully tested superposition over a distance of 1,200 kilometers, paving the way for an unhackable quantum-communications network. Computer scientists are using quantum algorithms to enhance traditional systems, ratcheting up progress toward the heralded quantum computing era. Such applications are still immature, as Elizabeth Gibney reports, yet it's not stopping investors from pouring money into quantum start-ups.

Science historians have argued about whether Einstein accepted the elements of quantum theory that conflicted with his own theories. Who knows whether he could have imagined the applications his ideas engendered. In any case, the thought experiment continues.

See original here:
On the Heels of a Light Beam - Scientific American