What if we could do chemistry inside a computer instead of in a test tube or beaker in the laboratory? What if running a new experiment was as simple as running an app and having it completed in a few seconds?

For this to really work, we would want it to happen with complete fidelity. The atoms and molecules as modeled in the computer should behave exactly like they do in the test tube. The chemical reactions that happen in the physical world would have precise computational analogs. We would need a completely accurate simulation.

If we could do this at scale, we might be able to compute the molecules we want and need.

These might be for new materials for shampoos or even alloys for cars and airplanes. Perhaps we could more efficiently discover medicines that are customized to your exact physiology. Maybe we could get a better insight into how proteins fold, thereby understanding their function, and possibly creating custom enzymes to positively change our body chemistry.

Is this plausible? We have massive supercomputers that can run all kinds of simulations. Can we model molecules in the above ways today?

This article is an excerpt from the book Dancing with Qubits written by Robert Sutor. Robert helps you understand how quantum computing works and delves into the math behind it with this quantum computing textbook.

Lets start with C8H10N4O2 1,3,7-Trimethylxanthine.

This is a very fancy name for a molecule that millions of people around the world enjoy every day: caffeine. An 8-ounce cup of coffee contains approximately 95 mg of caffeine, and this translates to roughly 2.95 10^20 molecules. Written out, this is

295, 000, 000, 000, 000, 000, 000 molecules.

A 12 ounce can of a popular cola drink has 32 mg of caffeine, the diet version has 42 mg, and energy drinks often have about 77 mg.

These numbers are large because we are counting physical objects in our universe, which we know is very big. Scientists estimate, for example, that there are between 10^49 and 10^50 atoms in our planet alone.

To put these values in context, one thousand = 10^3, one million = 10^6, one billion = 10^9, and so on. A gigabyte of storage is one billion bytes, and a terabyte is 10^12 bytes.

Getting back to the question I posed at the beginning of this section, can we model caffeine exactly on a computer? We dont have to model the huge number of caffeine molecules in a cup of coffee, but can we fully represent a single molecule at a single instant?

Caffeine is a small molecule and contains protons, neutrons, and electrons. In particular, if we just look at the energy configuration that determines the structure of the molecule and the bonds that hold it all together, the amount of information to describe this is staggering. In particular, the number of bits, the 0s and 1s, needed is approximately 10^48:

10, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000.

And this is just one molecule! Yet somehow nature manages to deal quite effectively with all this information. It handles the single caffeine molecule, to all those in your coffee, tea, or soft drink, to every other molecule that makes up you and the world around you.

How does it do this? We dont know! Of course, there are theories and these live at the intersection of physics and philosophy. However, we do not need to understand it fully to try to harness its capabilities.

We have no hope of providing enough traditional storage to hold this much information. Our dream of exact representation appears to be dashed. This is what Richard Feynman meant in his quote: Nature isnt classical.

However, 160 qubits (quantum bits) could hold 2^160 1.46 10^48 bits while the qubits were involved in a computation. To be clear, Im not saying how we would get all the data into those qubits and Im also not saying how many more we would need to do something interesting with the information. It does give us hope, however.

In the classical case, we will never fully represent the caffeine molecule. In the future, with enough very high-quality qubits in a powerful quantum computing system, we may be able to perform chemistry on a computer.

I can write a little app on a classical computer that can simulate a coin flip. This might be for my phone or laptop.

Instead of heads or tails, lets use 1 and 0. The routine, which I call R, starts with one of those values and randomly returns one or the other. That is, 50% of the time it returns 1 and 50% of the time it returns 0. We have no knowledge whatsoever of how R does what it does.

When you see R, think random. This is called a fair flip. It is not weighted to slightly prefer one result over the other. Whether we can produce a truly random result on a classical computer is another question. Lets assume our app is fair.

If I apply R to 1, half the time I expect 1 and another half 0. The same is true if I apply R to 0. Ill call these applications R(1) and R(0), respectively.

If I look at the result of R(1) or R(0), there is no way to tell if I started with 1 or 0. This is just like a secret coin flip where I cant tell whether I began with heads or tails just by looking at how the coin has landed. By secret coin flip, I mean that someone else has flipped it and I can see the result, but I have no knowledge of the mechanics of the flip itself or the starting state of the coin.

If R(1) and R(0) are randomly 1 and 0, what happens when I apply R twice?

I write this as R(R(1)) and R(R(0)). Its the same answer: random result with an equal split. The same thing happens no matter how many times we apply R. The result is random, and we cant reverse things to learn the initial value.

There is a catch, though. You are not allowed to look at the result of what H does if you want to reverse its effect. If you apply H to 0 or 1, peek at the result, and apply H again to that, it is the same as if you had used R. If you observe what is going on in the quantum case at the wrong time, you are right back at strictly classical behavior.

To summarize using the coin language: if you flip a quantum coin and then dont look at it, flipping it again will yield heads or tails with which you started. If you do look, you get classical randomness.

A second area where quantum is different is in how we can work with simultaneous values. Your phone or laptop uses bytes as individual units of memory or storage. Thats where we get phrases like megabyte, which means one million bytes of information.

A byte is further broken down into eight bits, which weve seen before. Each bit can be a 0 or 1. Doing the math, each byte can represent 2^8 = 256 different numbers composed of eight 0s or 1s, but it can only hold one value at a time. Eight qubits can represent all 256 values at the same time

This is through superposition, but also through entanglement, the way we can tightly tie together the behavior of two or more qubits. This is what gives us the (literally) exponential growth in the amount of working memory.

Artificial intelligence and one of its subsets, machine learning, are extremely broad collections of data-driven techniques and models. They are used to help find patterns in information, learn from the information, and automatically perform more intelligently. They also give humans help and insight that might have been difficult to get otherwise.

Here is a way to start thinking about how quantum computing might be applicable to large, complicated, computation-intensive systems of processes such as those found in AI and elsewhere. These three cases are in some sense the small, medium, and large ways quantum computing might complement classical techniques:

As I write this, quantum computers are not big data machines. This means you cannot take millions of records of information and provide them as input to a quantum calculation. Instead, quantum may be able to help where the number of inputs is modest but the computations blow up as you start examining relationships or dependencies in the data.

In the future, however, quantum computers may be able to input, output, and process much more data. Even if it is just theoretical now, it makes sense to ask if there are quantum algorithms that can be useful in AI someday.

To summarize, we explored how quantum computing works and different applications of artificial intelligence in quantum computing.

Get this quantum computing book Dancing with Qubits by Robert Sutor today where he has explored the inner workings of quantum computing. The book entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.

Intel introduces cryogenic control chip, Horse Ridge for commercially viable quantum computing

Microsoft announces Azure Quantum, an open cloud ecosystem to learn and build scalable quantum solutions

Amazon re:Invent 2019 Day One: AWS launches Braket, its new quantum service and releases

See the original post:

Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub

- NU receives $115 million federal grant to research and develop beyond state-of-the-art quantum computer - Daily Northwestern - September 24th, 2020
- IBM Just Committed to Having a Functioning 1,000 Qubit Quantum Computer by 2023 - ScienceAlert - September 24th, 2020
- IBM plans to build a 1121 qubit system. What does this technology mean? - The Hindu - September 24th, 2020
- Extending the life of the qubit | Temple Now - Temple University News - September 24th, 2020
- OSTP, NSF, DoE, and IBM make major push to strengthen research in AI and quantum - BlackEngineer.com - September 24th, 2020
- Heres why quantum computing is a cat among the pigeons - BusinessLine - September 12th, 2020
- The Hyperion-insideHPC Interviews: ORNL Distinguished Scientist Travis Humble on Coupling Classical and Quantum Computing - insideHPC - September 12th, 2020
- Oxford Instruments Partners With The 10 Million Consortium, To Launch The First Commercial Quantum Computer In UK - AZoNano - September 10th, 2020
- Combinations of new technologies will upend finance - The Australian Financial Review - September 10th, 2020
- Quantum Computing Market Analysis by Growth, segmentation, performance, Competitive Strategies and Forecast to 2026 - Galus Australis - September 10th, 2020
- The Quantum Dream: Are We There Yet? - Toolbox - September 7th, 2020
- 17 extremely useful productivity tips from this years 40 Under 40 - Yahoo Finance UK - September 7th, 2020
- How Amazon Quietly Powers The Internet - Forbes - September 7th, 2020
- Study Expands Types of Physics, Engineering Problems That Can Be Solved by Quantum Computers - HPCwire - September 4th, 2020
- New evidence that the quantum world is even stranger than we thought - Purdue News Service - September 4th, 2020
- How Andersen Cheng plans to defend against the quantum computer - The Independent - September 4th, 2020
- Quantum computer to be hosted in Abingdon - ClickLancashire - September 4th, 2020
- Assistant director of NSFs Computer and Information Science and Engineering to give virtual talk Sept. 11 - Vanderbilt University News - September 4th, 2020
- Fermilab to lead $115 million National Quantum Information Science Research Center to build revolutionary quantum computer with Rigetti Computing,... - August 29th, 2020
- I confess, I'm scared of the next generation of supercomputers - TechRadar - August 29th, 2020
- Q-NEXT collaboration awarded National Quantum Initiative funding - University of Wisconsin-Madison - August 29th, 2020
- UArizona Scientists to Build What Einstein Wrote off as Science Fiction - UANews - August 29th, 2020
- Quantum leap? US plans for unhackable internet may not fructify within a decade, but India is far behind - The Financial Express - August 4th, 2020
- Google distinguished scientist Hartmut Neven is one of Fast Company's - Fast Company - August 4th, 2020
- Quantum physicists say time travelers don't have to worry about the butterfly effect - The Next Web - August 2nd, 2020
- Week in review: BootHole, RCEs in industrial VPNs, the cybersecurity profession crisis - Help Net Security - August 2nd, 2020
- New UC-led institute awarded $25M to explore potential of quantum computing and train a future workforce - University of California - July 31st, 2020
- The future of encryption: Getting ready for the quantum computer attack - TechRepublic - July 31st, 2020
- IBM and University of Tokyo team up for Quantum Innovation Initiative Consortium - SmartPlanet.com - July 31st, 2020
- 'Butterfly effect' is wrong and reality can 'heal itself', quantum scientists find in time travel experiment - The Independent - July 31st, 2020
- Research: the butterfly effect does not exist in the quantum model - FREE NEWS - July 31st, 2020
- Solving problems by working together: Could quantum computing hold the key to Covid-19? - ITProPortal - July 2nd, 2020
- Spain Introduces the World's First Quantum Phase Battery - News - All About Circuits - July 2nd, 2020
- Professor tackles one more mystery about quantum mechanics and times flow - GeekWire - July 2nd, 2020
- This Week's Awesome Tech Stories From Around the Web (Through June 27) - Singularity Hub - June 29th, 2020
- Kudos: Read about faculty, staff and student awards, appointments and achievements - Vanderbilt University News - June 29th, 2020
- This Is the First Universal Language for Quantum Computers - Popular Mechanics - June 21st, 2020
- Universal Quantum raises $4.5 million to build a large-scale quantum computer - VentureBeat - June 17th, 2020
- Ethereum (ETH) Might Not have Quantum Resistance on its Roadmap, the QRL Team Reveals - Crowdfund Insider - June 17th, 2020
- Craig Knoblock Named Michael Keston Executive Director of the USC Information Sciences Institute - USC Viterbi School of Engineering - June 17th, 2020
- European quantum computing startup takes its funding to 32M with fresh raise - TechCrunch - June 11th, 2020
- SKT to expand use of new quantum-powered security solutions - The Korea Herald - June 11th, 2020
- Archer looks to commercialisation future with graphene-based biosensor tech - ZDNet - June 11th, 2020
- Dear NASA, please put a particle collider on the Moon - The Next Web - June 11th, 2020
- Top 10 emerging technologies of 2020: Winners and losers - TechRepublic - June 11th, 2020
- When Will Quantum Computing Come to Mainstream? - Analytics Insight - June 8th, 2020
- University announces 2020 winners of Quantrell and Graduate Teaching Awards - UChicago News - June 8th, 2020
- Physicists Found a Way to Save Schrdingers Cat - Dual Dove - June 8th, 2020
- Physicists hunt for room-temperature superconductors that could revolutionize the world's energy system - The Conversation US - June 3rd, 2020
- Covid 19 Pandemic: Quantum Computing Technologies Market 2020, Share, Growth, Trends And Forecast To 2025 - 3rd Watch News - May 24th, 2020
- Molecular dynamics used to simulate 100 million atoms | Opinion - Chemistry World - May 23rd, 2020
- Highest-performing quantum simulator IN THE WORLD delivered to Japan - TechGeek - May 18th, 2020
- Light, fantastic: the path ahead for faster, smaller computer processors - News - The University of Sydney - May 18th, 2020
- Wiring the quantum computer of the future - Space Daily - April 29th, 2020
- Technologies That You Can Explore Other Than Data Science During Lockdown - Analytics India Magazine - April 29th, 2020
- Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine - April 23rd, 2020
- Google's top quantum computing brain may or may not have quit - Fudzilla - April 23rd, 2020
- On the Heels of a Light Beam - Scientific American - April 23rd, 2020
- Advanced Encryption Standard (AES): What It Is and How It Works - Hashed Out by The SSL Store - Hashed Out by The SSL Store - April 23rd, 2020
- Google's Head of Quantum Computing Hardware Resigns - WIRED - April 21st, 2020
- COVID-19: Quantum computing could someday find cures for coronaviruses and other diseases - TechRepublic - April 21st, 2020
- The future of quantum computing in the cloud - TechTarget - April 21st, 2020
- Quantum computer chips demonstrated at the highest temperatures ever - New Scientist News - April 17th, 2020
- Alex Garland on 'Devs,' free will and quantum computing - Engadget - April 14th, 2020
- RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away - The Register - April 12th, 2020
- Quantum computing: When to expect the next major leap - TechRepublic - April 12th, 2020
- Cambridge Quantum Computing Performs the World's First Quantum Natural Language Processing Experiment - Quantaneo, the Quantum Computing Source - April 12th, 2020
- The Well-matched Combo of Quantum Computing and Machine Learning - Analytics Insight - March 23rd, 2020
- Picking up the quantum technology baton - The Hindu - March 23rd, 2020
- Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper - HPCwire - March 23rd, 2020
- Honeywell Achieves Breakthrough That Will Enable The Worlds Most Powerful Quantum Computer #47655 - New Kerala - March 23rd, 2020
- Is time broken? Physicists filmed a quantum measurement but the 'moment' was blurry - The Next Web - March 5th, 2020
- What Is Quantum Computing? The Next Era of Computational ... - March 3rd, 2020
- Honeywell says it will soon launch the worlds most powerful quantum computer - TechCrunch - March 3rd, 2020
- Majority of Promising AI Startups Are Still Based in the US - Transport Topics Online - March 3rd, 2020
- 10 Revolutionary Technologies To Lookout For In 2020 - Fossbytes - March 3rd, 2020
- Quantum researchers able to split one photon into three - Space Daily - March 3rd, 2020
- Physicists Captured The Moment That An Atom Enters Quantum Measurement - Somag News - February 29th, 2020
- This Week's Awesome Tech Stories From Around the Web (Through February 29) - Singularity Hub - February 29th, 2020
- IC Breakthroughs: Energy Harvesting, Quantum Computing, and a 96-Core Processor in Six Chiplets - News - All About Circuits - February 29th, 2020

## Recent Comments