What if we could do chemistry inside a computer instead of in a test tube or beaker in the laboratory? What if running a new experiment was as simple as running an app and having it completed in a few seconds?

For this to really work, we would want it to happen with complete fidelity. The atoms and molecules as modeled in the computer should behave exactly like they do in the test tube. The chemical reactions that happen in the physical world would have precise computational analogs. We would need a completely accurate simulation.

If we could do this at scale, we might be able to compute the molecules we want and need.

These might be for new materials for shampoos or even alloys for cars and airplanes. Perhaps we could more efficiently discover medicines that are customized to your exact physiology. Maybe we could get a better insight into how proteins fold, thereby understanding their function, and possibly creating custom enzymes to positively change our body chemistry.

Is this plausible? We have massive supercomputers that can run all kinds of simulations. Can we model molecules in the above ways today?

This article is an excerpt from the book Dancing with Qubits written by Robert Sutor. Robert helps you understand how quantum computing works and delves into the math behind it with this quantum computing textbook.

Lets start with C8H10N4O2 1,3,7-Trimethylxanthine.

This is a very fancy name for a molecule that millions of people around the world enjoy every day: caffeine. An 8-ounce cup of coffee contains approximately 95 mg of caffeine, and this translates to roughly 2.95 10^20 molecules. Written out, this is

295, 000, 000, 000, 000, 000, 000 molecules.

A 12 ounce can of a popular cola drink has 32 mg of caffeine, the diet version has 42 mg, and energy drinks often have about 77 mg.

These numbers are large because we are counting physical objects in our universe, which we know is very big. Scientists estimate, for example, that there are between 10^49 and 10^50 atoms in our planet alone.

To put these values in context, one thousand = 10^3, one million = 10^6, one billion = 10^9, and so on. A gigabyte of storage is one billion bytes, and a terabyte is 10^12 bytes.

Getting back to the question I posed at the beginning of this section, can we model caffeine exactly on a computer? We dont have to model the huge number of caffeine molecules in a cup of coffee, but can we fully represent a single molecule at a single instant?

Caffeine is a small molecule and contains protons, neutrons, and electrons. In particular, if we just look at the energy configuration that determines the structure of the molecule and the bonds that hold it all together, the amount of information to describe this is staggering. In particular, the number of bits, the 0s and 1s, needed is approximately 10^48:

10, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000.

And this is just one molecule! Yet somehow nature manages to deal quite effectively with all this information. It handles the single caffeine molecule, to all those in your coffee, tea, or soft drink, to every other molecule that makes up you and the world around you.

How does it do this? We dont know! Of course, there are theories and these live at the intersection of physics and philosophy. However, we do not need to understand it fully to try to harness its capabilities.

We have no hope of providing enough traditional storage to hold this much information. Our dream of exact representation appears to be dashed. This is what Richard Feynman meant in his quote: Nature isnt classical.

However, 160 qubits (quantum bits) could hold 2^160 1.46 10^48 bits while the qubits were involved in a computation. To be clear, Im not saying how we would get all the data into those qubits and Im also not saying how many more we would need to do something interesting with the information. It does give us hope, however.

In the classical case, we will never fully represent the caffeine molecule. In the future, with enough very high-quality qubits in a powerful quantum computing system, we may be able to perform chemistry on a computer.

I can write a little app on a classical computer that can simulate a coin flip. This might be for my phone or laptop.

Instead of heads or tails, lets use 1 and 0. The routine, which I call R, starts with one of those values and randomly returns one or the other. That is, 50% of the time it returns 1 and 50% of the time it returns 0. We have no knowledge whatsoever of how R does what it does.

When you see R, think random. This is called a fair flip. It is not weighted to slightly prefer one result over the other. Whether we can produce a truly random result on a classical computer is another question. Lets assume our app is fair.

If I apply R to 1, half the time I expect 1 and another half 0. The same is true if I apply R to 0. Ill call these applications R(1) and R(0), respectively.

If I look at the result of R(1) or R(0), there is no way to tell if I started with 1 or 0. This is just like a secret coin flip where I cant tell whether I began with heads or tails just by looking at how the coin has landed. By secret coin flip, I mean that someone else has flipped it and I can see the result, but I have no knowledge of the mechanics of the flip itself or the starting state of the coin.

If R(1) and R(0) are randomly 1 and 0, what happens when I apply R twice?

I write this as R(R(1)) and R(R(0)). Its the same answer: random result with an equal split. The same thing happens no matter how many times we apply R. The result is random, and we cant reverse things to learn the initial value.

There is a catch, though. You are not allowed to look at the result of what H does if you want to reverse its effect. If you apply H to 0 or 1, peek at the result, and apply H again to that, it is the same as if you had used R. If you observe what is going on in the quantum case at the wrong time, you are right back at strictly classical behavior.

To summarize using the coin language: if you flip a quantum coin and then dont look at it, flipping it again will yield heads or tails with which you started. If you do look, you get classical randomness.

A second area where quantum is different is in how we can work with simultaneous values. Your phone or laptop uses bytes as individual units of memory or storage. Thats where we get phrases like megabyte, which means one million bytes of information.

A byte is further broken down into eight bits, which weve seen before. Each bit can be a 0 or 1. Doing the math, each byte can represent 2^8 = 256 different numbers composed of eight 0s or 1s, but it can only hold one value at a time. Eight qubits can represent all 256 values at the same time

This is through superposition, but also through entanglement, the way we can tightly tie together the behavior of two or more qubits. This is what gives us the (literally) exponential growth in the amount of working memory.

Artificial intelligence and one of its subsets, machine learning, are extremely broad collections of data-driven techniques and models. They are used to help find patterns in information, learn from the information, and automatically perform more intelligently. They also give humans help and insight that might have been difficult to get otherwise.

Here is a way to start thinking about how quantum computing might be applicable to large, complicated, computation-intensive systems of processes such as those found in AI and elsewhere. These three cases are in some sense the small, medium, and large ways quantum computing might complement classical techniques:

As I write this, quantum computers are not big data machines. This means you cannot take millions of records of information and provide them as input to a quantum calculation. Instead, quantum may be able to help where the number of inputs is modest but the computations blow up as you start examining relationships or dependencies in the data.

In the future, however, quantum computers may be able to input, output, and process much more data. Even if it is just theoretical now, it makes sense to ask if there are quantum algorithms that can be useful in AI someday.

To summarize, we explored how quantum computing works and different applications of artificial intelligence in quantum computing.

Get this quantum computing book Dancing with Qubits by Robert Sutor today where he has explored the inner workings of quantum computing. The book entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.

Intel introduces cryogenic control chip, Horse Ridge for commercially viable quantum computing

Microsoft announces Azure Quantum, an open cloud ecosystem to learn and build scalable quantum solutions

Amazon re:Invent 2019 Day One: AWS launches Braket, its new quantum service and releases

See the original post:

Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub

- Google claims to have invented a quantum computer, but IBM begs to differ - The Conversation CA - January 22nd, 2020
- Xanadu Receives $4.4M Investment from SDTC to Advance its Photonic Quantum Computing Technology - Quantaneo, the Quantum Computing Source - January 22nd, 2020
- U of T's Peter Wittek, who will be remembered at Feb. 3 event, on why the future is quantum - News@UofT - January 17th, 2020
- Quantum Computing Technologies Market 2019, Size, Share, Global Industry Growth, Business Statistics, Top Leaders, Competitive Landscape, Forecast To... - January 17th, 2020
- This Week In Security: Windows 10 Apocalypse, Paypal Problems, And Cablehaunt - Hackaday - January 17th, 2020
- Kitchener's Angstrom Engineering is making a quantum leap with its next-generation technology - TheRecord.com - January 17th, 2020
- Xanadu Receives $4.4M Investment from SDTC to Advance its Photonic Quantum Computing Technology - Yahoo Finance - January 16th, 2020
- The dark side of IoT, AI and quantum computing: Hacking, data breaches and existential threat - ZDNet - January 16th, 2020
- 'How can we compete with Google?': the battle to train quantum coders - The Guardian - January 16th, 2020
- IBM heads US patent list for 27th consecutive year - Technology Decisions - January 16th, 2020
- New Technique May Be Capable of Creating Qubits From Silicon Carbide Wafer - Tom's Hardware - January 14th, 2020
- The hunt for the 'angel particle' continues - Big Think - January 13th, 2020
- How to verify that quantum chips are computing correctly - MIT News - January 13th, 2020
- Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 - The Coin Republic - January 13th, 2020
- Bleeding edge information technology developments - IT World Canada - January 13th, 2020
- Jeffrey Epstein scandal: MIT professor put on leave, he 'failed to inform' college that sex offender made donations - CNBC - January 10th, 2020
- The teenager that's at CES to network - Yahoo Singapore News - January 10th, 2020
- AI, ML and quantum computing to cement position in 2020: Alibabas Jeff Zhang - Tech Observer - January 8th, 2020
- Perspective: End Of An Era | WNIJ and WNIU - WNIJ and WNIU - January 8th, 2020
- Volkswagen carried out the world's first pilot project for traffic optimization with a quantum computer - Quantaneo, the Quantum Computing Source - January 6th, 2020
- The 12 Most Important and Stunning Quantum Experiments of 2019 - Livescience.com - December 31st, 2019
- Physicists Just Achieved The First-Ever Quantum Teleportation Between Computer Chips - ScienceAlert - December 31st, 2019
- Quantum Supremacy and the Regulation of Quantum Technologies - The Regulatory Review - December 31st, 2019
- The Best of Science in 2019 - Research Matters - December 31st, 2019
- Technology And Society: Can Marketing Save The World? - Forbes - December 31st, 2019
- From the image of a black hole to 'artificial embryos', 2019 was the year of many firsts in science - Economic Times - December 28th, 2019
- Information teleported between two computer chips for the first time - New Atlas - December 26th, 2019
- Same Plastic That Make Legos Could Also Be The Best Thermal Insulators Used in Quantum Computers - KTLA Los Angeles - December 26th, 2019
- Quanta's Year in Math and Computer Science (2019) - Quanta Magazine - December 26th, 2019
- 2019 EurekAlert! Trending Release List the most international ever - Science Codex - December 26th, 2019
- The big science and environment stories of 2019 - BBC News - December 26th, 2019
- Could quantum computing be the key to cracking congestion? - SmartCitiesWorld - December 15th, 2019
- ProBeat: AWS and Azure are generating uneasy excitement in quantum computing - VentureBeat - December 15th, 2019
- Will quantum computing overwhelm existing security tech in the near future? - Help Net Security - December 15th, 2019
- Traditional cryptography doesn't stand a chance against the quantum age - Inverse - December 15th, 2019
- China is beating the US when it comes to quantum security - MIT Technology Review - December 15th, 2019
- Technology to Highlight the Next 10 Years: Quantum Computing - Somag News - December 15th, 2019
- Quantum Trends And The Internet of Things - Forbes - December 6th, 2019
- Quantum supremacy is here, but smart data will have the biggest impact - Quantaneo, the Quantum Computing Source - December 6th, 2019
- Beer With Bella: Tyson Yunkaporta - The New York Times - December 6th, 2019
- The New Cold War? Its With China, and It Has Already Begun - The New York Times - December 2nd, 2019
- How Countries Are Betting on to Become Supreme in Quantum Computing - Analytics Insight - December 2nd, 2019
- Study: Our universe may be part of a giant quantum computer - The Next Web - November 28th, 2019
- First quantum computing conference to take place in Cambridge - Cambridge Independent - November 28th, 2019
- Threat of quantum computing hackathon to award $100,000 - App Developer Magazine - November 28th, 2019
- World High Performance Computing (HPC) Market Oulook Report, 2019-2024 - HPC Will Be Integral to Combined Classical & Quantum Computing Hybrid... - November 28th, 2019
- ETU "LETI" first won the Bertrand Meyer Award - QS WOW News - November 28th, 2019
- Global Quantum Computing Market is Set to Experience Revolutionary Growth With +25% CAGR by 2025 | Top Players D-Wave Systems Inc., QX Branch, Google... - November 28th, 2019
- Japan plots 20-year race to quantum computers, chasing US and China - Nikkei Asian Review - November 23rd, 2019
- A super cover illustration highlights superconductivity research - The Mix - November 23rd, 2019
- The future that graphene built - Knowable Magazine - November 23rd, 2019
- New Berlin foundation turns AI into immersive art - Art Newspaper - November 23rd, 2019
- Maryanna Saenko and Steve Jurvetson of Future Ventures talk SpaceX, the Boring Co. and . . . ayahuasca - TechCrunch - November 23rd, 2019
- Quantum Hackathon With $100,000 Prize Receives Overwhelming Response - Yahoo Finance - November 22nd, 2019
- Quantum Computing: Challenges, Trends and the Road Ahead - CMSWire - November 20th, 2019
- Researchers Have Achieved a New Level of Quantum Supremacy - TechDecisions - November 20th, 2019
- Will quantum computers revolutionize the world? The Courier - The Courier - November 20th, 2019
- Reality is subjective to the observer - scientists make stunning claim in quantum study - Express.co.uk - November 20th, 2019
- Geeking Out With Legendary Futurist and Investor Steve Jurvetson - mySanAntonio.com - November 20th, 2019
- Hedera Hashgraph (HBAR) Founder Says Quantum Computing Is Not a Threat to Cryptocurrency, Although That Claim Is Debatable Crypto.IQ | Bitcoin and... - November 18th, 2019
- Innovation Focused Firms Issue Open Call for Hackers - IndustryWeek - November 18th, 2019
- Quantum computer - Simple English Wikipedia, the free ... - October 11th, 2019
- Topological quantum computer - Wikipedia - October 11th, 2019
- What is a quantum computer? Explained with a simple example. - September 11th, 2019
- Qubits and Defining the Quantum Computer | HowStuffWorks - September 5th, 2019
- For a Split Second, a Quantum Computer Made History Go ... - May 13th, 2019
- Noisy Quantum Computers Could Be Good for Chemistry Problems ... - April 11th, 2019
- What is a Quantum Computer? - Definition from Techopedia - April 11th, 2019
- What Is a Quantum Computer? | JSTOR Daily - April 11th, 2019
- Measuring Quantum Computer Power With IBM Quantum Volume ... - April 9th, 2019
- Explainer: What is a quantum computer ... - March 24th, 2019
- What Can We Do with a Quantum Computer? | Institute for ... - March 7th, 2019
- Qubit - Wikipedia - February 25th, 2019
- Quantum computer | computer science | Britannica.com - January 10th, 2019
- IBMs new quantum computer is a symbol, not a breakthrough - January 9th, 2019
- IBM unveils the world's first quantum computer that ... - January 9th, 2019
- Were Close to a Universal Quantum Computer, Heres Where We're At - November 28th, 2018
- Schrdinger's Killer App: Race to Build the World's First ... - August 7th, 2018
- How Quantum Computers Work - May 3rd, 2018
- This is what a 50-qubit quantum computer looks like - January 15th, 2018

## Recent Comments