What if we could do chemistry inside a computer instead of in a test tube or beaker in the laboratory? What if running a new experiment was as simple as running an app and having it completed in a few seconds?
For this to really work, we would want it to happen with complete fidelity. The atoms and molecules as modeled in the computer should behave exactly like they do in the test tube. The chemical reactions that happen in the physical world would have precise computational analogs. We would need a completely accurate simulation.
If we could do this at scale, we might be able to compute the molecules we want and need.
These might be for new materials for shampoos or even alloys for cars and airplanes. Perhaps we could more efficiently discover medicines that are customized to your exact physiology. Maybe we could get a better insight into how proteins fold, thereby understanding their function, and possibly creating custom enzymes to positively change our body chemistry.
Is this plausible? We have massive supercomputers that can run all kinds of simulations. Can we model molecules in the above ways today?
This article is an excerpt from the book Dancing with Qubits written by Robert Sutor. Robert helps you understand how quantum computing works and delves into the math behind it with this quantum computing textbook.
Lets start with C8H10N4O2 1,3,7-Trimethylxanthine.
This is a very fancy name for a molecule that millions of people around the world enjoy every day: caffeine. An 8-ounce cup of coffee contains approximately 95 mg of caffeine, and this translates to roughly 2.95 10^20 molecules. Written out, this is
295, 000, 000, 000, 000, 000, 000 molecules.
A 12 ounce can of a popular cola drink has 32 mg of caffeine, the diet version has 42 mg, and energy drinks often have about 77 mg.
These numbers are large because we are counting physical objects in our universe, which we know is very big. Scientists estimate, for example, that there are between 10^49 and 10^50 atoms in our planet alone.
To put these values in context, one thousand = 10^3, one million = 10^6, one billion = 10^9, and so on. A gigabyte of storage is one billion bytes, and a terabyte is 10^12 bytes.
Getting back to the question I posed at the beginning of this section, can we model caffeine exactly on a computer? We dont have to model the huge number of caffeine molecules in a cup of coffee, but can we fully represent a single molecule at a single instant?
Caffeine is a small molecule and contains protons, neutrons, and electrons. In particular, if we just look at the energy configuration that determines the structure of the molecule and the bonds that hold it all together, the amount of information to describe this is staggering. In particular, the number of bits, the 0s and 1s, needed is approximately 10^48:
10, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000, 000.
And this is just one molecule! Yet somehow nature manages to deal quite effectively with all this information. It handles the single caffeine molecule, to all those in your coffee, tea, or soft drink, to every other molecule that makes up you and the world around you.
How does it do this? We dont know! Of course, there are theories and these live at the intersection of physics and philosophy. However, we do not need to understand it fully to try to harness its capabilities.
We have no hope of providing enough traditional storage to hold this much information. Our dream of exact representation appears to be dashed. This is what Richard Feynman meant in his quote: Nature isnt classical.
However, 160 qubits (quantum bits) could hold 2^160 1.46 10^48 bits while the qubits were involved in a computation. To be clear, Im not saying how we would get all the data into those qubits and Im also not saying how many more we would need to do something interesting with the information. It does give us hope, however.
In the classical case, we will never fully represent the caffeine molecule. In the future, with enough very high-quality qubits in a powerful quantum computing system, we may be able to perform chemistry on a computer.
I can write a little app on a classical computer that can simulate a coin flip. This might be for my phone or laptop.
Instead of heads or tails, lets use 1 and 0. The routine, which I call R, starts with one of those values and randomly returns one or the other. That is, 50% of the time it returns 1 and 50% of the time it returns 0. We have no knowledge whatsoever of how R does what it does.
When you see R, think random. This is called a fair flip. It is not weighted to slightly prefer one result over the other. Whether we can produce a truly random result on a classical computer is another question. Lets assume our app is fair.
If I apply R to 1, half the time I expect 1 and another half 0. The same is true if I apply R to 0. Ill call these applications R(1) and R(0), respectively.
If I look at the result of R(1) or R(0), there is no way to tell if I started with 1 or 0. This is just like a secret coin flip where I cant tell whether I began with heads or tails just by looking at how the coin has landed. By secret coin flip, I mean that someone else has flipped it and I can see the result, but I have no knowledge of the mechanics of the flip itself or the starting state of the coin.
If R(1) and R(0) are randomly 1 and 0, what happens when I apply R twice?
I write this as R(R(1)) and R(R(0)). Its the same answer: random result with an equal split. The same thing happens no matter how many times we apply R. The result is random, and we cant reverse things to learn the initial value.
There is a catch, though. You are not allowed to look at the result of what H does if you want to reverse its effect. If you apply H to 0 or 1, peek at the result, and apply H again to that, it is the same as if you had used R. If you observe what is going on in the quantum case at the wrong time, you are right back at strictly classical behavior.
To summarize using the coin language: if you flip a quantum coin and then dont look at it, flipping it again will yield heads or tails with which you started. If you do look, you get classical randomness.
A second area where quantum is different is in how we can work with simultaneous values. Your phone or laptop uses bytes as individual units of memory or storage. Thats where we get phrases like megabyte, which means one million bytes of information.
A byte is further broken down into eight bits, which weve seen before. Each bit can be a 0 or 1. Doing the math, each byte can represent 2^8 = 256 different numbers composed of eight 0s or 1s, but it can only hold one value at a time. Eight qubits can represent all 256 values at the same time
This is through superposition, but also through entanglement, the way we can tightly tie together the behavior of two or more qubits. This is what gives us the (literally) exponential growth in the amount of working memory.
Artificial intelligence and one of its subsets, machine learning, are extremely broad collections of data-driven techniques and models. They are used to help find patterns in information, learn from the information, and automatically perform more intelligently. They also give humans help and insight that might have been difficult to get otherwise.
Here is a way to start thinking about how quantum computing might be applicable to large, complicated, computation-intensive systems of processes such as those found in AI and elsewhere. These three cases are in some sense the small, medium, and large ways quantum computing might complement classical techniques:
As I write this, quantum computers are not big data machines. This means you cannot take millions of records of information and provide them as input to a quantum calculation. Instead, quantum may be able to help where the number of inputs is modest but the computations blow up as you start examining relationships or dependencies in the data.
In the future, however, quantum computers may be able to input, output, and process much more data. Even if it is just theoretical now, it makes sense to ask if there are quantum algorithms that can be useful in AI someday.
To summarize, we explored how quantum computing works and different applications of artificial intelligence in quantum computing.
Get this quantum computing book Dancing with Qubits by Robert Sutor today where he has explored the inner workings of quantum computing. The book entails some sophisticated mathematical exposition and is therefore best suited for those with a healthy interest in mathematics, physics, engineering, and computer science.
Intel introduces cryogenic control chip, Horse Ridge for commercially viable quantum computing
Microsoft announces Azure Quantum, an open cloud ecosystem to learn and build scalable quantum solutions
Amazon re:Invent 2019 Day One: AWS launches Braket, its new quantum service and releases
See the original post:
Quantum expert Robert Sutor explains the basics of Quantum Computing - Packt Hub
- Covid 19 Pandemic: Quantum Computing Technologies Market 2020, Share, Growth, Trends And Forecast To 2025 - 3rd Watch News - May 24th, 2020
- Molecular dynamics used to simulate 100 million atoms | Opinion - Chemistry World - May 23rd, 2020
- Highest-performing quantum simulator IN THE WORLD delivered to Japan - TechGeek - May 18th, 2020
- Light, fantastic: the path ahead for faster, smaller computer processors - News - The University of Sydney - May 18th, 2020
- Wiring the quantum computer of the future - Space Daily - April 29th, 2020
- Technologies That You Can Explore Other Than Data Science During Lockdown - Analytics India Magazine - April 29th, 2020
- Will Quantum Computing Really Change The World? Facts And Myths - Analytics India Magazine - April 23rd, 2020
- Google's top quantum computing brain may or may not have quit - Fudzilla - April 23rd, 2020
- On the Heels of a Light Beam - Scientific American - April 23rd, 2020
- Advanced Encryption Standard (AES): What It Is and How It Works - Hashed Out by The SSL Store - Hashed Out by The SSL Store - April 23rd, 2020
- Google's Head of Quantum Computing Hardware Resigns - WIRED - April 21st, 2020
- COVID-19: Quantum computing could someday find cures for coronaviruses and other diseases - TechRepublic - April 21st, 2020
- The future of quantum computing in the cloud - TechTarget - April 21st, 2020
- Quantum computer chips demonstrated at the highest temperatures ever - New Scientist News - April 17th, 2020
- Alex Garland on 'Devs,' free will and quantum computing - Engadget - April 14th, 2020
- RAND report finds that, like fusion power and Half Life 3, quantum computing is still 15 years away - The Register - April 12th, 2020
- Quantum computing: When to expect the next major leap - TechRepublic - April 12th, 2020
- Cambridge Quantum Computing Performs the World's First Quantum Natural Language Processing Experiment - Quantaneo, the Quantum Computing Source - April 12th, 2020
- The Well-matched Combo of Quantum Computing and Machine Learning - Analytics Insight - March 23rd, 2020
- Picking up the quantum technology baton - The Hindu - March 23rd, 2020
- Research by University of Chicago PhD Student and EPiQC Wins IBM Q Best Paper - HPCwire - March 23rd, 2020
- Honeywell Achieves Breakthrough That Will Enable The Worlds Most Powerful Quantum Computer #47655 - New Kerala - March 23rd, 2020
- Is time broken? Physicists filmed a quantum measurement but the 'moment' was blurry - The Next Web - March 5th, 2020
- What Is Quantum Computing? The Next Era of Computational ... - March 3rd, 2020
- Honeywell says it will soon launch the worlds most powerful quantum computer - TechCrunch - March 3rd, 2020
- Majority of Promising AI Startups Are Still Based in the US - Transport Topics Online - March 3rd, 2020
- 10 Revolutionary Technologies To Lookout For In 2020 - Fossbytes - March 3rd, 2020
- Quantum researchers able to split one photon into three - Space Daily - March 3rd, 2020
- Physicists Captured The Moment That An Atom Enters Quantum Measurement - Somag News - February 29th, 2020
- This Week's Awesome Tech Stories From Around the Web (Through February 29) - Singularity Hub - February 29th, 2020
- IC Breakthroughs: Energy Harvesting, Quantum Computing, and a 96-Core Processor in Six Chiplets - News - All About Circuits - February 29th, 2020
- Top 10 Strategic Technology Breakthroughs That Will Transform Our Lives - Analytics Insight - February 29th, 2020
- New Intel chip could accelerate the advent of quantum computing - RedShark News - February 28th, 2020
- Particle accelerator technology could solve one of the most vexing problems in building quantum computers - Fermi National Accelerator Laboratory - February 28th, 2020
- Top 10 breakthrough technologies of 2020 - TechRepublic - February 28th, 2020
- 21st ISQED Conference to Commence With Focus on Quantum Computing, Security, and AI/ML & Electronic Design - PRNewswire - February 25th, 2020
- NTT Research to Collaborate with UCLA and Georgetown on Cryptography and Blockchain - Yahoo Finance - February 25th, 2020
- Should decision makers be concerned by the threat of quantum? - Information Age - February 25th, 2020
- Keeping classified information secret in a world of quantum computing - Bulletin of the Atomic Scientists - February 11th, 2020
- A neural network that learned to predict the behavior of a quantum system - Tech Explorist - February 9th, 2020
- Deltec Bank, Bahamas A combination of Quantum Computing and Blockchain Technology Will Have a huge Impact on Banking - Press Release - Digital... - February 5th, 2020
- Could Photonic Chips Outpace the Fastest Supercomputers? - Singularity Hub - February 5th, 2020
- Google claims to have invented a quantum computer, but IBM begs to differ - The Conversation CA - January 22nd, 2020
- Xanadu Receives $4.4M Investment from SDTC to Advance its Photonic Quantum Computing Technology - Quantaneo, the Quantum Computing Source - January 22nd, 2020
- U of T's Peter Wittek, who will be remembered at Feb. 3 event, on why the future is quantum - News@UofT - January 17th, 2020
- Quantum Computing Technologies Market 2019, Size, Share, Global Industry Growth, Business Statistics, Top Leaders, Competitive Landscape, Forecast To... - January 17th, 2020
- This Week In Security: Windows 10 Apocalypse, Paypal Problems, And Cablehaunt - Hackaday - January 17th, 2020
- Kitchener's Angstrom Engineering is making a quantum leap with its next-generation technology - TheRecord.com - January 17th, 2020
- Xanadu Receives $4.4M Investment from SDTC to Advance its Photonic Quantum Computing Technology - Yahoo Finance - January 16th, 2020
- The dark side of IoT, AI and quantum computing: Hacking, data breaches and existential threat - ZDNet - January 16th, 2020
- 'How can we compete with Google?': the battle to train quantum coders - The Guardian - January 16th, 2020
- IBM heads US patent list for 27th consecutive year - Technology Decisions - January 16th, 2020
- New Technique May Be Capable of Creating Qubits From Silicon Carbide Wafer - Tom's Hardware - January 14th, 2020
- The hunt for the 'angel particle' continues - Big Think - January 13th, 2020
- How to verify that quantum chips are computing correctly - MIT News - January 13th, 2020
- Googles Quantum Supremacy will mark the End of the Bitcoin in 2020 - The Coin Republic - January 13th, 2020
- Bleeding edge information technology developments - IT World Canada - January 13th, 2020
- Jeffrey Epstein scandal: MIT professor put on leave, he 'failed to inform' college that sex offender made donations - CNBC - January 10th, 2020
- The teenager that's at CES to network - Yahoo Singapore News - January 10th, 2020
- AI, ML and quantum computing to cement position in 2020: Alibabas Jeff Zhang - Tech Observer - January 8th, 2020
- Perspective: End Of An Era | WNIJ and WNIU - WNIJ and WNIU - January 8th, 2020
- Volkswagen carried out the world's first pilot project for traffic optimization with a quantum computer - Quantaneo, the Quantum Computing Source - January 6th, 2020
- The 12 Most Important and Stunning Quantum Experiments of 2019 - Livescience.com - December 31st, 2019
- Physicists Just Achieved The First-Ever Quantum Teleportation Between Computer Chips - ScienceAlert - December 31st, 2019
- Quantum Supremacy and the Regulation of Quantum Technologies - The Regulatory Review - December 31st, 2019
- The Best of Science in 2019 - Research Matters - December 31st, 2019
- Technology And Society: Can Marketing Save The World? - Forbes - December 31st, 2019
- From the image of a black hole to 'artificial embryos', 2019 was the year of many firsts in science - Economic Times - December 28th, 2019
- Information teleported between two computer chips for the first time - New Atlas - December 26th, 2019
- Same Plastic That Make Legos Could Also Be The Best Thermal Insulators Used in Quantum Computers - KTLA Los Angeles - December 26th, 2019
- Quanta's Year in Math and Computer Science (2019) - Quanta Magazine - December 26th, 2019
- 2019 EurekAlert! Trending Release List the most international ever - Science Codex - December 26th, 2019
- The big science and environment stories of 2019 - BBC News - December 26th, 2019
- Could quantum computing be the key to cracking congestion? - SmartCitiesWorld - December 15th, 2019
- ProBeat: AWS and Azure are generating uneasy excitement in quantum computing - VentureBeat - December 15th, 2019
- Will quantum computing overwhelm existing security tech in the near future? - Help Net Security - December 15th, 2019
- Traditional cryptography doesn't stand a chance against the quantum age - Inverse - December 15th, 2019
- China is beating the US when it comes to quantum security - MIT Technology Review - December 15th, 2019
- Technology to Highlight the Next 10 Years: Quantum Computing - Somag News - December 15th, 2019
- Quantum Trends And The Internet of Things - Forbes - December 6th, 2019