DeepMind, the AI unit of Google that invented the chess champ neural network AlphaZero a few years back, shocked the world again in November with a program that had solved a decades-old problem of how proteins fold. The program handily beat all competitors, in what one researcher called a "watershed moment" that promises to revolutionize biology.
AlphaFold 2, as it's called, was described at the time only in brief terms, in a blog post by DeepMind and in a paper abstract provided by DeepMind for the competition in which they submitted the program, the Critical Assessment of Techniques for Protein Structure Prediction biannual competition.
Last week, DeepMind finally revealed just how it's done, offering up not only a blog post but also a16-page summary paperwritten by DeepMind's John Jumper and colleagues in Nature magazine, a 62-page collection of supplementary material, and a code library on GitHub. A story on the new details by Nature's Ewan Calloway characterizes the data dump as "protein structure coming to the masses."
So, what have we learned? A few things. As the name suggests, this neural net is the successor to the first AlphaFold, which had also trounced competitors in the prior competition in 2018. The most immediate revelation of AlphaFold 2 is that making progress in artificial intelligence can require what's called an architecture change.
The architecture of a software program is the particular set of operations used and the way they are combined. The first AlphaFold was made up of a convolutional neural network, or "CNN," a classic neural network that has been the workhorse of many AI breakthroughs in the past decade, such as containing triumphs in the ImageNet computer vision contest.
But convolutions are out, and graphs are in. Or, more specifically, the combination of graph networks with what's called attention.
A graph network is when some collection of things can be assessed in terms of their relatedness and how they're related via friendships -- such as people in a social network. In this case, AlphaFold uses information about proteins to construct a graph of how near to one another different amino acids are.
Also: Google DeepMind's effort on COVID-19 coronavirus rests on the shoulders of giants
These graphs are manipulated by the attention mechanism that has been gaining in popularity in many quarters of AI. Broadly speaking, attention is the practice of adding extra computing power to some pieces of input data. Programs that exploit attention have lead to breakthroughs in a variety of areas, but especially natural language processing, as in the case of Google's Transformer.
The part that used convolutions in the first AlphaFold has been dropped in Alpha Fold 2, replaced by a whole slew of attention mechanisms.
Use of attention runs throughout AlphaFold 2. The first part of AlphaFold is what's called EvoFormer, and it uses attention to focus processing on computing the graph of how each amino acid relates to another amino acid. Because of the geometric forms created in the graph, Jumper and colleagues refer to this operation of estimating the graph as "triangle self-attention."
Echoing natural language programs, the EvoFormer allows the triangle attention to send information backward to the groups of amino acid sequences, known as "multi-sequence alignments," or "MSAs," a common term in bioinformatics in which related amino acid sequences are compared piece by piece.
The authors consider the MSAs and the graphs to be in a kind of conversation thanks to attention -- what they refer to as a "joint embedding." Hence, attention is leading to communication between parts of the program.
The second part of AlphaFold 2, following the EvoFormer, is what's called a Structure Module, which is supposed to take the graphs that the EvoFormer has built and turn them into specifications of the 3-D structure of the protein, the output that wins the CASP competition.
Here, the authors have introduced an attention mechanism that calculates parts of a protein in isolation, called an "invariant point attention" mechanism. They describe it as "a geometry-aware attention operation."
The Structure Module initiates particles at a kind of origin point in space, which you can think of as a 3-D reference field, called a "residue gas," and then proceeds to rotate and shift the particles to produce the final 3-D configuration. Again, the important thing is that the particles are transformed independently of one another, using the attention mechanism.
Why is it important that graphs, and attention, have replaced convolutions? In the original abstract offered for the research last year, Jumper and colleagues pointed out a need to move beyond a fixation on what are called "local" structures.
Going back to AlphaFold 1, the convolutional neural network functioned by measuring the distance between amino acids, and then summarizing those measurements for all pairs of amino acids as a 2-D picture, known as a distance histogram, or "distogram." The CNN then operated by poring over that picture, the way CNNs do, to find local motifs that build into broader and broader motifs spanning the range of distances.
But that orderly progression from local motifs can ignore long-range dependencies, which are one of the important elements that attention supposedly captures. For example, the attention mechanism in the EvoFormer can connect what is learned in the triangle attention mechanism to what is learned in the search of the MSA -- not just one section of the MSA, but the entire universe of related amino acid sequences.
Hence, attention allows for making leaps that are more "global" in nature.
Another thing we see in AlphaFold is the end-to-end goal. In the original AlphaFold, the final assembly of the physical structure was simply driven by the convolutions, and what they came up with.
In AlphaFold 2, Jumper and colleagues have emphasized training the neural network from "end to end." As they say:
"Both within the Structure Module and throughout the whole network, we reinforce the notion of iterative refinement by repeatedly applying the final loss to outputs then feeding the outputs recursively to the same modules. The iterative refinement using the whole network (that we term 'recycling' and is related to approaches in computer vision) contributes significantly to accuracy with minor extra training time."
Hence, another big takeaway from AlphaFold 2 is the notion that a neural network really needs to be constantly revamping its predictions. That is true both for the recycling operation, but also in other respects. For example, the EvoFormer, the thing that makes the graphs of amino acids, revises those graphs at each of the multiple stages, what are called "blocks," of the EvoFormer. Jumper and team refer to this constant updates as "constant communication" throughout the network.
As the authors note, through constant revision, the Structure piece of the program seems to "smoothly" refine its models of the proteins. "AlphaFold makes constant incremental improvements to the structure until it can no longer improve," they write. Sometimes, that process is "greedy," meaning, the Structure Module hits on a good solution early in its layers of processing; sometimes, it takes longer.
Also: AI in sixty seconds
In any event, in this case the benefits of training a neural network -- or a combination of networks -- seem certain to be a point of emphasis for many researchers.
Alongside that big lesson, there is an important mystery that remains at the center of AlphaFold 2: Why?
Why is it that proteins fold in the ways they do? AlphaFold 2 has unlocked the prospect of every protein in the universe having its structure revealed, which is, again, an achievement decades in the making. But AlphaFold 2 doesn't explain why proteins assume the shape that they do.
Proteins are amino acids, and the forces that make them curl up into a given shape are fairly straightforward -- things like certain amino acids being attracted or repelled by positive or negative charges, and some amino acids being "hydrophobic," meaning, they stay farther away from water molecules.
What is still lacking is an explanation of why it should be that certain amino acids take on shapes that are so hard to predict.
AlphaFold 2 is a stunning achievement in terms of building a machine to transform sequence data into protein models, but we may have to wait for further study of the program itself to know what it is telling us about the big picture of protein behavior.
See the article here:
DeepMind's AlphaFold 2 reveal: Convolutions are out, attention is in - ZDNet
- GitHub - deepmind/deepmind-research: This repository ... - October 22nd, 2021
- Why DeepMind Acquired This Robotics Startup - Analytics India Magazine - October 22nd, 2021
- Deeper Is Not Necessarily Better: Princeton U & Intel's 12-Layer Parallel Networks Achieve Performance Competitive With SOTA Deep Networks -... - October 22nd, 2021
- How AI is reinventing what computers are - MIT Technology Review - October 22nd, 2021
- Incorporating This Into Your Daily Routine Can Bolster Your Brain Health & Mood - mindbodygreen.com - October 22nd, 2021
- The Ideal Color To Surround Yourself With Right Now, According To Astrologers - mindbodygreen.com - October 22nd, 2021
- A.I. Predicts the Shapes of Molecules to Come - The New ... - October 20th, 2021
- These weird virtual creatures evolve their bodies to solve problems - MIT Technology Review - October 20th, 2021
- As Great Resignation Draws On, Smaller Teams and Larger Workloads, Concerns Over Retention Top of Mind for Leaders - Business Wire - October 20th, 2021
- Global Mindfulness Meditation Apps Market 2021 Industry Insights and Major Players are Deep Relax, Smiling Mind, Inner Explorer, Inc. Radford... - October 15th, 2021
- Google Proposes ARDMs: Efficient Autoregressive Models That Learn to Generate in any Order - Synced - October 15th, 2021
- Here's what's big for AI this year, according to the annual State of AI report - Morning Brew - October 15th, 2021
- Google DeepMind AI: Heres how it can bring a revolution in weather forecasting - Yahoo Singapore News - October 12th, 2021
- Now AI Tells You If It Will Pour In The Next Two Hours - Analytics India Magazine - October 12th, 2021
- Global Blockchain Technology in Healthcare Market (2021 to 2026) - Featuring Accenture, Capgemini and DeepMind Health Among Others -... - October 12th, 2021
- 10 Most Popular Google AI Projects that Everyone Should Know - Analytics Insight - October 12th, 2021
- How this company is using data-driven drug discovery to fight disease - The Globe and Mail - October 12th, 2021
- DeepMind tells Google it has no idea how to make AI less toxic - The Next Web - September 23rd, 2021
- There are some important things to keep in mind for first time home buyers - KTBS - September 23rd, 2021
- Easy exploring: The best things to do in Deep Cove | Curated - Daily Hive - September 23rd, 2021
- The Secrets of The Worlds Greatest Freediver - GQ Magazine - September 23rd, 2021
- How to find peace of mind during the COVID-19 pandemic's delta surge | Opinion - Tennessean - September 23rd, 2021
- WONDERFUL THINGS: Perfect peace in a troubled world from Jesus - Destin Log - September 23rd, 2021
- The Best Social Media Reactions and Memes at the 2021 Met Gala - Yahoo Lifestyle - September 15th, 2021
- Researchers lay the groundwork for an AI hive mind - The Next Web - September 15th, 2021
- Infinite Memory Transformer: Attending to Arbitrarily Long Contexts Without Increasing Computation Burden - Synced - September 15th, 2021
- It takes a village: how Asics got a small town to lace up and raise money for Mind UK - The Drum - September 15th, 2021
- EY study deep dives into financial protection gap - Insurance Business - September 15th, 2021
- Peptone: Putting molecular physics at the heart of UK's next billion-pound pharma company - Yahoo Money - September 15th, 2021
- Vigil episode four recap Silva and the crew are in deep water - The Guardian - September 13th, 2021
- How to cultivate a 'big mind' to deal with everyday stresses | HeraldNet.com - The Daily Herald - September 13th, 2021
- THE BOOK NOOK: Deep behind the scenes of 9/11 - Niagara Gazette - September 13th, 2021
- How 7 years of dirt is deep cleaned from mattresses - Insider - September 13th, 2021
- A therapy called rhythm that creates impact on the body and the mind - Onmanorama - September 13th, 2021
- Bidens abortion clash with the Catholic Church - POLITICO - September 13th, 2021
- God's mysterious riches, wisdom, and knowledge - Leawood - Church of the Resurrection - September 13th, 2021
- Google pulls the plug on DeepMind-developed health app Streams - Business Insider - September 1st, 2021
- Where the mind is without fear - The Times of India Blog - September 1st, 2021
- Global Cognitive Computing in Retail Market, By Component, By Technology, By Deployment, By Application, By Region, Competition Forecast &... - September 1st, 2021
- U.S. households and small businesses have stockpiled a mind-blowing record cash pile of almost $17 trillion - MarketWatch - August 26th, 2021
- What would it be like to be a conscious AI? We might never know. - MIT Technology Review - August 26th, 2021
- Google Health Disbanded; Staff Sent To Other Divisions - Silicon UK - August 26th, 2021
- Zen and the art of fly-fishing in Sweden a photo essay - The Guardian - August 26th, 2021
- 7 Ways to Stop Racing Thoughts at Night and Get to Sleep - Livestrong - August 26th, 2021
- Possumhaw: Out of sight, out of mind - The Dispatch - The Commercial Dispatch - August 24th, 2021
- UK aims to boost solar by predicting cloud movements with A.I. - CNBC - August 24th, 2021
- This Special Nature Writing Ritual Will Connect You To Your Divine Feminine - mindbodygreen.com - August 24th, 2021
- Shipwrecked: A Shocking Tale of Love, Loss, and Survival in the Deep Blue Sea - Boston magazine - August 24th, 2021
- AllDay notes after deeper reading of the prospectus - Philstar.com - August 24th, 2021
- UCSD & Microsoft Improve Image Recognition With Extremely Low FLOPs - Synced - August 24th, 2021
- Are These the Hidden Deepfakes in the Anthony Bourdain Movie? - WIRED - August 24th, 2021
- Scientists report new findings on the role that fish play in balancing coral, algae on reefs - ASU Now - August 24th, 2021
- Accurate prediction of protein structures and interactions using a three-track neural network - Science Magazine - August 22nd, 2021
- The Wonders That Live at the Very Bottom of the Sea - The New York Times - August 22nd, 2021
- Tamo J is 'digging up deep-rooted issues' in therapy - Jamaica Gleaner - August 22nd, 2021
- How to use 'progressive muscle relaxation' to calms stress and ease anxiety - GLAMOUR UK - August 22nd, 2021
- Review: 'Sister Sorry' plunges deep into a shadowy abyss of the psyche. But are its characters worth caring about? - Berkshire Eagle - August 22nd, 2021
- Always on your mind - The New Indian Express - August 22nd, 2021
- This Former Pastor Is Changing Evangelicals' Minds on COVID Vaccines Mother Jones - Mother Jones - August 22nd, 2021
- How DeepMind's John Jumper unlocked the 3D secret to proteins - Fast Company - August 14th, 2021
- How to avoid falling asleep during meditation - The Indian Express - August 14th, 2021
- A New Review of Dozens of Studies Found These Are the 8 Most Effective Ways to Stop Worrying and Calm Your Mind - Inc. - August 14th, 2021
- Four Policies that Government Can Pursue to Advance Trustworthy AI - uschamber.com - August 14th, 2021
- England vs India 2nd Test, Day 3 as it happened: Roots marathon ton guides hosts to 27-run lead - Scroll.in - August 14th, 2021
- Deshbhakti Curriculum to be implemented in Delhi Govt schools. Here is what it will cover - Mint - August 14th, 2021
- Trouble at Google's DeepMind and tech workers not going back to the office - Business Insider - August 8th, 2021
- Embodied AI, superintelligence and the master algorithm - TechCrunch - August 8th, 2021
- Alserkal is serving up isolation tanks and other mind altering activities this month - Time Out Dubai - August 8th, 2021
- Tshola had a generous heart and mind - IOL - August 8th, 2021
- Keeping the faith: Remembering gratitude can help us extend love to others, transform world - The Columbus Dispatch - August 8th, 2021
- AI use cases are expanding and evolving in healthcare - Tech Wire Asia - August 1st, 2021
- DeepMind releases massive database of 3D protein structures - STAT - STAT - July 25th, 2021
- DeepMinds AI uncovers structure of 98.5 per cent of human proteins - News Nation USA - July 25th, 2021
- Good Morning, News: $600000 Settlement for 2017 PPB Killing, Deep Pockets Try Influencing MultCo DA, and Everything is GREAT at the Olympics! - The... - July 25th, 2021
- DeepMind open-sources AlphaFold 2 for protein structure predictions - VentureBeat - July 21st, 2021
- AI in Healthcare Market Growing Trade Among Emerging Economies Opening New Opportunities (2021-2031) | Nuance Communications, Inc., DeepMind... - July 21st, 2021
- Self-driving data centres: Managing the transition from human-to-AI workload management - IT Brief Australia - July 21st, 2021
- The scene is set | Royal St George's | The 149th Open - The Open - July 14th, 2021
- Tech Companies Leading the Cognitive Computing Race in 2021 - Analytics Insight - July 14th, 2021
- Science fiction: From hero's tales to deep thoughts - The Daily Advance - July 14th, 2021