Why reductionism fails at higher levels of complexity – Big Think

One of the greatest ideas of all time is reductionism, the notion that every system, no matter how complex, can be understood in terms of the behavior of its basic constituents. Reductionism has its roots in ancient Greece, when Leucippus and Democritus, in about 400 BC, proposed that everything is composed of atoms, which in Greek means that which cannot be cut. So, atoms came to signify the smallest constituents of matter, even though what we understand by smallest has drastically changed in time.

The focus is on the bottom layer of the material chain: matter is made of molecules; molecules are made of atoms; atoms are made of electrons, protons, and neutrons; protons and neutrons are made of up and down quarks, and so on to presumably other possible levels of smallness unknown to us at present. At the biological level, organisms are composed of organs; organs of cells; cells of organic macromolecules; macromolecules of many atoms, etc.

The more radical view of reductionism claims that all behaviors, from elementary particles to the human brain, spring from bits of matter with interactions described by a few fundamental physical laws. The corollary is that if we uncover these laws at the most basic level, we will be able to extrapolate to higher and higher levels of organizational complexity.

Of course, most reductionists know, or should know, that this kind of statement is more faith-based than scientific. In practice, this extrapolation is impossible: studying how quarks and electrons behave wont help us understand how a uranium nucleus behaves, much less genetic reproduction or how the brain works. Hard-core reductionists would stake their position as a matter of principle, a statement of what they believe is the final goal of fundamental science namely, the discovery of the symmetries and laws that dictate (I would say describe to the best of our ability) the behavior of matter at the subatomic level. But to believe that something is possible in principle is quite useless in the practice of science. The expression fundamental science is loaded and should be used with care.

There is no question that we should celebrate the triumphs of reductionism during the first 400 years of science. Many of the technological innovations of the past four centuries derive from it, as does our ever-deepening understanding of how nature works. In particular, our digital revolution is a byproduct of quantum mechanics, the branch of physics that studies atoms and subatomic particles. The problem is not so much with how efficient reductionism is at describing the behavior of the basic constituents of matter. The problems arise as we try to go bottom-up, from the lowest level of material organization to higher ones.

We know how to describe with great precision the behavior of the simplest chemical element: the hydrogen atom, with its single proton and electron. However, even here, trouble lurks as we attempt to include subtle corrections, for example adding that the electron orbits the proton with relativistic speeds (i.e., close to the speed of light) or that its intrinsic rotation (or spin) gives rise to a magnetic force that interacts with a similar magnetic force of the proton. Physicists take these effects into account using perturbation theory, an approximation scheme that adds small changes to the allowed energies of the atom.

Physicists can also describe the next atom of the periodic table, helium, with considerable success due to its high degree of symmetry. But life gets complicated very quickly as we go up in complexity. More drastic and less efficient approximation schemes are required to make progress. And these dont include the interactions between protons and neutrons in the nucleus (which calls for a different force, the strong nuclear force), much less the fact that protons and neutrons are made of quarks and gluons, the particles responsible for the strong interactions.

Physics is the art of approximation. We dress down complex systems to their bare essentials and model them in as simple terms as possible without compromising the goal of understanding the complicated system we started from. This process works well until the complexity is such that a new set of laws and approaches is necessary.

At the next level of complexity are the molecules, assemblies of atoms. In a very rough way, all chemical reactions are attempts to minimize electric charge disparities. How many molecules can exist?

Lets jump to biochemistry for an illustration. Proteins are chains of amino acids. Since there are 20 different amino acids and a typical protein has some 200 of them, the number of possible proteins is around 20200. Increasing the length of the protein and hence the possible choices of amino acids leads to a combinatorial explosion. Physicist Walter Elsasser coined the term immense to describe numbers larger than 10100, a googol (that is, a one followed by 100 zeroes). The number of possible proteins is certainly immense. We see only a small subset realized in living creatures.

The number 10100 is not arbitrary. Elsasser showed that a list containing 10100 molecules would require a computer memory containing more than all the matter in the universe. Worse, to analyze the contents of the list, we would need longer than the age of the Universe, 13.8 billion years. There is an immense number of new molecules with unknown properties to be explored. The same goes for the number of genetic combinations, cell types, and mental states.

It is thus impossible to predict the behavior of complex biomolecules from a bottom-up approach based on fundamental physical laws. Quarks do not explain the behavior of neurons. The passage from one level of material organization to the next is not continuous. New laws are required for different layers of material organization, as described in the fast-growing field of complex systems theory. There are many texts on the subject, including this somewhat technical book. The exciting aspect of this new field is that it calls for new ways of thinking about natural systems, which are by nature more holistic such as network theory, nonlinear dynamics, chaos theory and fractals, and information theory. Climate science is another clear example.

In his prescient 1972 essay More is Different, Nobel laureate physicist Philip Anderson argued for this layering of physical laws, which are irreducible: We cannot deduce laws from a higher layer by starting at a lower level of complexity. The reductionist program meets a brick wall, where progress needs to be carved at each specific level of complexity. There are theories of things and not a theory of everything.

More:

Why reductionism fails at higher levels of complexity - Big Think

Related Posts

Comments are closed.