The End of Theoretical Physics As We Know It

Theoretical physics has a reputation for being complicated. I beg to differ. That we are able to write down natural laws in mathematical form at all means that the laws we deal with are simple — much simpler than those of other scientific disciplines.

Unfortunately, actually solving those equations is often not so simple. For example, we have a perfectly fine theory that describes the elementary particles called quarks and gluons, but no one can calculate how they come together to make a proton. The equations just can’t be solved by any known methods. Similarly, a merger of black holes or even the flow of a mountain stream can be described in deceptively simple terms, but it’s hideously difficult to say what’s going to happen in any particular case.

By Sabine Hossenfelder QuantaMagazine Contributing Columnist

Of course, we are relentlessly pushing the limits, searching for new mathematical strategies. But in recent years much of the pushing has come not from more sophisticated math but from more computing power.

This article first appeared on QuantaMagazine.org by Contributing Columnist Sabine Hassenfelder
August 27, 2018
Quantized Columns- A regular column in which top researchers explore the process of discovery. This month’s columnist, Sabine Hossenfelder, is a theoretical physicist based at the Frankfurt Institute for Advanced Studies in Frankfurt, Germany. She is the author of Lost in Math: How Beauty Leads Physics Astray.

When the first math software became available in the 1980s, it didn’t do much more than save someone a search through enormous printed lists of solved integrals. But once physicists had computers at their fingertips, they realized they no longer had to solve the integrals in the first place, they could just plot the solution.

In the 1990s, many physicists opposed this “just plot it” approach. Many were not trained in computer analysis, and sometimes they couldn’t tell physical effects from coding artifacts. Maybe this is why I recall many seminars in which a result was degraded as “merely numerical.” But over the past two decades, this attitude has markedly shifted, not least thanks to a new generation of physicists for whom coding is a natural extension of their mathematical skill.

Accordingly, theoretical physics now has many subdisciplines dedicated to computer simulations of real-world systems, studies that would just not be possible any other way. Computer simulations are what we now use to study the formation of galaxies and supergalactic structures, to calculate the masses of particles that are composed of several quarks, to find out what goes on in the collision of large atomic nuclei, and to understand solar cycles, to name but a few areas of research that are mainly computer based.

The next step of this shift away from purely mathematical modeling is already on the way: Physicists now custom design laboratory systems that stand in for other systems which they want to better understand. They observe the simulated system in the lab to draw conclusions about, and make predictions for, the system it represents.

The best example may be the research area that goes by the name “quantum simulations.” These are systems composed of interacting, composite objects, like clouds of atoms. Physicists manipulate the interactions among these objects so the system resembles an interaction among more fundamental particles. For example, in circuit quantum electrodynamics, researchers use tiny superconducting circuits to simulate atoms, and then study how these artificial atoms interact with photons. Or in a lab in Munich, physicists use a superfluid of ultra-cold atoms to settle the debate over whether Higgs-like particles can exist in two dimensions of space (the answer is yes).

[SCIET Dynamics Note- Rather than using quantum rules to simulate interactions, the rules of the SCIET will be used to generate ongoing, evolving particles with a full feature set to the simulation that includes postulates for space and time related to the formation of all particles. 

These simulations are not only useful to overcome mathematical hurdles in theories we already know. We can also use them to explore consequences of new theories that haven’t been studied before and whose relevance we don’t yet know.

This is particularly interesting when it comes to the quantum behavior of space and time itself — an area where we still don’t have a good theory. In a recent experiment, for example, Raymond Laflamme, a physicist at the Institute for Quantum Computing at the University of Waterloo in Ontario, Canada, and his group used a quantum simulation to study so-called spin networks, structures that, in some theories, constitute the fundamental fabric of space-time. And Gia Dvali, a physicist at the University of Munich, has proposed a way to simulate the information processing of black holes with ultracold atom gases.

A similar idea is being pursued in the field of analogue gravity, where physicists use fluids to mimic the behavior of particles in gravitational fields. Black hole space-times have attracted the bulk of attention, as with Jeff Steinhauer’s (still somewhat controversial) claim of having measured Hawking radiation in a black-hole analogue. But researchers have also studied the rapid expansion of the early universe, called “inflation,” with fluid analogues for gravity.

In addition, physicists have studied hypothetical fundamental particles by observing stand-ins called quasiparticles. These quasiparticles behave like fundamental particles, but they emerge from the collective movement of many other particles. Understanding their properties allows us to learn more about their behavior, and thereby might also to help us find ways of observing the real thing.

This line of research raises some big questions. First of all, if we can simulate what we now believe to be fundamental by using composite quasiparticles, then maybe what we currently think of as fundamental — space and time and the 25 particles that make up the Standard Model of particle physics — is made up of an underlying structure, too. Quantum simulations also make us wonder what it means to explain the behavior of a system to begin with. Does observing, measuring, and making a prediction by use of a simplified version of a system amount to an explanation?

But for me, the most interesting aspect of this development is that it ultimately changes how we do physics. With quantum simulations, the mathematical model is of secondary relevance. We currently use the math to identify a suitable system because the math tells us what properties we should look for. But that’s not, strictly speaking, necessary. Maybe, over the course of time, experimentalists will just learn which system maps to which other system, as they have learned which system maps to which math. Perhaps one day, rather than doing calculations, we will just use observations of simplified systems to make predictions.

At present, I am sure, most of my colleagues would be appalled by this future vision. But in my mind, building a simplified model of a system in the laboratory is conceptually not so different from what physicists have been doing for centuries: writing down simplified models of physical systems in the language of mathematics.

 

1 comment

A WordPress Commenter

Hi, this is a comment.
To get started with moderating, editing, and deleting comments, please visit the Comments screen in the dashboard.
Commenter avatars come from Gravatar.

Leave a Reply to A WordPress Commenter Cancel reply