Month: September 2018

A New Test for the Leading Big Bang Theory

A New Test for the Leading Big Bang Theory

Cosmologists have predicted the existence of an oscillating signal that could distinguish
between cosmic inflation and alternative theories of the universe’s birth.

The leading hypothesis about the universe’s birth — that a quantum speck of space became energized and inflated in a split second, creating a baby cosmos — solves many puzzles and fits all observations to date. Yet this “cosmic inflation” hypothesis lacks definitive proof. Telltale ripples that should have formed in the inflating spatial fabric, known as primordial gravitational waves, haven’t been detected in the geometry of the universe by the world’s most sensitive telescopes. Their absence has fueled underdog theories of cosmogenesis in recent years. And yet cosmic inflation is wriggly. In many variants of the idea, the sought-after ripples would simply be too weak to observe.

“The question is whether one can test the entire [inflation] scenario, not just specific models,” said Avi Loeb, an astrophysicist and cosmologist at Harvard University. “If there is no guillotine that can kill off some theories, then what’s the point?”

[Note from SCIET Dynamics- The “Big Bang” observations can be accounted for differently. Rather than a speck of matter, the origin was an intrusion of consciousness that expressed an energy from the center to the edge, which then defused throughout the area defined by the radius of the expression. The SCIET algorithm requires specific conditions to do this, but these conditions existed and continue to exist. This concept is not about inflation, but about consolidation within a limited space defined by the original expression. It is also necessary to dispel the idea that all the matter in the universe existed before the Big Bang, and instead embrace the idea that matter is created after the expression through the resonance of nonmaterial points with one another. This resonance continues at the heart of all matter today.]

In a new paper that appeared on the physics preprint site, arxiv.org, on Sunday, Loeb and two Harvard colleagues, Xingang Chen and Zhong-Zhi Xianyu, suggested such a guillotine. The researchers predicted an oscillatory pattern in the distribution of matter throughout the cosmos that, if detected, could distinguish between inflation and alternative scenarios — particularly the hypothesis that the Big Bang was actually a bounce preceded by a long period of contraction.

The paper has yet to be peer-reviewed, but Will Kinney, an inflationary cosmologist at the University at Buffalo and a visiting professor at Stockholm University, said “the analysis seems correct to me.” He called the proposal “a very elegant idea.”

“If the signal is real and observable, it would be very interesting,” Sean Carroll of the California Institute of Technology said in an email.

Any potential hints about the Big Bang are worth looking for, but the main question, according to experts, is whether the putative oscillatory pattern will be strong enough to detect. It might not be a clear-cut guillotine as advertised.If it does exist, the signal would appear in density variations across the universe. Imagine taking a giant ice cream scoop to the sky and counting how many galaxies wind up inside. Do this many times all over the cosmos, and you’ll find that the number of scooped-up galaxies will vary above or below some average. Now increase the size of your scoop. When scooping larger volumes of universe, you might find that the number of captured galaxies now varies more extremely than before. As you use progressively larger scoops, according to Chen, Loeb and Xianyu’s calculations, the amplitude of matter density variations should oscillate between more and less extreme as you move up the scales. “What we showed,” Loeb explained, is that from the form of these oscillations, “you can tell if the universe was expanding or contracting when the density perturbations were produced” — reflecting an inflationary or bounce cosmology, respectively.

Regardless of which theory of cosmogenesis is correct, cosmologists believe that the density variations observed throughout the cosmos today were almost certainly seeded by random ripples in quantum fields that existed long ago.

Because of quantum uncertainty, any quantum field that filled the primordial universe would have fluctuated with ripples of all different wavelengths. Periodically, waves of a certain wavelength would have constructively interfered, forming peaks — or equivalently, concentrations of particles. These concentrations later grew into the matter density variations seen on different scales in the cosmos today.

But what caused the peaks at a particular wavelength to get frozen into the universe when they did? According to the new paper, the timing depended on whether the peaks formed while the universe was exponentially expanding, as in inflation models, or while it was slowly contracting, as in bounce models.

If the universe contracted in the lead-up to a bounce, ripples in the quantum fields would have been squeezed. At some point the observable universe would have contracted to a size smaller than ripples of a certain wavelength, like a violin whose resonant cavity is too small to produce the sounds of a cello. When the too-large ripples disappeared, whatever peaks, or concentrations of particles, existed at that scale at that moment would have been “frozen” into the universe. As the observable universe shrank further, ripples at progressively smaller and smaller scales would have vanished, freezing in as density variations. Ripples of some sizes might have been constructively interfering at the critical moment, producing peak density variations on that scale, whereas slightly shorter ripples that disappeared a moment later might have frozen out of phase. These are the oscillations between high and low density variations that Chen, Loeb and Xianyu argue should theoretically show up as you change the size of your galaxy ice cream scoop.

These oscillations would also arise if instead the universe experienced a period of rapid inflation. In that case, as it grew bigger and bigger, it would have been able to fit quantum ripples with ever larger wavelengths. Density variations would have been imprinted on the universe at each scale at the moment that ripples of that size were able to form.The authors argue that a qualitative difference between the forms of oscillations in the two scenarios will reveal which one occurred. In both cases, it was as if the quantum field put tick marks on a piece of tape as it rushed past — representing the expanding or contracting universe. If space were expanding exponentially, as in inflation, the tick marks imprinted on the universe by the field would have grown farther and farther apart. If the universe contracted, the tick marks should have become closer and closer together as a function of scale. Thus Chen, Loeb and Xianyu argue that the changing separation between the peaks in density variations as a function of scale should reveal the universe’s evolutionary history. “We can finally see whether the primordial universe was actually expanding or contracting, and whether it did it inflationarily fast or extremely slowly,” Chen said.

David Kaplan explores the leading cosmological explanation for the origin of the universe.

Video: David Kaplan explores the leading cosmological explanation for the origin of the universe.

Filming by Petr Stepanek. Editing and motion graphics by MK12. Music by Pete Calandra and Scott P. Schreer.

Exactly what the oscillatory signal might look like, and how strong it might be, depend on the unknown nature of the quantum fields that might have created it. Discovering such a signal would tell us about those primordial cosmic ingredients. As for whether the putative signal will show up at all in future galaxy surveys, “the good news,” according to Kinney, is that the signal is probably “much, much easier to detect” than other searched-for signals called “non-gaussianities”: triangles and other geometric arrangements of matter in the sky that would also verify and reveal details of inflation. The bad news, though, “is that the strength and the form of the signal depend on a lot of things you don’t know,” Kinney said, such as constants whose values might be zero, and it’s entirely possible that “there will be no detectable signal.”

Posted by Sc13t4 in Astrophysics, Cosmology, Space/Time, The Void, Theoretical Physics, 0 comments
A Short History of the Missing Universe

A Short History of the Missing Universe

The cosmos plays hide-and-seek. Sometimes, though, even when astronomers have a hunch for where their prey might hide, it can take them decades of searching to confirm it. The case of the universe’s missing matter — a case that appears to now be closed, as I reported earlier this month — is one such instance. To me, it is a fascinating tale in which clever cosmological models drew a treasure map that took 20 years to explore.

The concept of matter in SCIET Dynamics is related to the formatting of space at the time of the FIRST ACTION, a moment when massive burst of energy was distributed throughout space, in fact this burst defined SPACE and its definition was made of the energy of the original burst. Matter was created from this, and so the remaining energy is the missing matter. SPACETIME has Mass.

Scientists knew back in the 1980s that they could observe only a fraction of the atomic matter — or baryons — in the universe. (Today we know that all baryons taken together are thought to make up about 5 percent of the universe — the rest is dark energy and dark matter.) They knew that if they counted up all the stuff they could see in the universe — stars and galaxies, for the most part — the bulk of the baryons would be missing.

But exactly how much missing matter there was, and where it might be hiding, were questions that started to sharpen in the 1990s. Around that time, astronomer David Tytler of the University of California, San Diego, came up with a way to measure the amount of deuterium in the light of distant quasars — the bright cores of galaxies with active black holes at their center — using the new spectrograph at the Keck telescope in Hawaii. Tytler’s data helped researchers understand just how many baryons were missing in today’s universe once all the visible stars and gas were accounted for: a whopping 90 percent.

These results set off a firestorm of controversy, fanned in part by Tytler’s personality. “He [insisted] he was right in spite of, at the time, a lot of seemingly contradictory evidence, and basically said everyone else was a bunch of idiots who didn’t know what they were doing,” said Romeel Dave, an astronomer at the University of Edinburgh. “Turns out, of course, he was right.”

Then in 1998, Jeremiah Ostriker and Renyue Cen, Princeton University astrophysicists, released a seminal cosmological model that tracked the history of the universe from its beginnings. The model suggested that the missing baryons were likely wafting about in the form of diffuse (and at the time undetectable) gas between galaxies.

As it happens, Dave could have been the first to tell the world where the baryons were, beating Ostriker and Cen. Months before their paper came out, Dave had finished his own set of cosmological simulations, which were part of his Ph.D. work at the University of California, Santa Cruz. His thesis on the distribution of baryons suggested that they might be lurking in the warm plasma between galaxies. “I didn’t really appreciate the result for what it was,” said Dave. “Oh well, win some, lose some.”

Dave continued to work on the problem in the years to follow. He envisioned the missing matter as hiding in ghostly threads of extremely hot and very diffuse gas that connect galaxy pairs. In astro-speak, this became the “warm-hot intergalactic medium,” or WHIM, a term that Dave coined.

Many astronomers continued to suspect that there might be some very faint stars in the outskirts of galaxies that could account for a significant chunk of the missing matter. But after many decades of searching, the number of baryons in stars, even the faintest ones that could be seen, amounted to no more than 20 percent.

More and more sophisticated instruments came online. In 2003, the Wilkinson Microwave Anisotropy Probe measured the universe’s baryon density as it stood some 380,000 years after the Big Bang. It turned out to be the same density as indicated by the cosmological models. A decade later, the Planck satellite confirmed the number.

With the eventual failure to find hidden stars and galaxies that might be holding the missing matter, “attention turned toward gas in between the galaxies — the intergalactic medium distributed over billions of light years of low-density intergalactic space,” said Michael Shull, an astrophysicist at University of Colorado, Boulder. He and his team began searching for the WHIM by studying its effects on the light from distant quasars. Atoms of hydrogen, helium and heavier elements such as oxygen absorb the ultraviolet and X-ray radiation from these quasar lighthouses. The gas “steals a portion of light from the beam,” said Shull, leaving a deficit of light — an absorption line. Find the lines, and you’ll find the gas.

The most prominent absorption lines of hydrogen and ionized oxygen are at very short wavelengths, in the ultraviolet and X-ray portions of the spectrum. Unfortunately for astronomers (but fortunately for the rest of life on Earth), our atmosphere blocks these rays. In part to solve the missing matter problem, astronomers launched X-ray satellites to map this light. With the absorption line method, Shull said, scientists eventually “accounted for most, if not all, of the predicted baryons that were cooked up in the hot Big Bang.”

Other teams took different approaches, looking for the missing baryons indirectly. As my story from last week shows, three teams, including Shull’s, are now saying that all the baryons are accounted for.

But the WHIM is so faint, and the matter so diffuse, that it’s hard to definitely close the case. “Over the years, there have been many exchanges among researchers arguing for or against possible detections of the warm-hot intergalactic medium,” said Kenneth Sembach, director of the Space Telescope Science Institute in Baltimore. “I suspect there will be many more. The recent papers appear to be another piece in this complex and interesting cosmic puzzle. I’m sure there will be more pieces to come, and associated debates about how best to fit these pieces together.”

https://www.quantamagazine.org/a-short-history-of-the-missing-universe-20180919/

Posted by Sc13t4 in Astrophysics, Cosmology, Space/Time, The Void, Theoretical Physics, 0 comments
The Puzzle of the First Black Holes

The Puzzle of the First Black Holes

IN BRIEF

  • In the very distant, ancient universe, astronomers can see quasars—extremely bright objects powered by enormous black holes. Yet it is unclear how black holes this large could have formed so quickly after the big bang.
  • To solve the mystery, scientists proposed a novel mechanism for black hole formation. Rather than being born in the deaths of massive stars, the seeds of the most ancient supermassive black holes might have collapsed directly from gas clouds.
  • Astronomers may be able to find evidence for direct-collapse black holes using the James Webb Space Telescope, due to launch in 2019, which should see farther back in space and time than any instrument before it.

[SCIET Dynamics Note] SCIET regards Black Holes
as openings to the original Void, revealed by the energy
of vortex motion from the spinning disk of matter.
Space is “sticky”, it adheres to itself because it consists of  layers
of  energetic interactions between  equidistant polarized regions ,
which exist at all units of distance.
At the same time all of these units descend from the
original first action (the “big bang”), meaning that they
are restrained from changing faster than the space around them,
or faster than the original first action (the first change).

Image Credit: Mark Ross- Illustrated as understood today, the idea that the black hole is the source of gravity that attracts all the matter around it may be mistaken. In SCIET Dynamics it is viewed as a “portal”. The black hole is actually an opening in the fabric of space created by the mass swirling around it that is the basis of all the physical affects associated with it. Could we tell the difference? If it is a vortex of matter, then it would indeed create a “hole”, just like a whirlpool or tornado creates a hole and the power it generates is concentrated in the matter at the edge of the hole.

By Priyamvada Natarajan on February 1, 2018 from Scientific American

Imagine the universe in its infancy. Most scientists think space and time originated with the big bang. From that hot and dense start the cosmos expanded and cooled, but it took a while for stars and galaxies to start dotting the sky. It was not until about 380,000 years after the big bang that atoms could hold together and fill the universe with mostly hydrogen gas. When the cosmos was a few hundred million years old, this gas coalesced into the earliest stars, which formed in clusters that clumped together into galaxies, the oldest of which appears 400 million years after the universe was born. To their surprise, scientists have found that another class of astronomical objects begins to appear at this point, too: quasars.

Quasars are extremely bright objects powered by gas falling onto supermassive black holes. They are some of the most luminous things in the universe, visible out to the farthest reaches of space. The most distant quasars are also the most ancient, and the oldest among them pose a mystery.

To be visible at such incredible distances, these quasars must be fueled by black holes containing about a billion times the mass of the sun. Yet conventional theories of black hole formation and growth suggest that a black hole big enough to power these quasars could not have formed in less than a billion years. In 2001, however, with the Sloan Digital Sky Survey, astronomers began finding quasars that dated back earlier. The oldest and most distant quasar known, which was reported last December, existed just 690 million years after the big bang. In other words, it does not seem that there had been enough time in the history of the universe for quasars like this one to form.

Many astronomers think that the first black holes—seed black holes—are the remnants of the first stars, corpses left behind after the stars exploded into supernovae. Yet these stellar remnants should contain no more than a few hundred solar masses. It is difficult to imagine a scenario in which the black holes powering the first quasars grew from seeds this small.

To solve this quandary, a decade ago some colleagues and I proposed a way that seed black holes massive enough to explain the first quasars could have formed without the birth and death of stars. Instead these black hole seeds would have formed directly from gas. We call them direct-collapse black holes (DCBHs). In the right environments, direct-collapse black holes could have been born at 104 or 105 solar masses within a few hundred million years after the big bang. With this head start, they could have easily grown to 109 or 1010 solar masses, thereby producing the ancient quasars that have puzzled astronomers for nearly two decades.

The question is whether this scenario actually happened. Luckily, when the James Webb Space Telescope (JWST) launches in 2019, we should be able to find out.

THE FIRST SEEDS

Black holes are enigmatic astronomical objects, areas where the gravity is so immense that it has warped spacetime so that not even light can escape. It was not until the detection of quasars, which allow astronomers to see the light emitted by matter falling into black holes, that we had evidence that they were real objects and not just mathematical curiosities predicted by Einstein’s general theory of relativity.

Most black holes are thought to form when very massive stars—those with more than about 10 times the mass of sun—exhaust their nuclear fuel and begin to cool and therefore contract. Eventually gravity wins, and the star collapses, igniting a cataclysmic supernova explosion and leaving behind a black hole. Astronomers have traditionally assumed that most of the black holes powering the first quasars formed this way, too. They could have been born from the demise of the universe’s first stars (Population III stars), which we think formed when primordial gas cooled and fragmented about 200 million years after the big bang. Population III stars were probably more massive than stars born in the later universe, which means they could have left behind black holes as hefty as several hundred solar masses. These stars also probably formed in dense clusters, so it is likely that the black holes created on their deaths would have merged, giving rise to black holes of several thousand solar masses. Even black holes this large, however, are far smaller than the masses needed to power the ancient quasars.

Theories also suggest that so-called primordial black holes could have arisen even earlier in cosmic history, when spacetime may have been expanding exponentially in a process called inflation. Primordial black holes could have coalesced from tiny fluctuations in the density of the universe and then grown as the universe expanded. Yet these seeds would weigh only between 10 and 100 solar masses, presenting the same problem as Population III remnants.

As an explanation for the first quasars, each of these pathways for the formation of black hole seeds has the same problem: the seeds would have to grow extraordinarily quickly within the first billion years of cosmic history to create the earliest quasars. And what we know about the growth of black holes tells us that this scenario is highly unlikely.

The SCIET approach is much simpler.
The original Black Holes are portals that are now “receivers”
for newer Black Holes and the matter spewing out of them
is simply being “portaled” there from the newer ones.
The concept of portals in SCIET Dynamics requires that an opening in the
fabric of space”  cannot accept matter into a different
frequency  since the rule of like interacts with like forces it to
interact with a matching frequency  regardless of physical proximity.
Thus the event horizon of a black hole matches the event horizon of another black hole,
and older ones exist at a slightly lower frequency.

FEEDING A BLACK HOLE

Our current understanding of physics suggests that there is an optimal feeding rate, known as the Eddington rate, at which black holes gain mass most efficiently. A black hole feeding at the Eddington rate would grow exponentially, doubling in mass every 107 years or so. To grow to 109 solar masses, a black hole seed of 10 solar masses would have to gobble stars and gas unimpeded at the Eddington rate for a billion years. It is hard to explain how an entire population of black holes could continuously feed so efficiently.

In effect, if the first quasars grew from Population III black hole seeds, they would have had to eat faster than the Eddington rate. Surpassing that rate is theoretically possible under special circumstances in dense, gas-rich environments, and these conditions may have been available in the early universe, but they would not have been common, and they would have been short-lived. Furthermore, exceptionally fast growth can actually cause “choking,” where the radiation emitted during these super-Eddington episodes could disrupt and even stop the flow of mass onto the black hole, halting its growth. Given these restrictions, it seems that extreme feasting could account for a few freak quasars, but it cannot explain the existence of the entire detected population unless our current understanding of the Eddington rate and black hole feeding process is wrong.

Thus, we must wonder whether the first black hole seeds could have formed through other channels. Building on the work of several other research groups, my collaborator Giuseppe Lodato and I published a set of papers in 2006 and 2007 in which we proposed a novel mechanism that could have produced more massive black hole seeds from the get-go. We started with large, pristine gas disks that might otherwise have cooled and fragmented to give rise to stars and become galaxies. We showed that it is possible for these disks to circumvent this conventional process and instead collapse into dense clumps that form seed black holes weighing 104 to 106 solar masses. This outcome can occur if something interferes with the normal cooling process that leads to star formation and instead drives the entire disk to become unstable, rapidly funneling matter to the center, much like water flowing down a bathtub drain when you pull the plug.

Disks cool down more efficiently if their gas includes some molecular hydrogen—two hydrogen atoms bonded together—rather than atomic hydrogen, which consists of only one atom. But if radiation from stars in a neighboring galaxy strikes the disk, it can destroy molecular hydrogen and turn it into atomic hydrogen, which suppresses cooling, keeping the gas too hot to form stars. Without stars, this massive irradiated disk could become dynamically unstable, and matter would quickly drain into its center, rapidly driving the production of a massive, direct-collapse black hole. Because this scenario depends on the presence of nearby stars, we expect DCBHs to typically form in satellite galaxies that orbit around larger parent galaxies where Population III stars have already formed.

Simulations of gas flows on large scales, as well as the physics of small-scale processes, support this model for DCBH formation. Thus, the idea of very large initial seeds appears feasible in the early universe. And starting with seeds in this range alleviates the timing problem for the production of the supermassive black holes that power the brightest, most distant quasars.

LOOKING FOR PROOF

But just because DCBH seeds are feasible does not mean they actually exist. To find out, we must search for observational evidence. These objects would appear as bright, miniature quasars shining through the early universe. They should be detectable during a special phase when the seed merges with the parent galaxy—and this process should be common, given that DCBHs probably form in satellites orbiting larger galaxies. A merger would give the black hole seed a copious new source of gas to eat, so the black hole should start growing rapidly. In fact, it would briefly turn into a special kind of quasar that outshines all the stars in the galaxy.

Credit: Amanda Montañez

These black holes will not only be brighter than their surrounding stars, they will also be heavier—a reversal of the usual order of things. In general, the stars in a galaxy outweigh the central black holes by about a factor of 1,000. After the galaxy hosting the DCBH merges with its parent galaxy, however, the mass of the growing black hole will briefly exceed that of the stars. Such an object, called an obese black hole galaxy (OBG), should have a very special spectral signature, particularly in the infrared wavelengths between one and 30 microns where the JWST’s Mid-Infrared Instrument (MIRI) and Near-Infrared Camera (NIRCam) cameras will operate. This telescope will be the most powerful tool astronomers have ever had for peering into the earliest stages of cosmic history. If the telescope detects these obese black hole galaxies, it will provide strong evidence for our DCBH theory. Traditional black hole seeds, on the other hand, which derive from dead stars, are likely to be too faint for the JWST or other telescopes to see.

It is also possible that we might find other evidence for our theory. In the rare case that the parent galaxy that merges with the DCBH also hosts a central black hole, the two holes will collide and release powerful gravitational waves. These waves could be detectable by the Laser Interferometer Space Antenna (LISA), a European Space Agency/NASA mission expected to fly in the 2030s.

A FULLER PICTURE

It is entirely possible that both the DCBH scenario and small seeds feeding at super-Eddington rates both occurred in the early universe. In fact, the initial black hole seeds probably formed via both these pathways. The question is, Which channel created the bulk of the bright ancient quasars that astronomers see? Solving this mystery could do more than just clear up the timeline of the early cosmos. Astronomers also want to understand more broadly how supermassive black holes affect the larger galaxies around them.

Data suggest that central black holes might play an important role in adjusting how many stars form in the galaxies they inhabit. For one thing, the energy produced when matter falls into the black hole may heat up the surrounding gas at the center of the galaxy, thus preventing cooling and halting star formation. This energy may even have far-reaching effects outside the galactic center by driving energetic jets of radiation outward. These jets, which astronomers can detect in radio wavelengths, could also heat up gas in outer regions and shut down star formation there. These effects are complex, however, and astronomers want to understand the details more clearly. Finding the first seed black holes could help reveal how the relation between black holes and their host galaxies evolved over time.

These insights fit into a larger revolution in our ability to study and understand all masses of black holes. When the Laser Interferometer Gravitational-Wave Observatory (LIGO) made the first detection of gravitational waves in 2015, for instance, scientists were able to trace them back to two colliding black holes weighing 36 and 29 solar masses, the lightweight cousins of the supermassive black holes that power quasars. The project continues to detect waves from similar events, offering new and incredible details about what happens when these black holes crash and warp the spacetime around them. Meanwhile a project called the Event Horizon Telescope aims to use radio observatories scattered around Earth to image the supermassive black hole at the center of the Milky Way. Scientists hope to spot a ringlike shadow around the black hole’s boundary that general relativity predicts will occur as the hole’s strong gravity deflects light. Any deviations the Event Horizon Telescope measures from the predictions of general relativity have the potential to challenge our understanding of black hole physics. In addition, experiments looking at pulsing stars called pulsar timing arrays could also detect tremors in spacetime caused by an accumulated signal of many collisions of black holes. And very soon the JWST will open up an entirely new window on the very first black holes to light up the universe.

Many revelations are in store in the very near future, and our understanding of black holes stands to be transformed.

This article was originally published with the title “The First Monster Black Holes”

Rights & Permissions

MORE TO EXPLORE

New Observational Constraints on the Growth of the First Supermassive Black Holes. E. Treister, K. Schawinski, M. Volonteri and P. Natarajan in Astrophysical Journal, Vol. 778, No. 2, Article No. 130; December 1, 2013.

Seeds to Monsters: Tracing the Growth of Black Holes in the Universe. Priyamvada Natarajan in General Relativity and Gravitation, Vol. 46, No. 5, Article No. 1702; May 2014.

Mapping the Heavens: The Radical Scientific Ideas That Reveal the Cosmos. Priyamvada Natarajan. Yale University Press, 2016.

Unveiling the First Black Holes with JWST: Multi-wavelength Spectral Predictions. Priyamvada Natarajan et al. in Astrophysical Journal, Vol. 838, No. 2, Article No. 117; April 1, 2017.

Posted by Sc13t4 in Astrophysics, Consciousness, Cosmology, Space/Time, The Void, Theoretical Physics, 0 comments
Holofractal

Holofractal

Introduction to Holofractal.net by Billy Carson

In this blog I explore the various philosophical, spiritual and theoretical implications of a holofractographic worldview, with a focus on its relation to consciousness, biology, mysticism, physics and cosmology. Combined, these topics may be ascribed to the field of Cosmometry – the art of measuring the universe. Cosmometry is the study of the geometry of nature, and of the basic processes, patterns, structures and principles that dictate all creation.

Introduction from Holofractal.net

Nature’s guiding principle

The fractal-holographic universe is a geometric understanding of reality and thus represents a divergence from the assumption of a universe composed of subatomic particles towards a recognition of nature’s underlying patterns. Both the inner and the outer world can thus be described as pattern-based systems through geometric shape, proportion or principle.

Geometry is the purest expression of mathematics and it communicates to both our emotional and logical mind – our right and left brain hemispheres. In this sense, geometry is an effective tool that allows us to comprehend both intellectually and emotionally the deeper universal principles fundamental to reality and our communion with Nature. Simultaneously, geometry is a concrete, mathematical way to formulate functional physics applicable through all forms of science and technology.

A basic premise in our explorations is that reality is a fractal-holographic phenomenon that arises in the synergetic interplay between dynamical and absolute energy. As such, dynamical energy is equivalent to conventional physics and consciousness – our familiar everyday reality including everything between heaven and earth. Absolute energy however is pure potential, perfect balance, unbounded existence and appertains to metaphysics – a transdimensional reality beyond space and time. Through a fractal-holographic model we describe the synergetic interactions between these and how cosmological and existential principles results in a continuous creation-process at all levels of being. This blog covers all of these aspects of nature and how they together conduct the cosmic symphony!

The Elements of a Fractal-Holographic Universe

What does “fractal-holographic” mean?

The romanesco cauliflower exhibits a distinct fractal geometry

Fractal simply means that the same basic pattern is repeated on all scales. Fractals are commonplace in nature and they’re particularly visible in organic growth and crystalline forms. Here’s an example:

Holographic means that the whole is represented within all points within a certain system. In a holographic photography information about the whole object is stored at every point of the holographic plate by means of light-based intereference patterns. Several emerging models of contemporary cosmology attempts to describe our universe holographically. This animation depicts a three-dimensional figure stored in a two-dimensional surface:

A holographic plate contains the totality of the picture in every point.

Consciousness – Energy – Information

It wasn’t until the advent of quantum mechanics and Heisenberg’s principle of uncertainty that we began to consider consciousness and energy as mutually intertwined. Currently there’s an increasing recognition across the sciences of the need to define the phenomena of consciousness within the framework of an astrophysical model; the observer plays an integral point of reference for any theory seeking to formulate meaningful physics. For the planet’s many wisdom traditions, however, this has been a basic premise for thousands of years; consciousness and energy as one. We’ll explore this field in more detail by analyzing the various principles fundamental to both energy and consciousness in their various forms. For all intents and purposes, within this blog, the two are regarded as synonyms – yet we distinguish between two different aspects of consciousness/energy; in its absolute and dynamical form. From the holographic perspective we may also understand both energy and consciousness as information. In its most elemental design reality can be considered as abstract information and from this perspective the human form and consciousness can be understood as boundlessly intertwined with space-time itself.

Man is a pattern in space-time rather than a separate particle.

Dynamic Energy

Dynamic energy equals the reality of our everyday life, we have a physical body and a life in this world. Everything we sense, think and feel is fluidly dynamic energy and this energy is perpetually in a state of expansion or contraction – on the way to or away from a point of stillness and balance. We can understand the geometry of this flow as a torus wherein information is circulated and recycled, absorbed and radiated in a continuous feed-back loop. In a fractal holographic universe this self-reflective dynamic is the foundation of all information processing, perception and conscious experience, and thus the premise of all forms of systemic evolution. The torus is visible in all independently organized energy systems in the universe, from atoms to galaxies to humans and represent energy in its dynamic aspect.

The Toroidal Universe

“The self in a toroidal Universe can be both separate and connected with everything else.” – Arthur M. Young

The torus is unique in the sense that:
– It’s centered by a point of stillness (singularity)
– It has a vertical axis of rotation at its center
– It receives and transmits energy while
– Is itself recyclable and self-powered and made of the same medium that acts in
– It distributes and share energy through themselves and their surroundings
– It exists both as an independent entity and as an integrated part of a larger whole

Absolute Energy

Fundamental to all matter and conscious experience is a unified field of absolute energy. In astrophysics this field is called the vacuum, the zero-point field or simply the singularity. Mathematical calculations indicate that it possesses an inexhaustible energy potential. We may understand the singularity in a metaphysical sense as Infinity itself in a state of total balance or as undifferentiated consciousness. Although metaphysical in concept it nonetheless makes itself apparent as the guiding principle behind all tangible form. The zero-point field rests in a state of complete equilibrium and may possess enormous energy-density without making itself visible though thermal radiation or distortion of space-time. Even though it’s everywhere it remains invisible to us, much like water to the fish in the ocean. We may understand this equilibrium geometrically as the “Vector Equilibrium” – a stable geometry of omnidirectional balance. The Vector Equilibrium, or cubeoctahedron, is the only geometric shape that exhibits perfect structural balance through identical vector and angular relationships. In this flawless equilibrium all gravity, radiation, temperature, pressure, thoughts, feelings cancels out leaving behind stillness and a perfect vacuum – pure metaphysical potential.

“The vector equilibrium is the zero starting point for all happenings or nonhappenings, it is the empty theater and circus empty and empty Universe, ready to accommodate any act and any audience” – Buckminster Fuller

Field Geometry

The interplay between energy in its absolute and dynamical form is the foundation of the Cosmic Symphony; all principles of creation arises in this meeting. These principles can again be understood geometrically as Phi spirals, golden proportions, fractals and holographic interference pattern. Based on these ideals Nature creates optimal economic efficiency of energy distribution, dynamical flow, organization, individuation and natural scaling of the infinite vacuum energy, resulting in the myriad of forms we see all around us. We’ll explore all of these aspects in detail elsewhere in this blog.

The seeds of a sunflower grows in a double phi-spiral. This same pattern is also apparent in the field geometry of space-time

Synergy and Resonance

Synergy is the mutual interaction between multiple elements in a system that produces an effect greater than the sum of their individual parts. Resonance is the ability of interacting systems to influence and reinforce each other’s natural frequency through their synchronised vibration and impulse.

We’ve now been introduced to the following three synergistic components of the fractal-holographic universe:

  • Torus dynamics – the primary form of energy in motion (swirls, spirals, vortex)
  • Vector matrix – underlying geometric vacuum structure and patterns based on perfect symmetry (vector equilibrium, cubeoctahedrons, isotropic metric, crystalline forms)
  • Field Geometry – wave interactions, resonance and holographic interference patterns created through the interaction of the vector matrix and toroidal dynamics (imagine overlapping rings in water)

Although we can differentiate these components as unique, independently arising forms, they are nevertheless aspects of a single, unified whole. Below are the geometric principles of the fractal-holographic model illustrated in its most idealized and balanced form. This form is fundamental to how absolute energy unfolds into defined shape and three-dimensional space: it is both a symbol and a concrete depiction of infinity in balance and of astrophysical principles fundamental to our universe.

Resources

The Cosmometry Project
The Cosmometry Project is an initiative of Marshall Lefferts and its overall vision is to bring together the various aspects of cosmic geometry to an overall whole that supports our understanding and application of this knowledge. Many of the core ideas in this page originates from his work. Visit the site for more exciting information about basic cosmometric principles.

The Resonance Project
The Resonance Project, based in Hawaii, is an organization founded by Nassim Haramein and serves as a center for education and scientific research on unified field physics (Unified Field Theory) and cosmometry. Harameins work on the Holofractograhpic Universe is our main frame of reference within this blog.

Posted by Sc13t4 in Biology, Consciousness, Cosmology, Design, Theoretical Physics, 0 comments
Scientists Find Fractal Patterns and Golden Ratio Pulses in Stars

Scientists Find Fractal Patterns and Golden Ratio Pulses in Stars

A recent article in Scientific American has reported the discovery that fractal patterns and the golden ratio have been discovered in outer space for the very first time. Researchers from the University of Hawaiʻi at Mānoa have been studying a specific kind of stars called RR Lyrae variables using the Kepler Space Telescope. Unlike normal stars, they expand and contract, causing their brightness to adjust dramatically, and in so doing create pulsations.

But the pulsations aren’t random or arbitrary.  They are pulsating in accordance with the golden mean.  We have seen the golden ratio turn up in nature all the time, but this is the first time it has ever been identified in space.

“Unlike our Sun, RR Lyrae stars shrink and swell, causing their temperatures and brightness to rhythmically change like the frequencies or notes in a song,” Dr Lindner, the lead Researcher, explained. It’s the ratio between this swelling and shrinking that is so important.

They have been studying the pulsations of these stars, and several of them have been pulsating frequencies nearly identical to the Golden Ratio. These specific stars are called “Golden RR Lyrae Variables.”

“We call these stars ‘golden’ because the ratio of two of their frequency components is near the golden mean, which is an irrational number famous in art, architecture, and mathematics,” Dr Lindner said.

The Golden Mean

The Golden Mean or Ratio, (1.61803398875…) is a pattern that is absolutely essential to the understanding of nature, as its found in everything from sunflowers, to succulents, to sea shells, and is commonly referred to in the study of Sacred Geometry.

The Golden Ratio was essential to Da Vinci’s Vitruvian Man, can be found in studying the Pyramids of Egypt, the Parthenon, and several researches believe they have correlated it to the understanding of the human genome and unlocking the codes in our DNA.

The Golden Ratio or Divine Proportion, when plotted numerically, creates a sequence that emerges what we can see as a Fractal Pattern.  Metaphysicians and Modern Physicians for the last 15-years have been avidly suggesting that the study of fractal patterns can lead us to a greater understanding of the Universe, and a Unified Field within it that very likely may be at play in structuring the Universe.

“The golden stars are actually the first examples outside of a laboratory of what’s called “strange nonchaotic dynamics.” The “strange” here refers to a fractal pattern, and nonchaotic means the pattern is orderly, rather than random. Most fractal patterns in nature, such as weather, are chaotic, so this aspect of the variable stars came as a surprise.” Reported an article in Scientific American.

These RR Lyrae variable stars are at their youngest over 10 billion years old and their brightness can vary by 200 percent over half a day. This makes it a bit challenging to study from Earth due to our day-night cycle. It’s the variation itself causing this mathematical phenomenon.

Plato had theorized that the Universe as a whole is simply a resonance of the “Music or Harmony of the Spheres.” This new study may provide deeper insights to pairing the Philosophies & Spiritual Sciences offered throughout the ages with modern Astronomy, and how we may understand the underlying elegance of nature as a whole.

While some of these stars pulsate with a single frequency, observations confirm that others pulsate with multiple frequencies.

“Just as flamboyant rock stars deliver pulsating rhythmic beats under their song melodies, so, too, do these variable stars,” said Dr Lindner.

Posted by Sc13t4 in Astrophysics, Cosmology, Design, Mathematics, Space/Time, Theoretical Physics, 0 comments
Beyond Quantum Physics: The Next Giant Leap for Science is Approaching Fast

Beyond Quantum Physics: The Next Giant Leap for Science is Approaching Fast

The quote below  is a great example that lets the reader know one thing; that new information and evidence which challenge long held beliefs about our world are always met with harsh criticism. Remember when we found out that the Earth wasn’t flat? Human history shows the same pattern, especially if we look at the history of science.

“Despite the unrivalled empirical success of quantum theory, the very suggestion that it may be literally true as a description of nature is still greeted with cynicism, incomprehension and even anger.”
(T. Folger, “Quantum Shmantum”; Discover 22:37-43, 2001)

Take, for example, prominent physicist Lord Kelvin, who stated in the year 1900 that, “There is nothing new to be discovered in physics now. All that remains is more and more precise measurement.” 

It wasn’t long after this statement when Einstein published his paper on special relativity. Einstein’s theories challenged the accepted framework of knowledge at the time, and forced the scientific community to open up to an alternate view of reality.

It serves as a great example of how concepts that are taken to be absolute truth are susceptible to change.

Today, something special in science is happening. It’s the recognition that what we perceive to be our physical material world is not the only world, and non-material factors like consciousness, for example, may play a vital role in the make-up of our physical material world.

In the scientific community, it’s referred to as non-material science.

Other areas of study in this field include telepathy, clairvoyance, ESP, and more. These are topics that have been studied within black budget and at the highest levels of government for decades, yet at the same time ridiculed by mainstream science, despite extremely significant statistical results.

I definately resonate with the words below, found on this document. Intelligence agencies have a long history of keeping tabs on what goes on with this stuff.  It’s what inspired me to the title I did for the article, because quantum physics leaks into this type of phenomenon, and a quantum perspective is what’s needed to understand them. 

This area is usually referred to as “psi” phenomena, or parapsychological phenomenon.

It’s interesting because as far back as 1999, statistics professor Jessica Utts at UC Irvine, published a paper showing that parapsychological experiments have produced much stronger results than those showing a daily dose of aspirin helping to prevent heart attacks. Utts also showed that these results are much stronger than the research behind various drugs like antiplatelets, for example.

This is precisely why Nikola Tesla told the world that,

“The day science begins to study non-physical phenomena, it will make more progress in one decade than in all the previous centuries of its existence”

Hundreds of scientists are gathering to emphasize this, and are not really getting the attention they deserve. All of our academia and real-world applications come from material science. This is great, but it’s time to take the next leap. How can we continue to ignore facts and results simply because they defy the belief systems of so many people?

A group of internationally recognized scientists have come together to stress the fact that matter (protons, electrons, photons, anything that has a mass) is not the only reality. We wish to understand the nature of our reality, but how can we do so if we are continually examining only physical systems? What about the role of non-physical systems such as consciousness, or their interaction with physical systems (matter)?

Expanding Reality, A Ground Breaking Trilogy Film Series

You can purchase the film here.

“Expanding Reality is about the emerging postmaterialist paradigm and the next great scientific revolution. Why is it important? Because this paradigm has far-reaching implications. For instance, it re-enchants the world and profoundly alters the vision we have of ourselves, giving us back our dignity and power as human beings. The postmaterialist paradigm also fosters positive values such as compassion, respect, care, love, and peace, because it makes us realize that the boundaries between self and others are permeable. In doing so, this paradigm promotes an awareness of the deep interconnection between ourselves and Nature at large. In that sense, the model of reality associated with the postmaterialist paradigm may help humanity to create a sustainable civilization and to blossom.” – Mario Beauregard, PhD, from the University of Arizona

These people have exhausted their own resources in order to make Expanding Reality for the world, show your support by purchasing the movie HERE. You won’t be disappointed.

Important Points

Here is a list of points were co-authored by:  Dr. Gary Schwartz, professor of psychology, medicine, neurology, psychiatry, and surgery at the University of Arizona, Mario Beauregard, PhD, from the University of Arizona, and Lisa Miller, PhD, from Columbia University. It was presented at an international summit on post-materialist science, spirituality, and society.

The Summary Report of the International Summit on Post-Materialist Science, Spirituality and Society can be downloaded here: International Summit on Post-Materialist Science: Summary Report (PDF).

“Get over it, and accept the inarguable conclusion. The universe is immaterial-mental and spiritual.” (“The Mental Universe” ; Nature 436:29,2005)

 First seen: http://www.collective-evolution.com/2018/03/04/beyond-quantum-physics-the-next-giant-leap-for-science-is-approaching-really-fast/
Posted by Sc13t4 in Atomic, Design, Mathematics, Space/Time, Theoretical Physics, 0 comments
Physicists Want to Rebuild Quantum Theory from Scratch

Physicists Want to Rebuild Quantum Theory from Scratch

SCIENTISTS HAVE BEEN using quantum theory for almost a century now, but embarrassingly they still don’t know what it means. An informal poll taken at a 2011 conference on Quantum Physics and the Nature of Reality showed that there’s still no consensus on what quantum theory says about reality—the participants remained deeply divided about how the theory should be interpreted.

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Some physicists just shrug and say we have to live with the fact that quantum mechanics is weird. So particles can be in two places at once, or communicate instantaneously over vast distances? Get over it. After all, the theory works fine. If you want to calculate what experiments will reveal about subatomic particles, atoms, molecules and light, then quantum mechanics succeeds brilliantly.

But some researchers want to dig deeper. They want to know why quantum mechanics has the form it does, and they are engaged in an ambitious program to find out. It is called quantum reconstruction, and it amounts to trying to rebuild the theory from scratch based on a few simple principles.

If these efforts succeed, it’s possible that all the apparent oddness and confusion of quantum mechanics will melt away, and we will finally grasp what the theory has been trying to tell us. “For me, the ultimate goal is to prove that quantum theory is the only theory where our imperfect experiences allow us to build an ideal picture of the world,” said Giulio Chiribella, a theoretical physicist at the University of Hong Kong.

There’s no guarantee of success—no assurance that quantum mechanics really does have something plain and simple at its heart, rather than the abstruse collection of mathematical concepts used today. But even if quantum reconstruction efforts don’t pan out, they might point the way to an equally tantalizing goal: getting beyond quantum mechanics itself to a still deeper theory. “I think it might help us move towards a theory of quantum gravity,” said Lucien Hardy, a theoretical physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

The Flimsy Foundations of Quantum Mechanics

The basic premise of the quantum reconstruction game is summed up by the joke about the driver who, lost in rural Ireland, asks a passer-by how to get to Dublin. “I wouldn’t start from here,” comes the reply.

Where, in quantum mechanics, is “here”? The theory arose out of attempts to understand how atoms and molecules interact with light and other radiation, phenomena that classical physics couldn’t explain. Quantum theory was empirically motivated, and its rules were simply ones that seemed to fit what was observed. It uses mathematical formulas that, while tried and trusted, were essentially pulled out of a hat by the pioneers of the theory in the early 20th century.

Take Erwin Schrödinger’s equation for calculating the probabilistic properties of quantum particles. The particle is described by a “wave function” that encodes all we can know about it. It’s basically a wavelike mathematical expression, reflecting the well-known fact that quantum particles can sometimes seem to behave like waves. Want to know the probability that the particle will be observed in a particular place? Just calculate the square of the wave function (or, to be exact, a slightly more complicated mathematical term), and from that you can deduce how likely you are to detect the particle there. The probability of measuring some of its other observable properties can be found by, crudely speaking, applying a mathematical function called an operator to the wave function.

I think quantum theory as we know it will not stand. Alexei Grinbaum

But this so-called rule for calculating probabilities was really just an intuitive guess by the German physicist Max Born. So was Schrödinger’s equation itself. Neither was supported by rigorous derivation. Quantum mechanics seems largely built of arbitrary rules like this, some of them—such as the mathematical properties of operators that correspond to observable properties of the system—rather arcane. It’s a complex framework, but it’s also an ad hoc patchwork, lacking any obvious physical interpretation or justification.

Compare this with the ground rules, or axioms, of Einstein’s theory of special relativity, which was as revolutionary in its way as quantum mechanics. (Einstein launched them both, rather miraculously, in 1905.) Before Einstein, there was an untidy collection of equations to describe how light behaves from the point of view of a moving observer. Einstein dispelled the mathematical fog with two simple and intuitive principles: that the speed of light is constant, and that the laws of physics are the same for two observers moving at constant speed relative to one another. Grant these basic principles, and the rest of the theory follows. Not only are the axioms simple, but we can see at once what they mean in physical terms.

What are the analogous statements for quantum mechanics? The eminent physicist John Wheeler once asserted that if we really understood the central point of quantum theory, we would be able to state it in one simple sentence that anyone could understand. If such a statement exists, some quantum reconstructionists suspect that we’ll find it only by rebuilding quantum theory from scratch: by tearing up the work of Bohr, Heisenberg and Schrödinger and starting again.

Quantum Roulette

One of the first efforts at quantum reconstruction was made in 2001 by Hardy, then at the University of Oxford. He ignored everything that we typically associate with quantum mechanics, such as quantum jumps, wave-particle duality and uncertainty. Instead, Hardy focused on probability: specifically, the probabilities that relate the possible states of a system with the chance of observing each state in a measurement. Hardy found that these bare bones were enough to get all that familiar quantum stuff back again.

Lucien Hardy, a physicist at the Perimeter Institute, was one of the first to derive the rules of quantum mechanics from simple principles.
GABRIELA SECARA, PERIMETER INSTITUTE FOR THEORETICAL PHYSICS

 Lucien Hardy, a physicist at the Perimeter Institute, was one of the first to derive the rules of quantum mechanics from simple principles.
GABRIELA SECARA, PERIMETER INSTITUTE FOR THEORETICAL PHYSICS

In quantum mechanics, however, a particle can exist not just in distinct states, like the heads and tails of a coin, but in a so-called superposition—roughly speaking, a combination of those states. In other words, a quantum bit, or qubit, can be not just in the binary state of 0 or 1, but in a superposition of the two.

But if you make a measurement of that qubit, you’ll only ever get a result of 1 or 0. That is the mystery of quantum mechanics, often referred to as the collapse of the wave function: Measurements elicit only one of the possible outcomes. To put it another way, a quantum object commonly has more options for measurements encoded in the wave function than can be seen in practice.

Hardy’s rules governing possible states and their relationship to measurement outcomes acknowledged this property of quantum bits. In essence the rules were (probabilistic) ones about how systems can carry information and how they can be combined and interconverted.

Hardy then showed that the simplest possible theory to describe such systems is quantum mechanics, with all its characteristic phenomena such as wavelike interference and entanglement, in which the properties of different objects become interdependent. “Hardy’s 2001 paper was the ‘Yes, we can!’ moment of the reconstruction program,” Chiribella said. “It told us that in some way or another we can get to a reconstruction of quantum theory.”

More specifically, it implied that the core trait of quantum theory is that it is inherently probabilistic. “Quantum theory can be seen as a generalized probability theory, an abstract thing that can be studied detached from its application to physics,” Chiribella said. This approach doesn’t address any underlying physics at all, but just considers how outputs are related to inputs: what we can measure given how a state is prepared (a so-called operational perspective). “What the physical system is is not specified and plays no role in the results,” Chiribella said. These generalized probability theories are “pure syntax,” he added — they relate states and measurements, just as linguistic syntax relates categories of words, without regard to what the words mean. In other words, Chiribella explained, generalized probability theories “are the syntax of physical theories, once we strip them of the semantics.”

Shouldn’t this shock anyone who thinks of quantum theory as an expression of properties of nature? Adán Cabello

The general idea for all approaches in quantum reconstruction, then, is to start by listing the probabilities that a user of the theory assigns to each of the possible outcomes of all the measurements the user can perform on a system. That list is the “state of the system.” The only other ingredients are the ways in which states can be transformed into one another, and the probability of the outputs given certain inputs. This operational approach to reconstruction “doesn’t assume space-time or causality or anything, only a distinction between these two types of data,” said Alexei Grinbaum, a philosopher of physics at the CEA Saclay in France.

To distinguish quantum theory from a generalized probability theory, you need specific kinds of constraints on the probabilities and possible outcomes of measurement. But those constraints aren’t unique. So lots of possible theories of probability look quantum-like. How then do you pick out the right one?

“We can look for probabilistic theories that are similar to quantum theory but differ in specific aspects,” said Matthias Kleinmann, a theoretical physicist at the University of the Basque Country in Bilbao, Spain. If you can then find postulates that select quantum mechanics specifically, he explained, you can “drop or weaken some of them and work out mathematically what other theories appear as solutions.” Such exploration of what lies beyond quantum mechanics is not just academic doodling, for it’s possible—indeed, likely—that quantum mechanics is itself just an approximation of a deeper theory. That theory might emerge, as quantum theory did from classical physics, from violations in quantum theory that appear if we push it hard enough.

Bits and Pieces

Some researchers suspect that ultimately the axioms of a quantum reconstruction will be about information: what can and can’t be done with it. One such derivation of quantum theory based on axioms about information was proposed in 2010 by Chiribella, then working at the Perimeter Institute, and his collaborators Giacomo Mauro D’Ariano and Paolo Perinotti of the University of Pavia in Italy. “Loosely speaking,” explained Jacques Pienaar, a theoretical physicist at the University of Vienna, “their principles state that information should be localized in space and time, that systems should be able to encode information about each other, and that every process should in principle be reversible, so that information is conserved.” (In irreversible processes, by contrast, information is typically lost—just as it is when you erase a file on your hard drive.)

What’s more, said Pienaar, these axioms can all be explained using ordinary language. “They all pertain directly to the elements of human experience, namely, what real experimenters ought to be able to do with the systems in their laboratories,” he said. “And they all seem quite reasonable, so that it is easy to accept their truth.” Chiribella and his colleagues showed that a system governed by these rules shows all the familiar quantum behaviors, such as superposition and entanglement.

Giulio Chiribella, a physicist at the University of Hong Kong, reconstructed quantum theory from ideas in information theory.
COURTESY OF CIFAR

One challenge is to decide what should be designated an axiom and what physicists should try to derive from the axioms. Take the quantum no-cloning rule, which is another of the principles that naturally arises from Chiribella’s reconstruction. One of the deep findings of modern quantum theory, this principle states that it is impossible to make a duplicate of an arbitrary, unknown quantum state.

It sounds like a technicality (albeit a highly inconvenient one for scientists and mathematicians seeking to design quantum computers). But in an effort in 2002 to derive quantum mechanics from rules about what is permitted with quantum information, Jeffrey Bub of the University of Maryland and his colleagues Rob Clifton of the University of Pittsburgh and Hans Halvorson of Princeton University made no-cloning one of three fundamental axioms. One of the others was a straightforward consequence of special relativity: You can’t transmit information between two objects more quickly than the speed of light by making a measurement on one of the objects. The third axiom was harder to state, but it also crops up as a constraint on quantum information technology. In essence, it limits how securely a bit of information can be exchanged without being tampered with: The rule is a prohibition on what is called “unconditionally secure bit commitment.”

These axioms seem to relate to the practicalities of managing quantum information. But if we consider them instead to be fundamental, and if we additionally assume that the algebra of quantum theory has a property called non-commutation, meaning that the order in which you do calculations matters (in contrast to the multiplication of two numbers, which can be done in any order), Clifton, Bub and Halvorson have shown that these rules too give rise to superposition, entanglement, uncertainty, nonlocality and so on: the core phenomena of quantum theory.

Another information-focused reconstruction was suggested in 2009 by Borivoje Dakić and Časlav Brukner, physicists at the University of Vienna. They proposed three “reasonable axioms” having to do with information capacity: that the most elementary component of all systems can carry no more than one bit of information, that the state of a composite system made up of subsystems is completely determined by measurements on its subsystems, and that you can convert any “pure” state to another and back again (like flipping a coin between heads and tails).

Dakić and Brukner showed that these assumptions lead inevitably to classical and quantum-style probability, and to no other kinds. What’s more, if you modify axiom three to say that states get converted continuously—little by little, rather than in one big jump—you get only quantum theory, not classical. (Yes, it really is that way round, contrary to what the “quantum jump” idea would have you expect—you can interconvert states of quantum spins by rotating their orientation smoothly, but you can’t gradually convert a classical heads to a tails.) “If we don’t have continuity, then we don’t have quantum theory,” Grinbaum said.

May 26, 2015 – Harvard University
Quantum Physicist Chris Fuchs stands for a portrait inside the Integrated Sciences building on the campus of UMass Boston.
Photo by Katherine Taylor for Quanta

A further approach in the spirit of quantum reconstruction is called quantum Bayesianism, or QBism. Devised by Carlton Caves, Christopher Fuchs and Rüdiger Schack in the early 2000s, it takes the provocative position that the mathematical machinery of quantum mechanics has nothing to do with the way the world really is; rather, it is just the appropriate framework that lets us develop expectations and beliefs about the outcomes of our interventions. It takes its cue from the Bayesian approach to classical probability developed in the 18th century, in which probabilities stem from personal beliefs rather than observed frequencies. In QBism, quantum probabilities calculated by the Born rule don’t tell us what we’ll measure, but only what we should rationally expect to measure.

In this view, the world isn’t bound by rules—or at least, not by quantum rules. Indeed, there may be no fundamental laws governing the way particles interact; instead, laws emerge at the scale of our observations. This possibility was considered by John Wheeler, who dubbed the scenario Law Without Law. It would mean that “quantum theory is merely a tool to make comprehensible a lawless slicing-up of nature,” said Adán Cabello, a physicist at the University of Seville. Can we derive quantum theory from these premises alone?

“At first sight, it seems impossible,” Cabello admitted—the ingredients seem far too thin, not to mention arbitrary and alien to the usual assumptions of science. “But what if we manage to do it?” he asked. “Shouldn’t this shock anyone who thinks of quantum theory as an expression of properties of nature?”

Making Space for Gravity

In Hardy’s view, quantum reconstructions have been almost too successful, in one sense: Various sets of axioms all give rise to the basic structure of quantum mechanics. “We have these different sets of axioms, but when you look at them, you can see the connections between them,” he said. “They all seem reasonably good and are in a formal sense equivalent because they all give you quantum theory.” And that’s not quite what he’d hoped for. “When I started on this, what I wanted to see was two or so obvious, compelling axioms that would give you quantum theory and which no one would argue with.”

So how do we choose between the options available? “My suspicion now is that there is still a deeper level to go to in understanding quantum theory,” Hardy said. And he hopes that this deeper level will point beyond quantum theory, to the elusive goal of a quantum theory of gravity. “That’s the next step,” he said. Several researchers working on reconstructions now hope that its axiomatic approach will help us see how to pose quantum theory in a way that forges a connection with the modern theory of gravitation—Einstein’s general relativity.

Perhaps when we finally get our hands on quantum gravity, the interpretation will suggest itself. Lucien Hardy

Look at the Schrödinger equation and you will find no clues about how to take that step. But quantum reconstructions with an “informational” flavor speak about how information-carrying systems can affect one another, a framework of causation that hints at a link to the space-time picture of general relativity. Causation imposes chronological ordering: An effect can’t precede its cause. But Hardy suspects that the axioms we need to build quantum theory will be ones that embrace a lack of definite causal structure—no unique time-ordering of events—which he says is what we should expect when quantum theory is combined with general relativity. “I’d like to see axioms that are as causally neutral as possible, because they’d be better candidates as axioms that come from quantum gravity,” he said.

Hardy first suggested that quantum-gravitational systems might show indefinite causal structure in 2007. And in fact only quantum mechanics can display that. While working on quantum reconstructions, Chiribella was inspired to propose an experiment to create causal superpositions of quantum systems, in which there is no definite series of cause-and-effect events. This experiment has now been carried out by Philip Walther’s lab at the University of Vienna—and it might incidentally point to a way of making quantum computing more efficient.

“I find this a striking illustration of the usefulness of the reconstruction approach,” Chiribella said. “Capturing quantum theory with axioms is not just an intellectual exercise. We want the axioms to do something useful for us—to help us reason about quantum theory, invent new communication protocols and new algorithms for quantum computers, and to be a guide for the formulation of new physics.”

But can quantum reconstructions also help us understand the “meaning” of quantum mechanics? Hardy doubts that these efforts can resolve arguments about interpretation—whether we need many worlds or just one, for example. After all, precisely because the reconstructionist program is inherently “operational,” meaning that it focuses on the “user experience”—probabilities about what we measure—it may never speak about the “underlying reality” that creates those probabilities.

“When I went into this approach, I hoped it would help to resolve these interpretational problems,” Hardy admitted. “But I would say it hasn’t.” Cabello agrees. “One can argue that previous reconstructions failed to make quantum theory less puzzling or to explain where quantum theory comes from,” he said. “All of them seem to miss the mark for an ultimate understanding of the theory.” But he remains optimistic: “I still think that the right approach will dissolve the problems and we will understand the theory.”

Maybe, Hardy said, these challenges stem from the fact that the more fundamental description of reality is rooted in that still undiscovered theory of quantum gravity. “Perhaps when we finally get our hands on quantum gravity, the interpretation will suggest itself,” he said. “Or it might be worse!”

MORE QUANTA

  • MEGAN MOLTENI
  • NATALIE WOLCHOVER
    The Man Who’s Trying to Kill Dark Matter
  • FRANK WILCZEK
    Your Simple (Yes, Simple) Guide to Quantum Entanglement

Right now, quantum reconstruction has few adherents—which pleases Hardy, as it means that it’s still a relatively tranquil field. But if it makes serious inroads into quantum gravity, that will surely change. In the 2011 poll, about a quarter of the respondents felt that quantum reconstructions will lead to a new, deeper theory. A one-in-four chance certainly seems worth a shot.

Grinbaum thinks that the task of building the whole of quantum theory from scratch with a handful of axioms may ultimately be unsuccessful. “I’m now very pessimistic about complete reconstructions,” he said. But, he suggested, why not try to do it piece by piece instead—to just reconstruct particular aspects, such as nonlocality or causality? “Why would one try to reconstruct the entire edifice of quantum theory if we know that it’s made of different bricks?” he asked. “Reconstruct the bricks first. Maybe remove some and look at what kind of new theory may emerge.”

“I think quantum theory as we know it will not stand,” Grinbaum said. “Which of its feet of clay will break first is what reconstructions are trying to explore.” He thinks that, as this daunting task proceeds, some of the most vexing and vague issues in standard quantum theory—such as the process of measurement and the role of the observer—will disappear, and we’ll see that the real challenges are elsewhere. “What is needed is new mathematics that will render these notions scientific,” he said. Then, perhaps, we’ll understand what we’ve been arguing about for so long.

 

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Posted by Sc13t4 in Atomic, Design, Mathematics, Space/Time, Theoretical Physics, 0 comments
Proof Claimed for Deep Connection between Prime Numbers

Proof Claimed for Deep Connection between Prime Numbers

If true, a solution to the “abc” conjecture about whole numbers would be “one of the most astounding achievements of mathematics of the 21st century”

The usually quiet world of mathematics is abuzz with a claim that one of the most important problems in number theory has been solved.

Mathematician Shinichi Mochizuki of Kyoto University in Japan has released a 500-page proof of the abcconjecture, which proposes a relationship between whole numbers — a ‘Diophantine’ problem.

The abcconjecture, proposed independently by David Masser and Joseph Oesterle in 1985, might not be as familiar to the wider world as Fermat’s Last Theorem, but in some ways it is more significant. “The abcconjecture, if proved true, at one stroke solves many famous Diophantine problems, including Fermat’s Last Theorem,” says Dorian Goldfeld, a mathematician at Columbia University in New York. “If Mochizuki’s proof is correct, it will be one of the most astounding achievements of mathematics of the twenty-first century.”

By Philipe Ball  From Nature magazine Credit: Flickr/Center for Image in Science and Art _ UL

Like Fermat’s theorem, the abc conjecture refers to equations of the form a+b=c. It involves the concept of a square-free number: one that cannot be divided by the square of any number. Fifteen and 17 are square free-numbers, but 16 and 18 — being divisible by 42and 32, respectively — are not.

The ‘square-free’ part of a number n, sqp(n), is the largest square-free number that can be formed by multiplying the factors of n that are prime numbers. For instance, sqp(18)=2×3=6.

If you’ve got that, then you should get the abcconjecture. It concerns a property of the product of the three integers axbxc, or abc— or more specifically, of the square-free part of this product, which involves their distinct prime factors. It states that for integers a+b=c, the ratio of sqp(abc)r/calways has some minimum value greater than zero for any value of rgreater than 1. For example, if a=3 and b=125, so that c=128, then sqp(abc)=30 and sqp(abc)2/c = 900/128. In this case, in which r=2, sqp(abc)r/c is nearly always greater than 1, and always greater than zero.

Deep connection
It turns out that this conjecture encapsulates many other Diophantine problems, including Fermat’s Last Theorem (which states that an+bn=cnhas no integer solutions if n>2). Like many Diophantine problems, it is all about the relationships between prime numbers. According to Brian Conrad of Stanford University in California, “it encodes a deep connection between the prime factors of a, b and a+b”.

Many mathematicians have expended a great deal of effort trying to prove the conjecture. In 2007, French mathematician Lucien Szpiro, whose work in 1978 led to the abcconjecture in the first place claimed to have a proof of it, but it was soon found to be flawed.

Like Szpiro, and also like British mathematician Andrew Wiles, who proved Fermat’s Last Theorem in 1994, Mochizuki has attacked the problem using the theory of elliptic curves — the smooth curves generated by algebraic relationships of the sort y2=x3+ax+b.

There, however, the relationship of Mochizuki’s work to previous efforts stops. He has developed techniques that very few other mathematicians fully understand and that invoke new mathematical ‘objects’ — abstract entities analogous to more familiar examples such as geometric objects, sets, permutations, topologies and matrices. “At this point, he is probably the only one that knows it all,” says Goldfeld.

Conrad says that the work “uses a huge number of insights that are going to take a long time to be digested by the community”. The proof is spread across four long papers1–4, each of which rests on earlier long papers. “It can require a huge investment of time to understand a long and sophisticated proof, so the willingness by others to do this rests not only on the importance of the announcement but also on the track record of the authors,” Conrad explains.

Mochizuki’s track record certainly makes the effort worthwhile. “He has proved extremely deep theorems in the past, and is very thorough in his writing, so that provides a lot of confidence,” says Conrad. And he adds that the pay-off would be more than a matter of simply verifying the claim. “The exciting aspect is not just that the conjecture may have now been solved, but that the techniques and insights he must have had to introduce should be very powerful tools for solving future problems in number theory.”

This article is reproduced with permission from the magazine Nature. The article was first publishedon September 10, 2012.

Posted by Sc13t4 in Mathematics, Space/Time, Theoretical Physics, 0 comments
What is SpaceTime?

What is SpaceTime?

Physicists believe that at the tiniest scales, space emerges from quanta.
What might these building blocks look like?

People have always taken space for granted. It is just emptiness, after all—a backdrop to everything else. Time, likewise, simply ticks on incessantly. But if physicists have learned anything from the long slog to unify their theories, it is that space and time form a system of such staggering complexity that it may defy our most ardent efforts to understand.

Albert Einstein saw what was coming as early as November 1916. A year earlier he had formulated his general theory of relativity, which postulates that gravity is not a force that propagates through space but a feature of spacetime itself. When you throw a ball high into the air, it arcs back to the ground because Earth distorts the spacetime around it, so that the paths of the ball and the ground intersect again. In a letter to a friend, Einstein contemplated the challenge of merging general relativity with his other brainchild, the nascent theory of quantum mechanics. That would not merely distort space but dismantle it. Mathematically, he hardly knew where to begin. “How much have I already plagued myself in this way!” he wrote.

Einstein never got very far. Even today there are almost as many contending ideas for a quantum theory of gravity as scientists working on the topic. The disputes obscure an important truth: the competing approaches all say space is derived from something deeper—an idea that breaks with 2,500 years of scientific and philosophical understanding.

[SCIET Dynamic’s Note] This article is posted here because it beautifully presents some core issues regarding the controversy over the competition to describe reality in the realm of very small changes in space. We need to find a General Theory of Spacetime.

SCIET Dynamics seeks to unite the components of SpaceTime into an interdependent set that grows in complexity as it develops. It views the Void(Awareness), Space, Matter and Consciousness as sequences of creation built one upon the other. The Void, called “Awareness” in SD, exists as a sea of extremely small and fast fluctuations, which then gives rise to a burst of energy, labeled the “First Action” which converts the burst into ever smaller increments, or “points of Awareness”, that have the effect of “formatting” the area defined by the original burst of energy. The “formatting” is the byproduct of self-measuring algorithm which reduces uniformly within the original radius of the burst. When the increments reach the size of the original center point they begin to interact, or resonate, with that value. The resonance gives rise to a new quality that allows the information about the change created by movement to bounce off of the center point and be stored in the area around the “point of Awareness”, a phenomenon that is responsible to the formation of spheres that surround every “point of Awareness. All nucleons (Protons, neutrons and electrons) are created by this affect. The same affect is responsible for spherical forms in space of all sizes.

DOWN THE BLACK HOLE

A kitchen magnet neatly demonstrates the problem that physicists face. It can grip a paper clip against the gravity of the entire Earth. Gravity is weaker than magnetism or than electric or nuclear forces. Whatever quantum effects it has are weaker still. The only tangible evidence that these processes occur at all is the mottled pattern of matter in the very early universe—thought to be caused, in part, by quantum fluctuations of the gravitational field.

Black holes are the best test case for quantum gravity. “It’s the closest thing we have to experiments,” says Ted Jacobson of the University of Maryland, College Park. He and other theorists study black holes as theoretical fulcrums. What happens when you take equations that work perfectly well under laboratory conditions and extrapolate them to the most extreme conceivable situation? Will some subtle flaw manifest itself?

General relativity predicts that matter falling into a black hole becomes compressed without limit as it approaches the center—a mathematical cul-de-sac called a singularity. Theorists cannot extrapolate the trajectory of an object beyond the singularity; its time line ends there. Even to speak of “there” is problematic because the very spacetime that would define the location of the singularity ceases to exist. Researchers hope that quantum theory could focus a microscope on that point and track what becomes of the material that falls in.

Out at the boundary of the hole, matter is not so compressed, gravity is weaker and, by all rights, the known laws of physics should still hold. Thus, it is all the more perplexing that they do not. The black hole is demarcated by an event horizon, a point of no return: matter that falls in cannot get back out. The descent is irreversible. That is a problem because all known laws of fundamental physics, including those of quantum mechanics as generally understood, are reversible. At least in principle, you should be able to reverse the motion of all the particles and recover what you had.

A very similar conundrum confronted physicists in the late 1800s, when they contemplated the mathematics of a “black body,” idealized as a cavity full of electromagnetic radiation. James Clerk Maxwell’s theory of electromagnetism predicted that such an object would absorb all the radiation that impinges on it and that it could never come to equilibrium with surrounding matter. “It would absorb an infinite amount of heat from a reservoir maintained at a fixed temperature,” explains Rafael Sorkin of the Perimeter Institute for Theoretical Physics in Ontario. In thermal terms, it would effectively have a temperature of absolute zero. This conclusion contradicted observations of real-life black bodies (such as an oven). Following up on work by Max Planck, Einstein showed that a black body can reach thermal equilibrium if radiative energy comes in discrete units, or quanta.

Theoretical physicists have been trying for nearly half a century to achieve an equivalent resolution for black holes. The late Stephen Hawking of the University of Cambridge took a huge step in the mid-1970s, when he applied quantum theory to the radiation field around black holes and showed they have a nonzero temperature. As such, they can not only absorb but also emit energy. Although his analysis brought black holes within the fold of thermodynamics, it deepened the problem of irreversibility. The outgoing radiation emerges from just outside the boundary of the hole and carries no information about the interior. It is random heat energy. If you reversed the process and fed the energy back in, the stuff that had fallen in would not pop out; you would just get more heat. And you cannot imagine that the original stuff is still there, merely trapped inside the hole, because as the hole emits radiation, it shrinks and, according to Hawking’s analysis, ultimately disappears.

This problem is called the information paradox because the black hole destroys the information about the infalling particles that would let you rewind their motion. If black hole physics really is reversible, something must carry information back out, and our conception of spacetime may need to change to allow for that.

ATOMS OF SPACETIME

Heat is the random motion of microscopic parts, such as the molecules of a gas. Because black holes can warm up and cool down, it stands to reason that they have parts—or, more generally, a microscopic structure. And because a black hole is just empty space (according to general relativity, infalling matter passes through the horizon but cannot linger), the parts of the black hole must be the parts of space itself. As plain as an expanse of empty space may look, it has enormous latent complexity.

Even theories that set out to preserve a conventional notion of spacetime end up concluding that something lurks behind the featureless facade. For instance, in the late 1970s Steven Weinberg, now at the University of Texas at Austin, sought to describe gravity in much the same way as the other forces of nature. He still found that spacetime is radically modified on its finest scales.

Physicists initially visualized microscopic space as a mosaic of little chunks of space. If you zoomed in to the Planck scale, an almost inconceivably small size of 10–35 meter, they thought you would see something like a chessboard. But that cannot be quite right. For one thing, the grid lines of a chessboard space would privilege some directions over others, creating asymmetries that contradict the special theory of relativity. For example, light of different colors might travel at different speeds—just as in a glass prism, which refracts light into its constituent colors. Whereas effects on small scales are usually hard to see, violations of relativity would actually be fairly obvious.

In SCIET Dynamics the “atoms” of space time are perceived to be quantum scale fluctuations that leave tetrahedral tracks as they appear and disappear. The tracks are related to the Event Horizons of Black Holes because they bound the the area between the void and space. In this sense, the tiny “tetrons” are an artifact of the creation of space.

The thermodynamics of black holes casts further doubt on picturing space as a simple mosaic. By measuring the thermal behavior of any system, you can count its parts, at least in principle. Dump in energy and watch the thermometer. If it shoots up, that energy must be spread out over comparatively few molecules. In effect, you are measuring the entropy of the system, which represents its microscopic complexity.

If you go through this exercise for an ordinary substance, the number of molecules increases with the volume of material. That is as it should be: If you increase the radius of a beach ball by a factor of 10, you will have 1,000 times as many molecules inside it. But if you increase the radius of a black hole by a factor of 10, the inferred number of molecules goes up by only a factor of 100. The number of “molecules” that it is made up of must be proportional not to its volume but to its surface area. The black hole may look three-dimensional, but it behaves as if it were two-dimensional.

This weird effect goes under the name of the holographic principle because it is reminiscent of a hologram, which presents itself to us as a three-dimensional object. On closer examination, however, it turns out to be an image produced by a two-dimensional sheet of film. If the holographic principle counts the microscopic constituents of space and its contents—as physicists widely, though not universally, accept—it must take more to build space than splicing together little pieces of it.

The relation of part to whole is seldom so straightforward, anyway. An H2O molecule is not just a little piece of water. Consider what liquid water does: it flows, forms droplets, carries ripples and waves, and freezes and boils. An individual H2O molecule does none of that: those are collective behaviors. Likewise, the building blocks of space need not be spatial. “The atoms of space are not the smallest portions of space,” says Daniele Oriti of the Max Planck Institute for Gravitational Physics in Potsdam, Germany. “They are the constituents of space. The geometric properties of space are new, collective, approximate properties of a system made of many such atoms.”

What exactly those building blocks are depends on the theory. In loop quantum gravity, they are quanta of volume aggregated by applying quantum principles. In string theory, they are fields akin to those of electromagnetism that live on the surface traced out by a moving strand or loop of energy—the namesake string. In M-theory, which is related to string theory and may underlie it, they are a special type of particle: a membrane shrunk to a point. In causal set theory, they are events related by a web of cause and effect. In the amplituhedron theory and some other approaches, there are no building blocks at all—at least not in any conventional sense.

Although the organizing principles of these theories vary, all strive to uphold some version of the so-called relationalism of 17th- and 18th-century German philosopher Gottfried Leibniz. In broad terms, relationalism holds that space arises from a certain pattern of correlations among objects. In this view, space is a jigsaw puzzle. You start with a big pile of pieces, see how they connect and place them accordingly. If two pieces have similar properties, such as color, they are likely to be nearby; if they differ strongly, you tentatively put them far apart. Physicists commonly express these relations as a network with a certain pattern of connectivity. The relations are dictated by quantum theory or other principles, and the spatial arrangement follows.

Phase transitions are another common theme. If space is assembled, it might be disassembled, too; then its building blocks could organize into something that looks nothing like space. “Just like you have different phases of matter, like ice, water and water vapor, the atoms of space can also reconfigure themselves in different phases,” says Thanu Padmanabhan of the Inter-University Center for Astronomy and Astrophysics in India. In this view, black holes may be places where space melts. Known theories break down, but a more general theory would describe what happens in the new phase. Even when space reaches its end, physics carries on.

ENTANGLED WEBS

The big realization of recent years—and one that has crossed old disciplinary boundaries—is that the relevant relations involve quantum entanglement. An extrapowerful type of correlation, intrinsic to quantum mechanics, entanglement seems to be more primitive than space. For instance, an experimentalist might create two particles that fly off in opposing directions. If they are entangled, they remain coordinated no matter how far apart they may be.

Traditionally when people talked about “quantum” gravity, they were referring to quantum discreteness, quantum fluctuations and almost every other quantum effect in the book—but never quantum entanglement. That changed when black holes forced the issue. Over the lifetime of a black hole, entangled particles fall in, but after the hole evaporates fully, their partners on the outside are left entangled with—nothing. “Hawking should have called it the entanglement problem,” says Samir Mathur of Ohio State University.

Even in a vacuum, with no particles around, the electromagnetic and other fields are internally entangled. If you measure a field at two different spots, your readings will jiggle in a random but coordinated way. And if you divide a region in two, the pieces will be correlated, with the degree of correlation depending on the only geometric quantity they have in common: the area of their interface. In 1995 Jacobson argued that entanglement provides a link between the presence of matter and the geometry of spacetime—which is to say, it might explain the law of gravity. “More entanglement implies weaker gravity—that is, stiffer spacetime,” he says.

Several approaches to quantum gravity—most of all, string theory—now see entanglement as crucial. String theory applies the holographic principle not just to black holes but also to the universe at large, providing a recipe for how to create space—or at least some of it. For instance, a two-dimensional space could be threaded by fields that, when structured in the right way, generate an additional dimension of space. The original two-dimensional space would serve as the boundary of a more expansive realm, known as the bulk space. And entanglement is what knits the bulk space into a contiguous whole.

In 2009 Mark Van Raamsdonk of the University of British Columbia gave an elegant argument for this process. Suppose the fields at the boundary are not entangled—they form a pair of uncorrelated systems. They correspond to two separate universes, with no way to travel between them. When the systems become entangled, it is as if a tunnel, or wormhole, opens up between those universes, and a spaceship can go from one to the other. As the degree of entanglement increases, the wormhole shrinks in length, drawing the universes together until you would not even speak of them as two universes anymore. “The emergence of a big spacetime is directly tied into the entangling of these field theory degrees of freedom,” Van Raamsdonk says. When we observe correlations in the electromagnetic and other fields, they are a residue of the entanglement that binds space together.

Many other features of space, besides its contiguity, may also reflect entanglement. Van Raamsdonk and Brian Swingle, now at the University of Maryland, College Park, argue that the ubiquity of entanglement explains the universality of gravity—that it affects all objects and cannot be screened out. As for black holes, Leonard Susskind of Stanford University and Juan Maldacena of the Institute for Advanced Study in Princeton, N.J., suggest that entanglement between a black hole and the radiation it has emitted creates a wormhole—a back-door entrance into the hole. That may help preserve information and ensure that black hole physics is reversible.

Whereas these string theory ideas work only for specific geometries and reconstruct only a single dimension of space, some researchers have sought to explain how all of space can emerge from scratch. For instance, ChunJun Cao, Spyridon Michalakis and Sean M. Carroll, all at the California Institute of Technology, begin with a minimalist quantum description of a system, formulated with no direct reference to spacetime or even to matter. If it has the right pattern of correlations, the system can be cleaved into component parts that can be identified as different regions of spacetime. In this model, the degree of entanglement defines a notion of spatial distance.

In physics and, more generally, in the natural sciences, space and time are the foundation of all theories. Yet we never see spacetime directly. Rather we infer its existence from our everyday experience. We assume that the most economical account of the phenomena we see is some mechanism that operates within spacetime. But the bottom-line lesson of quantum gravity is that not all phenomena neatly fit within spacetime. Physicists will need to find some new foundational structure, and when they do, they will have completed the revolution that began just more than a century ago with Einstein.

This article was originally published with the title “What Is Spacetime?”
Rights & Permissions
Posted by Sc13t4 in Astrophysics, Cosmology, Design, Mathematics, Space/Time, Theoretical Physics, 0 comments
The End of Theoretical Physics As We Know It

The End of Theoretical Physics As We Know It

Theoretical physics has a reputation for being complicated. I beg to differ. That we are able to write down natural laws in mathematical form at all means that the laws we deal with are simple — much simpler than those of other scientific disciplines.

Unfortunately, actually solving those equations is often not so simple. For example, we have a perfectly fine theory that describes the elementary particles called quarks and gluons, but no one can calculate how they come together to make a proton. The equations just can’t be solved by any known methods. Similarly, a merger of black holes or even the flow of a mountain stream can be described in deceptively simple terms, but it’s hideously difficult to say what’s going to happen in any particular case.

By Sabine Hossenfelder QuantaMagazine Contributing Columnist

Of course, we are relentlessly pushing the limits, searching for new mathematical strategies. But in recent years much of the pushing has come not from more sophisticated math but from more computing power.

This article first appeared on QuantaMagazine.org by Contributing Columnist Sabine Hassenfelder
August 27, 2018
Quantized Columns- A regular column in which top researchers explore the process of discovery. This month’s columnist, Sabine Hossenfelder, is a theoretical physicist based at the Frankfurt Institute for Advanced Studies in Frankfurt, Germany. She is the author of Lost in Math: How Beauty Leads Physics Astray.

When the first math software became available in the 1980s, it didn’t do much more than save someone a search through enormous printed lists of solved integrals. But once physicists had computers at their fingertips, they realized they no longer had to solve the integrals in the first place, they could just plot the solution.

In the 1990s, many physicists opposed this “just plot it” approach. Many were not trained in computer analysis, and sometimes they couldn’t tell physical effects from coding artifacts. Maybe this is why I recall many seminars in which a result was degraded as “merely numerical.” But over the past two decades, this attitude has markedly shifted, not least thanks to a new generation of physicists for whom coding is a natural extension of their mathematical skill.

Accordingly, theoretical physics now has many subdisciplines dedicated to computer simulations of real-world systems, studies that would just not be possible any other way. Computer simulations are what we now use to study the formation of galaxies and supergalactic structures, to calculate the masses of particles that are composed of several quarks, to find out what goes on in the collision of large atomic nuclei, and to understand solar cycles, to name but a few areas of research that are mainly computer based.

The next step of this shift away from purely mathematical modeling is already on the way: Physicists now custom design laboratory systems that stand in for other systems which they want to better understand. They observe the simulated system in the lab to draw conclusions about, and make predictions for, the system it represents.

The best example may be the research area that goes by the name “quantum simulations.” These are systems composed of interacting, composite objects, like clouds of atoms. Physicists manipulate the interactions among these objects so the system resembles an interaction among more fundamental particles. For example, in circuit quantum electrodynamics, researchers use tiny superconducting circuits to simulate atoms, and then study how these artificial atoms interact with photons. Or in a lab in Munich, physicists use a superfluid of ultra-cold atoms to settle the debate over whether Higgs-like particles can exist in two dimensions of space (the answer is yes).

[SCIET Dynamics Note- Rather than using quantum rules to simulate interactions, the rules of the SCIET will be used to generate ongoing, evolving particles with a full feature set to the simulation that includes postulates for space and time related to the formation of all particles. 

These simulations are not only useful to overcome mathematical hurdles in theories we already know. We can also use them to explore consequences of new theories that haven’t been studied before and whose relevance we don’t yet know.

This is particularly interesting when it comes to the quantum behavior of space and time itself — an area where we still don’t have a good theory. In a recent experiment, for example, Raymond Laflamme, a physicist at the Institute for Quantum Computing at the University of Waterloo in Ontario, Canada, and his group used a quantum simulation to study so-called spin networks, structures that, in some theories, constitute the fundamental fabric of space-time. And Gia Dvali, a physicist at the University of Munich, has proposed a way to simulate the information processing of black holes with ultracold atom gases.

A similar idea is being pursued in the field of analogue gravity, where physicists use fluids to mimic the behavior of particles in gravitational fields. Black hole space-times have attracted the bulk of attention, as with Jeff Steinhauer’s (still somewhat controversial) claim of having measured Hawking radiation in a black-hole analogue. But researchers have also studied the rapid expansion of the early universe, called “inflation,” with fluid analogues for gravity.

In addition, physicists have studied hypothetical fundamental particles by observing stand-ins called quasiparticles. These quasiparticles behave like fundamental particles, but they emerge from the collective movement of many other particles. Understanding their properties allows us to learn more about their behavior, and thereby might also to help us find ways of observing the real thing.

This line of research raises some big questions. First of all, if we can simulate what we now believe to be fundamental by using composite quasiparticles, then maybe what we currently think of as fundamental — space and time and the 25 particles that make up the Standard Model of particle physics — is made up of an underlying structure, too. Quantum simulations also make us wonder what it means to explain the behavior of a system to begin with. Does observing, measuring, and making a prediction by use of a simplified version of a system amount to an explanation?

But for me, the most interesting aspect of this development is that it ultimately changes how we do physics. With quantum simulations, the mathematical model is of secondary relevance. We currently use the math to identify a suitable system because the math tells us what properties we should look for. But that’s not, strictly speaking, necessary. Maybe, over the course of time, experimentalists will just learn which system maps to which other system, as they have learned which system maps to which math. Perhaps one day, rather than doing calculations, we will just use observations of simplified systems to make predictions.

At present, I am sure, most of my colleagues would be appalled by this future vision. But in my mind, building a simplified model of a system in the laboratory is conceptually not so different from what physicists have been doing for centuries: writing down simplified models of physical systems in the language of mathematics.

 

Posted by Sc13t4 in Design, Mathematics, Theoretical Physics, 1 comment