Articles that seek to explain Space/Time and offer insights about the SCIET

What Quantum Theory Reveals To Us About The Nature of Reality

What Quantum Theory Reveals To Us About The Nature of Reality

  • The Facts:Quantum physics has revealed astonishing discoveries, many of which challenge many long-held belief systems. It opens up discussions into metaphysical realities, and are thus labelled as mere interpretations due to the vastness of their implication.
  • Reflect On:For a long time, authorities have suppressed ideas that are different, even if backed by evidence. Who is deciding what information gets out and is confirmed in the public domain? Who decides to establish something as ‘fact’ within the mainstream?

“I regard consciousness as fundamental. I regard matter as derivative from consciousness. We cannot get behind consciousness. Everything that we talk about, everything that we regard as existing, postulates consciousness.”  – Max Planck, the originator of Quantum theory.

Basically, it means each photon individually goes through both slits at the same time and interferes with itself, but it also goes through one slit, and it goes through the other, it also goes through neither of them. The single piece of matter becomes a “wave” of potentials, expressing itself as multiple possibilities, which is why we get the interference pattern. How can a single piece of matter exist and express itself in multiple states without any physical properties until it is measured or observed?

The article in Scientific American states, “some have even used it (the double slit experiment) to argue that the quantum world is influenced by human consciousness, giving our minds an agency and a place in the ontology of the universe. But does this simple experiment really make such a case?”

I stopped reading there for the simple fact that it’s not only this experiment but hundreds, if not thousands of other studies within the realms of quantum physics and parapsychology that clearly show that at some degree, our physical material reality is influenced by consciousness, in more ways than one, and this is not really trivial or a mere interpretation…

This is emphasized by a number of researchers who have conducted the experiment, as well as all of the founding people of quantum theory. A paper published in Physics Essays, for example, explains how the experiment has been used a number of times to explore the role of consciousness in shaping the nature of physical reality, it concluded that factors associated with consciousness “significantly” correlated in predicted ways with perturbations in the double-slit interference pattern. Again, here, scientists affected the results of the experiment by simply observing it.

Observation not only disturbs what has to be measured, they produce it. We compel the electron to assume a definite position, We ourselves produce the results of the measurement.”

The paper showed that meditators were able to collapse quantum systems at a distance through intention alone. The lead author of the study points out that a “5 sigma” result was able to give CERN the Nobel Prize in 2013 for finding the Higgs particle (which turned out not to be Higgs after all). In this study, they also received a 5 sigma result when testing meditators against non-meditators in collapsing the quantum wave function. This means that mental activity, the human mind,  attention, and intention, which are a few labels under the umbrella of consciousness, compelled physical matter to act in a certain way.

Perhaps the strongest point to illustrate the fact that consciousness and our physical material reality are intertwined are black budget special access programstudies that have been conducted by multiple governments worldwide for decades. In these programs, various phenomena are studied within the realms of quantum physics and parapsychology and have been confirmed, tested and used in the field. We’re talking about telepathy, remote viewing and much more. There are even classified documents pertaining to human beings with special abilities, who are able to alter physical material matter using their mind, as well as peer-reviewed research, here’s one example. There is also the health connection, theplacebo effect, and the mind-body connection which further prove that consciousness and physical material reality are intertwined.  When it comes to parapsychology, the science behind it is stronger than the science we used to approve some of our medications…(source)

This is precisely why the American Institutes for Research concluded:

The statistical results of the studies examined are far beyond what is expected by chance. Arguments that these results could be due to methodological flaws in the experiments are soundly refuted. Effects of similar magnitude to those found in government-sponsored research at SRI and SAIC have been replicated at a number of laboratories across the world. Such consistency cannot be readily explained by claims of flaws or fraud.

It was not possible to formulate the laws of quantum mechanics in a fully consistent way without reference to consciousness. – Eugene Wigner, theoretical physicist and mathematician

There is also distant healing, and studies conducted showing what human attention can do to not just a piece of matter, but to another human body. If you want to learn more about this kind of thing, a great place to start is at The Institute of Noetic Sciences. I recently wrote about a study that found healing energy was able to be stored and treat cancer cells, you can read more about that here.

There is no doubt about it, consciousness does have an effect on our physical material world, what type of effect is not as well understood, but we know there is one and it shouldn’t really be called into question, especially by in an article published in Scientific American. 

At the end of the nineteenth century, physicists discovered empirical phenomena that could not be explained by classical physics. This led to the development, during the 1920s and early 1930s, of a revolutionary new branch of physics called quantum mechanics (QM). QM has questioned the material foundations of the world by showing that atoms and subatomic particles are not really solid objects—they do not exist with certainty at definite spatial locations and definite times. Most importantly, QM explicitly introduced the mind into its basic conceptual structure since it was found that particles being observed and the observer—the physicist and the method used for observation—are linked. According to one interpretation of QM, this phenomenon implies that the consciousness of the observer is vital to the existence of the physical events being observed, and that mental events can affect the physical world. The results of recent experiments support this interpretation. These results suggest that the physical world is no longer the primary or sole component of reality, and that it cannot be fully understood without making reference to the mind. – Dr Gary Schwartz, Dr. Gary Schwartz, professor of psychology, medicine, neurology, psychiatry, and surgery at the University of Arizona

How Does This Apply To Our Lives & Our World In General?

This kind of thing has moved beyond just simple interpretation, and we also have examples from the black budget, like the STARGATE program, and real-world examples that what is discovered at the quantum scale is indeed important and relevant, and does apply in many cases to larger scales. People with ‘special abilities’ as mentioned above is one example, and technology that utilized quantum physics is another example, like the ones this Ex Lockheed executive describes, or this one. This type of stuff moved out of the theoretical realm a long time ago, yet again, it’s not really acknowledged. Another example would be over-unity energy, which utilized the non-physical properties of physical matter. You can read more and find out more information about that machine, here and here.

What we have today, is scientific dogma.

The modern scientific worldview is predominantly predicated on assumptions that are closely associated with classical physics. Materialism—the idea that matter is the only reality—is one of these assumptions. A related assumption is a reductionism, the notion that complex things can be understood by reducing them to the interactions of their parts, or to simpler or more fundamental things such as tiny material particles. During the 19th century, these assumptions narrowed, turned into dogmas, and coalesced into an ideological belief system that came to be known as “scientific materialism.” This belief system implies that the mind is nothing but the physical activity of the brain and that our thoughts cannot have any effect on our brains and bodies, our actions, and the physical world. –  Lisa Miller, Ph.D., Columbia University. 

So, why is this not acknowledge or established? That consciousness clearly has an effect on our physical material reality? Because, simply, we’re going against belief systems here. This and other types of discoveries bring into play and confirm a metaphysical reality, one that’s been ridiculed by many, especially authoritarian figures, for years.

When something questions our collective established beliefs, no matter how repeatable the results, it’s always going to be greeted with false claims and harsh reactions, we’re simply going through that transition now.

Despite the unrivalled empirical success of quantum theory, the very suggestion that it may be literally true as a description of nature is still greeted with cynicism, incomprehension and even anger. (T. Folger, “Quantum Shmantum”; Discover 22:37-43, 2001)

Today, it’s best to keep an open mind, as new findings are destroying what we previously thought to be true. The next step for science is taking a spiritual leap, because that’s what quantum physics is showing us, and it’s clearly far from a mere interpretation. Our thoughts, feelings, emotions, perceptions and more all influence physical reality. This is why it’s so important to focus on our own state of being, feeling good, and in the simplest form, just being a nice person.

The very fact that these findings have metaphysical and spiritual revelations is exactly what forces one to instantaneously throw the ‘pseudoscience’ label at it. Instead of examining and addressing the evidence, the skeptic uses ridicule to de-bunk something they do not believe in, sort of like what mainstream media tends to do these days quite a bit.

So why is this significant? Well, it’s significant because planet Earth is made up of a huge collection of billions of minds. If consciousness does have an effect on our physical material reality, that means in some sense, we are all co-creating our human experience here. We are responsible for the human experience and what happens the on the planet, because we are all, collectively, creating it.

That doesn’t mean that if we all collectively have a thought, it will manifest into existence right away, it. simply means mind influences matter in various ways that we don’t quite understand yet. If everybody thought the Earth was flat, would it actually be flat? These are the questions we are approaching as we move forward.

When our perception of reality changes, our reality begins to change. When we become aware of something, when we observe what is going on, and when we have paradigm-shifting revelations, these mental shifts bring about a physical change in our human experience. Even in our own individual lives, our emotional state, physical state, state of well being and how we perceive reality around us can also influence what type of human experience we create for ourselves. The experience can also change, depending on how you look at it.

Change the way you look at things, and the things you look at will change.

There is a very spiritual message that comes from quantum physics, and it’s not really an interpretation.

“Broadly speaking, although there are some differences, I think Buddhist philosophy and Quantum Mechanics can shake hands on their view of the world. We can see in these great examples the fruits of human thinking. Regardless of the admiration we feel for these great thinkers, we should not lose sight of the fact that they were human beings just as we are.

– The Dalai Lama (source)

As many of you reading this will know, 99.99 percent of an atom is empty space, but we’ve recently discovered that it’s actually not empty space, but is full of energy. This is not debatable and trivial, and we can now effectively use and harness that energy and turn it into electrical energy, which is exactly is discussed in this article if you go through the whole thing, it proves sufficient evidence. Read carefully.

Change Starts Within

When I think about this stuff, it really hits home that change does really start within, that we as human beings are co-creators, and together we can change this world any time we choose to do so. Metaphysics and spirituality represent the next scientific revolution, and it all boils down to humanity as a collective and as individuals finding our inner peace, losing our buttons so they can’t be pushed, and to just overall be good people.

Science can only take us so far, intuition, gut feelings, emotions and more will all be used to decipher truth more accurately in the future.

Today, we’ve lost our connection to spirituality, this connection was replaced long ago with belief systems have been given to us in the form of religion to the point where society is extremely separated when it comes to ‘what is.’ If we are all believing something different, and constantly arguing and conflicting instead of coming together and focusing on what we have in common to create a better world, then we have a problem….Especially if you think about the fact that we are all collectively co-creating.

We’ve been programmed to see the world a different way than what it actually is. We are living an illusion and quantum physics is one of many areas that can snap us out of that illusion if not restricted and conclusions labelled as mere interpretations.

I fail to realize how the spirituality emerging from quantum physics is a mere interpretation and see this as a tactic used by the elite to simply keep us in the same old world paradigm. I believe this is done deliberately.

Once we wake up and realize the power of human consciousness, we would be much more cautious of our thoughts, we would be much more focused on growing ourselves spirituality, and we would realize that greed, ego, fear and separation are completely useless and unnecessary.

Service to others is key, and it’s important that our planet and our ‘leadership’ here solely be focused on serving all of humanity. Right now, that’s not the case, but we are in the midst of a great change, one that has been taking place over a number of years, but on a cosmic scale, it’s happening in an instant. It’s interesting because the world is waking up to the illusions that have guided our actions, to the brainwashing, and to the false information.  Our collective consciousness is shifting, and we are creating a new human experience

We interviewed Franco DeNicola about what is happening with the shift in consciousness. It turned out to be one of the deepest and most important information we pulled out within an interview.

By Arjun Walia
From Collective Evolution

Posted by Sc13t4 in Consciousness, Mathematics, Space/Time, Theoretical Physics, 0 comments
A New Test for the Leading Big Bang Theory

A New Test for the Leading Big Bang Theory

Cosmologists have predicted the existence of an oscillating signal that could distinguish
between cosmic inflation and alternative theories of the universe’s birth.

The leading hypothesis about the universe’s birth — that a quantum speck of space became energized and inflated in a split second, creating a baby cosmos — solves many puzzles and fits all observations to date. Yet this “cosmic inflation” hypothesis lacks definitive proof. Telltale ripples that should have formed in the inflating spatial fabric, known as primordial gravitational waves, haven’t been detected in the geometry of the universe by the world’s most sensitive telescopes. Their absence has fueled underdog theories of cosmogenesis in recent years. And yet cosmic inflation is wriggly. In many variants of the idea, the sought-after ripples would simply be too weak to observe.

“The question is whether one can test the entire [inflation] scenario, not just specific models,” said Avi Loeb, an astrophysicist and cosmologist at Harvard University. “If there is no guillotine that can kill off some theories, then what’s the point?”

[Note from SCIET Dynamics- The “Big Bang” observations can be accounted for differently. Rather than a speck of matter, the origin was an intrusion of consciousness that expressed an energy from the center to the edge, which then defused throughout the area defined by the radius of the expression. The SCIET algorithm requires specific conditions to do this, but these conditions existed and continue to exist. This concept is not about inflation, but about consolidation within a limited space defined by the original expression. It is also necessary to dispel the idea that all the matter in the universe existed before the Big Bang, and instead embrace the idea that matter is created after the expression through the resonance of nonmaterial points with one another. This resonance continues at the heart of all matter today.]

In a new paper that appeared on the physics preprint site,, on Sunday, Loeb and two Harvard colleagues, Xingang Chen and Zhong-Zhi Xianyu, suggested such a guillotine. The researchers predicted an oscillatory pattern in the distribution of matter throughout the cosmos that, if detected, could distinguish between inflation and alternative scenarios — particularly the hypothesis that the Big Bang was actually a bounce preceded by a long period of contraction.

The paper has yet to be peer-reviewed, but Will Kinney, an inflationary cosmologist at the University at Buffalo and a visiting professor at Stockholm University, said “the analysis seems correct to me.” He called the proposal “a very elegant idea.”

“If the signal is real and observable, it would be very interesting,” Sean Carroll of the California Institute of Technology said in an email.

Any potential hints about the Big Bang are worth looking for, but the main question, according to experts, is whether the putative oscillatory pattern will be strong enough to detect. It might not be a clear-cut guillotine as advertised.If it does exist, the signal would appear in density variations across the universe. Imagine taking a giant ice cream scoop to the sky and counting how many galaxies wind up inside. Do this many times all over the cosmos, and you’ll find that the number of scooped-up galaxies will vary above or below some average. Now increase the size of your scoop. When scooping larger volumes of universe, you might find that the number of captured galaxies now varies more extremely than before. As you use progressively larger scoops, according to Chen, Loeb and Xianyu’s calculations, the amplitude of matter density variations should oscillate between more and less extreme as you move up the scales. “What we showed,” Loeb explained, is that from the form of these oscillations, “you can tell if the universe was expanding or contracting when the density perturbations were produced” — reflecting an inflationary or bounce cosmology, respectively.

Regardless of which theory of cosmogenesis is correct, cosmologists believe that the density variations observed throughout the cosmos today were almost certainly seeded by random ripples in quantum fields that existed long ago.

Because of quantum uncertainty, any quantum field that filled the primordial universe would have fluctuated with ripples of all different wavelengths. Periodically, waves of a certain wavelength would have constructively interfered, forming peaks — or equivalently, concentrations of particles. These concentrations later grew into the matter density variations seen on different scales in the cosmos today.

But what caused the peaks at a particular wavelength to get frozen into the universe when they did? According to the new paper, the timing depended on whether the peaks formed while the universe was exponentially expanding, as in inflation models, or while it was slowly contracting, as in bounce models.

If the universe contracted in the lead-up to a bounce, ripples in the quantum fields would have been squeezed. At some point the observable universe would have contracted to a size smaller than ripples of a certain wavelength, like a violin whose resonant cavity is too small to produce the sounds of a cello. When the too-large ripples disappeared, whatever peaks, or concentrations of particles, existed at that scale at that moment would have been “frozen” into the universe. As the observable universe shrank further, ripples at progressively smaller and smaller scales would have vanished, freezing in as density variations. Ripples of some sizes might have been constructively interfering at the critical moment, producing peak density variations on that scale, whereas slightly shorter ripples that disappeared a moment later might have frozen out of phase. These are the oscillations between high and low density variations that Chen, Loeb and Xianyu argue should theoretically show up as you change the size of your galaxy ice cream scoop.

These oscillations would also arise if instead the universe experienced a period of rapid inflation. In that case, as it grew bigger and bigger, it would have been able to fit quantum ripples with ever larger wavelengths. Density variations would have been imprinted on the universe at each scale at the moment that ripples of that size were able to form.The authors argue that a qualitative difference between the forms of oscillations in the two scenarios will reveal which one occurred. In both cases, it was as if the quantum field put tick marks on a piece of tape as it rushed past — representing the expanding or contracting universe. If space were expanding exponentially, as in inflation, the tick marks imprinted on the universe by the field would have grown farther and farther apart. If the universe contracted, the tick marks should have become closer and closer together as a function of scale. Thus Chen, Loeb and Xianyu argue that the changing separation between the peaks in density variations as a function of scale should reveal the universe’s evolutionary history. “We can finally see whether the primordial universe was actually expanding or contracting, and whether it did it inflationarily fast or extremely slowly,” Chen said.

David Kaplan explores the leading cosmological explanation for the origin of the universe.

Video: David Kaplan explores the leading cosmological explanation for the origin of the universe.

Filming by Petr Stepanek. Editing and motion graphics by MK12. Music by Pete Calandra and Scott P. Schreer.

Exactly what the oscillatory signal might look like, and how strong it might be, depend on the unknown nature of the quantum fields that might have created it. Discovering such a signal would tell us about those primordial cosmic ingredients. As for whether the putative signal will show up at all in future galaxy surveys, “the good news,” according to Kinney, is that the signal is probably “much, much easier to detect” than other searched-for signals called “non-gaussianities”: triangles and other geometric arrangements of matter in the sky that would also verify and reveal details of inflation. The bad news, though, “is that the strength and the form of the signal depend on a lot of things you don’t know,” Kinney said, such as constants whose values might be zero, and it’s entirely possible that “there will be no detectable signal.”

Posted by Sc13t4 in Astrophysics, Cosmology, Space/Time, The Void, Theoretical Physics, 0 comments
A Short History of the Missing Universe

A Short History of the Missing Universe

The cosmos plays hide-and-seek. Sometimes, though, even when astronomers have a hunch for where their prey might hide, it can take them decades of searching to confirm it. The case of the universe’s missing matter — a case that appears to now be closed, as I reported earlier this month — is one such instance. To me, it is a fascinating tale in which clever cosmological models drew a treasure map that took 20 years to explore.

The concept of matter in SCIET Dynamics is related to the formatting of space at the time of the FIRST ACTION, a moment when massive burst of energy was distributed throughout space, in fact this burst defined SPACE and its definition was made of the energy of the original burst. Matter was created from this, and so the remaining energy is the missing matter. SPACETIME has Mass.

Scientists knew back in the 1980s that they could observe only a fraction of the atomic matter — or baryons — in the universe. (Today we know that all baryons taken together are thought to make up about 5 percent of the universe — the rest is dark energy and dark matter.) They knew that if they counted up all the stuff they could see in the universe — stars and galaxies, for the most part — the bulk of the baryons would be missing.

But exactly how much missing matter there was, and where it might be hiding, were questions that started to sharpen in the 1990s. Around that time, astronomer David Tytler of the University of California, San Diego, came up with a way to measure the amount of deuterium in the light of distant quasars — the bright cores of galaxies with active black holes at their center — using the new spectrograph at the Keck telescope in Hawaii. Tytler’s data helped researchers understand just how many baryons were missing in today’s universe once all the visible stars and gas were accounted for: a whopping 90 percent.

These results set off a firestorm of controversy, fanned in part by Tytler’s personality. “He [insisted] he was right in spite of, at the time, a lot of seemingly contradictory evidence, and basically said everyone else was a bunch of idiots who didn’t know what they were doing,” said Romeel Dave, an astronomer at the University of Edinburgh. “Turns out, of course, he was right.”

Then in 1998, Jeremiah Ostriker and Renyue Cen, Princeton University astrophysicists, released a seminal cosmological model that tracked the history of the universe from its beginnings. The model suggested that the missing baryons were likely wafting about in the form of diffuse (and at the time undetectable) gas between galaxies.

As it happens, Dave could have been the first to tell the world where the baryons were, beating Ostriker and Cen. Months before their paper came out, Dave had finished his own set of cosmological simulations, which were part of his Ph.D. work at the University of California, Santa Cruz. His thesis on the distribution of baryons suggested that they might be lurking in the warm plasma between galaxies. “I didn’t really appreciate the result for what it was,” said Dave. “Oh well, win some, lose some.”

Dave continued to work on the problem in the years to follow. He envisioned the missing matter as hiding in ghostly threads of extremely hot and very diffuse gas that connect galaxy pairs. In astro-speak, this became the “warm-hot intergalactic medium,” or WHIM, a term that Dave coined.

Many astronomers continued to suspect that there might be some very faint stars in the outskirts of galaxies that could account for a significant chunk of the missing matter. But after many decades of searching, the number of baryons in stars, even the faintest ones that could be seen, amounted to no more than 20 percent.

More and more sophisticated instruments came online. In 2003, the Wilkinson Microwave Anisotropy Probe measured the universe’s baryon density as it stood some 380,000 years after the Big Bang. It turned out to be the same density as indicated by the cosmological models. A decade later, the Planck satellite confirmed the number.

With the eventual failure to find hidden stars and galaxies that might be holding the missing matter, “attention turned toward gas in between the galaxies — the intergalactic medium distributed over billions of light years of low-density intergalactic space,” said Michael Shull, an astrophysicist at University of Colorado, Boulder. He and his team began searching for the WHIM by studying its effects on the light from distant quasars. Atoms of hydrogen, helium and heavier elements such as oxygen absorb the ultraviolet and X-ray radiation from these quasar lighthouses. The gas “steals a portion of light from the beam,” said Shull, leaving a deficit of light — an absorption line. Find the lines, and you’ll find the gas.

The most prominent absorption lines of hydrogen and ionized oxygen are at very short wavelengths, in the ultraviolet and X-ray portions of the spectrum. Unfortunately for astronomers (but fortunately for the rest of life on Earth), our atmosphere blocks these rays. In part to solve the missing matter problem, astronomers launched X-ray satellites to map this light. With the absorption line method, Shull said, scientists eventually “accounted for most, if not all, of the predicted baryons that were cooked up in the hot Big Bang.”

Other teams took different approaches, looking for the missing baryons indirectly. As my story from last week shows, three teams, including Shull’s, are now saying that all the baryons are accounted for.

But the WHIM is so faint, and the matter so diffuse, that it’s hard to definitely close the case. “Over the years, there have been many exchanges among researchers arguing for or against possible detections of the warm-hot intergalactic medium,” said Kenneth Sembach, director of the Space Telescope Science Institute in Baltimore. “I suspect there will be many more. The recent papers appear to be another piece in this complex and interesting cosmic puzzle. I’m sure there will be more pieces to come, and associated debates about how best to fit these pieces together.”

Posted by Sc13t4 in Astrophysics, Cosmology, Space/Time, The Void, Theoretical Physics, 0 comments
The Puzzle of the First Black Holes

The Puzzle of the First Black Holes


  • In the very distant, ancient universe, astronomers can see quasars—extremely bright objects powered by enormous black holes. Yet it is unclear how black holes this large could have formed so quickly after the big bang.
  • To solve the mystery, scientists proposed a novel mechanism for black hole formation. Rather than being born in the deaths of massive stars, the seeds of the most ancient supermassive black holes might have collapsed directly from gas clouds.
  • Astronomers may be able to find evidence for direct-collapse black holes using the James Webb Space Telescope, due to launch in 2019, which should see farther back in space and time than any instrument before it.

[SCIET Dynamics Note] SCIET regards Black Holes
as openings to the original Void, revealed by the energy
of vortex motion from the spinning disk of matter.
Space is “sticky”, it adheres to itself because it consists of  layers
of  energetic interactions between  equidistant polarized regions ,
which exist at all units of distance.
At the same time all of these units descend from the
original first action (the “big bang”), meaning that they
are restrained from changing faster than the space around them,
or faster than the original first action (the first change).

Image Credit: Mark Ross- Illustrated as understood today, the idea that the black hole is the source of gravity that attracts all the matter around it may be mistaken. In SCIET Dynamics it is viewed as a “portal”. The black hole is actually an opening in the fabric of space created by the mass swirling around it that is the basis of all the physical affects associated with it. Could we tell the difference? If it is a vortex of matter, then it would indeed create a “hole”, just like a whirlpool or tornado creates a hole and the power it generates is concentrated in the matter at the edge of the hole.

By Priyamvada Natarajan on February 1, 2018 from Scientific American

Imagine the universe in its infancy. Most scientists think space and time originated with the big bang. From that hot and dense start the cosmos expanded and cooled, but it took a while for stars and galaxies to start dotting the sky. It was not until about 380,000 years after the big bang that atoms could hold together and fill the universe with mostly hydrogen gas. When the cosmos was a few hundred million years old, this gas coalesced into the earliest stars, which formed in clusters that clumped together into galaxies, the oldest of which appears 400 million years after the universe was born. To their surprise, scientists have found that another class of astronomical objects begins to appear at this point, too: quasars.

Quasars are extremely bright objects powered by gas falling onto supermassive black holes. They are some of the most luminous things in the universe, visible out to the farthest reaches of space. The most distant quasars are also the most ancient, and the oldest among them pose a mystery.

To be visible at such incredible distances, these quasars must be fueled by black holes containing about a billion times the mass of the sun. Yet conventional theories of black hole formation and growth suggest that a black hole big enough to power these quasars could not have formed in less than a billion years. In 2001, however, with the Sloan Digital Sky Survey, astronomers began finding quasars that dated back earlier. The oldest and most distant quasar known, which was reported last December, existed just 690 million years after the big bang. In other words, it does not seem that there had been enough time in the history of the universe for quasars like this one to form.

Many astronomers think that the first black holes—seed black holes—are the remnants of the first stars, corpses left behind after the stars exploded into supernovae. Yet these stellar remnants should contain no more than a few hundred solar masses. It is difficult to imagine a scenario in which the black holes powering the first quasars grew from seeds this small.

To solve this quandary, a decade ago some colleagues and I proposed a way that seed black holes massive enough to explain the first quasars could have formed without the birth and death of stars. Instead these black hole seeds would have formed directly from gas. We call them direct-collapse black holes (DCBHs). In the right environments, direct-collapse black holes could have been born at 104 or 105 solar masses within a few hundred million years after the big bang. With this head start, they could have easily grown to 109 or 1010 solar masses, thereby producing the ancient quasars that have puzzled astronomers for nearly two decades.

The question is whether this scenario actually happened. Luckily, when the James Webb Space Telescope (JWST) launches in 2019, we should be able to find out.


Black holes are enigmatic astronomical objects, areas where the gravity is so immense that it has warped spacetime so that not even light can escape. It was not until the detection of quasars, which allow astronomers to see the light emitted by matter falling into black holes, that we had evidence that they were real objects and not just mathematical curiosities predicted by Einstein’s general theory of relativity.

Most black holes are thought to form when very massive stars—those with more than about 10 times the mass of sun—exhaust their nuclear fuel and begin to cool and therefore contract. Eventually gravity wins, and the star collapses, igniting a cataclysmic supernova explosion and leaving behind a black hole. Astronomers have traditionally assumed that most of the black holes powering the first quasars formed this way, too. They could have been born from the demise of the universe’s first stars (Population III stars), which we think formed when primordial gas cooled and fragmented about 200 million years after the big bang. Population III stars were probably more massive than stars born in the later universe, which means they could have left behind black holes as hefty as several hundred solar masses. These stars also probably formed in dense clusters, so it is likely that the black holes created on their deaths would have merged, giving rise to black holes of several thousand solar masses. Even black holes this large, however, are far smaller than the masses needed to power the ancient quasars.

Theories also suggest that so-called primordial black holes could have arisen even earlier in cosmic history, when spacetime may have been expanding exponentially in a process called inflation. Primordial black holes could have coalesced from tiny fluctuations in the density of the universe and then grown as the universe expanded. Yet these seeds would weigh only between 10 and 100 solar masses, presenting the same problem as Population III remnants.

As an explanation for the first quasars, each of these pathways for the formation of black hole seeds has the same problem: the seeds would have to grow extraordinarily quickly within the first billion years of cosmic history to create the earliest quasars. And what we know about the growth of black holes tells us that this scenario is highly unlikely.

The SCIET approach is much simpler.
The original Black Holes are portals that are now “receivers”
for newer Black Holes and the matter spewing out of them
is simply being “portaled” there from the newer ones.
The concept of portals in SCIET Dynamics requires that an opening in the
fabric of space”  cannot accept matter into a different
frequency  since the rule of like interacts with like forces it to
interact with a matching frequency  regardless of physical proximity.
Thus the event horizon of a black hole matches the event horizon of another black hole,
and older ones exist at a slightly lower frequency.


Our current understanding of physics suggests that there is an optimal feeding rate, known as the Eddington rate, at which black holes gain mass most efficiently. A black hole feeding at the Eddington rate would grow exponentially, doubling in mass every 107 years or so. To grow to 109 solar masses, a black hole seed of 10 solar masses would have to gobble stars and gas unimpeded at the Eddington rate for a billion years. It is hard to explain how an entire population of black holes could continuously feed so efficiently.

In effect, if the first quasars grew from Population III black hole seeds, they would have had to eat faster than the Eddington rate. Surpassing that rate is theoretically possible under special circumstances in dense, gas-rich environments, and these conditions may have been available in the early universe, but they would not have been common, and they would have been short-lived. Furthermore, exceptionally fast growth can actually cause “choking,” where the radiation emitted during these super-Eddington episodes could disrupt and even stop the flow of mass onto the black hole, halting its growth. Given these restrictions, it seems that extreme feasting could account for a few freak quasars, but it cannot explain the existence of the entire detected population unless our current understanding of the Eddington rate and black hole feeding process is wrong.

Thus, we must wonder whether the first black hole seeds could have formed through other channels. Building on the work of several other research groups, my collaborator Giuseppe Lodato and I published a set of papers in 2006 and 2007 in which we proposed a novel mechanism that could have produced more massive black hole seeds from the get-go. We started with large, pristine gas disks that might otherwise have cooled and fragmented to give rise to stars and become galaxies. We showed that it is possible for these disks to circumvent this conventional process and instead collapse into dense clumps that form seed black holes weighing 104 to 106 solar masses. This outcome can occur if something interferes with the normal cooling process that leads to star formation and instead drives the entire disk to become unstable, rapidly funneling matter to the center, much like water flowing down a bathtub drain when you pull the plug.

Disks cool down more efficiently if their gas includes some molecular hydrogen—two hydrogen atoms bonded together—rather than atomic hydrogen, which consists of only one atom. But if radiation from stars in a neighboring galaxy strikes the disk, it can destroy molecular hydrogen and turn it into atomic hydrogen, which suppresses cooling, keeping the gas too hot to form stars. Without stars, this massive irradiated disk could become dynamically unstable, and matter would quickly drain into its center, rapidly driving the production of a massive, direct-collapse black hole. Because this scenario depends on the presence of nearby stars, we expect DCBHs to typically form in satellite galaxies that orbit around larger parent galaxies where Population III stars have already formed.

Simulations of gas flows on large scales, as well as the physics of small-scale processes, support this model for DCBH formation. Thus, the idea of very large initial seeds appears feasible in the early universe. And starting with seeds in this range alleviates the timing problem for the production of the supermassive black holes that power the brightest, most distant quasars.


But just because DCBH seeds are feasible does not mean they actually exist. To find out, we must search for observational evidence. These objects would appear as bright, miniature quasars shining through the early universe. They should be detectable during a special phase when the seed merges with the parent galaxy—and this process should be common, given that DCBHs probably form in satellites orbiting larger galaxies. A merger would give the black hole seed a copious new source of gas to eat, so the black hole should start growing rapidly. In fact, it would briefly turn into a special kind of quasar that outshines all the stars in the galaxy.

Credit: Amanda Montañez

These black holes will not only be brighter than their surrounding stars, they will also be heavier—a reversal of the usual order of things. In general, the stars in a galaxy outweigh the central black holes by about a factor of 1,000. After the galaxy hosting the DCBH merges with its parent galaxy, however, the mass of the growing black hole will briefly exceed that of the stars. Such an object, called an obese black hole galaxy (OBG), should have a very special spectral signature, particularly in the infrared wavelengths between one and 30 microns where the JWST’s Mid-Infrared Instrument (MIRI) and Near-Infrared Camera (NIRCam) cameras will operate. This telescope will be the most powerful tool astronomers have ever had for peering into the earliest stages of cosmic history. If the telescope detects these obese black hole galaxies, it will provide strong evidence for our DCBH theory. Traditional black hole seeds, on the other hand, which derive from dead stars, are likely to be too faint for the JWST or other telescopes to see.

It is also possible that we might find other evidence for our theory. In the rare case that the parent galaxy that merges with the DCBH also hosts a central black hole, the two holes will collide and release powerful gravitational waves. These waves could be detectable by the Laser Interferometer Space Antenna (LISA), a European Space Agency/NASA mission expected to fly in the 2030s.


It is entirely possible that both the DCBH scenario and small seeds feeding at super-Eddington rates both occurred in the early universe. In fact, the initial black hole seeds probably formed via both these pathways. The question is, Which channel created the bulk of the bright ancient quasars that astronomers see? Solving this mystery could do more than just clear up the timeline of the early cosmos. Astronomers also want to understand more broadly how supermassive black holes affect the larger galaxies around them.

Data suggest that central black holes might play an important role in adjusting how many stars form in the galaxies they inhabit. For one thing, the energy produced when matter falls into the black hole may heat up the surrounding gas at the center of the galaxy, thus preventing cooling and halting star formation. This energy may even have far-reaching effects outside the galactic center by driving energetic jets of radiation outward. These jets, which astronomers can detect in radio wavelengths, could also heat up gas in outer regions and shut down star formation there. These effects are complex, however, and astronomers want to understand the details more clearly. Finding the first seed black holes could help reveal how the relation between black holes and their host galaxies evolved over time.

These insights fit into a larger revolution in our ability to study and understand all masses of black holes. When the Laser Interferometer Gravitational-Wave Observatory (LIGO) made the first detection of gravitational waves in 2015, for instance, scientists were able to trace them back to two colliding black holes weighing 36 and 29 solar masses, the lightweight cousins of the supermassive black holes that power quasars. The project continues to detect waves from similar events, offering new and incredible details about what happens when these black holes crash and warp the spacetime around them. Meanwhile a project called the Event Horizon Telescope aims to use radio observatories scattered around Earth to image the supermassive black hole at the center of the Milky Way. Scientists hope to spot a ringlike shadow around the black hole’s boundary that general relativity predicts will occur as the hole’s strong gravity deflects light. Any deviations the Event Horizon Telescope measures from the predictions of general relativity have the potential to challenge our understanding of black hole physics. In addition, experiments looking at pulsing stars called pulsar timing arrays could also detect tremors in spacetime caused by an accumulated signal of many collisions of black holes. And very soon the JWST will open up an entirely new window on the very first black holes to light up the universe.

Many revelations are in store in the very near future, and our understanding of black holes stands to be transformed.

This article was originally published with the title “The First Monster Black Holes”

Rights & Permissions


New Observational Constraints on the Growth of the First Supermassive Black Holes. E. Treister, K. Schawinski, M. Volonteri and P. Natarajan in Astrophysical Journal, Vol. 778, No. 2, Article No. 130; December 1, 2013.

Seeds to Monsters: Tracing the Growth of Black Holes in the Universe. Priyamvada Natarajan in General Relativity and Gravitation, Vol. 46, No. 5, Article No. 1702; May 2014.

Mapping the Heavens: The Radical Scientific Ideas That Reveal the Cosmos. Priyamvada Natarajan. Yale University Press, 2016.

Unveiling the First Black Holes with JWST: Multi-wavelength Spectral Predictions. Priyamvada Natarajan et al. in Astrophysical Journal, Vol. 838, No. 2, Article No. 117; April 1, 2017.

Posted by Sc13t4 in Astrophysics, Consciousness, Cosmology, Space/Time, The Void, Theoretical Physics, 0 comments
Scientists Find Fractal Patterns and Golden Ratio Pulses in Stars

Scientists Find Fractal Patterns and Golden Ratio Pulses in Stars

A recent article in Scientific American has reported the discovery that fractal patterns and the golden ratio have been discovered in outer space for the very first time. Researchers from the University of Hawaiʻi at Mānoa have been studying a specific kind of stars called RR Lyrae variables using the Kepler Space Telescope. Unlike normal stars, they expand and contract, causing their brightness to adjust dramatically, and in so doing create pulsations.

But the pulsations aren’t random or arbitrary.  They are pulsating in accordance with the golden mean.  We have seen the golden ratio turn up in nature all the time, but this is the first time it has ever been identified in space.

“Unlike our Sun, RR Lyrae stars shrink and swell, causing their temperatures and brightness to rhythmically change like the frequencies or notes in a song,” Dr Lindner, the lead Researcher, explained. It’s the ratio between this swelling and shrinking that is so important.

They have been studying the pulsations of these stars, and several of them have been pulsating frequencies nearly identical to the Golden Ratio. These specific stars are called “Golden RR Lyrae Variables.”

“We call these stars ‘golden’ because the ratio of two of their frequency components is near the golden mean, which is an irrational number famous in art, architecture, and mathematics,” Dr Lindner said.

The Golden Mean

The Golden Mean or Ratio, (1.61803398875…) is a pattern that is absolutely essential to the understanding of nature, as its found in everything from sunflowers, to succulents, to sea shells, and is commonly referred to in the study of Sacred Geometry.

The Golden Ratio was essential to Da Vinci’s Vitruvian Man, can be found in studying the Pyramids of Egypt, the Parthenon, and several researches believe they have correlated it to the understanding of the human genome and unlocking the codes in our DNA.

The Golden Ratio or Divine Proportion, when plotted numerically, creates a sequence that emerges what we can see as a Fractal Pattern.  Metaphysicians and Modern Physicians for the last 15-years have been avidly suggesting that the study of fractal patterns can lead us to a greater understanding of the Universe, and a Unified Field within it that very likely may be at play in structuring the Universe.

“The golden stars are actually the first examples outside of a laboratory of what’s called “strange nonchaotic dynamics.” The “strange” here refers to a fractal pattern, and nonchaotic means the pattern is orderly, rather than random. Most fractal patterns in nature, such as weather, are chaotic, so this aspect of the variable stars came as a surprise.” Reported an article in Scientific American.

These RR Lyrae variable stars are at their youngest over 10 billion years old and their brightness can vary by 200 percent over half a day. This makes it a bit challenging to study from Earth due to our day-night cycle. It’s the variation itself causing this mathematical phenomenon.

Plato had theorized that the Universe as a whole is simply a resonance of the “Music or Harmony of the Spheres.” This new study may provide deeper insights to pairing the Philosophies & Spiritual Sciences offered throughout the ages with modern Astronomy, and how we may understand the underlying elegance of nature as a whole.

While some of these stars pulsate with a single frequency, observations confirm that others pulsate with multiple frequencies.

“Just as flamboyant rock stars deliver pulsating rhythmic beats under their song melodies, so, too, do these variable stars,” said Dr Lindner.

Posted by Sc13t4 in Astrophysics, Cosmology, Design, Mathematics, Space/Time, Theoretical Physics, 0 comments
Beyond Quantum Physics: The Next Giant Leap for Science is Approaching Fast

Beyond Quantum Physics: The Next Giant Leap for Science is Approaching Fast

The quote below  is a great example that lets the reader know one thing; that new information and evidence which challenge long held beliefs about our world are always met with harsh criticism. Remember when we found out that the Earth wasn’t flat? Human history shows the same pattern, especially if we look at the history of science.

“Despite the unrivalled empirical success of quantum theory, the very suggestion that it may be literally true as a description of nature is still greeted with cynicism, incomprehension and even anger.”
(T. Folger, “Quantum Shmantum”; Discover 22:37-43, 2001)

Take, for example, prominent physicist Lord Kelvin, who stated in the year 1900 that, “There is nothing new to be discovered in physics now. All that remains is more and more precise measurement.” 

It wasn’t long after this statement when Einstein published his paper on special relativity. Einstein’s theories challenged the accepted framework of knowledge at the time, and forced the scientific community to open up to an alternate view of reality.

It serves as a great example of how concepts that are taken to be absolute truth are susceptible to change.

Today, something special in science is happening. It’s the recognition that what we perceive to be our physical material world is not the only world, and non-material factors like consciousness, for example, may play a vital role in the make-up of our physical material world.

In the scientific community, it’s referred to as non-material science.

Other areas of study in this field include telepathy, clairvoyance, ESP, and more. These are topics that have been studied within black budget and at the highest levels of government for decades, yet at the same time ridiculed by mainstream science, despite extremely significant statistical results.

I definately resonate with the words below, found on this document. Intelligence agencies have a long history of keeping tabs on what goes on with this stuff.  It’s what inspired me to the title I did for the article, because quantum physics leaks into this type of phenomenon, and a quantum perspective is what’s needed to understand them. 

This area is usually referred to as “psi” phenomena, or parapsychological phenomenon.

It’s interesting because as far back as 1999, statistics professor Jessica Utts at UC Irvine, published a paper showing that parapsychological experiments have produced much stronger results than those showing a daily dose of aspirin helping to prevent heart attacks. Utts also showed that these results are much stronger than the research behind various drugs like antiplatelets, for example.

This is precisely why Nikola Tesla told the world that,

“The day science begins to study non-physical phenomena, it will make more progress in one decade than in all the previous centuries of its existence”

Hundreds of scientists are gathering to emphasize this, and are not really getting the attention they deserve. All of our academia and real-world applications come from material science. This is great, but it’s time to take the next leap. How can we continue to ignore facts and results simply because they defy the belief systems of so many people?

A group of internationally recognized scientists have come together to stress the fact that matter (protons, electrons, photons, anything that has a mass) is not the only reality. We wish to understand the nature of our reality, but how can we do so if we are continually examining only physical systems? What about the role of non-physical systems such as consciousness, or their interaction with physical systems (matter)?

Expanding Reality, A Ground Breaking Trilogy Film Series

You can purchase the film here.

“Expanding Reality is about the emerging postmaterialist paradigm and the next great scientific revolution. Why is it important? Because this paradigm has far-reaching implications. For instance, it re-enchants the world and profoundly alters the vision we have of ourselves, giving us back our dignity and power as human beings. The postmaterialist paradigm also fosters positive values such as compassion, respect, care, love, and peace, because it makes us realize that the boundaries between self and others are permeable. In doing so, this paradigm promotes an awareness of the deep interconnection between ourselves and Nature at large. In that sense, the model of reality associated with the postmaterialist paradigm may help humanity to create a sustainable civilization and to blossom.” – Mario Beauregard, PhD, from the University of Arizona

These people have exhausted their own resources in order to make Expanding Reality for the world, show your support by purchasing the movie HERE. You won’t be disappointed.

Important Points

Here is a list of points were co-authored by:  Dr. Gary Schwartz, professor of psychology, medicine, neurology, psychiatry, and surgery at the University of Arizona, Mario Beauregard, PhD, from the University of Arizona, and Lisa Miller, PhD, from Columbia University. It was presented at an international summit on post-materialist science, spirituality, and society.

The Summary Report of the International Summit on Post-Materialist Science, Spirituality and Society can be downloaded here: International Summit on Post-Materialist Science: Summary Report (PDF).

“Get over it, and accept the inarguable conclusion. The universe is immaterial-mental and spiritual.” (“The Mental Universe” ; Nature 436:29,2005)

 First seen:
Posted by Sc13t4 in Atomic, Design, Mathematics, Space/Time, Theoretical Physics, 0 comments
Physicists Want to Rebuild Quantum Theory from Scratch

Physicists Want to Rebuild Quantum Theory from Scratch

SCIENTISTS HAVE BEEN using quantum theory for almost a century now, but embarrassingly they still don’t know what it means. An informal poll taken at a 2011 conference on Quantum Physics and the Nature of Reality showed that there’s still no consensus on what quantum theory says about reality—the participants remained deeply divided about how the theory should be interpreted.

Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Some physicists just shrug and say we have to live with the fact that quantum mechanics is weird. So particles can be in two places at once, or communicate instantaneously over vast distances? Get over it. After all, the theory works fine. If you want to calculate what experiments will reveal about subatomic particles, atoms, molecules and light, then quantum mechanics succeeds brilliantly.

But some researchers want to dig deeper. They want to know why quantum mechanics has the form it does, and they are engaged in an ambitious program to find out. It is called quantum reconstruction, and it amounts to trying to rebuild the theory from scratch based on a few simple principles.

If these efforts succeed, it’s possible that all the apparent oddness and confusion of quantum mechanics will melt away, and we will finally grasp what the theory has been trying to tell us. “For me, the ultimate goal is to prove that quantum theory is the only theory where our imperfect experiences allow us to build an ideal picture of the world,” said Giulio Chiribella, a theoretical physicist at the University of Hong Kong.

There’s no guarantee of success—no assurance that quantum mechanics really does have something plain and simple at its heart, rather than the abstruse collection of mathematical concepts used today. But even if quantum reconstruction efforts don’t pan out, they might point the way to an equally tantalizing goal: getting beyond quantum mechanics itself to a still deeper theory. “I think it might help us move towards a theory of quantum gravity,” said Lucien Hardy, a theoretical physicist at the Perimeter Institute for Theoretical Physics in Waterloo, Canada.

The Flimsy Foundations of Quantum Mechanics

The basic premise of the quantum reconstruction game is summed up by the joke about the driver who, lost in rural Ireland, asks a passer-by how to get to Dublin. “I wouldn’t start from here,” comes the reply.

Where, in quantum mechanics, is “here”? The theory arose out of attempts to understand how atoms and molecules interact with light and other radiation, phenomena that classical physics couldn’t explain. Quantum theory was empirically motivated, and its rules were simply ones that seemed to fit what was observed. It uses mathematical formulas that, while tried and trusted, were essentially pulled out of a hat by the pioneers of the theory in the early 20th century.

Take Erwin Schrödinger’s equation for calculating the probabilistic properties of quantum particles. The particle is described by a “wave function” that encodes all we can know about it. It’s basically a wavelike mathematical expression, reflecting the well-known fact that quantum particles can sometimes seem to behave like waves. Want to know the probability that the particle will be observed in a particular place? Just calculate the square of the wave function (or, to be exact, a slightly more complicated mathematical term), and from that you can deduce how likely you are to detect the particle there. The probability of measuring some of its other observable properties can be found by, crudely speaking, applying a mathematical function called an operator to the wave function.

I think quantum theory as we know it will not stand. Alexei Grinbaum

But this so-called rule for calculating probabilities was really just an intuitive guess by the German physicist Max Born. So was Schrödinger’s equation itself. Neither was supported by rigorous derivation. Quantum mechanics seems largely built of arbitrary rules like this, some of them—such as the mathematical properties of operators that correspond to observable properties of the system—rather arcane. It’s a complex framework, but it’s also an ad hoc patchwork, lacking any obvious physical interpretation or justification.

Compare this with the ground rules, or axioms, of Einstein’s theory of special relativity, which was as revolutionary in its way as quantum mechanics. (Einstein launched them both, rather miraculously, in 1905.) Before Einstein, there was an untidy collection of equations to describe how light behaves from the point of view of a moving observer. Einstein dispelled the mathematical fog with two simple and intuitive principles: that the speed of light is constant, and that the laws of physics are the same for two observers moving at constant speed relative to one another. Grant these basic principles, and the rest of the theory follows. Not only are the axioms simple, but we can see at once what they mean in physical terms.

What are the analogous statements for quantum mechanics? The eminent physicist John Wheeler once asserted that if we really understood the central point of quantum theory, we would be able to state it in one simple sentence that anyone could understand. If such a statement exists, some quantum reconstructionists suspect that we’ll find it only by rebuilding quantum theory from scratch: by tearing up the work of Bohr, Heisenberg and Schrödinger and starting again.

Quantum Roulette

One of the first efforts at quantum reconstruction was made in 2001 by Hardy, then at the University of Oxford. He ignored everything that we typically associate with quantum mechanics, such as quantum jumps, wave-particle duality and uncertainty. Instead, Hardy focused on probability: specifically, the probabilities that relate the possible states of a system with the chance of observing each state in a measurement. Hardy found that these bare bones were enough to get all that familiar quantum stuff back again.

Lucien Hardy, a physicist at the Perimeter Institute, was one of the first to derive the rules of quantum mechanics from simple principles.

 Lucien Hardy, a physicist at the Perimeter Institute, was one of the first to derive the rules of quantum mechanics from simple principles.

In quantum mechanics, however, a particle can exist not just in distinct states, like the heads and tails of a coin, but in a so-called superposition—roughly speaking, a combination of those states. In other words, a quantum bit, or qubit, can be not just in the binary state of 0 or 1, but in a superposition of the two.

But if you make a measurement of that qubit, you’ll only ever get a result of 1 or 0. That is the mystery of quantum mechanics, often referred to as the collapse of the wave function: Measurements elicit only one of the possible outcomes. To put it another way, a quantum object commonly has more options for measurements encoded in the wave function than can be seen in practice.

Hardy’s rules governing possible states and their relationship to measurement outcomes acknowledged this property of quantum bits. In essence the rules were (probabilistic) ones about how systems can carry information and how they can be combined and interconverted.

Hardy then showed that the simplest possible theory to describe such systems is quantum mechanics, with all its characteristic phenomena such as wavelike interference and entanglement, in which the properties of different objects become interdependent. “Hardy’s 2001 paper was the ‘Yes, we can!’ moment of the reconstruction program,” Chiribella said. “It told us that in some way or another we can get to a reconstruction of quantum theory.”

More specifically, it implied that the core trait of quantum theory is that it is inherently probabilistic. “Quantum theory can be seen as a generalized probability theory, an abstract thing that can be studied detached from its application to physics,” Chiribella said. This approach doesn’t address any underlying physics at all, but just considers how outputs are related to inputs: what we can measure given how a state is prepared (a so-called operational perspective). “What the physical system is is not specified and plays no role in the results,” Chiribella said. These generalized probability theories are “pure syntax,” he added — they relate states and measurements, just as linguistic syntax relates categories of words, without regard to what the words mean. In other words, Chiribella explained, generalized probability theories “are the syntax of physical theories, once we strip them of the semantics.”

Shouldn’t this shock anyone who thinks of quantum theory as an expression of properties of nature? Adán Cabello

The general idea for all approaches in quantum reconstruction, then, is to start by listing the probabilities that a user of the theory assigns to each of the possible outcomes of all the measurements the user can perform on a system. That list is the “state of the system.” The only other ingredients are the ways in which states can be transformed into one another, and the probability of the outputs given certain inputs. This operational approach to reconstruction “doesn’t assume space-time or causality or anything, only a distinction between these two types of data,” said Alexei Grinbaum, a philosopher of physics at the CEA Saclay in France.

To distinguish quantum theory from a generalized probability theory, you need specific kinds of constraints on the probabilities and possible outcomes of measurement. But those constraints aren’t unique. So lots of possible theories of probability look quantum-like. How then do you pick out the right one?

“We can look for probabilistic theories that are similar to quantum theory but differ in specific aspects,” said Matthias Kleinmann, a theoretical physicist at the University of the Basque Country in Bilbao, Spain. If you can then find postulates that select quantum mechanics specifically, he explained, you can “drop or weaken some of them and work out mathematically what other theories appear as solutions.” Such exploration of what lies beyond quantum mechanics is not just academic doodling, for it’s possible—indeed, likely—that quantum mechanics is itself just an approximation of a deeper theory. That theory might emerge, as quantum theory did from classical physics, from violations in quantum theory that appear if we push it hard enough.

Bits and Pieces

Some researchers suspect that ultimately the axioms of a quantum reconstruction will be about information: what can and can’t be done with it. One such derivation of quantum theory based on axioms about information was proposed in 2010 by Chiribella, then working at the Perimeter Institute, and his collaborators Giacomo Mauro D’Ariano and Paolo Perinotti of the University of Pavia in Italy. “Loosely speaking,” explained Jacques Pienaar, a theoretical physicist at the University of Vienna, “their principles state that information should be localized in space and time, that systems should be able to encode information about each other, and that every process should in principle be reversible, so that information is conserved.” (In irreversible processes, by contrast, information is typically lost—just as it is when you erase a file on your hard drive.)

What’s more, said Pienaar, these axioms can all be explained using ordinary language. “They all pertain directly to the elements of human experience, namely, what real experimenters ought to be able to do with the systems in their laboratories,” he said. “And they all seem quite reasonable, so that it is easy to accept their truth.” Chiribella and his colleagues showed that a system governed by these rules shows all the familiar quantum behaviors, such as superposition and entanglement.

Giulio Chiribella, a physicist at the University of Hong Kong, reconstructed quantum theory from ideas in information theory.

One challenge is to decide what should be designated an axiom and what physicists should try to derive from the axioms. Take the quantum no-cloning rule, which is another of the principles that naturally arises from Chiribella’s reconstruction. One of the deep findings of modern quantum theory, this principle states that it is impossible to make a duplicate of an arbitrary, unknown quantum state.

It sounds like a technicality (albeit a highly inconvenient one for scientists and mathematicians seeking to design quantum computers). But in an effort in 2002 to derive quantum mechanics from rules about what is permitted with quantum information, Jeffrey Bub of the University of Maryland and his colleagues Rob Clifton of the University of Pittsburgh and Hans Halvorson of Princeton University made no-cloning one of three fundamental axioms. One of the others was a straightforward consequence of special relativity: You can’t transmit information between two objects more quickly than the speed of light by making a measurement on one of the objects. The third axiom was harder to state, but it also crops up as a constraint on quantum information technology. In essence, it limits how securely a bit of information can be exchanged without being tampered with: The rule is a prohibition on what is called “unconditionally secure bit commitment.”

These axioms seem to relate to the practicalities of managing quantum information. But if we consider them instead to be fundamental, and if we additionally assume that the algebra of quantum theory has a property called non-commutation, meaning that the order in which you do calculations matters (in contrast to the multiplication of two numbers, which can be done in any order), Clifton, Bub and Halvorson have shown that these rules too give rise to superposition, entanglement, uncertainty, nonlocality and so on: the core phenomena of quantum theory.

Another information-focused reconstruction was suggested in 2009 by Borivoje Dakić and Časlav Brukner, physicists at the University of Vienna. They proposed three “reasonable axioms” having to do with information capacity: that the most elementary component of all systems can carry no more than one bit of information, that the state of a composite system made up of subsystems is completely determined by measurements on its subsystems, and that you can convert any “pure” state to another and back again (like flipping a coin between heads and tails).

Dakić and Brukner showed that these assumptions lead inevitably to classical and quantum-style probability, and to no other kinds. What’s more, if you modify axiom three to say that states get converted continuously—little by little, rather than in one big jump—you get only quantum theory, not classical. (Yes, it really is that way round, contrary to what the “quantum jump” idea would have you expect—you can interconvert states of quantum spins by rotating their orientation smoothly, but you can’t gradually convert a classical heads to a tails.) “If we don’t have continuity, then we don’t have quantum theory,” Grinbaum said.

May 26, 2015 – Harvard University
Quantum Physicist Chris Fuchs stands for a portrait inside the Integrated Sciences building on the campus of UMass Boston.
Photo by Katherine Taylor for Quanta

A further approach in the spirit of quantum reconstruction is called quantum Bayesianism, or QBism. Devised by Carlton Caves, Christopher Fuchs and Rüdiger Schack in the early 2000s, it takes the provocative position that the mathematical machinery of quantum mechanics has nothing to do with the way the world really is; rather, it is just the appropriate framework that lets us develop expectations and beliefs about the outcomes of our interventions. It takes its cue from the Bayesian approach to classical probability developed in the 18th century, in which probabilities stem from personal beliefs rather than observed frequencies. In QBism, quantum probabilities calculated by the Born rule don’t tell us what we’ll measure, but only what we should rationally expect to measure.

In this view, the world isn’t bound by rules—or at least, not by quantum rules. Indeed, there may be no fundamental laws governing the way particles interact; instead, laws emerge at the scale of our observations. This possibility was considered by John Wheeler, who dubbed the scenario Law Without Law. It would mean that “quantum theory is merely a tool to make comprehensible a lawless slicing-up of nature,” said Adán Cabello, a physicist at the University of Seville. Can we derive quantum theory from these premises alone?

“At first sight, it seems impossible,” Cabello admitted—the ingredients seem far too thin, not to mention arbitrary and alien to the usual assumptions of science. “But what if we manage to do it?” he asked. “Shouldn’t this shock anyone who thinks of quantum theory as an expression of properties of nature?”

Making Space for Gravity

In Hardy’s view, quantum reconstructions have been almost too successful, in one sense: Various sets of axioms all give rise to the basic structure of quantum mechanics. “We have these different sets of axioms, but when you look at them, you can see the connections between them,” he said. “They all seem reasonably good and are in a formal sense equivalent because they all give you quantum theory.” And that’s not quite what he’d hoped for. “When I started on this, what I wanted to see was two or so obvious, compelling axioms that would give you quantum theory and which no one would argue with.”

So how do we choose between the options available? “My suspicion now is that there is still a deeper level to go to in understanding quantum theory,” Hardy said. And he hopes that this deeper level will point beyond quantum theory, to the elusive goal of a quantum theory of gravity. “That’s the next step,” he said. Several researchers working on reconstructions now hope that its axiomatic approach will help us see how to pose quantum theory in a way that forges a connection with the modern theory of gravitation—Einstein’s general relativity.

Perhaps when we finally get our hands on quantum gravity, the interpretation will suggest itself. Lucien Hardy

Look at the Schrödinger equation and you will find no clues about how to take that step. But quantum reconstructions with an “informational” flavor speak about how information-carrying systems can affect one another, a framework of causation that hints at a link to the space-time picture of general relativity. Causation imposes chronological ordering: An effect can’t precede its cause. But Hardy suspects that the axioms we need to build quantum theory will be ones that embrace a lack of definite causal structure—no unique time-ordering of events—which he says is what we should expect when quantum theory is combined with general relativity. “I’d like to see axioms that are as causally neutral as possible, because they’d be better candidates as axioms that come from quantum gravity,” he said.

Hardy first suggested that quantum-gravitational systems might show indefinite causal structure in 2007. And in fact only quantum mechanics can display that. While working on quantum reconstructions, Chiribella was inspired to propose an experiment to create causal superpositions of quantum systems, in which there is no definite series of cause-and-effect events. This experiment has now been carried out by Philip Walther’s lab at the University of Vienna—and it might incidentally point to a way of making quantum computing more efficient.

“I find this a striking illustration of the usefulness of the reconstruction approach,” Chiribella said. “Capturing quantum theory with axioms is not just an intellectual exercise. We want the axioms to do something useful for us—to help us reason about quantum theory, invent new communication protocols and new algorithms for quantum computers, and to be a guide for the formulation of new physics.”

But can quantum reconstructions also help us understand the “meaning” of quantum mechanics? Hardy doubts that these efforts can resolve arguments about interpretation—whether we need many worlds or just one, for example. After all, precisely because the reconstructionist program is inherently “operational,” meaning that it focuses on the “user experience”—probabilities about what we measure—it may never speak about the “underlying reality” that creates those probabilities.

“When I went into this approach, I hoped it would help to resolve these interpretational problems,” Hardy admitted. “But I would say it hasn’t.” Cabello agrees. “One can argue that previous reconstructions failed to make quantum theory less puzzling or to explain where quantum theory comes from,” he said. “All of them seem to miss the mark for an ultimate understanding of the theory.” But he remains optimistic: “I still think that the right approach will dissolve the problems and we will understand the theory.”

Maybe, Hardy said, these challenges stem from the fact that the more fundamental description of reality is rooted in that still undiscovered theory of quantum gravity. “Perhaps when we finally get our hands on quantum gravity, the interpretation will suggest itself,” he said. “Or it might be worse!”


    The Man Who’s Trying to Kill Dark Matter
    Your Simple (Yes, Simple) Guide to Quantum Entanglement

Right now, quantum reconstruction has few adherents—which pleases Hardy, as it means that it’s still a relatively tranquil field. But if it makes serious inroads into quantum gravity, that will surely change. In the 2011 poll, about a quarter of the respondents felt that quantum reconstructions will lead to a new, deeper theory. A one-in-four chance certainly seems worth a shot.

Grinbaum thinks that the task of building the whole of quantum theory from scratch with a handful of axioms may ultimately be unsuccessful. “I’m now very pessimistic about complete reconstructions,” he said. But, he suggested, why not try to do it piece by piece instead—to just reconstruct particular aspects, such as nonlocality or causality? “Why would one try to reconstruct the entire edifice of quantum theory if we know that it’s made of different bricks?” he asked. “Reconstruct the bricks first. Maybe remove some and look at what kind of new theory may emerge.”

“I think quantum theory as we know it will not stand,” Grinbaum said. “Which of its feet of clay will break first is what reconstructions are trying to explore.” He thinks that, as this daunting task proceeds, some of the most vexing and vague issues in standard quantum theory—such as the process of measurement and the role of the observer—will disappear, and we’ll see that the real challenges are elsewhere. “What is needed is new mathematics that will render these notions scientific,” he said. Then, perhaps, we’ll understand what we’ve been arguing about for so long.


Original story reprinted with permission from Quanta Magazine, an editorially independent publication of the Simons Foundation whose mission is to enhance public understanding of science by covering research developments and trends in mathematics and the physical and life sciences.

Posted by Sc13t4 in Atomic, Design, Mathematics, Space/Time, Theoretical Physics, 0 comments
Proof Claimed for Deep Connection between Prime Numbers

Proof Claimed for Deep Connection between Prime Numbers

If true, a solution to the “abc” conjecture about whole numbers would be “one of the most astounding achievements of mathematics of the 21st century”

The usually quiet world of mathematics is abuzz with a claim that one of the most important problems in number theory has been solved.

Mathematician Shinichi Mochizuki of Kyoto University in Japan has released a 500-page proof of the abcconjecture, which proposes a relationship between whole numbers — a ‘Diophantine’ problem.

The abcconjecture, proposed independently by David Masser and Joseph Oesterle in 1985, might not be as familiar to the wider world as Fermat’s Last Theorem, but in some ways it is more significant. “The abcconjecture, if proved true, at one stroke solves many famous Diophantine problems, including Fermat’s Last Theorem,” says Dorian Goldfeld, a mathematician at Columbia University in New York. “If Mochizuki’s proof is correct, it will be one of the most astounding achievements of mathematics of the twenty-first century.”

By Philipe Ball  From Nature magazine Credit: Flickr/Center for Image in Science and Art _ UL

Like Fermat’s theorem, the abc conjecture refers to equations of the form a+b=c. It involves the concept of a square-free number: one that cannot be divided by the square of any number. Fifteen and 17 are square free-numbers, but 16 and 18 — being divisible by 42and 32, respectively — are not.

The ‘square-free’ part of a number n, sqp(n), is the largest square-free number that can be formed by multiplying the factors of n that are prime numbers. For instance, sqp(18)=2×3=6.

If you’ve got that, then you should get the abcconjecture. It concerns a property of the product of the three integers axbxc, or abc— or more specifically, of the square-free part of this product, which involves their distinct prime factors. It states that for integers a+b=c, the ratio of sqp(abc)r/calways has some minimum value greater than zero for any value of rgreater than 1. For example, if a=3 and b=125, so that c=128, then sqp(abc)=30 and sqp(abc)2/c = 900/128. In this case, in which r=2, sqp(abc)r/c is nearly always greater than 1, and always greater than zero.

Deep connection
It turns out that this conjecture encapsulates many other Diophantine problems, including Fermat’s Last Theorem (which states that an+bn=cnhas no integer solutions if n>2). Like many Diophantine problems, it is all about the relationships between prime numbers. According to Brian Conrad of Stanford University in California, “it encodes a deep connection between the prime factors of a, b and a+b”.

Many mathematicians have expended a great deal of effort trying to prove the conjecture. In 2007, French mathematician Lucien Szpiro, whose work in 1978 led to the abcconjecture in the first place claimed to have a proof of it, but it was soon found to be flawed.

Like Szpiro, and also like British mathematician Andrew Wiles, who proved Fermat’s Last Theorem in 1994, Mochizuki has attacked the problem using the theory of elliptic curves — the smooth curves generated by algebraic relationships of the sort y2=x3+ax+b.

There, however, the relationship of Mochizuki’s work to previous efforts stops. He has developed techniques that very few other mathematicians fully understand and that invoke new mathematical ‘objects’ — abstract entities analogous to more familiar examples such as geometric objects, sets, permutations, topologies and matrices. “At this point, he is probably the only one that knows it all,” says Goldfeld.

Conrad says that the work “uses a huge number of insights that are going to take a long time to be digested by the community”. The proof is spread across four long papers1–4, each of which rests on earlier long papers. “It can require a huge investment of time to understand a long and sophisticated proof, so the willingness by others to do this rests not only on the importance of the announcement but also on the track record of the authors,” Conrad explains.

Mochizuki’s track record certainly makes the effort worthwhile. “He has proved extremely deep theorems in the past, and is very thorough in his writing, so that provides a lot of confidence,” says Conrad. And he adds that the pay-off would be more than a matter of simply verifying the claim. “The exciting aspect is not just that the conjecture may have now been solved, but that the techniques and insights he must have had to introduce should be very powerful tools for solving future problems in number theory.”

This article is reproduced with permission from the magazine Nature. The article was first publishedon September 10, 2012.

Posted by Sc13t4 in Mathematics, Space/Time, Theoretical Physics, 0 comments
What is SpaceTime?

What is SpaceTime?

Physicists believe that at the tiniest scales, space emerges from quanta.
What might these building blocks look like?

People have always taken space for granted. It is just emptiness, after all—a backdrop to everything else. Time, likewise, simply ticks on incessantly. But if physicists have learned anything from the long slog to unify their theories, it is that space and time form a system of such staggering complexity that it may defy our most ardent efforts to understand.

Albert Einstein saw what was coming as early as November 1916. A year earlier he had formulated his general theory of relativity, which postulates that gravity is not a force that propagates through space but a feature of spacetime itself. When you throw a ball high into the air, it arcs back to the ground because Earth distorts the spacetime around it, so that the paths of the ball and the ground intersect again. In a letter to a friend, Einstein contemplated the challenge of merging general relativity with his other brainchild, the nascent theory of quantum mechanics. That would not merely distort space but dismantle it. Mathematically, he hardly knew where to begin. “How much have I already plagued myself in this way!” he wrote.

Einstein never got very far. Even today there are almost as many contending ideas for a quantum theory of gravity as scientists working on the topic. The disputes obscure an important truth: the competing approaches all say space is derived from something deeper—an idea that breaks with 2,500 years of scientific and philosophical understanding.

[SCIET Dynamic’s Note] This article is posted here because it beautifully presents some core issues regarding the controversy over the competition to describe reality in the realm of very small changes in space. We need to find a General Theory of Spacetime.

SCIET Dynamics seeks to unite the components of SpaceTime into an interdependent set that grows in complexity as it develops. It views the Void(Awareness), Space, Matter and Consciousness as sequences of creation built one upon the other. The Void, called “Awareness” in SD, exists as a sea of extremely small and fast fluctuations, which then gives rise to a burst of energy, labeled the “First Action” which converts the burst into ever smaller increments, or “points of Awareness”, that have the effect of “formatting” the area defined by the original burst of energy. The “formatting” is the byproduct of self-measuring algorithm which reduces uniformly within the original radius of the burst. When the increments reach the size of the original center point they begin to interact, or resonate, with that value. The resonance gives rise to a new quality that allows the information about the change created by movement to bounce off of the center point and be stored in the area around the “point of Awareness”, a phenomenon that is responsible to the formation of spheres that surround every “point of Awareness. All nucleons (Protons, neutrons and electrons) are created by this affect. The same affect is responsible for spherical forms in space of all sizes.


A kitchen magnet neatly demonstrates the problem that physicists face. It can grip a paper clip against the gravity of the entire Earth. Gravity is weaker than magnetism or than electric or nuclear forces. Whatever quantum effects it has are weaker still. The only tangible evidence that these processes occur at all is the mottled pattern of matter in the very early universe—thought to be caused, in part, by quantum fluctuations of the gravitational field.

Black holes are the best test case for quantum gravity. “It’s the closest thing we have to experiments,” says Ted Jacobson of the University of Maryland, College Park. He and other theorists study black holes as theoretical fulcrums. What happens when you take equations that work perfectly well under laboratory conditions and extrapolate them to the most extreme conceivable situation? Will some subtle flaw manifest itself?

General relativity predicts that matter falling into a black hole becomes compressed without limit as it approaches the center—a mathematical cul-de-sac called a singularity. Theorists cannot extrapolate the trajectory of an object beyond the singularity; its time line ends there. Even to speak of “there” is problematic because the very spacetime that would define the location of the singularity ceases to exist. Researchers hope that quantum theory could focus a microscope on that point and track what becomes of the material that falls in.

Out at the boundary of the hole, matter is not so compressed, gravity is weaker and, by all rights, the known laws of physics should still hold. Thus, it is all the more perplexing that they do not. The black hole is demarcated by an event horizon, a point of no return: matter that falls in cannot get back out. The descent is irreversible. That is a problem because all known laws of fundamental physics, including those of quantum mechanics as generally understood, are reversible. At least in principle, you should be able to reverse the motion of all the particles and recover what you had.

A very similar conundrum confronted physicists in the late 1800s, when they contemplated the mathematics of a “black body,” idealized as a cavity full of electromagnetic radiation. James Clerk Maxwell’s theory of electromagnetism predicted that such an object would absorb all the radiation that impinges on it and that it could never come to equilibrium with surrounding matter. “It would absorb an infinite amount of heat from a reservoir maintained at a fixed temperature,” explains Rafael Sorkin of the Perimeter Institute for Theoretical Physics in Ontario. In thermal terms, it would effectively have a temperature of absolute zero. This conclusion contradicted observations of real-life black bodies (such as an oven). Following up on work by Max Planck, Einstein showed that a black body can reach thermal equilibrium if radiative energy comes in discrete units, or quanta.

Theoretical physicists have been trying for nearly half a century to achieve an equivalent resolution for black holes. The late Stephen Hawking of the University of Cambridge took a huge step in the mid-1970s, when he applied quantum theory to the radiation field around black holes and showed they have a nonzero temperature. As such, they can not only absorb but also emit energy. Although his analysis brought black holes within the fold of thermodynamics, it deepened the problem of irreversibility. The outgoing radiation emerges from just outside the boundary of the hole and carries no information about the interior. It is random heat energy. If you reversed the process and fed the energy back in, the stuff that had fallen in would not pop out; you would just get more heat. And you cannot imagine that the original stuff is still there, merely trapped inside the hole, because as the hole emits radiation, it shrinks and, according to Hawking’s analysis, ultimately disappears.

This problem is called the information paradox because the black hole destroys the information about the infalling particles that would let you rewind their motion. If black hole physics really is reversible, something must carry information back out, and our conception of spacetime may need to change to allow for that.


Heat is the random motion of microscopic parts, such as the molecules of a gas. Because black holes can warm up and cool down, it stands to reason that they have parts—or, more generally, a microscopic structure. And because a black hole is just empty space (according to general relativity, infalling matter passes through the horizon but cannot linger), the parts of the black hole must be the parts of space itself. As plain as an expanse of empty space may look, it has enormous latent complexity.

Even theories that set out to preserve a conventional notion of spacetime end up concluding that something lurks behind the featureless facade. For instance, in the late 1970s Steven Weinberg, now at the University of Texas at Austin, sought to describe gravity in much the same way as the other forces of nature. He still found that spacetime is radically modified on its finest scales.

Physicists initially visualized microscopic space as a mosaic of little chunks of space. If you zoomed in to the Planck scale, an almost inconceivably small size of 10–35 meter, they thought you would see something like a chessboard. But that cannot be quite right. For one thing, the grid lines of a chessboard space would privilege some directions over others, creating asymmetries that contradict the special theory of relativity. For example, light of different colors might travel at different speeds—just as in a glass prism, which refracts light into its constituent colors. Whereas effects on small scales are usually hard to see, violations of relativity would actually be fairly obvious.

In SCIET Dynamics the “atoms” of space time are perceived to be quantum scale fluctuations that leave tetrahedral tracks as they appear and disappear. The tracks are related to the Event Horizons of Black Holes because they bound the the area between the void and space. In this sense, the tiny “tetrons” are an artifact of the creation of space.

The thermodynamics of black holes casts further doubt on picturing space as a simple mosaic. By measuring the thermal behavior of any system, you can count its parts, at least in principle. Dump in energy and watch the thermometer. If it shoots up, that energy must be spread out over comparatively few molecules. In effect, you are measuring the entropy of the system, which represents its microscopic complexity.

If you go through this exercise for an ordinary substance, the number of molecules increases with the volume of material. That is as it should be: If you increase the radius of a beach ball by a factor of 10, you will have 1,000 times as many molecules inside it. But if you increase the radius of a black hole by a factor of 10, the inferred number of molecules goes up by only a factor of 100. The number of “molecules” that it is made up of must be proportional not to its volume but to its surface area. The black hole may look three-dimensional, but it behaves as if it were two-dimensional.

This weird effect goes under the name of the holographic principle because it is reminiscent of a hologram, which presents itself to us as a three-dimensional object. On closer examination, however, it turns out to be an image produced by a two-dimensional sheet of film. If the holographic principle counts the microscopic constituents of space and its contents—as physicists widely, though not universally, accept—it must take more to build space than splicing together little pieces of it.

The relation of part to whole is seldom so straightforward, anyway. An H2O molecule is not just a little piece of water. Consider what liquid water does: it flows, forms droplets, carries ripples and waves, and freezes and boils. An individual H2O molecule does none of that: those are collective behaviors. Likewise, the building blocks of space need not be spatial. “The atoms of space are not the smallest portions of space,” says Daniele Oriti of the Max Planck Institute for Gravitational Physics in Potsdam, Germany. “They are the constituents of space. The geometric properties of space are new, collective, approximate properties of a system made of many such atoms.”

What exactly those building blocks are depends on the theory. In loop quantum gravity, they are quanta of volume aggregated by applying quantum principles. In string theory, they are fields akin to those of electromagnetism that live on the surface traced out by a moving strand or loop of energy—the namesake string. In M-theory, which is related to string theory and may underlie it, they are a special type of particle: a membrane shrunk to a point. In causal set theory, they are events related by a web of cause and effect. In the amplituhedron theory and some other approaches, there are no building blocks at all—at least not in any conventional sense.

Although the organizing principles of these theories vary, all strive to uphold some version of the so-called relationalism of 17th- and 18th-century German philosopher Gottfried Leibniz. In broad terms, relationalism holds that space arises from a certain pattern of correlations among objects. In this view, space is a jigsaw puzzle. You start with a big pile of pieces, see how they connect and place them accordingly. If two pieces have similar properties, such as color, they are likely to be nearby; if they differ strongly, you tentatively put them far apart. Physicists commonly express these relations as a network with a certain pattern of connectivity. The relations are dictated by quantum theory or other principles, and the spatial arrangement follows.

Phase transitions are another common theme. If space is assembled, it might be disassembled, too; then its building blocks could organize into something that looks nothing like space. “Just like you have different phases of matter, like ice, water and water vapor, the atoms of space can also reconfigure themselves in different phases,” says Thanu Padmanabhan of the Inter-University Center for Astronomy and Astrophysics in India. In this view, black holes may be places where space melts. Known theories break down, but a more general theory would describe what happens in the new phase. Even when space reaches its end, physics carries on.


The big realization of recent years—and one that has crossed old disciplinary boundaries—is that the relevant relations involve quantum entanglement. An extrapowerful type of correlation, intrinsic to quantum mechanics, entanglement seems to be more primitive than space. For instance, an experimentalist might create two particles that fly off in opposing directions. If they are entangled, they remain coordinated no matter how far apart they may be.

Traditionally when people talked about “quantum” gravity, they were referring to quantum discreteness, quantum fluctuations and almost every other quantum effect in the book—but never quantum entanglement. That changed when black holes forced the issue. Over the lifetime of a black hole, entangled particles fall in, but after the hole evaporates fully, their partners on the outside are left entangled with—nothing. “Hawking should have called it the entanglement problem,” says Samir Mathur of Ohio State University.

Even in a vacuum, with no particles around, the electromagnetic and other fields are internally entangled. If you measure a field at two different spots, your readings will jiggle in a random but coordinated way. And if you divide a region in two, the pieces will be correlated, with the degree of correlation depending on the only geometric quantity they have in common: the area of their interface. In 1995 Jacobson argued that entanglement provides a link between the presence of matter and the geometry of spacetime—which is to say, it might explain the law of gravity. “More entanglement implies weaker gravity—that is, stiffer spacetime,” he says.

Several approaches to quantum gravity—most of all, string theory—now see entanglement as crucial. String theory applies the holographic principle not just to black holes but also to the universe at large, providing a recipe for how to create space—or at least some of it. For instance, a two-dimensional space could be threaded by fields that, when structured in the right way, generate an additional dimension of space. The original two-dimensional space would serve as the boundary of a more expansive realm, known as the bulk space. And entanglement is what knits the bulk space into a contiguous whole.

In 2009 Mark Van Raamsdonk of the University of British Columbia gave an elegant argument for this process. Suppose the fields at the boundary are not entangled—they form a pair of uncorrelated systems. They correspond to two separate universes, with no way to travel between them. When the systems become entangled, it is as if a tunnel, or wormhole, opens up between those universes, and a spaceship can go from one to the other. As the degree of entanglement increases, the wormhole shrinks in length, drawing the universes together until you would not even speak of them as two universes anymore. “The emergence of a big spacetime is directly tied into the entangling of these field theory degrees of freedom,” Van Raamsdonk says. When we observe correlations in the electromagnetic and other fields, they are a residue of the entanglement that binds space together.

Many other features of space, besides its contiguity, may also reflect entanglement. Van Raamsdonk and Brian Swingle, now at the University of Maryland, College Park, argue that the ubiquity of entanglement explains the universality of gravity—that it affects all objects and cannot be screened out. As for black holes, Leonard Susskind of Stanford University and Juan Maldacena of the Institute for Advanced Study in Princeton, N.J., suggest that entanglement between a black hole and the radiation it has emitted creates a wormhole—a back-door entrance into the hole. That may help preserve information and ensure that black hole physics is reversible.

Whereas these string theory ideas work only for specific geometries and reconstruct only a single dimension of space, some researchers have sought to explain how all of space can emerge from scratch. For instance, ChunJun Cao, Spyridon Michalakis and Sean M. Carroll, all at the California Institute of Technology, begin with a minimalist quantum description of a system, formulated with no direct reference to spacetime or even to matter. If it has the right pattern of correlations, the system can be cleaved into component parts that can be identified as different regions of spacetime. In this model, the degree of entanglement defines a notion of spatial distance.

In physics and, more generally, in the natural sciences, space and time are the foundation of all theories. Yet we never see spacetime directly. Rather we infer its existence from our everyday experience. We assume that the most economical account of the phenomena we see is some mechanism that operates within spacetime. But the bottom-line lesson of quantum gravity is that not all phenomena neatly fit within spacetime. Physicists will need to find some new foundational structure, and when they do, they will have completed the revolution that began just more than a century ago with Einstein.

This article was originally published with the title “What Is Spacetime?”
Rights & Permissions
Posted by Sc13t4 in Astrophysics, Cosmology, Design, Mathematics, Space/Time, Theoretical Physics, 0 comments