Skip to content

Ask A Genius 1501: Rick Rosner on Falsifiability and Predictions in Informational Cosmology

2025-11-08

Author(s): Scott Douglas Jacobsen

Publication (Outlet/Website): Ask A Genius

Publication Date (yyyy/mm/dd): 2025/08/21

Rick Rosner frames falsifiability as the ability to find evidence that definitively disproves a theory. For his Informational Cosmology, two key falsifiers would be proving no objects older than 13.8 billion years exist, or confirming dark matter is exotic particles rather than stellar remnants. He predicts: (1) the structure of consciousness mirrors universal physics, (2) objects older than the Big Bang exist, and (3) black holes never collapse to singularities. Possible tests include unusual gravitational lensing, gravitational wave patterns from halo collisions, or variations in constants. He concedes Informational Cosmology currently lacks parsimony but aims to eventually unify constants and structure.

Scott Douglas Jacobsen: First opener: When I say the term falsifiability—or even testability—what does that mean to you in the context of digital physics?

Rick Rosner: In the context of any experimental science, it means this: can you find something in the real world that shows your theory is wrong? If you say your theory predicts certain things and you have done the math to confirm it, then you check.

If the observations agree with your predictions, then your theory survives—at least for now. However, if the evidence is definitively not the way your theory predicts—not just small details off, but fundamentally wrong—then your theory has been falsified. That is how you do science: a theory has to make predictions, and those predictions must be testable. Otherwise, it is not a scientific theory.

So what would make me abandon informational cosmology? If we could show that nothing in the universe is older than the apparent Big Bang age of 13.8 billion years, that would be a serious problem. Informational cosmology says the universe is far older than that.

You would want to find ancient stellar remnants—white dwarfs, brown dwarfs, neutron stars—that predate the standard Big Bang age. According to stellar evolution, high-mass stars end as neutron stars or black holes, medium-mass stars become white dwarfs, and smaller stars can become brown dwarfs or just cool remnants. These cool slowly over billions of years, and their temperatures can, in principle, tell you how long they have been around.

The challenge is detection. White dwarfs can be faint, and brown dwarfs radiate very little, so they are hard to see, especially from far away. However, if you could measure the temperatures of these ancient stellar remnants precisely enough, you could estimate their ages. If none were found older than 13.8 billion years, that would directly contradict the claim that the universe itself is older.

Another possible falsifier is dark matter. Dark matter is primarily made up of old, burned-out stellar remnants. However, suppose future experiments prove that dark matter is instead dominated by exotic non-baryonic particles—something like WIMPs or axions—making up the vast majority of the universe’s mass. In that case, I would have to rethink. My model could allow for some exotic matter, but if observations confirm that exotic dark matter accounts for essentially all of it, that would seriously undermine informational cosmology.

So those are a couple of big potential falsifiers. Conversely, there is no such thing as a truthifier. You can only gather evidence in favour of a theory; you cannot ever prove a theory 100% accurate. That is not how falsifiability works.

However, you could say that “truthifiers” would be things like finding ancient objects—older than 13.8 billion years—or finding more heavy elements, like gold, than could plausibly have formed in the time since the Big Bang. There are also other possible numerical thresholds: background signal strengths, ratios of different types of matter and energy, or variations in measured constants. That kind of quantitative detail requires expertise I do not fully have.

Now, both the standard Big Bang theory and Informational Cosmology share the idea that the universe could be embedded in curved space, like three-dimensional space being the surface of a four-dimensional hypersphere shaped by gravity. However, I would argue that in Informational Cosmology, if you look back to the early universe, you would see a more convoluted space because there were collapsed remnants from previous cycles—black-hole-like regions connected along filaments.

If that is the case, what would it look like? You would expect a tremendous amount of gravitational lensing. Then again, the Big Bang universe also produces lots of lensing so that the distinction might be subtle. Still, I would expect that in extreme environments—say, near the event horizon of a supermassive black hole—some physical constants might shift slightly, like the electron–proton mass ratio or even the speed of light. General relativity might already account for that, but I doubt it fully.

Another possible test: if you could drop a probe into a black hole and show that there is not a region where the escape velocity exceeds the speed of light, that would be huge. Hawking radiation already shows that black holes are not perfectly black, but that is still consistent with relativity. If instead we discovered that the structure of space-time prevents infinite gravitational collapse—that black holes are never truly singular—then that would be a strong confirmation of Informational Cosmology. However, realistically, testing that directly is far beyond our current capability.

Jacobsen: So then, what are your three biggest predictions for Informational Cosmology?

Rosner: The first and biggest one is that the mathematics of the information in our consciousness—as a self-consistent, semi-contained whole—is deeply analogous to the physics of the universe. That is the core claim.

The second is that there are objects in the universe that are older than the apparent Big Bang age of 13.8 billion years.

The third is that black holes are never completely black. The supposed infinite gravitational pressure is tempered because space itself is shaped by information, and you can never set up an information distribution that produces actual infinite collapse.

What else? Let us call this number four. If you could somehow wait around for another five or even fifteen billion years, and the apparent age of the universe had not advanced in step—had not gone from 13.8 to 18.8 billion years in that span—that would be consistent with Informational Cosmology. The point is that the universe’s “apparent age” does not necessarily increase in lockstep with the passage of years. Five billion years from now, the universe might not look 18.8 billion years old; it might still appear closer to 13.8 billion years old.

Number five: most particle interactions are not time-reversible. Think about photons. Once a photon escapes from a star into interstellar space, it keeps going forever. There is not much for it to run into, and it does not reverse course. That is not time-reversible in any practical sense in an expanding, redshifted universe.

Now, someone might argue that if the universe eventually stopped expanding and collapsed—a “Big Crunch”—then maybe photons would come back, regaining the energy they lost as they blue-shifted. Frank Tipler, for example, has suggested a scenario where everything runs in reverse and even imagined that the resurrection of individuals could follow from that. However, I do not buy it.

Now, about galaxy structure. If the brain is an information processor that works by associations—neurons firing in patterns, dendrites efficiently encoding combinatorial signals, Bayesian networks pulling up the “most probable” associations—then physics must allow for that kind of structure. Moreover, the universe looks like an information processor itself: it has a filamentary network of galaxies, very much like the associative network of neurons. So, if the universe is an information-processing system, you would expect it to have that large-scale filamentary structure—which, observationally, it does.

On to nucleosynthesis. Heavy elements like gold form in extreme astrophysical events. Fusion inside stars can only build elements up to iron, because producing heavier nuclei consumes energy. To get beyond iron—to gold, platinum, uranium—you need cataclysmic events like supernovae or neutron star mergers. Those are among the only known processes that provide enough energy and density to create the heavy elements we see today.

Heavy elements beyond iron—like gold—form in violent astrophysical events. Supernova explosions and neutron star mergers provide the extra energy needed to fuse nuclei heavier than iron. If someone did the math, surveyed the universe, and showed that these processes make far more gold than we thought—say, three times as much—that could count against Informational Cosmology. Under IC, the universe is vastly older, which means it has had extra time for rare gold-making events to accumulate. If standard cosmology alone can explain the observed amounts of heavy elements, then IC loses one of its arguments.

Now, take the black hole example. Another way to falsify IC is by finding the wrong kind of dark matter. If you rule out compact-object dark matter—the idea that dark matter is made mainly of stellar remnants—that would be a direct hit.

Here is the issue: you cannot just say dark matter is “weird stuff that’s hard to detect.” To test it, you need a theory of what kind of particle it might be. Then you can design experiments. That is how we found the Higgs boson: theorists predicted it, and then CERN’s Large Hadron Collider generated conditions with enough energy to produce it. They accelerated protons to nearly light speed, smashed them together, and out popped Higgs bosons.

The same goes for dark matter. If a theory proposes a viable particle—say, a WIMP, axion, or something new—you can build experiments to try to detect it directly or indirectly. If experiments succeed, that is evidence for standard particle dark matter, and a blow to the idea that dark matter is old, primarily stellar cinders.

Dark energy is trickier, but again you would test it through its gravitational effects—how it curves space, shows up in gravitational lensing, or influences cosmic expansion. If someone came up with a modified theory of gravity that, for example, changes the inverse-square law slightly—like 1/r² becoming 1/r^1.98 under certain conditions—you could test that too.

If one of these alternative theories succeeds—if particle dark matter is confirmed, or a modified-gravity model works better than IC—that is a significant point in favour of standard cosmology. Right now, all we see are the large-scale effects: galaxies rotating too fast, gravitational lensing stronger than visible matter allows. If particle physics provides a solid candidate that matches those effects, then the exotic but testable framework of traditional cosmology wins.

Jacobsen: Would you expect weird patterns in the cosmic microwave background, or maybe a distinctive signal in the gravitational-wave background? 

Rosner: Possibly. If every galaxy has a halo of old collapsed matter, then stable galaxies that last tens of billions of years might leave stable orbital structures in their halos. However, when galaxies collide, their halos would crash into each other, and that could create distinctive signals.

That is maybe the most falsifiable angle: if halos are full of compact remnants—white dwarfs, neutron stars, stellar cinders—then galaxy collisions should produce detectable events. The question is whether those collisions would generate enough energy, gravitational waves, or neutrino bursts to be observed. 

People already look at interacting galaxies. If two halos full of compact objects collide, you might expect bursts of gravitational radiation or other signatures. However, you would need to run the math to see how frequent and intense such events would be.

Neutron stars and black holes are tiny, so direct collisions are rare. More often, two compact objects capture each other and orbit for centuries, gradually losing energy through gravitational radiation until they merge. That is what LIGO and Virgo detect: the inspiral and merger of compact objects. If Informational Cosmology is correct, there should be more such signals associated with halo interactions.

Jacobsen: Then there is parsimony. If Informational Cosmology ends up needing more free parameters than standard Big Bang cosmology, then by Occam’s razor, it is less efficient. That counts against it. And what about the universe’s information limits? The speed of light sets a hard cap on how fast information can move. 

Rosner: You could ask whether the universe processes information at that limit, or whether it is constrained differently. Astronomers already compare the recession speed of distant galaxies to the speed of light. Once something recedes faster than light due to cosmic expansion, it slips permanently beyond our observable universe. That is a built-in information horizon. From an informational perspective, you should think about the speed of light not just as a physical constraint, but as the rate at which the universe computes its evolution.

Alternatively, it could go the other way. Instead of the speed of light determining how fast things can move, maybe the speed of light itself is determined by the relative motion of everything in the universe.

Proper Informational Cosmology should be more parsimonious, not less. A worked-out theory should explain a lot of the physical constants and ratios—like the electron-to-proton mass ratio—that Big Bang cosmology cannot. Right now, IC is half-formed and amorphous, so it does not pass that test yet. You got me there. But not forever.

Jacobsen: Anyway, that is a long-term problem. Any final thoughts? 

Rosner: No. Thanks, however. 

Jacobsen: We can do another one tomorrow—maybe focus on testability.

Last updated May 3, 2025. These terms govern all In-Sight Publishing content—past, present, and future—and supersede any prior notices.In-Sight Publishing by Scott  Douglas  Jacobsen is licensed under a Creative Commons BY‑NC‑ND 4.0; © In-Sight Publishing by Scott  Douglas  Jacobsen 2012–Present. All trademarks, performances, databases & branding are owned by their rights holders; no use without permission. Unauthorized copying, modification, framing or public communication is prohibited. External links are not endorsed. Cookies & tracking require consent, and data processing complies with PIPEDA & GDPR; no data from children < 13 (COPPA). Content meets WCAG 2.1 AA under the Accessible Canada Act & is preserved in open archival formats with backups. Excerpts & links require full credit & hyperlink; limited quoting under fair-dealing & fair-use. All content is informational; no liability for errors or omissions: Feedback welcome, and verified errors corrected promptly. For permissions or DMCA notices, email: scott.jacobsen2025@gmail.com. Site use is governed by BC laws; content is “as‑is,” liability limited, users indemnify us; moral, performers’ & database sui generis rights reserved.

Leave a Comment

Leave a comment