Skip to content

Ask A Genius 1392: Rethinking the Cosmos: AI, Lambda-CDM, and the Crisis in Big Bang Cosmology

2025-06-13

Author(s): Rick Rosner and Scott Douglas Jacobsen

Publication (Outlet/Website): Ask A Genius

Publication Date (yyyy/mm/dd): 2025/05/28

 ]Rick Rosner is an accomplished television writer with credits on shows like Jimmy Kimmel Live!Crank Yankers, and The Man Show. Over his career, he has earned multiple Writers Guild Award nominations—winning one—and an Emmy nomination. Rosner holds a broad academic background, graduating with the equivalent of eight majors. Based in Los Angeles, he continues to write and develop ideas while spending time with his wife, daughter, and two dogs.

Scott Douglas Jacobsen is the publisher of In-Sight Publishing (ISBN: 978-1-0692343) and Editor-in-Chief of In-Sight: Interviews (ISSN: 2369-6885). He writes for The Good Men ProjectInternational Policy Digest (ISSN: 2332–9416), The Humanist (Print: ISSN 0018-7399; Online: ISSN 2163-3576), Basic Income Earth Network (UK Registered Charity 1177066), A Further Inquiry, and other media. He is a member in good standing of numerous media organizations.

Scott Douglas Jacobsen and Rick Rosner delve into rising tensions in modern cosmology, including the Hubble constant discrepancy, S₈ tension, cold dark matter, early galaxy formation, and cosmic anisotropies. They propose that the universe may be older and governed by information, hinting at a future paradigm shift fueled by AI.

Scott Douglas Jacobsen: So we’ve got a few things going on in Big Bang cosmology. Standard Big Bang cosmology is based on Lambda-CDM—Lambda being dark energy and CDM being cold dark matter—as fundamental components. The expansion of the universe is modeled using the Hubble constant, H0.

Early-universe measurements using the cosmic microwave background—like those from the Planck satellite—yield H0≈67.14 kilometers per second per megaparsec. But late-universe measurements, based on Type Ia supernovae and Cepheid variable stars, yield a significantly higher value—around 73.2 kilometers per second per megaparsec.

That discrepancy is statistically significant—on the order of five to six sigma. It suggests we may be missing something in our understanding of cosmological evolution. What we’re talking about is the Hubble expansion coefficient—the rate at which galaxies appear to recede from us based on their distance.

Our benchmarks for early-universe measurements are the cosmic microwave background, and for late-universe measurements, we use Type Ia supernovae and Cepheids. The discrepancy is roughly five to six sigma, meaning the two values should be closer than they are—statistically speaking.

Rick Rosner: I think it reflects a broader trend in science: the Big Bang theory accumulates anomalies until a better theory comes along—one that explains those anomalies more satisfactorily. 

Jacobsen: That’s the historical pattern of scientific progress. 

Rosner: But in this particular case, there’s a deeper issue. We increasingly understand the universe as being made of information. And in an information-based universe, the Big Bang model as it currently stands doesn’t quite work.

Jacobsen: Why not?

Rosner: Because the standard Big Bang model assumes a universe that is homogeneous in space but radically heterogeneous in time. Every moment of the universe’s history is dramatically different in size, scale, and energy. That doesn’t make much sense if the universe is fundamentally informational. For that model to hold, we would expect a kind of informational constancy across time—not just space.

The mismatch in the Hubble constant could be an artifact of a deeper issue in how we model time, space, and information in the universe. Especially since looking deeper into space is also looking further back in time. Comparing those measurements to what’s happening “now” introduces complex variables that might be glossed over in our current frameworks.

So that’s another reason the discrepancy might make sense—it could point us toward a more refined or radically different cosmological model.

Jacobsen: All right, let’s move to the next point—S₈ tension. That’s the S sub 8 parameter, which measures the amplitude of matter fluctuations in the universe.

Rosner: And what do those measurements show?

Jacobsen: They show tension. Specifically, predictions from the cosmic microwave background—like those from the Planck satellite—suggest a higher value of S₈ than what we see in large-scale structure observations, like weak gravitational lensing and galaxy clustering surveys. 

Rosner: So what exactly do they mean by “tension” here?

Jacobsen: It means the values don’t match. The discrepancy implies that there might be missing physics in the post-recombination growth of cosmic structure.

Rosner: Recombination—that’s the phase when the universe becomes transparent, right? Or have I got that wrong?

Jacobsen: No, you’re mostly right. Recombination occurred around 300,000 years after the Big Bang, at a redshift of approximately z≈1100. 

Rosner: It’s the point at which the universe cooled enough for protons and electrons to combine into neutral hydrogen atoms. That made the universe transparent because neutral atoms don’t scatter photons the way free electrons do. So, prior to that, photons were constantly interacting with charged particles, and after recombination, light could travel freely—hence the cosmic microwave background.

Jacobsen: Problem solved on that front. But back to the clumping—what’s the issue? 

Rosner: The problem is with the amount of clumping—structure formation—in the universe. If the Big Bang model is incomplete or inaccurate, and if the universe is fundamentally informational rather than purely material, that might affect how structure forms over time.

Jacobsen: So are they saying the universe isn’t as clumpy as it should be, based on our current physics?

Rosner: Yes, that’s essentially it. The observed level of matter clustering is lower than predicted by early-universe models. That’s the tension. So again, this might point to deeper flaws in the standard cosmological model—or, at the very least, suggest that we need to refine how we model structure formation over cosmic time. Ithe universe is made of information—as we’ve discussed—that would have implications for how matter clumps together and evolves after recombination.

Jacobsen: There’s still no direct detection of dark matter, which remains a cornerstone of the Lambda Cold Dark Matter model. So what do people currently think cold dark matter actually is—how it behaves, what it’s made of? 

Rosner: Honestly, I haven’t looked at dark matter research in a while. I’ve had my head buried in other things. From what I understand, the prevailing model still leans toward exotic particles—WIMPs, axions, or other beyond-the-Standard-Model candidates. But detection efforts haven’t turned anything up.

And I hope I’ve at least persuaded you that some portion of what we call cold dark matter could just be regular matter—collapsed into old stellar remnants: neutron stars, brown dwarfs, black holes. Essentially, very old, very dim, very cold stuff on the outskirts of galaxies. Hard to detect because it’s… black. Or nearly black—meaning it emits little to no electromagnetic radiation.

You might expect to see such objects through gravitational lensing, since even “invisible” mass bends light. So, hypothetically, if cold dark matter were primarily composed of these stellar remnants, would we see increased gravitational lensing beyond the visible bounds of galaxies—say, in the form of light distortion or blurring?

That’s a good question. You might—but it depends on how much of this collapsed matter exists and how it’s distributed. My guess is that the objects themselves are so small, relatively speaking, that the total amount of lensing wouldn’t be easy to detect. Because their mass is concentrated into tiny volumes, right? A neutron star is, what, about 10 miles across?

Compare that to the Sun, which is about 800,000 miles in diameter—neutron stars are just one ten-thousandth of one percent of the Sun’s width. That’s one part in 10 billion in terms of area, so visually speaking, these collapsed remnants occupy an infinitesimal portion of the sky.

So even if you had a lot of them, the total area of space subject to significant gravitational lensing would still be small—unless light passes very close to them. Everything causes lensing in theory, but to get appreciable lensing—something we can observe and measure—light has to pass very near the object.

Jacobsen: That brings us to a more direct and speculative point: if we can’t see it, our critiques remain speculative. It’s the classic problem—absence of evidence is not evidence of absence. But at the same time, it isn’t positive support either. It’s a weird duality. The lack of direct detection doesn’t disprove cold dark matter, but it also doesn’t bolster it. 

Rosner: The longer that persists, the more it invites alternative explanations—including the possibility that some dark matter is just ordinary matter in an unlit, collapsed state.

Jacobsen: The issue is, if you’re critiquing the model without providing a rigorous alternative, it comes off as hand-waving. It’s a strange tension. Okay, next on the list, one of the big challenges is the so-called small-scale crisis in Lambda Cold Dark Matter (ΛCDM). The model predicts more small-scale structures—dwarf galaxies, satellite galaxies—than we actually observe. Apparently, that includes the “Too Big to Fail” problem and the “Missing Satellites” problem.

Rosner: These refer to the mismatch between predicted galaxy formation on small scales and observed structures. Either our understanding of galaxy feedback processes and small-scale dark matter behavior is incomplete—or the model is just wrong. If the universe did not form 13.8 billion years ago—as current cosmology holds—but instead formed much earlier, and merely appears to be 13.8 billion years old, then yes, you’d expect a wide range of anomalies. That’s the essence of the critique. An older universe has more time to evolve, more time to form large galaxies and black holes—things that shouldn’t appear as early as they do under the current model.

Jacobsen: Which brings us to the early massive galaxies and early supermassive black holes. Observations are showing mature, bright, and massive galaxies far earlier in cosmic history than ΛCDM can comfortably explain. It’s “too much, too soon.” These structures are forming at redshifts that indicate they existed only a few hundred million years after the Big Bang. If the universe is actually older, that problem disappears—they just had more time to form.

Rosner: When I was pretending to be a high school student, I had way more chest hair than I should have. I probably should have waxed—but instead, I shaved, which looked… suspicious. But the point is, I had about nine extra years to grow it compared to most high schoolers. The same applies under an informational cosmology—if the universe is much older than it appears, it’s had more time to grow a bunch of stuff.

Which leads us to the final issue: anisotropic anomalies and cosmic voids—these large-scale cold spots and dipole asymmetries in the cosmic microwave background. These suggest the universe may not be as isotropic and homogeneous as the ΛCDM model predicts.

If the universe is older and governed by informational structure rather than material structure alone, you’d expect more heterogeneity over time—more irregularities, more large-scale voids, and structural anomalies.

Jacobsen: Basically, the universe has had more time to “scramble” itself—producing more large-scale structure, irregularity, and deviations from the assumed cosmic smoothness. If it’s older than we think, it would be “holier”—as in full of more voids—than it appears. Which brings us to the natural question…How many popes did it take? How many popes did they go through to become this holy?

Rosner: That landed. That was a shocker-room moment.

Jacobsen: Sorry, sorry.

Rosner: Anyway, gravity may function as an informational equalizer. It might operate in such a way that the scale of space adjusts to keep information evenly distributed—or at least to ensure that gravitational vectors are balanced across all directions.

That reminds me—there’s an old concept in black hole theory from the 1970s called “black holes have no hair.” It was about the idea that black holes are defined only by a few key properties—mass, charge, and spin—and nothing else.

But the broader point is this: the universe, to function consistently, has to behave like a global system. That likely includes no net spin, no net charge—certain boundary conditions that constrain how asymmetric it can get.

Jacobsen: So gravity acts to smooth things out across cosmic scales—to impose a kind of global isotropy.

Rosner: So yes, the universe may be “holier” than expected if it’s older—but gravitational dynamics still enforce a degree of uniformity so the whole system doesn’t go off balance. I’m doing a terrible job of explaining—or even understanding—this, but I get the general shape of the idea. 

Jacobsen: It’s like when leaves fall in a small town and the wind gathers them in the corners of cul-de-sacs or against the curves of curbs and fences.

Rosner: Yes, exactly. That’s the most straightforward analogy: the leaves accumulate in natural crevices. That’s how clumping and anisotropy could emerge. 

Jacobsen: But there’s a limit. Eventually, the corners fill, and excess leaves blow out. 

Rosner: The universe can only tolerate so much structural asymmetry before conservation laws push back. So the universe can’t be permanently lopsided. It needs to have net zero angular momentum—otherwise, the physics wouldn’t hold together. Those kinds of constraints—like zero net spin—help limit how weird the large-scale structure of the universe can be.

Last updated May  3, 2025. These terms govern all In Sight Publishing content—past, present, and future—and supersede any prior notices.In Sight Publishing by Scott  Douglas  Jacobsen is licensed under a Creative Commons BY‑NC‑ND 4.0; © In Sight Publishing by Scott  Douglas  Jacobsen 2012–Present. All trademarksperformancesdatabases & branding are owned by their rights holders; no use without permission. Unauthorized copying, modification, framing or public communication is prohibited. External links are not endorsed. Cookies & tracking require consent, and data processing complies with PIPEDA & GDPR; no data from children < 13 (COPPA). Content meets WCAG 2.1 AA under the Accessible Canada Act & is preserved in open archival formats with backups. Excerpts & links require full credit & hyperlink; limited quoting under fair-dealing & fair-use. All content is informational; no liability for errors or omissions: Feedback welcome, and verified errors corrected promptly. For permissions or DMCA notices, email: scott.jacobsen2025@gmail.com. Site use is governed by BC laws; content is “as‑is,” liability limited, users indemnify us; moral, performers’ & database sui generis rights reserved.

Leave a Comment

Leave a comment