Ask A Genius 1263: More Enshittification
Author(s): Rick Rosner and Scott Douglas Jacobsen
Publication (Outlet/Website): Ask A Genius
Publication Date (yyyy/mm/dd): 2025/02/14
Scott Douglas Jacobsen: Let’s examine the concept of enshittification—a term popularized by Cory Doctorow and applied to recent technological developments, particularly social media platforms like Facebook, Twitter, and Google. These companies start by attracting a large user base with attractive deals and free services, burning through venture capital during their initial, money‐losing phase. They quickly amass millions, or even billions, of users, and their market value skyrockets. However, once shareholders and owners shift their focus toward profitability, these companies begin selling ads, charging for services, and generally squeezing users and employees. The quality of service often degrades over time, with Twitter serving as a prime example of this phenomenon.
Rick Rosner: This discussion leads us to consider a broader idea: the unification of consciousness and the reality that genuine thought is an expensive resource. Modern brain science suggests that the primary function of your brain is to predict what will happen next so that you can position yourself to maximize benefits and minimize risks. With a limited energy budget, your brain assembles thoughts that are half-formed best guesses. Moreover, your brain is not always an honest mediator; it sometimes manipulates your priorities and perceptions to favour survival and reproduction over objective reasoning. Even if AI may seem crude or “shitty” in its current information processing, it might eventually begin to approximate aspects of unconscious thought.
In summary, it is easy to overestimate AI’s sophistication—attributing it with true thought when merely processing information—and overestimate our mental abilities.
Can I ask: have you been tricked into overestimating the power of AI the way I have?
Jacobsen: I remain extremely skeptical. I listen to prominent AI figures—Eric Schmidt, Elon Musk, Sam Altman, Ray Kurzweil, Geoffrey Hinton, Andrew Ng, Andrej Karpathy, Daphne Koller, Ian Goodfellow, Jürgen Schmidhuber, Joy Buolamwini, Yoshua Bengio, Demis Hassabis, Fei-Fei Li, Yann LeCun, and so on. Their prominence doesn’t necessarily guarantee a correct opinion; it may indicate they’re more informed, though still fallible. Perhaps they’ll pick up on something I’m not considering.
If you ask it a simple math problem—say, “23 plus 17″—it might erroneously answer “8” instead of “40.” Even when you correct it, the mistake may persist. Essentially, you’re dealing with a system prone to constant confabulation, a tendency shaped by its training on vast and sometimes flawed human discourse. Left to its own devices, as human language evolves, its output could slowly diverge further from accurate human thought. Ultimately, it remains just a tool.
Last updated May 3, 2025. These terms govern all In Sight Publishing content—past, present, and future—and supersede any prior notices. In Sight Publishing by Scott Douglas Jacobsen is licensed under a Creative Commons BY‑NC‑ND 4.0; © In Sight Publishing by Scott Douglas Jacobsen 2012–Present. All trademarks, performances, databases & branding are owned by their rights holders; no use without permission. Unauthorized copying, modification, framing or public communication is prohibited. External links are not endorsed. Cookies & tracking require consent, and data processing complies with PIPEDA & GDPR; no data from children < 13 (COPPA). Content meets WCAG 2.1 AA under the Accessible Canada Act & is preserved in open archival formats with backups. Excerpts & links require full credit & hyperlink; limited quoting under fair-dealing & fair-use. All content is informational; no liability for errors or omissions: Feedback welcome, and verified errors corrected promptly. For permissions or DMCA notices, email: scott.jacobsen2025@gmail.com. Site use is governed by BC laws; content is “as‑is,” liability limited, users indemnify us; moral, performers’ & database sui generis rights reserved.
