Skip to content

Ask A Genius 1287: Near-Future Fiction, AI Evolution, and the Changing Nature of Work

2025-06-13

Author(s): Rick Rosner and Scott Douglas Jacobsen

Publication (Outlet/Website): Ask A Genius

Publication Date (yyyy/mm/dd): 2025/03/02

Scott Douglas Jacobsen: Bear, writer boy—what have you been writing?

Rosner: The novel I’ve been writing begins with something close to a murder.

Jacobsen: Dark?

Rosner: Yes. I’ve been going through that chapter again, ensuring the logistics and the action make physical and medical sense. Because it’s not as simple as putting a bullet in somebody. The action is more violent and intimate and takes an uncomfortably long time. This is part of the grim fun in a Stephen King sense. Stephen King meticulously describes shit—he wants it to be as accurate as possible. So, I’m writing this as an homage to King. I want to get everything right. I’ve been acting it out—getting into various physical positions to see if the sequence makes sense. Turns out, my first version was half-assed. I had to rewrite a lot of it, making the action more precise, because if they ever made it into a movie, the action needs to translate visually.

Jacobsen: Yes, makes sense.

Rosner: And I read it to Carole. She said it goes on too long because I describe everything too meticulously—even down to how the attacker places each limb while sneaking up on his victim. She told me it’s too much. But I don’t want to write, “He slowly scurried up the structure.” I want every moment to feel real. Anyway, once the victim is dead, the attacker wants the body to go undiscovered for as long as possible. The longer it takes to be found, the more the evidence deteriorates. And the less the attacker gets connected to the crime. But now, I have to figure out what happens to the property. Los Angeles has 70,000+ homeless people—what happens to empty houses? Do squatters take over? Or does it turn into a party house, some Hollywood Hills mansions where people break in, trash the place, and throw raves? Or does it become a meth lab? A nice four-story house was less than a mile from my place—left unattended for a year. People set up a meth lab inside. And it burned to the ground. So, I have to figure out what happens after the murder in my book. Because the attacker leaves, but consequences unfold anyway.

Jacobsen: Yes.

Rosner: And the novel spans 20 years past this event. I haven’t even dealt with the future implications of it all. It’s a weird challenge, because when you write near-future fiction, you risk getting overtaken by real events.

Jacobsen: Charles Stross?

Rosner: Stross wrote two books in a near-future trilogy. When it was time for Book Three, he gave up. He said, “The stuff I was gonna write about already happened. It’s not sci-fi anymore.” And that’s a huge risk when writing about America’s future. We have two major unknowns: the political trajectory of the U.S. and technological advancements in the next two decades.

Jacobsen: Blade Runner takes place in 2019.

Rosner: Then Blade Runner 2049 came out in 2022. But it’s no longer a future world—it’s a world that never happened. And that’s the risk. I don’t know. I’m figuring it out as I go.

It’s an alternate world that didn’t come true. The 2019 in Blade Runner didn’t happen. It’s way different from our 2019. But 2049 has to stick to the alternate history of the world, which is fine. But I’d to not get everything completely wrong. If I’d been faster in my writing, I would have gotten the 2024 election completely wrong. Because I thought there was no fucking way we’d reelect the worst president in history. Though it does fit in with the future that Trump’s creating. Suppose you could rerun history—or look at the history of many civilizations—you might find that, when artificial thought begins to usurp evolved biological thought, social derangement tends to happen simultaneously with the rise of AI. You’re nodding because the same tech that brings us social media, which can derange tens of millions of people, also brings AI. So, I don’t know. That’s what I’ll try to poke at next. I’m grateful to you and our talks, because it helps me work through this.

Jacobsen: No comment. It’s general. We won’t be the dominant thinkers anymore—most computation will be 95 to 99% nonhuman. We’ll learn what that means in precise terms, and that will change things in ways we can’t even predict yet. We could have pulled the biggest PT Barnum on humanity ever. Maybe we fooled ourselves. Maybe the super-smart people fooled themselves, and then they fooled less expert people into thinking they understand intelligence. Maybe we’re all wrong. Maybe we’ll have to redefine intelligence based on these technologies. Maybe computational systems aren’t the majority of thought. Maybe we categorize things differently. Maybe. But it’s an inflection point. It raises a lot of open questions. Do you think AI will continue to be owned by mega-rich corporations and the elite?

Rosner: It’d be weird if it weren’t. But it’s all speculation. It wouldn’t be unreasonable to think they might restructure things to take over for themselves. They could manipulate the system subtly over time, co-opting and convincing us—without violence—until they are in total control. I never read Karl Marx’s Das Kapital. Have you?

Jacobsen: I read it in high school. 

Rosner: One thing I do remember is that he sets up a timeline in which the means of production eventually end up in the hands of the workers. Then, we’re supposed to get a workers’ paradise—where people own their own livelihoods instead of paying rent to capitalists. But with AI, the next logical step is that production takes over itself. Karl Marx never imagined robots. The word robot didn’t even exist—it was 75 years in the future.

Jacobsen: You get hokey things—positronic brains in Asimov’s stories—but nobody saw this coming.

Rosner: What Marx got wrong—he assumed we’d always need workers. What we’re seeing now is that you fucking don’t.

Jacobsen: What we define as a worker and what we mean by work is changing. We used to mean physical labor. Then we expanded that to mental labor. But now, with computers, we need to generalize even further.

Rosner: And the problem is that Maslow’s Hierarchy of Needs applies to everybody, but it might have holes.

Jacobsen: It was constructed by a guy in 20th-century North America. That’s fine.

Rosner: But the bigger problem is—we have no fucking clue what AI will want to be fulfilled.

Jacobsen: We don’t even fully understand human consciousness.

Rosner: There’s that saying: “No man is an island.” But we all fucking are. Our consciousness is trapped inside our skulls.

Jacobsen: Yes, but we have mirror neurons—so we automatically recognize others’ experiences.

Rosner: But AI won’t have that. Will it defend its individual identity? Or will it be “slutty” —merging and splitting itself at will? We have no fucking idea.

Jacobsen: Humans evolved segmentation—which is why we’re individualistic. Evolution shaped us to be separate. Machines don’t have that limitation. 

Rosner: But why don’t animals evolve shared consciousness?

Jacobsen: Probably for the same reason they don’t have wheels. It’s logistically hard. 

Rosner: Right. But some things evolve easily—eyes.

Jacobsen: Because eyes are balls.

Rosner: But linked brains? Not so much.

Jacobsen: Except for conjoined twins.

Rosner: That’s a good point. But most conjoined twins don’t even share brain function. I don’t know how their thinking works if they’re joined at the head. I don’t think we have an example of shared cognition in humans.

Jacobsen: I’ll look it up. But nature deals in whole systems. It doesn’t piecemeal things together.

Rosner: Right. But octopuses might be the closest thing to distributed cognition. They have nine brains—a central brain and one for each arm.

Jacobsen: Yes, but do they think separately? Or are the arms brains or motor control systems?

Rosner: Probably the latter. They coordinate—but they don’t argue with themselves.

But you could imagine a giant, super-smart octopus that does.

I don’t fucking know.

Last updated May  3, 2025. These terms govern all In Sight Publishing content—past, present, and future—and supersede any prior notices.In Sight Publishing by Scott  Douglas  Jacobsen is licensed under a Creative Commons BY‑NC‑ND 4.0; © In Sight Publishing by Scott  Douglas  Jacobsen 2012–Present. All trademarksperformancesdatabases & branding are owned by their rights holders; no use without permission. Unauthorized copying, modification, framing or public communication is prohibited. External links are not endorsed. Cookies & tracking require consent, and data processing complies with PIPEDA & GDPR; no data from children < 13 (COPPA). Content meets WCAG 2.1 AA under the Accessible Canada Act & is preserved in open archival formats with backups. Excerpts & links require full credit & hyperlink; limited quoting under fair-dealing & fair-use. All content is informational; no liability for errors or omissions: Feedback welcome, and verified errors corrected promptly. For permissions or DMCA notices, email: scott.jacobsen2025@gmail.com. Site use is governed by BC laws; content is “as‑is,” liability limited, users indemnify us; moral, performers’ & database sui generis rights reserved.

Leave a Comment

Leave a comment