Ask A Genius 1529: Is “Alien: Earth”’s Wendy a Mary Sue?
Author(s): Scott Douglas Jacobsen and Rick Rosner
Publication (Outlet/Website): Ask A Genius
Publication Date (yyyy/mm/dd): 2025/09/29
Does Alien: Earth fall into science fiction tropes like the Mary Sue and the “idiot ball,” or does it build meaningfully on the Alien franchise?
Rick Rosner critiques Alien: Earth through the lens of classic science fiction tropes. He sees Wendy, the hybrid lead portrayed by Sydney Chandler, as fitting the “Mary Sue” archetype: overly competent, with few visible flaws, much like Ripley in Alien (1979) but with heightened powers. He contrasts this with films like The Long Kiss Goodnight, which justify character abilities within the story. Rosner also highlights the “idiot ball” trope—characters making foolish choices to advance the plot—common in Alien films. His larger point: science fiction demands knowledge of its tropes to avoid lazy storytelling, as with time travel clichés.
Rick Rosner: I was thinking about Alien: Earth. A lot of people reached the same conclusion. Wendy fits the science-fiction trope called a “Mary Sue.” Are you familiar with that? A Mary Sue is an overly competent character—usually a young woman—portrayed as free of meaningful flaws.
The term comes from Paula Smith’s 1973 Star Trek parody “A Trekkie’s Tale.” Like Ripley in the original Alien (1979): everyone else made mistakes, and she survived. Casting helped—Sigourney Weaver is tall and physically imposing—but Ripley was written as a working crew member on a commercial ship, not royalty.
Sigourney Weaver and the Alien underwear scene: there’s a long-circulating anecdote that producers wanted her shaved and that pubic hair was retouched out of shots. I haven’t found a primary source confirming the airbrushing story; treat it as unverified lore.
It showed Weaver was a great choice to lead an action-horror story. She looked formidable—big movie-star jaw, strong cheekbones—and she’s tall. But the Mary Sue archetype is usually an ordinary person who, when the crisis hits, suddenly performs with near-unrealistic mastery. There’s a movie called The Long Kiss Goodnight—not “Last”—one of my favorites. Geena Davis plays a small-town schoolteacher who, under pressure, reveals she was once a highly trained assassin. That’s not really a Mary Sue, because the film gives an in-world reason for her abilities (amnesia; her prior life as Charly Baltimore). It’s a common character type. Some critics say Wendy’s Mary-Sue-ness in Alien: Earth is overblown: she’s a hybrid who can interface with xenomorphs and—with access—exert control over facility systems, which can lower perceived stakes. The series is a prequel set two years before Alien (1979), and Wendy is portrayed by Sydney Chandler.
I don’t think they’re wrong. It’s still enjoyable, but she is way too powerful a character. We don’t know the extent of her abilities. When she first takes on a xenomorph—spoiler alert—she kills it, and they don’t even show it on camera. They just show her stepping away from the body. So they make her super powerful. Another trope I was reading about in Forbes is called the “idiot ball.” In improv exercises you take turns passing an imaginary ball, sometimes in games like “zip zap zop.” It’s an exercise in mental quickness. The “idiot ball” in science fiction, especially in the Alien movies, means whoever catches it does something unforgivably stupid that gets themself or others killed. The Forbesreview said there was a big idiot ball in Alien: Earth, which is true—and in all the Alien movies. They’re often driven forward by characters making dumb decisions. That’s a trope in both science fiction and horror. If you’re going to write anything—books, TV, movies—get a sense of what your genre is, or whether it crosses genres, and be well read in those genres. Know the tropes. There’s a website called TV Tropes. If you’re not already familiar with the conventions of, say, a time travel story, educate yourself: watch several time travel films, read books with time travel plots. Time travel movies are notorious for falling into the same ruts. There’s a good one starring Jake Gyllenhaal, directed by Duncan Jones, with Michelle Monaghan in it—Source Code. Duncan Jones clearly knows the tropes, because he tells a story that doesn’t fall into the usual traps. It’s suspenseful and exciting. But so many other time travel stories fall into clichés. For example: no matter what you do, fate blocks you, and the Titanic sinks anyway. Or like Back to the Future, where a change to the past must be fixed to restore the timeline or everything will be destroyed. With Back to the Future it works, because it’s popular-level entertainment—meant to be fun—and it pushes boundaries in playful ways, like the subplot where the teenage mom develops a crush on her time-traveling son. But many other time travel movies recycle the same tropes. Some low-budget ones avoid them but are irritating for other reasons. The point is: if you’re going to write in a genre, be familiar with it. That’s a big problem with TV science fiction. Too often the people producing it aren’t steeped in the tropes, or they just get lazy. I complain about that a lot.
Altered Carbon is a lazily imagined future, 300 years from now. It feels incomplete. They should have had a writer’s room with futurists to flesh it out more. I don’t know how Westworld did it, but for at least its first two or three seasons, it managed to tell a pretty involving story. It took stabs at imagining aspects of the future that were both plausible and unsettling. If you’re going to write near-future science fiction, you need a strong writer’s room that includes good near-future science fiction writers—people like Neal Stephenson or Charles Stross.
Noah Hawley, I think, did all the writing for Alien: Earth. But he was working within a well-established future world and guided by that. Lack of familiarity wasn’t an issue. He and the production team were clearly familiar with every aspect of how the Alien movies were made. They even had original blueprints and worked from those.
But if you’re creating an original story and you’re not steeped in science fiction—if you’re just some Hollywood slickster who’s written a couple of decent screenplays, but not in near-future sci-fi—get help from people who know the field.
It’s like what you see in $200 million superhero movies. Because they’re spending so much money, they bring in people who know the entire canonical history of the characters. James Gunn, now in charge of the DC Universe, knows and loves the history of every DC superhero. He loves the characters, and he also loves the weirdness.
So his Superman movie is straightforward but with twists. Superman still stands for truth, justice, and the American way, though he gets made fun of for it. He tries to defend himself, insisting he’s a cool guy—“punk rock”—but that’s the joke: he isn’t. The movie has the Fortress of Solitude—traditionally at the North Pole, but in this version it’s at the South Pole. He also has virtual parents who left him a message: be a good boy, protect Earth, be its savior. Part of the message is scrambled, and that creates a twist.
So Superman is a fairly upstanding movie. But Gunn also created Peacemaker, about D-level superheroes whose personal dysfunction keeps them from being as good or as effective as Superman. A bunch of messed-up stuff happens with them, but it’s still in the same universe. The latest episode even had a Lex Luthor crossover. Gunn knows the canon backwards and forwards, and that lets him make both a solid Superman film and a twisted, darker show that still feels consistent. So the point is: know your material.
I’ve got a question for you. You and I agree that AI is advancing at a rate that suggests it will be able to do a lot at some point.
Scott Douglas Jacobsen: Your limitations are power. Compute isn’t casual; it’s hunky. It is a question of when each gigawatt compute centre comes online.
Rosner: My question is this: Cory Doctorow—who has a huge amount of technical knowledge, probably more than either of us about how tech actually works—thinks AI is never going to achieve anything like human competence. Why does Doctorow think this?
Jacobsen: If he’s stating that in absolute terms, it’s clearly wrong. As I was noting before, chess, white-collar jobs, text production, generating ideas, writing abstracts—AI already does those well, faster and better than most people. So I don’t buy the blanket argument. But I do accept the other half: there are areas where it still hasn’t reached human-level competence.
Rosner: Why is he saying this? It’s not necessarily pessimism because…
Jacobsen: Douglas Rushkoff started making the same argument a few years ago.
Rosner: Who did?
Jacobsen: Douglas Rushkoff. He is an anarchist, left-wing writer in the vein of Robert Anton Wilson and Timothy Leary. Then he pivoted with Team Human, which was about keeping human sensibilities and values in the mix. Maybe Cory Doctorow is going through a similar sentiment.
Rosner: I don’t think so. I think Doctorow likes to be realistic. One of his arguments is that there will be a huge crash in AI because it has no way of recouping the tens of billions spent on it.
Jacobsen: The only way is through defense contractors. With half a trillion dollars in projected spending, that’s where they’ll go to recover losses. Few other sectors can provide that scale.
Rosner: What will happen is a big crash. Stock values could drop by 75–80%. Some companies may go bankrupt, though probably not the largest ones. Afterward we won’t be starting from zero, but from a place where the money has been lost while the products—LLMs, models, and other AI systems—still exist. There will still be useful tools after the crash. Doctorow argues those tools will be too expensive to use, because compute costs rise as models consume more data. But that’s not entirely true. You can prune models or build smaller, efficient ones. Humans themselves think effectively with far less data than these systems hold. Maybe not one-millionth, but vastly less. We still manage. What he’s saying could even be cause for optimism: if AI never achieves human-level competence, it will be less capable of destroying the world, whether by intention or accident. But it doesn’t seem realistic to me.
Last updated May 3, 2025. These terms govern all In-Sight Publishing content—past, present, and future—and supersede any prior notices. In-Sight Publishing by Scott Douglas Jacobsen is licensed under a Creative Commons BY‑NC‑ND 4.0; © In-Sight Publishing by Scott Douglas Jacobsen 2012–Present. All trademarks, performances, databases & branding are owned by their rights holders; no use without permission. Unauthorized copying, modification, framing or public communication is prohibited. External links are not endorsed. Cookies & tracking require consent, and data processing complies with PIPEDA & GDPR; no data from children < 13 (COPPA). Content meets WCAG 2.1 AA under the Accessible Canada Act & is preserved in open archival formats with backups. Excerpts & links require full credit & hyperlink; limited quoting under fair-dealing & fair-use. All content is informational; no liability for errors or omissions: Feedback welcome, and verified errors corrected promptly. For permissions or DMCA notices, email: scott.jacobsen2025@gmail.com. Site use is governed by BC laws; content is “as‑is,” liability limited, users indemnify us; moral, performers’ & database sui generis rights reserved.
