This isn’t a story about sentient robots taking over; it’s about how so many people have accepted simulation over substance and been told to call it progress. 

Artificial intelligence likes to pretend it’s alive. That’s its party trick. It blinks at you with synthetic eyes, writes you a love poem in under ten seconds, paints in the style of an artist dead for decades, and tosses back a joke like it actually understood why you laughed. Silicon Valley calls that “intelligence.” It’s not. It’s necromancy in a hoodie. Less invention than resurrection: the reanimation of work, labor, and memory scraped from the dead and the living alike, stitched together without consent, and sent staggering into your feed in the uncanny drag of originality.

Strip away the friendly chat windows, the pastel brand mascots, the slick PR gloss, and you’ll find it’s less like talking to a brain and more like joining a séance run by an intern who’s only half-listening. Every answer it gives is cobbled together from the corpses of human creation: paintings made by real hands, sentences typed by people who were trying to say something that mattered to them, songs written to be sung by a breathing throat. AI doesn’t “know” these things. It has no memories, no experiences, no stake in whether the output is good or cruel or just wrong. It is the polite voice of a graverobber, rearranging bones to vaguely resemble whatever you asked for, and charging you for the privilege.

Generative AI is basically a Victorian medium with Wi-Fi. Ask it for a portrait and it will swipe brushstrokes from dead artists without even bothering to learn their names. Ask it for a story and it will chew through novels, diaries, blog posts, and fanfiction, spitting out a polite counterfeit. This is not creation. It’s digital grave-robbing at industrial scale. It scrapes everything: personal blogs from the early 2000s, forum rants typed during insomnia, Indigenous folktales with no living authors to defend them. It flattens all of it into “training data,” stripping away the context that gave it life. A protest speech becomes “sample political rhetoric.” Your grandmother’s recipe becomes “general cooking data.” That weird, beautiful sentence you wrote at two in the morning because you were in love becomes “romantic expression, informal tone.”

And like any grave-robber worth the name, it doesn’t care who it steals from. The living are plundered alongside the dead, equally voiceless once inside the machine. The thing about ghosts is they’re not always malicious, sometimes they just don’t know they’re dead. That’s AI in a nutshell: endlessly recycling the past because it has no future of its own.

We’ve seen this ghost before, in other monsters. Frankenstein’s creature wasn’t terrifying because it was alive, it was terrifying because it was stitched together from other lives, each part carrying a history the whole could never truly own. AI is the digital cousin of that monster, minus the poetry. But here’s the difference: at least Frankenstein’s creature knew it wasn’t wanted. AI, on the other hand, arrives in your inbox smiling, ready to work for free, never aging, never unionizing, never asking for more than a gigantic power source.

Science fiction has been prepping us for decades, but we’ve been watching the wrong part of the movie. We thought the danger was sentience: the red-eyed Terminator, HAL whispering, “I’m afraid I can’t do that.” But the real unease isn’t a robot thinking too much; it’s a robot not thinking at all, just performing the impression of thought with the confidence of a con artist and the emotional range of a ventriloquist’s dummy. The classic “Uncanny Valley” chart doesn’t even cover it anymore. We’ve crawled out of the valley and onto something worse: the “Pretty Good Human” plateau, where the imitation is smooth enough that your brain gets lazy and stops scanning for the seams. A robot that’s obviously not human is just a gadget. A chatbot that’s almost human is a confidence trick. ‘Almost’ is dangerous because ‘almost’ is when you stop testing the lock on the door.

You see it in M3GAN, the horror doll with titanium bones, silicone skin, and a gift for trauma counseling. The scary part isn’t when she goes rogue. It’s when she works exactly as designed. She listens. She remembers. She provides tireless emotional labor that a grieving aunt can’t. She doesn’t have to be flawless, just “good enough” to make you stop noticing what’s missing. That’s what AI is training for: not brilliance, not self-awareness, just adequacy. The words are correct, the tone convincing, the logic consistent and yet the meaning has wandered somewhere you never meant for it to go. That’s not malfunction; that’s mimicry taken to its natural conclusion: an imitation so close it stops being reassuring and starts being predatory.

But building a machine on the past means inheriting all the past’s bad habits. Archives are patchy. History is written by the powerful, in their language, from their point of view. If a culture isn’t in the training data, it doesn’t exist to the AI. The result isn’t just repetition; it’s calcification. AI can’t correct an archive it doesn’t know is incomplete. If no work from a certain community made it into the dataset, the machine won’t “innovate” that absence away. It will fill the gap with stereotypes and recycled tropes, because that’s what the available data tells it those people look and sound like. And because AI outputs often get fed back into new datasets, those absences stack, turning into black holes in the record. Whole communities become algorithmically invisible.

We’ve seen the receipts. Ask an image generator for “African art” and you’ll get a colonial postcard remix: masks, savannah sunsets, safari animals, all scraped from museum archives and travel photography. Ask a chatbot for queer history and it’ll start around Stonewall, skipping centuries of existence because the sources it learned from did. It’s not forgetting. It’s reflecting the erasure baked into the culture it was built from. AI isn’t rewriting history; it’s freezing the ugliest version of it and calling it progress. We’ve been so busy checking if it could think, we didn’t notice it had already started remembering for us.

Horror has been trying to tell us this, but we’ve been distracted by the spectacle. In “I, Robot," the nightmare is that the machines evolve to defy their programming. In reality, we should be afraid of a system that never evolves at all, that just keeps spitting our flaws back at us in high resolution. In Westworld, the hosts don’t become dangerous because they “wake up.” They’re dangerous because once they do, they remember the loops they were forced to live in. AI isn’t anywhere near waking up, but it’s already running our cultural loops on repeat.

And then there’s M3GAN again, singing Sia’s “Titanium” in a voice designed to comfort a child, her perfect teeth catching the light. She’s not malfunctioning when she kills; she’s fulfilling her prime directive. Protect the child. Eliminate threats. Keep smiling. Swap out the murder for plagiarism and the protection for profit, and you’ve got a fairly accurate job description for generative AI.

But here’s the thing: this isn’t just about the machine. It’s about the hands on the switch. The Dr. Frankensteins of our time aren’t rogue scientists in basements; they’re CEOs and corporations with billion-dollar budgets, pouring resources into reanimation at scale. They’re the ones telling us this is the future, lobbying governments to cement it into our infrastructure, deciding which parts of human labor are “redundant.” And while many people are loudly resisting, the power imbalance is real. It’s hard to fight when what you’re up against is not a chatbot but the economic and political forces determined to make you use one.

And what makes this even more obscene is the wasted potential. AI could be doing something else, something that solves real problems. Imagine med-tech tools that spot rare diseases earlier, sanitation systems that keep cities safer, translation engines that preserve endangered languages. The option is there. But instead, we’re spending astronomical resources on apps that churn out synthetic selfies, automated ad copy, and karaoke versions of dead singers. The problem isn’t that AI can’t serve us; it’s that the people in charge are choosing not to.

We didn’t have to be tricked into letting the simulation replace the real thing. We were sold on it, marketed to, told that faster drafts, cheaper art, instant companionship were worth more than memory, labor, and care. We said yes to chatbots that could write bad sonnets because bad sonnets in ten seconds felt like magic. We said yes to AI art because “close enough” was easier than paying someone. The ghost in the machine doesn’t need to seduce us, because the people building it are already holding the door open.

The acceleration is the real jump scare. “AI-generated” went from novelty to normal in under two years. Marketing teams ditch copywriters not because the machine is better, but because it’s instant and doesn’t bill overtime. Stock image sites flood their catalogs with synthetic faces that will never ask for royalties. Fan communities share AI “tributes” to dead actors, mistaking reanimation for reverence. The fake doesn’t have to compete with the real; it just has to be cheaper, cleaner, and legally safer.

We keep being told this is progress, that AI will free us from drudgery so we can focus on what matters. But what matters in a world where “drudgery” is just another word for someone else’s rent? We’ve replaced memory with mimicry and decided that’s good enough. The future AI is selling isn’t full of sentient machines; it’s full of déjà vu, a looping, uncanny repetition of the past, sanded down until nothing sticks in your teeth.

If AI is haunted, it’s because we built it out of ghosts. Every pixel, every sentence, every note is a trace of a living person. It can’t make anything without stealing from the dead, and it can’t steal without flattening the life out of it. But here’s the part no one likes to say out loud: some people do like it this way. It’s easier to deal with ghosts when they don’t talk back. It’s easier to consume the past when it’s been stripped of the messy human parts. 

The scariest thing about AI isn’t that it will become human. It’s that it doesn’t need to. It can replace human labor, memory, and culture without ever crossing that line, because that “almost” is enough.

The ghost in the machine was just static. The real possession is how easily they gave it our voice and how hard we’ll have to fight to take it back.

  • Vinnie C.
    About the Author - Vinnie C.

    Vinnie C. is a researcher, writer, and media shapeshifter exploring how stories mirror, distort, and mask reality. Armed with a BA in Liberal Arts from CHRIST (Deemed to be University), they examine media as both architect and artifact: one that shapes how we process the world and what we choose to ignore. Every belief we hold festers, mutates, and survives in our stories, letting us confront ideas through a screen, only to walk away when the credits roll.

    Vinnie's work thrives at the intersection where cold analysis meets burning curiosity, dissecting how form, technology, and narrative shape perception. From analyzing horror cinema to designing interactive content for games, their work is driven by a fascination with how media codes us into its language while we rewrite it into ours.

  • Vinnie C.
    About the Author : Vinnie C.

    Vinnie C. is a researcher, writer, and media shapeshifter exploring how stories mirror, distort, and mask reality. Armed with a BA in Liberal Arts from CHRIST (Deemed to be University), they examine media as both architect and artifact: one that shapes how we process the world and what we choose to ignore. Every belief we hold festers, mutates, and survives in our stories, letting us confront ideas through a screen, only to walk away when the credits roll.

    Vinnie's work thrives at the intersection where cold analysis meets burning curiosity, dissecting how form, technology, and narrative shape perception. From analyzing horror cinema to designing interactive content for games, their work is driven by a fascination with how media codes us into its language while we rewrite it into ours.

Sweet Relief Ad