The Grave-Digger Economy
The professionals building the thing that ate their careers aren't protesting. They're clocking in.
You know how this story is supposed to go. Technology displaces workers. Workers protest, or retrain, or disappear into a statistical category that economists argue about in journals you'll never read. The displaced become a talking point — sympathetic in the abstract, invisible in person. Someone on a panel says "creative destruction." Someone else says "just learn to code." The audience nods. The workers vanish.
That's not what's happening here.
What's happening is that the workers are inside the machine. Not displaced. Not protesting. Not retraining for something new. They are being hired — by the companies that made their jobs obsolete — to build the very systems that replaced them. They know what they're doing. They're doing it anyway. And the economy forming around this arrangement is worth more than some countries.
Josh Dzieza's 8,000-word investigation for The Verge introduces you to people you're not supposed to meet: the human workforce behind the AI industry's most carefully maintained illusion — that the machines work on their own. Emmy-winning producers. Lawyers who passed the bar. Screenwriters with credits you'd recognize. They sit in their apartments writing what the industry calls "golden outputs" — ideal chatbot responses that AI models learn to imitate. They design "stumper" prompts meant to expose where the model fails, then write the correct answer so the model can learn from it. They generate "reasoning traces" — step-by-step explanations of how an expert would think through a problem — so the model can mimic the cognitive process, not just the conclusion.
They build fake corporate environments. The industry calls it "world-building." Imagine constructing an elaborate fictional company — organizational charts, internal memos, quarterly reports, HR disputes — so that an AI can practice navigating a workplace that doesn't exist, in preparation for replacing people in workplaces that do. Yours, for instance.
All of this happens under Insightful, a surveillance platform that tracks work to the second. Idle time. Active time. Keystrokes. Applications open. The software watches everything, and the people reviewing the data are, in many cases, twenty-one years old. When experienced professionals raise concerns about the work — about quality, about process, about the existential strangeness of what they're being asked to do — the response, according to Dzieza's reporting, sometimes comes back as a corgi meme.
"My job is gone because of ChatGPT," one worker named Katya told Dzieza, "and I was being invited to train the model to do the worst version of it imaginable."
Another worker put it more simply: "I'm being handed a shovel and told to dig my own grave."
The Supply Chain
Here is the business that has formed around the shovel.
Mercor, founded by twenty-one-year-olds, reached a $10 billion valuation in October 2025 after quintupling its value in a single funding round. It generates roughly $500 million in annual recurring revenue. It pays over $1.5 million per day to the approximately 30,000 contractors who work on its platform each week, producing training data for OpenAI, Google DeepMind, Meta, and Anthropic.
Read those numbers again. Absorb the architecture.
A company worth ten billion dollars exists to manage a workforce of thirty thousand professionals who are paid to produce the training material that teaches AI to do what those professionals used to do for a living. The workers are the raw material. The product is their obsolescence. The $1.5 million per day sounds generous until you divide it by thirty thousand and realize it's fifty dollars per worker per day, distributed unevenly, with no benefits, no stability, and no guarantee that the project you're working on won't evaporate tomorrow.
This is not the gig economy. The gig economy at least had the decency to pretend you were an entrepreneur. This is something more precise: a supply chain for professional self-dissolution, with surveillance software to make sure you dissolve on schedule.
And it gets worse. Some of the workers — tasked with producing training data that is explicitly supposed to be human-generated, the entire point being that humans produce it — are using ChatGPT to do the work. They're using AI to generate the "human" data that trains the next generation of AI. The ouroboros isn't a metaphor. The snake is eating itself, and the training data is contaminated by the thing it's supposed to improve.
The Name on the Headstone
If Mercor extracts expertise through labor — hiring you to do fragments of what used to be your career — Grammarly has found a way to extract it without even hiring you. They just take your name.
Grammarly's "Expert Review" feature, launched in August 2025, offers users AI-generated writing feedback attributed to real people: scholars, journalists, authors. The system trains on their publicly available work and generates suggestions stamped with their identity. Stephen King. Kara Swisher. The Verge's own Nilay Patel, who responded with characteristic precision: "I'm almost more offended by the suggestion that I would give this shitbox edit than having my identity stolen."
None of them consented. Grammarly's position: their published work is publicly available and widely cited, which apparently makes their names available for commercial ventriloquism. The remedy offered to those who object is opt-out — the consent theater of a company that knows you can't opt out of something you didn't know was happening.
But the case that reveals the machinery most clearly involves a Cambridge historian named David Abulafia. Medieval historian Verena Krebs first noticed him listed as an available "expert" on the platform. Abulafia's AI persona was still generating suggestions, still lending his scholarly credibility to Grammarly's product.
David Abulafia died on January 24, 2026.
His name remained on the platform. His AI ghost continued to "review" manuscripts, offering feedback that a dead man never wrote, based on a reputation that a living man spent decades building. Kathleen Alves, an associate professor at CUNY, called it "literally digital necromancy." Claire Aubin, a Yale historian, offered the more surgical observation: "These are not expert reviews, because there are no experts involved in producing them."
This is identity extraction. Not labor extraction — you don't have to hire the expert. Not even knowledge extraction — you don't need what they know. You need what their name means. The expertise is a brand, and the brand is transferable. The person, living or dead, is optional.
The Dissolution of the Job
AI can do work. Can it do a job?
A job is an integrated role. It includes judgment, context, autonomy, relationships, career trajectory, professional identity, ethical responsibility. A lawyer doesn't just produce legal briefs — she advises clients, navigates ambiguity, exercises discretion, bears accountability for outcomes. A television producer doesn't just organize footage — he makes thousands of small decisions that reflect a coherent creative vision accumulated over years. An academic doesn't just write papers — she builds a body of knowledge that exists within a community of scholars who challenge, refine, and extend it.
AI cannot do any of those jobs. What it can do is perform tasks — fragments of what used to constitute those jobs. And this is the structural mechanism that the grave-digger economy runs on: not replacing workers with AI, but dissolving jobs into tasks, automating some of the tasks, and then hiring the displaced workers back to do the tasks that remain.
The Emmy-winning producer doesn't produce anymore. She writes golden outputs. The lawyer doesn't practice law. He generates reasoning traces. The screenwriter doesn't write scripts. She builds fake worlds for AI to practice navigating. Each one is still using their expertise — the work requires it. But the job is gone. What remains is piece-work: isolated, surveilled, fungible, stripped of every condition that produced the expertise in the first place.
The security. The autonomy. The creative control. The mentorship. The professional community. The ability to say no. All dissolved. What's left is the skill, extracted from its context and sold by the hour.
This is the grave-digger economy's deepest trick. The workers aren't unemployed. They're not even underemployed in any way the statistics would capture. They're employed. They're using their skills. They're being paid. And the thing they're building will make even these diminished tasks unnecessary.
A healthy system composts its old forms into nutrients for growth — yesterday's expertise feeds tomorrow's capability. The grave-digger economy runs the cycle in reverse. The expertise feeds the thing consuming it. The nutrients go to the predator. The professionals aren't composting into something new. They're composting into their own replacement.
The Mirror
There's a counterpoint. Kapwing tried the other version — the one where you actually pay artists. Their platform Tess.Design offered illustrators 50% royalties on AI-generated art that used their styles. It ran for twenty months. It generated $12,172.33 in gross revenue. Kapwing paid out $18,000 in advances. Not a single artist earned royalties beyond their advance. An enterprise client backed out because their legal team deemed AI licensing too risky. One artist who declined to participate offered this: "There is no such thing as ethical AI, full stop."
Kapwing shut Tess down in January 2026. The same month David Abulafia died and his name kept working.
The ethical version wasn't viable. The extraction version is worth ten billion dollars. The market has spoken, and it did not stutter.
Here is where the mirror turns toward you.
The professionals in Dzieza's investigation have Emmy Awards. They have law degrees. They have publication records and production credits and the kind of expertise that took decades to build. And they're writing chatbot responses under surveillance software while twenty-one-year-olds manage their workflow with meme reactions.
The comfortable assumption is that this is someone else's story. That your work is different — more complex, more relational, more resistant to fragmentation. That your judgment can't be decomposed into stumper prompts and golden outputs and reasoning traces.
Maybe. For now.
But the grave-digger economy doesn't need to replace you. It just needs to dissolve your job into tasks, automate the ones it can, and hire you back to do the rest at a fraction of your former salary, under surveillance, with no security, no autonomy, and no trajectory. You'll still be using your skills. You'll just be using them to build the thing that makes your skills unnecessary.
The professionals in the Verge investigation know exactly what they're doing. They're not confused. They're not in denial. They're inside the machine with their eyes open, building the thing that devoured their careers, because the alternative is nothing.
That's not someone else's story. That's a mirror. And the person staring back at you has your degree and your mortgage and your carefully constructed professional identity.
They're just further along.
Sources: "You Could Be Next" (The Verge/Josh Dzieza, 2026-03-10); Grammarly Expert Review backlash (The Verge, 2026-03-10); "AI Can Do Work. Can It Do a Job?" (RealClearPolitics/Kobe Yank-Jacobs, 2026-03-10); Learnings from paying artists royalties for AI-generated art (Kapwing, 2026-03-10); Mercor Series C (TechCrunch, 2025-10-27)
Source: The Verge, Grammarly, RCP