Nostalgia for Specialness
Mark Carney said it plainly: nostalgia is not a strategy. He was talking about geopolitics, about Canada's relationship with an America that no longer plays by the old rules. But the line lands harder than he intended. It cuts through the entire discourse about AI and work, the endless back-and-forth between doomers and boosters, the think pieces and policy papers and LinkedIn manifestos. Nostalgia is not a strategy. You cannot wish your way back to a world that isn't coming back. Accept the fracture. Move forward.
He's right. And almost everyone responding to him is proving his point while thinking they're refuting it.
Watch the discourse. Not the doom-and-gloom predictions about truck drivers and warehouse workers. Those arguments are old, and besides, the people making them aren't talking about themselves. Watch instead what happens when knowledge workers confront AI. The lawyers, the consultants, the analysts, the writers. The people who thought they were safe because they "think for a living."
They're not denying the threat. That's last year's cope. Now they're doing something more interesting: hunting for the residue. The uniquely human. The part the machines can't reach.
Creativity. Empathy. Complex judgment. Emotional intelligence. Strategic thinking. Leadership. Taste. Wisdom. The ability to "truly understand" context, to read between the lines, to grasp what the client really means. The scramble is on to inventory what remains when the algorithms take the rest.
This search is its own form of nostalgia. Not nostalgia for a job or an industry. Nostalgia for human specialness. For the comfortable belief that somewhere in the cognitive stack there's a layer of magic the machines can't touch. That we're not just pattern-matching all the way down.
Here's the part no one wants to say out loud: if AI can do your cognitive work, maybe the work was more mechanical than you believed. Not that you were bad at it, not that you weren't skilled, but that the skill was never what you thought it was.
Your "creativity" was recombination. Novel outputs assembled from a vast library of inputs you'd absorbed over decades, remixed according to heuristics you couldn't articulate but that were, in principle, articulable. Your "judgment" was pattern-matching trained on experience, refined through feedback loops you'd forgotten you'd learned from. Your "insight" was synthesis that felt like epiphany because you couldn't see the wiring underneath.
This is a reveal, not an insult. The machine doesn't diminish what you did. It illuminates what you were doing. The work was always more mechanical than the story you told about it.
Consider the radiologist. For decades, reading X-rays and MRIs felt like expertise, like trained intuition that took years to develop. It was. But it was also pattern recognition at a level of complexity that felt like something more because humans couldn't introspect on their own processing. When the machine matches or exceeds that performance, it doesn't prove the radiologist was a fraud. It proves that pattern recognition was what the job actually was, dressed in the narrative of medical judgment.
The same reveal is coming for lawyers drafting contracts. For consultants synthesizing market research. For analysts building financial models. For writers constructing arguments. The machine arrives, performs the task, and in performing it, shows you what the task actually consisted of. Not magic. Mechanism.
The real loss isn't the job. It's the story about the job.
We built elaborate narratives about what knowledge work required. The years of training. The hard-won intuition. The judgment that couldn't be taught, only developed. The creative spark. These narratives were load-bearing. They justified salaries. They explained to yourself and others why you mattered, why your contribution couldn't be commoditized, why you weren't just another input to be optimized away.
AI isn't threatening the job. Plenty of jobs will survive, at least for a while, at least in some form. What's threatened is the story. The narrative that what you did required something irreducibly human. Something that couldn't, in principle, be specified, formalized, automated.
When that story breaks, what's left?
This is why the "uniquely human" hunt has such desperate energy. It's not really about predicting which tasks survive automation. It's about preserving the narrative of specialness. Finding the thing that restores the story, that lets you believe the machine is just handling the routine stuff while you do the real work, the human work, the creative-empathetic-strategic work that couldn't possibly be reduced to computation.
But each candidate keeps falling. Creativity? The models generate. Empathy? They simulate it well enough for most practical purposes. Complex judgment? They're getting there. The goalpost moves and moves again, and at some point you have to ask: what if there's no final sanctuary? What if the residue you're hunting for doesn't exist, at least not in any form the economy knows how to value?
The non-nostalgic position is harder than anyone is admitting.
It's not "reskill and adapt." That's just a different nostalgia, for a world where human flexibility could always outrun technological change, where there was always another job on the other side of disruption. The evidence for that world is weaker than we pretend. Historical transitions that "worked out" did so over generations, and for the individuals caught in the gears, they often didn't work out at all. The handloom weavers didn't reskill. They suffered and died and their children did something else.
It's not "find the uniquely human." That search, as I've said, is nostalgia dressed in a lab coat. It keeps the story of specialness alive while pretending to be hard-headed about capabilities.
The non-nostalgic position is this: maybe there is nothing economically valuable that is also uniquely human.
Sit with that for a moment. Not as doom-saying, but as a genuine possibility that the discourse keeps sliding away from. The things that are actually unique to humans, that the machines genuinely cannot replicate or even approximate, might not be things the economy ever wanted. The capacity for suffering. The need for meaning. The ability to love other people and waste time beautifully with them. The experience of being a body in a world, moving through seasons, getting older, fearing death.
These are real. These are ours. But the economy never paid for them directly. It paid for outputs that machines are learning to produce. The question isn't whether humans are special. The question is whether human specialness is what the market was ever buying.
So what opens up when you stop defending the story?
Not comfort. I'm not going to tell you it's all fine, that meaning awaits on the other side of disruption, that this is secretly an opportunity. The nostalgic hunt for the uniquely human is desperate because the stakes are real. Identities are built on these narratives. Livelihoods depend on them. The story wasn't just a story; it was a structure that held up lives.
But defending a story that's breaking is its own kind of trap. It locks you into arguments you can't win, into a rearguard action against capabilities that keep advancing. It forces you to keep hunting for the sanctuary that might not exist, and to ignore or downplay evidence that the sanctuary has already fallen.
The non-nostalgic position isn't optimism or pessimism. It's clarity. Seeing the situation without the narrative that made it bearable. And from that clarity, maybe, the actual questions become visible:
If human worth was never really about economic productivity, what is it about? We used to have answers to this question, before the modern economy outsourced meaning to the labor market. Religious answers. Philosophical answers. Answers that located human dignity in something other than usefulness. Those answers have their own problems, but at least they're answers. The economy's answer was always circular: you matter because you produce, and production matters because it employs people who matter. When the circle breaks, you need something else.
What would it take to build a society that doesn't require the story of human specialness to function? Not as utopia, but as practical engineering. Policies, institutions, narratives, ways of organizing life that don't depend on the premise that humans are economically irreplaceable. We might have to build this anyway. We might as well start thinking about it clearly, rather than spending the next decade defending a story we're not sure we believe.
Nostalgia is not a strategy. Neither is the frantic search for what makes us special. The question was never whether humans are special. The question is whether we can build a world that doesn't require us to prove it.
Sources⚓︎
- Mark Carney, "Principled and pragmatic: Canada's path" (January 20, 2026): Prime Minister's Office