Skip to content

Walking and Flying⚓︎

There's a particular kind of knowing that comes from walking—from months spent in neighborhoods where you learn which doors open easily and which remain forever closed, from conversations that drift and return like tides, from the smell of cooking that tells you more about a place than any survey could capture. Ethnographers have walked like this for decades: slowly, attentively, building trust one cup of coffee at a time, noticing the things people don't say as carefully as the things they do. It's intimate work, this ground-level knowing; embodied, reciprocal, achingly slow.

Now consider another scene: eighteen days, a million conversations clustered by machine, patterns emerging from the computational ether that no human could have spotted walking. An algorithm discovers that twenty-three percent of DoorDash merchants fail to update their inventory in real-time, causing cascading cancellations that ripple through the system like stones thrown in digital water. The fix is obvious, the implementation swift, the impact measurable in percentage points and quarterly reports. What took ethnographers eighteen months now takes eighteen days—and the acceleration, we're told, is just beginning.

The bifurcation feels absolute: walking versus flying, intimacy versus altitude, the human versus the mechanical. Yet what looks like a simple choice from one angle reveals itself, from another, as something more troubling—not a choice at all, but a drift we're pretending not to notice, an inevitability we dress in the language of innovation while the ground beneath our feet quietly disappears.

The Nature of the Transition⚓︎

Digital ethnography, it should be said, isn't the issue here. Researchers have been walking through Discord servers and Reddit threads for years, doing the same patient work of observation and participation, just in digital spaces rather than physical ones. That's still walking, still building relationships over time, still present in the unfolding of community life. User experience research, too, has its established methods—the careful interviews, the usability tests, the surveys with their small samples and closed questions. Both valuable; neither new.

Computational ethnography is something else entirely—a fundamental epistemological shift dressed in familiar clothing. Here, large language models digest millions of conversations in the time it takes to brew coffee; algorithms cluster themes across populations rather than samples; patterns emerge not from years of patient observation but from the brutal efficiency of machine learning applied to human discourse. The relationship changes: no longer reciprocal but extractive, no longer participatory but post-hoc, no longer chosen but captured through terms of service nobody reads.

Anthropic analyzes Claude conversations to surface the desire for Socratic questioning (implemented within weeks as a product feature); DoorDash discovers those inventory gaps that plague their merchants (fixed through better integration, no merchant training required); Uber identifies driver shortages before they become crises (addressed through algorithmic incentive adjustments); restaurants find customers asking about items in free text that never appear in structured data (prompt servers to suggest high-margin alternatives, watch revenue climb). This is genuine power—the ability to see patterns invisible from the ground, to fix systemic problems at scale, to ship and measure and iterate faster than any competitor still walking.

The seduction is obvious; the trade-offs less so.

The Drone's Perspective⚓︎

Think of it as the difference between walking and flying—though even metaphors, as Borges knew, can mislead us. Walking means ground-level engagement: embodied, intimate, necessarily slow. You smell the cooking, feel the tension in rooms, learn the names of children, understand why certain corners feel safe and others don't. Flying means altitude: mediated, pattern-seeking, breathtakingly fast. From above, you see traffic patterns invisible to drivers, demographic shifts imperceptible to residents, connections that only distance can reveal.

But here's where the metaphor stumbles: drones can zoom in with extraordinary precision. Thermal imaging pierces walls; LiDAR maps topography to the millimeter; high-resolution cameras capture individual faces from a thousand feet. Similarly, large language models can perform remarkably close readings of individual conversations, zooming from population-level patterns to particular instances with a fluidity that seems almost magical. Altitude, we're learning, doesn't necessarily mean shallow.

The real limitation isn't resolution but presence—or rather, its absence. Even examining a single conversation with exquisite computational care, you weren't there when it happened; you can't see the pause before someone speaks, the glance that changes meaning, the child crying in the background that explains the abrupt ending. You observe post-hoc, not in-the-moment. You analyze what was said, not what was carefully not said. The distinction isn't about how closely you can see, but about the nature of seeing itself.

What Walking Reveals (And What We're Losing)⚓︎

Jane Jacobs spent years walking Greenwich Village before she understood what made neighborhoods work. She noticed how shopkeepers became "eyes on the street," their casual presence creating safety without surveillance; she saw which corners attracted lingering and which repelled it, how the ballet of the sidewalk—that unconscious choreography of urban life—emerged from the interplay of mixed uses, of commerce and residence creating life together. No dataset could have captured why certain plazas felt welcoming while others felt hostile; no algorithm could have identified the precise alchemy that transforms a space into a place.

Arlie Hochschild drove the roads of rural Louisiana for years, drinking coffee in living rooms where Fox News played constantly, sitting through church services where politics and faith intertwined in ways that confounded coastal assumptions. She discovered what she called "The Deep Story"—an emotional narrative that explained political views not through policy positions but through feelings of betrayal, of watching others "cut in line" while you waited patiently for your turn at the American Dream. The vulnerability required to hear that story, to have it shared with you rather than extracted from you, demanded time, presence, reciprocity. You can't cluster your way to that kind of understanding.

Sherry Turkle sat with families as they navigated life with devices, watching parents and children avoid eye contact through the mediation of screens. She documented the "seven-minute rule"—how long people could sustain dinner conversation before reaching for phones—and the phantom vibrations that haunted pockets even when devices were elsewhere. She caught the particular quality of being alone together, that modern predicament of physical presence paired with emotional absence. Usage data would show frequency of phone checking but not the texture of the loneliness it both caused and claimed to solve.

This kind of knowing—embodied, slow, particular—is becoming the province of the elite. (Who else can afford eighteen months for a single study? What institution will fund presence when patterns can be had in days?) Like artisanal bread or analog photography, walking ethnography is drifting toward craft status: admired, expensive, increasingly irrelevant to how the actual work gets done.

What Altitude Delivers⚓︎

Yet we should be honest about altitude's genuine achievements, which require no qualifiers or apologies. When Anthropic analyzed those million Claude conversations in eighteen days, they discovered something no amount of walking could have revealed: a consistent user desire for more Socratic dialogue, for AI that questions rather than merely answers. The pattern cut across demographics, use cases, and contexts in ways that would have taken years to surface through traditional methods—if it could have been surfaced at all.

DoorDash's discovery about merchant inventory wasn't just operational intelligence; it was systemic insight. Twenty-three percent of merchants failing to update inventory in real-time created cascading effects throughout the network: customer frustration, driver wasted time, restaurant reputation damage, platform trust erosion. One fix—better technical integration—solved what thousands of individual merchant trainings could never have addressed. This is the power of altitude: seeing the system as a system, identifying leverage points that ground-level observation would miss.

The restaurant finding questions about unlisted items in fifteen percent of orders reveals altitude's ability to surface the invisible. Those queries, buried in free text, never appeared in structured data; they were literally invisible to traditional analytics. Yet they represented latent demand, unrealized revenue, customer needs going unmet. The fix was simple—prompt servers to offer specific high-margin items—but finding the pattern required computational power that no amount of walking could replicate.

These aren't marginal improvements; they're categorical leaps in our ability to understand and respond to human behavior at scale. Altitude reveals unknown unknowns—patterns we didn't know to look for; enables comparison across contexts—languages, markets, cultures—that walking could never span; tracks temporal evolution—how patterns shift and change—in real-time rather than retrospectively; identifies statistical rarities—long-tail events, unusual combinations—that might never surface in small samples.

The power is real. The question is what kind of power it is, and what it costs us to wield it.

The Political Economy of Drift⚓︎

But individual consciousness—however acute—cannot explain the wholesale transformation we're witnessing. Sartre was right about bad faith, about our tendency to pretend we have no choice when we're actively choosing not to choose. Yet even if every ethnographer suddenly chose radical authenticity, acknowledged their freedom, felt the full weight of their decisions, the drift toward altitude would continue unabated. The forces driving this transformation aren't psychological but structural; they operate at the level of capital flows, institutional incentives, network effects. They're the kind of forces that shape civilizations whether individuals notice them or not.

Consider the brutal logic of capital. Traditional ethnography costs months of researcher time per study, produces insights that resist easy quantification, generates understanding that may not translate into product decisions, and—perhaps most damningly—doesn't scale: ten times more users doesn't mean ten times more insight without ten times more researchers. Computational ethnography, by contrast, analyzes millions of conversations for roughly the same cost as thousands; delivers actionable insights in days, not months; produces clear metrics ("pattern X leads to improvement Y"); scales elegantly with growth—more users means more data means better patterns.

In competitive markets, this isn't a choice; it's an evolutionary pressure. Company A adopts computational methods, ships features faster, captures market share. Company B maintains traditional ethnography, develops deeper understanding, moves thoughtfully—and loses. The market doesn't care about depth if speed captures value first. Capital flows to what works, and what works is what's measurable, scalable, fast.

The institutional logic follows similar patterns. Universities reward publication in top venues—computational ethnography enables faster publication cycles; grant agencies fund scalable research—"we'll analyze millions" beats "we'll walk neighborhoods"; tenure committees count papers—quantity matters when the clock is six years. Junior faculty, understanding these dynamics perfectly, adopt computational methods not from conviction but from necessity. Methods courses, responding to both faculty research and job market demands, teach LLM pipelines and clustering before—or instead of—building trust and noticing silence. The next generation learns to fly because walking, increasingly, leads nowhere.

Labor markets codify the transformation. Tech company job postings seek "User Researchers" with experience in "large-scale log analysis, LLM prompt engineering, clustering methods"—not "deep fieldwork, participant observation, relationship building." The salary differential is stark: computational ethnographers at tech companies earn $150,000 to $250,000; traditional ethnographers at nonprofits or academia earn $60,000 to $90,000. Smart, ambitious people with student loans and families follow the money—wouldn't you?

Network effects compound these pressures exponentially. Each company that successfully uses computational ethnography creates best practices that make adoption easier for others; each success story becomes a conference talk, a Medium post, a Harvard Business Review article; each tool that gets built lowers barriers for the next adopter. Meanwhile, traditional ethnography grows more isolated: fewer practitioners means less innovation; smaller community means less support; declining relevance means less funding. The gap widens with each iteration.

The Machine's Machine⚓︎

Sartre—who understood both individual consciousness and structural force—offers a framework that captures our particular predicament. His concept of the "practico-inert" describes systems that begin as human creations but solidify into structures that then shape their creators. We build machines to serve us; the machines become infrastructure; the infrastructure imposes its logic on us; we reshape ourselves to fit what we've built. We become, in his memorable phrase, "the machine's machine."

The pattern is seductively gradual. We create large language models to scale qualitative analysis (a tool to serve our needs); the tool proves so effective that research questions reshape around what LLMs can process (the tool shapes the work); everything must become computationally tractable to be studied at all (the logic of the tool becomes the logic of the field); researchers optimize themselves for computational methods (we become what the system needs). Each step seems reasonable; together they constitute a transformation so complete we forget things were ever different.

Watch how it manifests in practice. "I have to cluster at scale," we say, as if physics demanded it rather than economics. But we're choosing—choosing speed over depth, scale over intimacy, measurement over meaning. The discomfort we feel saying this aloud, the way we dress it up in inevitability, reveals the bad faith. "The algorithm surfaced this pattern," we announce, displacing agency onto code. But we chose the algorithm, the parameters, the data, which patterns to investigate and which to ignore. The algorithm is instrument, not actor—though calling it actor absolves us of responsibility.

"This is just how research is done now," we declare, treating a decade's drift as natural law. But nothing about this is natural or necessary; it's chosen, constructed, contingent. Other arrangements are possible—economically unlikely perhaps, but philosophically available. The fact that we can barely imagine alternatives reveals how thoroughly we've internalized the machine's logic.

Yet even here, Sartre insists, consciousness overflows every system that attempts to contain it. The daydream that drifts through your mind while validating cluster assignments; the moment of doubt—"is this actually meaningful?"—that no amount of optimization can eliminate; the part of you that notices you're being shaped, that recognizes the reshaping even as it happens. This overflow, irreducible and unquantifiable, is what remains human in us despite the mechanization. The question is whether we cultivate it or numb it with bad faith.

How to Do It Consciously⚓︎

If we're going to fly—and let's be honest, we are—we might at least do it with our eyes open. The CLIO framework (cluster, label, interpret, operationalize) that computational ethnography employs becomes different when performed consciously, when each step acknowledges what it's choosing and what it's trading.

When you summarize millions of conversations into manageable chunks, you're choosing what counts as "key"—whose voices matter, which themes surface, what gets compressed into silence. Ask yourself: what did I just delete as noise that might have been signal? Which humans did I just reduce to data points? I'm choosing compression over completeness, accepting that trade for scale—but I notice what I'm trading.

When you enrich with metadata—sentiment scores, urgency flags, user segments—you're imposing categories that don't exist in nature. These taxonomies are your creation, your imposition of order on irreducible complexity. The humans you're categorizing exceed every category you create. You know this. The question is whether you remember it when the dashboards look so clean.

When algorithms cluster conversations into patterns, those patterns are artifacts of your choices—which embedding model, how many clusters, what counts as similarity. Different choices would surface different patterns; other realities would emerge from other parameters. You're not discovering; you're constructing. The patterns are real, but they're not inevitable. You could have found others.

When you validate the output, are you actually reading with attention—noticing not just whether the categories are correct but what they assume, what they exclude, who they serve? Or are you rubber-stamping, providing human legitimacy for mechanical judgment? If you're approving ninety-five percent of what the machine produces, the machine is doing the thinking. You're the human in the loop, but the loop is what matters, not the human.

The Practices of Preservation⚓︎

Given that the drift is economically inevitable—and clarity demands we admit this—what remains possible? Not resistance, which would be romanticism, but consciousness: the deliberate cultivation of what won't optimize, what can't scale, what remains stubbornly, irrationally human.

Practice radical acknowledgment. Replace "I have to" with "I choose to because..."—and feel the weight of the because. "I choose computational ethnography because it's faster and I need tenure." "I choose altitude because my company demands quarterly results." "I choose efficiency because the alternative is unemployment." Fine. But own it. The discomfort you feel is consciousness refusing to be optimized away.

Maintain awareness of what's being lost—not through nostalgia, which is just another form of bad faith, but through deliberate attention. Keep asking questions logs can't answer: Why did they choose us in the first place? What alternatives did they consider but not pursue? What are they not saying, and why? These questions have no computational solution. That's precisely their value.

Preserve inefficient acts—small rebellions against the logic of optimization. Walk a neighborhood without purpose. Have a conversation without agenda. Write something that will never be clustered, analyzed, or operationalized. These acts accomplish nothing measurable. They assert freedom against mechanism, humanity against efficiency. They're how you remain human while using the machine.

Name your complicity without self-flagellation. "I benefit from computational methods." "My career is built on altitude." "This piece provides intellectual cover for a practice I'm ambivalent about." Not confession but acknowledgment—the difference between bad faith and authentic participation in structures you didn't choose but can't escape.

The Ethics We Can't Escape⚓︎

The ethical dimension demands particular honesty. Traditional ethnography, for all its limitations, maintained certain principles: informed consent (participants knew they were being studied); reciprocity (the relationship benefited both parties); the right to withdraw (ongoing consent could be revoked); institutional review (IRB oversight, however imperfect). These weren't perfect safeguards, but they acknowledged research as a moral relationship between humans.

Computational ethnography operates on different terms entirely. Terms of Service substitute for informed consent—legal coverage, not ethical agreement. Users don't know their conversations are being clustered, analyzed, transformed into product features. The relationship is extractive: companies benefit through insights and revenue; users receive "improved products" that they didn't ask for and can't refuse. Data, once logged, becomes corporate property—no withdrawal, no recourse, no reciprocity.

Let's not pretend these are equivalent ethical frameworks. Computational ethnography is closer to surveillance than research—the scale, the lack of awareness, the impossibility of opting out except by not participating at all. The "ethnography" label provides humanistic cover for what is essentially population monitoring. We can acknowledge this while still doing it—honesty doesn't require abstinence—but we shouldn't confuse what we're doing with what ethnographers used to do.

When to Walk, When to Fly⚓︎

The practical question—after all this analysis—remains stubbornly simple: when should you walk and when should you fly? The answer depends on what you're trying to understand, though even asking the question this way assumes more choice than most of us have.

Fly when population patterns matter more than individual depth; when genuine urgency justifies speed (actual crises, not manufactured deadlines); when you're looking for unknown unknowns that only emerge at scale; when systemic fixes are possible—pattern identification leading to infrastructure changes; when comparison across contexts would be impossible any other way.

Walk when texture and meaning matter more than patterns; when building trust is prerequisite to understanding; when what's not said carries as much weight as what is; when you need to participate, not just observe; when one case, deeply understood, matters more than millions, shallowly analyzed.

But let's be honest: you'll mostly fly. It's faster, cheaper, what employers expect, what competitors are doing, what the tools enable. That's fine—hypocrisy is pretending otherwise, not doing what economics demands while wishing it were different. The question isn't whether you'll fly but how consciously you'll do it.

What This Means for Practice⚓︎

For companies, the implications are straightforward if uncomfortable. Use computational ethnography for what it does well: operational intelligence, system optimization, pattern recognition at scale. These are genuine capabilities that create real value. But know what you're not learning: why people choose you rather than just that they do; what alternatives they considered; what meaning they attach to using your product; what your invisible users—those who leave no trace—actually need. These questions require walking, and walking requires investment that quarterly capitalism rarely supports.

For researchers, the credential game is essentially over. Computational methods get hired, published, funded, tenured. Traditional ethnography becomes a luxury you might maintain on the side, subsidized by your computational work—a kind of methodological hobby that you pursue for love rather than career advancement. The choice isn't between computational and traditional but between computational with awareness and computational with bad faith.

For the field itself, we're witnessing a speciation event. "Ethnography" is splitting into two distinct practices that share a name but little else: computational ethnography (pattern recognition at scale) and traditional ethnography (meaning-making through participation). Like the division between portrait painters and photographers after the camera's invention, both will persist but in different niches, serving different needs, valued differently by society.

The Weight of Freedom⚓︎

Sartre's most challenging insight—challenging because it offers no escape—is that we remain free even when freedom feels impossible. Every constraint we face, every pressure we feel, every "necessity" we invoke is real but not determining. We choose within constraints, but we choose. The company demands quarterly results, tenure requires publications, mortgages must be paid—all true. Yet within these pressures, consciousness persists, capable of recognizing what it's doing even when it can't do otherwise.

This is what Sartre meant by being "condemned to be free"—not that we can do whatever we want, but that we remain responsible for what we do within the constraints we face. The discomfort you feel reading this, the mixture of recognition and resistance, the "yes, but..." forming in your mind—that's consciousness refusing to disappear into mechanism. That refusal, however small, however practically irrelevant, is what keeps us human.

The question isn't whether we'll become the machine's machine—that transformation is already underway, accelerating with each iteration. The question is whether we'll become it consciously, maintaining awareness of what we're choosing even when choice feels impossible, preserving some irreducible core that notices, questions, resists, even while participating in what it questions.

Final Thoughts⚓︎

We stand at a particular moment—though every generation thinks its moment is particular—when the tools we've created to understand human behavior are reshaping both the behavior and the understanding. The flywheel of platform capitalism taught us that extraction has limits; eventually you run out of trust to violate, commons to enclose, goodwill to monetize. Computational ethnography may teach us a parallel lesson about knowledge itself: that speed and scale, pursued without limit, eventually exhaust meaning, that patterns without presence become empty forms, that understanding stripped of relationship becomes mere surveillance.

Yet even knowing this, we'll continue flying—the economics are too compelling, the advantages too real, the alternatives too costly. The drift from walking to altitude isn't a choice we're making but a transformation we're undergoing, as inexorable as any technological shift in human history. Like the movement from memory to writing, from craft to industry, from presence to mediation, this change will produce genuine losses and genuine gains, new capacities and new blindnesses.

The task—if we can call it that—isn't to stop this transformation or even to slow it. The task is to remain human while it happens: to maintain awareness of what we're trading, to preserve what won't optimize, to cultivate consciousness that overflows every system that attempts to contain it. This is harder than resistance would be; resistance at least offers the comfort of opposition. What we're asked to do instead is participate consciously in something we're ambivalent about, to fly while remembering how to walk, to become the machine's machine while retaining some irreducible core that knows what it's becoming.

That knowledge—uncomfortable, practically useless, impossible to optimize away—is what remains of our humanity in the age of altitude. It's not much, perhaps, but it's not nothing. And in a world increasingly organized around what can be measured, clustered, and scaled, the immeasurable becomes precious precisely because it has no value—economic value, that is. Its value is existential: it's what makes us more than the patterns we produce, more than the data we generate, more than the insights we can be mined for.

Protect it. Not because it will change anything—it won't—but because it's what remains human in us after everything else has been optimized. In the end, that may be the only resistance that matters: not the resistance of refusal but the resistance of consciousness itself, persisting despite every pressure to disappear into efficiency.

We are condemned to fly. The question is whether we can fly while remaining human—not human in some essential, unchanging sense, but human as an ongoing practice of noticing, questioning, preserving what won't scale. It's a practice without guarantee, without clear outcome, without economic value. That's precisely why it matters.

The machines are getting better at understanding us. The question is whether we're getting better at understanding ourselves—not our patterns or behaviors or clusterable conversations, but the part of us that remains irreducible to data, the consciousness that overflows every attempt to contain it. That overflow, however small, however practically irrelevant, is what we're fighting to preserve. Not because we can win—we can't—but because the fight itself is what keeps us human.

In the age of altitude, that may be the best we can do. It's not enough. But then, it never was.