View this post on the web at [link removed]
In a November 2023 New York Times interview [ [link removed] ], OpenAI’s on-again, off-again [ [link removed] ] CEO, Sam Altman, was asked about the risks of novel AI technologies and even the ominous possibility of AI-driven human extinction. He responded, rather dismissively, by reaffirming his belief in technology as the driver of human progress. “Yeah, I actually don’t think we’re all going to go extinct,” he said. “I think it’s going to be great. I think we’re heading towards the best world ever.”
This statement made the rounds online mere days before OpenAI’s board revealed its concerns about Altman’s optimism in a public defenestration. Altman was booted as CEO of the company he co-founded [ [link removed] ], transformed and led into the center of public debate about the social, cultural and economic significance of the current boom in AI. At the center of Altman’s ouster was his belief that technological change and human progress are inseparable, and that entrepreneurs, programmers, engineers and futurists can only create a better world by moving fast, breaking things and just building it. As Silicon Valley venture capitalist Marc Andreessen likes to say, “It’s time to build [ [link removed] ].”
These are only a sample of some of the grand ambitions animating the self-styled gods of Silicon Valley, a modern-day Mt. Olympus. The relationship between technological change and human progress is often less clear to everyone living in the foothills below. The addictive [ [link removed] ] design of digital platforms, specters of automation in nearly every category of human work, daily news of the existential risk unleashed by AI and novel high-tech weapons—these are not the headlines we would expect if we were on a one-way ride to a technologically mediated utopia.
But though doom-scrolling on social media makes it seem otherwise, these also aren’t the headlines of a society marching lockstep toward a technologically controlled dystopia. Rather, they are clippings from a society struggling to square the promises of many, often awe-inspiring, new technologies with the fundamental political reality that “society” contains and channels competing, contradictory visions for how those technologies should be put to use.
Problems arise, however, when Altman, Andreessen and others like them believe that we now command technological powers unknown in the history of humanity, and that they are perched at the perfect height to see above and beyond the limited visions of those who might disagree. If the rest of us must deal with decisions dispensed from Mt. Olympus, we would be wise to heed the old English proverb: Shit runs downhill.
The Original Techno-Pessimists?
Of course, we aren’t the first people in history who have had to reckon with and interpret technological change. In his 2023 book, “Blood in the Machine: The Origins of the Rebellion Against Big Tech [ [link removed] ],” Los Angeles Times technology columnist Brian Merchant revisits the period of profound sociopolitical upheaval and economic change scholars describe as the First Industrial Revolution. During that period, manufacturers drew on the proliferation of new energy sources, physical materials and machines, as well as transportation and communications infrastructures bolstered by new technologies in an increasingly global world, to scale up production to levels never before seen in human history.
But unprecedented production was only possible because of changes to the nature of work, which pulled laborers away from scattered subsistence arrangements on the commons and in small communities to large factories based on a new system: wages paid for hours worked. For the common man in England and beyond, the result was nothing less than a series of great transformations [ [link removed] ] in daily life.
Within this crucial chapter in English and, indeed, world history, Merchant examines the Luddites, an early 19th-century social movement that took hold among English textile workers, mostly in the Midlands and the North. The Luddites resisted the new automated machines, arguing that their sole purpose was to lower wages and eliminate jobs. They opposed entrepreneurial and political pressure to reinvent labor and economic life in the newfangled pattern of factory wage labor.
Luddite resistance took many forms, including town meetings and petitions to Parliament, but it arguably reached its peak in 1812, when waves of resistance swept across regions of England. The Luddites smashed and sabotaged the mills, looms, frames and jennies poised to replace worker handiwork; burned factories that housed such machines; and even murdered one particularly antagonistic manufacturer.
In the 212 years since the peak of the Luddite rebellion, the English machine breakers have attracted no shortage of interpreters, who range from empathetic historians to dismissive economists. Their status in popular and cultural memory is probably lesser than that of the pirates, witches and vigilantes who enjoy regular representation in art, television and film. But many among the scribbling class, like Merchant, and a growing cross-section of society fed up with technology see in the Luddites icons to animate “techlash [ [link removed] ]” today, and the inspiration for the different technologically mediated worlds of tomorrow. Who better to help us chart the dangerous waters between the Scylla of technological promise and the Charybdis of digital oppression than the Luddites?, they ask.
The Luddites can certainly inspire us to new thinking about technological change in the present. But latter-day “neo-Luddite” interpreters often invest too much weight in the aesthetics and politics of a “usable past.” Connecting the struggles of our time with the problems of our past in a shared continuum of resistance to technological change is an empowering banner call, but one that can blind us to a much more complicated present.
Modern-Day Luddites
So, what can we learn from the Luddites, then and now? “True Luddism was about locating exactly where elites were using technologies to the disadvantage of the human being, and organizing to fight back,” Merchant argues in “Blood in the Machine.” Luddism was not regressive or delusional machine-breaking for its own sake. Merchant stresses that Luddism’s purpose was to keep technology working with and for, rather than against, human activities. “The handloom, for example, made the Luddites’ way of life possible, long before they became Luddites—and they cherished that lifestyle enough to take up arms to defend it.” What mattered to the Luddites then, and to neo-Luddites like Merchant, was not merely the existence of new technologies but how they are deployed.
Today, tech companies, investors and politicians style specific technological changes and their supposed benefits as inevitable, dismissing any contrary reaction with the epithet of Luddism. But the relationship between technology and progress is never a foregone conclusion, and labor relations often offer the best place to puncture naive beliefs about inevitability of technological change and utopian aspirations for a friction-free ride into technologically mediated futures.
How comparable are the “dark satanic mills [ [link removed] ],” as William Blake described the factories as objects of Luddite struggle, and the digital devices dominating the world today? The extreme precarity of working conditions in the early 1800s was only possible in a virtually unregulated labor market that harnessed poverty and child labor to its advantage, inspiring the likes of Charles Dickens’ “Oliver Twist.” Arguably, the modern-day deadly manufacturing facilities in Asian and African markets—the “ghost work [ [link removed] ]” disguised in Silicon Valley’s global supply chains—are not so different.
Apple’s 1.4-square-mile flagship Foxconn iPhone manufacturing facility is perhaps the most damning display of big tech’s hidden and horrific labor conditions. In the 2010s, conditions were so poor that an “epidemic [ [link removed] ]” of suicides swept the workforce. Perhaps, as Merchant suggests, we should see the Luddites and exploited iPhone assemblers in the same continuum: Automation, unchecked by forces independent from industry, doesn’t always lead to less work so much as it proliferates forms of “precarious, lower-paying or less-protected work.”
A Neo-Luddite Politics?
Is a substantive neo-Luddite politics within reach today? Merchant, citing new waves of organizing among tech and gig workers, believes it is. On a superficial level, however, there appears to be a troubling disconnect between the important possibilities of novel regulation and the faddish feeling of many self-styled neo-Luddites. Novel labor protections, minimum wage laws, intellectual property regulations guarding against the hungry blob of large language models, and strong antitrust actions would all be important planks in a neo-Luddite coalition. But the unique interest-group concerns generated by laborers harmed by big tech seem at odds with the aesthetic concerns of those most prominently self-described as neo-Luddites.
Take New York City’s “Luddite Club [ [link removed] ],” for example, “a teenage lifestyle group” of hip, left-leaning social media users promoting “self-liberation from social media and technology.” Liberation from addictive apps, devices and social networks is a worthwhile goal. But we would be mistaken to equate the admirable but ultimately personal decision to forgo voluntary use of technologies with the potency of new federal or international regulation. Neo-Luddism, it seems, could turn inward, becoming a historically chic form of lifestyle consumerism detached from formal political action. Politics starts at the personal level, of course, but it isn’t automatically the basis for lasting regulatory change.
Another challenge to the plausibility of a reinvigorated Luddism is its political provenance on the left. In other words, can a neo-Luddism cut across political differences and ally with those on the right? In the late 1990s, neoconservative historian Gertrude Himmelfarb offered her own “neo-Luddite” dissent. As she wrote in the Chronicle of Higher Education [ [link removed] ], she was “disturbed” by digital technology’s effects on education.
“Like postmodernism,” she complained, “the Internet does not distinguish between the true and the false, the important and the trivial, the enduring and the ephemeral.” Internet search engines “will produce a comic strip or advertising slogan as readily as a quotation from the Bible or Shakespeare. Every source appearing on the screen has the same weight and credibility as every other; no authority is ‘privileged’ over any other.” Neo-Luddites on the left might find many allies on today’s conservative and communitarian right, but it is unclear whether a coalition is possible, either today or in the near future.
The Technologically Mediated Future
At their best, the neo-Luddites critique the wealth and power wielded by technologists and entrepreneurs and suggest democratic alternatives of technological change. At their worst, however, the neo-Luddites’ nearly obsessive interest in figures like Altman, Andreessen and Elon Musk only feeds the hype cycles that have transformed technologists and entrepreneurs into celebrities entrenched in a transnational and hyper-visible cultural elite whose every move is projected onto the screens of millions. If the world promised by technologists is only ever utopian, listening to the neo-Luddites leaves one with the impression that our technologically sophisticated world moves only in the opposite direction: from bad to worse.
A reinvigorated Luddism might appeal to a peculiar coalition of academics, journalists, and political and ideological factions on the left and right. To wit, it may well marshal individuals across generations, including boomers and millennials, who feel the nostalgic pang of longing for a pre-digital future, and younger cohorts who are living experiments in the possibilities and consequences of being born digital. As Gen Zers like me enter our late 20s, the effects of the hyper-mediated life materialize in concerning impacts on mental health; in eyes, wrists and spines warped by screens; and, everyone seems to agree, in an increasingly precarious political and cultural context on the verge of short-circuiting and setting the world aflame.
If the English textile workers’ uprisings inspire even a small segment of the population to question the premise that 21st-century entrepreneurs should march us into the world of their making, memory of the Luddite struggle will serve as an important political project. More important still is the task of creating open, collaborative and pluralistic channels for envisioning other technologically mediated futures. The Luddites and their students can surely be a spark of inspiration in this process. But history, to say nothing of what we think the past means at any given moment in the present, can just as easily be a prison of the imagination. Who the Luddites were and what they mean today is no different. A humane, inhabitable future—a future in which technologies are tools for living rather than totems of progress—is eminently desirable.
As Judith Shklar declared in “After Utopia [ [link removed] ],” her 1959 diagnosis of political stagnation, “It is well-known that each age writes history anew to serve its own purposes and that the history of political ideas is no exception to this rule.” New accounts of Luddism, such as Merchant’s “Blood in the Machine,” can be a part of that new history, but a political coalition that metabolizes deep differences between believers in fully automated luxury communism, conservative communitarians and moderate futurists who believe in technology as a tool for creating a more democratic future seems out of reach. Creating that coalition, I suspect, will require fewer history lessons and many more humble, iterative attempts to think, design and live with, rather than against, those who believe technology drives human progress.
Unsubscribe [link removed]?