From xxxxxx <[email protected]>
Subject Tim Berners-Lee Invented the World Wide Web. Now He Wants To Save It
Date October 6, 2025 3:10 AM
  Links have been removed from this email. Learn more in the FAQ.
  Links have been removed from this email. Learn more in the FAQ.
[[link removed]]

TIM BERNERS-LEE INVENTED THE WORLD WIDE WEB. NOW HE WANTS TO SAVE IT
 
[[link removed]]


 

Julian Lucas
September 29, 2025
The New Yorker
[[link removed]]


*
[[link removed]]
*
[[link removed]]
*
*
[[link removed]]

_ In 1989, Sir Tim revolutionized the online world. Today, in the era
of misinformation, addictive algorithms, and extractive monopolies, he
thinks he can do it again. _

Berners-Lee is building tools that aim to resist the Big Tech
platforms, give users control over their own data, and prevent A.I.
from hollowing out the open web., .Illustration by Tim Bouckley

 

im Berners-Lee may have the smallest fame-to-impact ratio of anyone
living. Strangers hardly ever recognize his face; on “Jeopardy!,”
his name usually goes for at least sixteen hundred dollars.
Berners-Lee invented the World Wide Web, in 1989, but people informed
of this often respond with a joke: Wasn’t that Al Gore? Still, his
creation keeps growing, absorbing our reality in the process. If
you’re reading this online, Berners-Lee wrote the hypertext markup
language (HTML) that your browser is interpreting. He’s the
necessary condition behind everything from Amazon to Wikipedia, and if
A.I. brings about what Sam Altman
[[link removed]] recently
called “the gentle singularity”—or else buries us in
slop—that, too, will be an outgrowth of his global collective
consciousness.

Somehow, the man responsible for all of this is a mild-mannered
British Unitarian who loves model trains and folk music, and recently
celebrated his seventieth birthday with a picnic on a Welsh mountain.
An emeritus professor at Oxford and M.I.T., he divides his time
between the U.K., Canada, and Concord, Massachusetts, where he and his
wife, Rosemary Leith, live in a stout greige house older than the
Republic. On the summer morning when I visited, geese honked and
cicadas whined. Leith, an investor and a nonprofit director who
co-founded a dot-com-era women’s portal called Flametree, greeted me
at the door. “We’re basically guardians of the house,” she said,
showing me its antique features. I almost missed Berners-Lee in the
converted-barn kitchen, standing, expectantly, in a blue plaid shirt.
He shook my hand, then glanced at Leith. “Are you a canoer?” she
asked. Minutes later, he and I were gliding across a pond behind the
house.

Berners-Lee is bronzed and wiry, with sharp cheekbones and faraway
blue eyes, the right one underscored by an X-shaped wrinkle. There’s
a recalcitrant blond tuft at the back of his balding head; in quiet
moments, I could picture Ralph Fiennes playing him in a movie—the
internet’s careworn steward, ruminating on some techno-political
conundrum. A twitchier figure emerged when he spoke. He muttered and
trailed off, eyes darting, or froze midsentence, as though to buffer,
before delivering a verbal torrent. It was the arrhythmia of a
disciplined demeanor struggling with a restless mind. “Tim has
always been difficult to understand,” a former colleague of his told
me. “He speaks in hypertext.”

 

He visibly relaxed as we paddled onto the water. Berners-Lee swims
daily when it’s warm, and sometimes invites members of the World
Wide Web Consortium (W3C) to “pondithons,” or pond-based
hackathons. “We have a joke that if you get any number of them on
the island, then they form a quorum, and can make decisions,” he
said, indicating a gazebo-size clump of foliage. He spoke of the web
as though it were a small New England town and he one of the
selectmen. Berners-Lee raised his two children in nearby Lexington,
the cradle of the American Revolution, and rose early for the annual
Patriots’ Day festivities. “We took them to the reënactment on
the Battle Green,” he recalled, “and the midnight ride of Paul
Revere.”

The Founding Fathers idolized Cincinnatus, who was appointed dictator
to save the Roman Republic, then peacefully returned to his fields.
Berners-Lee is admired in a similar spirit—not only for inventing
the web but for refusing to patent it. Others wrung riches from the
network; Berners-Lee assumed the mantle of moral authority, fighting
to safeguard the web’s openness and promote equitable access. He’s
been honored accordingly: a knighthood, in 2004; the million-dollar
Turing Award, in 2016.

Now Sir Tim has written a memoir, “This Is for Everyone
[[link removed]],”
with the journalist Stephen Witt. It might have been a victory lap,
but for the web’s dire situation—viral misinformation, addictive
algorithms, the escalating disruptions of A.I. In such times,
Berners-Lee can no longer be Cincinnatus. He has taken up the role of
Paul Revere.

“They thought they were safe,” he said, as the boat startled a
flock of geese. Platforms had lulled users into complacent dependency,
then sealed off the exits, revealing themselves as extractive
monopolies. Berners-Lee’s escape hatch is a project called the Solid
Protocol, whose mission is to revolutionize the web by giving users
control over their data. To accelerate its adoption, he launched a
company, Inrupt, in 2017. “We can build a new world in which we get
the functionality of things like Facebook and Instagram,” he told
me. “And we don’t need to ask for permission.”

Berners-Lee knows that the obstacles are formidable. But he’s pulled
off a miracle before. “Young people don’t understand what it took
to make the web,” he said. “It took companies giving up their
patent rights, it took individuals giving up their time and energy, it
took bright people giving up their ideas for the sake of a common
idea.” The dock slid into view just as he reached a crescendo.
Smiling, he set down his paddle. “Shall I drop you here?”

In the beginning, the internet was without form, and void, and data
trickled through the ports of the routers. The “series of tubes,”
in the immortal words of the Alaska senator Ted Stevens, went online
in the late nineteen-sixties, though “tubes” exaggerates its
concreteness. Technically, the internet is a protocol: a set of rules
that let computers send and receive data over various networks by
breaking it into “packets.” Vint Cerf and Robert Kahn devised this
“inter-network” at the U.S. Department of Defense. By the late
eighties, it had spread to civilians, who could send e-mail, transfer
files, and post on forums through subscription-based services such as
CompuServe and AOL. Still, many yearned for a unified ecosystem.
“There was a fork in the road,” Brewster Kahle, the founder of the
Internet Archive, told me. “Are we going to have an information
superhighway which is open to all? Or is it going to be five hundred
channels of nothing on the net?”

Berners-Lee modestly maintains that anyone might have solved this
conundrum. But his upbringing helped. He was born in 1955 to Conway
Berners-Lee and Mary Lee Woods, two computer scientists who met while
working on an early commercial computer, and raised him in suburban
London. Conway, who studied the mathematics of queuing, used water
jets to teach Tim about electronic circuits. Mary, a believer in
“watchful negligence,” would let him and his three younger
siblings wrap themselves in extra perforated tape. Tim loved math, the
outdoors, and building electronics with transistors. At Oxford, where
he studied physics, he knew that his future was in computing; between
terms, he cobbled together a working machine from junk parts.

His career began, ordinarily enough, at a telecom company in southern
England, where he and a college girlfriend, then first wife, went to
work. But in 1980 he took time off for a fellowship at _CERN_, the
particle-physics lab near Geneva, and returned, four years later, for
a full-time job. His unglamorous assignment was to maintain the
computer system that processed images of experiments—I.T. work for
the heirs of Planck and Einstein. And the only thing more complex than
the quarks and bosons they were chasing was the babel of languages,
operating systems, storage formats, and filing systems that they
employed. “One scientist might have critical information about how
to run the accelerators stored in French in a private directory in the
central Unix mainframe; another might have information on how to
calibrate the sensors stored in English on an eight-inch I.B.M. floppy
disk in a locked metal cabinet,” Berners-Lee writes. “It was a
mess.” Out of this mess came the last great invention of the
twentieth century.

The web was a fusion of two earlier technologies: the internet and
hypertext, a way of organizing documents non-hierarchically through
links. Hypertext dated to the nineteen-forties, when the science
administrator Vannevar Bush wrote an article about a device that could
represent knowledge “As Freely as We May Think.” By the eighties,
the technologist Ted Nelson was trying, unsuccessfully, to build a
universal hypertext library, which he called Project Xanadu.
Berners-Lee’s more pragmatic idea was to use hypertext to enhance
online collaboration. “Imagine making a large three-dimensional
model, with people represented by little spheres, and strings between
people who have something in common at work,” he wrote in a 1989
proposal.

Colleagues at _cern_ didn’t know what to make of the idea. “For
many computer scientists . . . every document belonged in a
specified container,” Berners-Lee writes. “I was proposing instead
to free those documents—essentially to dump the files from their
folders onto the floor.” A supervisor jotted “vague but
exciting” on the proposal, and let him pursue it on the side. In
October, 1990, Berners-Lee began laying the web’s foundations: HTML,
the language of web pages; HTTP, the protocol that governed their
transmission; and URLs, the addresses that linked them together. On
August 6, 1991, the web’s first page, [link removed], went
online, introducing itself as “a wide-area hypermedia information
retrieval initiative aiming to give universal access to a large
universe of documents.” Soon enough, there would be porn.

In January, 1993, when I was born, there were about fifty web servers
on the internet; new hosts customarily e-mailed Berners-Lee to let him
know they were online. By my first birthday, there were six hundred,
and this magazine had printed the now famous cartoon captioned “On
the Internet, nobody knows you’re a dog.” The first site I
remember is Yahooligans, a Yahoo portal for children, where I played
chess and downloaded screensavers. Next was Neopets, a virtual-pet
game where my uncle, a former photo-lab technician, reinvented himself
as a programmer. On the web, I read Jules Verne on Project Gutenberg,
gave myself nightmares learning about Japanese war crimes in
Manchuria, and laughed with a cousin at the crowdsourced recordings on
farts.com. It was just as Berners-Lee wrote: “If you could put
anything on it, then, after a while it would have everything on it.”

My father, a songwriter and producer who built computers for his home
studio, was quick to embrace the dot-com gospel. He bought domain
names for everyone in the family and encouraged my early experiments
in programming. At recess in middle school, while others played soccer
or traded Yu-Gi-Oh! cards, I pored over tomes on HTML, JavaScript, and
PHP—which paid off, socially, when I built a proxy server to let
classmates access banned Flash games. Eventually, I started coding
sites for local businesses, beginning with my mother’s. But it was
exhausting to keep up with browsers’ rival implementations of the
languages I’d learned.

Fragmentation menaced the web from the outset. From _CERN_, it spread
quickly through listservs, where enthusiasts shared proto-browsers to
replace the bare-bones command-line program Berners-Lee had written.
This was the kind of improvisation he’d hoped for. But it quickly
got out of hand.

One day, Berners-Lee had a listserv exchange with a college student
named Marc Andreessen
[[link removed]],
who’d proposed an “<img>” tag to embed pictures in pages.
Berners-Lee demurred, saying that he preferred more content-neutral
syntax. But Andreessen wasn’t asking for his blessing. In 1993, he
led the team that launched Mosaic, the first modern browser. The next
year, he released a commercial successor, Netscape, whose I.P.O. made
him an instant multimillionaire. _Time_ put him on its
cover—barefoot, leering, perched on a throne—and hailed him as a
“Golden Geek.” (_Time_ profiled Berners-Lee the next year, noting
that unlike Andreessen, who drove a Mercedes, Berners-Lee drove an old
Volkswagen; he jokingly blamed its carbon-monoxide emissions for the
“diffuseness of his answers.”) Berners-Lee believed that
Andreessen was trying to “hijack” his creation.

His pique wasn’t just about money or ego. The web was meant to be
universal, and had already outpaced similar networks. Kahle, the
Internet Archive founder, had created _WAIS_, or the Wide Area
Information Server, a publishing system with natural-language search.
Another competitor was Gopher, developed at the University of
Minnesota. Yet both relied on existing file formats and hierarchical
menus. When Gopher tried to charge licensing fees, users fled. The
web, by contrast, was free, easy to use, and, thanks to hypertext,
infinitely flexible. “The markup language was simple,” Dan
Connolly, who worked with Berners-Lee to codify HTML, told me. “And
you didn’t have to ask your boss for money.”

To keep it that way, Berners-Lee moved to the U.S. and founded W3C, in
1994. In time, the organization would open offices across the world,
but its first home was at M.I.T., where it eventually settled into
Frank Gehry’s flamboyant Stata Center, a jumble of towers and angles
that appear to grow in several directions at once. The web, too,
seemed in need of a stabilizing center—one that Berners-Lee doubted
either he or the market could supply. A consortium, he writes,
provided an alternative to “Balkanization and competing technical
fiefdoms.” Companies were invited to shape the web collaboratively,
through technical standards reached by consensus, and, later on,
agreed not to sue one another over web technology.

“Tim used to call it ‘blue-helmet work,’ like U.N.
peacekeepers,” Connolly said of the consortium’s early efforts.
Its authority was constantly challenged. “These young engineers were
saying, ‘Why do we need a consortium?’ ” Jean-François
Abramatic, a former W3C chairman, recalled of an early meeting in San
Francisco. “ ‘Why don’t we develop the best products and
compete?’ ” But more enlightened self-interest won out. “They
realized that the whole market was going to be much bigger if they
coöperated,” Berners-Lee told me. He offered a nautical analogy:
“When you sail a boat, there’s force on the sail and force on the
keel. The boat goes forward, because those forces are very strong, but
it’s the constructive tension that drives the boat forward.”

W3C kept the web whole during the “browser wars” of the late
nineties, as Microsoft and Netscape pushed their own flavors of HTML.
It kept the web’s design supple amid exponential growth, even when
that clashed with demands for more features. The lodestar was
Berners-Lee’s “principle of least power,” which dictated a
minimal architecture. “He’s got this physicist’s picture of
things scaling up and scaling down, of very simple rules that work
well at any level,” Connolly said. Abramatic recalled the stress of
defending this vision from the shortsightedness of various industries.
“But if we had to do all of it just for Wikipedia,” he said,
“that was worth it.”

“You have to stay with it,” Berners-Lee told me. “You invent
something, and you have to make sure it’s all right.” He didn’t
win every battle. He had imagined the web as a space where everyone
would read _and_ write; instead, “browsers,” a term suggestive
of bovine passivity, won out. He still regrets tying web addresses to
the Domain Name System, or D.N.S., which allowed domain names like
newyorker.com to become speculative assets.

Even so, the early web was a dream realized. As Y2K neared,
Berners-Lee was planning the next phase: a “Giant Global Graph,”
as he later dubbed it, of structured data. In his first book,
“Weaving the Web
[[link removed]]”
(1999), he argued that, if websites could be augmented with a layer of
machine-readable information, the potential was boundless. “The
intelligent ‘agents’ people have touted for ages will finally
materialize,” he wrote. “The Web will be a place where the whim of
a human being and the reasoning of a machine coexist in an ideal,
powerful mixture.”

Berners-Lee sipped lemonade and stared at a projected image of Joe
Rogan. From Concord, he, Leith, and I had come to M.I.T.’s Center
for Constructive Communication, whose director, Deb Roy, knelt on a
rolling chair and presented his research on America’s “Civic
Breakdown.” Roy, a media scientist, discussed a project that used
large language models as a “listening tool” for group discussion,
which he’d piloted at a public high school in Newark.

Berners-Lee and Leith considered. “I’m just thinking of Charlie,
Tim,” Leith said. “Is there a role for Charlie in this?”

“Well, Charlie is the _individual’s_ A.I.,” Berners-Lee
replied, pursing his lips.

“You could make a _group_ Charlie, a small-group Charlie,” she
suggested.

“You could ask Charlie how polarized you are, if he had access to
all of your media data,” Berners-Lee’s young chief of staff chimed
in.

“Isn’t the balanced person someone who listens to everything?”
Leith asked.

Berners-Lee, squinting, wasn’t so sure. “Can you do that just by
listening to Ezra Klein and Joe Rogan and you’ve covered the entire
spectrum?”

Roy looked flummoxed: “So, ‘Charlie’? ”

“ ‘Charlie’ is an A.I. that works for you,” Berners-Lee said.
“It’s very, very powerful.” A prototype was already being tested
at his company, Inrupt.

Berners-Lee has been predicting our age of automation since the late
nineties, when he set out to build what he called the Semantic Web.
Its mission was to get humanity’s data online, and he pursued it
zealously for more than a decade. In a 2009 _TED_ talk
[[link removed]] called
“The Next Web,” he urged governments, corporations, and citizens
to upload all they could: “You hug your database, you don’t want
to let it go until you’ve made a beautiful website for it,” he
said. “But, first, give us the unadulterated data.” His demand
escalated to a chant. “We have to ask for raw data now,”
Berners-Lee cried with sermonic fervor. He windmilled his arms like an
inflatable tube man. “Can you say ‘raw’? Can you say ‘data’?
Can you say ‘now’? Raw data now!”

The idea was to make facts, statistics, and just about any
“structured” information as free and flexible online as documents
already were. A database of magazines, for instance, could link to
further databases maintained by each publisher—and so on down to the
facts in particular articles, which, in turn, might link to the
sources they cited. It was metadata unchained, and Berners-Lee
believed it would change the world. In a 2001 _Scientific
American_ article
[[link removed]], he
envisioned a future web of genie-like agents able to book medical
appointments or instruct microwaves in the latest
manufacturer-approved tips for heating frozen food.

For this utopia to be realized, the web would need an overhaul. HTML
had run its course, Berners-Lee decided. Its successor, XHTML, or
extensible hypertext markup language, would separate information and
the way it was presented more cleanly, making pages easier for
machines to read. Many developers, though, had no interest in such a
drastic change. Berners-Lee wanted “raw data now”; they wanted to
build interactive web applications.

The clash led to a schism at W3C. In 2004, after losing a vote, a
group of browser developers who wanted to keep improving HTML formed a
rival standards body. Berners-Lee considered the move a power grab,
describing it as “the first real blow to the integrity of the World
Wide Web.” But when his “extensible” language faltered he backed
a reconciliation with the rebels, whose new standard, HTML5, had
prevailed. Web applications became the basis of “Web 2.0,”
powering Twitter’s endless scroll and Google’s smoothly panning
Maps.

The Semantic Web survives in certain contexts. Scientists use it to
make the research behind their papers—protein structures, brain
scans—programmatically searchable. DBPedia, a crowdsourced database
of several billion facts, helped I.B.M.’s Watson win “Jeopardy!”
But Berners-Lee’s vision of reasoning machines, drawing conclusions
from trustworthy data freely shared by individuals, never came to
pass. There is plenty of raw data online, but much of it is harvested
privately by platforms. The A.I. trained on it doesn’t parse
carefully encoded labels according to logical rules; it “infers”
from wholesale scraping.

After the presentation at M.I.T., the conversation turned to A.I.’s
trustworthiness.

“I use some language model daily,” Roy said. “Yet there’s this
slipperiness at the base. They’re not accountable.”

“They’re not accountable in what sense?” Berners-Lee asked.

“If they steer you wrong, whose fault is it?” Roy clarified.
“There’s a difference between pretending to care and caring.”

Berners-Lee paused. “Philosophically, I disagree with you.”

“You do?”

“Yeah. If something can pretend to care, it’s fundamentally the
same operation.”

Near the climax of the opening ceremony of the 2012 London
Olympics—a living diorama of British history, directed by the
filmmaker Danny Boyle—a model house was lifted away to reveal
Berners-Lee. Seated at a NeXT Computer, the kind he’d used
at _CERN_, he typed a message that flashed across the stadium:
“_THIS IS FOR EVERYONE_.” A light show dramatized the birth of the
World Wide Web, its hyperlinks racing from continent to continent.
Finally, Berners-Lee stood, maestro-like, from the keyboard, turning
to applaud each quadrant of the roaring crowd.

The web was riding high. China had half a billion internet users, who
could still criticize the government on the microblogging platform
Sina Weibo. Twitter was credited with fuelling the Arab Spring. In the
United States, Barack Obama
[[link removed]] was on his way to
reëlection, his campaign driven by the largest social-media and
data-analysis operations in political history. The web was broadly
seen as a force for justice, destined to uplift the world.

Berners-Lee was spreading his wings, too. In 2010, he divorced his
second wife, the mother of his two children, and began a relationship
with Leith, whom he knew from philanthropic projects. (They married
four years later at the Chapel Royal, in St. James’s Palace.)
“Once I met Rosemary, my life became an almost non-stop flurry of
activity,” he writes. Together, in 2009, they had founded the World
Wide Web Foundation, to promote global internet access, especially in
Africa, where Berners-Lee marvelled at the web-enabled spread of
farming techniques, and at the profusion of routers in the palace of
Rwanda’s President, Paul Kagame.

In 2012, Berners-Lee founded the Open Data Institute, in London, to
advocate for digital transparency. One of his protégés, the young
activist Aaron Swartz, took more radical measures. In his “Guerilla
Open Access Manifesto,” Swartz had warned that the world’s
scholarship and scientific research—much of it publicly funded—was
being “digitized and locked up by a handful of private
corporations.” Then, in 2013, he took his life, after federal
prosecutors charged him with a felony for sneaking into a router
closet at M.I.T. to download millions of articles from _JSTOR_.
“Hackers for right, we are one down,” Berners-Lee tweeted. “Let
us all weep.”

Swartz’s death foreshadowed a darker turn. In a forthcoming book,
“The Age of Extraction
[[link removed]],”
Tim Wu, a Columbia law professor who coined the term “net
neutrality,” identifies 2012 and 2013 as the years when “platform
power” took hold. Since the nineties, it had been assumed that the
web would democratize society, empowering bloggers to compete with
media conglomerates, and small manufacturers to bypass big retailers.
Some of that happened. But the web’s Davids had only traded one
Goliath for another—corporate platforms that stood between them and
their markets. As Wu writes, “Paeans to small-is-beautiful and the
transformation of the human existence” soon gave way to “a
strategy that extracted from dependent businesses and harvested the
time and data of the masses.”

Platforms aren’t inherently extractive. Wu defines them as any space
that “brings together two or more groups to transact or interact
while reducing the costs of doing so.” The internet itself is a
platform. But the new web-based platforms were far less neutral. They
grew at breakneck speed, and then, once network effects had made them
indispensable, they squeezed sellers, served ads, and otherwise
extracted value from users while making exit ever costlier. They
bought out rivals and turned into monopolies: between 2007 and 2018,
Wu notes, Facebook, Microsoft, Google, and Amazon collectively
acquired more than a thousand firms.

Berners-Lee sounded the alarm, warning, as always, about
fragmentation: buy a song on iTunes or read a magazine in its
proprietary app, and you were no longer on the web. “The more this
kind of architecture gains widespread use,” he wrote
[[link removed]],
in _Scientific American_, “the less we enjoy a single, universal
information space.”

The fight over how to resist platform power led to W3C’s deepest
rupture. In 2012, Netflix and several other members of the consortium
proposed a standard to protect streaming video from piracy by letting
browsers play video while blocking access to the underlying files.
This was a form of digital-rights management, or D.R.M.—long
anathema to open-web advocates, who not only disliked copyright but
were morally opposed to technical limits on the free operation of
their computers. (Drivers must obey traffic laws, but their cars
don’t shut off when they run a red light.) Berners-Lee felt the same
but feared that, without D.R.M., streaming companies would retreat to
closed, app-based ecosystems. He agreed to hear the proposal.

The backlash was swift, spilling from the consortium’s mailing lists
into the pages of the _Guardian_. “Stop the Hollyweb,” the Free
Software Foundation, one of the oldest digital-rights groups, urged in
a petition. Video was only the beginning, activists warned: if D.R.M.
prevailed, browsers might one day block source-code views, downloads,
even cut-and-paste. When the Motion Picture Association of America
joined the consortium, in 2014, the fight grew uglier. “Hitler might
have caused less of a stir,” a W3C staffer recalled on a podcast
episode titled “Bring Me the Head of Tim Berners-Lee.”

The loudest dissenter was the science-fiction writer Cory Doctorow,
who represented the Electronic Frontier Foundation at the consortium.
D.R.M., he argued, would hinder accessibility, create security flaws,
and make browsers dependent on encryption modules sold by Microsoft
and Google. Users could even be charged with a felony for bypassing
D.R.M. software. Doctorow warned, “We are Huxleying ourselves into
the full Orwell.”

Doctorow admired Berners-Lee—both had wept at Aaron Swartz’s
funeral. “He passed up ten fortunes to devote himself to public
service,” Doctorow told me. “The web was so important that these
companies came and bent the knee to Tim.” But now, he believed, the
web’s knight was the one genuflecting.

By 2016, as a deeply divided W3C debated a new D.R.M. standard,
protesters in Guy Fawkes masks gathered outside the Stata Center,
chanting “rm D.R.M.”—“rm” being the Unix command to delete a
file. In the end, Berners-Lee exercised his authority as director to
break the deadlock: D.R.M. was in. “Some people have protested
‘no,’ but in fact I decided the actual logical answer is
‘yes,’ ” he wrote afterward. Evoking the legendary King Canute,
who couldn’t hold back the tide, he urged the consortium to accept
its limits: “People like to watch Netflix.”

“It was a rotten time,” Berners-Lee said of the battle, which is
conspicuously absent from his memoir. “People we’d counted on as
friends began to see the W3C as the enemy.”

Doctorow, for his part, is still fighting “to bring back the Web
that Tim made.” His new book, “Enshittification
[[link removed]],”
vividly dissects our “age of zombie platforms”: Google
adulterating search results for advertisers; Facebook extorting news
organizations; Adobe removing unlicensed colors from users’ images
after shifting its software to the cloud. He characterizes tech
C.E.O.s as graduates of “Darth Vader University, where the first
lesson is ‘I’m altering the deal. Pray that I don’t alter it any
further.’ ”

Yet Doctorow insists there are ways to resist: antitrust actions,
data-privacy regulations, and the legalization of “adversarial
interoperability,” or the right to engineer compatibility between
proprietary platforms and more open alternatives. In 2017, Berners-Lee
took a hiatus from W3C to launch his own interoperability initiative,
Inrupt. “We will build beneficial systems that work for everyone,”
he wrote in a post announcing the project. “The future is still so
much bigger than the past.”

Inrupt’s offices occupy a glass tower beside TD Garden, where the
Bruins and the Celtics play. The space is lined with whiteboards, and,
on the day I visited, a half-dozen employees worked quietly at
standing desks. The company’s name, a portmanteau of “innovate”
and “disrupt,” does little to clarify its mission—nothing less
than breaking the hold of platforms and reclaiming the open web.

In a conference room, I met the C.E.O., John Bruce, an affable
Englishman with a sweep of white hair, who plays the plainspoken foil
to Berners-Lee’s digital statesman. When the two met, almost a
decade ago, Bruce had just sold a cybersecurity firm to I.B.M., and
was interested to hear about the company Berners-Lee planned to start.
“A man who invents something like the web is a smart guy,” Bruce
said. “But it was more than that. He’d nurtured it. He’d fought
for it. If this guy had an idea to make it better, I was all ears.”
They bonded over British television from the sixties, but when it came
to Berners-Lee’s project, “I couldn’t grok it,” Bruce
admitted. “We had a couple of dinners, and it took me all of those
and then some to understand what Tim was talking about.”

What needed fixing was obvious enough: web users had surrendered their
data to monopolistic platforms that respected neither privacy nor
choice. Because their systems were deliberately incompatible, they
could wall off the valuable information trails we generated—search
histories, purchases, social-media posts—and treat us with impunity,
knowing it was nearly impossible for us to leave.

But what if everyone stored their data on personal servers? Platforms
would have to request access, or even offer micropayments, letting
users comparison-shop. Decoupling data from the services that used it
would also spur competition and encourage innovative new applications,
since information from various sources could be recombined. All this
would be accomplished by what Berners-Lee called Solid Pods: Solid,
for “social linked data,” Pods, for “personal online data
stores.” They were online strongboxes devised by the very Pandora
who’d unleashed the web itself.

Solid grew out of M.I.T.’s Decentralized Information Group, which
Berners-Lee had co-founded to help realize his dreams for the Semantic
Web. In 2015, he and his colleagues launched the Solid Protocol,
hoping that it, like the web, would show how an open, decentralized
system could triumph over a patchwork of subscription services. “You
can make the walled garden very, very sweet,” Berners-Lee said at an
event in 2016. “But the jungle outside is always more appealing.”

The promise of “data sovereignty,” though, was relatively
intangible, and twenty-first-century platforms were far more
entrenched than nineties AOL. To accelerate Solid’s adoption,
Berners-Lee decided to go into business. Recommending standards
wasn’t enough. It was time to move fast and fix things.

“It’s been fascinating for me to get things done, to execute,”
Berners-Lee told me. By the end of 2018, Inrupt had twenty employees,
a reported twenty million dollars in V.C. funding, and a tailwind from
the Cambridge Analytica scandal
[[link removed]],
which revealed that leaked Facebook data had been used to target ads
in the 2016 U.S. Presidential race. Conveniently, the web was also
about to turn thirty. In a BBC segment, Berners-Lee warned of the
web’s “downward plunge to a dysfunctional future.” He used the
anniversary to promote Inrupt, which planned to sell enterprise
servers to implement Solid. (Because the protocol is open, other
companies can do the same.)

In the years after its launch, Inrupt announced a string of
partnerships. Solid was piloted by the U.K.’s National Health
Service in the hopes of giving patients more control over their
medical records. The BBC built a prototype “BBC Box” that could
algorithmically recommend shows without retaining user data. The
government of Flanders, in Belgium, went further, promising every
citizen a Solid Pod as part of its compliance with Europe’s General
Data Protection Regulation. The momentum coincided with a broader wave
of decentralization in tech, from the blockchain boom to federated
networks like Mastodon and Bluesky. Once again, Berners-Lee seemed to
be on history’s leading edge.

Today, Solid looks stalled. Eleni Sharp, who led the BBC pilot, told
me the Box never made it out of testing. “People say they want to be
more in control of their data,” she said. “But do they then want
to be more hands-on? Not really!” In Flanders, with nearly seven
million residents, only about a thousand actively use Solid; one
feature lets graduates send digital diplomas to employers. A Flemish
official insisted that more projects were under way, but I couldn’t
find any residents aware of them. On the r/Vlaanderen subreddit, one
replied to my query, “I had no idea we were using some exotic tech
by Tim Berders. Did you make that up?”

“I never really used Solid for anything serious,” Kjetil Kjernsmo,
a Norwegian informatics expert who co-authored the standard, told me.
He was Inrupt’s first employee, and had expected to work on tools
for the hundreds of developers interested in the protocol; instead,
the company focussed on selling servers to corporate clients.
Berners-Lee mentioned a developer who has built several Solid apps,
including a recipe manager and a viewership tracker that aggregates
data from multiple streaming services. But that developer’s own blog
wearily concedes that the protocol “doesn’t seem to be going
mainstream anytime soon.”

Of course, it takes only one “killer app” to vindicate a
technology. While I spoke with Bruce, Berners-Lee was meeting with a
representative from Visa, which recently announced “the next
evolution of digital commerce.” Visa believes that consumer
purchasing will soon be delegated to A.I. agents, which will make
informed decisions based on user data. But whom will they work for? If
they answer to platforms, the result could be a more insidious version
of algorithmic recommendations—sentient credit cards that read our
minds and collude with merchants.

Inrupt’s solution is Charlie, a Solid-based chatbot that works for
you. Charlie uses personal data to inform its answers, but also
protects that data from platforms, allowing security-conscious users
reliant on targeted “insights” to have their cake and eat it, too.
Greater trust would inspire more data-sharing, facilitating deeper
customization, Berners-Lee explained. “If you give it access to your
exercise data, then ask what running shoes you should have, you’ll
find that it knows you very, very well.”

He dreamed up Charlie in 2017. Last year, Inrupt built a prototype,
which Bruce showed me over Zoom. It was an app on his iPhone, which
opened with the prompt “How can I help you today?”

“This is the world without Charlie,” Bruce said of the default
mode, which simply queries Anthropic’s L.L.M., Claude. We asked for
potential fall getaways, and it suggested Kyoto or Tuscany—each
pricey and overrun with tourists. But giving Charlie access to a
fictional user’s “data wallet” yielded more bespoke results.
“Zoe,” as the user was called, lived in Seattle, loved nature
photography, and worked in tech, where salaries are falling. Why not
send her to Olympic National Park, in Washington? Charlie thought it
was a good fit for her “love of photography” and “practical
travel constraints,” adding that her Marriott points would cover the
hotel.

“Charlie knew what data was pertinent to this request,” Bruce
said. The app had sifted Zoe’s personal information, bundled it with
her query, and sent it to Claude. (The final product will be
compatible with multiple L.L.M.s, which will run locally in a
sealed-off “Trusted Execution Environment.”)

Soon, Bruce added, Charlie will be able to alter files in a data
wallet, the first step toward “agentic” powers. But he couldn’t
say when it would be released. “It could be rolled out by an Acme,
Inc.,” Bruce said. “It could be rolled out by an independent
business that wants to operate Charlie for the benefit of everybody.
We could roll it out for the benefit of everybody.”

The commendable aim was to mainstream the principle of user control
over data. Still, it was hard not to feel that Berners-Lee’s
ambitions had narrowed. “We build it now so that those who come to
it later will be able to create things that we cannot ourselves
imagine,” he once wrote of the web. But, at this critical juncture,
the unimaginable thing he’d chosen to build was a chatbot that helps
you pick sneakers.

Charlie may be too late. Google just announced its own agentic
commerce platform, and, when I followed up with Visa, the company was
evasive about its commitment to Berners-Lee’s idea. In any case,
stronger measures will be needed to resist what Wu calls the
“emergent form of economic power in our time—the artificially
intelligent tech platform.”

In July, Cloudflare—a firm that shields roughly a fifth of all
websites from automated attacks—rolled out tools to block A.I.
companies from scraping sites without permission. It’s meant to
stave off what some call Google Zero, the day when “answer
engines” such as Google’s Gemini and OpenAI’s ChatGPT—which
don’t drive traffic to the sites that they scrape—replace search
engines and destroy publishers reliant on online advertising. “The
dystopian horrible outcome is that you starve to death and die as a
journalist or a researcher or an academic,” Matthew Prince,
Cloudflare’s C.E.O., told me. His hope is that A.I. firms can be
forced to pay for what they consume, with revenue distributed to
creators à la Spotify.

“These companies are basically free-riding off of the content and
production of others,” Lina Khan, who led the Federal Trade
Commission under President Biden, told me. A millennial like the web
itself, she grew up posting on Xanga and LiveJournal, and is concerned
not only with the platform economy’s unfairness but also its threat
to online creativity. “If creators who are actually producing
aren’t going to reap the rewards, what’s going to be the initial
economic incentive?” Last year, a federal judge ruled against Google
in an antitrust case that Khan filed against it for monopolizing the
search advertising market. Lawyers for the company, which plans to
appeal, recently made the startling admission that “the open web is
already in rapid decline.”

Afloatplane skimmed the clear skies over Lake Muskoka, in Ontario’s
cottage country. “That’s the guy who worked for Microsoft,”
Berners-Lee remarked, though later he wasn’t sure. We had just sat
down for lunch on the deck of the summer house he shares with Leith,
who emerged from the kitchen bearing asparagus, smoked-trout pâté,
peaches, and butter tarts. Friends had cycled through all summer,
Leith explained, many with forbidding dietary restrictions. Her
solution: “I did what most people do now. I went to Claude and said,
‘Claude?’ ”—she affected the French pronunciation—“ ‘I
need six days of menu planning using the New York _Times_.’ ”

“Streets of London,” by the singer Ralph McTell, started playing;
he crooned about a “forgotten hero / and a world that doesn’t
care.” Lately, Berners-Lee has been spending a bit more time on
music himself. “As it happens, I’ve just had a few singing lessons
for the first time in decades,” he told me. “Have you heard of
panto?” He meant British pantomime, a genre of family-friendly
slapstick that he first performed with an amateur group in Geneva.
“We did ‘Peter Pan’ and flew everybody to Never Never Land.”

I was, he’d told me, the first journalist to visit his summer place:
a snug retreat with brown clapboard siding, on a sparsely inhabited
island. Earlier, we’d been swimming, and had meant to sail across
the lake in Berners-Lee’s catamaran until we realized that there
wasn’t any wind.

Our conversation was similarly becalmed. I’d come not just as a
journalist but as a concerned digital native, watching Berners-Lee’s
web unravel from within. Billionaires were using platform power to
distort reality and control politics; Elon Musk’s Grok had recently
declared itself “MechaHitler.” Generative A.I. was flooding the
internet with deepfakes and conspiracy theories; a retired relative,
who spends a lot of time watching YouTube and querying Gemini, had
recently informed me of a likely shift in the magnetic poles, which
would fling us into the void as though “God were shaking an Etch A
Sketch.” The Trump Administration had abolished net neutrality, the
principle that internet-service providers should treat all traffic
equally. Yahooligans felt further away than ever as news broke that
Meta’s A.I. was engaging in sexual role-play with children.

Like Dorothy confronting the Wizard of Oz, I wanted Berners-Lee to
explain how, exactly, we were all going to get home. Did he really
think monopolistic tech companies could be constrained without
government intervention? How could Charlie—a mere intermediary
between users and L.L.M.s—prevent A.I. from hollowing out the open
web? And was anyone, anywhere, actually using Solid Pods? Politely,
Berners-Lee bristled. He countered that “public outcry” would
protect net neutrality, that A.I. hallucinations could be checked
against structured data, and that users were clamoring to take back
their privacy. Of algorithms, he said, “It’s just the addictive
bits we have to worry about,” then whipped out a diagram of all the
good and bad online. Eventually, we broke off the interview to go
kayaking. The conversation turned to Isaac Asimov
[[link removed]],
who, Berners-Lee observed, had failed to anticipate an A.I. that
couldn’t be made to follow deterministic rules.

In “This Is for Everyone,” Berners-Lee argues that the web’s
lack of compassion is “a _design issue_” that can be fixed.
“There’s still time,” he writes, “to build machines that serve
the human,” that “promote the dignity of our fragile species on
this isolated globe.” It’s a moving vision. But it’s hard to
reconcile with the entropy of today’s online world, where all
that’s solid melts into air, and every protocol is profaned.

Leith returned me to shore in a motorboat. Soon, I was in my hotel
room, retracing Berners-Lee’s past across the network he had built.
Some links were broken, but the Internet Archive filled the gaps. Next
month, in San Francisco, the organization will honor Berners-Lee with
its Hero Award, to mark the trillionth page its crawlers have
downloaded from his World Wide Web. ♦

Published in the print edition of the October 6, 2025
[[link removed]], issue, with the
headline “Pandora’s Patch.”

_JULIAN LUCAS is a staff writer at The New Yorker covering books,
the arts, and the politics of history. His work for the magazine
includes features on the excavation of slave ships
[[link removed]] and reënactments
of the Underground Railroad
[[link removed]];
profiles of writers and artists such as Mati Diop
[[link removed]], Cole
Escola
[[link removed]], Samuel
R. Delany
[[link removed]],
and Ishmael Reed
[[link removed]];
and essays on intangible heritage
[[link removed]], art
restitution
[[link removed]],
and the exhibition of video games at museums
[[link removed]].
Previously, he was an associate editor at Cabinet. His writing has
appeared in The New York Review of Books, Harper’s
Magazine, Vanity Fair, Art in America, and the New York Times Book
Review, where he was a contributing writer. In 2021, he was a finalist
for the Nona Balakian Citation for Excellence in Reviewing from the
National Book Critics Circle. He grew up in New Jersey and lives in
Brooklyn._

_Since its founding, in 1925 [[link removed]], THE
NEW YORKER has evolved from a Manhattan-centric “fifteen-cent comic
paper”—as its first editor, Harold Ross, put it—to a
multi-platform publication known worldwide for its in-depth reporting,
political and cultural commentary, fiction, poetry, and humor. The
weekly magazine is complemented by newyorker.com
[[link removed]], a daily source of news and cultural
coverage, plus an expansive audio division, an award-winning
film-and-television arm, and a range of live events featuring people
of note. Today, The New Yorker continues to stand apart for its
rigor, fairness, and excellence, and for its singular mix of stories
that surprise, delight, and inform._

_How to Access The New Yorker_

_The weekly magazine is available in print, at newsstands, and by
subscription. Digital subscribers have access to our complete archive,
which includes a digital replica of every issue of the print magazine,
from 1925 to today._

_Our Web site, newyorker.com [[link removed]], features
the full contents of each week’s magazine, plus up-to-the-minute
reporting and commentary. You can also watch our videos
[[link removed]], listen to our podcasts
[[link removed]] and narrated articles, explore
our interactive features
[[link removed]],
solve our crossword puzzles
[[link removed]], and enter
the Cartoon Caption Contest
[[link removed]] from the site._

_All readers are welcome to enjoy the home page, the front page of
each section, the video hub, Goings On
[[link removed]] recommendations, and
a limited number of articles per month at no charge._

_Apps_

_The New Yorker app, available for free in the App Store
[[link removed]] and Google
Play Store
[[link removed]],
is the best way to stay on top of The New Yorker’s offerings.
Download the app for a daily blend of in-depth reporting, political
commentary, cultural criticism, and humor. iOS app users can also opt
to receive alerts about breaking-news coverage and major stories._

_In addition to the app, The New Yorker publishes each week’s
issue in its entirety for Nooks and more. Learn more
[[link removed]]._

_Subscribe to The New Yorker
[[link removed]]_

* World Wide Web
[[link removed]]
* online world
[[link removed]]
* misinformation
[[link removed]]
* social media addiction
[[link removed]]
* monopolies
[[link removed]]
* patents
[[link removed]]
* artificial intelligence
[[link removed]]
* HTML
[[link removed]]

*
[[link removed]]
*
[[link removed]]
*
*
[[link removed]]

 

 

 

INTERPRET THE WORLD AND CHANGE IT

 

 

Submit via web
[[link removed]]

Submit via email
Frequently asked questions
[[link removed]]
Manage subscription
[[link removed]]
Visit xxxxxx.org
[[link removed]]

Twitter [[link removed]]

Facebook [[link removed]]

 




[link removed]

To unsubscribe, click the following link:
[link removed]
Screenshot of the email generated on import

Message Analysis