This past May, Farhad Manjoo, the technology columnist for The New York Times, posted an interactive feature that asked readers to rank what he calls the “frightful five”: Amazon, Facebook, Apple, Microsoft, and Alphabet (Google’s parent company). The idea was that readers were asked to put them in order of which company they could live without, forsaking the products and services that each provides.
For me, it was easy to give up Facebook and Microsoft; neither is essential to my life. It would be harder to drop Apple, as I’m locked into their iPhones and Macs and all my photos and music are stored on Apple platforms. Still, alternatives exist. But I’m an avid user of Amazon Prime, and can’t fathom having to find an online alternative for my various purchases, let alone navigate real stores. And Google? That might be hardest; it’s tantamount to abandoning the Internet altogether.
The Times feature was premised on Manjoo’s realization that these companies are, in fact, impossible to live without in the modern day. We may daydream about ditching one or two, but we could never forsake the lot. They are, he believes, “essentially inescapable for any consumer or business that wants to participate in the modern world.”
Manjoo would get no argument from Franklin Foer, whose new book, A World Without Mind, contrasts with Manjoo’s lighthearted tone by stipulating that the control we have ceded to these companies is doing nothing less than threatening American democracy and desiccating the capacity of all humans to think, interact, and act with free will.
Foer focuses his argument on Amazon, Apple, Facebook, and Google, apparently less concerned about the perniciousness of Microsoft, I imagine because they have less of a consumer presence—most Microsoft customers are companies, not individuals. Of the remaining four, he’s most tolerant of Apple, which is surprising since it arguably does more damage to the spread of ideas through its closed-web, app-centric view of the internet, not to mention the horrendous curation of iTunes, which on editorial grounds alone excels at pushing popularity and burying the novelty or originality Foer elsewhere champions.
He believes there is nothing benign or accidental about these tech powerhouses. Rather they are the inevitable result of a perverse and particularly Silicon Valley version of 1960s counterculture, that bygone faith that connecting people together can lead to a better, freer society. Foer traces, as others have done previously, a straight line from the ideology of ’60s radicals to the corporate campuses of Facebook and Google. The touchstone beliefs of the 1960s—decentralized power, communitarian production, a faith that technology can liberate ideas—have been automated, scaled, and distributed through mega-billion-dollar corporations, unchecked by meaningful regulation. The great reach and power of these companies, Foer warns, allows them to pursue a dangerous mission; these companies, he argues, believe they know what’s best for us, what we want to read and watch and even how we want to feel, and the intoxicating power of their services is allowing them to reach deep into our brains and control broad areas of our lives.
“They are monopolies operating without restraint, regulatory or otherwise,” Foer argues. “The companies preach the gospel of efficiency, as they engage in the most extensive surveillance in human history. They are rent-seekers with little regard for the independent producers whose goods they sell. They shape the minds of citizens, filtering the information by which citizens arrive at their political opinions.”
The book is unapologetically a polemic—or what, in my days as an editor at Wired, we would call a “manifesto.” It turns out that many of Foer’s villains are my former colleagues and writers; Kevin Kelly, Chris Anderson, Stewart Brand. I offer that disclosure, but would quickly add that I started the book eager to dive into Foer’s critique. My own faith in technology is more pragmatic than ideological, and I am as suspicious about the deep reach of these companies as Manjoo or Foer. There’s nothing altruistic about Amazon’s dogged pursuit of efficiency or Apple’s hyper-optimized product development cycles; their benefits indeed come with costs.
What’s more, Foer’s angsty book is exceptionally well-timed. In June 2017, Amazon announced a $13.7 billion purchase of Whole Foods, a step into the analog world that unnerved not just its retail rivals but also legions of Whole Foods customers, like me, who aren’t sure we want our produce optimized by algorithms. That same month, Facebook piously unveiled a new mission statement—“Bring the world closer together”—that starts to sound sinister after reading Foer’s skeptical critique that these giants believe that “by stitching the world together, they can cure its ills.”
But even a manifesto needs the weight of argument to be convincing. Foer’s critique is so damning—“The more they can insinuate themselves into our lives the better. There is no limit.”—that it demands some evidence, something more than mere insults and insinuation.
Foer is a veteran of scrappy arguments and controversies. Twice the editor of The New Republic, he previously wrote How Soccer Explains the World, a whip smart book that analyzed geopolitics via sport. Lately, he’s been writing smart stuff on Washington for The Atlantic.
We can certainly agree with Foer that these companies have accrued tremendous power over the culture, power that is practically unfettered by regulation (at least in the United States) or physical constraint. But when he claims that Google and Facebook are “the most spectacularly successful firms in history,” it only takes a moment to consider that maybe that honorific belongs to any number of other companies; ExxonMobil, for instance, has made $310 billion in profits over the past 10 years, three times what Google has made over the same span.
It’s delightful to read erudite analogies such as “Like nineteenth century European powers, each company does little to impinge on the other’s sphere of influence, competing only on the fringes of empire.” But unfortunately it’s just not true. Google competes doggedly with Apple on mobile hardware and operating systems, and with Facebook for the advertising market. Amazon competes with Apple on media downloads, and with Google and Apple on artificial intelligence hardware, and so on. Hyperbole is part of Foer’s style here, but it does occasionally lead him to make assertions that come across as uninformed.
If Foer lacks precision in his more sweeping outrages, he is on stronger ground in the heart of his argument, which is essentially about journalism, and the role of writing. Amazon, despite the fact that less than 10 percent of its revenues are from books, according to a 2014 New Yorker estimate (the company is notoriously opaque about its financials), is essentially still a bookseller to Foer. Google’s worst sins are Google News and Google Books, and Facebook is a malevolent parasite of news and writing. Each of these companies, Foer argues, shows staggering contempt for intellectual property—the hard-won written word—and they command something close to monopoly power, putting the publications, and more importantly, their writers, in jeopardy. He compares these companies to the food industry in the 1950s, pushing Twinkies and Wonder Bread down our throats: “Intellectuals, freelance writers, investigative journalists, midlist novelists are the analog to family farmers, who have always struggled but simply can’t compete in this transformed economy.”
Midlist novelists might be unlikely heroes, but Foer is their champion, having bloodied his nose in his own battles. Foer is forthright in admitting that part of his aversion to the ideology of Technology Triumphant stems from getting burned at the New Republic when founding Facebooker Chris Hughes bought it and coaxed Foer into taking a second run as editor in chief of the magazine. Not 18 months into the project Hughes got frustrated with the magazine’s paltry lack of online growth and profit (he really should’ve known better) and forced a clicks-and-views acolyte CEO upon Foer. Not long after, Foer was gone.
It really is a tragic story, and Foer is absolutely correct that writers face an uncertain future in the current landscape. As an erstwhile journalist myself, I’m glad for the empathy. But it’s unfortunate that Foer is so eager to castigate the technology giants for the threat they pose to democracy—a threat that, in his telling, is really mostly about writers—that he doesn’t offer a richer argument. And it’s telling that one of the best arguments for the role of newspapers and journalism in the current reality come in essays (2009 and 2014) by Clay Shirky, whom Foer mentions only to scoff at his belief that by connecting distant communities, the Internet might liberate some new creativity and progress. If only Foer had wrestled with those ideas forthrightly—they’re only a Google search away, after all.
My biggest lament with World Without Mind, though, is how the book misses the mark on Stewart Brand, a Merry Prankster turned Whole Earth Catalog founder turned nuclear energy advocate (I have met Brand, though I would be surprised if he remembers the occasions). Brand is a worthy foil and a controversial figure; he’s never shirked his role as a cultural agitator. There’s a lot to challenge him on. Foer starts with a sketch of Brand’s life—which has fused 1960s faith in communities and transparency with the power of technological dissemination—framing this a dire coupling of two dangerous ideologies. But alas, Foer doesn’t dig deeper; instead he reverts, as so many other Silicon Valley takedowns have, to basing his damnation on Brand’s most famous aphorism: “Information wants to be free.”
On the face of it, this remark, which has informed decades of Silicon Valley disruptiveness, seems like blatant contempt for the work of creatives—writers included—in an atoms-versus-bits universe. If writing is mere information, and if information should circulate fluidly without regard to its essential value, then this is indeed the end of the culture we have built. Foer is right: For 30 years, Brand’s koan has been invoked in the Valley as permission to open the Pandora’s box of digitization, unleashing technology to destroy music and movies and journalism and other creative fields.
But here is where Foer’s argument is built from straw when it might have been made of bricks. For such an essential block in Foer’s argument, and considering how lucidly Foer dives into history elsewhere (Descartes, Alan Turing, Gottfried Leibniz), it’s surprising he didn’t reckon to provide the actual context and full quote behind Brand’s riff. In fact, it was in 1984, at the very first “Hackers” conference, and Brand was responding to Steve Wozniak, co-founder of Apple (Foer goes into the history and etymology of “hackers” elsewhere, so this might’ve crossed his radar). Up on the conference stage, Woz wondered why big tech companies shouldn’t hand over intellectual property to their employees if the company didn’t decide to make use of it. Brand, who was in the audience, stood up to reply:
On the one hand information wants to be expensive, because it’s so valuable. The right information in the right place just changes your life. On the other hand, information wants to be free, because the cost of getting it out is getting lower and lower all the time. So you have these two fighting against each other.
“Information wants to be expensive.” Brand wasn’t, in fact, calling to storm the gates of intellectual property; he was noting an essential tension, a tension that demands both creators and disseminators to engage with each other—and a tension that increasingly unleashes a new opportunity: Your ideas and intellectual effort have the potential to reach exponentially more people than they might have previous. But to do so, they must be available, and transferable. They must be liberated.
Indeed, though Brand refers to cost, what Wozniak was referring to—and what technologists often mean by “free”—is less about price, and more about access. In the pre-digital age, information was locked up among experts and institutions and even libraries, where it was difficult to access and circulate. When ideas are let loose, however, they have profoundly more impact—more value—than they might when they’re pent up inside, say, a print magazine that reaches 60,000 East Coast elites.
The technologists would argue, and I would concur, that there’s in fact something essentially democratic about this libertization of information, about the spreading of ideas and the connection of communities. If you believe that everyone deserves access to ideas, access on balance creates more benefit than the evils it might spread. That’s a tension I’d like to have seen Foer explore. What is the essential trade off we’re faced with? If Manjoo is correct, and these companies aren’t going anywhere, then how do we maximize the upsides and mitigate the downsides?
But Foer doesn’t go there. It’s too bad, because a deeper engagement with Brand’s tension, by someone as intelligent as Foer, would be welcome; indeed, Foer may be one of the few writers capable of such an engagement, given his experience on both sides of the divide. But he never digs in, preferring to let demons spew from the caldron of artificial intelligence, algorithms, and automation that these companies excel at.
Or do they? As someone who has spent the past few years in the trenches of analyzing personal data for analytic power (all anonymized, natch), I can admit that it is much harder than people think to plunder reams of data for real insights. Foer is absolutely right that these four companies are exceptionally good at it—but that doesn’t mean they get it right. Case in point: A few months ago, Google had spliced my identity with a 64-year-old Bavarian. Anyone who Googled my name was offered a Frankenstein “Thomas Goetz” who was half me and half this other dude. It was a creepy mistake. Once aware of the error, I clicked on “feedback” and scolded Google—or its algorithms—that I was not this other person, without offering my real DOB. The mistake vanished within a couple days. That may be grist for Foer’s argument (the algorithms are a work in progress, and I just made them a teensy bit better). But it also suggests that the singularity—the creation of an AI superintelligence—isn’t so near, after all. There are, in fact, practical alternatives to the hegemony that Foer decries. Instead of Google, you are free to use Bing; if you’d rather avoid Microsoft, try DuckDuckGo, a search engine that doesn’t track or store personal information. The results aren’t as spot-on as Google’s, but they’re solid enough for most queries.
Instead of Apple, there is Samsung, or Spotify, or whatever cocktail of hardware and software you require. And instead of Facebook, there’s always Snap (it’s where the kids are, anyway). And instead of Amazon, well, there is Wal-Mart. Remember when Wal-Mart was the most evil manifestation of corporate perfection imaginable? That’s so 1998.
If these don’t sound like much of an alternative, given the benefits that these companies and their products bring to us, it’s not entirely obvious from Foer what his perfect world would look like. Clearly, it’s one where writers are well paid to write stories that may or may not prove popular. It’s one where writers don’t have to be driven by clicks and traffic counters, a quest that’s not only soul-sucking but also, he insists, potentially nation-destroying. One of Foer’s more brilliant riffs traces how the clickerati (Vox, Buzzfeed, Upworthy) all chased the story of a Minnesota dentist who killed Cecil the Lion, to universal outrage, especially online. It was a great story, and every outlet had to find their angle to get clicks. This frenzy for traffic seems benign . . . until it doesn’t. “Trump began as Cecil the Lion, and then ended up president of the United States,” Foer quips. It’s a great line with more than some truth to it.
He then continues: “This profusion of data has changed the character of journalism. It has turned it into a commodity, something to be marketed, tested, and calibrated. . . . Magazines and newspapers used to think of themselves as something coherent—an issue, an edition, an institution. Not as the publishers of dozens of discrete pieces to be trafficked each day of Facebook, Twitter, and Google. The audience for journalism may be larger now, but the mindset is smaller.”
This isn’t, in the end, much of a polemic, in the sense of arming readers with a battery of arguments to better understand the reach of these powerful companies. Rather it’s more a reminiscence, a nostalgic glance back to a time when words were paid by the dollar, not the impression, when people had to move through a physical world to buy physical objects to gain knowledge. I myself lived those days; I loved those days; I was well paid in those days. But there is—there must be—a way for media, not just writing but all the creative media through which we seek to persuade and inform and entertain, to survive and thrive in the current environment. Jonah Peretti, the algorithm-obsessed founder of Buzzfeed who gets some mention here, is a terrific exemplar of threading this needle. If only Foer had put his polemics aside to explore the kind of journalism Peretti champions as a model for what might come next.
And here’s the ultimate irony, which I know isn’t lost on Foer: His argument against Google, Facebook, Apple, and Amazon is contained inside a book, available in digital or paper form. This book will be discussed fervidly on Facebook, as it deserves to be. And it is on sale now—in electronic form at Google Play and Apple iTunes—and at Amazon. The list price there for a hardcover you can hold in your hands is $27, but as of this writing, you can get it for 31 percent off, at $18.56.