Last July, a paper by Mark Aguiar et al. made waves by attributing 23-46 percent of the 12 percent decline in work hours among predominantly low-skill males aged 21-30 to improvements in video-gaming technology. That hypothesis offended the sensibilities of anyone who believes that all those who are able to work should do so. But what if it misses a larger point about the changing patterns of work and leisure?
Start with the story of recreational gaming in recent years, which is also a story about the commodification of free time. Where once video gaming offered individual or cooperative escapes from the workaday world (after the initial cost of purchase), it has now increasingly been pressed into the service of the market. As a result, gaming has come to privilege haves over have-nots, work and passive consumption over leisure, and the economic over the social. It represents a cautionary tale for what can happen to any social and leisure activity, particularly in the digital economy.
The market capture of gaming has taken at least three forms. First, gaming has, in many ways, been transformed from an ordinary leisure activity into a consumer luxury good. According to the Entertainment Software Association, “software, including in-game purchases and subscriptions,” accounted for $29.1 billion of the industry’s revenue in 2017, whereas “hardware, including peripherals,” accounted for a relatively meager $6.9 billion. And in 2016, mobile-gaming apps made up 75 percent of all Apple App Store revenues and 90 percent of Google Play Store revenues.
More to the point, the makeup of the gaming industry’s revenue—which grew by almost 20 percent in 2016 alone—suggests that it is catering not just to those who are working less, but also to a leisure class with higher levels of disposable income. Much of the industry’s growth is the product of a widely adopted “freemium” model, whereby users get a game for free, but are pressured to make credit-card purchases in order to experience it in full (hence the portmanteau of “free” and “premium”).
Accordingly, many of the highest-grossing games create a sunk-cost dilemma, in that users must continuously make new purchases to keep up with each software version update. In what becomes a vicious cycle, a player who has spent $50 to get a leg up over other players easily succumbs to the temptation to spend just $5 more to keep things that way. Similarly, many games use addictive “gacha” or “loot boxes” that deliver randomly generated, exclusive rewards. This increasingly common ploy functions so much like a slot machine that it has prompted new anti-gambling legislation in six U.S. states. Far from being just an inexpensive escape for idlers, then, modern gaming has become an outlet for people with money to burn—or, worse, for those gambling on credit.
Meanwhile, this trend has been accompanied by the rise to a growing precariat of semi-professional “content creators” publishing videos on YouTube and streaming their gameplay on Twitch. This growing informal labor force may constitute a healthy share of the 4 percent of young men who report spending six or more hours per day on computers, while neither working nor looking for work.
In fact, Twitch boasts 1.5 million “broadcasters” worldwide, some of whom log 60-hour weeks and make six figures from viewer donations, ad revenues, and sponsorships, notes Taylor Clark of The New Yorker. Still, most grind away in Twitch’s online gig economy without ever achieving that level of success. And all the while, the entire experience is saturated in advertising and direct solicitations to the site’s 100 million monthly visitors.
A third form of market mediation is not far off. Many machine-learning applications rely on vast stores of human-produced data, and digital gaming is an especially data-rich activity. Already, the “Grand Theft Auto” series is being used to teach autonomous vehicles to recognize street signs and other obstacles. Presumably, it is only a matter of time before developers start designing games with the goal of collecting even more human behavioral data. Such financial concerns will inevitably affect the content of games, optimizing them for machine learning alongside—or instead of—quality of experience.
Although the commodification of leisure is not new, it is also not something to be complacent about. In industrial and post-industrial societies, work tends to be necessarily hierarchical. But leisure has always held out the promise of equality. Under ideal conditions, one need not belong to the same socioeconomic class to belong to the same book club, or, for that matter, to the same “Clash of Clans” clan (a private in-game community formed by players who set the qualifications for membership). Video games, at their best, offer everyone an equal chance to overcome the same challenges on an equal playing field. “Pac-Man,” after all, didn’t let you add extra quarters to purchase immunity from the ghosts. And, like leisure generally, games provide a space for the formation of social relations that can stand apart from economic ones.
But as Harvard University philosopher Michael Sandel has demonstrated in great detail, the introduction of market forces can shatter the ideal of equality in a variety of spheres. Markets bring extraneous competition, even envy, which, as Bertrand Russell once observed, “consists in seeing things never in themselves but only in their relations.”
Thus, in the “freemium” economy, one’s expendable income really does determine whether one can join certain “Clash” clans, because many only accept members who have advanced to a level that can only be achieved through the in-app purchase of “gems.” On Twitch, income divides social communities into haves and have-nots who must constantly hustle for the former’s patronage. And in an AI-driven setting – as on social media – one can never be too sure where the fun stops and the exploitation begins. In any case, a potential realm of equal opportunity has been replaced by a congeries of unequal, transactional power relations.
To be sure, any discussion of recreational gaming, and the community it has created, is largely a discussion about a narrow cohort of men and boys born after 1980. But the market’s wholesale colonization of this domain is of a piece with broader trends across health care, education, media, and even public spaces such as U.S. national parks—areas where a strict economic logic is often incompatible with the public good. What’s more, this process has been accompanied by a larger cultural shift toward market prerogatives, one that is reflected in most gamers’ passive acceptance of the full-bore commodification of their chosen free-time activity. (Now that the Supreme Court has granted an imprimatur for state governments to legalize sports betting, the culture of athletic “fantasy” leagues – and fandom more generally – will likely continue even further down this path.)
Reversing the cultural acceptance of market infiltrations into leisure will require nothing less than a return to traditional twentieth-century social democracy. Under today’s state-sanctioned system of per capita GDP-anchored utilitarianism (or “neoliberalism”), the social and environmental costs of unfettered GDP growth are considered not just incidental, but justified for the greater good. Within this prevailing economic model, the social-democratic impulse is to redress the externalities of growth through labor-market and social-insurance policies.
But such policies are merely the “apps” of social democracy. The operating system is something larger. It might best be understood as a political dispensation for preserving that which should be separate from markets, be it the environment, organic social networks, or public goods such as education and basic research.
This is important because the labor/leisure trends that are so visible in the world of gaming pose a challenge to the social-democratic project itself. For starters, sources of economic value are changing in such a way as to make market encroachments into social and civic life more likely. If one accepts the truism that “data is the new oil”—meaning the era’s predominant form of wealth—then those sitting at home gaming for hours on end are, in theory, already more productive than the millions of people employed in purely extractive, rent-seeking occupations such as advertising and speculative finance.
In light of these facts, how can the market exploitation of civic, social, and leisure spaces possibly be averted? With respect to personal data, the virtual-reality developer Jaron Lanier, who has gradually become a kind of self-appointed ombudsman for Silicon Valley, has proposed a system of universal micropayments, whereby everyone with a digital presence will be entitled to royalties in perpetuity anytime their data is used. For example, if you meet your spouse through a dating app and that app uses the “correlations between you and your spouse to [match] other prospective couples,” Lanier explains in Who Owns the Future?, then you would be entitled by law to a small payment from the app. Likewise, anyone whose gameplay is being used for AI training or behavioral research would get a cut of any future profits stemming from that enterprise.
More broadly, Nicholas Agar of the Victoria University of Wellington suspects that the kinds of goods that only people can provide could form the basis of a “social economy” running parallel to the digital economy. Whereas the latter will continue to prize efficiency above all else, “the principal value of the social economy,” Agar writes in his forthcoming How to Be Human in the Digital Economy, “is humanness … founded on a preference for beings with minds like ours, a preference to interact with beings with feelings like ours.”
Simply put, Agar believes that we will always prefer a human waiter, nurse, or teacher to an AI—regardless of how authentically “human” robots become. He envisages a labor market where a large share of workers will contribute purely social goods, such as companionship for the elderly, caretaking for the young and convalescent, and a sense of community for the marginalized—which happens to be what many Twitch stars aspire to provide. To that end, he urges policymakers to preserve and expand such occupations, rather than disparage all forms of employment that do not bear directly on productivity.
Both Lanier and Agar’s visions for the future could militate against poverty or “digital feudalism,” in which a Silicon Valley oligarchy would rule over a massive underclass whose sole purpose is to furnish data for AIs. But each also introduces a dilemma for social democracy, particularly its dispensation for civil, social, and temporal spaces outside of the market. Indeed, under Lanier’s system, one’s entire lived experience would be commodified, and everyone would have a strong incentive to become digital hustlers, dumping new data into the ether around the clock to juice their royalties.
Agar’s approach shows more promise, particularly as a response to “one of the defining ills of our time—social isolation.” And yet, it also implies the widespread commoditization of sociality itself. As one can see in gaming, directing social interactions according to the dictates of the market risks preserving or exacerbating existing economic inequities and political power imbalances. Agar is right to say that we “value the human interaction that occurs” when we receive that “café latte” we ordered. But that doesn’t mean the barista wouldn’t rather be somewhere else.
More often than not, discussions about AI and future work arrangements give rise to proposals for a universal basic income. A UBI would be effective in eliminating absolute poverty, and it may very well be making its way from “radical” to inevitable, given the prevalence of below-poverty-wage jobs. Taken in isolation, though, a UBI could serve as an alibi for the greater status quo of rising housing, health care, and tuition costs, along with ravenous advertising-driven consumption (which now accounts for almost 70 percent of U.S. GDP).
Moreover, too much focus on a UBI risks crowding out efforts to vastly expand pre-tax, upstream forms of redistribution, such as investments in education, public health, infrastructure, and other public goods that form the basis for a civil society that is separate from—and constitutive of—the market. Such outlays could be funded, perhaps, by a tax on data (and other) rentiers. And, taken together, they would represent an expanded social-democratic state, and thus a new operating system.
Over time, with that operating system running in the background, the cultural acceptance of market encroachments into leisure and other domains would likely ebb as more people came to appreciate alternatives to the commodified life. And this could be accelerated with publicly provisioned options for leisure, recreation, and personal development.
With this in mind, the social-democratic project—and the future of both work and leisure—might best be served by a National Investment Employment Corps (NIEC) like that proposed by Mark Paul, William Darity Jr., and Darrick Hamilton. A government program to employ all willing workers would not just be in a position to raise wages across the entire economy over time. It could also put downward pressure on standard work hours, which was the original goal of organized labor. Eventually, it might also prod policymakers away from GDP-centrism, and toward a more humanistic policy metric that privileges leisure time—and the capacity to find satisfaction from it—as a measure of well-being, rather than as a Beckerian loss to productivity.
A public option for employment could also start funneling people into genuinely fulfilling social work of the kind Agar envisions, but in such a way as to avoid exploitation. Such work could include the restoration of public and natural spaces, or caretaking, entertaining, and communing with the elderly, disabled, and others outside of the labor force. Because jobs would be created to serve social purposes rather than profit-driven growth for its own sake, one could even imagine government-employed Twitch stars, tasked with building online communities for the marginalized and estranged. They would no longer be at the mercy of strangers’ largesse. And, ideally, they’d only have to put in a few hours a day before they could begin gaming just for the fun of it.
Click to
View Comments