Book Reviews

Chip Shots

The United States wants to stop China’s semiconductor industry in its tracks. Here’s how it could backfire.

By Arthur Goldhammer

Tagged Chinachips acttechnology policy

Chip War: The Fight for the World’s Most Critical Technology by Chris Miller • Scribner • 2022 • 464 pages • $30

In left-liberal circles there is a rough consensus about what has gone wrong with our politics over the past 40 years. The critique can be summed up in two words: neoliberalism and globalization. Although these capacious ideological generalizations cover a multitude of sins, the gravamen of the charge against both is that, in the name of economic efficiency and growth, globalizing neoliberals of both the right and the left justified depriving national governments of the power to reduce inequalities of wealth and income, promote equal opportunity, and protect the health and welfare of the citizenry. Neoliberals prioritized property rights over social and political rights and protected markets from political meddling. They removed regulatory fetters on the movement of capital and sought the cheapest labor they could find to put their money to work. As a result, from the late 1970s on, governments across the developed world retreated from the social democratic reforms credited with fostering the harmonious prosperity of the three decades following World War II—the period the French have dubbed les Trente Glorieuses—thereby triggering a populist and xenophobic backlash while polarizing previously consensual political systems and weakening resistance to authoritarian demagogues.

This account of political change across the Western world since the 1980s has much to recommend it, not least the implication that the globalized neoliberal regime has sown the seeds of its own impending demise. This is the view espoused in one form or another by a number of excellent recent books, among them Gary Gerstle’s The Rise and Fall of the Neoliberal Order, Michael Tomasky’s The Middle Out, and Bradford DeLong’s Slouching Towards Utopia. Yet each of these estimable authors embraces the notion that the novel feature of the period was superstructural, to borrow a term of art from the Marxist lexicon: All believe that ideology was in the driver’s seat and that it was the readiness of left-liberals to accede to the tenets of market-first ideology that established neoliberalism as the unsurpassable political horizon of the age (to borrow a phrase from philosopher Jean-Paul Sartre).

But what if this superstructural interpretation is incomplete? What if it blinds us to a deeper transformation of the means of production themselves? What if the key innovation of the 1970s and ’80s was the advent not of neoliberal ideology but of the microprocessor, which simultaneously created new markets, dramatically altered trade flows, and shifted both the economic and military balance of power among nations? And what if this crucial technological innovation can trace its roots all the way back to the aforementioned Trente Glorieuses? What if the glory years of social democracy saw the benefits of higher education spread much more widely than ever before, disseminating technological skills throughout the world and making it possible to tap far more of humanity’s collective brainpower, while creating a web of interdependent corporations spanning both the developed and less developed worlds? The microprocessor not only became the flagship product of the neoliberal era’s dominant industry but also served as its indispensable instrument, without which it would have been impossible to tame the torrents of information necessary to manage far-flung supply chains and global capital flows.

What if the key innovation of the 1970s and ’80s was the advent not of neoliberal ideology but of the microprocessor?

Chris Miller’s Chip War deserves credit precisely for redirecting our attention from superstructure to base, from the high political drama of the past four decades to the more prosaic business of manufacturing microchips. At its most basic level, the book offers a masterful history of the semiconductor industry, from the invention of the first transistor in 1947 to the incredibly complex machinery required to deposit tens of billions of nearly atom-sized switches on a silicon chip no larger than a fingernail. Miller, who teaches international history at Tufts University’s Fletcher School, emphasizes the national security implications of a global supply chain in which components crucial to U.S. defense must pass through choke points such as Taiwan subject to intervention by commercial and strategic rivals. But the history he recounts in vivid detail also tells a more hopeful story, illustrating the way in which globalization has made it possible to mobilize humanity’s collective brainpower to achieve progress that no single country could have achieved on its own.

World War II had demonstrated the usefulness of computing power for purposes as different as breaking codes, building weapons, and organizing industrial processes. The first general-purpose computers were just coming into being as the war was ending. As impressive as they were, they suffered from a major weakness: They relied on failure-prone vacuum tubes. Although the advent of the transistor promised a new era in computing, it was easy to miss its significance. The New York Times buried this important step toward a more reliable and less power-hungry “electronic brain” on page 46. “Time magazine did better,” Miller reports, by dubbing the new device a “little brain cell.”

But it would take nearly a decade and many further innovations for the full implications of the solid-state revolution to become apparent. The next major step came in 1958, when Texas Instruments hired electrical engineer Jack Kilby to work on potential military applications of semiconductor technology. Kilby had the idea of etching several transistors onto a single “chip” of germanium (later supplanted by silicon). Independently, Bob Noyce, an engineer working for Fairchild Semiconductor, hit on the same idea but also realized that the components on the chip could be connected without wires. The potential of these early “integrated circuits” was immediately clear, but one problem remained: The first samples cost 50 times as much to manufacture as comparable circuits built out of traditional components.

As luck would have it, however, the Soviet Union launched its Sputnik three days after the founding of Fairchild. For the U.S. government, cost was no object when it came to beating the Russians: “The first big order for Noyce’s chips came from NASA, which in the 1960s had a vast budget to send astronauts to the moon,” Miller writes. Thus, the nascent chip industry got its first major boost not from the market but from a government determined to refurbish its tarnished prestige, assert U.S. technological superiority, and bolster the nation’s security. Within two years Fairchild grew from annual sales of $500,000 to $21 million, largely on the strength of sales to the Apollo program. Later, semiconductors would figure in the smart bombs and missile guidance systems developed during the Vietnam War, ensuring that the defense budget would continue to subsidize the industry’s growth.

The next major step came with the realization that circuits could be laid out on their semiconductor substrates by using light and lenses to project “masks” onto a layer of “photoresist” chemicals which react when exposed to light. The impact of this crucial innovation, called photolithography, may be compared to that of the printing press. By using ever shorter wavelengths of light, circuit “features” could be made smaller and smaller, allowing more and more transistors to be packed together on a single chip. Fairchild co-founder Gordon Moore became famous for creating Moore’s Law in 1965, predicting that the computing power on every chip would “double every couple of years.” Time has proved him right.

Selling advanced technology to the government at premium prices was not the only formula for success, however. Akio Morita, the co-founder of Japan’s Sony Corporation, recognized the transistor’s potential to revolutionize the consumer market. Unlike American firms bent on pushing the underlying technology forward, Sony was content to license U.S. patents for use in mass-market products such as transistor radios, music players, and digital cameras. After receiving one such radio as a gift from early 1960s Japanese Prime Minister Hayato Ikeda, French President Charles de Gaulle sniffed derisively to an aide about a statesman who would stoop to acting as a “transistor salesman.” But the Asian leader had grasped what the French president had not: At this moment in history, “made in Japan” still signaled a product that was cheap and poorly made, but consumer electronics would soon rival oil as a source of wealth, jobs, and international clout.

Consumer spending made it possible for industries in developing countries to bootstrap themselves on the sales they generated, without infusions of capital in search of cheap labor (as the standard neoliberal story would have it). Yet money did also flow from the United States to Asia in search of lower production costs. Despite the immense labor savings made possible by photolithographic techniques, chips still had to be attached to their carriers and connected to other chips—labor-intensive tasks that stood in the way of reaping the windfall implicit in Moore’s Law. At first, U.S. industry turned to women and Native Americans among other steps to reduce labor costs, but Asia offered the prospect of even lower wages.

A prime mover in offshoring chip production to the Far East was one Charlie Sporck, the kind of manager whose anti-union animus and relentless focus on cutting costs drove workers at General Electric to burn him in effigy. Fairchild hired him to trim labor costs to the bone, which he did by opening a factory in Hong Kong in 1963. (Incidentally, Miller’s sharp-penned portraits of industry figures like Sporck do much to enliven what might otherwise read as dry industrial history. The men—and they are all men—who people these pages demonstrate that the capitalist market, for all its vices, has a remarkable knack for enlisting a wide variety of character types in its mission.)

Asia’s rise to a top place in the semiconductor industry was not due solely to its ability to produce cheap consumer goods and supply cheap labor, however. As Miller pertinently notes, “The semiconductor industry was globalizing decades before anyone had heard of the word, laying the grounds for the Asia-centric supply chains we know today.” Strategic considerations were as fundamental as market forces. Take Taiwan, for example. Although economy minister K. T. Li, who had studied nuclear physics at Cambridge, initially offended a delegation from Texas Instruments in 1968 by declaring that intellectual property was something “imperialists used to bully less-advanced countries,” he soon realized that partnering with American firms not only promised substantial economic rewards but might also shore up American security guarantees to Taiwan and other anti-Communist governments in Asia, which were looking wobbly in the wake of the U.S. debacle in Vietnam. Leaders in Singapore, Malaysia, and South Korea saw similar benefits in pursuing U.S. partnerships. Capital to build chip fabrication plants came from strategic investments by governments and domestic banks rather than from global capital markets. By the end of the 1980s, Asian firms were outcompeting their American rivals in key market segments.

While knowledge of the physics that underlies solid-state devices is widely shared, the actual fabrication of chips at minimal cost and maximal yield depends on a great deal of tacit knowledge that can only be gained through experience. Firms guard this knowledge closely. Asian chip foundries thus gained a comparative advantage, which was further enhanced as chip design increasingly became a business separate from chip manufacturing. The advent of so-called “fabless” chip shops, which sold circuit designs but lacked the facilities to manufacture them, lowered the barriers to entry. Highly specialized chip fabricators such as the Taiwan Semiconductor Manufacturing Company (TSMC) could then take these designs and, with their extensive in-house knowledge, earn handsome profits while adding to the store of private knowledge that gave them their competitive edge.

The ratcheting effects of ever more advanced tech and ever greater economies of scale left even major firms wondering if they would be the next to be consigned to the dustbin of history. For example, Only the Paranoid Survive was the title of a book written by Intel co-founder Andy Grove. Although his company has retained its dominance over the microprocessor market to this day, it has plenty of reasons for paranoia. Apple has dumped Intel chips in favor of a design licensed from the fabless firm ARM, located in the UK and owned by the Japanese conglomerate Softbank. Fabrication of Apple’s chips is outsourced to TSMC in Taiwan, which, despite frosty relations between Taiwan and mainland China, employs workers in the People’s Republic. But cheap labor is only a minor reason for TSMC’s dominant position among chip foundries. It is also the sole master of the technologies needed to manufacture Apple’s latest chips. “A foundry like TSMC could fabricate chips for many chip designers, wringing out efficiencies from its massive production volumes that other companies would find difficult to replicate,” writes Miller. Yet the company has learned to operate the most sophisticated photolithography machines essential to the task, for which there is only one source in the entire world: the Netherlands’ ASML.

Therein hangs a tale. The latest generation of chips employs features so small that they cannot be etched by ordinary, visible light. Extreme ultraviolet (EUV) light is required. To produce these short wavelengths, tiny balls of tin must be vaporized by high-powered lasers comprising 457,329 component parts. Aiming the ultraviolet rays requires special mirrors made of a hundred alternating layers of molybdenum and silicon. These and countless other esoteric technologies are embedded in all EUV lithography machines, which are manufactured by a Dutch firm called ASML. In turn, each of these specialized systems embodies the tacit knowledge acquired over many years by ASML’s many subcontractors and sources of scientific inspiration, spread across countries as diverse as the United States, Germany, the Netherlands, Japan, Slovenia, and Greece, which supply the hundreds of thousands of individual parts that go into each machine. A top ASML executive told Miller that the company itself manufactures only about 15 percent of those parts. For the rest, it engineers “like a machine” the network of several thousand business relationships that constitute its supply chain.

Here we come back to the point I alluded to at the outset. Viewing global capitalism as economists do, in terms of abstract pools of capital and labor, obscures the crucial importance of these social relations of production, to borrow another term of art from the Marxist lexicon. Advanced technological societies depend on knowledge of many kinds, some of it public and widely disseminated, some of it acquired through long experience and stored up in the institutional knowledge warehouses of specific firms. Chip War shows how the diffusion of public knowledge has accelerated the development of several highly useful technologies while simultaneously creating “choke points” of tacit knowledge in the underlying supply chains. Firms like ASML and TSMC became single sources of inputs essential to the operation of vast webs of global production. When national defense capabilities depend on those inputs, the issue ceases to be solely economic: If key choke points are vulnerable to occupation or destruction by a potential enemy, it becomes existential.

Or so national security analysts and China hawks would have us believe. If most of Chip War recounts the long and winding history of the semiconductor industry, the final chapters zero in on the increasingly heated tensions between China and the West. “War” might seem a hyperbolic metaphor for commercial rivalries between Samsung and Micron or Intel and AMD, but when it comes to China’s alleged ambition to supplant the United States as the leading global power, the word takes on a more ominous coloration.

Miller is clear about the importance of semiconductors to China’s global ambitions. In 2017, China imported $260 billion worth of chips, a sum “far larger than Saudi Arabia’s export of oil or Germany’s export of cars. China spends more money buying chips each year than the entire global trade in aircraft.” Chinese leader Xi Jinping exhorted a group of industrialists to take a martial approach to developing the country’s own domestic semiconductor industry: “We must assault the fortifications of core technology research and development.” The United States has accused the Chinese telecom firm Huawei of using its technology to spy on communications around the world. And of course, China’s territorial claims on Taiwan raise questions about the vulnerability of TSMC, a crucial node in the production of today’s most advanced chips.

The best counterweight to Chinese military and commercial ambitions is the collective brainpower of the democratic world.

In assessing the national security risks posed by China’s semiconductor ambitions, some analysts seem to have accepted Andy Grove’s adage that “only the paranoid survive” at face value. While one former UK intelligence official argued that “we should accept that China will be a global tech power in the future and start managing the risk,” the United States, taking a darker view of China’s aims, has set out to stop China in its tracks by pressuring allies to reject Huawei chips and by banning the export of certain U.S.-developed technologies to China, most notably with the CHIPS Act of 2022 and related legislation.

Such aggressive policies could backfire, however. Miller quotes China tech policy analyst Dan Wang, who argues that American restrictions have “boosted Beijing’s quest for tech dominance” by catalyzing new Chinese government policies that support their local chip industry, including the training of tens of thousands of electrical engineers and condensed matter physicists. There are good reasons to worry about China’s military ambitions, but it is probably futile to try to halt the spread of technology as though it were a bulk good susceptible to blockade. There are also less aggressive ways to alleviate Chinese threats to the global supply chain: For instance, U.S. incentives have encouraged TSMC to move some of its operations from Taiwan to Arizona.

Finally, history shows that trying to stymie competitors by impeding the flow of technical information is unlikely to work against an adversary like China, with a large pool of educated workers and substantial ability to invest in research and development. Remember that Britain tried to monopolize early nineteenth-century textile technology, but Samuel Slater, the “father of the American Industrial Revolution,” used his knowledge of British machine designs to develop better technology in his adopted country. The way to compete effectively with China is not to ratchet up bellicose rhetoric about defending Taiwan or attempt to halt the spread of technical know-how by drafting new CHIP Acts, but to educate American workers and foster closer cooperation with other countries that have taken the lead in developing key aspects of the semiconductor manufacturing process. The history that Miller recounts demonstrates that what matters most in achieving technological leadership is free movement of people and ideas, not tariffs, export controls, or paranoid levels of fear. The best counterweight to Chinese military and commercial ambitions is the collective brainpower of the democratic world, not chip embargoes and saber-rattling.

By delving deeply into the technologies, personalities, and firms that have shaped this immense market, Miller obliges us to think anew about the proposed remedies for the ills of global neoliberalism. Yes, some chip firms, like Intel, Nvidia, ASML, and TSMC, dominate certain pieces of the global supply chain, but their hold is tenuous. Andy Grove was right that firms at one moment thriving on the technological frontier can quickly find themselves relegated to irrelevance when the frontier suddenly advances. Nor is the sheer size of these firms, which are often denounced as monopolies, an insuperable barrier to the entry of disruptive competitors. As noted, plentiful sources of unregulated capital emerge whenever huge new markets beckon. Governments, too, are sources of new investment, whether in pursuit of their own national interests or because the strategic situation seems to demand it.

Increased regulation of the movement of capital—an evergreen remedy among neoliberalism’s critics—will therefore not resolve the problem, because innovative firms can tap state aid (as chip firms in Korea, Taiwan, and China have done) or divert revenue streams from popular consumer products (as Sony and Samsung did). Depriving China of advanced semiconductor technology will only hasten its efforts to develop its domestic fabrication capacity. And threatening military action will only heighten its already rampant fears that the West is determined to prevent it from advancing to the next phase of economic development.

Over the past 75 years, the semiconductor industry has demonstrated that humankind makes its greatest strides when it enlists the full diversity of the planet’s talents: people as different as Hungarian refugee Andy Grove, China-born and U.S.-educated TSMC founder Morris Chang, Japanese engineer and son of sake merchants Akio Morita, and Lee Byung-Chul, who started out in Korea exporting dried fish and ended up creating a semiconductor superpower in the form of Samsung—to name just a few of the many individuals who contributed to the semiconductor revolution. The transistor may have been invented in the United States, but we cannot hope to harvest the bulk of its fruits forever. The best way to prevent the chip war from turning into a hot war is to recognize what a boon to humanity the industry’s intensely competitive ethos has been. Progress cannot be thwarted, and it’s a mistake to try to keep the chips flowing by erecting a Maginot Line around TSMC rather than developing second sources in countries around the world. To add the protection of TSMC to the list of reasons for promoting a second Cold War with China would be to fundamentally mistake the power of transformative technologies to restructure economies and societies and create new world orders.

Read more about Chinachips acttechnology policy

Arthur Goldhammer is a writer and translator. A senior affiliate of the Center for European Studies at Harvard, he has translated more than 120 books from French, writes widely on culture and politics, and is the author of the novel Shooting War.

Also by this author

A Tale of Two Capitalisms

Click to

View Comments

blog comments powered by Disqus