We’re living in a new era. The post-Cold War period “is definitively over,” President Joe Biden’s latest national security strategy declares, and the world’s major powers are competing to “shape what comes next.” The thing, though, is that the latest geopolitical installment seems to have borrowed plot points from an earlier one: the Cold War itself. Nuclear saber rattling? Check. Communists? Check. Proxy war? Check.
And, increasingly, another dynamic with echoes of the Cold War is playing out. The unfolding sequel even includes a full-throttled battle for technological supremacy, where officials in both the United States and China are racing to undercut the other country’s cutting-edge technical development. The competition, the thinking goes, will influence which country markets new technologies, creates new military capabilities, and defines the regulatory and ethical standards for emerging fields like artificial intelligence (AI).
It’s a “tech war,” or so the media is saying. “Biden is Now All-In on Taking Out China,” Foreign Policy proclaimed. The “US goes on Offensive in Its China Tech War,” Bloomberg wrote. There are many other examples.
The “war” framing underscores how governments see technology research and development (R&D) as a key national security issue. In an area where, by some measures, the United States has been losing ground, the Biden administration is now taking an aggressive stance and investing tens of billions in the domestic high-tech industries while working to thwart China’s technological rise. War is a troubling lens, though, to view the US-China competition. It risks militarizing economic, scientific, and technological competition—bolstering rationales for “expanded military budgets and brinksmanship,” as one analyst puts it. “War” rhetoric runs the risk of fostering nationalism, discrimination, and oppression. Given the transformational nature of some emerging technologies, like AI or biotechnology, however, as well as the clashing trajectories of the United States and China, there are arguments for the sense of urgency that a metaphorical war can provide.
Losing ground. By many metrics, the United States appears to be slipping in R&D. Take funding. In the years following World War II, the US government funded about 65 percent of research and development in the United States; today that figure is 24 percent, according to a new book by Daniel Gerstein, a RAND Corporation researcher and former science and technology official in the Department of Homeland Security, called Tech Wars: Transforming U.S. Technology Development.
Not only has US government investment in R&D fallen, Gerstein’s book says, but other countries have leapt up in the ranks of R&D funders. In 1960 the United States held a 70 percent share of global research and development. By 2018, the US share stood at 26 percent while China accounted for 21 percent.
Gerstein, a contributor to the Bulletin, believes the US government needs to pay much greater attention to R&D. The risks, he wrote in his book, “for not getting it right continue to grow for the United States and humanity.” There are two massive initiatives that underscore China’s R&D ambitions, Gerstein argues: the Belt and Road Initiative and Made in China 2025. The first is an effort to build infrastructure like railways, ports, and power plants in countries around the world, using Chinese-made supplies and Chinese contractors (and paid for using loans from China). Policymakers in the United States, Japan, and elsewhere, though, have long viewed the plan to bring dozens of countries into a Chinese-run economic sphere with suspicion.
Made in China 2025, meanwhile, is an effort by the Chinese government to have the country dominate several technology sectors, including AI, semiconductors, and biotechnology. In some areas such as communications infrastructure, China has already taken the lead, becoming a powerhouse in the realm of telecom equipment that makes up the framework for the coming generation of handsets.
“We always have competitions; competitions are part of the human experience. This feels a little different,” Gerstein said. “The Chinese, for example, have laid down the gauntlet and talked about the Belt and Road Initiative, which could encompass two-thirds of the world’s population, if they were to achieve what they set out to do. And likewise, [with] Made in China 2025, they’re really sort of established that they’ve selected a number of different technologies that they want to lead in many of which [the United States] is currently the global leader.”
In recent months, the Biden administration seems to have gotten more aggressive in its efforts to preserve the US lead in R&D.
A few days before declaring the post-Cold War era over earlier this month, the administration unveiled what analysts say are debilitating rules aimed at suffocating the Chinese semiconductor industry—the sector that makes the computer chips that could enable the latest advances in artificial intelligence (AI) and supercomputing.
Saying that semiconductor development could aid the Chinese military, the administration issued rules to prevent the export to China of advanced chips, the equipment to make them, and even foreign-made chips that used US technology during the production process. The restrictions also cut off US citizens, residents, and green card holders from participating in the Chinese semiconductor industry, forcing the Chinese government to “reinvent the wheel” in this all-important technical area.
A recently passed bill would invest $52 billion in domestic manufacturing and research and development of semiconductors—“one of the largest industrial development programs the federal government has ever administered.” The measure has attracted private sector interest in semiconductors, as well. Gerstein sees the Biden administration making other such signals through an initiative to invest in biotechnology, including by earmarking $2 billion for “bioeconomy research, development, and infrastructure.”
More than an economic rivalry? For years US officials have worried about Chinese government espionage—and about how US companies were aiding Chinese ambitions, if inadvertently.
While the Chinese telecommunications giant Huawei was pursuing deals to build 5G telecom network in the United Kingdom and elsewhere, all while striving to become a dominant cellphone company, US presidential administrations responded by barring the company from the US network, preventing US companies like Google from working with Huawei, and pressuring allies to keep Huawei out of their network infrastructure. US officials argued that Huawei, founded by former Chinese army technologist Ren Zhengfei, was an espionage risk, alleging that its equipment had surveillance backdoors. (The US government might know a thing or two about this alleged ruse, considering the CIA secretly reportedly operated Crypto AG, a communications encryption company that sold “weakened” products in Pakistan, India, and a host of other countries for decades.)
More recently, a Washington Post investigation revealed how Chinese military research groups working on hypersonic missile programs have been acquiring US-made software important for hypersonic missile development despite being on an export “blacklist.”
Chinese leaders want China to “become the world’s leading power,” the Biden administration said in its 2022 national security strategy. The country is “using its technological capacity and increasing influence over international institutions to create more permissive conditions for its own authoritarian model,” the document, which calls China “America’s most consequential geopolitical challenge,” said.
Just as the confrontational tenor of the “post-Cold War era” carries risks—like the possibility of actual war between bellicose competitors—so too does technological competition, in ways similar to those in the old Cold War days. “The dilemmas of engaging with a strategic competitor on matters of science—from concerns regarding dual-use technologies and research to industrial espionage, academic exchange, and visas—are continuations of debates that went dormant in 1989,” Brendan Thomas-Noone wrote in a piece for Brookings
Throughout the decades of the Cold War, the scientific and technological relationships connecting the Soviet Union and the United States waxed and waned. In 1959, for example, a partnership between the US National Academy of Science and the Academy for Science of the Soviet Union led to joint work on energy and aspects of arms control. A decade later, US policy “actively encouraged dual-use technological trade with the Soviet bloc,” according to Thomas-Noone. But by the 1980s, the pendulum had swung in the opposite direction. The Soviets were catching up, Washington officials feared, through exploiting the student exchanges, scientific conferences, and unclassified reports of the previous period of openness. Through these, the Soviets saved “a considerable amount of time and money by pointing out the fruitful avenues of research and development,” a 1981 Pentagon report said. US export polices tightened to the point where officials were scrutinizing not just physical technologies but know-how and relationships among researchers.
More recently, the same-type of Cold War-era suspicions have guided US policies toward China. In the Trump administration, the Justice Department’s National Security Division initiated an investigation into economic espionage called the China Initiative that sought to probe academics’ ties to China. Human rights groups called the initiative racial profiling, and the charges brought in the investigations often involved grant fraud, researchers failing to disclose funding from a Chinese institution. The Justice Department eventually closed the initiative, acknowledging the problematic racial dimensions of the investigation while pledging that “we will be relentless in defending our country from China.”
During the Cold War, as well, geopolitical tensions impeded scientific partnerships.
Thomas-Noone notes how academics, including the presidents of MIT, Stanford, and several other major universities pushed back against government restrictions intended to keep certain computer chip technology classified. The restrictions, the presidents wrote, prevented necessary scholarly communication on certain technologies to the point where “faculty could not conduct classroom lectures when foreign students were present.” The government policies, meant to prevent the diffusion of militarily applicable technologies could also lead to a “chilling effect” on researchers whose work had a “much broader utility in such other areas as medical systems and communication equipment.”
American suspicions of China involve the country’s military aims—such as the development of hypersonic missiles—as well as its authoritarian incorporation of technology to preserve the power of the Chinese Communist Party. The Chinese government is building a turbo-charged, tech-enabled surveillance system so all-encompassing that for Uyghur minorities in Xinjiang, for instance, growing a beard or using a back door instead of front entrance trigger police suspicion. China comes in at the bottom of nonprofit Freedom House’s ranking of internet freedom. An army of censors scans a closed-off web—the Great Firewall–for any whiff of discontent, quickly blocking and removing content. “I just wonder, what would it look like if a not-so-open nation had led the development of the internet, and then the standards and the regulations and the policies that govern the internet,” Gerstein said.
And the Chinese government is exporting authoritarian-enabling tech around the world. A 2019 piece in The New York Times reported that 18 countries were using Chinese-made “intelligence monitoring” systems and even more were receiving training in “public opinion guidance,” a.k.a censorship. In Russia, where criticizing the war in Ukraine can lead to jail time, officials are working to engineer more Chinese-style government control into an already restricted web.
But as Yangyang Cheng, a writer, physicist, and fellow at Yale Law School, wrote in a piece for Wired, the national-security-driven focus on China stealing information or misusing technology, the kind that has led to the China Initiative, for example, can obscure ethical questions people should be asking in America. Before US-China relations sunk to their current depths, American institutions and scientists were “eager to partner with China,” regardless of the murky ways that officials there made use of scientific and technical advances. A US geneticist, after all, contributed to a police system for genetically tracking Uyghurs. And, of course, police have used controversial technologies like facial recognition in the United States. “Instead of reckoning with global systems of injustice and one’s complicity in them, it’s politically expedient and self-absolving to fixate on alleged threats from a foreign other,” Cheng wrote.
Indeed, there are plenty of skeptics who question the framing of global tech competition as a “war.”
In 2017, Russian President Vladimir Putin told school children that whoever excelled in AI would be the “ruler of the world,” while touting Russia’s technological advances. A few years later, however, his huge and supposed advanced military would be bogged down, if not in retreat, in next-door Ukraine. The ominous-seeming Belt and Road Initiative also may not be going as planned. Countries aren’t able to pay back loans, and Chinese officials are looking for ways to scale back.
So is the media rhetoric on tech war just hyperbole? Robert Daly, who analyzes US-China relations and is the director of the Kissinger Institute on China and the United States, has come around to the Cold War II framing, after initially rejecting it as simplistic and dangerous. “The world’s two most powerful nations have embarked, wittingly or otherwise, on a comprehensive competition, including a contest to shape global order,” he wrote in the Bulletin. “Their rivalry includes a burgeoning arms race and expanding nuclear capability. … We are one crisis away from cementing rivalry as the basis of bilateral relations for decades to come.”
There are no “bright spots” in the relationship, Daly wrote, and the two sides can’t seem to work together on climate change, pandemic prevention, and other transnational issues: “If this isn’t a Cold War, the term has no meaning.”
Likewise, Gerstein, the author, grappled with whether to call the global technological race a war, a conflict, or a competition. “The intended use of the term war is to signify the magnitude and urgency of what we have to do,” he said. “Just to treat it, like some sort of normal competition seems like it would be undervaluing the importance of doing well in this and ultimately prevailing.”
Debates about whether we are or aren’t in a tech war or even a new Cold War may just be semantics. Either way, we’re in a risky place.