Trade War and The Grand Sino-US Race for the World’s Fastest Supercomputer
In a recent effort to reanimate their YouTube channel, Bridgewater Associates, world’s largest hedge fund, posted a half-an-hour interview with their founder and superstar investor Ray Dalio. When asked about the possible perils of investing in China, Dalio determinedly argued that China is as risky of a market as any other saying, “I think Europe is very risky when monetary policy is almost out of gas and we have the political fragmentation and they’re not participating in the technology revolution.”
All the other factors aside, it is the technology revolution part that caught our attention, as something that might be at the very core of the current onslaught of incessant sanctions, tariffs and what have you directed at China as part of the United States’ plan to hinder Beijing’s global ambitions. The technology revolution is in full force indeed, with the US and China leading the pack. However, with the latter currently more focused on implementing innovations domestically rather than disseminating them globally, there is arguably but one field where the two behemoth powers inevitably and very openly clash – supercomputers.
As of 2019, China and the US own 43.8% and 23.2% of the world’s 500 fastest supercomputers respectively. Running abreast both countries are chasing the same ultimate end of producing the first exascale computer, a machine capable of processing one quintillion – a billion billion – calculations a second, as opposed to roughly 149 (200 peak) quadrillion – million billion – that the current best supercomputer, IBM Summit, can handle.
China claims that its newest runner in the race, the prototype exascale computer Shuguang, housed at the computer network information Centre of the Chinese Academy of Sciences (CAS) in Beijing, can effortlessly operate at the capacity of 200 petaflops, or 200 quadrillion calculations per second, easily beating the IBM titlist. However, in order to take the edge off the trade war tensions the country decided to abstain from entering its best supercomputer in the official competition for the opportunity to top the list of 500 fastest machines in 2019.
Paradoxically, even after this benign gesture the US Department of Commerce stuck to its guns and imposed new restrictions on tech exports that will prevent five big Chinese supercomputer producers, Sugon of Beijing, three of its affiliates, as well as the Wuxi Jiangnan Institute of Computing Technology, from using US-made components. The restrictions now bar American chip producers such as Intel, AMD, and Nvidia from selling their products to Chinese manufacturers, many of whom are heavily dependent on American semiconductors.
History
From 1942 to 2009, for nearly 70 years, the US almost exclusively dominated the global supercomputing race with occasional intrusions from the UK and Japan. China was not regarded as a viable competitor.
– In 1958, with the help of Soviet engineers, scientists at the Institute of Military Engineering in Harbin created China’s first vacuum-tube computer, kick-starting the erratic development of the local tech industry.
– In 1960, China broke relations with the Soviet Union, forcing Moscow to withdraw its experts from China. Around the same time the United States and the Soviet Union simultaneously imposed a technical blockade on China.
– In 1965, the Beijing Institute of Computing Technology independently built 109C, a computer, that could perform at the speed of 115 kiloflops and developed the first integrated circuit, just five years after the US. The best American computer at that time, Control Data Corporation’s 6600, could wring out three megaflops – 3,000 kiloflops. To compare, iPhone 7 now operates at the speed roughly equal to 400 gigaflops, or 400 million kiloflops. While insignificant both by current global standards and by the US 1960’s standards, China’s success in building a computer without any outside help laid the foundation for the country’s future technological development.
– In 1968, with the Cultural Revolution raging in the streets and the general crowd slowly moving from melting steel in their backyard furnaces to hounding counterrevolutionaries, Chinese factories started manufacturing integrated circuits.
– In 1970, the Beijing Institute of Computing Technology produced a new computer titled 111, the new machine could operate at 180 kiloflops. The same year, a window handle factory in Shanghai was repurposed to accommodate Shanghai Computing Research Institute’s needs for the production of an integrated circuit digital computer.
– In 1972, several months after President Nixon’s well-documented visit, a group of U.S. computer scientists was also allowed entry to China. By that time the PRC had already nurtured a domestic industry capable of manufacturing third-generation computers, compact machines running on integrated circuits rather than individual transistors.
– In 1978, after China initiated economic reforms, the domestic computer industry tumbled. Because the Sino-US relations were hastily restarted, foreign technologies overflowed China. With relatively easy access to overseas innovations, the Chinese lost interest in domestic inventions. At that time, there was a popular saying: “zao buru mai, mai buru zu,” meaning “buying is better than making, renting is better than buying.”
– In 1989, after the US government began to restrict the export of supercomputers to China, dramatically raising the prices, theState Planning Commission, the State Science and Technology Commission and the World Bank launched a joint project to develop supercomputing facilities in the country. The project brought about the construction of network facilities and three supercomputer centers.
– In 2002 China’s best supercomputer placed 43rd in the world, in 2003 it climbed to the 11th spot only to rise even higher to the 10th place in 2004.
– In 2010, China’s Tianhe-1 became the fastest supercomputer in the world.
– In June 2011, China lost ground to Japan and trailed behind the US until June 2013 when the Tianhe-2, based in Guangzhou, set the new record.
– At the end of 2014, the United States announced that it would ban Intel from selling their Xeon chips to China on the basis of presumably having evidence that the chip was being used for nuclear simulation calculations. Intel Xeon was the main processor used in the Tianhe-2.
– In 2015 after the Obama administration imposed sales restrictions on other chip producers such as Intel, Nvidia, and AMD, China started developing its own supercomputers without US semiconductors.
– In 2016, Sunway TaihuLight, a Chinese supercomputer running on Chinese-designed SW26010 manycore 64-bit RISC processors led the race.
– As of 2018, Sunway TaihuLight was outpaced by IBM’s Summit, but China remained dominance in the field with 229 machines in the global Top 500, exceeding the 108 housed by the United States.
– In 2019 China developed a prototype exascale computer Shuguang, unofficially beating IBM’s Summit, but abstained from entering the supercomputer in the official worldwide competition.
Why are supercomputers important?
While an indiscernible presence in our day-to-day lives, supercomputing, or high performance computing (HPC) put more scientifically, is the backbone of so many operations that if someone decided to obliterate all supercomputers on Earth at once, our tech dependent civilization would be in shambles. These gargantuan machines can be applied pretty much in any sphere requiring analyses of massive blocks of data or simulations based on complex and bulky calculations. The following are some of the possible use cases.
Climate prediction and astrophysics
Climate prediction broadly refers to tracking airflows, ocean currents and monitoring trends that could be hinting at possible natural disasters like earthquakes, tsunamis and others. The basic principle of weather forecasting is to capture cloud and airflow trajectories on the map through meteorological satellites, and derive future trends by means of a large number of intricate calculations.
Sadly, even with the current level of technological development and the astounding progress in high performance computing, the accuracy of weather forecasts rarely reaches 80%. Nonetheless, climate prediction preciseness has undeniably improved over the years. Just several decades ago the best meteorological calculations could only yield approximations covering swathes of land as large as 100 kilometers in diameter. Nowadays meteorological science has reached the 1-3 km level of granularity.
Supercomputers also propel our understanding of the space and astronomical objects, providing the capacities needed for simulating events as tremendous as galaxy formation and stellar evolution. Naturally, the potential for scientific breakthroughs has significant value in terms of reputation and strategic competitiveness that neither China, nor the US want to miss out on.
Nuclear explosion simulation and weapon testing
Since today live tests of nuclear bombs are out of the question, countries go to great lengths to make sure their arsenals are fully operational. Supercomputer simulations are used not only for new weapons but also to inspect old warheads, some of which have been piling up in the backyards since World War II.
The nuclear reaction that causes what we call a nuclear explosion is in essence a chain reaction. Atomic fission affects surrounding atoms, the surrounding atoms affect the atoms around them and so on. Simulating this process requires a ton of computing power with no upper margin in terms of maximum capacity needed for a “perfect” simulation. The more computational power, the more accuracy will be attained, the deeper patterns will be discovered.
It is this macabre side of high frequency computing that is arguably one of the cornerstones of the US-China rivalry in the field. While the US with 6,185 warheads has only the second biggest nuclear weapon depository in the world, following Russia’s 6,490, China lags behind in the fourth place with a relatively tiny armory of 290 items. Advanced supercomputing capabilities could bridge that gap by allowing the Middle Kingdom to ensure the maximum efficiency of its nuclear weapons.
Genome sequencing
Decryption of an organism’s full DNA sequence is integral to evolutional biology and is a powerful tool that doesn’t fail to constantly push the whole medical science forward. However, it is not all microscopes, flasks and colorful potions, a genome is a complex cluster of data that could hardly be analyzed without a supercomputer.
“With the advancement of technology in the past decade, scientists are challenged to manage this overwhelming amount of unstructured genomics data being generated by a whole universe of academic, clinical, and pharmaceutical research. For example, many organizations are now requiring more advanced data analysis and management for cases such as drug development, identifying the root causes of diseases, and creating personalized treatments in clinical applications,” wrote Gabriel Broner, current director of Amazon Web Services and former VP of high-performance computing company SGI in an article for Genetic Engineering & Biotechnology News (GEN).
Unsurprisingly, the two major players in the field of genome sequencers – supercomputers designed for genome sequencing – are yet again China and the US. The most powerful American machine NovaSeq was built by the San Diego-based genomics giant Illumina. China’s response was unveiled in 2018 and came from BGI, a genome sequencing services company based in Shenzhen, the current leader in the field. The Chinese apparatus is said to be capable of sequencing a whole human genome in just under 24 hours.
Video rendering
Technological advances contribute not only to monumental scientific innovations, but also to very earthly matters like entertainment. The visual effects (VFX) technology has already reached the heights where it is barely possible to tell the real from the drawn. Take the latest Lion King remake that looked more like a National Geographic documentary gone beautifully wrong, rather than a Hollywood film.
Needless to say, progress comes at a cost, and in the case of movie production the cost is in the terabytes of VFX data that need to be rendered to finish off the work. To tackle the challenge some of the top production studios committed to running their own data centers. According to Wired, in 2013, Pixar ran a facility housing 2,000 servers powered by 24,000 cores, which by some estimates put it among the world’s 25 most powerful computers, while Weta Digital an HPC production company behind Avatar, The Lord of the Rings, The Avengers and other VFX-heavy films had their various computers certified at No. 193-197 on the Top 500 list.
The Chinese film market is deeply hooked on VFX with epic action sci-fi films as the country’s latest cinematic infatuation. But supercomputers made their way into the Chinese film scene way before. In fact, the 2016 animated film Small Door Gods was rendered with the help of 2000 machines belonging to a supercomputing cluster operated by Alibaba Cloud.
High-frequency trading
Theoretically, if a supercomputer could record everyone’s thoughts and behavioral patterns, it could do a good job of simulating future stock trends. But while this dream (or nightmare?) of a Wall Street trader is effectively unattainable, supercomputing has other applications in stock markets.
Supercomputers are fairly often used to scrutinize financial assets for unsustainable price movements, which might allude to a growing financial bubble. Furthermore, HPC is also essential for high-speed trading that now dominates American stock exchanges. Buying and selling deals can now be processed within a blink of an eye. Supercomputers track trading operations across several markets and send warnings when signs of a possible crash are discovered.
Other
HPC is also actively used in calculations regarding aero- engine materials to ensure that alloy materials can withstand massive thrust loads and comply with other safety and efficiency requirements. Supercomputers are employed in battery and semiconductor production and building materials research. They arealso an essential component of cloud computing, especially with the advent of streaming, bike sharing, car sharing, apartment sharing and all the other sorts of sharing that require massive amounts of data to be stored on host servers, operated by cloud computing vendors. Countless supercomputing centers, and the vast underlying computing power they posses, might as well be the strongest weapons that two major powers like the US and China could realistically put to use in a world where wars are more often faugh with data rather than arms.