The Department of Energy’s Titan Supercomputer.
Credit: Image courtesy of Oak Ridge National Laboratory
Gregory Scott Jones is a writer specializing in the field of supercomputing. He contributed this article to LiveScience's Expert Voices: Op-Ed & Insights.
Next week, a torch will be passed.
On June 17, the Top500, a biannual ranking of the world's fastest computers, will be announced in Leipzig, Germany. In all likelihood, the United States will not be No. 1, as it has been for the past year.
The Cray XK7, known as Titan, will likely fall victim to a back-and-forth "arms race" that is becoming all too familiar in supercomputing circles. Titan took over from Sequoia, an American system that earlier usurped Fujitsu's "K" computer in Japan. It is widely speculated that this year China will steal Titan's crown.
The quest among countries to stand up the world's fastest computing machine can be interpreted in different ways. Among high-performance computing's most common criticisms are the cost to build these huge machines and their rather hefty power requirements (often in the megawatts).
But one thing is for sure: The priority investments made in supercomputers among developed nations is a testament to the machines' increasing relevance in research and development, and fundamental scientific discovery.
Large-scale simulations are critical to understanding climate change; they bring us closer every day to a better understanding of the beginning and evolution of the universe, and ourselves; they are instrumental in designing novel materials, the key to many technological bottlenecks; and they are shedding much-needed light on the basic building blocks of matter — just to name a few. Science has the most to gain from the race for these magnificent machines, as does the United States.
"The nation that leads the world in high-performance computing will have an enormous competitive advantage across a broad range of sectors, including national defense, science and medicine, energy production, transmission and distribution, storm weather and climate prediction, finance, commercial product development, and manufacturing," said former Energy Secretary Steven Chu when announcing Titan's top spot in November last year.
And while the U.S. might not be No. 1 as of June 17, its status as the world leader in high-performance computing is still beyond debate, at least for the time being. (In fact, Russia, a recent newcomer to the supercomputing game, recently announced plans to build a 10-petaflop system, making it potentially the most powerful computer in Europe.) Six months ago, the U.S. had three of the top five systems and 251 of the total 500. But things are changing fast.
Just five years ago, the petascale, or the point at which a computer sustains a thousand trillion calculations per second, was the next big thing. Today's systems top out at 20 to 50 times that, and scientists and engineers already have their eye on the exascale, a lofty term that represents sustained calculations at an entire order of magnitude beyond the petascale.
The reasons for this expansion are plenty. Supercomputing is now recognized as the "third pillar" of scientific inquiry, alongside theory and experiment, literally revolutionizing the way researchers go about asking and answering the great questions. Whereas experiments can be dangerous, expensive or impossible, simulation carries almost zero risks and is relatively cost-effective. And with today's top computers approaching 20 to 50 petaflops, with the right models in place, simulation can be extremely precise — so much so that game-changing discoveries are regularly taking place not in laboratories, but in giant, cooled rooms lined with rows of cabinets cranking out data at lightning speed.
In the race for the world's fastest machine, countries are challenging engineers to push the computational envelope with ever more innovation. Today's top machines have advanced computing ecosystems and accelerated architectures that allow for greater peak performance than previous systems with only marginal increases in power production.
That innovation, in turn, brings us climate models on local scales, giving policymakers unprecedented tools with which to craft regulation; an ability to build nanodevices atom-by-atom, thus allowing engineers to arrive at optimal design configurations faster than ever; and three-dimensional details of the violent explosions of core-collapse supernovas, the elemental fountains responsible for life as we know it. And again, this is just the tip of the iceberg.
While the large-scale computing and simulation "arms race" has been criticized by some in the past, the increased precision and efficiency of today's most powerful computers should be welcomed by all in the scientific community.
Just as competition drove European innovation in the Renaissance and colonial age, so does today's computing "arms race" drive the art of simulation and the science of, well, everything. The U.S. would be wise to stay in the hunt.
The views expressed are those of the author and do not necessarily reflect the views of the publisher. This article was originally published on LiveScience.com .