« Superstrong Space Magnets Are Just as Weird as We Thought |
Main
http://blog.wired.com/wiredscience/2008/11/supercomputers.html
Supercomputers Break Petaflop Barrier, Transforming Science
By Betsy Mason
November 18, 2008 | 8:24:15 PMCategories:
Science
A new crop of supercomputers is breaking down the petaflop speed barrier, pushing high-performance computing into a new realm that could change science more profoundly than at any time since Galileo, leading researchers say.
When the
Top 500 list of the world's fastest supercomputers was announced at the international supercomputing conference in Austin, Texas, on Monday, IBM had barely managed to cling to the top spot, fending off a challenge from Cray. But both competitors broke petaflop speeds, performing 1.105 and 1.059 quadrillion floating-point calculations per second, the first two computers to do so.
These computers aren't just faster than those they pushed further down the list, they will enable a new class of science that wasn't possible before.
As recently described in Wired magazine, these massive number crunchers will push simulation to the forefront of science.
Scientists will be able to run new and vastly more accurate models of complex phenomena: Climate models will have dramatically higher resolution and accuracy, new materials for efficient energy transmission will be developed and simulations of scramjet engines will reach a new level of complexity.
"The scientific method has changed for the first time since Galileo invented the telescope (in 1509)," said computer scientist Mark Seager of
Lawrence Livermore National Laboratory.
Supercomputing has made huge advances over the last decade or so, gradually packing on the ability to handle more and more data points in increasingly complex ways. It has enabled scientists to test theories, design experiments and predict outcomes as never before. But now, the new class of petaflop-scale machines is poised to bring about major qualitative changes in the way science is done.
"The new capability allows you to do fundamentally new physics and tackle new problems," said Thomas Zacharia, who heads up computer science at
Oak Ridge National Laboratory in Tennessee, home of the second place
Cray XT5 Jaguar supercomputer. "And it will accelerate the transition from basic research to applied technology."
Breaking the petaflop barrier, a feat that seemed astronomical just two years ago, won't just allow faster computations. These computers will enable entirely new types of science that couldn't have been done before. This new generation of petascale machines will move scientific simulation beyond just supporting the two main branches of science, theory and experimentation, and into the foreground. Instead of just hypotheses being tested with experiments and observations, large-scale extrapolation and prediction of things we can't observe or that would be impractical for an experiment, will become central to many scientific endeavors.
"It's getting to the point where simulation is actually the third branch of science," Seager said. "We say that nature is always the arbiter of truth, but it turns out our ability to observe nature is fundamentally limited."
http://link.brightcove.com/services/...ctid2562507001
Climate modeling is one area that is ripe for a boost. In the past couple years, the general public has come around to the idea that climate change is real, and scientists are moving on to the potential impacts, how we might adapt and the technology that will help us cope. To do this in any meaningful way, the predictive models need to have a much higher resolution and be much more precise.
"These kinds of questions require much higher fidelity than we had before," Zacharia said. "Very important decisions are going to be made by policy makers based on this science."
Currently, the state of Tennessee, which is more than 400 miles long, is represented by only two pixels in most global climate models. The new computers will drastically increase resolution, in both space and time, and improve accuracy.
In the race to achieve this promise, Oak Ridge had made a push to top the speed list this year with its Cray XT5 Jaguar, but
Los Alamos National Laboratory in New Mexico tweaked its IBM Roadrunner to get just enough more juice to keep the crown. Both more than doubled the performance of Livermore Lab's BlueGene/L IBM computer that led the pack a year ago.
Though there may be disappointment in Oak Ridge over losing by a nose, the lab also has the eighth fastest computer, a smaller version of the Jaguar. When combined with its bigger sibling in the next few weeks, the Jaguar will boost the lab's total capability to around 1.6 petaflops. In the same one-acre room resides the 15th fastest computer, and Oak Ridge is in the process of assembling yet another supercomputer for the National Science Foundation. All told, the lab could reach 2.5 petaflops.
Speeds and Feeds: Oak Ridge's Jaguar Supercomputer:- Type: The Cray XT computer is a distributed-memory massively parallel MIMD supercomputer.
- What Is It: A petaflop computer can process one quadrillion floating-point calculations per second. That's 1,000,000,000,000,000 calculations every second.
- Processors: 182,000 AMD quad-core Opterons, running at 2.3 gigahertz.
- Memory Capacity: 362 terabytes of memory (with 578 terabytes per second of memory bandwidth).
- OS: Cray XTs run UNICOS/lc, a flavor of Unix with networking and file-system enhancements from BSD.
- Recent History: The peak performance for a supercomputer has more than doubled in the last year, from 0.5 petaflops to the current high of 1.1.
- The Future: Raymond Kurzweil believes the human brain has a power of 10 petaflops. By Kurzweil's reckoning, we should equal the human brain's calculating power in less than 7 years.
But it's not just about the speed.
"This is not an Olympic sprint where somebody gets a medal at the end," Zacharia said. "That's not the point."
The Jaguar was designed to be optimal for science. Oak Ridge surveyed scientists in many fields including energy, climate and combustion, and built the computer to suit their needs. It has three times the memory capacity of any other computer, Zacharia said -- 362 terabytes of memory.
The designers paid special attention to making the transition to Jaguar as easy as possible for scientists, allowing them to use applications they have already developed instead of spending years coding new ones to suit the computer.
"I believe we have the best, most capable computer in the world for science," he said.
Only fully assembled in early September, nine months ahead of schedule, Jaguar has already helped scientists who have been eagerly anticipating the petaflop capability.
"The past six weeks we have already run many of the scientific applications people have been waiting for for a long time," Zacharia said.
Jaguar and its peers, which will undoubtedly be multiplying in the coming years -- Livermore Lab is currently assembling a petaflop computer that will join the club in 2011 -- promise to take some scientific fields to the next level by enabling far more complex simulations. This in turn will inspire scientists to imagine new questions, which will in turn need even bigger supercomputers to answer.
"That's exactly how science thrives on these big facilities," Zacharia said. "Any fundamentally new science facility captivates and drives the imaginations of scientists worldwide."
Jaguar's power will be unleashed on scientific problems including drug discovery, photovoltaics and new materials. A single simulation will be able to handle every aspect of a complex problem, such as the performance of a scramjet engine, including the airflow around it, its internal combustion, the strength of its materials, the effect of intense heat and aerodynamic forces.
"With the advent of petaflop computing, it's possible to do this simulation," said Seager, who is collaborating with scientists at other labs and universities to do just that.
Today's computer scientists can barely contain their excitement as they imagine what is now possible.
"It's very exciting to be alive today and doing computer science," Seager said. "Now we can do some spectacular things."
----------------------------------------------
My 2 Cents:
It's really not about speed any more. If we could do convincing AI, for instance, it'd still be convincing even if it took hours to process a single request. But we can't model a lot of things at all, not even very slowly. But we have no idea how to even approach a lot of problems.
This sort of thing is really only good for solving a handful of easily parallelized O(c^n) problems and pushing lots of pixels around. Yeah, you can factor big primes too, but who gives a shit? This is pretty neat, but it's just not that interesting anymore. Until we can come up with an entirely new way to describe complex problems, Moore's law will ensure that (good) developer time is always more expensive than computer time, and Kurzweil's singularity will remain a pipe dream. It doesn't matter if we can calculate as much as the brain if we can't calculate as
well as the brain.