Saturday, March 2, 2013

IEEE Members on Supercomputing Predictions, Challenges, thanks to IEEE "The Institute"


IEEE Members on Supercomputing Predictions, Challenges

A recent conference focused on the future of supercomputing

 18 February 2013
Photo: Baran Ozdemir/iStockphoto








What do increased energy efficiency, improved weather forecasting, a better understanding of the universe, and faster discovery of new drugs have in common? They are all expected to be commonplace over the next two decades with the help of supercomputing.
“Advances in supercomputing in the coming years will impact all areas that use any computing and simulation,” says Rajeev Thakur, technical program chair of SC12, the annual supercomputing conference cosponsored by the IEEE Computer Society. Last year’s event, held in November in Salt Lake City, featured five days of technical sessions and presentations on all aspects of supercomputers and their use, including architecture, networking, system software, programming environments, performance analysis, algorithms, data storage, visualization, and applications.
The conference tried to answer two key questions: What does the future of supercomputing look like, and what challenges must be overcome to get there?
MYRIAD APPLICATIONS
One area expected to see big changes is energy storage and distribution. “Advances in supercomputing in such areas as materials science will result in batteries with higher capacities at lower cost,” says Thakur, deputy director of the Mathematics and Computer Science Division at Argonne National Laboratory, in Lemont, Ill.
And IEEE Member Bronis de Supinski, who coauthored four papers that were presented at the conference, predicts headway in many other areas related to power. “With supercomputers, we expect to manage the power grid better and significantly improve our ability to predict how much electricity needs to be produced,” says de Supinski. “That can in turn reduce the production of electricity that simply ends up being wasted.”
Other experts have gone as far as to say that by 2027, supercomputing will play a role in the development of nearly limitless clean energy by enabling clean nuclear fission via thermonuclear reactors that could produce three to four times as much energy as a nuclear power plant without generating radioactive waste.
Supercomputing is also expected to play a big role in better weather prediction. “High-fidelity and higher-resolution simulations will result in more accurate weather predictions and tracking of storms and hurricanes,” says Thakur. That could go a long way toward helping areas that will be hit by storms to prepare.
Supercomputing’s potential isn’t just limited to Earth. “Large-scale cosmological simulations will give us a better understanding of the building blocks of the universe,” Thakur adds. That includes dark matter, dark energy, the geometry of the universe, and why the universe’s expansion rate is accelerating.
Health care is another area expected to see major effects. Currently, the discovery of new medications is a long process, requiring extensive screening of many compounds against isolated biological targets to learn their effects. Faster, more powerful computers will change that. “Advanced simulations in molecular dynamics will greatly accelerate new drug discovery,” Thakur says.
In addition, supercomputers are being used to study the potential of regenerative medicine—the process of regenerating human cells—which could eliminate some diseases, undo the effects of aging, and possibly even extend lives.
CHALLENGES AHEAD
But all these advances rely on the ability to overcome several major technical hurdles.
“Opinions vary on which are the biggest challenges,” says de Supinski. “But many feel that power—and its dissipation—will be the biggest constraint. Power considerations are already changing the kinds of processors available. We are building computer systems today that have a peak power of about 10 megawatts. Simple scaling would lead to 500 MW for an exascale system if built from today’s components. The projected power consumption for an exascale system in 2022 is 100 to 200MW, based on historic trends for improvements in energy efficiency of components. And since 1 MW roughly equates to US $1 million in operating costs, that level of power is untenable.”
Thakur agrees that power consumption will be a major issue, along with several others. “One of the biggest technical challenges will be minimizing power consumption—how to provide 1000 times higher performance while consuming roughly the same amount of power as today’s supercomputers,” he says. Thakur notes other important issues, which include developing hardware, software, and applications resilient to failure; designing algorithms that scale to systems of such a large size; and programming systems that could have hundreds of millions of cores.
For de Supinski, the biggest challenge will involve memory bandwidth and capacity. “The amount of memory and the bandwidth to memory continues to shrink relative to the computational capability of the systems we’re building,” he says. “We may not be able to use the increased computational capabilities effectively in future systems because of that. Projections show that our typical applications will first be limited by memory bandwidth and then by memory capacity unless we do something very different.”
For Thakur, funding will be key. “Overcoming these challenges requires a concerted effort of research and development involving academia, research labs, and vendors,” he says. “This, in turn, is critically dependent on the availability of sustained funding for research and for the acquisition and deployment of the next generation of supercomputers.”
 

IEEE Members on Supercomputing Predictions, Challenges - IEEE - The Institute

'via Blog this'

No comments:

Post a Comment