Supercomputers: New Software Needed - InformationWeek

InformationWeek is part of the Informa Tech Division of Informa PLC

This site is operated by a business or businesses owned by Informa PLC and all copyright resides with them.Informa PLC's registered office is 5 Howick Place, London SW1P 1WG. Registered in England and Wales. Number 8860726.

Government // Big Data Analytics
09:06 AM

Supercomputers: New Software Needed

Next hurdle for high-performance computing is figuring out how to handle unstructured data.

Top 10 Government IT Innovators Of 2013
Top 10 Government IT Innovators Of 2013
(click image for larger view)

Supercomputing, in the broadest sense, is about finding the perfect combination of speed and power, even as the definition of perfection changes as technology advances. But the single biggest challenge in high-performance computing (HPC) is now on the software side: Creating code that can keep up with the processors.

"As you go back and try to adapt legacy codes to modern architecture, there's a lot of baggage that comes along," said Mike Papka, director of the Argonne Leadership Computing Facility and deputy associate laboratory director for computing, environment and life sciences at Argonne National Laboratory. "It's not clear to me what the path forward is … [the Department of Energy] is very interested in a modern approach to programming, what applications look like."

[From the bombing in Boston to the evolution of more sophisticated robots, here are some of our top government IT stories from 2013. Top 15 Government Technology Stories Of 2013. ]

Much attention has been given to rating the speed of supercomputers. Twice a year, the top 500 supercomputers are evaluated and ranked based on their processing speed, most recently in November, when China's National University of Defense Technology's Tianhe-2 (Milky Way-2) supercomputer achieved a benchmark speed of 33.86 petaflops/second. Titan, a Cray supercomputer operated by the Oak Ridge National Laboratory, which in June 2012 was No. 1 on the list, came in second at 17.59 Pflop/s.

That next level is exascale computing, machines capable of a million trillion calculations per second (an exaflop). HPC may achieve that level by 2020, Papka said, but before then -- perhaps in the 2017-2018 timeframe -- the next generation of supercomputers may get to 400 Pflop/s.

"If all the stars aligned, the money's there, and developers had the resources [by] combining Oak Ridge and Argonne, we have made the case that the scientific community needs a 400-petaflop machine," Papka said. "Vendors have work to do, labs have infrastructure to put in place -- heating, cooling, floor space. It's not just buying machines any more, you've got to have the software [and] applications in place."

One of the challenges to faster supercomputers is designing an operating system capable of handling that many calculations per second. Argonne, in collaboration with two other national laboratories, is working on the project, which is called Argo.

Tony Celeste, director of federal sales at Brocade, said another emerging trend in HPC is a growing awareness of its applicability to other IT developments, such as big data and analytics. "There are a number of emerging applications in those areas," he said. "Software now, networks in particular, have to move vast amounts of data around. The traffic pattern has changed; there's a lot of communication going on between servers, and between servers and supercomputers ... It's changing what supercomputing was 10, 15 years ago."

Other important trends Celeste identified include emphasis on having open, rather than proprietary, systems, and the growing awareness of energy efficiency as a requirement.

Patrick Dreher, chief scientist in the HPC technologies group at DRC, said the growing interest in HPC outside of the circles of fundamental scientific research, is driven by "demand for better, more accurate, more detailed computational simulations across the spectrum of science and engineering. It's a very cost-effective way to design products, research things, and much cheaper and faster than building prototypes."

Dreher's colleague, Rajiv Bendale, director of DRC's science and technology division, said the HPC community's emphasis is shifting a little away from the speed/power paradigm and toward addressing software challenges. "What matters is not acquiring the iron, but being able to run code that matters," Bendale said. "Rather than increasing the push to parallelize codes, the effort is on efficient use of codes."

Cloud Connect Summit, March 31 – April 1 2014, offers a two-day program colocated at Interop Las Vegas developed around "10 critical cloud decisions." Cloud Connect Summit zeros in on the most pressing cloud technology, policy and organizational decisions & debates for the cloud-enabled enterprise. Cloud Connect Summit is geared towards a cross-section of disciplines with a stake in the cloud-enabled enterprise. Register for Cloud Connect Summit today.

We welcome your comments on this topic on our social media channels, or [contact us directly] with questions about the site.
Comment  | 
Print  | 
More Insights
Newest First  |  Oldest First  |  Threaded View
User Rank: Ninja
1/2/2014 | 6:26:05 PM
Re: Efficiency
Most probably liquid cooling would be the standard as HPC systems would generate heat that air cooling could not disperse. I feel centralization is the one goal that would cause a unit to opt for an HPC rather than perform the same computation in the cloud and if centralization is very important to the operation then cost does not matter.  
User Rank: Ninja
1/1/2014 | 6:06:13 PM
No need of a new OS
How come nobody thought of using Windows 8.1 or its successor? Ah, no, the article mentions the emphasis in open systems not proprietary. My bad. Windows is no go then.
User Rank: Ninja
1/1/2014 | 10:46:42 AM
I can see how energy efficiency would be a factor here. As computing power increases on these machines, they sap more power. I'm also interested in hearing about how the cooling systems work with these machines. I would think that liquid cooling would be the standard. Thoughts anyone?
How to Create a Successful AI Program
Jessica Davis, Senior Editor, Enterprise Apps,  10/14/2020
Think Like a Chief Innovation Officer and Get Work Done
Joao-Pierre S. Ruth, Senior Writer,  10/13/2020
10 Trends Accelerating Edge Computing
Cynthia Harvey, Freelance Journalist, InformationWeek,  10/8/2020
White Papers
Register for InformationWeek Newsletters
Current Issue
[Special Report] Edge Computing: An IT Platform for the New Enterprise
Edge computing is poised to make a major splash within the next generation of corporate IT architectures. Here's what you need to know!
Flash Poll