Navy and NASA are doing ocean modeling but building separate supercomputers

Paul McDougall, Editor At Large, InformationWeek

July 30, 2004

3 Min Read

When it comes to supercomputing, one size apparently doesn't fit all. As part of its efforts to solve advanced, mathematically intense problems such as weather modeling, NASA last week said it would deploy a supercomputer consisting of 20 Silicon Graphics Inc. servers containing hundreds of Intel processors. At the same time, the Navy is turning to a system that incorporates hundreds of IBM servers, each with just eight processors, for its own high-end computing needs.

The Navy's system, dubbed Kraken after a mythic sea monster that sinks ships, will run at 20 teraflops--about three times faster than any system in use by the Department of Defense. Theoretically, NASA's system could run at 60 teraflops, which, if it hits that mark, would make it the fastest computer on earth.

IBM is building Kraken for the Navy's Major Shared Resource Center at the Naval Oceanographic Office at Stennis Space Center in Mississippi. It features 368 IBM p655 eServers--each containing eight Power 4+ processors--connected so they operate as a single unit. The system, which uses IBM's AIX Unix operating system, is to begin operating by September.

Steve Adamec, director of the resource center at Stennis, says he believes the IBM architecture of a large, parallel, clustered environment is best suited for the types of problems the Navy needs to solve. Defense scientists will use the system to, among other things, create finely detailed models of ocean surfaces. "The people who ride in the gray-hulled ships above and below the water will benefit from this," Adamec says.


The Navy's new supercomputing system will provide finely detailed models of the ocean's surface. Photo by U.S. Navy


The Navy's new supercomputing system will provide finely detailed models of the ocean's surface

Photo by U.S. Navy

Kraken will provide the biggest stress test yet for IBM's new pSeries High Performance Switch interconnect fabric, which lets processors communicate across multiple systems. "Our hope is that it's going to significantly reduce latency," Adamec says. Kraken will cost Defense "less than $100 million," he says, declining to be more specific.

NASA opted to band together 20 SGI, Linux-based Altix systems--each containing 512 64-bit Intel processors--to build the Space Exploration Simulator at its Ames Research Center. NASA is deploying it as part of Project Columbia, a collaborative effort with Intel and SGI to advance the study and exploration of outer space.

Unlike Kraken's approach of a massively parallel cluster, Columbia project manager Bill Thigpen says he wanted to minimize the number of servers that comprise the project's supercomputer. "Getting the performance we need becomes problematic on smaller-node systems," he says. The system cost NASA about $50 million, though full commercial pricing would likely top $200 million, Thigpen says. Among other things, NASA scientists will use the simulator to model characteristics of the ocean they haven't been able to in the past.

That sounds similar to the Navy's intended use of Kraken. Adamec and Thigpen insist the requirements and operational procedures of the Navy and NASA are sufficiently different to justify the separate purchases, although Thigpen notes that the government of late has been pushing for more shared use of expensive computer systems. The federal Office of Management and Budget approved NASA's purchase of the SGI system "on the condition that we offer some of its unique features to other agencies, so we will be doing that."

About the Author(s)

Paul McDougall

Editor At Large, InformationWeek

Paul McDougall is a former editor for InformationWeek.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights