The relentless doubling of cores per microprocessor chip will result in servers with far more horsepower than software can use, says Gartner.

Antone Gonsalves, Contributor

January 28, 2009

2 Min Read

The relentless doubling of cores per microprocessor chip will drive total processor counts in upcoming generations of servers well beyond levels for which key business software has been engineered, a market research firm said Wednesday.

Technologies that will be affected by this evolution include operating systems, middleware, virtualization tools, and other applications, Gartner said. As a result, companies and other organizations will be faced with "difficult decisions, hurried migrations to new versions and performance challenges."

"Looking at the specifications for these software products, it is clear that many will be challenged to support the hardware configurations possible today and those that will be accelerating in the future," Gartner analyst Carl Claunch said in a statement. "The impact is akin to putting a Ferrari engine in a go-cart; the power may be there, but design mismatches severely limit the ability to exploit it."

On average, organizations get double the number of processors in each chip generation about every two years, Gartner said. The increase is accomplished through some combination of more cores and more threads per core.

For example, this year's 32-socket, high-end server with eight-core chips in each socket would deliver 256 processors. In two years, with 16 processors per socket expected in the market, the machine swells to 512 processors total. Four years from now, the server would host 1,024 processors.

Gartner said organizations need to pay attention to this evolution because there are limits on the ability of software to make use of all this horsepower.

"Most virtualization software today cannot use all 64 processors, much less the 1,024 of the high-end box, and database software, middleware, and applications all have their own limits on scalability," Claunch said. "There is a real risk that organizations will not be able to use all the processors that are thrust on them in only a few years time."

Gartner points out that software programs have both hard and soft limits on the number of processors they can effectively handle. The hard limit is within the product documentation. The soft limit, however, can only be uncovered through word of mouth or real-world cases. Due to characteristics of the software design, application performance could decline as more processors are added.

"Most virtualization software today cannot use all 64 processors, much less the 1,024 of the high-end box, and database software, middleware, and applications all have their own limits on scalability," Claunch said. "There is a real risk that organizations will not be able to use all the processors that are thrust on them in only a few years time."

Additional information is available in the Gartner report "The Impact of Multicore Architectures on Server Scaling," available through the firm's Web site.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights