Big Data Software Targets Server Energy Efficiency

Hardware improvements only go so far. Multicore servers need newer software to run most efficiently, says Actian CEO.

Jeff Bertolucci, Contributor

August 5, 2013

4 Min Read

5 Big Wishes For Big Data Deployments

5 Big Wishes For Big Data Deployments


5 Big Wishes For Big Data Deployments(click image for larger view and for slideshow)

As data sets grow increasingly larger and more challenging to manage, enterprises are taking a closer look at the efficiency of their server hardware and software. That's good news, of course, because there's always room for improvement in server efficiency. A 2010 study by nonprofit environment group Greening Greater Toronto found that most servers in data centers operate at a shockingly low 4% average utilization.

Ugly, yes, but the good news is that things are getting better. In fact, a number of tech startups are developing high performance servers that are energy efficient as well.

Calxeda, an Austin, Texas-based company that uses low-power ARM processors in energy-saving servers, recently earned kudos from consulting firm Frost & Sullivan for its EnergyCore Architecture, which could help accelerate the growth of big data applications in the enterprise.

"As enterprises handle petabytes of unstructured and structured data in a single platform, they need highly scalable and sustainable data warehousing that consumes less energy," said Frost & Sullivan senior research analyst Swapnadeep Nayak in a statement.

[ Want to learn about tools that make the data center more efficient? Green IT Discussion Changes. ]

Penguin Computing is another promising contender in the ARM-based server market. It recently announced that it's developing an efficient cloud storage system with Calxeda and Inktank. Another newcomer, Lopoco, offers low-power servers as well.

The feds are getting involved, too. The U.S. Department of Energy (DOE) last month said it hopes to establish minimum energy-efficiency standards for computer servers, in part because of the growth of big data and cloud-based services.

"The aggregate energy use of servers is significant and rising as cloud computing becomes more ubiquitous. Individuals and enterprises increasingly rely on centralized applications and data storage. Coverage of servers will enable the conservation of energy supplies through both labeling programs and the regulation of server energy efficiency," the DOE proposal states.

It's good that everybody urges hardware manufacturers to make their servers more energy efficient. But they often neglect the software side of things, said Mike Hoskins, CTO of database software company Actian.

"The gaping, yawning gap here -- and where the real opportunity is -- is to make software more intelligent and efficient, and make to it use the ... hardware you've already bought," Hoskins told InformationWeek in a phone interview.

A large part of the problem, he said, is legacy code that's decades old.

"Hardware advances have been amazing over the last 30 years, and yet an enormous amount of software out there -- legacy software, very famous software -- is frankly 30 or 40 years old and hasn't been updated much," Hoskins added.

"There's a lot of software that's just single-threaded; it doesn't even understand the multicore revolution. It hasn't really made the transition to efficient optimizations around parallelism," he noted. "You can take famous software, stick it on a 16-core machine, punch 'run,' and one core gets really tired and the other 15 do nothing."

Perhaps not surprisingly, Hoskins pointed to Actian's growing stable of software solutions -- the company recently acquired ParAccel and its massively parallel processing (MPP) database -- as an example of modern software that dramatically increases server utilization on industry standard hardware.

In recent benchmark data from the nonprofit Transaction Processing Performance Council, the top results in in the 100-, 300-, 500- and 1,000-GB categories in the database column were running Actian's Vectorwise database, Hoskins said.

Actian has been on a buying spree this year. In addition to grabbing ParAccel, it purchased data management and analytics firm Pervasive Software for $162 million in February. Before joining Actian, Hoskins was CTO of Pervasive.

The big data market is not just about technologies and platforms -- it's about creating new opportunities and solving problems. The Big Data Conference provides three days of comprehensive content for business and technology professionals seeking to capitalize on the boom in data volume, variety and velocity. The Big Data Conference happens in Chicago, Oct. 21-23.

Read more about:

20132013

About the Author(s)

Jeff Bertolucci

Contributor

Jeff Bertolucci is a technology journalist in Los Angeles who writes mostly for Kiplinger's Personal Finance, The Saturday Evening Post, and InformationWeek.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights