Google's DeepMind AI Cuts Data Center Power Bills

Google uses its DeepMind artificial intelligence to slice nearly half of the amount of energy used for cooling its data centers, improving its overall power usage effectiveness (PUE) by 15%.

Dawn Kawamoto, Associate Editor, Dark Reading

July 20, 2016

3 Min Read

10 AI App Dev Tips And Tricks For Enterprises

10 AI App Dev Tips And Tricks For Enterprises


10 AI App Dev Tips And Tricks For Enterprises (Click image for larger view and slideshow.)

Google dramatically cut its energy usage for cooling its datacenters by up to 40%, with the help of its DeepMind artificial intelligence, the company announced Wednesday in a blog post.

Alphabet's Google began using machine learning two years ago to save energy and money at its data centers. Over the past few months, Google added artificial intelligence from its DeepMind research and significantly improved on its results.

Google was able to cut up to 40% of its energy usage from cooling its data centers and, overall, improve its power usage effectiveness (PUE) by 15%, after accounting for other non-cooling inefficiencies and electrical losses.

{Image 1}

For Google, its energy reduction results could bode well should it decide to launch something as a source of revenue. The company may already be thinking along those lines, since it noted other potential applications for its efforts in its blog:

Because the algorithm is a general-purpose framework to understand complex dynamics, we plan to apply this to other challenges in the data center environment and beyond in the coming months. Possible applications of this technology include improving power plant conversion efficiency (getting more energy from the same unit of input), reducing semiconductor manufacturing energy and water usage, or helping manufacturing facilities increase throughput.

Google plans to roll out its energy conservation system more broadly. It will reveal the details of how it was done in an upcoming publication, the blog noted. 

Although Google has been using machine learning to squeeze out more energy efficiency out of its data centers since 2014, it said it has focused on reducing its data center energy usage over the past decade by investing in green energy resources to building its own efficient servers. The ultimate goal for Google is to power its data centers completely with renewable energy.

[See Google I/O 2016: AI, VR Get Day in the Sun.]

Google noted that its electricity consumption in 2014 was 4,402,836 MWh. That is equivalent to powering 366,903 homes with electricity in the US, based on an average annual consumption, according to a Bloomberg report. Even a 10% energy savings at its data centers could potentially yield hundreds of millions of dollars in savings for Google over a number of years, given the amount that electricity costs per MWh in the US, Bloomberg estimated.

In outlining some of the details of how it achieved the savings, Google stated in its blog:

Using a system of neural networks trained on different operating scenarios and parameters within our data centers, we created a more efficient and adaptive framework to understand data center dynamics and optimize efficiency.

We accomplished this by taking the historical data that had already been collected by thousands of sensors within the data center -- data such as temperatures, power, pump speeds, setpoints, etc. -- and using it to train an ensemble of deep neural networks. Since our objective was to improve data center energy efficiency, we trained the neural networks on the average future PUE (Power Usage Effectiveness), which is defined as the ratio of the total building energy usage to the IT energy usage. We then trained two additional ensembles of deep neural networks to predict the future temperature and pressure of the data center over the next hour. The purpose of these predictions is to simulate the recommended actions from the PUE model, to ensure that we do not go beyond any operating constraints.

We tested our model by deploying on a live data center.

Google said it plans to share this information in greater detail with other data center and industrial system operators to allow others to benefit and reduce the carbon footprint overall for the world.

About the Author(s)

Dawn Kawamoto

Associate Editor, Dark Reading

Dawn Kawamoto is an Associate Editor for Dark Reading, where she covers cybersecurity news and trends. She is an award-winning journalist who has written and edited technology, management, leadership, career, finance, and innovation stories for such publications as CNET's News.com, TheStreet.com, AOL's DailyFinance, and The Motley Fool. More recently, she served as associate editor for technology careers site Dice.com.

Never Miss a Beat: Get a snapshot of the issues affecting the IT industry straight to your inbox.

You May Also Like


More Insights