Menu Home

Google’s Use of Neural Networks to Predict Datacenter Efficiencies

There should be no doubt that the future of IT infrastructure is in public cloud. Exhibit C: Google has built a model run in a neural network to predict datacenter efficiencies. They’re using the model to make decisions about infrastructure changes on the raised floor.

An enterprising engineer saw that the datacenter is full of sensors that are being used only for threshold-based monitoring systems. He decided Google could do something better with that data. Building a team to analyze the data is too slow and costly. Building deep learning algorithms to correlate data and make predictions is cheaper, more effective and also more Google-y.

It’s brilliant work. What’s your long term strategy for increasing efficiency of your datacenter? Take a look at the whitepaper I’ve linked below for insight into what Jim Gao at Google built. It’s a great read.

Machine Learning Applications for Data Center Optimization

Categories: private cloud Public Cloud Technology

Tagged as:

Chris Saunders

I am a Sr. Cloud Solutions Architect for Red Hat.

email chrisb@redhat.com
twitter @canadianchris

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s

%d bloggers like this: