There should be no doubt that the future of IT infrastructure is in public cloud. Exhibit C: Google has built a model run in a neural network to predict datacenter efficiencies. They’re using the model to make decisions about infrastructure changes on the raised floor.
An enterprising engineer saw that the datacenter is full of sensors that are being used only for threshold-based monitoring systems. He decided Google could do something better with that data. Building a team to analyze the data is too slow and costly. Building deep learning algorithms to correlate data and make predictions is cheaper, more effective and also more Google-y.
It’s brilliant work. What’s your long term strategy for increasing efficiency of your datacenter? Take a look at the whitepaper I’ve linked below for insight into what Jim Gao at Google built. It’s a great read.
Machine Learning Applications for Data Center Optimization
Categories: private cloud Public Cloud Technology
Chris Saunders
Infrastructure nerd learning to code and be a better human
Leave a Reply