Simulation School for Our Young Machine Learners

22 August, 2018

As humans, we are hurtling towards a not-so-distant future entirely supported and optimized by data. Whether it’s our growing network of smart connected devices and wearables, a transport system of autonomous vehicles or a world built around Virtual and Augmented Reality, data is the driving force behind making these technologies a reality. As the volumes of data spiral out of control, we’ve turned to machines to analyze, understand and make informed decisions about how we should react next. The data center is no different.

The modern data center has almost unlimited data available about its performance. Whether it’s processor power, chip temperature, chilled water valve positions or cold aisle humidity, the abundance of monitored data can be overwhelming. Furthermore, the data center is a highly complex and balanced environment, where small changes can have a massive impact on the entire data center space.

So, it makes sense that many companies are leveraging machines to utilize this vast amount of data for operational decision-making. The future data center is one that is automated, self-aware and self-controlling. However, the key to success will be how well the algorithms making the decisions in the background are designed and implemented. Whether these are the next generation of cooling controls or algorithms that manage power and application deployment within your virtualized infrastructure, the future data center will utilize artificial intelligence (AI) and machine learning (ML) to ensure optimum performance.

The backbone of any AI system is the training set that drives the AI algorithms. Recommendation systems designed and optimized by NetFlix, Amazon and Google are all based on training sets created by collecting large data volumes, mapped with user behavior over long periods. However, there are several challenges when employing a similar approach for data centers.  For data centers that have yet to be built, there is no ‘real’ measured data to analyze. In operational data centers, the system must make changes to a live production environment to learn, which can add additional risk to operations. So, how do you collect training set data without turning your production data center into an AI lab? 

The answer lies in simulation, which offers a digital clone of the DC to assess the impact of any change in its performance. With a simulation model, rich training sets can be created by varying a plethora of variables. Let’s imagine you want to predict the best location for an IT equipment in a rack based on available power, space, cooling, networking, etc. This may seem simple, but the answer is dependent on several variables, including: 

  • Temperature of the rack over time
  • Airflow at the rack
  • Location 
  • Type of rack
  • Types of IT installed
  • # Changes to IT over time
  • Type of floor grille installed
  • Control systems

Having the ability to provide these data sets to the AI algorithms that link all of these variables is extremely valuable, and is also risk free when compared to changing them in a live production environment. 

Research into co-simulation between 6SigmaDCX and these algorithms is gaining traction. A team from Nanyang Technological University, Singapore, led by Prof. Yonggang Wen at School of Computer Science and Engineering, has been working with Future Facilities to develop an automated CFD model calibration method. The aim of this project is to help train the AI component for application deployment, and to help improve the model’s accuracy when assumptions have been made on the inputs.

“…the CFD model can be used to combine with AI algorithms like reinforcement learning to make AI more robust. Model-based Reinforcement Learning is becoming more and more popular, as the training cost with a real DC is almost unacceptable in practice. An AI control agent can be trained based on the CFD model’s generated data to cover more situations than collecting data from a real physical environment.”

As we continue to look for ways to improve data center performance, the use of machines to understand, run and optimize the environment is a natural next step. However, machine learning requires a large amount of data to fully understand the environment, and the algorithms struggle with large steps into the unknown or in new designs where there is no operational data set to learn from. In these cases, simulation connected to AI will prove invaluable to train the algorithms in how the real data center will perform before letting them loose into the wide world.


Blog written by: Mark Fenton, Product Manager

Other Recent Posts


Managing Energy Consumption for the Surging Data Center Market

Data center operations today are not the same as they were 10 years ago. Between the rapid growth o…

Read More


HF Lenz speeds up design process with 6SigmaRoom and Rescale

Engineering firm uses 6SigmaRoom and Rescale to Deliver Fast and Accurate Results to Demanding Clie…

Read More