Why Data Center Operators Need to Rethink the Rule of Thumb Practice

Digital Twin Less Thumbs.jpg

Many of us may feel we’re winning the battle against over-provisioning  of data center resources, but this exercise in caution is still frighteningly prolific in the data center industry. As data demands continue to increase—with 77 percent of data center managers experiencing increased pressure on their infrastructure—we can no longer afford to rely on estimations and “the rule of thumb” to plan facilities. When we think about the pressure for efficiency and maximization of space in today’s colocation market, it is clear that optimization is impossible without precision. 

The reality then is that making accurate decisions on cooling, power and capacity are being compromised by the pressures that teams are under. We recently commissioned a study through Sapio Research and found that nearly a third (29 percent) of data center managers are compromising on decision making all of the time, while 45 percent are compromising at least some of the time. It’s also clear the damage that a lack of data is doing. Performance, quality of work and meeting deadlines being just some of the areas that are affected from not having the right information at the right time. 

So, when it comes to approaching the company board or key business decision makers about making significant changes or updates, how can you win them over? The industry has two choices today:

  1. Walk into a meeting with the company board armed only with backwards-looking data from monitoring systems and guesswork about what will happen after the expansions are installed and switched on.
  2. Alternatively, you can walk in with the technology and data that shows you’re the boss of your facility - that every eventuality is planned for and you’re operating in an optimized state.

Our industry has matured beyond the outdated security blanket of over-provisioning. We’re past needing to patch together retrospective data and gut instincts. The technical capabilities exist to operate data centers with capacity maximized, costs minimized and resiliency kept safely in place. This isn’t make-believe anymore, it’s real life. 

 

Future Facilities' 6SigmaDCX software gives you this control using a digital twin model of your data center—covering you from design into operations. It enables data center professionals to continually optimize their data center infrastructure without risking a “failed experiment.” Data center operators now have the ability to safely simulate the impact of future change on a data center's resiliency, physical capacity and cooling efficiency. It’s a win for IT, facilities and the board.

The reality is that as the demands have increased in our industry, we now need to rethink how we operate and ensure we are up to the challenge. Demands on data centers are not going to stop where they are now either. We need to think about how data centers are managed and run not just for today, but tomorrow as well. It’s time we leave the rule of thumb behind and refocus on facts. 

To introduce this level of control and visibility in your data center, find out more about how the digital twin for the data center is shaking up operations and data center management here.


Blog written by: Robert Schmidt, Director of Sales + Client Innovation - Data Center Infrastructure Software SME

他の最新投稿


【お知らせ】6SigmaET オンライン講座開講します!

Future Facilitiesでは、電子機器の熱設計エンジニア必見のオンライン講座「知らなきゃ損!Future Facilitiesが送る次世代の熱設計講座」を8月4日(水)から全4回にわたりお…

ニュースをすべて見る


【Webinar講演まとめ】6SigmaAccessを用いた運用

【全4回 Webinar講演まとめブログ:第3回】(投稿者:技術部  磯辺)みなさん初めまして。技術部の磯辺です。2021年2月12日に開催しました6SigmaDCX Webinar 2021では「…

ニュースをすべて見る

22 April, 2020

一覧へ戻る