From facility level to individual server and switch, organizations reeling in the wake of the pandemic are looking to eliminate risk from their business. The question remains whether the data center industry and its supply chain are ready for these new demands.
A tendency towards risk aversion has always been a valuable asset for anyone involved in the operation of a data center - whether at the facility level or in the delivery of compute performance. At times the demand from the business has felt in contradiction to this need to reduce risk.
Today a new challenge is being faced. The requirements of the business are proliferating - more apps, greater availability, improved redundancy - but at the same time there is a new face of risk aversion that has been accentuated by the pandemic. From the C-Suite through to lines of business, key stakeholders want to know there’s "no chance" of missing opportunities because of downtime.
This pressure is pushing IT and infrastructure suppliers to deliver solutions that perform at the optimal level, while providing the maximum guarantee against outages.
As Moore’s Law offers less bragging rights in IT equipment, and organizations look to sweat assets harder, the ability to guarantee reliable performance is the key to effective tech design. And as the Uptime Institute has suggested, organizations are likely to look for supplier solutions that limit downtime and include more comprehensive replacement agreements.
In such a context, there’s a major win to be gained by suppliers who can provide this sort of assurance. Who can guarantee the performance of their equipment under pressure, and during disrupted and fast-changing situations. But this trust has to be earned, and to do this, modelling of electronic components has to become more accurate.
The detail required to understand the behavior of components, their interactions and thermal performance just keeps getting more complex. At the same time, the various forms of CFD many vendors have had to use leave them relying on too much guesswork.
Our Digital Twin model for the data center ensures that designers and operators can simulate change within their facilities in a risk-free virtual environment. And the same CFD that powers this can be used to create accurate models of electronic equipment. That means gaining the ability to simulate the thermal performance of electronic components: ranging from the smallest ICs to the largest, most powerful servers.
When your design parameters require you to reduce power requirements, system weight and cost – without compromising performance and reliability – there isn’t much room for failure. However thermal simulation allows you to experiment with your equipment design – such as modifying heatsink geometry or reducing fan speeds – in a virtual environment. This is faster, cheaper and safer than physical experimentation and measurement.
We’re all operating in an increasingly risk averse business environment. So highly accurate simulation, driven by CFD, is going to ensure data center facilities and their associated equipment suppliers remain in the driving seat when it comes to delivering reliable performance.
Blog written by: Sherman Ikemoto, Managing Director
Other Recent Posts
The Digital Twin for Hyperscale Data Centers
In a world where artificial intelligence and machine learning are the new frontiers, cloud applicat…
Why the pandemic has increased the importance of thermal design
The pandemic has moved most organizations further down the spectrum of risk aversion, meaning that…
7 December, 2020