In a world where artificial intelligence and machine learning are the new frontiers, cloud applications are shaping the design and operations of the next generation of data centers. End users are increasingly looking to the cloud for scalability, infrastructure reliability and faster deployment of applications that can increase productivity.
To meet these new demands, hyperscale owner-operators must eliminate the uncertainty caused by new assets and infrastructural changes. The degree to which you can predict and plan for the impact of these changes determines whether you can ensure critical goals are met without overspending on design and operations.
How can you see the impact of these potential changes? Our industry-leading digital twin software is a 3D virtual model of your physical data center and can be used to test design and operational changes. With the ability to design everything from the smallest components in the facility to the data center itself, the digital twin has the accuracy and power to safeguard your business from the uncertainty of data center performance. Utilizing the same principles of prediction that health, insurance and automotive industries use to mitigate business risk, the digital twin gives you an insight into future risk that no other tool can match.
Server and Rack Design
To run multiple software applications, IT equipment must be designed with an awareness of both the applications it will run and the data center that will house it. Utilizing a digital prototype of the server, you can design the optimal internal layout, maximizing airflow through the server while keeping the electronics cool at their highest workloads.
The digital twin’s powerful tools allow you to model IT equipment by importing existing CAD files and utilizing a wide range of intelligent parts, such as components, heat sinks, fans and PCBs. It also allows you to evaluate a wide range of scenarios, such as fan control algorithms, fan failures and component temperatures during stress conditions. Knowing the true boundary conditions of a piece of IT allows you to address and solve thermal issues during the project’s architectural phase – when changes are easier and less expensive to make.
Figure 1. Using Computational Fluid Dynamics (CFD), you can design and test a range of server layouts
Additionally, the servers can be tested in various rack configurations to ensure that any variant of IT stack-up works at different power densities.
Figure 2. After designing the server in the digital twin, you can test its performance in a range of rack configurations
DC Design and Operations
The digital twin gives you the opportunity to "design out" potential issues before the data center is built. The biggest risks during design are over-engineering and overspending on CapEx (capital expenditure). On the other hand, a poorly designed facility can lead to overspending in OpEx (operational expenditure) when power and cooling infrastructure upgrades become unavoidable.
The variables that affect data center designs are hard to count: from power densities to rack layouts and cooling control systems. With the digital twin, the entire design can be validated using data from the actual server designed by the hardware team – a critical component in understanding the validity of the design.
Figure 3. Addressing the entire design from server to the data center and beyond
Putting it All Together
Given the seamless transfer of data across the entire platform, you can run various scenarios at the data center level to validate the design at every level. In operations, a key parameter is the constant shifting of power from one section of the data center to another, which can result in hotspots or capacity losses if the control system isn’t tuned appropriately. You can test the impact of varying power load via an external API. Given that the servers used in the model can react to the change in power and external ambient temperature, you can understand where potential hot-spots may occur.
Designing with the End in Mind
Simulating real-world data center components in the digital twin powered by Computational Fluid Dynamics (CFD) helps you to design with the end in mind: you can evaluate the performance of a server in the rack, and see how that rack will be affected by deployment in the data center. This removes a great deal of risk from the design process.
The digital twin’s targeted software tools are designed to meet the needs of both server mechanical engineers and data center design engineers, providing a holistic, physics-based solution for every person on your team.
This blog was originally published on June 6, 2018 and updated on June 20, 2019.*
Blog written by: Akhil Docca, Director of Marketing
Other Recent Posts
Unpredictability is public enemy #1 for the data center
If the pandemic has taught us anything, it’s that for all of the experience and intelligence in bus…
Power is in the firing line for risk averse data center operators
There’s never a good time for an unplanned outage. A lack of visibility or capacity to report is ne…
20 June, 2019