iCooling: How You Can Cut $ Millions in Data Center Power Bills
The Ideal and the Reality
Network evolution and a rapidly growing data business are driving a boom in the data industry. But as data center networks keep expanding, they keep consuming more power. Consider a 10 MW data center: In its 10-year lifecycle, power consumption costs account for more than 60% of overall DC OPEX! Power usage effectiveness (PUE) is the ratio (and major KPI) that describes how efficiently a data center uses energy. IT vendors have been busy exploring ways of cutting PUE. Here are a few examples:
Subsea: An underwater DC deployed by Microsoft in the waters off the coast of Scotland’s Orkney Islands. All components are pre-integrated, and seawater can be used for free cooling.
The coop: Yahoo’s DC in the US (Lockport, New York). The Yahoo Computing Coop is cooled almost entirely by external air currents, completely eschewing mechanical water chillers. The PUE value of the Computing Coop is about 1.08
The evaporator: Google’s DC in Saint-Ghislain, Belgium: The evaporative cooling system of this DC implements free cooling by using water drawn from a nearby industrial canal, achieving a yearly PUE of 1.11.
According to the Uptime Institute Global Data Center Survey, the average DC PUE in 2018 was 1.58, with the rapid fall from the 2007 figure of 2.5 becoming much more gradual since 2013 (at 1.65) to now.
While the three data centers I mention efficiently use natural cooling sources and are significantly under the average PUE, their success is at the cost of flexibility. The way they’re deployed is closely coupled with specific environments, so they can’t be batch replicated as not all DCs can be built in seas and mountains.
So, the question you might be asking is how can we reduce PUE in every situation?
Traditional chilled water cooling systems consist of water chillers, pumps, cooling towers, and terminals. Energy consumption for cooling is associated with the heat dissipation of devices, device configuration, facility environment, and external climate conditions. After O&M reaches a certain maturity level, hardware energy savings or optimizations based on human expertise cannot further reduce power consumption.
Three issues need to be resolved when adjusting a chilled water cooling system:
- Ensure that all components are running in a high-efficiency range (optimal for components).
- Find out the optimal combination of components (in a global context). For example, what is the optimal frequency of the cooling tower, cooling water pump, chiller, and chilled water pump for 1000 kW cooling output? Which combination is the most energy-efficient?
- Associate IT loads with the cooling system to balance the heat and cooling capacity.
Optimization based on expert experience can only resolve the first issue, so it’s difficult to ensure that the performance of the cooling system is globally optimal. In addition, manual adjustment is lengthy and slow and unable to meet energy-saving requirements.
For a complex chilled water cooling system, a new control algorithm is needed to achieve overall optimal performance. This is where big data and AI come in.
Huawei’s iCooling Solution
iCooling optimizes DC energy efficiency using AI. The solution applies automatic inference using a deep neural network (DNN). The DNN determines the system parameters for the optimal PUE under the specified climate conditions and SLA to maximize DC energy efficiency.
It works as follows:
- Data Collection: collects the operating parameters of the refrigeration station, air conditioners, IT loads, and other systems. In phase 2 of the Huawei Langfang Cloud Data Center, for example, the operating parameters for nine months were collected from more than 700 data collection points.
- Data Governance: performs dimension reduction, de-noising, and cleansing of raw data. Huawei’s powerful automated governance tool processes 100 million pieces of raw data within one hour, providing high-quality data for subsequent model training.
- Model Training: uses high-quality data and DNN to train the PUE model. The prediction accuracy of the trained PUE model is 99.5%, with an error rate of less than 0.005.
- Inference & Decision-making: publishes the trained prediction model onto the inference platform. Using the prediction model, iCooling can find the optimal parameter combination for the current outdoor environment and IT loads from 170,000 combinations within one minute. It can also perform multi-layer filtering according to requirements, output the most appropriate instructions to execute, and then provide the results.
Data Centers Can Be Cool
The Huawei Langfang Cloud Data Center uses iCooling@AI technology together with the optimal cooling products developed by Huawei to build up data linkage between IT loads, the cooling system, and entire external environment without increasing hardware costs. This reduces the PUE of the built data center by 0.1, which means a saving of about US$1million each year.
AI has developed rapidly alongside improvements in algorithm theories. With AI, power distribution systems can intelligently analyze root causes of faults and evolve from passive alarms to proactive prevention, iCooling-powered cooling systems can slash energy consumption to achieve green energy targets, with O&M systems able to run smoothly without staff onsite.
Disclaimer: Any views and/or opinions expressed in this post by individual authors or contributors are their personal views and/or opinions and do not necessarily reflect the views and/or opinions of Huawei Technologies.