Air2O

Chilling Perspective: Navigating the Future of Data Center Cooling – Debunking Myths and Embracing Innovation

Chilling Perspective: Navigating the Future of Data Center Cooling – Debunking Myths and Embracing Innovation

By Nabeel Mahmood

The global proliferation of data centers has been significant, triggered by the exponential growth of digital data and the surge in computing. Data centers are growing not just in number but also in capacity, complexity, and energy efficiency. The advent of artificial intelligence (AI) has further propelled this growth. AI, with its computational power, is driving a surge in data consumption and processing needs, making data centers more indispensable than ever. AI’s predictive capabilities also enable more efficient data center operations, from cooling systems management to load balancing, optimizing energy usage and reducing operational costs. In essence, the symbiosis of data centers and AI is shaping the evolution of global digital infrastructure and influencing sustainable practices within the industry.

The increasing demand for data centers has also led to the rise of edge computing, which involves processing data closer to its source rather than at a centralized location. Edge is particularly suitable for time-sensitive applications and devices that require low-latency processing, such as self-driving cars and Internet of Things (IoT) devices. This trend towards edge computing has significant implications for data centers, as they need to adapt and evolve to meet the changing needs of the industry. Moreover, data centers are not just limited to traditional physical infrastructure. The rise of cloud computing has also seen a shift towards virtualized data centers, where resources are pooled together and managed through software-defined technology. This enables more efficient resource allocation and scalability, making it easier for organizations to manage their increasing data needs. As a result, chips’ thermal design power (TDP) has also significantly risen in the past 16 years. According to IDTechEx, the TDP has experienced a four-fold increase from 2006 to 2022, with servers reaching IT loads exceeding 750W.  This upward trend in TDP has created a crucial need for more efficient thermal management systems at the micro (server board and chip) and macro (data center) levels. In response to this demand, leading data center users have collaborated with various cooling solution vendors to launch innovative pilot projects and commercialize ready-to-use cooling solutions. They aim to enhance cooling performance and meet sustainability targets by adopting these more efficient cooling solutions.

In 2023, the heat generation in data centers is substantially higher compared to a decade ago. This can be attributed to the increase in the number and complexity of servers, the growth in data processing needs, and the rise of AI and edge computing. Data centres are estimated to consume 3% of the world’s total electricity and contribute 2% of global carbon emissions. This dramatic increase in heat generation presents a significant challenge in terms of cooling and sustainability.

With the increasing TDPs and energy consumption, data centers significantly impact the environment. We must continue innovating and deploying realistic, environmentally friendly cooling strategies to combat this. While some of the advancements we have seen recently are promising, continuous research and innovation are necessary to improve the sustainability of data centers further and reduce their environmental impact. 

As data center workloads intensify, surpassing 100kW/rack in some cases, traditional cooling methods will struggle to keep up. While immersion cooling has shown great promise, another approach to consider is rear door heat exchangers (RDHx). RDHx units attach to the back of server racks, capturing and neutralizing heat at the source. They can handle high-density workloads without increasing the cooling load on the data center’s air conditioning system. RDHx systems are highly efficient, often reducing cooling energy consumption by up to 75%.

Regarding rack loads over 100kW, while technically feasible, they present several practical challenges. Increased power density can lead to hot spots, which can compromise the performance and lifespan of equipment. Additionally, supporting infrastructure, such as power distribution and cooling systems, must be designed to handle such high loads. For these reasons, loads of 100kW/rack are not typically seen in day-to-day operations, though they may become more common as technology continues to evolve and efficiencies improve. 

The evolution of cooling technologies in the world of computers and data centers has brought forth a debate: Is liquid cooling the death of air cooling? Based on my research, the answer to this question isn’t a simple yes or no. Both liquid and air cooling have their merits and drawbacks, making each suitable for different use cases.

Liquid Cooling Pros and Cons

Nowadays, liquid cooling is often associated with superior performance. It’s more efficient at heat dissipation due to water’s higher heat capacity than air. This can potentially lead to longer component lifespans and better overclocking potential. Moreover, liquid cooling systems are generally quieter as they don’t rely solely on fans to dissipate heat. They also offer aesthetic appeal for those who prefer a cleaner, more streamlined look inside their computer cases.

However, liquid cooling systems come with their own set of challenges. They have more points of failure, including the pump and hose attachments. There’s also the possibility of the fluid gunking up over time. Additionally, they are more complex to install and maintain, making them less suitable for novice users or those who prefer simplicity.

Air Cooling Pros and Cons

On the other hand, air cooling is generally simpler, cost effective, and easier to install. High-quality air coolers can provide excellent cooling performance for most users, especially those not planning on heavy overclocking. Air coolers are also less prone to mechanical failure since they primarily consist of a heatsink and fan(s). An air cooler will function as long as the fan is operational and the heatsink isn’t blocked.

The main drawback of air cooling is noise, as fans need to run at high speeds to cool components effectively. 

While liquid cooling might offer performance and aesthetic appeal, it’s not necessarily the death of air cooling. Moreover, liquid cooling methods include and are not limited to direct/ indirect/ immersion, Direct-to-Chip and Cold Plate Cooling. Air cooling still holds its ground as a reliable, cost-effective solution. The choice between the two will largely depend on individual needs, budget, technical expertise, and personal preference. Therefore, rather than seeing one technology as the death of the other, it’s more constructive to view them as complementary solutions catering to different market segments.

Another innovative approach involves capturing CO2 from the ventilation airflow within data centers. Beyond heat recovery, this technology presents a promising method to reduce the carbon footprint of data centers. Using liquid CO2 could change this paradigm significantly. Besides providing effective cooling, it presents some interesting side benefits like temperature and smoke control.

While we can theoretically use CO2 in data centers, more research and development are required to make these solutions practical and efficient. As we continue to rely on digital technologies, it is crucial that we also continue to explore and invest in ways to make data centers more sustainable. Let’s continue to innovate and adapt our strategies, ensuring we build data centers that are both robust and resource-efficient. Doing so can create a more sustainable industry while meeting the growing demand for data and technology. Let’s keep pushing towards a greener and more efficient future for data centers. Together, we can make it happen.