What Will Accelerate the Adoption of 2-Phase Immersion Cooling?

Data center processing has now reached an inflection point. The heat being generated by chips and servers is so high that the cooling methods most data centers employ are no longer completely viable. 

The demand for video content streaming, application services and data-intense technologies like artificial intelligence (AI), IoT and 5G is growing rapidly. High performance computing and cryptocurrency mining often require cooling capabilities beyond what air and other traditional cooling methods are capable of. 

Enter 2-phase immersion cooling. 

This groundbreaking technology immerses servers and other IT equipment in a non-conductive fluid that has excellent thermal characteristics, providing thousands of times more heat rejection than air cooling. The server components (like CPUs, GPUs, ASICs and power supplies) heat the fluid until it is boiled into a vapor. The heat energy in the vapor is then transferred through a condensing coil placed just above the ‘vapor zone’, rejecting that heat to an outside fluid loop typically connected to a fluid cooler (also known as a dry cooler since no water is consumed to reject the heat). The condensed vapor falls back into the tank in the form of a liquid, hence completing a perpetual, self-contained, 2-phase cooling cycle: Liquid – Gas – Liquid. 

2-phase immersion cooling drastically reduces energy consumption, water use, data center floor space, and horizontal and vertical IT equipment space. The six major factors driving the need for this revolutionary technology are: 

  1. Data center efficiency gains have stalled since 2018
  2. Chip power / IT power densities are increasing rapidly
  3. Data center water use has surpassed energy use as an environmental concern
  4. Compute power is moving toward the edge and compacting
  5. Data center e-waste is a growing problem
  6. Billions are being invested in corporate sustainability initiatives

Data Center Efficiency Gains Have Stalled Since 2018

Data center efficiency gains have flat-lined and reversed direction. The last two decades realized steady improvements in air cooling, including hot-cold aisle arrangements, close-coupled cooling, fan speed control, staged and inverter-driven compressors, adiabatic assist and many other innovations. But according to the Uptime Institute, since 2018 data center PUEs are actually on the rise (meaning the wrong direction!). This is due in part to the fact that air cooling technology has reached a ‘technology development tap-out’, coupled with the fact that today’s higher powered chips and processors are too energy dense to be efficiently cooled with air.

Chip Power / IT Power Densities Are Increasing Rapidly

While Moore’s Law long ago established that processor speed would double every eighteen months, the speed of AI processing now doubles every three and a half. Handling these speeds requires the most powerful chips ever designed, and these chips generate massive amounts of heat, which cannot be effectively or efficiently cooled with air.  

For example, in April 2021 Cerebras released its new WSE 2 chip, which boasts 2.6 trillion transistors and 850,000 AI-optimized cores, and draws 23 kW of power. Most air cooling systems in data centers can only handle about 8kW to 12kW per rack, so even though you could fit three WSE 2 chips in a rack, you might not be able to blow enough air through the rack to cool even one of them. Even if you miraculously achieved an air cooling solution, with AI power doubling every quarter, this approach still wouldn’t be sustainable for long.

Data Center Water Use Has Surpassed Energy Use as an Environmental Concern

We all know that the majority of electricity we use is generated by fossil fuels. But what many don’t know is that power plants use these fossil fuels to heat water, generating steam to turn the turbines that ultimately create the power. Furthermore, chillers and air handling units can consume massive volumes of water to reject the heat from data centers. These compute facilities therefore use billions of liters of water per year. According to recent estimates, data centers consumed over 660 billion liters of water in 2020 alone.

Compute Power is Moving Toward the Edge and Compacting

Edge computing means processing data within devices or at edge data centers located geographically close to end users, rather than within a centralized data center. The proximity to users and bypassing of centralized data centers means data has much less backhaul time and cost, while the close proximity of regional and edge data centers to the applications and users drastically reduces response times. This results in significantly higher bandwidth and ultra-low latency, enabling the use of technologies that require real-time data relay of massive datasets, such as 5G, industrial IoT, AR and VR gaming, smart city sensors, drones, and so on.

Not only does edge computing require an enormous amount of processing power, but dense user populations are typically located in cities, where space is limited and priced at a premium rate. Edge data centers must often support dense servers and heavy workloads in compact spaces, located in harsh climates with high or low temperatures, dust, dirt, contaminates and particulates. Air cooling methods do not allow for this compaction because the coils, fans, compressors and ducting take up a significant amount of space.  Furthermore, because these air-cooled systems rely on airflow to reject heat, that airflow naturally carries with it a host of dust and debris which can clog coils, reduce performance and even lead to premature failure.

Data Center E-Waste is a Growing Problem

In addition to wasted space, water and energy, we must add the physical waste associated with outdated cooling systems.  Since high power air-cooled servers require larger fans and heat sinks, they take up even more space, are shrouded in more sheet metal, require more racks, and generate vast volumes of waste packaging. All of this Electronic Waste creates a negative impact on the environment. The idea is to do more with less. Liquid cooling provides the opportunity to achieve exactly that. 

Billions Are Being Invested in Corporate Sustainability Initiatives

There is now a heightened social, corporate, governmental, and consumer attention on the environmental impact of corporations, the resources they consume and the impact on our ecosystem. Sustainability initiatives are no longer a nice-to-have; they are a high-priority directive being driven by the C-suite and boards of directors. 

There is no better example of organizations putting their money where their mouth is and implementing real-world sustainability initiatives than the “big three” cloud providers — AWS, Google Public Cloud, and Microsoft. Microsoft has pledged to be carbon negative by 2030. Google has set an ambitious goal to run solely on carbon-free energy at all data centers by 2030. And AWS has pledged to power all operations with 100% renewable energy by 2025. 

Since cooling and thermal management can consume approximately 40% of the energy data centers use, replacing outdated cooling methods is low-hanging fruit for enterprise sustainability goals.  

2-Phase Immersion Cooling Will Be The Standard

2-phase immersion cooling is a tailor-made solution to the trends discussed above. The savings in energy costs, space, and carbon emissions will not only contribute significantly to corporate sustainability goals, but will enable exponentially higher compute power and a brave new technological future.

Stay in the know!

Sign up to get notified about the latest LiquidStack news, resources and events.