OptiCool https://opticooltechnologies.com/ Tue, 14 Oct 2025 14:05:19 +0000 en-US hourly 1 https://wordpress.org/?v=6.9.1 https://opticooltechnologies.com/wp-content/uploads/2025/06/cropped-Opticool-Site-Icon-32x32.jpg OptiCool https://opticooltechnologies.com/ 32 32 Data Center Cooling Systems: Challenges in Standardization and Comparison https://opticooltechnologies.com/rear-door-heat-exchanger-rdhx-cooling-systems-challenges-in-standardization-and-comparison/ Tue, 14 Oct 2025 13:15:25 +0000 https://opticooltechnologies.com/?p=1330 As the heat density of racks in high density data centers continues to rise rapidly, Rear-Door Heat Exchanger (RDHx) cooling systems have emerged as a highly effective alternative to traditional Computer Room Air Conditioning (CRAC) systems.

The post Data Center Cooling Systems: Challenges in Standardization and Comparison appeared first on OptiCool.

]]>
The Rise of RDHx in High-Density Data Centers

As the heat density of racks in high density data centers continues to rise rapidly, Rear-Door Heat Exchanger (RDHx) cooling systems have emerged as a highly effective alternative to traditional Computer Room Air Conditioning (CRAC) systems. A Rear-Door Heat Exchanger (RDHx) is a liquid-based cooling system mounted on the back of a server rack that extracts heat directly at the source. OptiCool’s two-phase liquid cooling for data centers efficiently extracts heat from the rack entirely and redistributes it at a controlled temperature. RDHx systems offer superior heat removal capabilities, making them a compelling choice for modern energy-efficient cooling technologies in data centers.

The Problem: Lack of Uniform Rating Criteria

However, due to the fast-paced evolution of data center cooling technologies and efficiency standards, standardized rating criteria for RDHx systems—which typically include fans, heat exchangers, and controllers—are not yet fully established. Consequently, manufacturers often report cooling capacities based on varying and sometimes arbitrary operating conditions. This lack of uniformity makes it difficult for data center managers to perform fair and accurate comparisons between products and evaluate RDHx cooling efficiency.

Table 1.  Operating conditions for published RDHx Cooling Capacities.

 Manufacturer 1Manufacturer 2Manufacturer 3
Published Cooling Capacity, kW785055
Supply Liquid Temperature, °F57.253.655-65
Air Temperature at RDHx exit, °F75.269.875
Air Temperature at RDHx inlet or Server Outlet, °FNot AvailableNot AvailableNot Available

Notably, none of the manufacturers provide the RDHx inlet temperature—a critical parameter that significantly influences RDHx performance. Without this data, the published cooling capacities do not accurately reflect the systems’ true capabilities.

Recommendations for Objective Comparison and Evaluation

To enable meaningful comparisons across RDHx systems, it is recommended to use standardized operating conditions, including:

  • Air temperature exiting the servers
  • Air temperature exiting the RDHx
  • Liquid supply temperature entering the RDHx

These conditions should align with the server exhaust air temperatures outlined in ASHRAE Standard 127. Additionally, the RDHx exit air temperature should fall within the recommended range specified in the ASHRAE TC9.9 Datacom Equipment Thermal Guidelines.

Once standardized conditions are established, cooling capacity should be determined using both the air-side and liquid-side enthalpy methods. According to ASHRAE 127, the results from both methods should agree within 5% to ensure accuracy.

Measuring Energy Efficiency

Beyond cooling capacity, energy efficiency is a key metric for RDHx system evaluation. This can be expressed as the cooling capacity-to-power input ratio, calculated as:

Cooling Capacity ÷ Total Power Input

  • Cooling Capacity: Determined using the enthalpy methods described above.
  • Total Power Input: Includes power consumed by fans, controllers, and other internal components, measured at the rated voltage and power source.

Note: The fluid pump is typically external to the RDHx system and cannot be measured directly. To assess overall system efficiency, pumping power must be included and can be calculated using the pressure drop across the RDHX:

kWpump = dp x vol ÷ η

Where:

kWpump = Pump power input [kW]

dp = Pressure drop [kPa]

vol = Volumetric flow rate [m3/s]

η = Pump efficiency including motor and impeller efficiencies at the rated voltage and frequency.

This ratio provides a clear measure of energy-efficiency cooling performance. For example, if RDHx System A and System B both deliver 30 kW of cooling, but System A consumes 0.4 kW and System B consumes 0.5 kW, System A has a higher efficiency and would contribute to a lower Power Usage Effectiveness (PUE).

OptiCool RDHx systems – powered by two-phase liquid cooling for data centers – are specifically engineered to minimize power consumption and maximize the cooling capacity-to-power input ratio, thereby enhancing overall energy efficiency in high-density environments.

What’s Next: Industry Standardization Efforts

Organizations such as ASHRAE and AHRI (Air Conditioning, Heating and Refrigeration Institute) are actively working to update and expand standards to accommodate the rapid advancements in RDHX technologies. It is anticipated that standards for rating tests will be established soon for RDHX systems, including both water- and refrigerant-based systems.

Until these standards are finalized, data center managers and operators must take additional steps—such as requesting detailed operating conditions and performing standardized evaluations—to ensure they select RDHX systems that meet their cooling and efficiency requirements.

As RDHx technology continues to advance, energy-efficient high-density data center cooling will play a key role in shaping next-generation standards for sustainable, low-PUE digital infrastructure.

The post Data Center Cooling Systems: Challenges in Standardization and Comparison appeared first on OptiCool.

]]>
Why are Rear Door Heat Exchangers (RDHx) the most versatile liquid cooling option for data centers? https://opticooltechnologies.com/why-are-rear-door-heat-exchangers-rdhx-the-most-versatile-liquid-cooling-option-for-data-centers/ Thu, 09 Oct 2025 21:00:13 +0000 https://opticooltechnologies.com/?p=1324 For the past decade, traditional air-cooling with CRAC, CRAH, and fan walls have been the industry workhorse. Pump a lot of air across the room, perhaps with containment aisles, fix the hotspots, and for the most part, it keeps systems in range. But today, escalating rack and heat densities are surpassing what traditional air cooling was ever designed to handle.

The post Why are Rear Door Heat Exchangers (RDHx) the most versatile liquid cooling option for data centers? appeared first on OptiCool.

]]>
For the past decade, traditional air-cooling with CRAC, CRAH, and fan walls have been the industry workhorse. Pump a lot of air across the room, perhaps with containment aisles, fix the hotspots, and for the most part, it keeps systems in range. But today, escalating rack and heat densities are surpassing what traditional air cooling was ever designed to handle.

Consider the NVIDIA DGX H200, with 8 GPUs drawing 10.2kW of power. Six of these can fit in a 48U rack, creating a 61kW rack. On the roadmap are GPUs that will continue driving heat densities upward—well beyond what traditional air-cooling can manage. As fan speeds rise to compensate, efficiency falls and energy costs increase. NVIDIA CEO Jensen Huang has already indicated that racks drawing around 200kW are already emerging, and its public roadmap points to designs approaching 600kW within just a few years. This shift underscores the limits of air cooling and the growing necessity for scalable liquid cooling solutions.

Liquid Cooling Differences

When things get hot, liquid cooling is usually the answer. Thanks to its high heat capacity, liquid cooling efficiently manages workloads that air-based systems can no longer handle. Three distinct approaches exist:

  • Rear door heat exchangers (RDHx)
  • Direct-to-chip (DTC)
  • Immersion Cooling

DTC and immersion cooling systems are designed to support ultra-high heat loads, but they require specialized infrastructure and upfront investment. On the other hand, RDHx systems are designed for a wide range of heat loads – from traditional enterprise racks to AI/HPC racks up to 120kW – making them the most versatile choice.

Why RDHx is the Practical, Low-Risk Choice

Traditional air-cooling cools the entire room. RDHx cools at the rack level, providing only the cooling that each rack actually requires. Rear doors often include reserve capacity that can be used immediately, without reconfiguring the rack, interrupting workloads, or bringing in skilled labor. If additional cooling is needed, the door can simply be upgraded to a higher-capacity model.

Depending on the design, RDHx systems may return either conditioned or warm air to the room. Two-phase RDHx systems, like OptiCool’s, extract heat and expel room-neutral air, enabling them to work seamlessly with existing air-cooling systems, or operate independently. This flexibility makes it possible to  add AI and HPC racks with power densities up to 120kW incrementally as demand grows, without major upfront infrastructure changes. And with AI hardware evolving so quickly, long-term adaptability matters. Unlike DTC and immersion systems, which are vendor-specific and locked to AI workloads, RDHx doors can be redeployed to standard racks when equipment ages out, continuing to reduce load on legacy air systems.

Liquid cooling:Single-phase vs. 2-Phase

Most liquid cooling solutions rely on water and operate in single-phase mode. OptiCool’s RDHx is different: it uses a refrigerant in a 2-phase system that leverages phase change for maximum efficiency. When refrigerant absorbs heat, it vaporizes instead of rising in temperature, unlocking enormous heat capacity at the boiling point.

This design:

  • Extracts heat from the room rather than simply introducing cold air, creating a more efficient thermal cycle.
  • Delivers higher capacity, unlike single-phase systems that rely on more flow or larger equipment to handle higher densities, two-phase cooling achieves higher heat removal in a smaller footprint.
  • Uses less energy, by harnessing thermodynamics, two-phase systems require significantly less power to extract heat. OptiCool’s 60kW door, for example, operates at just 800 watts for the pump and 790 watts for the door.

OptiCool is the only vendor offering 2-phase refrigerant RDHx technology, combining the versatility of rear door cooling with the efficiency of phase change systems.

Learn more about our new 120kW RDHx — the highest-capacity rear door heat exchanger on the market.

The post Why are Rear Door Heat Exchangers (RDHx) the most versatile liquid cooling option for data centers? appeared first on OptiCool.

]]>
Cooling the Gold Rush: Why the Future of Data Centers Depends on Partnership, Reliability, and Care https://opticooltechnologies.com/cooling-the-gold-rush-why-the-future-of-data-centers-depends-on-partnership-reliability-and-care/ Fri, 29 Aug 2025 11:00:00 +0000 https://opticooltechnologies.com/?p=1245 In today’s data center landscape, the conversation is dominated by a single theme: the race for power and cooling. Demand is surging, deployments are accelerating – and behind the rush for capacity lies a very human reality that too often gets overlooked.

The post Cooling the Gold Rush: Why the Future of Data Centers Depends on Partnership, Reliability, and Care appeared first on OptiCool.

]]>
By Bill Bentley, Dedicated Industry Collaborator – OptiCool Technologies

In today’s data center landscape, the conversation is dominated by a single theme: the race for power and cooling. Demand is surging, deployments are accelerating – and behind the rush for capacity lies a very human reality that too often gets overlooked.

Customers urgently need solutions, but they don’t always have a clear picture of what those solutions should be or how their specific facilities will evolve over the next five, ten, or even twenty years; no one does. They are turning to industry experts not just for products, but for guidance—trusting us to shepherd them through decisions that will shape the reliability and future of their critical spaces.

The Human Side of Data Center Cooling

Cooling and power aren’t just technical requirements. For a data center operator, they are lifelines. A failure in either system doesn’t only risk uptime—it risks the confidence their company places in them, individually.  I’ve been part of on-call rotations and crisis war rooms. Operators face immense pressure, knowing that every equipment choice carries career-defining implications.

Yet, in too many sales processes, partnership, compassion, and empathy take a back seat to closing the deal. What’s missing is a sense of community and shared purpose—something our industry once valued deeply.

Why Reliability Must Guide Cooling Innovation

In the scramble to deliver the latest innovation, reliability often becomes a secondary concern. Faster and newer overshadow better and proven. But reliability should never be compromised.

True innovation means working together—industry leaders, customers, and manufacturers—to design systems tailored to real, specific needs. It means caring enough to create products that aren’t just impressive in a spec sheet, but dependable in real-world operation for years to come.

Principles of Data Center Cooling Done Right

At OptiCool, our goal isn’t just the next sale. We focus on helping our customers create environments that are reliable, efficient, and ready for long-term growth. That means delivering solutions that:

  • Do what they promise – no overpromising, no surprises.
  • Stand the test of time – built for reliability and operational certainty.
  • Drive efficiency – because sustainability is as much about smart engineering as it is about environmental responsibility.

These aren’t just features. They’re principles that guide every solution we design. The result is a data center that’s not only prepared for the next wave of demand, but one that operators can trust—day after day, year after year.

From Gold Rush to Community Effort

The growth in our industry has been called “the new gold rush.” If that’s true, then why not mine it together? 

Let’s move beyond isolated transactions and return to a team mentality—where success is shared, innovation is collaborative, and care for the customer’s mission is at the heart of every solution.

Because in life, it takes a community to achieve greatness. In our industry, it’s no different. Let’s make sure we build that community—one reliable, well-designed solution at a time.Learn how OptiCool helps operators create cooling systems that are efficient, scalable, and built to last. [Link]

The post Cooling the Gold Rush: Why the Future of Data Centers Depends on Partnership, Reliability, and Care appeared first on OptiCool.

]]>