The Data Center Liquid Cooling Landscape
Data centers are the backbone of our digital world. Every time we send a text message, run a Google search, stream content, use artificial intelligence or cloud computing, or email colleagues, vast amounts of data are routed through, processed by, and stored in these critical facilities.
Data centers consume significant amounts of electricity to operate 24/7, which means they generate a ton of heat. Traditionally, air cooling was sufficient to prevent equipment from overheating, but rising rack densities have dramatically increased heat loads. Consequently, facilities are shifting away from air cooling in favor of a more efficient solutionโliquid cooling.
A single large data center can consume as much electricity as 50,000 homes. [1]

Across new builds and retrofits, data center liquid cooling technologies rely on coolant distribution units (CDUs) and related thermal management equipment to maintain allowable component temperatures. Cooling is therefore a mission critical discipline in modern facilities, directly tied to uptime and operating costs.
Liquid Cooling Technologies Used in New Data Center Builds and Retrofits
Data center operators, together with their engineering and construction partners, evaluate several liquid cooling methods to identify options that align with workload densities, facility constraints, and long-term operating strategy. Most facilities adopt a single approach, though many air-cooled sites are introducing hybrid designs to capture some liquid cooling benefits.
Below are the three most common approaches to data center liquid cooling.
Direct-to-Chip Liquid Cooling
Direct-to-chip liquid cooling circulates coolant through cold plates mounted directly on CPUs and GPUs, removing heat at the source rather than through room air. This approach supports high density racks and more predictable thermal performance.
Immersion Cooling
In an immersion cooling data center, servers are submerged in a dielectric fluid that absorbs heat directly from components. This liquid cooling approach is gaining traction for workloads with extreme power densities (>100 kW per tank), and brings with it additional considerations around fluid compatibility, fluid contamination control, and long-term material stability.
Rear Door Heat Exchanger (RDHx) Systems
A rear door heat exchanger replaces or augments the rear rack door with a liquid-cooled heat exchanger that captures server exhaust heat before it enters the data hall. RDHx systems are often used as a middle-ground between air-cooled and fully liquid cooled rooms, enabling meaningful rack power density gains without redesigning the entire IT infrastructure.
Flexible Stainless Steel Hose Assemblies in Data Center Cooling Systems
When it comes to fluid conveyance, one type of hose stands out as the clean, leak-tight, and reliable solution. Requiring less maintenance and fewer replacements that could otherwise disrupt or jeopardize operations, stainless steel hose assemblies play a vital role across key data center applications.
- Cooling distribution units (CDUs): Central to each of the three approaches to liquid cooling are cooling distribution units (CDUs). The brains behind many data center cooling systems, CDUs circulate coolant, whether deionized water, glycol mixes, orโin the case of immersion cooling systemsโdielectric fluids, between IT equipment and a facilityโs primary cooling infrastructure. They regulate coolant flow and temperature to prevent overheating and maintain stable operation. Flexible connections are used between pumps, heat exchangers, and manifolds inside CDUs where space is often constrained, and vibration absorption is important. They are also used as supply and return lines connecting CDUs to the secondary cooling loop.
- Facility cooling: Metal hoses are used at key interfaces, such as chiller and pump connections, within a data centerโs facility water system to manage vibration from equipment, accommodate thermal expansion and contraction of piping, and handle tight bends.
- Armored conduit: Electrical wiring is often encased in flexible metal hose that provides a protective shield against moisture, harsh environmental conditions, and physical compression in tight or confined spaces.
- Hot aisle containment connections: In air-cooled facilities, flexible connections are relied on to support the separation of hot and cold air, often to channel air from hot aisles back to the cooling systemโs return air intake.

Why Metal is Better in Data Center Liquid Cooling Systems
Due to their low cost and ease of installation, rubber hoses have been used in some data center applications. Over time, however, rubber hoses can become brittle, crack, and leak, especially in rooftop chiller systems exposed to heat and UV radiation. These failures introduce serious risks, including floods, equipment damage, and outages.
Through experience, operators have found that rubber hoses often cannot withstand the operating conditions of modern data centers. Metal hoses maintain ductility over time, resist UV degradation, and deliver long service life when properly installed, making them a more durable and reliable solution for mission-critical cooling systems.
How Penflex Supports CDU Manufacturers and Data Center MEPs
Data center liquid cooling equipment is built to operate continuously under strict uptime expectations often aiming for โfive-9sโ (99.999% uptime). As a result, OEMs and design firms evaluate every connecting component through a reliability and lifecycle lens, prioritizing material compatibility and long-term stability to prevent leaks and failures.
While metal hoses are a relatively recent addition to data centers, Penflex brings over a century of expertise in fluid conveyance and metal forming technology to the table. We understand the critical features that make metal hoses performโand excelโwhere it matters most.
Clean ID technology: Modern liquid cooling hardware uses narrow flow channels and precision cold plates, which makes coolant cleanliness a critical performance factor. The industry increasingly expects components to be clean, minimizing particulate shedding, residue, or trapped contaminants that could foul microchannels or degrade heat transfer efficiency. Penflexโs Clean ID hose is manufactured without internal tooling, lubricants, or forming fluid, resulting in a final product that contains no traces of grease, moisture, oils, or particulate thatโs left behind through standard manufacturing processes.
Superior cleanliness: In addition to our Clean ID options, we can accommodate customer-specified cleaning proceduresโincluding oxygen cleaning servicesโupon request. To maintain cleanliness, all assemblies are individually bagged after preparation.
Sanitary fittings: Penflex maintains a deep network of sanitary fitting suppliers to support hygienic connections commonly specified in data center liquid cooling systems.
Corrosion resistance: All hoses are prone to deterioration if theyโre continuously exposed to harsh substances and intense heat or pressures, but our austenitic stainless steel hoses have superior corrosion resistance. Their compatibility with glycol-based refrigerants and other cooling fluids significantly extends their lifespan. Material compatibility is not simply a preference, it is a baseline requirement to prevent leaks, flow restriction, or contamination that can cascade into performance loss.
Quality welds: High-quality welds are essential to hose integrity and long-term performance, especially in data centers. At Penflex, we achieve this through a proven process that includes TIG welding and argon purging, which together produce clean, strong welds without compromising corrosion resistance. Our team includes ASME Section IXโcertified welders, on-site certified welding instructors, and non-destructive examination (NDE) specialists, ensuring every assembly is tested and leak-tight before it leaves our facility.
Broad size range and configuration capability: Assemblies are available across a wide diameter range, from ยผโ to 24โ, and can be configured to fit common OEM architectures and facility interface needs.
Increased flexibility: Penflexโs metal hoses are flexible, simplifying installation in even the most confined or hard-to-reach spaces. Each data center liquid cooling hose can be easily adjusted to fit various heights and orientations without compromising its performance or durability.
Recommended Products for Data Center Cooling Applications
Clean ID
Removing the need for post-production cleaning processes, Clean ID represents the latest innovations in hose forming technology.
P5 Series
Large diameter hose options for commercial HVAC applications.
P3 Series
Lightweight, flexible annular hose ideal for tight configurations.
Beyond the Product: The Penflex Partnership
At Penflex, we see every order as an opportunity to continue building our relationship with a current customerโor the beginning of a partnership with a new customer. Our customers count on us not only for a quality product, but also for expertise, responsive service, and resources to help them get the most out of their investment.
Dedicated Engineering Support
Whether you’re leveraging direct-to-chip liquid cooling, immersion cooling, or rear door heat exchanger (RDHx) technologies, our sales engineers work closely with you from project prototype through delivery, helping define requirements for new products and replacements. A dedicated account manager means you have a single point of contact who understands your business and responds quickly with thoughtful solutions.
Practical Technical Resources
From online calculators and engineering bulletins to in-person and virtual training, we give your team the tools to make informed decisions and the confidence to install, operate, and maintain hoses for maximum longevity.
Full Quality Documentation
On request, we provide complete record packsโincluding material test reports, leak and pressure test results, and cleaning procedure documentationโso you can meet compliance requirements and maintain full traceability.
Our commitment to service is reflected in our 99% customer retention rate and an average customer relationship of 20 years. We aim to make every interaction count, combining technical know-how with the kind of service that turns first-time buyers into long-time partners.
Footnotes
[1] MIT Energy Initiative, Massachusetts Institute of Technology. The multi-faceted challenge of powering AI. Retrieved November 17, 2025 from https://energy.mit.edu/news/the-multi-faceted-challenge-of-powering-ai/.