Data is defined as facts or statistics collected together for reference or analysis. And thinking about the world today, it’s evident that there is a lot more data now than there was even five or 10 years ago. Things such as IoT, smart technology, and cloud computing are driving rapid growth in the data center cooling market. In fact, according to a report from Grand View Research Inc., the global data center cooling market size was valued at $6.10 billion in 2015 and is expected to reach $17.78 billion by 2024, at a compound annual growth rate exceeding 11 percent from 2016 to 2024.

 

GREATER EFFICIENCY

According to Eddie Rodriguez, strategic marketing manager, Danfoss Turbocor Compressors, the data center cooling market is growing due to increased use of electronic devices, such as smart phones and tablets, and cloud-based services that require more data storage.

“We are seeing a trend towards using cooling technologies that are more efficient and use environmentally friendly refrigerants that limit the impact on global warming,” Rodriguez said. “We are also seeing increased usage of air-cooled chillers compared to water-cooled chillers for data center cooling. Air-cooled chillers offer advantages such as reduced maintenance and lower system installation costs compared to water cooled chillers.”

According to Rodriguez, higher electrical rates and government legislation that is phasing out the use of refrigerants with high global warming potential are driving the market.

“Increased costs and shortages of water have driven many data centers to use air cooled chillers instead of water cooled,” he added.

Steve Madara, vice president, Global Thermal Sales, GM/Executive Management, Vertiv Co., agreed that the data center cooling market is showing solid growth, saying it is being driven by a proliferation of big data, IoT, AI, cloud, and edge computing activity.

“Larger cooling systems — from large indirect evaporative air handlers to large pumped refrigerant split and packaged systems — are showing high growth, as colocation and cloud hosting providers build new data centers to support greater capacity and lower latency requirements,” he said. “At the same time, equipment and applications remaining in enterprise data centers often have higher levels of criticality, and these facilities are being upgraded for higher cooling capacities and densities. We expect demand for thermal management offerings to continue, spurred by the growth in big data, IoT, and machine learning that will increase the amount of data used by cloud, on-premise, and edge applications.”

There are more cooling options than ever before, which help companies save money by operating more efficiently and managing their thermal environments more effectively, Madara explained.

“Data center managers are constantly challenged to find new approaches to physical infrastructure as they add capacity to their data centers or other IT spaces to drive down operating and capital costs,” he said. “Traditional raised-floor designs using perimeter cooling have given way to non-raised-floor data centers or data centers using package cooling systems external to the buildings. Beyond chilled water, companies are using economization solutions such as indirect evaporative and pumped refrigerant cooling. Today’s customers demand modular systems that can be easily scaled and fit easily into existing physical infrastructure and software management systems.”

While efficiency is still a must-have for new cooling offerings, customers are also becoming aware of water-related costs as they calculate the costs of cooling options, Madara added.

“Beyond the increase in demand, the desire to drive down energy costs remains a concern for all companies, whether their facilities are on-premise or outsourced,” he said. “Colocation and cloud hosting providers are especially focused on using cooling solutions that lower peak power to reduce their generator and electrical infrastructure costs. Colocation and cloud hosting providers are showing preference for working with vendor partners that can speed delivery and deployment to achieve faster time to market in this very competitive business.”

Arthur Lu, president, Durkee America Inc., said data center cooling is expanding in different directions. Additionally, new technologies make semiconductor chips more powerful while consuming less energy.

“Data centers are upgrading server equipment to not only expand speed and capacity but increase energy efficiency,” Lu said. “Another trend we see is companies making data centers larger, greener, and more energy efficient. One example is the location of data centers in colder climates, such as Facebook and Google selecting one of their data center locations in northern Europe. Colder climates utilize free cooling from a nearby water source or cold outside air for equipment cooling.”

The cost of energy is a large expense in running a data center, and cooling can be a large portion of that energy consumption, Lu noted.

Michael Zarrilli, data center solutions (DCS) lead, Building Technologies & Solutions, Johnson Controls Inc., agreed, saying cooling can be up to 40 percent of a data center’s operating expenses.

“It’s critical that data centers utilize the most efficient cooling methods available to drive down PUE [power usage effectiveness],” Zarrilli said. “As a result, we are seeing a shift from traditional cooling methods to free-cooling methods, such as direct evaporative cooling (DEC) and air-cooled chillers with free-cooling.

“However, most free-cooling methods heavily rely on water usage, which is increasingly becoming problematic due to geographic water constraints,” he continued. “As a result, we expect WUE (water usage effectiveness) to increase in importance when data centers think about their cooling needs.”

 

LIQUID COOLING

There are multiple types of equipment in the data structure infrastructure using water to cool the data center environment, noted Mat Hery, product manager for mission critical cooling at Nortek Global HVAC.

“We have water chillers, cooling towers, chilled water CRAH or CRAC, adiabatic cooler, in row coolers, and rear door heat exchangers — all these products use pumped water [often with additives] to cool the servers or other infrastructure equipment. Water is a fantastic fluid because it’s abundant [right now], it’s almost incompressible, and it has great sensible thermal properties.”

Hery said he is seeing a trend of direct liquid cooling in the industry.

“Direct liquid cooling generally involves a relatively short ROI [return on investment] and a reduced total cost of ownership compared to traditional ways of cooling a data center; this, of course, depends on the workloads and density of the data center and the existing infrastructure. It’s incredibly silent, as opposed to air cooling, which requires loud centrifugal fans, making the data center a more comfortable environment to work in; it’s the most effective way to cool a data center because it can be used for server densities of hundreds of kW for a 42U server, removing multiple mediums and targeting directly to the heat source; and it’s the most efficient way to cool a data center.”

 

CREATIVE SOLUTIONS

According to Daniel Jones, president, UV Resources, data centers are looking for increasingly affordable and creative ways to deliver the environment needed to maintain server equipment at optimum temperatures.

“A single data center can consume more power than a medium-sized town,” he said. “UV-C’s ability to restore and maintain cooling capacity and airflow can reduce HVAC energy use by up to 35 percent, offering an untapped savings opportunity for managers of small server rooms to large data centers. While upper limit set point temperatures have increased over the years, many large data centers are in geographically undesirable locations that do not allow for ‘free’ cooling, i.e., mechanical cooling will be necessary. Maintaining mechanical cooling systems becomes critical; therefore, performance accessories, such as UV-C fixtures, while a relatively small cost, are an integral part to ensuring continuous cooling performance.”

The need for data centers will continue to grow as technology becomes more and more accessible throughout the world, Jones noted.

“However, data centers are not thought of as revenue generators for large corporations,” he said. “This fact drives data center developers to geographical areas where the cost per square foot is as low as possible, creating the demand for more and more mechanical cooling options.”

Joe Reele, vice president, Data Center Solution Architects, Schneider Electric, said the market is starting to see a trend in different types of cooling technology that are being deployed and utilized.

“IT technology and environmental regulations are big drivers, but don’t underestimate the drive for data center owners to minimize cost while speeding up delivery — all while maintaining the agreed-upon risk profile,” he said. “With all that as a high-level back drop, we are seeing some cooling trends developing.

“Typically in the past, we’ve seen a lot of traditional type of cooling systems [centrifugal chiller/cooling towers], and they work well, but the trend we’re starting to see take a little more foothold is some form of economization in the base design,” Reele continued.

With that comes different types of technology for data center cooling. So indirect air economization is a pretty innovative technology that’s becoming more readily accepted, he noted.

“You’re not sucking in raw outside air and pumping that air filled with contaminants directly into the data center,” Reele said. “In addition, we continue to see owners shy away from the more traditional approach of ‘perimeter based cooling’… meaning CRAC units installed within the white space envelope. Why take up revenue generating space or critical business production space with cooling units? Owners are looking at ways to maximize the white space and having cooling equipment installed where a rack could go. Furthermore, as density of the rack is starting to increase, the need for ‘point of use’ cooling systems is gaining more traction as well.”

In-row cooling that is maximized for space savings and performance is one way to achieve a more efficient cooling method, according to Reele. This type of system focuses on the most important equipment to be cooled — the IT equipment — and is located directly next to the equipment, minimizing losses and bad airflow characteristics that typically plague a perimeter-based approach.

“As technology changes, new ways, more efficient ways to cool, become paramount,” Reele continued. “Environmental regulations also play a role in the innovation cycle. We start to see more jurisdictions have much tighter control of refrigerant-based systems, driving the market to develop something other than a refrigerant based system.”

 

CHALLENGES

Because data centers require cooling 24 hours a day, seven days a week, and 365 days a year, and they can be installed in regions with a wide range of climate conditions, they represent a challenge for the cooling industry.

“It’s a big challenge to provide a product that can operate efficiently at all conditions, year-round,” said Rodriguez. “Data centers are very critical applications that cannot afford any downtime because of reliability or maintenance issues. This represents a challenge for manufacturers to provide not only a highly efficient product but also one that operates reliably with minimal maintenance. Danfoss has developed compressors for both air- and water-cooled chillers that can be used for year-round data center cooling. Danfoss Turbocor compressors feature oil-free, magnetic bearing technology that reduces maintenance and minimizes the risk of downtime due to issues with complex oil management systems.”

According to Lu, several challenges data centers face today include uniform cooling and energy efficiency throughout the entire high-density rack system.

“It’s important to avoid hot and cold air mixing, which can reduce cooling efficiency,” he said. “Another challenge is scalability. The cooling system needs to be flexible and accommodate future growth and expansion. Using a fabric air dispersion system can facilitate cold-aisle and hot-aisle air distribution by delivering cool air more evenly and directly to the entire surface of equipment racks’ cold aisle air intake. This effectively reduces the risk of uneven cooling in the high-density rack system. And a pre-engineered, modular cooling system is preferred to accommodate scalability and flexibility demand.”

Extracting heat from data centers requires equipment to continuouly operate at its original design capacity. However, as that equipment ages, its ability to maintain desired temperatures and humidity levels declines, Jones noted.

“Most often, the culprit is reduced coil heat-transfer effectiveness, or the ability of the air handling units’ (AHU) cooling coils to remove heat from the air,” he said. “These inefficient heat transfer rates derive primarily from the buildup of organic contaminants on, and through, the coil’s fin areas.

“UV-C eliminates microbes on cooling coils to significantly improve airflow and heat-exchange efficiency levels while reducing HVAC energy use up to 25 percent,” Jones continued. “The operating cost for a system that is on year-round is far less than 1 percent of the power to operate that air conditioning system. This amount is a noteworthy bargain in those systems that return 5 percent or more of their capacity.”

One of the biggest challenges data centers face is when the cooling system is already installed, Reele explained.

“Once cooling is installed in a data center or building, it is what it is,” he said. “What I mean by that is electricity is electricity. If I have a megawatt of distribution, and all of a sudden the IT kit changes in three years and I go through a refresh kit, I may have to run a couple of new distribution circuits, but at the end of the day, electricity is there. If you put in a chilled water plant, and the next thing you know, the IT technology is pushing you toward liquid cooling, that’s a pretty significant change. So cooling is not nearly as flexible over the long haul as electricity or other aspects of a data center, and that’s a challenge.”

Madara agreed, saying the biggest challenge lies in keeping up with the pace of change in capacity management and controlling costs.

“Colocation providers, by their nature, often have unpredictable capacity demands, while enterprise data center managers are continually working through how to achieve the right mix of capacities through on-premise and public and private cloud data centers,” he said. “These factors have big impacts on the types of cooling equipment used, with customers demanding highly modular and scalable systems that can be deployed in multiple form factors.

“Related to this dynamic environment is a growing focus among customers on thermal controls, both in their complexity and their ease of integration to existing building management systems and data center infrastructure management systems,” he continued. “Within this context, the set points established by commissioning a cooling system today will be obsolete not far into the future, so controls with advanced algorithms are needed to continually adjust thermal equipment operation to meet data center targets for efficiency and protection.”

As time goes on, cooling system flexibility becomes more important, Madara noted.

“While one facility might be perfect for traditional split-system, perimeter cooling, another might require a fully integrated outdoor packaged system,” he said. “Also, as the cost of real estate in areas with high concentrations of data centers increases, we’re seeing more and more multistory facilities. These buildings are often candidates for rooftop cooling systems. A surprising number of data centers still haven’t utilized aisle or cabinet containment to provide a more stable thermal environment and increase capacity and efficiency. Uncontained aisles allow hot and cold air to mix, which lowers the temperature of return air to the existing cooling units, thus reducing their efficiency. Aisle containment increases return air temperatures and reduces airflow requirements, both increasing the cooling equipment efficiency and capacity. This can be done with perimeter cooling, raised-floor cooling, or with row-based cooling that puts heat removal closer to the source.”

Publication date: 8/13/2018

Want more HVAC industry news and information? Join The NEWS on Facebook, Twitter, and LinkedIn today!