FIGURE 1. Heat densities of data center equipment are projected to continue to rise throughout this decade. (Chart reprinted with permission of The Uptime Institute from a White Paper titled “Heat Density Trends in Data Processing Computer Systems and Telecommunications Equipment,” Version 1.0.)

Most businesses now rely heavily on information technology (IT) equipment. It’s typically stored and operated from specific rooms, where it generates a lot of heat - even though the equipment itself can be very sensitive to heat, even during colder seasons like fall and winter, when the rest of the building requires heating.

“Over the past 30 years, the IT industry has seen a geometric decrease in the floor space required to achieve a constant quantity of computing and storage capability,” states the white paper “Heat Density Trends in Data Processing, Computer Systems, and Telecommunications Equipment,” published by The Uptime Institute*. However, the efficiency of the equipment hasn’t improved. As a result, they consume more power and kick out more heat in a smaller space. The equipment is getting smaller, but that doesn’t mean it generates less heat.

And more users are realizing that these smaller units can be stacked. According to the institute, “The resulting power consumption and heat densities that can be created are almost unimaginable, especially to the layperson.”

Businesses of all sizes are coming to depend increasingly on computers and associated electronics equipment for a wide range of activities: general operations, accounting, Internet transactions, e-mail, phones, hotel pay-per-view and satellite TV systems, and more, pointed out HVAC manufacturer MovinCool in a white paper, “How to Avoid IT Equipment Overheating in Winter.”

“As a result, businesses are using many more pieces of electronics equipment than ever before, which they are often storing in server or telecom closets.”

Keeping the heat-sensitive IT equipment in server or telecom rooms cool is a challenge being faced by many companies - even during cold-weather months, when most of the rest of the building is heated. “As businesses of all types have come to rely more and more on electronics equipment, such as servers and telecom switches, this problem has become a critical one,” wrote MovinCool.

The contractor who undertakes this type of work needs to be prepared for quick response during emergencies. One contractor, Commercial Express HVAC in Chantilly, Va., has built this into a lucrative niche - with a heavy emphasis on fast service at any hour.

“It’s all about performance,” said company president Jim Whitescarver. “For our particular niche, they’re not so much focused on energy as they are on performance and reliability.”

Success depends on a balance of the performance, maintenance, and the right contractor, he said. It includes redundancy to keep IT systems up and running despite incidental cooling breakdowns. “One customer requires a one-hour response time from us,” Whitescarver said. “They have a lot of mission-critical contracts involving computer room facilities.” These high-end, behind-the-scenes, secure rooms demand 24/7 service.

“We just finished up a Department of State job here with seven Liebert systems,” Whitescarver continued. “We’re not only installing them, we’re also designing the room.” In that process “you look at capacity first, and you look at redundancy. If you have a 50-ton load, you don’t want a 50-ton unit.”

It’s important for contractors to factor in several specific and diverse concerns when entering the potentially lucrative area of IT cooling design. Customers take reliability issues very seriously.



SMALLER AND HOTTER

“Increasingly, electronics equipment is housed in a converted closet,” MovinCool pointed out. “The equipment is susceptible to malfunctioning or damage due to overheating, so keeping it cool is essential.” Like all buildings, these special-purpose rooms work better if they are included in the building’s initial design - and their cooling requirements can be factored in upfront. “Yet for one reason or another, this is often overlooked,” the company said.

Newer, higher-performing computers pack more transistors operating at higher speeds into processors, thereby increasing transistor power consumption. “For example, Intel’s next 64-bit processor is expected to consume somewhere between 140 and 180 watts of power,” explained Liebert.

“Compare that with the Pentium 4, which consumes about 83 watts at full power. Then consider that the Pentium 4 represented a 319 percent increase in power consumption over its predecessor, which had a maximum power consumption of 26 watts.”

These advances, in turn, drive increases in power consumption by servers and switches. “And remember that in electronics, all power is transformed into heat - heat that must be removed from the data center,” Liebert said.

Even though computers and servers have gotten smaller, they now “kick out more heat in smaller spaces,” stated MovinCool. “This high-density cooling requirement presents interesting challenges for mechanical contractors.”

“The history of the computer industry is the story of the quest to pack more processing power into a smaller footprint,” said Liebert. “One misconception that has resulted from this trend is the idea that power consumption and heat generation are shrinking along with equipment size. This is not the case.”


Traditional approaches to data center cooling, such as underfloor cooling, “have consistently been able to scale to meet these increases,” Liebert said. “However, with their recent advances, computer system manufacturers have achieved densities that are testing the limits of traditional cooling systems.”

“Unfortunately, very few computer rooms or technology spaces have sufficient useable cooling and/or airflow capacity to handle the projected heat loads and heat density,” said The Uptime Institute.

“In the small to midsized rooms, we do run into capacity problems,” said Whitescarver. “Often you’ve got an air distribution problem too - air conditioning in the wrong areas, but not necessarily capacity problems.” These are retrofit jobs for the fast-response contractor.

Rack densities greater than 3-4 kW are now becoming common, said Liebert. “That appears to be the threshold at which current approaches to cooling reach their practical limits. System manufacturers are expected to drive power densities even higher in the years ahead.” The Uptime Institute predicts that power densities will continue rising throughout this decade.

“While many existing technology spaces and data centers are likely to be able to provide sufficient electrical power, most will struggle or may not be able to provide sufficient air circulation and air-cooling capacity if large numbers of future high-performance IT products are installed,” said Liebert. “As the projected trends occur over the next three to six years, air from under the floor by itself will not be sufficient to remove the heat being generated.”



FIGURE 2. A computational fluid dynamic model (side view) showing air temperatures for high-density racks cooled by traditional underfloor cooling in the Liebert XD test lab. Red in the image indicates areas where air is above 93°F.

COOLING ELECTRONICS

“At the start of the computer era, when mainframe computers were the rule, only the largest companies could afford them,” said MovinCool. “Mainframes, which generated very large amounts of heat, were housed in their own large rooms. They were cooled by dedicated air conditioning systems, which were supplied by the computer manufacturer as part of a total equipment package.”

Today’s server technology still requires dedicated server rooms cooled by specialized precision-cooling systems.

However, “New servers and communication switches generate as much as 10 times the heat per square foot as systems manufactured just 10 years ago,” pointed out Liebert. “As these new systems are installed alongside previous generation systems, they create hot zones within the data center that cannot be effectively managed using traditional approaches to cooling. New strategies and technologies must be implemented to provide the cooling high-density systems required for reliable operation.”

Sometimes the threat of electronics overheating is not immediately apparent. “To prevent an emergency from occurring later on, however, it is essential to understand the dangers involved, so that the proper precautions can be taken before it becomes too late,” said MovinCool. “Although there may be no outward signs of overheating and the equipment may not fail immediately, even moderately excessive heat can shorten its life cycle.”

In addition, if the air conditioning system is turned down according to occupancy schedules, on weekends or on holidays, temperatures in equipment rooms can quickly soar. “If the equipment runs 24 hours a day, seven days a week, as is often the case, it can easily be affected,” said MovinCool.

Keep in mind that electronics equipment itself generates heat, and the server closet it is housed in is very small; temperatures can quickly rise. “Unless adequate air conditioning is provided, there is a high risk of heat-caused equipment failure and possible costly damage. For many businesses, system downtime can be even costlier.”

When servers start to overheat, they usually shut down in time to prevent damage. Network routers, however, are more susceptible to heat. “These handle a company’s internal and external data transmissions, including e-mail and IP-telephone communications,” explained MovinCool. “If they overheat, they can sustain permanent damage and need to be replaced, often at considerable cost.”



SOLUTIONS, PITFALLS

Finding the right cooling solution depends on determining what specifically needs to be addressed. This requires some coordination between the HVAC contractor and the IT department.

High-density system heat-control strategies have included increasing equipment spacing, adding more precision air conditioning units, and attempting to increase airflow through perforated floor tiles. “Each of these approaches may provide some improvement, but none represent a true solution to the problem and several actually increase data center operating costs,” said Liebert.

When airflow in raised-floor sites is measured, the company said, it typically averages 250 cfm or less through each perforated floor tile. “Spacing the equipment to allow existing airflow to dissipate a 16-kW rack would require aisle widths of more than 16 feet - hardly a practical solution considering the cost of data center space.

“Adding more room-level precision air conditioners may not prove to be practical or effective either,” the company continued. “Space in the controlled environment is often scarce and expensive. In addition, room-level precision air conditioners may not provide cooling where it is most needed: around new high-density systems. Optimizing the placement of room-level air conditioners to handle the hot spots results in all other equipment being cooled less efficiently, increasing costs.” Individual space needs must be evaluated.

“The amount of air that can be pushed through a raised floor is also a limiting factor,” the company said. “It may not be possible to push enough air through the floor tiles to handle increasing cooling requirements, regardless of how many precision air conditioners are installed.” Doubling the floor height can increase capacity as much as 50 percent, but this solution requires the removal of all existing equipment while the new floor is installed.



SPECIAL SOLUTIONS

As if these demands weren’t enough, contractors need to factor in the ability to provide cooling when the rest of the building needs heating, and meeting the IT room’s humidity needs. Providing adequate cooling when the weather turns cool can add one more level of complication, as the building’s HVAC system switches to the heating mode. The IT room’s cooling system does indeed need to be dedicated to that space.

“Electronics equipment usually requires a cooler ambient temperature than the human occupants of a building,” said MovinCool. “As a result, even during the summer months when the building’s main air conditioning system is on, electronics equipment may not receive adequate cooling to ensure long-term reliability.”

Precision-cooling systems have been the traditional solution. In server closets, MovinCool suggested that new, self-contained, ceiling-mounted industrial spot air conditioners offer an alternative.

Heat loads from electronic equipment also tend to be drier than general-purpose environments. Consequently, precision air conditioners are designed to deliver a higher sensible heat ratio than comfort cooling systems. “That means they spend less energy removing moisture from the air and more energy removing heat,” explained Liebert. “In fact, these systems are typically equipped with a humidification device that allows them to add moisture to keep the environment at the 45-50 percent relative humidity range that is optimum for electronics.”

According to Liebert, “Precision air systems can continue to meet the humidity control and air filtration requirements of the room, which change little based on equipment density. But they are not able to meet the increased demand for sensible cooling that is being created by high-density equipment. As a result, hot spots are being created that cannot be easily reached by raised-floor systems.”

One way to address this, the company said, is to arrange equipment racks to support alternating hot and cold aisles. “This can increase the efficiency of raised-floor cooling and should be adopted as a first step in addressing heat-density issues. In this approach, equipment racks are placed face to face so the cooling air being pushed into the cold aisle is drawn into the face of each rack and exhausted out the back onto the adjacent hot aisle.”

This raises the temperature of air drawn into the precision air conditioners, “enabling them to operate closer to their capacity,” the company said. “Blanking panels can be used to fill open spaces in racks to keep hot air from being drawn back through the rack.”



CONCLUSIONS

Our sources agree that power densities have exceeded the capacities of traditional approaches to cooling, and they are going to get even higher. According to Liebert, “Over the next five years, data centers installing new systems will be faced with two challenges:”

• Managing hot spots within the data center.

• Managing increasing overall temperatures as high-density systems displace older systems, creating higher heat across the room.

New cooling equipment is being developed, or is currently available, to address these growing needs for contractors and their customers.

* “Heat Density Trends in Data Processing, Computer Systems, and Telecommunications Equipment,” copyright 2006, all rights reserved. All sections reprinted with permission.

For more information, visit www.liebert.com, www.movincool.com, and www.upsite.com.

Publication date:04/27/2009