How many people are using fewer websites and spending less personal or work time online? How many businesses are adjusting their business plans to scale back their online presence and their ability to conduct transactions online?
Questions like that get to the essence of the data center market. A Gartner Inc. report projects data center infrastructure spending to reach $200 billion in 2021 after a predictable pandemic-related dip in 2020. That amount represents a 6% increase. A different study from Market Research expects a compound annual growth rate of 2% between 2019 and 2025.
Of course, data centers need maintenance, and HVAC is a key component.
Aisle Temps and Independent Systems: Keep ’Em Separated
Vice president of operations MSI Mechanical Systems Inc.
Contractors in this space understand that the financial reputational stakes for data center owners and their clients are like data center interior temperatures themselves: higher than in other applications. Contractors providing service and maintenance should understand the other differences and how that can affect their hiring, training, and processes.
“Equipment typically used in data center applications are referred to as the Mercedes Benz of HVAC equipment, therefore requiring experienced technicians for routine service and maintenance” said Dino Mangione, executive vice president of sales and marketing for Donnelly Mechanical, headquartered in Queens Village, New York.
A primary difference: the computer room air conditioner, or CRAC, dealing with the intense head loads generated by servers. The specialized units demand higher investment for the owner, and the required manufacturer training, diagnostics, and/or OEM parts and boards contribute to a higher total cost of ownership, Mangione explained.
With high performance comes high sensitivity, and that in turn heightens attention on filters, as Brian Hooper noted. Hooper is vice president of operations for MSI Mechanical Systems Inc. in Salem, New Hampshire.
“If the filters are not cleaned or changed, you can get dust particulates in the air, which can actually set off the fire alarms depending on how sensitive the sensors are and where they are located,” he said.
That susceptibility goes hand in hand with the challenging window of conditions for the interior environment. Any contractor working with data centers needs to be on comfortable terms with the ASHRAE guidelines and best practices for mission critical facilities.
The basic challenges for indoor conditions are familiar enough: humidity and temperature. As with other spaces, winter dryness wants to pull humidity levels down. Worst case, that can lead to static and terrible consequences for equipment and uptime.
On the other hand, Hooper said, “if you have a data center in warmer climates, you have to battle the higher humidity if you are taking in fresh air from outdoors.
“The more common problem,” he added, “is having the house systems battle with the temperature controls for the room.”
Keeping those separate systems 100% independent, working in proximity to each other and with one serving mission critical functions, is another task most HVAC contractors do not face.
Many are surprised at what that server space temperature range actually is.
“Many servers within computer room suites now have higher thresholds than was typical in the past, allowing cooling settings to be programmed in the mid-70s as compared to the mid-60s,” Mangione said.
That shift in conventional thinking and design has taken place this century, as equipment evolved and as data challenging older and colder norms persuaded the industry that performance would not suffer.
“This results in energy cost savings, fewer breakdowns, and reduced capacity for redundancy within these environments,” said Mangione. Given the upward growth of data centers over the past couple of decades, the difference in any one of those areas on its own would translate to meaningful savings in energy and equipment expenditures.
The other data center design norm that has evolved from a generation ago is the space layout itself for server racks and other equipment. While a data center might have had equipment lined around the perimeter once upon a time, Hooper noted, research and the need to manage ever-increasing heat loads have steered designers toward a “hot aisle, cold aisle” approach.
“In its simplest form,” he continued, “hot aisle/cold aisle design involves lining up server racks in alternating rows, with cold air intakes facing one way and hot air exhausts facing another.”
Managing Preventive Maintenance
With that cursory look at the data center application landscape, attention turns to the best ways to manage maintenance. Ask these men about preventive maintenance (PM) agreements and some details may vary, but a shared challenge is the occasional owner impulse to look for ways to trim expenses.
GREEN-LIGHTING MAINTENANCE: Facility managers sometimes feel the pressure to save money by skipping scheduled preventive maintenance. Sensitive servers and systems mean that rarely ends well, said Dino Mangione of Donnelly Mechanical. Courtesy of PxHere, CC0 public domain
“If the customer does not keep up with quarterly PMs,” Hooper offered, “then the data center will never work properly.” Other systems in the building may allow for biannual check-ups. HVAC is just too critical on a day-to-day basis in this setting, where tolerance for subpar performance is necessarily low.
Dirty filters, for example, mean insufficient airflow and equipment alarms (if not the aforementioned fire alarms). Broken belts are another common victim of overlooked PM.
A data center’s specific location will influence system vulnerability, too. Donnelly’s Mangione uses the example of “keeping condenser coils clean when located close to street level in a busy city.” The hot aisle/cold aisle layout strategy will also affect how contractors maintain air distribution, he noted.
Donnelly typically recommends monthly HVAC inspections to protect the data center’ all-important uptime.
“The opportunity cost of the savings achieved through fewer inspections generally does not outweigh the revenue and lost productivity that is inevitable during an outage,” Mangione said.
That won’t necessarily stop owners from trying, based on Hooper’s examples.
“Facility managers will look at their yearly budgets and say ‘we need to cut back, so let’s not do the PM this quarter,’” but then that leads to additional service calls.
“I know the work comes out of different budgets and that is why this is done,” Hooper continued, “but you don’t save any money if you continue with these business practices long term.”
Better to pursue steady predictive and preventive maintenance instead of lurching from emergency to emergency … but what is an emergency, anyway? That is another part of putting together a good PM relationship.
“Once a frequency is agreed upon,” said Mangione, “standards and scopes are defined to ensure there is an understanding between client and contractor as to what constitutes an emergency; who is to respond and when; how it gets responded to; and finally, how the site is to be maintained, serviced, and accessed during each visit.”
Especially in the event of a sudden problem, those are not issues either party wants to have to sort out or debate on the fly. After that, just as many facilities employ a level of equipment redundancy to ensure performance, Mangione commented that Donnelly does the same.
“We train multiple dedicated technicians for each data center to ensure service redundancy and knowledge of the site across our service division.”
Mangione offered one other tip that sounds like advice for the owner but would actually benefit the contractor just as much, if not more.
“One of the simplest and most effective recommendations for our data center clients is to keep a critical spare parts stock on site.”
The ever-increasing complexity of mission critical HVAC equipment has increased the chances that part availability could be an obstacle in a pinch. On the other hand, according to Mangione, identifying the most critical items and having a spare on hand, he said, opens the door to a contractor being able to “respond and resolve the failure almost immediately.”
When presented with the chance to decide or contribute input regarding equipment upgrades, Mangione also cites variable frequency drives and units that can ramp up or down to match load demand as ways data centers can temper energy costs that can, well, run hot.
State rebates may further reduce the sting of capital investments, whether they are necessary or just a good idea.
As Hooper said, data center equipment tolerances lead to tight designer specifications. But it winds up being the facilities team and contractor partner who carry the daily responsibility to deliver steady environments within even one or two degrees.
“The world relies on technology,” Mangione summed up. Contractors can decide if an operations environment with that amount of pressure — and its requisite amount of preparation and training — is a deterrent or a business opportunity. For those who can stand the literal and figurative heat, the data center market doesn’t look to cool down anytime soon.