There are dozens of data center cooling methods. But multitenant, shared location data center operator Involta LLC said it believes its HVAC design team has developed one of the industry’s most efficient concepts. 

Involta Northpointe, a recently opened, 40,000-square-foot data center in the Northpointe Industrial Park of Freeport, Pennsylvania, is already recording 1.3 power usage effectiveness, which places it in the top 5 percent of efficient multitenant data centers nationwide, according to company officials.

The performance statistics haven’t gone unnoticed. Involta recently signed one of the nation’s major health care providers, University of Pittsburgh Medical Center, as Northpointe’s anchor tenant.

Uptime Institute, a certification organization for data center design, construction, management and operations, issued Northpointe a tier III certification, which includes HVAC capabilities of cooling 725 kilowatts per hour of critical heat load even during a power interruption.

Involta has continually strived for higher efficiencies. For example, Northpointe’s HVAC design is 52 percent more efficient, and uses half the energy of Involta’s first multiple tenant location opened in 2008.

The statistical performance leading up to Northpointe’s prototype didn’t occur quickly. Officials say it is due to a series of progressive green HVAC design modifications.  Involta’s design team have made constructing and retrofitting its 12 other shared locations in Arizona, Pennsylvania, Ohio, Minnesota, Iowa and Idaho. Innovations include developing data center-specific air dispersion, variable frequency drives on cooling systems, and supply/return air plenum designs.

The Involta team includes the company’s chief security officer Jeff Thorsteinson, and its data center operations director Lucas Mistelske. The team also includes outsourced consultants, architects and engineers: Jason Lindquist, P.E., associate at consulting engineering firm Erikson Ellison & Associates in New Brighton, Minnesota; and Scott Friauf, president of general contractor Rinderknecht & Associates in Cedar Rapids, Iowa; Solum Lang Architects, also in Cedar Rapids; and fabric air dispersion manufacturer DuctSox Corp. of Dubuque, Iowa.  

server rooms

The common wall shared by the mechanical and server rooms have a row of air conditioning units with a specially designed sheet metal plenum that combines all the airflow and sends it to fabric ductwork inside the server rooms.

‘Strategic air dispersion’

Northpointe features a common industry methodology of computer room air conditioners that supply displacement ductwork runs centered above electronics rack cold aisles. However, that’s where the similarities stop.

“The data center industry has come to realize that strategic air dispersion, not more cooling volume, is the secret to effective rack cooling, facility efficiency and minimal equipment failures,” Thorsteinson said.

Traditional metal ductwork in earlier Involta locations, whether recessed in ceilings or exposed over cold aisles, fell short of delivering efficient and effective cooling even though there was sufficient computer room conditioning capacity and room temperatures as recommended by ASHRAE Standard 90.4, “Energy Standard for Data Centers” and its “Data Center Power Equipment Thermal Guideline and Best Practices” white paper. The main shortcoming was metal duct’s inherent high velocities that resulted in turbulences that prevented electronic equipment fans to draw cooling into the racks. The high velocities of 800 feet per minute and beyond also caused inefficient return air volumes.

Involta collaborated with fabric duct manufacturer DuctSox to develop “DataSox,” an air dispersion duct that’s specifically aimed at solving challenges unique to data centers.

The design solved velocity, volume and turbulent air dispersion issues. At Northpointe, it’s positioned over the cold aisle in double 36-inch-diameter, 36-foot-long runs. A majority of air is distributed through the fabric with micro perforations located on the bottom half of the round, static-free fabric. There are also field-adjustable, directional nozzles running linearly down both sides that allow higher concentrations for hot spots.

Sheet metal dampers

Lindquist specified metal dampers for duct takeoffs in the event a duct run is uninstalled for commercial laundering, reconfiguration or adjustments. Generally, data center-specific fabric air dispersion is designed for a particular project’s specifications. In the field however, the nozzles can be throttled and redirected to eliminate any damper balancing commonly required in conventional ductwork projects.

“This unique approach that the Involta team innovated in its recent data centers is very impressive and according to our tests, has outperformed a lot of other HVAC concepts we’ve looked at,” said Lindquist, who has designed more than 12 data center mechanical systems.

The computer room air conditioners discharge 64°F air and the racks generally draw in 64°F to 67°F air. Temperatures to the air conditioners’ return plenum ranges from 82°F to 95°F.

Cold aisle temperature uniformity in conventionally designed data centers can surpass a 10°F differential in conventional data center air distribution designs. However, Northpointe’s design records very slim cold aisle differentials of only two degrees from the top to bottom.

A precursor to this design, Involta’s Marion, Iowa-based location, was retrofitted from metal duct and conventional air handlers to data center-specific DataSox and air conditioners with variable frequency drives and other enhancements. The HVAC retrofit reduced energy usage by 80,000 kilowatt hours monthly. 

buffer space

The buffer space between the data hall and the mechanical room is insulated.

Mechanical configuration

Northpointe’s mechanical room configuration, designed by general contractor Rinderknecht and installed by local mechanical contractor McKamish of Pittsburgh, splits the data center into 200- and 180-rack halls. In the centrally located mechanical room, each bank of ten 24-ton DA085 upflow air conditioners by Vertiv of Columbus, Ohio, is positioned along the wall of the room it supplies. For example, the 200-rack room is anchored by 10 computer room air conditioners supplying approximately 13,000 total cubic feet per minute controlled by variable drives. Each room air conditioner offers redundant refrigerant circuits and fans. The two-stage scroll compressors switch to free cooling when outdoor ambient temperatures drop to  54°F or less. The units reject heat to rooftop high-efficiency micro-channel condensers. 

The variable frequency drives operate the air conditioners at 20 to 40 percent capacity; however the i-Vu building automation system by Carrier Corp. of Syracuse, New York, can call for more in high humidity situations.

“Running at these lower fan speeds, obviously saves us a lot of energy,” Thorsteinson said.

The data halls are constructed of tornado-proof, 12-inch-thick, pre-cast concrete cores. Roof R-value insulation averages approximately R-36 and far surpasses ASHRAE 90.1 building code standards and adds to the facility’s total energy savings.

Rinderknecht also designed a supply plenum and separate return air plenum that connect to each data hall’s bank of computer roof air conditioners. The return air collection of taking rising warm air and delivering it to a plenum arrangement the computer rooms share is something Rinderknecht designed.

Certification

The certification by the HVAC system by Uptime wasn’t easy. Uptime Institute certifiers had never before seen such a plenum arrangement and air-delivery system. They required unusual data from the engineering firm, such as calculations on the mechanical spine pressurization, or unprecedented worst-case scenarios of extreme pressurization, airflow and temperature events.

“They were initially quite skeptical of our HVAC approach and required test data that was well beyond typical certification requirements, but ultimately we proved the energy efficiency, airflow uniformity and performance claims,” Thorsteinson said.  

Involta Northpointe

Involta Northpointe’s 40,000-square foot data center in Freeport, Pennsylvania, saves energy thanks in part to its HCVAC design.

Rinderknecht was also proactive in helping Involta obtain utility rebates for LED lighting, lighting controls, building automation controls, uninterrupted power supplies, static-transfer switches, direct-current circuits, Energy Star-rated transformers and a host of other gear.    

Besides Uptime Institute certification and a green HVAC system, potential customers are wowed by the visual impact the unique air dispersion makes when touring an Involta facility, Thorsteinson said.

“Their (DataSox) unique appearance always prompts questions, which is always a good thing,” he said. “Afterward they typically view them as innovative and smart.”

This article and its images were supplied by DuctSox Corp.