CASE STUDY. TCS Utilizes Airflow Containment to Overcome Limited Power Availability and Win Two ASHRAE Awards. Optimize. Store. Secure.
|
|
|
- Jeremy Fitzgerald
- 10 years ago
- Views:
Transcription
1 CASE STUDY Helping you Optimize. Store. Secure. technology equipment... TCS Utilizes Airflow Containment to Overcome Limited Power Availability and Win Two ASHRAE Awards Seattle-based saw plans for its new data center run into a wall when the local utility company said the grid didn t have enough available power to support the expansion. But in this case, it seemed running into a wall was exactly what TCS needed. Utilizing a cooling wall introduced by the design-build firm of McKinstry, TCS new data center not only overcame the problem of limited power availability, it made TCS one of the most efficient data centers in the Pacific Northwest. Deployed through an evaporative-only cooling wall and airflow TCS built a showcase data center that earned two ASHRAE awards and a regular PUE of 1.15 by using 50 F-Series TeraFrame Cabinets with Vertical Exhaust Ducts, Evolution Cable Management and CPI Rack Systems. containment provided by Chatsworth Products (CPI) Passive Cooling Solutions, the TCS data center expansion went on to earn two ASHRAE awards one national and one regional and an average PUE of Challenge If you call 911 from a cell phone located anywhere in the US, the signal is likely to pass through TCS data center before it ever reaches emergency personnel. If you get lost, TCS gets you home by managing navigation systems for some of the nation s largest cell phone service providers. These aren t responsibilities that can be taken lightly they are literally a matter of life and death. The Seattle switch sends the information to us and then we look you up, see where you are and provide your location, said Jeri Ginnaty, TCS Manager, Data Facilities Group. Then it takes the data from your phone and the information we compiled and calculated, and sends it to the public safety answering point. They would put those two together and send the emergency responder to your location. For all of that to happen it takes about 600 milliseconds. Summary overcomes limited power availability and wins two ASHRAE awards with a data center design that reduced power consumption and increased expansion capacity with CPI Passive Cooling Solutions, white equipment cabinets and evaporative-only cooling. Challenge Overcome limited power availability and cooling inefficiencies associated with the critical need for a data center expansion. Solution Deployed evaporative-only cooling, full airside economizer and an airflow containment strategy that utilized CPI s glacier white F-Series TeraFrame Cabinets with Vertical Exhaust Ducts in a hot/cold aisle layout. The room s allwhite layout of equipment cabinets and Evolution Cable Management also helped reduce energy used for lighting by 30%. Customer Profile TeleCommunication Systems, Inc. (TCS) produces solutions for missioncritical wireless communications that demand proven high levels of reliability, security, and availability. Solutions include: secure deployable communication systems, locationbased wireless infrastructure, navigation, VoIP Enhanced 911 services (E911; E112), text messaging infrastructure for wireless operators and related professional services for communications solutions. Industry Telecommunications, Wireless
2 By containing the air and directing it into the ceiling through a Vertical Exhaust Duct, CPI's F-Series TeraFrame Cabinets helped TCS expand its data center with a cooling strategy that eliminated mechanical cooling and greatly reduced energy consumption. As cell phone use rises, so does the need for these critical services. TCS has seen that demand continually grow in its legacy data center and in 2009 they decided it was time to expand. By keeping its legacy data center an open-rack design using seismic frames arranged in hot and cold aisles in operation, an additional data center in the same building would mean faster service, the ability to serve more customers and the creation of a new testing lab for software development. But with the legacy data center and proposed expansion both relying on CRAC units for cooling, it would also mean the use of more power a lot more power. We found out we had absorbed every spare ounce of energy the building had available, said Stephen Walgren, TCS Data Facility Engineer. For us to grow we would have to go to Seattle City Light and have them put in a new primary feed. CPI s cabinets are totally passive and that s very efficient. Limited power availability was only half the problem. A data center expansion using traditional mechanical cooling would only allow about 30 percent of the building s power supply to be used for the servers. And with cooling units siphoning off nearly 70 percent of the building s energy, that left little room to increase computing capacity or plan for other expansions in the future. We were driven by two reasons, said Ginnaty. One was the amount of power available to us in the building and the other was the cost of 2
3 maintaining the data center. To do it the old way, we wouldn t have had enough power to make it efficient to build. We couldn t run all the servers. So we looked at less energy to cool this system and now we have over 70 percent of our power available for the servers. A Different Approach to Cooling TCS needed a cooling strategy that was ambitious, innovative and above all, it had to be incredibly efficient. McKinstry, which operates across the US but is headquartered in Seattle, was up to that challenge and joined the project in 2010 with a brave new approach. Rather than find a way to accommodate the energy usage that CRACs would put on TCS limited power availability, McKinstry opted for a design that not only eliminated CRACs, it eliminated mechanical cooling altogether. Recognizing that Seattle has a relatively mild climate, McKinstry Project Manager Kyle Victor and McKinstry Operations Manager Joe Wolf worked with primary designers Jeff Sloan and Alireza Sadigh to set aside one wall in the data center to become the cooling wall. Acting as the connecting point between the data center and a controlled flow of the Pacific Northwest s signature cool air, this cooling wall resulted in a system that uses 100 percent evaporative airside economization. Optimized by airflow containment within CPI s F-Series TeraFrame Cabinet System with Vertical Exhaust Duct and carefully monitored air pressure levels, McKinstry s design delivered a precise and effective answer to TCS efficiency needs. CPI s cabinets are totally passive and that s very efficient, said Ginnaty. All the work is being done by the servers and the room is maintained at the supply temperature we re feeding the servers. Because Seattle s weather can be unpredictable at times, this raised concerns early on that a few warm summer days could heat the data center to dangerous levels and raise humidity. However, when weighed against equipment specs on the servers and a proposed temperature set point of F (21-23 C), Seattle s weather and temperature fluctuations were deemed manageable and unlikely to disrupt the cooling system. We do allow the temperature set point to reset to a max of 80 F (27 C) said Victor. There s potential for a 100 F (38 C) day in Seattle with high One of the ways TCS was able to optimize its cooling system was by keeping hot air contained in the cabinet and separate from the room through a combination of Vertical Exhaust Ducts and custom designed brush-sealed grommets. humidity and the temperature in here might get up to 80 F (27 C) with some humidity. We discussed early in the project that this would be a possibility. But after looking at equipment specs we see they re designed with those tolerances and a lot of equipment is designed to go higher. While CPI s cabinets and vertical exhaust ducts work to keep hot air isolated from the room by passively directing it into the plenum, the evaporative cooling system s exhaust fans still require energy to keep the room supplied with cool air. Ensuring that this fan energy is minimized, the entire system is maintained with a finite pressure of 1/100 of an inch of pressure difference between the room, the building and outside. This tightly monitored pressure level is fully automated, allowing each fan to adjust whenever servers in the data center are added or removed. You put in a new server that wasn t in here previously and it draws 100 CFM (cubic feet per minute), the pressure transducers that control the fans can sense that explained Victor. They ll adjust their speed by exactly that amount, automatically, to compensate for that additional 100 CFM requirement. So you re never using more fan energy than you need to be using for that exact load. Primary elements of the cooling system were finally set and expected to bring TCS energy consumption well below levels seen in typical mechanical cooling, but one final piece of the system still remained how to deploy IT equipment without exhaust air having an impact on the room. Chatsworth Products, Inc. Case Study 3
4 Cooling with Cabinets To meet the telecommunications needs and expansion hopes of TCS new data center, the core IT infrastructure demanded a solution that included quality engineered equipment cabinets and flexible cabling solutions. Such a precise cooling system also needed cabinets that satisfied equally critical efficiency and airflow containment needs. Rather than merely exist in the data center s white space, these cabinets would need to be incorporated into the room s design as a primary element of IT infrastructure and the cooling system. Wolf had already seen an example of this approach while touring through King County s data center, also located in Seattle. King County used a slightly different approach to evaporative airside cooling, but the finished data center still experienced dramatic savings by deploying CPI Passive Cooling within rows of F-Series TeraFrame Cabinets with Vertical Exhaust Duct. This system worked by passively removing hot exhaust air out of the cabinets and into the overhead plenum, leaving it completely contained and separate from the room s cool supply air. That was one of the first places I saw the (ducted) cabinets, said Wolf. I talk with Casey Scott (Northwest Regional Sales Manager for CPI) pretty frequently and he s always keeping me up to speed on what s new and what s innovative. Known throughout the industry for being the pioneers that first introduced passive cooling into data center environments, CPI and its ducted cabinets were the obvious choice for King County s data center they could remove the heat, isolate it from supply air and help the airside system maintain efficient pressure levels. Further optimized with a comprehensive sealing strategy that closes empty rack-mount unit (RMU) space, cable openings and the bottoms of cabinets, this cabinet-level approach to airflow containment was also what the TCS/McKinstry design needed. In order to control these fans, which are based on very fine pressure differentials, these cabinets had to be totally sealed, said Victor. We wanted to have everything sealed except the opening at the server that s pulling the air through. Read the King County case study at CPI Rack Systems and Evolution Cable Management, both in glacier white, were used to increase cabling capacity and reduce lighting costs in the TCS data center expansion. When deployed as a total thermal solution, CPI Passive Cooling has the proven ability to eliminate hot spots, allow higher set points on cooling equipment and reduce a data center s total energy costs by up to 40 percent. For King County the use of CPI Passive Cooling had already resulted in an average PUE of 1.5 but for TCS the efficiency potential was even greater. Connections and Collisions A solid plan to save energy was in place but one of the data center s most critical elements still needed to be deployed a reliable cabling infrastructure that optimized connectivity, airflow and convenience. Specifically, TCS needed a cabinet and cable management solution that easily accommodated cable changes, and a design that deployed across the data center without conflicting with factors like seismic protection, fire suppression and airflow. We had huge collision checks, said Wolf. It s tight, from sprinkler heads to smoke detectors to lights to up above the ceiling to the conduits everything is really crammed in there. The final design included a slab floor and cabinets on ISO-Bases, which would require an overhead cabling approach that was flexible enough to move at least eight inches in either direction during a seismic event. This also tied into how the Vertical Exhaust Duct would meet the plenum. Rather than have the duct penetrate the ceiling tiles, they extended to the tile surface, allowing them to easily move across the ceiling if the cabinets shifted from seismic events. CPI helped accommodate this design by manufacturing customized top 4
5 It s been good for us and the cabinets work so well. After touring a number of data centers we were convinced these were the cabinets we wanted. The difference is huge. Stephen Walgren, Data Facility Engineer to support whatever we ve put in there. We can get around in these cabinets and run cabling comfortably top to bottom of the right side. You can actually get in and work. panels and rear doors that could house grommets that allowed cabling to pass through, while airflow remained contained. The cabinet needs to be able to move and not yank on anything, said Wolf. We didn t want to run these cords through the center and into the front because then you re creating a pinch point and something is going to give. So the grommets were custom basically the tops and backs were custom. Extra length would need to be added to many of the cables to accommodate potential shifts during seismic events. However, too much extra cabling in a cabinet could obstruct airflow. CPI solved that problem with its F-Series TeraFrame Cabinets, which were 51U in height and deep enough to address cabling concerns by providing ample room between the frame and equipment rails. Day to day our cabinets are fairly full of servers, said Ginnaty. But on the backside of the cabinet, because of cable management, it s large enough so you can easily manage all the cables. You re not blocking any of the airflow. In a smaller cabinet all that would be jam packed across the back and you would have a lot of airflow issues. Extra cabling would also be an issue at the data center s intermediate distribution frames (IDF), which were deployed across the data center on glacier white Standard (Two-Post) Racks by CPI equipped with Evolution Vertical Cable Managers, also in glacier white. We have lots of space to work with at the IDF and if the cable is a little long there s a great place to hide it in the IDF, said Walgren. It s been good for us and the cabinets work so well. After touring a number of data centers we were convinced these were the cabinets we wanted. The difference is huge and I can t say enough good things about that M6 rail set it s meaty enough McKinstry carefully planned the TCS data center to avoid conflicts between seismic protection, fire suppression, power delivery and airflow. Here it is clear that each of those elements were deployed in a confined space that still provides easy access to cabling and equipment. Added Incentive Seattle City Light had already established that TCS would not be able to expand its data center with a traditional cooling approach. However, Seattle City Light was willing to balance that limitation with an incentive that gave TCS the opportunity to present two separate approaches and compare energy usage for each one. If TCS went with the more efficient approach, the utility company would reward that effort with a cash rebate. Now operational for nearly two years, TCS has been carefully documenting its energy usage and is inching closer and closer to reaping those rewards. It comes down to measurement and verifying how you re operating, said Victor. What we re proving is that we re operating more efficiently than the We can get around in these cabinets and run cabling comfortably top to bottom of the right side. You can actually get in and work. Stephen Walgren, Data Facility Engineer Chatsworth Products, Inc. Case Study 5
6 code required baseline efficiency and by how much. We had to build a speculative case that said, Here s how many kilowatt hours per year the mechanical system would utilize if we just installed a baseline code compliant mechanical system. And here s how many kilowatt hours a year we use with this more efficient system. Then there s a delta, we re using far fewer kw a year, so they re incentivizing those saved kilowatt hours. I think it s 23 cents per kw saved. CPI Passive Cooling is further reducing that load by allowing the server fans to push heat through the Vertical Exhaust Duct and into the plenum without the use of additional energy. It runs very efficiently, said Ginnaty. The only power being used for air movement is the server fans. And in conjunction with the cabinets isolating and directing the airflow,they are doing all the work. That s where the biggest savings is. Right now we re about 70 percent full and I think it s about 30 kw for the IT equipment not including DC, it s probably 50 kw with that. White Cabinets A Brighter Future Choosing a cabinet s color may be a matter of aesthetics for some data centers but for others that decision revolves around practicality. This is especially true for data centers joining the gradual shift from traditional black cabinets to the bright, clean look of a white cabinet. By having a surface that reflects light instead of absorbing it, data center technicians are finding it easier to see inside the cabinet for equipment changes and maintenance. You do anything you can to create light, said Walgren. You put dark cabinets in there and the room is going to get darker especially with having the ISO-bases. The cabinets in the other data center, if you drop a screw in there it s gone forever. We ve had screwdrivers that disappeared in there. We were able to reduce lighting by 30 percent by going to white cabinets. Already leaning toward a full deployment of white cabinets, racks and cable management, the design found its final nudge in the form of yet another reduction in power usage. We were able to reduce lighting by 30 percent by going to white cabinets, said Ginnaty. We were going with white because it looks really nice and the other reason is working inside the cabinet. But the room doesn t require the same amount of light. For a design team that adamantly worked to uncover every possible avenue toward savings, the choice to use white cabinets helped bring the design one step closer to meeting the efficiency goals set by the local utility. There was actually a rebate component to having the white cabinets because it reduced the total lumens required to light the space, said Victor. Conclusion If there is one common thread this data center has seen, it is a persistent commitment to efficiency. Spanning from the early planning phases and on through today, nearly two years after coming online, TCS, McKinstry and CPI have proven that an efficient data center design can overcome power limitations and drastically reduce energy consumption in this case by an estimated 513,590 kw per year. ASHRAE recognized those efforts with a firstplace regional technology award in 2011 and second place honors on the national level in 2013 for Category III Industrial Facilities or Processes Existing. The only power being used for air movement is the server fans. And in conjunction with the cabinets isolating and directing the airflow, they re doing all the work. We ve seen drastic savings over our other data center, said Ginnaty. Right now you can see we re pushing 72 degrees into the room and it s 84 degrees inside the back of this cabinet. But you can see none of this warm air is coming into the room. It s all in the cabinet. 6
7 Standing near the cooling wall, Victor adds, The air velocity is much higher here, but the temperature here is the same as the temperature down there on the far end and you have all these servers in between. TCS issue of limited power capacity is long gone and has since evolved into a total power capacity of 400 kw. And with this efficient design only using about 250 kw of that load, TCS data center expansion now has a very different limitation floor space. We re looking at bringing in at least 10 more cabinets, possibly 20, in the first quarter, said Ginnaty. And our next step would be to fill up this area, which would give us another 40 or 50 cabinets. We ve seen drastic savings over our other data center. None of the warm air is coming into the room, it s all in the cabinet. Leading the way in designing an innovative and efficient data center that earned TCS two ASHRAE awards were, from left to right, Joe Wolf and Kyle Victor of McKinstry, and Jeri Ginnaty, TCS Manager, Data Facilities Group. About Chatsworth Products Chatsworth Products (CPI) is a global manufacturer providing voice, data and security products and service solutions that optimize, store and secure technology equipment. CPI Products offer innovation, configurability, quality and value with a breadth of integrated system components, covering virtually all physical layer needs. Unequalled customer service and technical support, as well as a global network of industry-leading distributors, assures customers that CPI is dedicated to delivering products and services designed to meet their needs. Headquartered in the US, CPI operates global offices within the US, Mexico, Canada, China, the Middle East and the United Kingdom. CPI s manufacturing facilities are located in the US, Asia and Europe. CPI is listed with the General Services Administration (GSA) under Federal Supply Schedule IT 70. Products are also available through GSA Advantage and through Government Wide Acquisition Contracts (GWACs), including GSA Connections and NITAAC-ECS III. (/gov) About TeleCommunication Systems, Inc. (TCS) produces solutions for mission-critical wireless communications that demand proven high levels of reliability, security, and availability. Solutions include: Secure deployable communication systems, location-based wireless infrastructure, navigation, VoIP Enhanced services (E911; E112), text messaging infrastructure for wireless operators and related professional services for communications solutions. About McKinstry Established in 1960, McKinstry is a full-service design, build, operate and maintain (DBOM) firm with over 1,600 employees and approximately $400 million in annual revenue. McKinstry s professional staff and trades people deliver consulting, construction, energy, and facility services. As an early adopter of the DBOM process, the company advocates collaborative and sustainable solutions that are designed to ensure occupant comfort, improve systems efficiency, reduce facility operational costs, and ultimately optimize client profitability for the life of their building Chatsworth Products, Inc. All rights reserved. CPI, CPI Passive Cooling, GlobalFrame, MegaFrame, Saf-T-Grip, Seismic Frame, SlimFrame, TeraFrame, Cube-iT Plus, Evolution, OnTrac, Velocity and QuadraRack are federally registered trademarks of Chatsworth Products. econnect and Simply Efficient are trademarks of Chatsworth Products. All other trademarks belong to their respective companies. 2/13 MKT
CASE STUDY. How the Lego Approach Helped Norway Giant SpareBank 1 Create a Flexible Data Centre. They used what they called the Lego Approach.
CASE STUDY Helping you Optimise. Store. Secure. technology equipment... How the Lego Approach Helped Norway Giant SpareBank 1 Create a Flexible Data Centre Oslo, the most populated city in Norway, is the
Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers
CABINETS: ENCLOSED THERMAL MOUNTING MANAGEMENT SYSTEMS WHITE PAPER Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers 800-834-4969 [email protected] www.chatsworth.com All products
CASE STUDY. The Evolution in Healthcare Requires Contemporary Design and Out of the Box Thinking
CASE STUDY Helping you Optimize. Store. Secure. technology equipment... The Evolution in Healthcare Requires Contemporary Design and Out of the Box Thinking Creating a Modern Data Center that Provides
Dealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe [email protected] AGENDA Business Drivers Challenges
Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers
C P I P O W E R M A N A G E M E N T WHITE PAPER Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers By Anderson Hungria Sr. Product Manager of Power, Electronics & Software Published
How To Improve Energy Efficiency In A Data Center
Google s Green Data Centers: Network POP Case Study Table of Contents Introduction... 2 Best practices: Measuring. performance, optimizing air flow,. and turning up the thermostat... 2...Best Practice
Extend the Life of Your Data Center By Ian Seaton Global Technology Manager [email protected]
C P I C O N TA I N M E N T S O L U T I O N S WHITE PAPER Extend the Life of Your Data Center By Ian Seaton Global Technology Manager [email protected] Published August 2013 800-834-4969 [email protected]
IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT
DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.
Benefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
Rack Hygiene. Data Center White Paper. Executive Summary
Data Center White Paper Rack Hygiene April 14, 2011 Ed Eacueo Data Center Manager Executive Summary This paper describes the concept of Rack Hygiene, which positions the rack as an airflow management device,
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER
Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER Lylette Macdonald, RCDD Legrand Ortronics BICSI Baltimore 2011 Agenda: Discuss passive thermal management at the Rack
AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings
WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200
Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.
Business Management Magazine Winter 2008 Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Contrary to some beliefs, air is quite
How To Run A Data Center Efficiently
A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly
Airflow Simulation Solves Data Centre Cooling Problem
Airflow Simulation Solves Data Centre Cooling Problem The owner s initial design for a data centre in China utilized 40 equipment racks filled with blade servers spread out in three rows along the length
Great Lakes Data Room Case Study
Great Lakes Data Room Case Study WeRackYourWorld.com Problem: During a warm summer period in 2008, Great Lakes experienced a network outage due to a switch failure in the network enclosure. After an equipment
Data Center Rack Systems Key to Business-Critical Continuity. A White Paper from the Experts in Business-Critical Continuity TM
Data Center Rack Systems Key to Business-Critical Continuity A White Paper from the Experts in Business-Critical Continuity TM Executive Summary At one time, data center rack enclosures and related equipment
The Data Center Dilemma - A Case Study
Second Place: Industrial Facilities or Processes, Existing The TCS data center makes use of white cabinets and reflective surfaces as one method to reduce power for its lights and cooling. Data Center
Data Center Power Consumption
Data Center Power Consumption A new look at a growing problem Fact - Data center power density up 10x in the last 10 years 2.1 kw/rack (1992); 14 kw/rack (2007) Racks are not fully populated due to power/cooling
Cabinets 101: Configuring A Network Or Server Cabinet
Cabinets 101: Configuring A Network Or Server Cabinet North, South and Central America White Paper May 2012 www.commscope.com Contents Background Information 3 What is a network or server cabinet? 3 What
Environmental Data Center Management and Monitoring
2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor
Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions
Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the
Data center upgrade proposal. (phase one)
Data center upgrade proposal (phase one) Executive Summary Great Lakes began a recent dialogue with a customer regarding current operations and the potential for performance improvement within the The
APC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
We understand your need for IT infrastructure solutions that optimize equipment, operations and business.
Company Overview Optimize. Store. Secure. Our mission, our motivation At Chatsworth Products, Inc. (CPI) it is our mission to provide solutions that solve your IT infrastructure challenges and provide
Managing Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
Greening Commercial Data Centres
Greening Commercial Data Centres Fresh air cooling giving a PUE of 1.2 in a colocation environment Greater efficiency and greater resilience Adjustable Overhead Supply allows variation in rack cooling
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Agenda Overview - Network Critical Physical Infrastructure Cooling issues in the Server Room
Center Thermal Management Can Live Together
Network Management age e and Data Center Thermal Management Can Live Together Ian Seaton Chatsworth Products, Inc. Benefits of Hot Air Isolation 1. Eliminates reliance on thermostat 2. Eliminates hot spots
Unified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers
CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building
Thermal Monitoring Best Practices Benefits gained through proper deployment and utilizing sensors that work with evolving monitoring systems
NER Data Corporation White Paper Thermal Monitoring Best Practices Benefits gained through proper deployment and utilizing sensors that work with evolving monitoring systems Introduction Data centers are
Choosing Close-Coupled IT Cooling Solutions
W H I T E P A P E R Choosing Close-Coupled IT Cooling Solutions Smart Strategies for Small to Mid-Size Data Centers Executive Summary As high-density IT equipment becomes the new normal, the amount of
APC APPLICATION NOTE #92
#92 Best Practices for Designing Data Centers with the InfraStruXure InRow RC By John Niemann Abstract The InfraStruXure InRow RC is designed to provide cooling at the row and rack level of a data center
Containment Solutions
Containment Solutions Improve cooling efficiency Separate hot and cold air streams Configured to meet density requirements Suitable for commercial and SCEC endorsed cabinets Available for retrofit to SRA
Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential
TECHNICAL REPORT Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential By Lars Strong, P.E., Upsite Technologies, Inc. Bruce Long, Upsite Technologies, Inc. +1.888.982.7800 upsite.com
How To Improve Energy Efficiency Through Raising Inlet Temperatures
Data Center Operating Cost Savings Realized by Air Flow Management and Increased Rack Inlet Temperatures William Seeber Stephen Seeber Mid Atlantic Infrared Services, Inc. 5309 Mohican Road Bethesda, MD
Cabinet & Enclosure Systems
Cabinet & Enclosure Systems Optimize. Store. Secure. CPI Cabinets Optimize. Store. Secure. The cabinet platform on which your enterprise is built is just as critical as the equipment it stores. Utilizing
Managing Cooling Capacity & Redundancy In Data Centers Today
Managing Cooling Capacity & Redundancy In Data Centers Today About AdaptivCOOL 15+ Years Thermal & Airflow Expertise Global Presence U.S., India, Japan, China Standards & Compliances: ISO 9001:2008 RoHS
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.
Power and Cooling for Ultra-High Density Racks and Blade Servers
Power and Cooling for Ultra-High Density Racks and Blade Servers White Paper #46 Introduction The Problem Average rack in a typical data center is under 2 kw Dense deployment of blade servers (10-20 kw
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers Deploying a Vertical Exhaust System www.panduit.com WP-09 September 2009 Introduction Business management applications and rich
Small Data / Telecommunications Room on Slab Floor
Small Data / Telecommunications Room on Slab Floor Small Data / Telecommunications Room on Slab Floor During a warm summer period in 2008, Great Lakes experienced a network outage due to a switch failure
AIR-SITE GROUP. White Paper. Green Equipment Room Practices
AIR-SITE GROUP White Paper Green Equipment Room Practices www.air-site.com Common practices to build a green equipment room 1 Introduction Air-Site (www.air-site.com) is a leading international provider
2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms
Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms Agenda Review of cooling challenge and strategies Solutions to deal with wiring closet cooling Opportunity and value Power
Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency
A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no
- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
Raised Floor Data Centers Prepared for the Future
Raised Floor Data Centers Prepared for the Future Raised Floors Protect My Investment by Providing a Foundation for Current and Future Technologies. Creating the Future Proof Data Center Environment The
DataCenter 2020: first results for energy-optimization at existing data centers
DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction
Enclosure and Airflow Management Solution
Enclosure and Airflow Management Solution CFD Analysis Report Des Plaines, Illinois December 22, 2013 Matt Koukl- DECP Mission Critical Systems Affiliated Engineers, Inc. Contents Executive Summary...
Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Kenneth G. Brill, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc. 505.798.0200
OCTOBER 2010. Layer Zero, the Infrastructure Layer, and High-Performance Data Centers
Layer Zero, the Infrastructure Layer, and The demands on data centers and networks today are growing dramatically, with no end in sight. Virtualization is increasing the load on servers. Connection speeds
Re Engineering to a "Green" Data Center, with Measurable ROI
Re Engineering to a "Green" Data Center, with Measurable ROI Alan Mamane CEO and Founder Agenda Data Center Energy Trends Benchmarking Efficiency Systematic Approach to Improve Energy Efficiency Best Practices
Elements of Energy Efficiency in Data Centre Cooling Architecture
Elements of Energy Efficiency in Data Centre Cooling Architecture Energy Efficient Data Center Cooling 1 STULZ Group of Companies Turnover 2006 Plastics Technology 400 Mio A/C Technology 200 Mio Total
Data Center Components Overview
Data Center Components Overview Power Power Outside Transformer Takes grid power and transforms it from 113KV to 480V Utility (grid) power Supply of high voltage power to the Data Center Electrical Room
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
Data Centers. Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions
Data Centers Mapping Cisco Nexus, Catalyst, and MDS Logical Architectures into PANDUIT Physical Layer Infrastructure Solutions 1 Introduction The growing speed and footprint of data centers is challenging
Best Practices for Wire-free Environmental Monitoring in the Data Center
White Paper 11800 Ridge Parkway Broomfiled, CO 80021 1-800-638-2638 http://www.42u.com [email protected] Best Practices for Wire-free Environmental Monitoring in the Data Center Introduction Monitoring for
Data Center Cooling: Fend Off The Phantom Meltdown Of Mass Destruction. 670 Deer Road n Cherry Hill, NJ 08034 n 877.429.7225 n
Data Center Cooling: Fend Off The Phantom Meltdown Of Mass Destruction How To Preserve Your Servers And Prevent Overheating Your high-performance, multiprocessor servers are working hard, computing tons
Element D Services Heating, Ventilating, and Air Conditioning
PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer
Airflow Utilization Efficiency (AUE) Can we cool today s servers with airflow?
AirflowUtilizationEfficiency(AUE) How to use River Cooling to reduce energy costs while improving server performance. By ThomasS.Weiss Sr.VicePresident TriadFloorsbyNxGen,LLC This paper offers insight
HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE
HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE Best practices for airflow management and cooling optimisation AIRFLOW MANAGEMENT COLD AND HOT AISLE CONTAINMENT DC ASSESSMENT SERVICES Achieving true 24/7/365
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue
Data Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling
Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com
Benefits of Cold Aisle Containment During Cooling Failure
Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business
Improving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
Data Center Optimization Services. Maximize Your Data Center
Data Center Optimization Services Maximize Your Data Center Energy savings was definitely the key driver in my decision to use the SmartAire dampers and DirectAire panels. - Larry Hood Senior Construction
IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS
IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS SUMMARY As computer manufacturers pack more and more processing power into smaller packages, the challenge of data center
Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant
RESEARCH UNDERWRITER WHITE PAPER LEAN, CLEAN & GREEN Wright Line Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant Currently 60 percent of the cool air that
Data Center Design and Energy Management
Agenda EasyStreet Online Services, Inc. Data Center Design and Energy Management Introduction to EasyStreet Speaker Bio EasyStreet Sustainability Data Center 1 (Retrofit completed May 2010) Data Center
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program
How to Meet 24 by Forever Cooling Demands of your Data Center
W h i t e P a p e r How to Meet 24 by Forever Cooling Demands of your Data Center Three critical aspects that are important to the operation of computer facilities are matching IT expectations with the
Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009
Optimum Climate Control For Datacenter - Case Study T. Prabu March 17 th 2009 Agenda 2 About EDEC (Emerson) Facility Data Center Details Design Considerations & Challenges Layout Design CFD Analysis Of
Reducing Data Center Energy Consumption
Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate
W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers
W H I T E P A P E R Computational Fluid Dynamics Modeling for Operational Data Centers 2 Executive Summary Improving Effectiveness of CFD Technology in Cooling of Data Centers IT managers continue to be
Rack Blanking Panels To Fill or Not to Fill
Rack Blanking Panels To Fill or Not to Fill A Dell Technical White Paper Dell Data Center Infrastructure David L. Moss Joyce F. Ruff Learn more at Dell.com/PowerEdge/Rack THIS WHITE PAPER IS FOR INFORMATIONAL
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:
Reducing the Annual Cost of a Telecommunications Data Center
Applied Math Modeling White Paper Reducing the Annual Cost of a Telecommunications Data Center By Paul Bemis and Liz Marshall, Applied Math Modeling Inc., Concord, NH March, 2011 Introduction The facilities
Energy-efficient & scalable data center infrastructure design
Seminar am 30. September 2010 in Wetzlar Nachhaltigkeit durch UPI (Unified Physical Infrastructure) Vortrag von Stefan Fammler: Energieeffiziente Strategien im Rechenzentrum Energy-efficient & scalable
High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.
High Density Data Centers Fraught with Peril Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. Microprocessors Trends Reprinted with the permission of The Uptime Institute from a white
F-SERIES TERAFRAME HD CABINET SYSTEM
P R O D U C T D ATA S H E E T F-SERIES TERAFRAME HD CABINET SYSTEM K E Y F E AT U R E S Safely transports up to 3,000 lb (1360 kg) of electronic equipment from a factory integration site direct to a customer
Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs
WHITE PAPER Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs By Lars Strong, P.E., Upsite Technologies, Inc. 505.798.000 upsite.com Reducing Room-Level
In Row Cooling Options for High Density IT Applications
In Row Cooling Options for High Density IT Applications By Ramzi Y. Namek, PE, LEED AP, BD+C; Director of Engineering 1 Users wishing to deploy high density IT racks have several In Row Cooling solutions
The New Data Center Cooling Paradigm The Tiered Approach
Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,
COMPARISON OF EFFICIENCY AND COSTS BETWEEN THREE CONCEPTS FOR DATA-CENTRE HEAT DISSIPATION
COMPARISON OF EFFICIENCY AND COSTS BETWEEN THREE CONCEPTS FOR DATA-CENTRE HEAT DISSIPATION 1/16 Contents: 1. Introduction: the increasing importance of energy efficiency 1.1 Power consumers in the data
Driving Data Center Efficiency Through the Adoption of Best Practices
Data Center Solutions 2008 Driving Data Center Efficiency Through the Adoption of Best Practices Data Center Solutions Driving Data Center Efficiency Through the Adoption of Best Practices Executive Summary
WHITE PAPER. Creating the Green Data Center. Simple Measures to Reduce Energy Consumption
WHITE PAPER Creating the Green Data Center Simple Measures to Reduce Energy Consumption Introduction Energy Awareness Driving Decisions in the DataCenter The continued thirst for energy is a recurring
Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360
Autodesk Revit 2013 Autodesk BIM 360 Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360 Data centers consume approximately 200 terawatt hours of energy
