IT Equipment Design Evolution & Data Center Operation Optimization. Don Beaty Roger Schmidt
|
|
- Gary Watson
- 8 years ago
- Views:
Transcription
1 IT Equipment Design Evolution & Data Center Operation Optimization Don Beaty Roger Schmidt
2 Copyright Materials Copyright 2015 by ASHRAE. All rights reserved. No part of this presentation may be reproduced without written permission from ASHRAE, nor may any part of this presentation be reproduced, stored in a retrieval system or transmitted in any form or by any means (electronic, photocopying, recording or other) without written permission from ASHRAE. 2
3 AIA/CES Registered Provider ASHRAE is a Registered Provider with The American Institute of Architects Continuing Education Systems. Credit earned on completion of this program will be reported to CES Records for AIA members. Certificates of Completion for non AIA members are available on request. This program is registered with the AIA/CES for continuing professional education. As such, it does not include content that may be deemed or construed to be an approval or endorsement by the AIA of any material of construction or any method or manner of handling, using, distributing, or dealing in any material or product. Questions related to specific materials, methods, and services will be addressed at the conclusion of this presentation. 3
4 USGBC Education Partner IT Equipment Design Evolution & Data Center Operation Optimization Approved for: 3 General CE hours By ASHRAE GBCI cannot guarantee that course sessions will be delivered to you as submitted to GBCI. However, any course found to be in violation of the standards of the program, or otherwise contrary to the mission of GBCI, shall be removed. Your course evaluations will help us uphold these standards. 0 LEED-specific hours Approval date: July 2014 Course ID:
5 Opening Comments Was the dominant computer cooling method liquid or air in 1980, in 2000? What will it be in 2020? IT hardware has been REALLY CHANGING and EVOLVING (for example, Extreme Low Energy Servers, CPUs, GPUs, Multi Core, Disaggregation, Variable Speed, Software Defined Networks) IT manufacturers respond to customer need and demands which vary greatly; any combination of the following: LOWER cost MORE storage MORE energy efficient MORE computing capabilities Significant changes in hardware including hardware operating conditions. Customer needs including operational continues the evolution and optimization of hardware. Operational data received in recent years combined with IT manufacturers analysis has created some surprising and important discoveries including unintended consequences. 5
6 Opening Comments Tower Server (tall & narrow) 1U Server (short & wide) Blade Server (square) Is Hardware Eating Itself? Is Software Eating Hardware? All indications are that Software is Eating Hardware; what does this mean? 6
7 Presentation Overview PRESENTATION TITLE: IT Equipment Design Evolution & Data Center Operation Optimization PART 1 Hardware Overview Including Trends Previous Trends PART 2 Hardware Overview Including Trends Current / Projected Trends PART 3 Hardware Basics PART 4 Hardware Requirements, Discoveries & Concerns PART 5 Facilities Air Cooling Architecture PART 6 Liquid Cooling Closing Comments 7
8 Part 1: Hardware Overview Including Trends Previous Trends History has consistently proven that many aspects of hardware continue to aggressively trend upward. Two exceptions to the upward trend are Electronic Packaging & Compute Energy 8
9 Incredible Performance Improvements As the Number of Transistors goes up, Energy per Transistor goes down Number of Transistors Energy per Transistor Number of Transistors 10 5 Energy per Transistor ~1 Million Factor Reduction in Energy / Transistor Over 30+ Years IBM Graphic Modified By DLB
10 Chip Cooling (Bipolar vs. CMOS) Historical Trend 15 End of Bipolar Water Cooling Bipolar CMOS Module Heat Flux (Watts / cm²) IBM Graphic Modified By DLB 10
11 Moore s Law (Microprocessor Transistor Counts) Historical & Predictive Trend 2.6B 1B 100M Number of Transistors 10M 1M 100k 10k 2.3k 24 Month reduced to 18 Month Doubling Date of Introduction Source: Wikipedia Modified By DLB
12 Kryder s Law (Moore s Law for Storage) Historical & Predictive Trend Capacity (GB) ,1 0,01 0, Source: Wikipedia / Scientific American 2005 Modified By DLB 12
13 Internet Traffic Trend Historical Trend Amsterdam International Internet Exchange Monthly Input Traffic (TB) (July 2001 to January 2013) Amsterdam International Internet Exchange Traffic Intel (Doubling every 18 months) Moore s Law (Doubling every 24 months) TB Input / Month Source: ams ix.com Modified by DLB 13
14 (Jakob) Nielsen's Law of Internet Bandwidth Historical & Predictive Trend Capacity (GB) A high end user's connection speed grows by 50% per year nngroup.com Modified By DLB 14
15 2000 & 2005 IT Equipment Power Predictive Trends Uptime Institute Graphic Modified By DLB ASHRAE Graphic Modified By DLB 0,65m 2 0,65m 2 Thermal Management Consortium in 2000: Published Through The Uptime Institute ASHRAE 2005 Publication: Datacom Equipment Power Trends & Cooling Applications History Validated These Trends 15
16 Data Center Generational Impact 2007 to 2011 Generation 1 Generation 2 Generation 3 Generation 4 Generations 1 to 4 have the SAME IT Performance IBM Metric Comparative Change (4U, 4 Processor) Generation 1 (Year 1 Baseline) Generation 4 (Year 4) Generation 1to 4 Comparison IT Performance 100% 100% SAME IT Power per Rack 100% 119% + 19% Rack Count 100% (60) 13% (8) 87% Total IT Power 100% 18% 82% Space Required 100% 15% 85% Cooling Required 100% 20% 80% 16
17 ITE Environment ASHRAE Psychrometric Chart 2004 / 2008 Recommended / Allowable Criteria Low End Temp. 20 C (68 F) 18 C (64,4 F) High End Temp. 25 C (77 F) 27 C (80,6 F) Prior to 2004 Typically 20 C +/ 0,6 C Low End Moisture 40% RH 5,5 C DP (41,9 F) High End Moisture 55% RH 60% RH & DP 15 C (59 F) 17
18 Part 2: Hardware Overview Including Trends Current / Projected Trends Compute Performance Growth is faster than Compute Power Growth but power is still growing ASHRAE Trend Charts and TGG Maturity Model are valuable for Planning the Future 18
19 ITE Environment ASHRAE Psychrometric Chart 2011 Classes A1 and A2 are EXACTLY the SAME as previous Classes 1 & 2: Apply to new AND legacy equipment New Classes A3 and A4 do NOT include legacy equipment 19
20 Temperature Rate of Change Clarification NOTE: For tape storage equip., no more than: 5 C in an hour For all other IT equip., no more than: 20 C in an hour AND 5 C in any 15 minute period of time (The 5 C and 20 C temperature changes are NOT instantaneous temperature rates of change) 20
21 Temperature Rate of Change Example Graphs Inlet Air Temp ( C) 5 C 1 Hour 1 Hour 1 Hour Time (hours) Examples of Conforming Rate of Change for Tape Drives IBM Graphic Modified By DLB IBM Graphic Modified By DLB 15 min Inlet Air Temp ( C) 20 C 5 C 1 Hour 1 Hour 1 Hour Time (hours) Examples of Conforming Rate of Change for Other IT Equipment (e.g. Servers) 21
22 Volume Server Power Demand & Performance Growth Volume Server Power Demand & Performance Growth (excludes extreme low energy servers) Volume Server CAGR Growth in 3 Years Growth in 5 Years Power Demand Growth 2% 6% 5% 20% 10% 30% Performance Growth 25% 30% 90% 120% 200% 250% Compounded Annual Growth Rate (CAGR) Growth Projections are Rounded IBM 22
23 Volume Server Power Trends to 2020 (fully configured, fully utilized max load) Height Heat Load / Chassis (Watts) Heat Load / 42U Rack Increase No. of 2010 to Sockets U 1s % 2s % 4s % 2U 2s % 4s % 4U 2s % 7U (Blade) 2s % 9U (Blade) 2s % ASHRAE Table Reformatted by DLB Associates Market Requirements force IT manufacturers to maximize performance/volume (creating high heat load/rack). These rack heat loads will result in increased focus on improving data center ventilation solutions and localized liquid cooling solutions. High Risk to Generalize; One Shoe Definitely Does NOT Fit All 23
24 Volume Server Power Trends Simple Adjustment Factor Example Adjustment Factor 1.00 (Original Value) 0.50 (Your Data Center) Volume Server Configuration Heat Load / Chassis (Watts) Heat Load / 42U Rack U, 2s U, 2s How to adjust the published Trends for your environment: 1) Trend Chart Value for a 1U, 2s Volume Server in 2010: 600 Watts 2) ACTUAL MEASURED Value for YOUR 1U, 2s Server: 300 Watts 3) Calculated Adjustment Factor for YOUR 1U, 2s Server = 300 Watts / 600 Watts =
25 TGG Data Center Maturity Model Concept Color Coding Level 5: Visionary (5 Years Away) Clear: Not a DCMM level Level 4: Level 3: Reasonable Steps (between current best practices & the visionary 5 year projection) Black: Yellow: Target Theoretical Max. Level 2: Best Practice Yellow: Target Level 1: Part Best Practice Level 0: Minimal / No Progress Green: Green: Achieved Achieved DLB Associates 25
26 TGG Data Center Maturity Model Concept Color Coding Level 5 (Visionary: 5 Yrs. Away) Level 4 (Reasonable Step) Level 3 (Reasonable Step) Level 2 (Best Practice) Level 1 (Part Best Practice) Level 0 (Minimal / No Progress) Targeted, Theoretical Max. and Non Levels VARY with a Given Aspect of the Data Center DLB Associates 26
27 TGG Data Center Maturity Model Definitions FACILITY 1. Power (Critical Power Path Efficiency Building Entrance to IT load, Architecture, Operations, Generation) IT 5. Compute (Utilization, Workload Management, Operations, Power Management, Server population) 2. Cooling (PUE Cooling Contribution, RCI (high) & RCI (low) if applicable, Mechanical / Refrigerant Cooling reduction, Environmental set point range at inlet conditions to IT equipment, Environmental monitoring and control, Operations) 3. Other Facility (Operational Resilience, Resilience vs. Need, Lighting, Building/Shell, M&E Waste, Procurement) 4. Management (Monitoring, PUE, Waste heat reuse (as measured by ERF/ERE), CUE, WUE, xue/additional metrics) 6. Storage (Workload, Architecture, Operations, Technology, Provisioning) 7. Network (Utilization, Workload, Operations, Technology, Base performance, Provisioning) 8. Other IT (Overall, Utilization, IT sizing, Internal Power Supply Efficiency, Service Catalog / SLA's, Incentivizing change for efficient behavior, E Waste, Procurement) 27
28 TGG Data Center Maturity Model Definitions Matrix Topic Level 0 Level 1 Level 2 Level 3 Level 4 Level 5 Facility Cooling: 2.1 PUE Cooling Contribution Annual Average 1.0 Annual Average 0.5 Annual Average 0.35 Annual Average 0.2 Annual Average 0.1 Annual Average 0.05 Level 0 Level 1 Level 2 Level 3 Level 4 Level 5 Data Center Efficiency & Sustainability Investment (Financial, Time & Resource) 28 TGG Graphic Modified by DLB Associates
29 Data Center Maturity Model ASHRAE TC 9.9 Envelope Examples Data Center Maturity Model ASHRAE Envelope Examples TEMPERATURE ONLY Upper Limit Example 1 Example 2 Level Description Annual Excursion Annual Excursion Normal Normal Temperature Hours Temperature Hours 0 Minimal / No Progress 21 C ±1 C (R) None None 21 C ±1 C (R) None None 1 Part Best Practice C (R) F (R) < 10 hrs C (R) C (A1) < 10 hrs. 2 Best Practice C (R) C (A1) < 10 hrs C (A1) C (A1) < 10 hrs. 3 Reasonable Step C (R) F (A1) < 100 hrs C (A1) C (A1) < 100 hrs. 4 Reasonable Step C (R) C (A1) < 10 hrs C(A1) C (A1) < 10 hrs. 5 Visionary (5 Yrs. Away) C (R) C (A2) < 25 hrs C (A1) C (A2) < 25 hrs. (R) = ASHRAE Recommended Envelope, (A1) = ASHRAE Allowable A1, (A2) = ASHRAE Allowable A2 DLB Associates 29
30 Part 3: Hardware Basics Some basics on server design, configuration and misconceptions 30
31 Server Power & IT Hardware Airflow Rate SERVER POWER 25 to 50% 100% IT HARDWARE AIRFLOW RATE 75 CFM/KW 150 CFM / KW Idle Power Production Power Normal Conditions Worst Case Conditions IBM Graphic Modified By DLB 31
32 Thermal Report Example: Generic Server Description Minimum Configuration Model 1 way 1.5 GHz Processor 16 GB memory Typical Heat Release Condition Nominal Airflow 35 o C (95 F) 120 V cfm (m 3 /h) cfm (m 3 /h) Full 2 way 1.65 GHz Processor Max. memory Typical 1 way 1.65 GHz Processor 16 GB memory NOTE: Most new server fans are variable speed 32
33 Thermal Report Comparison to Nameplate Nameplate 920 W (1 kva w/ PF = 0.92) ASHRAE Thermal Report 420 to 600 W 33
34 Thermal Report Energy Star Example 34
35 2012 ASHRAE Whitepaper ASHRAE TC 9.9 Whitepaper on IT Equipment Thermal Management & Controls (free download at The purpose of the whitepaper was to: 1) Describe mainstream ITE cooling systems 2) Describe ITE power and thermal management 3) Describe interactions between ITE equipment and the data center 35
36 Common Misconceptions with Respect to Servers MISCONCEPTION 1 ITE FANS consume 10% to 20% of Total ITE power: IT Fan Power 20 % 10 % 0% Idle Typical Perception IT Workload Idle Conditions (as low as 1% of total load) Typical Conditions (2% to 4% of total load) Extreme Conditions (8% to 15% of total load) (limited to no fan speed control) MISCONCEPTION 2 ITE is managed based upon Chassis Temperature Rise: Thermal management within servers is primarily driven to ensure compliance to component temperature specifications. Component temperatures are often very similar over a range of wide ambient temperatures. Temperature rise is not generally a consideration in the thermal management scheme. Exhaust temperature may be a consideration for safety reasons, in which case temperature rise of the air passing through the chassis must be determined in some manner. 36
37 Common Misconceptions with Respect to Servers (cont.) MISCONCEPTION 3 All mainstream IT equipment is Designed and Performs SIMILARLY: Poor Design IT equipment designed WITHOUT Precision Thermal Management Low End Volume Servers Typically do not monitor all sensors available and have simple fan speed control (FSC) algorithms. Sensors Today s well designed servers integrate a large numbers of thermal sensors along with activity and power sensors to drive fan speeds as low as possible to reduce wall power. Server Fans Power consumption of server fans has improved significantly over time. 37
38 Requirements for Server Cooling Thermal Design ASHRAE 3 Levels of Limiting Component Temperature 38
39 Platform Power Thermal Management Power Management Used for Thermal Compliance 39
40 Component Temperature Driven by Three Effects ASHRAE 3 Sources of Component Temperature: Self Heating Air Heating System Ambient 40
41 Boundaries for Thermal Management Design considerations include: Usage models Environmental conditions Rack & room level airflow protocols Components selection, location & configuration ASHRAE Design considerations must be evaluated against: Cost Performance Energy objectives 41
42 Component Packaging to Meet Thermal Requirements Processor Package TIM 2 TIM 1 Case/IHS Die Substrate T LA (Local Ambient) T s (sink) T c (case/ihs) T j (junction) Processor Package plus Cooling Components Abbreviations: T = Temperature TIM = Thermal Insulating Material IHS = Internal Heat Sink 42
43 System Considerations Good Thermal Design includes: Integrates Real time Optimization Delivers PRECISELY the Performance Needed Consumes LOWEST Power while meeting Component Specifications Incorporates best acoustic performance without compromising power / performance SHOW & TELL ASHRAE 43
44 Server Control Process & Sensing for Thermal Control Process SERVER CONTROL PROCESS Diagrammatic Sensor Location Example Temperature INPUTS Power Activity Fan Conditions Fan Speed Control ALGORITHMS Power Management State, Traffic Limitation Fan Speeds OUTPUTS Performance Settings Power Settings ASHRAE Graphic Modified By DLB ASHRAE Graphic Modified By DLB Common for 2 socket server to have more than 30 sensors 44
45 Testing and Validation Classical Testing and Industry Standards: 1) Acoustics 2) Electromagnetic Compatibility 3) Shock and Vibrations 4) Environmental and Thermal Stress 5) Volatile Organic Compounds 6) Product Safety 45
46 Testing and Validation Acoustics IBM IBM Emission Sound Pressure Level (Semi Anechoic Chamber) Sound Power Level (Reverberation Room) Measurements in accordance with ISO 7779: Measurement of airborne noise emitted by information technology and telecommunications equipment 46
47 Testing and Validation Electromagnetic Compatibility IBM IBM Radiated & Conducted Emissions (10 meter Semi Anechoic & OATS Chamber) Radiated Immunity (3 meter Semi Anechoic Chamber) Regulatory Compliance: Emissions Tests Immunity Tests 47
48 Testing and Validation Shock & Vibration Show & Tell 48
49 Testing and Validation Environmental / Thermal Stress Chamber IBM 49
50 Testing and Validation Environmental Chamber Verification of extremes of environmental envelope IBM IBM Worst case cabling configurations considered for testing 50
51 Testing and Validation Volatile Organic Compounds Nested Testing Strategy (VOCs, Ozone, Particulates) Test top model in each product line Perform thorough emissions analysis Pass remaining models on substantial equivalency Track Emissions Profiles Polymers IC Card substrate Supplies Dynamic Testing Targets Regulations/Standards Government Agency Guidelines Product Emissions Chamber (15 m x 8 m x 3 m) IBM 51
52 Testing and Validation Product Safety Product Safety Requirements driven by: 1) Legal/Regulatory Requirements Manufacturer based Marketing/Sales based Importer/Exporter based Customer based 2) Good Corporate Citizenship Process for Obtaining Certifications: 1) Early involvement in the design to assure the product will be compliant 2) Building block approach to certify safety critical components 3) Accreditation of local labs to carry out testing 52
53 ITE Environment ASHRAE Psychrometric Chart 2011 Classes A1 and A2 are EXACTLY the SAME as previous Classes 1 & 2: Apply to new AND legacy equipment New Classes A3 and A4 do NOT include legacy equipment 53
54 ITE Environment 2011 Environment Specifications Table (Partial) 54
55 Table of Contents EXECUTIVE SUMMARY 1) Introduction 2) Survey of Maximum Temperature Ratings 3) Cooling Design of Networking Equipment 4) Equipment Power and Exhaust Temperatures 5) Environmental Specifications 6) Reliability 7) Practical Installation Considerations 8) ASHRAE TC9.9 Recommendations 9) Summary 10) References APPENDIX A: DEFINITION OF ACRONYMS AND KEY TERMS APPENDIX B: ACOUSTICS APPENDIX C: TOUCH TEMPERATURE Data Center Networking Equipment Issues and Best Practices Whitepaper prepared by ASHRAE Technical Committee (TC) 9.9 Mission Critical Facilities, Data Centers, Technology Spaces, and Electronic Equipment 2013, American Society of Heating, Refrigerating and Air Conditioning Engineers, Inc. All rights reserved. This publication may not be reproduced in whole or in part; may not be distributed in paper or digital form; and may not be posted in any form on the Internet without ASHRAE s expressed written permission. Inquiries for use should be directed to publisher@ashrae.org ASHRAE TC
56 Networking Recommendations New networking equipment designs draw cooling air from the front face of the rack, with the air flow direction from the front of the rack to the rear, and the hot exhaust exiting the chassis at the rear face of the rack. This front to rear cooled equipment should be rated to a minimum of ASHRAE Class A3 [40 C (104 F)] and preferably ASHRAE Class A4 [45 C (113 F)]. The development of new products that do not adhere to a front to rear cooling design is not recommended. It is recommended that networking equipment, where the chassis does not span the full depth of the rack, have an air flow duct that extends all of the way to the front face of the rack. The equipment should be designed to withstand a higher inlet air temperature than the data center cooling supply air if: 1) the equipment is installed in an enclosed space that does not have direct access to the data center air cooling stream, or 2) the equipment has a side to side air flow configuration inside an enclosed cabinet. 56
57 Networking Recommendations (cont.) Networking equipment manufacturers should provide very specific information on what types of installations for which their equipment is designed. Users should follow the manufacturer installation recommendations carefully. Any accessories needed for installation, such as ducting, should either be provided with the equipment or should be readily available. By following these recommendations, the risk of equipment overheating can largely be avoided and the compatibility of networking equipment with other types of equipment in rack and data center level solutions will be significantly improved. 57
58 Common Air Flow & Mechanical Design Configurations Side View of Rack (side panel of rack and chassis removed) Front View of Rack Large Switch, Typically Full & Half Size Rack with Front to Rear Air Flow 58
59 Common Air Flow & Mechanical Design Configurations (cont.) Side View of Rack (side panel of rack and chassis removed) Front View of Rack Mid Size Switch with Side to Side Air Flow 59
60 Common Air Flow & Mechanical Design Configurations (cont.) pp Top View of Equipment (top panel of chassis removed) Front View of Equipment (ports can sometimes also be in rear) Small Networking Equipment with S Shaped Air Flow (other networking equipment diagrams are shown in the whitepaper) 60
61 Part 4: Hardware Requirements, Discoveries & Concerns 61
62 IT Equipment Environment Envelope Definitions RECOMMENDED The purpose of the recommended envelope is to give guidance to data center operators on maintaining high reliability and also operating their data centers in the most energy efficient manner. ALLOWABLE The allowable envelope is where the IT manufacturers test their equipment in order to verify that the equipment will function within those environmental boundaries. PROLONGED EXPOSURE Prolonged exposure of operating equipment to conditions outside its recommended range, especially approaching the extremes of the allowable operating environment, can result in decreased equipment reliability and longevity. Occasional short term excursions into the allowable envelope MAY be acceptable. OPERATING AT COLDER TEMPERATURES WASTES ENERGY NEEDLESSLY! 62
63 ITE Environment 2011 Environment Specifications Table (Partial) 63
64 2011 ALLOWABLE Environmental Envelopes High RH Effects: IT Reliability Enhanced Corrosion High Operating Temp. Effects: IT Reliability DC Airflow Transient Response Low RH Effects: IT Reliability ESD IBM 64
65 Ambient Inlet Temperature Impacts Power Increase Extrapolation From Graph Power Power Increase Due to Temperature 20 C (68 F) 25 C (77 F) 27 C (80,6 F) 32 C (89,6 F) 35 C (95 F) Lowest Highest Airflow and total power increase with temperature Fan power required increases to the cube of fan speed (rpm) Total power increase includes both fan and component power 65
66 IT Equipment Airflow Trends IT equipment AIRFLOW demands are increasing PROPORTIONAL to POWER. This SIGNIFICANT increase in airflow demand has not been obvious ( caught off guard). How fast will end users and facility operators discover this AIRFLOW INCREASE Significant airflow perturbation exists when HPC applications are initialized (can be on the order of >1,000 CFM / rack). 66
67 Data Center Air Flow Problem? Increase Aisle Pitch? Change Vent Type? Reduce Server Count? Front Rack Plenum? Liquid Cooling? Rear Door Heat Exchanger? WHAT DO YOU DO? Add Row Cooling to Supplement? Passive Chimney? Active Chimney? 67
68 How to Choose? Choice depends largely on facility design: 1) Existing or new facility, rack density, redundancy, flexibility, etc. 2) Lifecycle cost (TCO) is essential 3) Likely to use more than one approach Separate high and low density. Scale cooling resources as you scale demand. 68
69 Close Coupled Cooling HEAT EXCHANGER HEAT EXCHANGER RACK RACK IBM Rear Door Heat Exchanger (Side View) RACK RACK RACK HEX RACK RACK RACK Overhead Heat Exchanger (Side View) IBM In Row Heat Exchanger (Top View) IBM 69
70 Impact of Additional Hardware Connected to a Rack Rack Cover Server Rack Cover Additional Hardware dp 1 dp 2 dp 3 dp 4 dp 5 Side View of Rack Blowup of Rack & Server IBM 70
71 Bypass Air Internal to Rack Rack Cover Rack Cover Server 100% Server 100% Server OFF Server 100% Side View of Rack Server 100% Blowup of Servers 71
72 Multi Vendor Pressure Testing 100% 80% 60% 40% 20% 0% 20% 40% Inhibiting Flow CFM % 60% Inches H2O Aiding Flow ASHRAE * Component temperatures essentially constant over the range of pressures Products CCurrent Generation 2U CCurrent Generation 1U CCurrent Generation 1U Last Generation 1U CCurrent Generation CBlade CCurrent Generation 2U Current Generation 1U 72
73 Server Delta T Reduction at Increased Operating Temperature 58 C Effective Ceiling 58 C Effective Ceiling 60 C Effective Ceiling IBM 73
74 Higher Temperature Impacts on Data Center Operation Hardware Failure Rate for Volume Servers X Factor C (59 F) 17.5 C (63.5 F) 20 C (68 F) 22.5 C (72.5 F) 25 C (77 F) 27.5 C (81.5 F) 30 C (86 F) 35 C (95 F) 40 C (104 F) 45 C (113 F) Inlet Temperature IBM 74
75 Higher Temperature Impacts on Data Center Operation (cont.) CRAH RACK RACK CRAH RACK RACK CRAH RACK RACK IBM Hot Aisle Containment Cold Aisle Containment Direct Rack Exhaust IBM IBM Impacts to Air Flow & Temperature Inlet to Servers: Loss of utility power resulting in all CRAHs being off until generator power engages Loss of chilled water to some or all the CRAHs Airside economizer room where inlet air temperature goes above ASHRAE recommended by design 75
76 Raw Data Example Insulating floor/shoe Large voltages Accumulation (who sits up 10 times?) ESD mitigating floor/shoe Much lower voltages Much quicker discharge ASHRAE 76
77 Definition of Event Voltage (taking off and dropping a sweater) ASHRAE 77
78 Higher Temperature Impacts on Data Center Operation (cont.) Recording Device Stainless steel, brass or copper electrode (or wrist strap) can be used Charge Plate Monitor 91 cm Support Material 91 cm IBM ANSI / ESD STM 97.2 Test Procedure The induced voltage on the body of a person is measured and recorded while he is walking. 78
79 Main Experiment Main Experiment 1) Well defined walking experiment Awareness Experiments 2) Random walking and scraping 3) Sweater taking off and drop 4) Sitting up from chairs 5) Cart experiment The test setup for random walking and the scraping experiment is similar to the well defined walking experiment. The person walks randomly fast and sometimes scrapes his feet. The principle setup of the human walking test in accordance to ANSI/ESD S
80 Low RH Effects on Data Center Operation Low RH Effects on Data Center Operation: 15 C (59 F) and 15% RH Floor Shoes 3M 4530 Asia 3M China Slip On 3M 6432 Cond 3M Full Sole 3M 8413 Running Shoe 3M Green Diss DESCO 2 Meg Sole 3M Low Diss DESCO Heel 3M Thin VPI DESCO Full Sole 2 Meg Epoxy 1A Hush Puppy Flexco Rubber Red Wing HPL F Stat a Rest HPL N Sperry Korean Vinyl Heel Strap Standard Tile Wax 80 IBM
81 Probability of an ESD Related Risk In order to derive the relative rate of ESD related failures, it is necessary to obtain the probability of observing a voltage above a certain value. Three thresholds values have been selected: 500 V Service Test Limit 4,000 V 8,000 V Operational Test Limits 81
82 ESD Risks at the Three Threshold Values Cumulative Probability P (V > V 0 ) with ESD Floors and ESD Shoes Pattern Walking Environmental Condition V 0 = 500 V V 0 = 4,000 V V 0 = 8,000 V 45% RH at 80.6 F (27 C) 1.47e 9% 1.69e 17% 3.82e 20% 25% RH at 80.6 F (27 C) 9.74e 3% 3.05e 7% 9.61e 9% 8% RH at 80.6 F (27 C) 3.76e 4% 6.80e 10% 8.30e 12% Cumulative Probability P (V > V 0 ) with Non ESD Floors and Non ESD Shoes Pattern Walking Environmental Condition V 0 = 500 V V 0 = 4,000 V V 0 = 8,000 V 45% RH at 80.6 F (27 C) 4.7% 0.013% % 25% RH at 80.6 F (27 C) 23% 1.13% 0.27% 8% RH at 80.6 F (27 C) 48.8% 2.28% 0.43% Cumulative Probability P (V > V 0 ) with ESD Floors and Non ESD Shoes Pattern Walking Environmental Condition V 0 = 500 V V 0 = 4,000 V V 0 = 8,000 V 45% RH at 80.6 F (27 C) 0.15% 7.44e 9% 1.17e 11% 25% RH at 80.6 F (27 C) 5.8% 7.14e 9% 2.12e 8% 8% RH at 80.6 F (27 C) 12.2% 2.38e 4% 3.01e 7% 82 BEST WORST ASHRAE
83 Dust Can Degrade Computer Reliability Dust is everywhere. Even with our best filtration efforts, fine dust will be present in a data center and will settle on electronic hardware. Dust settled on printed circuit boards can lead to electrical short circuiting of closely spaced features by absorbing water, getting wet and, thus, becoming electrically conductive. Dust can enter electrical connectors causing: Power connections to overheat, and Signal connections to become electrically noisy. IBM 83
84 Higher Temperature Impacts on Data Center Operation (cont.) Dust Contaminants & Relative Humidity Corrosion No corrosion IBM Each salt contaminant has a deliquescent relative humidity above which the salt will absorb moisture, becoming wet and electrically conductive. Wet salts can corrode metals. 84
85 Gaseous Pollutants/RH Effects IBM 85
CIBSE ASHRAE Group. Data Centre Energy Efficiency: Who, What, Why, When, Where & How
CIBSE ASHRAE Group Data Centre Energy Efficiency: Who, What, Why, When, Where & How Presenters Don Beaty PE, FASHRAE Founder, President & Managing Director DLB Associates Consulting Engineers Paul Finch
More informationData Center Environments
Data Center Environments ASHRAE s Evolving Thermal Guidelines By Robin A. Steinbrecher, Member ASHRAE; and Roger Schmidt, Ph.D., Member ASHRAE Over the last decade, data centers housing large numbers of
More informationEnergy Efficiency Opportunities in Federal High Performance Computing Data Centers
Energy Efficiency Opportunities in Federal High Performance Computing Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By Lawrence Berkeley National Laboratory
More informationIncreasing Energ y Efficiency In Data Centers
The following article was published in ASHRAE Journal, December 2007. Copyright 2007 American Society of Heating, Refrigerating and Air- Conditioning Engineers, Inc. It is presented for educational purposes
More informationDealing with Thermal Issues in Data Center Universal Aisle Containment
Dealing with Thermal Issues in Data Center Universal Aisle Containment Daniele Tordin BICSI RCDD Technical System Engineer - Panduit Europe Daniele.Tordin@Panduit.com AGENDA Business Drivers Challenges
More informationThe New Data Center Cooling Paradigm The Tiered Approach
Product Footprint - Heat Density Trends The New Data Center Cooling Paradigm The Tiered Approach Lennart Ståhl Amdahl, Cisco, Compaq, Cray, Dell, EMC, HP, IBM, Intel, Lucent, Motorola, Nokia, Nortel, Sun,
More informationLiquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.
Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED. Presentation Goals & Outline Power Density Where we have been- where we are now - where we are going Limitations of Air Cooling
More informationHow to Build a Data Centre Cooling Budget. Ian Cathcart
How to Build a Data Centre Cooling Budget Ian Cathcart Chatsworth Products Topics We ll Cover Availability objectives Space and Load planning Equipment and design options Using CFD to evaluate options
More informationCURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers
CURBING THE COST OF DATA CENTER COOLING Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers OVERVIEW Compare Cooling Strategies in Free- Standing and In-Building
More informationEnvironmental Data Center Management and Monitoring
2013 Raritan Inc. Table of Contents Introduction Page 3 Sensor Design Considerations Page 3 Temperature and Humidity Sensors Page 4 Airflow Sensor Page 6 Differential Air Pressure Sensor Page 6 Water Sensor
More informationData Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling
Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling Joshua Grimshaw Director of Engineering, Nova Corporation
More informationData Center Design Guide featuring Water-Side Economizer Solutions. with Dynamic Economizer Cooling
Data Center Design Guide featuring Water-Side Economizer Solutions with Dynamic Economizer Cooling Presenter: Jason Koo, P.Eng Sr. Field Applications Engineer STULZ Air Technology Systems jkoo@stulz ats.com
More informationBenefits of Cold Aisle Containment During Cooling Failure
Benefits of Cold Aisle Containment During Cooling Failure Introduction Data centers are mission-critical facilities that require constant operation because they are at the core of the customer-business
More informationGreen Data Centre Design
Green Data Centre Design A Holistic Approach Stantec Consulting Ltd. Aleks Milojkovic, P.Eng., RCDD, LEED AP Tommy Chiu, EIT, RCDD, LEED AP STANDARDS ENERGY EQUIPMENT MATERIALS EXAMPLES CONCLUSION STANDARDS
More informationAPC APPLICATION NOTE #112
#112 Best Practices for Deploying the InfraStruXure InRow SC By David Roden Abstract The InfraStruXure InRow SC (ACSC100 and ACSC101) is a self-contained air conditioner for server rooms and wiring closets.
More informationFree Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions
Free Cooling in Data Centers John Speck, RCDD, DCDC JFC Solutions Why this topic Many data center projects or retrofits do not have a comprehensive analyses of systems power consumption completed in the
More informationGREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.
GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp. Overview Data centers are an ever growing part of our economy.
More informationReducing Data Center Energy Consumption
Reducing Data Center Energy Consumption By John Judge, Member ASHRAE; Jack Pouchet, Anand Ekbote, and Sachin Dixit Rising data center energy consumption and increasing energy costs have combined to elevate
More informationCooling Small Server Rooms Can Be. - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com
Cooling Small Server Rooms Can Be Inexpensive, Efficient and Easy - Jim Magallanes Computer Room Uptime: www.cruptime.com Uptime Racks: www.uptimeracks.com Server Rooms Description & Heat Problem Trends
More informationIn Row Cooling Options for High Density IT Applications
In Row Cooling Options for High Density IT Applications By Ramzi Y. Namek, PE, LEED AP, BD+C; Director of Engineering 1 Users wishing to deploy high density IT racks have several In Row Cooling solutions
More informationHow High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions
Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power and cooling savings through the use of Intel
More informationCooling Audit for Identifying Potential Cooling Problems in Data Centers
Cooling Audit for Identifying Potential Cooling Problems in Data Centers By Kevin Dunlap White Paper #40 Revision 2 Executive Summary The compaction of information technology equipment and simultaneous
More informationCase Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia
Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia Prepared for the U.S. Department of Energy s Federal Energy Management Program
More informationHigh Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.
High Density Data Centers Fraught with Peril Richard A. Greco, Principal EYP Mission Critical Facilities, Inc. Microprocessors Trends Reprinted with the permission of The Uptime Institute from a white
More informationBenefits of. Air Flow Management. Data Center
Benefits of Passive Air Flow Management in the Data Center Learning Objectives At the end of this program, participants will be able to: Readily identify if opportunities i where networking equipment
More informationManaging Data Centre Heat Issues
Managing Data Centre Heat Issues Victor Banuelos Field Applications Engineer Chatsworth Products, Inc. 2010 Managing Data Centre Heat Issues Thermal trends in the data centre Hot Aisle / Cold Aisle design
More informationDOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers
DOE FEMP First Thursday Seminar Achieving Energy Efficient Data Centers with New ASHRAE Thermal Guidelines Learner Guide Core Competency Areas Addressed in the Training Energy/Sustainability Managers and
More informationCase Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers
Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers Prepared for the U.S. Department of Energy s Federal Energy Management Program Prepared By Lawrence Berkeley National
More informationDriving Data Center Efficiency Through the Adoption of Best Practices
Data Center Solutions 2008 Driving Data Center Efficiency Through the Adoption of Best Practices Data Center Solutions Driving Data Center Efficiency Through the Adoption of Best Practices Executive Summary
More informationThe Effect of Data Centre Environment on IT Reliability & Energy Consumption
The Effect of Data Centre Environment on IT Reliability & Energy Consumption Steve Strutt EMEA Technical Work Group Member IBM The Green Grid EMEA Technical Forum 2011 Agenda History of IT environmental
More informationAisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings
WH I TE PAPE R AisleLok Modular Containment vs. : A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings By Bruce Long, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies,
More informationAnalysis of data centre cooling energy efficiency
Analysis of data centre cooling energy efficiency An analysis of the distribution of energy overheads in the data centre and the relationship between economiser hours and chiller efficiency Liam Newcombe
More informationImproving Data Center Energy Efficiency Through Environmental Optimization
Improving Data Center Energy Efficiency Through Environmental Optimization How Fine-Tuning Humidity, Airflows, and Temperature Dramatically Cuts Cooling Costs William Seeber Stephen Seeber Mid Atlantic
More informationSelecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers
C P I P O W E R M A N A G E M E N T WHITE PAPER Selecting Rack-Mount Power Distribution Units For High-Efficiency Data Centers By Anderson Hungria Sr. Product Manager of Power, Electronics & Software Published
More informationGUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group
GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY Public Sector ICT Special Working Group SERVER ROOM ENERGY EFFICIENCY This guide is one of a suite of documents that aims to provide guidance on ICT energy efficiency.
More informationVerizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001
Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001 Presented by: Liberty Engineering, LLP 1609 Connecticut Avenue
More informationHow High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions
Intel Intelligent Power Management Intel How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions Power savings through the use of Intel s intelligent
More informationDirect Fresh Air Free Cooling of Data Centres
White Paper Introduction There have been many different cooling systems deployed in Data Centres in the past to maintain an acceptable environment for the equipment and for Data Centre operatives. The
More informationData Center Energy Profiler Questions Checklist
Data Center Energy Profiler Questions Checklist Step 1 Case Name Date Center Company State/Region County Floor Area Data Center Space Floor Area Non Data Center Space Floor Area Data Center Support Space
More informationElement D Services Heating, Ventilating, and Air Conditioning
PART 1 - GENERAL 1.01 OVERVIEW A. This section supplements Design Guideline Element D3041 on air handling distribution with specific criteria for projects involving design of a Data Center spaces B. Refer
More informationElements of Energy Efficiency in Data Centre Cooling Architecture
Elements of Energy Efficiency in Data Centre Cooling Architecture Energy Efficient Data Center Cooling 1 STULZ Group of Companies Turnover 2006 Plastics Technology 400 Mio A/C Technology 200 Mio Total
More informationImproving Data Center Efficiency with Rack or Row Cooling Devices:
Improving Data Center Efficiency with Rack or Row Cooling Devices: Results of Chill-Off 2 Comparative Testing Introduction In new data center designs, capacity provisioning for ever-higher power densities
More informationGlossary of Heating, Ventilation and Air Conditioning Terms
Glossary of Heating, Ventilation and Air Conditioning Terms Air Change: Unlike re-circulated air, this is the total air required to completely replace the air in a room or building. Air Conditioner: Equipment
More informationSTULZ Water-Side Economizer Solutions
STULZ Water-Side Economizer Solutions with STULZ Dynamic Economizer Cooling Optimized Cap-Ex and Minimized Op-Ex STULZ Data Center Design Guide Authors: Jason Derrick PE, David Joy Date: June 11, 2014
More informationHot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.
Business Management Magazine Winter 2008 Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc. Contrary to some beliefs, air is quite
More informationRe Engineering to a "Green" Data Center, with Measurable ROI
Re Engineering to a "Green" Data Center, with Measurable ROI Alan Mamane CEO and Founder Agenda Data Center Energy Trends Benchmarking Efficiency Systematic Approach to Improve Energy Efficiency Best Practices
More information7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America
7 Best Practices for Increasing Efficiency, Availability and Capacity XXXX XXXXXXXX Liebert North America Emerson Network Power: The global leader in enabling Business-Critical Continuity Automatic Transfer
More informationPower and Cooling for Ultra-High Density Racks and Blade Servers
Power and Cooling for Ultra-High Density Racks and Blade Servers White Paper #46 Introduction The Problem Average rack in a typical data center is under 2 kw Dense deployment of blade servers (10-20 kw
More informationThe Different Types of Air Conditioning Equipment for IT Environments
The Different Types of Air Conditioning Equipment for IT Environments By Tony Evans White Paper #59 Executive Summary Cooling equipment for an IT environment can be implemented in 10 basic configurations.
More informationEnergy Efficient Server Room and Data Centre Cooling. Computer Room Evaporative Cooler. EcoCooling
Energy Efficient Server Room and Data Centre Cooling Computer Room Evaporative Cooler EcoCooling EcoCooling CREC Computer Room Evaporative Cooler Reduce your cooling costs by over 90% Did you know? An
More informationData Centers: How Does It Affect My Building s Energy Use and What Can I Do?
Data Centers: How Does It Affect My Building s Energy Use and What Can I Do? 1 Thank you for attending today s session! Please let us know your name and/or location when you sign in We ask everyone to
More informationChallenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre
Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre S. Luong 1*, K. Liu 2, James Robey 3 1 Technologies for Sustainable Built Environments, University of Reading,
More informationBlade Server & Data Room Cooling Specialists
SURVEY I DESIGN I MANUFACTURE I INSTALL I COMMISSION I SERVICE SERVERCOOL An Eaton-Williams Group Brand Blade Server & Data Room Cooling Specialists Manufactured in the UK SERVERCOOL Cooling IT Cooling
More informationGuide to Minimizing Compressor-based Cooling in Data Centers
Guide to Minimizing Compressor-based Cooling in Data Centers Prepared for the U.S. Department of Energy Federal Energy Management Program By: Lawrence Berkeley National Laboratory Author: William Tschudi
More informationPresentation Outline. Common Terms / Concepts HVAC Building Blocks. Links. Plant Level Building Blocks. Air Distribution Building Blocks
Presentation Outline Common Terms / Concepts HVAC Building Blocks Plant Level Building Blocks Description / Application Data Green opportunities Selection Criteria Air Distribution Building Blocks same
More informationRittal Liquid Cooling Series
Rittal Liquid Cooling Series by Herb Villa White Paper 04 Copyright 2006 All rights reserved. Rittal GmbH & Co. KG Auf dem Stützelberg D-35745 Herborn Phone +49(0)2772 / 505-0 Fax +49(0)2772/505-2319 www.rittal.de
More information- White Paper - Data Centre Cooling. Best Practice
- White Paper - Data Centre Cooling Best Practice Release 2, April 2008 Contents INTRODUCTION... 3 1. AIR FLOW LEAKAGE... 3 2. PERFORATED TILES: NUMBER AND OPENING FACTOR... 4 3. PERFORATED TILES: WITH
More informationWhite Paper. Data Center Containment Cooling Strategies. Abstract WHITE PAPER EC9001. Geist Updated August 2010
White Paper Data Center Containment Cooling Strategies WHITE PAPER EC9001 Geist Updated August 2010 Abstract Deployment of high density IT equipment into data center infrastructure is now a common occurrence
More informationBytes and BTUs: Holistic Approaches to Data Center Energy Efficiency. Steve Hammond NREL
Bytes and BTUs: Holistic Approaches to Data Center Energy Efficiency NREL 1 National Renewable Energy Laboratory Presentation Road Map A Holistic Approach to Efficiency: Power, Packaging, Cooling, Integration
More informationEducation Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009
Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009 Agenda Overview - Network Critical Physical Infrastructure Cooling issues in the Server Room
More informationUnified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers
Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers Deploying a Vertical Exhaust System www.panduit.com WP-09 September 2009 Introduction Business management applications and rich
More informationData Center 2020: Delivering high density in the Data Center; efficiently and reliably
Data Center 2020: Delivering high density in the Data Center; efficiently and reliably March 2011 Powered by Data Center 2020: Delivering high density in the Data Center; efficiently and reliably Review:
More informationAIA Provider: Colorado Green Building Guild Provider Number: 50111120. Speaker: Geoff Overland
AIA Provider: Colorado Green Building Guild Provider Number: 50111120 Office Efficiency: Get Plugged In AIA Course Number: EW10.15.14 Speaker: Geoff Overland Credit(s) earned on completion of this course
More informationData Centre Testing and Commissioning
Data Centre Testing and Commissioning What is Testing and Commissioning? Commissioning provides a systematic and rigorous set of tests tailored to suit the specific design. It is a process designed to
More informationEnergy and Cost Analysis of Rittal Corporation Liquid Cooled Package
Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package Munther Salim, Ph.D. Yury Lui, PE, CEM, LEED AP eyp mission critical facilities, 200 west adams street, suite 2750, Chicago, il 60606
More informationDataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences
DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no significant differences November 2011 Powered by DataCenter 2020: hot aisle and cold aisle containment efficiencies reveal no
More informationAPC APPLICATION NOTE #92
#92 Best Practices for Designing Data Centers with the InfraStruXure InRow RC By John Niemann Abstract The InfraStruXure InRow RC is designed to provide cooling at the row and rack level of a data center
More informationUnified Physical Infrastructure (UPI) Strategies for Thermal Management
Unified Physical Infrastructure (UPI) Strategies for Thermal Management The Importance of Air Sealing Grommets to Improving Smart www.panduit.com WP-04 August 2008 Introduction One of the core issues affecting
More informationCANNON T4 MINI / MICRO DATA CENTRE SYSTEMS
CANNON T4 MINI / MICRO DATA CENTRE SYSTEMS Air / Water / DX Cooled Cabinet Solutions Mini Data Centre All in One Solution: Where there is a requirement for standalone computing and communications, or highly
More informationDefining Quality. Building Comfort. Precision. Air Conditioning
Defining Quality. Building Comfort. Precision Air Conditioning Today s technology rooms require precise, stable environments in order for sensitive electronics to operate optimally. Standard comfort air
More informationCombining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency
A White Paper from the Experts in Business-Critical Continuity TM Combining Cold Aisle Containment with Intelligent Control to Optimize Data Center Cooling Efficiency Executive Summary Energy efficiency
More informationCenter Thermal Management Can Live Together
Network Management age e and Data Center Thermal Management Can Live Together Ian Seaton Chatsworth Products, Inc. Benefits of Hot Air Isolation 1. Eliminates reliance on thermostat 2. Eliminates hot spots
More informationCooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Lars Strong, P.E., Upsite Technologies, Inc. Kenneth G. Brill, Upsite Technologies, Inc. 505.798.0200
More informationHow High Temperature Data Centers and Intel Technologies Decrease Operating Costs
Intel Intelligent Management How High Temperature Data Centers and Intel Technologies Decrease Operating Costs and cooling savings through the use of Intel s Platforms and Intelligent Management features
More informationHow Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems
How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems Paul Mathew, Ph.D., Staff Scientist Steve Greenberg, P.E., Energy Management Engineer
More informationManaging Cooling Capacity & Redundancy In Data Centers Today
Managing Cooling Capacity & Redundancy In Data Centers Today About AdaptivCOOL 15+ Years Thermal & Airflow Expertise Global Presence U.S., India, Japan, China Standards & Compliances: ISO 9001:2008 RoHS
More informationUniversity of St Andrews. Energy Efficient Data Centre Cooling
Energy Efficient Data Centre Cooling St. Andrews and Elsewhere Richard Lumb Consultant Engineer Future-Tech Energy Efficient Data Centre Cooling There is no one single best cooling solution for all Data
More informationThe Mission Critical Data Center Understand Complexity Improve Performance
The Mission Critical Data Center Understand Complexity Improve Performance Communications Technology Forum Fall 2013 Department / Presenter / Date 1 Rittal IT Solutions The State of the Data Center Market
More informationIT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS
IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS SUMMARY As computer manufacturers pack more and more processing power into smaller packages, the challenge of data center
More informationUsing Portable Spot Air Conditioners in Industrial Applications
Using Portable Spot Air Conditioners in Industrial Applications Cooling People, Processes and Equipment In today s highly competitive manufacturing sector, companies are constantly looking for new ways
More informationEnergy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources
Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources ALEXANDRU SERBAN, VICTOR CHIRIAC, FLOREA CHIRIAC, GABRIEL NASTASE Building Services
More informationCooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings
WHITE PAPER Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings By Kenneth G. Brill, Upsite Technologies, Inc. Lars Strong, P.E., Upsite Technologies, Inc. 505.798.0200
More informationCreating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant
RESEARCH UNDERWRITER WHITE PAPER LEAN, CLEAN & GREEN Wright Line Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant Currently 60 percent of the cool air that
More informationAIRAH Presentation April 30 th 2014
Data Centre Cooling Strategies The Changing Landscape.. AIRAH Presentation April 30 th 2014 More Connections. More Devices. High Expectations 2 State of the Data Center, Emerson Network Power Redefining
More informationThe CEETHERM Data Center Laboratory
The CEETHERM Data Center Laboratory A Platform for Transformative Research on Green Data Centers Yogendra Joshi and Pramod Kumar G.W. Woodruff School of Mechanical Engineering Georgia Institute of Technology
More informationChiller-less Facilities: They May Be Closer Than You Think
Chiller-less Facilities: They May Be Closer Than You Think A Dell Technical White Paper Learn more at Dell.com/PowerEdge/Rack David Moss Jon Fitch Paul Artman THIS WHITE PAPER IS FOR INFORMATIONAL PURPOSES
More informationBICSInews. plus. july/august 2012. + Industrial-grade Infrastructure. + I Say Bonding, You Say Grounding + Data Center Containment.
BICSInews m a g a z i n e july/august 2012 Volume 33, Number 4 plus + Industrial-grade Infrastructure and Equipment + I Say Bonding, You Say Grounding + Data Center Containment & design deployment The
More informationState of the Art Energy Efficient Data Centre Air Conditioning
- White Paper - State of the Art Energy Efficient Data Centre Air Conditioning - Dynamic Free Cooling - Release 2, April 2008 Contents ABSTRACT... 3 1. WHY DO I NEED AN ENERGY EFFICIENT COOLING SYSTEM...
More informationEnergy Impact of Increased Server Inlet Temperature
Energy Impact of Increased Server Inlet Temperature By: David Moss, Dell, Inc. John H. Bean, Jr., APC White Paper #138 Executive Summary The quest for efficiency improvement raises questions regarding
More informationHow To Run A Data Center Efficiently
A White Paper from the Experts in Business-Critical Continuity TM Data Center Cooling Assessments What They Can Do for You Executive Summary Managing data centers and IT facilities is becoming increasingly
More informationEnergy Efficient Data Centre at Imperial College. M. Okan Kibaroglu IT Production Services Manager Imperial College London.
Energy Efficient Data Centre at Imperial College M. Okan Kibaroglu IT Production Services Manager Imperial College London 3 March 2009 Contents Recognising the Wider Issue Role of IT / Actions at Imperial
More informationA Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School
A Comparative Study of Various High Density Data Center Cooling Technologies A Thesis Presented by Kwok Wu to The Graduate School in Partial Fulfillment of the Requirements for the Degree of Master of
More informationFive Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency
Five Strategies for Cutting Data Center Energy Costs Through Enhanced Cooling Efficiency A White Paper from the Experts in Business-Critical Continuity TM Executive Summary As electricity prices and IT
More informationDataCenter 2020: first results for energy-optimization at existing data centers
DataCenter : first results for energy-optimization at existing data centers July Powered by WHITE PAPER: DataCenter DataCenter : first results for energy-optimization at existing data centers Introduction
More informationUsing Outside Air and Evaporative Cooling to Increase Data Center Efficiency
Using Outside Air and Evaporative Cooling to Increase Data Center Efficiency Mee Industries Inc. 16021 Adelante Street,Irwindale, California 91702 telephone 1.800.732.5364 fax 1.626.359.4660 info@meefog.com
More informationData Centre Cooling Air Performance Metrics
Data Centre Cooling Air Performance Metrics Sophia Flucker CEng MIMechE Ing Dr Robert Tozer MSc MBA PhD CEng MCIBSE MASHRAE Operational Intelligence Ltd. info@dc-oi.com Abstract Data centre energy consumption
More informationBest Practices. for the EU Code of Conduct on Data Centres. Version 1.0.0 First Release Release Public
Best Practices for the EU Code of Conduct on Data Centres Version 1.0.0 First Release Release Public 1 Introduction This document is a companion to the EU Code of Conduct on Data Centres v0.9. This document
More informationTypical Air Cooled Data Centre Compared to Iceotope s Liquid Cooled Solution
Typical Air Cooled Data Centre Compared to Iceotope s Liquid Cooled Solution White Paper July 2013 Yong Quiang Chi University of Leeds Contents Background of the Work... 3 The Design of the Iceotope Liquid
More information2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms
Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms Agenda Review of cooling challenge and strategies Solutions to deal with wiring closet cooling Opportunity and value Power
More informationIMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT
DATA CENTER RESOURCES WHITE PAPER IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT BY: STEVE HAMBRUCH EXECUTIVE SUMMARY Data centers have experienced explosive growth in the last decade.
More information