Green Data Centre: Is There Such A Thing? Dr. T. C. Tan Distinguished Member, CommScope Labs
Topics The Why? The How? Conclusion
The Why?
Drivers for Green IT According to Forrester Research, IT accounts for about 2% 10% of overall energy consumption By 2008, Gartner estimates that 48% of all IT budgets will be spent on energy alone According to a recent Uptime Institute survey, about 42% said that their data centre would run out of power capacity by 2009 (in 12 24 months) Reduce energy cost Improve efficiency Improve company image Corporate social responsibility Attract green customers Regulations
120.0 Energy Efficiency: Critical Objective for Data Centres (sources US EPA Server and Data Centre Energy Efficiency & EU Efficient Servers reports) 100.0 2011 2011 Annual Electricity Use, TWh 80.0 60.0 40.0 2006 2006 20.0 0.0 USA EU-27
Data Centre Energy Consumption in 2006 70.00 60.00 50.00 Annual Electricity Use, TWh 40.00 30.00 Infrastructure (Cooling, Power) Network Equipment Storage High-end Servers Mid-range Servers Voulme Servers 20.00 10.00 0.00 USA EU-27
Power Trend Chart (Source: The Uptime Institute 2000 & The Thermal Consortium)
Revised Power Trend Chart (Source: ASHRAE Datacom Equipment Power Trends and Cooling Applications)
EU Regulations Energy Performance of Building Directive, EPBD (2002/91/EC) Requires energy performance certification Energy-using Product Directive, EuPD (2005/32/EC) Set eco-design requirements for energy-using products WEEE (2002/96/EC) and RoHS (2002/95/EC) Set eco-design requirements for energy-using products
Possible Outcome from US Environmental Protection Agency (EPA) Report Public Law 109-431 from Congress May mandate steps for US Government data centres May recommend steps for commercial data centres
US Green Building Council: LEED Program Leadership in Energy and Environmental Design (LEED) Green Building rating system for new constructions and major renovations Four Levels: Platinum, Gold, Silver, Certified Accepted benchmark for the design, construction, and operation of high performance green buildings
The How?
IEEE 802.3az Energy Efficient Ethernet IEEE Study Group formed: Jan 2007 Some Objectives Define a mechanism to reduce power consumption during periods of low utilisation Define a protocol to co-ordinate transitions to and from a lower level of power consumption Link status should not change as a result of the transition
Possible IEEE 802.3az Timeline
The Green Grid Non-profit consortium Microsoft, Intel, AMD, Dell, HP, SUN, IBM, APC, Cisco, etc To develop and promote energy efficiency in data centres Lobbies for LEED data centre certification Provide metrics for describing data centre power efficiency
The Green Grid (cont) Provides guidelines for energyefficient data centres Floor layout: hot aisle/cold aisle layout & optimum locations of vented floor tiles and CRAC units (use CFD) Proper configuration of server dynamic power management software Installation of more efficient power equipment and energy efficient lighting Server and storage consolidation (retire legacy servers) Server and storage virtualisation (VSERVER and VSAN)
Hot & Cold Aisles Concept Front Front Rear Rear COLD AISLE HOT AISLE Equipment layout: Front to front on cold aisle w/perforated floor tiles blowing cool air dissipated w/equipment fan Location of power cables under raised floor Back to back on hot aisle w/solid floor tiles using fan exhausts Location of IT cables under raised floor
Raised Floor and CRAC Placement Proper raised floor depth Recommended depth is 457 610 mm (18 24 ) Proper CRAC placement Maximise return air flow Align CRACs with hot aisles unless ducting or other heat containment method is used Minimise obstacles to good air flow E.g. CRAC piping Remove old cables and pipes
CRAC and Cabinet/Rack Placement Highest velocity is closest to CRAC High velocity = low pressure (Bernoulli s Law) Low pressure = low air delivery Place low heat cabinets/racks (e.g patching) closest to CRAC Place high heat cabinets at best airflow positions CRAC unit
Apply Practical Physics 25ºC 20ºC Without blanking panels With blanking panels 15ºC Highest heat equipment towards bottom
Typical Server Consumption: Component Peak Power (Watt) Fan 4% Motherboard 10% PSU Losses 15% Peripheral 20% Disk 5% CPU 32% Use Thin Clients where appropriate Source: EPA Report Memory 14%
Conclusion Business-as-usual is NOT AN OPTION Cost and regulations will mandate efficiency improvement Change architecture Optimisation techniques Install energy metering tools Manage stored data: regular audit and delete Maximise use of free cooling (i.e. outside air)
Conclusion (cont) Change processes Green procurement policy IT needs to know its power consumption (typically handled by facility department) Change behaviour Get employee participation (via blog, e-board, etc) Switch off unused printers and PCs at the end of each work day
Do Green Data Centres Exist? Not Really IT industry must take the lead Provide standardised benchmarking metrics for green data centres
Thank You