Architecture for Modular Data Centers. It is all about tailoring the most optimized solution, and be able to prove it for you.



Similar documents
CarbonDecisions. The green data centre. Why becoming a green data centre makes good business sense

Data Center & IT Infrastructure Optimization. Trends & Best Practices. Mickey Iqbal - IBM Distinguished Engineer. IBM Global Technology Services

Data Center Equipment Power Trends

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Green Data Centre Design

Managing Data Centre Heat Issues

DataCenter 2020: first results for energy-optimization at existing data centers

Re Engineering to a "Green" Data Center, with Measurable ROI

Case Study: Innovative Energy Efficiency Approaches in NOAA s Environmental Security Computing Center in Fairmont, West Virginia

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

AEGIS DATA CENTER SERVICES POWER AND COOLING ANALYSIS SERVICE SUMMARY

How to Build a Data Centre Cooling Budget. Ian Cathcart

Increasing Energ y Efficiency In Data Centers

Overview of Green Energy Strategies and Techniques for Modern Data Centers

Data Center Rack Level Cooling Utilizing Water-Cooled, Passive Rear Door Heat Exchangers (RDHx) as a Cost Effective Alternative to CRAH Air Cooling

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Enabling an agile Data Centre in a (Fr)agile market

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Rittal Liquid Cooling Series

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Data Center Technology: Physical Infrastructure

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Blade Server & Data Room Cooling Specialists

Environmental Data Center Management and Monitoring

Data Center Energy Profiler Questions Checklist

Data Centre/Server Room Containment: How to Get More Cooling Capacity without Adding Additional AC

How Does Your Data Center Measure Up? Energy Efficiency Metrics and Benchmarks for Data Center Infrastructure Systems

Data center upgrade proposal. (phase one)

IBM Twin Data Center Complex Ehningen Peter John IBM BS 2011 IBM Corporation

Energy Recovery Systems for the Efficient Cooling of Data Centers using Absorption Chillers and Renewable Energy Resources

Data Center 2020: Delivering high density in the Data Center; efficiently and reliably

7 Best Practices for Increasing Efficiency, Availability and Capacity. XXXX XXXXXXXX Liebert North America

Elements of Energy Efficiency in Data Centre Cooling Architecture

Energy and Cost Analysis of Rittal Corporation Liquid Cooled Package

Data Centre Testing and Commissioning

Energy Management Services

Reducing Data Center Energy Consumption

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

The New Data Center Cooling Paradigm The Tiered Approach

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

The Pros and Cons of Modular Systems

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

HPC TCO: Cooling and Computer Room Efficiency

Verizon SMARTS Data Center Design Phase 1 Conceptual Study Report Ms. Leah Zabarenko Verizon Business 2606A Carsins Run Road Aberdeen, MD 21001

IT White Paper MANAGING EXTREME HEAT: COOLING STRATEGIES FOR HIGH-DENSITY SYSTEMS

AIA Provider: Colorado Green Building Guild Provider Number: Speaker: Geoff Overland

The Mission Critical Data Center Understand Complexity Improve Performance

Data Center Cooling & Air Flow Management. Arnold Murphy, CDCEP, CDCAP March 3, 2015

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

Thermal Optimisation in the Data Centre

Recommendation For Incorporating Data Center Specific Sustainability Best Practices into EO13514 Implementation

Free Cooling in Data Centers. John Speck, RCDD, DCDC JFC Solutions

Strategies for Deploying Blade Servers in Existing Data Centers

abstract about the GREEn GRiD

Control of Computer Room Air Handlers Using Wireless Sensors. Energy Efficient Data Center Demonstration Project

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

The Effect of Data Centre Environment on IT Reliability & Energy Consumption

Next Generation Data Center Infrastructure

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Scalable. Affordable. Flexible. Fast.

Raised Floor Data Centers Prepared for the Future

APC APPLICATION NOTE #112

Dealing with Thermal Issues in Data Center Universal Aisle Containment

BRUNS-PAK Presents MARK S. EVANKO, Principal

How High Temperature Data Centers & Intel Technologies save Energy, Money, Water and Greenhouse Gas Emissions

DOE FEMP First Thursday Seminar. Learner Guide. Core Competency Areas Addressed in the Training. Energy/Sustainability Managers and Facility Managers

Steps to Achieve Green Data Center

1. Data Centre Environment Why focus on Data Centres? Hitachi Eco-Friendly Data Centre Solutions

Benefits of. Air Flow Management. Data Center

Prediction Is Better Than Cure CFD Simulation For Data Center Operation.

Data Center Power Consumption

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Best Practices. for the EU Code of Conduct on Data Centres. Version First Release Release Public

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Specialty Environment Design Mission Critical Facilities

Benefits of Cold Aisle Containment During Cooling Failure

Power and Cooling for Ultra-High Density Racks and Blade Servers

Data Centre Solutions... energy efficient by design. APC by Schneider Electric Elite Data Centre Partner. APC s leading Elite Data Centre Solutions

Design Best Practices for Data Centers

ICT and the Green Data Centre

How To Improve Energy Efficiency In A Data Center

Cooling Audit for Identifying Potential Cooling Problems in Data Centers

Office of the Government Chief Information Officer. Green Data Centre Practices

Energy Efficiency Best Practice Guide Data Centre and IT Facilities

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

DATA CENTER ASSESSMENT SERVICES

Case Study: Guaranteed efficiency through standardised data centres

Airflow Simulation Solves Data Centre Cooling Problem

European Code of Conduct for Data Centre. Presentation of the Awards 2016

International Telecommunication Union SERIES L: CONSTRUCTION, INSTALLATION AND PROTECTION OF TELECOMMUNICATION CABLES IN PUBLIC NETWORKS

AIRAH Presentation April 30 th 2014

Transcription:

Architecture for Modular Data Centers It is all about tailoring the most optimized solution, and be able to prove it for you. (AA)

Facing new pressures: Data centers are at a tipping point Increased Computing Demand Changing Cost Dynamics Data Center Lifecycle Mismatch In the next decade, growth in server shipments will be 6x and 69x for storage IBM / Consultant studies Per square foot, annual data center energy costs are 10 to 30 times more than those of a typical office building. 4 - William Tschudi. Datacenters have doubled their energy use in the past five years. 5 Koomey. US commercial electrical costs increased by 10 percent from 2005-06. 6 - EPA Monthly Forecast. Eighty-six percent of data centers were built before 2001 7 Twenty-nine percent of clients identified data center capability affected server purchases - Ziff Davis According to Gartner, The underlying consumption of energy in large data centers to power and cool hardware infrastructure is likely to increase steadily during the next ten years. 2 2. Gartner, Data Center Power and Cooling Scenario Through 2015, Rakesh Kumar, March 2007. 4. William Tschudi, March 2006 5. Koomey, February 2007. 6. EPA Monthly Forecast, 2007. 7. Nemertes Research, Architecting and Managing the 21st Century Data Center, Johna Till Johnson, 2006.

Commodity Data Center Growth Software as a Service Services w/o value-add going off premise Payroll, security, etc. all went years ago Substantial economies of scale Services at 10^5+ systems under mgmt rather than ~10^2 IT outsourcing also centralizing compute centers Commercial High Performance Computing Better understand customers, optimize supply chain, Consumer Services Google estimated at ½ million systems in 30 data centers Basic observation: No single system can reliably reach 5 9 s (need redundant H/W with resultant S/W complexity) With S/W redundancy, most economic H/W solution is large numbers of commodity systems

The Energy Efficiency Initiative Principles 1 Energy usage in the data center has a significant impact today and will have an even greater impact in the future 2 Real solutions are available now that can reduce data center energy usage 3 To meet the challenge, collaboration is a must across IT technology vendors, data center design and build businesses, infrastructure technology providers, energy utilities and governments 4 Think green. And think ahead. Understanding your energy facts is key. Expert advice can help make savings real

Where does the energy go? The data center energy challenge affects both the physical data center and the IT infrastructure % of total data center electricity use 35 30 25 20 15 10 5 0 Chiller/ Cooling Tower Cooling systems Humidifier Computer Room Air Conditioner Information Technology Power Distribution Unit Electrical and building systems Uninterruptible Power Supply Switch/ Gen Lighting Power Use

Problem Description

Hot Spots and Recirculation Zones ASHRAE Class 1 Environment Requires Inlet Air Temperature to Be maintained between 15 32 C Chilled air does not reach top of racks Cold Aisle Hot Aisle Cold Aisle

Conventional Data Center Flow Chart HO T O UTPU T RETURN AIR - Perforated tiles do not support high flow rates required by high density servers - U nderfloor obstructions from chilled w ater pipes and cables R AISED FLO O R ARE A 3 Server R ack 2 4 CO LD IN PU T AIR PEDESTAL A/C unit 1 CO N CR ETE SUBFLO O R

Raised Floor typical condition 35-45 % Cold Air Flow FLOOR SUPPORT PEDESTAL PERFORATED FLOOR TILE SOLID FLO OR TILE CABLE TRAY CONCRETE SUB FLOOR FLOOR SUPPORT PEDESTAL AIR FLOW RESISTANCE 100 % Cold Air Flow from A/C unit

Optimized Airflow Assessment for Cabling under floor savings A comprehensive assessment of the existing data center cabling infrastructure provides an expert analysis of the overall cabling actions. The service is designed to improve airflow for optimized cooling, simplified change management and improved cabling resiliency. Offering benefits Improved air flow under raised floor creates a more energy efficient data center Reduce hot spots due to bypass airflow Improve manageability of cabling systems Reduce operational costs associated with cable installation and change management Before After

2nd law of thermodynamics

Resulted Operational cost

Product heat density Trend chart

Today s Common Known Practice solution חוקי אצבע שימושיים: 1. 1. לחישוב מהיר, חלקו את מספר הקילו-וואט ב 3.5 כדי לקבל טון קירור. 2. ארון שרתים ממוצע של שרתי U2 מפזר.KW4-5 3. ארון תקשורת ) U40 ( מלא במתגים יצרוך.KW6 4. שרת מתקדם עם שני כרטיסי רשת ושני דיסקים יצרוך 500 400- וואט בפועל (נתוני 2007) 5. שרתים עם כרטיס רשת אחד יצרכו באופן טיפוסי 300 W לשרת 6. צריכה בפועל של שרת תהיה 50% מהנומינאלי (בחדר מחשב אופייני. לא במעבדות!) דוגמאות: בלייד סנטר אחד (14 שרתי להב, ( W 3600 דורש אחד טון קירור ארון שרתי להב עם 6 בלייד סנטרס ) 84 שרתי להב) יצרוך 6 טון קירור ארון מלא ב 40 שרתי פיצה מתקדמים, W 400 כ א (50% מהנומינאלי) פולט KW 16 שדורשים 4.5 טון קירור.

PUE: Power Usage effectiveness...( Power Usage Effectiveness) PUE, החשבון שמגיע במעטפה בסוף כל חודש. PUE הנו מקדם המגדיר את היחס בין הספק הכולל של המערכות המותקנות בחדר המחשב ומשמשות לחישוב לבין הספק התשתית הנדרשת לתמיכה וסילוק החום ממערכות המחשב, לדוגמה אור, מזגן, ספקים. בחדרי מחשב רוב ההספק הנצרך לתמיכה נדרש עבור מערכות הקירור של חדר המחשב. לדוגמה, עבור מערכת Blade העולה כ $, 4000 ומפזרת, W500 עלויות התפעול השותף (ללא עלות טכנאי ותקלות, ללא עלות ההתקנה של התשתיות). במשך שלוש שנים הנם $2628 עבור חדר מחשב תקני עם PUE 2.0 ו $ 3942 עבור חדר מחשב המתוכנן גרוע.PUE 3.0

Scalable Modular Data Center efficiency within weeks A cost-effective, high quality 500-1,000 square foot (50-100 square meters) data center. The unit can be designed and installed in nearly any working environment in less time than a traditional raised floor. Offering benefits Rapid deployment: A fully functional, cost effective data center in 8-12 weeks Energy efficient design: UPS systems and inrow, load variable cooling High quality: complete turn-key solution from planning to install and start-up

Environmental conditions ASHRAE Thermal Guidelines define conditions at the inlet to the IT equipment Often, operating temperatures are much less than recommended Often humidity is more tightly controlled than recommended

Solution

Data Center Stored Cooling Solution the cool battery A turnkey, patented thermal storage solution to improve the efficiency of the cooling system and reduce energy costs. The ice cube for your data center Offering benefits Improved efficiency: 40-50 percent improvement in chiller efficiency Reduced cost: Shift up to 30 percent of energy usage out of peak time Access to lower cost power Free cooling

Solution For New Data Centers Flow Control Valve S Redundant Pumps Expansion Tank Quick Connects T Supply Secondary Cooling Loop 1 (Conditioned Water) Heat Exchanger Return Distribution Manifolds Flexible Hose mximum length 15.24 meters (50 feet) Primary Cooling Loop (Building Chilled Water) Flow Control Valve S T Redundant Pumps Expansion Tank Supply Quick Connects Secondary Cooling Loop 2 (Conditioned Water)\ Heat Exchanger Return Distribution Manifolds Flexible Hose mximum length 15.24 meters (50 feet) Flow Control Valve S T Redundant Pumps Expansion Tank Supply Quick Connects Secondary Cooling Loop 3 (Conditioned Water) Heat Exchanger Return Distribution Manifolds Flexible Hose mximum length 15.24 meters (50 feet)

Liquid Cooling Loops within Data Center

Solution Cost and ROI Solution cost is varied along with application requirement. Bellow are some typical return of investment numbers for some of the applications Vette Cooltronic s Doors: 1.5-3 Years, for new facilities, not including floor area savings and saturations scenarios. Future s software and consultancy, 1 1.5 Year, without saturation scenarios.

Customers

Thank You!