Architecture for Modular Data Centers It is all about tailoring the most optimized solution, and be able to prove it for you. (AA)
Facing new pressures: Data centers are at a tipping point Increased Computing Demand Changing Cost Dynamics Data Center Lifecycle Mismatch In the next decade, growth in server shipments will be 6x and 69x for storage IBM / Consultant studies Per square foot, annual data center energy costs are 10 to 30 times more than those of a typical office building. 4 - William Tschudi. Datacenters have doubled their energy use in the past five years. 5 Koomey. US commercial electrical costs increased by 10 percent from 2005-06. 6 - EPA Monthly Forecast. Eighty-six percent of data centers were built before 2001 7 Twenty-nine percent of clients identified data center capability affected server purchases - Ziff Davis According to Gartner, The underlying consumption of energy in large data centers to power and cool hardware infrastructure is likely to increase steadily during the next ten years. 2 2. Gartner, Data Center Power and Cooling Scenario Through 2015, Rakesh Kumar, March 2007. 4. William Tschudi, March 2006 5. Koomey, February 2007. 6. EPA Monthly Forecast, 2007. 7. Nemertes Research, Architecting and Managing the 21st Century Data Center, Johna Till Johnson, 2006.
Commodity Data Center Growth Software as a Service Services w/o value-add going off premise Payroll, security, etc. all went years ago Substantial economies of scale Services at 10^5+ systems under mgmt rather than ~10^2 IT outsourcing also centralizing compute centers Commercial High Performance Computing Better understand customers, optimize supply chain, Consumer Services Google estimated at ½ million systems in 30 data centers Basic observation: No single system can reliably reach 5 9 s (need redundant H/W with resultant S/W complexity) With S/W redundancy, most economic H/W solution is large numbers of commodity systems
The Energy Efficiency Initiative Principles 1 Energy usage in the data center has a significant impact today and will have an even greater impact in the future 2 Real solutions are available now that can reduce data center energy usage 3 To meet the challenge, collaboration is a must across IT technology vendors, data center design and build businesses, infrastructure technology providers, energy utilities and governments 4 Think green. And think ahead. Understanding your energy facts is key. Expert advice can help make savings real
Where does the energy go? The data center energy challenge affects both the physical data center and the IT infrastructure % of total data center electricity use 35 30 25 20 15 10 5 0 Chiller/ Cooling Tower Cooling systems Humidifier Computer Room Air Conditioner Information Technology Power Distribution Unit Electrical and building systems Uninterruptible Power Supply Switch/ Gen Lighting Power Use
Problem Description
Hot Spots and Recirculation Zones ASHRAE Class 1 Environment Requires Inlet Air Temperature to Be maintained between 15 32 C Chilled air does not reach top of racks Cold Aisle Hot Aisle Cold Aisle
Conventional Data Center Flow Chart HO T O UTPU T RETURN AIR - Perforated tiles do not support high flow rates required by high density servers - U nderfloor obstructions from chilled w ater pipes and cables R AISED FLO O R ARE A 3 Server R ack 2 4 CO LD IN PU T AIR PEDESTAL A/C unit 1 CO N CR ETE SUBFLO O R
Raised Floor typical condition 35-45 % Cold Air Flow FLOOR SUPPORT PEDESTAL PERFORATED FLOOR TILE SOLID FLO OR TILE CABLE TRAY CONCRETE SUB FLOOR FLOOR SUPPORT PEDESTAL AIR FLOW RESISTANCE 100 % Cold Air Flow from A/C unit
Optimized Airflow Assessment for Cabling under floor savings A comprehensive assessment of the existing data center cabling infrastructure provides an expert analysis of the overall cabling actions. The service is designed to improve airflow for optimized cooling, simplified change management and improved cabling resiliency. Offering benefits Improved air flow under raised floor creates a more energy efficient data center Reduce hot spots due to bypass airflow Improve manageability of cabling systems Reduce operational costs associated with cable installation and change management Before After
2nd law of thermodynamics
Resulted Operational cost
Product heat density Trend chart
Today s Common Known Practice solution חוקי אצבע שימושיים: 1. 1. לחישוב מהיר, חלקו את מספר הקילו-וואט ב 3.5 כדי לקבל טון קירור. 2. ארון שרתים ממוצע של שרתי U2 מפזר.KW4-5 3. ארון תקשורת ) U40 ( מלא במתגים יצרוך.KW6 4. שרת מתקדם עם שני כרטיסי רשת ושני דיסקים יצרוך 500 400- וואט בפועל (נתוני 2007) 5. שרתים עם כרטיס רשת אחד יצרכו באופן טיפוסי 300 W לשרת 6. צריכה בפועל של שרת תהיה 50% מהנומינאלי (בחדר מחשב אופייני. לא במעבדות!) דוגמאות: בלייד סנטר אחד (14 שרתי להב, ( W 3600 דורש אחד טון קירור ארון שרתי להב עם 6 בלייד סנטרס ) 84 שרתי להב) יצרוך 6 טון קירור ארון מלא ב 40 שרתי פיצה מתקדמים, W 400 כ א (50% מהנומינאלי) פולט KW 16 שדורשים 4.5 טון קירור.
PUE: Power Usage effectiveness...( Power Usage Effectiveness) PUE, החשבון שמגיע במעטפה בסוף כל חודש. PUE הנו מקדם המגדיר את היחס בין הספק הכולל של המערכות המותקנות בחדר המחשב ומשמשות לחישוב לבין הספק התשתית הנדרשת לתמיכה וסילוק החום ממערכות המחשב, לדוגמה אור, מזגן, ספקים. בחדרי מחשב רוב ההספק הנצרך לתמיכה נדרש עבור מערכות הקירור של חדר המחשב. לדוגמה, עבור מערכת Blade העולה כ $, 4000 ומפזרת, W500 עלויות התפעול השותף (ללא עלות טכנאי ותקלות, ללא עלות ההתקנה של התשתיות). במשך שלוש שנים הנם $2628 עבור חדר מחשב תקני עם PUE 2.0 ו $ 3942 עבור חדר מחשב המתוכנן גרוע.PUE 3.0
Scalable Modular Data Center efficiency within weeks A cost-effective, high quality 500-1,000 square foot (50-100 square meters) data center. The unit can be designed and installed in nearly any working environment in less time than a traditional raised floor. Offering benefits Rapid deployment: A fully functional, cost effective data center in 8-12 weeks Energy efficient design: UPS systems and inrow, load variable cooling High quality: complete turn-key solution from planning to install and start-up
Environmental conditions ASHRAE Thermal Guidelines define conditions at the inlet to the IT equipment Often, operating temperatures are much less than recommended Often humidity is more tightly controlled than recommended
Solution
Data Center Stored Cooling Solution the cool battery A turnkey, patented thermal storage solution to improve the efficiency of the cooling system and reduce energy costs. The ice cube for your data center Offering benefits Improved efficiency: 40-50 percent improvement in chiller efficiency Reduced cost: Shift up to 30 percent of energy usage out of peak time Access to lower cost power Free cooling
Solution For New Data Centers Flow Control Valve S Redundant Pumps Expansion Tank Quick Connects T Supply Secondary Cooling Loop 1 (Conditioned Water) Heat Exchanger Return Distribution Manifolds Flexible Hose mximum length 15.24 meters (50 feet) Primary Cooling Loop (Building Chilled Water) Flow Control Valve S T Redundant Pumps Expansion Tank Supply Quick Connects Secondary Cooling Loop 2 (Conditioned Water)\ Heat Exchanger Return Distribution Manifolds Flexible Hose mximum length 15.24 meters (50 feet) Flow Control Valve S T Redundant Pumps Expansion Tank Supply Quick Connects Secondary Cooling Loop 3 (Conditioned Water) Heat Exchanger Return Distribution Manifolds Flexible Hose mximum length 15.24 meters (50 feet)
Liquid Cooling Loops within Data Center
Solution Cost and ROI Solution cost is varied along with application requirement. Bellow are some typical return of investment numbers for some of the applications Vette Cooltronic s Doors: 1.5-3 Years, for new facilities, not including floor area savings and saturations scenarios. Future s software and consultancy, 1 1.5 Year, without saturation scenarios.
Customers
Thank You!