Airflow Utilization Efficiency (AUE) Can we cool today s servers with airflow?



Similar documents
Benefits of. Air Flow Management. Data Center

Optimizing Network Performance through PASSIVE AIR FLOW MANAGEMENT IN THE DATA CENTER

Great Lakes Data Room Case Study

Managing Data Centre Heat Issues

- White Paper - Data Centre Cooling. Best Practice

Unified Physical Infrastructure (UPI) Strategies for Thermal Management

Data Center Power Consumption

Data center upgrade proposal. (phase one)

HIGHLY EFFICIENT COOLING FOR YOUR DATA CENTRE

How To Improve Energy Efficiency Through Raising Inlet Temperatures

Energy-efficient & scalable data center infrastructure design

Dealing with Thermal Issues in Data Center Universal Aisle Containment

Layer Zero. for the data center

Airflow Simulation Solves Data Centre Cooling Problem

Data Center Components Overview

How To Run A Data Center Efficiently

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

Benefits of Cold Aisle Containment During Cooling Failure

Environmental Data Center Management and Monitoring

Computational Fluid Dynamic Investigation of Liquid Rack Cooling in Data Centres

Managing Cooling Capacity & Redundancy In Data Centers Today

Small Data / Telecommunications Room on Slab Floor

Saving energy in the Brookhaven National Laboratory Computing Center: Cold aisle containment implementation and computational fluid dynamics modeling

Thermal Optimisation in the Data Centre

Data Center Equipment Power Trends

Analysis of the UNH Data Center Using CFD Modeling

The New Data Center Cooling Paradigm The Tiered Approach

Power and Cooling for Ultra-High Density Racks and Blade Servers

Reducing Room-Level Bypass Airflow Creates Opportunities to Improve Cooling Capacity and Operating Costs

Element D Services Heating, Ventilating, and Air Conditioning

Optimum Climate Control For Datacenter - Case Study. T. Prabu March 17 th 2009

Cooling a hot issue? MINKELS COLD CORRIDOR TM SOLUTION

Thermal Monitoring Best Practices Benefits gained through proper deployment and utilizing sensors that work with evolving monitoring systems

Energy Efficiency Opportunities in Federal High Performance Computing Data Centers

Effect of Rack Server Population on Temperatures in Data Centers

Rack Hygiene. Data Center White Paper. Executive Summary

Education Evolution: Scalable Server Rooms George Lantouris Client Relationship Manager (Education) May 2009

Server Room Thermal Assessment

Liquid Cooling Solutions for DATA CENTERS - R.M.IYENGAR BLUESTAR LIMITED.

Hot Air Isolation Cools High-Density Data Centers By: Ian Seaton, Technology Marketing Manager, Chatsworth Products, Inc.

Challenges In Intelligent Management Of Power And Cooling Towards Sustainable Data Centre

AisleLok Modular Containment vs. Legacy Containment: A Comparative CFD Study of IT Inlet Temperatures and Fan Energy Savings

Using Simulation to Improve Data Center Efficiency

Air, Fluid Flow, and Thermal Simulation of Data Centers with Autodesk Revit 2013 and Autodesk BIM 360

A Comparative Study of Various High Density Data Center Cooling Technologies. A Thesis Presented. Kwok Wu. The Graduate School

High Density Data Centers Fraught with Peril. Richard A. Greco, Principal EYP Mission Critical Facilities, Inc.

APC APPLICATION NOTE #92

5 Reasons. Environment Sensors are used in all Modern Data Centers

Measure Server delta- T using AUDIT- BUDDY

Driving Data Center Efficiency Through the Adoption of Best Practices

Layer Zero. for the data center

GREEN FIELD DATA CENTER DESIGN WATER COOLING FOR MAXIMUM EFFICIENCY. Shlomo Novotny, Vice President and Chief Technology Officer, Vette Corp.

Unified Physical Infrastructure SM (UPI) Strategies for Smart Data Centers

Improving Data Center Energy Efficiency Through Environmental Optimization

Fundamentals of CFD and Data Center Cooling Amir Radmehr, Ph.D. Innovative Research, Inc.

Data Center Cooling A guide from: Your partners in UPS Monitoring Systems

Office of the Government Chief Information Officer. Green Data Centre Practices

Extend the Life of Your Data Center By Ian Seaton Global Technology Manager

OCTOBER Layer Zero, the Infrastructure Layer, and High-Performance Data Centers

Elements of Energy Efficiency in Data Centre Cooling Architecture

Energy Performance Optimization of Server Room HVAC System

RACKMOUNT SOLUTIONS. 1/7/2013 AcoustiQuiet /UCoustic User Guide. The AcoustiQuiet /UCoustic User Guide provides useful

APC APPLICATION NOTE #112

Creating Data Center Efficiencies Using Closed-Loop Design Brent Goren, Data Center Consultant

Cabcon. Data Centre Solutions. an acal group company UK +44 (0) IRE (00353)

Center Thermal Management Can Live Together

SERVER ROOM CABINET INSTALLATION CONCEPTS

Sealing Gaps Under IT Racks: CFD Analysis Reveals Significant Savings Potential

Data Centre Testing and Commissioning

An Introduction to Cold Aisle Containment Systems in the Data Centre

Cloud Computing Data Centers

Using CFD for optimal thermal management and cooling design in data centers

Understanding How Cabinet Door Perforation Impacts Airflow by Travis North

Data Center Cooling: Fend Off The Phantom Meltdown Of Mass Destruction. 670 Deer Road n Cherry Hill, NJ n n

Modeling Rack and Server Heat Capacity in a Physics Based Dynamic CFD Model of Data Centers. Sami Alkharabsheh, Bahgat Sammakia 10/28/2013

Case Study: Opportunities to Improve Energy Efficiency in Three Federal Data Centers

Enclosure and Airflow Management Solution

Greening Commercial Data Centres

Data Centers: How Does It Affect My Building s Energy Use and What Can I Do?

Rittal Liquid Cooling Series

2006 APC corporation. Cooling Solutions and Selling Strategies for Wiring Closets and Small IT Rooms

Supporting Cisco Switches In Hot Aisle/Cold Aisle Data Centers

GUIDE TO ICT SERVER ROOM ENERGY EFFICIENCY. Public Sector ICT Special Working Group

Application of CFD modelling to the Design of Modern Data Centres

FAC 2.1: Data Center Cooling Simulation. Jacob Harris, Application Engineer Yue Ma, Senior Product Manager

TecCrete. Data Center Flooring

W H I T E P A P E R. Computational Fluid Dynamics Modeling for Operational Data Centers

CURBING THE COST OF DATA CENTER COOLING. Charles B. Kensky, PE, LEED AP BD+C, CEA Executive Vice President Bala Consulting Engineers

IMPROVING DATA CENTER EFFICIENCY AND CAPACITY WITH AISLE CONTAINMENT

Access Server Rack Cabinet Compatibility Guide

Cooling Capacity Factor (CCF) Reveals Stranded Capacity and Data Center Cost Savings

I-STUTE Project - WP2.3 Data Centre Cooling. Project Review Meeting 8, Loughborough University, 29 th June 2015

EXPERIMENTAL CHARACTERIZATION OF TRANSIENT TEMPERATURE EVOLUTION IN A DATA CENTER FACILITY

Transcription:

AirflowUtilizationEfficiency(AUE) How to use River Cooling to reduce energy costs while improving server performance. By ThomasS.Weiss Sr.VicePresident TriadFloorsbyNxGen,LLC This paper offers insight into how airflow can be utilized and manipulated to cool servers in the data center. The analysis uncovers new revelations regarding how tiles perform while framing this performance in financial terms. In the end, this paperwillprovideamoredetailedunderstandingofthecapabilitiesandscienceof raised floor cooling, provide methods for testing tiles and provide a total cost of ownership example. This information has been created to enable Data Center Managers to make the most cost effective decision in determining how to keep coolingcostslowwhilereducinghotspotsandimprovingserverperformance. Canwecooltoday sserverswithairflow? CoolingintheDataCenterhasbeentheroleoftheraisedfloorforthelast40years. Airissentintotheunderfloorplenumwhileperforatedtilesareplacedinfrontof serverracks.thehopeisthattheairwillrisetotheserversandbepulledthrough theserversbytheserverfans.thissystemhasbeenrelativelyeffectiveasheatloads withintheservershavebeenverylow. Inthepastcoupleofyears,datacenterservershavebeenaskedtodomuchmore. With the use of virtualization products such as VMware, servers are now utilizing 60 80% of their capacity compared to 20 40% previously. This has added significantamountsofheattotheinternalprocessingoftheserver.coupledwiththe emergence of video, digital security, IP telephony and CRM/ ERP applications, we havebeguntoseepowerusagepushing7kwperrackwithmanylookingtoscaleto 12 25kwandbeyond. Higherheatloadswithintheserverrackhavebeguntohighlighttheinefficienciesof raisedfloorcoolingsystems.asheatloadsrise,hotspotsbegintoemergefollowed by sky rocketing energy costs. The inefficiencies of the airflow tile have made it impossible for the air to get to all the servers in the rack. The problem with conventionalairflowsystemsresultsfromtheinabilityfortheairflowtopenetrate theboundarylayersofheatonthefront,thesideandatthetopoftherack/cabinet. Asaresult,theDataCentermanagershavetolowerthesetpointofthedatacenter to keep the servers cool. Servers can function at an intake temperature of 80 degreesbutdatacentersaresetat72,68andsometimesaslowas65degrees.since every degree of movement in set point equates to a 4% impact on the energy bill, datacenterswerenowspendingupto60%oftheutilitybillonthecoolingsystems.

Whenanalyzingindetailwhathappenswhentheairflowsystemdoesnotpenetrate the boundary layers of heat, we first find a large discrepancy between the temperaturesatthebottomoftherackcomparedtothetopoftherack.sincethe topoftherackisthehottestpartoftheserverenvironment,theheatatthetopof therackdrivesthetemperatureofthewholedatacenter.highertemperaturesat thetopoftherackforceustolowerthetemperatureintheroomsowecankeepthe upper servers below 90 degrees. The reason there is such a discrepancy between thebottomandthetopistheaircomingoutofthefloorsitsinthecoldaisleinstead offlowingintothehotservers.thisinefficientsystemof leakingair intothecold aislemakestheserverfantheprimarycatalystforcoolingtheserver. Whenweanalyzetheentirerackwiththermalimaging,weseecoolingisprimarily presentwherethereisafan.thisleadstowarmertemperaturesattheperimeterof theserverscreatinghotspotsinthesidesofthecabinet,thetopofthecabinetand the bottom of the cabinet. In effect the tiles only cool a small portion of the boundarylayerofheatonthefrontsideofthecabinet.sincetheairdoesnotflowto theheatsource,theonlywaydatacentermanagerscankeeptheheatedareasofthe serverscoolistomaketheroomcolder.they,ineffect,havetomakethedatacenter arefrigerator. Thisinefficiencyoftheairflowsystemhasleadsomecompaniestomoveawayfrom raisedfloorcoolingformoreeffective,ifnotextremelyexpensive,solutions.these alternativesconsistofactivecoolingdevicesanddesignsthatusefans,liquidcooling and customized racking. Cooling is improved but the solution can cost as much as $500,000forasfewas12racks.Thesesystemsare10timesmoreexpensivethan theirpassivecounterparts. Sofromacapitaloutlayperspective,themostcosteffectivecoolingsystemwouldbe passive solution capable of supplying air to the front of the servers, starting at 8 inchesandstratifyingupto7feet.airneedstoflowtotheservers.thetileneedsto workfortheserver.theairinthecoldaisleshouldbemorethanpresent;itshould attacktheboundarylayerofheatonthefrontandtopoftherack.thiswouldallow ustokeepeveryserverintherackthesametemperatureandwouldblockthehot airfromcomingoverthetopoftherackandheatingtheupperservers. The end result of deploying an efficient airflow system would mean temperatures couldbeconsistentatthetopandbottomoftherackandsetpointscouldrisetothe temperatureoftheaircomingoutofthefloor.ifwewereabletouseairmovement effectively,datacentercoolingcostscouldbereducedbyasmuchas50%. Thetileistheentitythatdeliverstheairtotheserversbutitisonlyonepartofthe system. We have broken the airflow ecosystem into four components. These all play a part in the efficiency of the airflow system. Each of these components supportstheflowofairwhileminimizingwasteandlossofthermalintegrity.each andeverymoleculeofcool,movingair,shouldgothroughoracrossaserver.

Thesefourcomponentsare: 1. Theunderfloorspace 2. Theperforatedtile 3. Thecabinet/rack 4. Thereturnairsystem Thispaperisgoingtoprimarilyfocusononepartofthesystem;theperforatedtile as an airflow delivery device. Through a detailed analysis of perforated tile performance, in an open plenum environment, we have found the need for three distinctperformancecharacteristics. 1. The tile should be constructed in a way that it ensures positive flow out of everypartofthetile(noshortcycling). 2. Itshoulddiverttheairtotheserverssoitcanpenetratetheboundarylayer ofheatonthefrontsideofthecabinet/rack. 3. Itshouldbeableto stratify theairtothetopofthecoldaisle. Servers need a constant flow of air to keep them cool. All three of these performance parameters are necessary and lead to consistent temperatures and improvedcooling. When analyzing the performance of traditional perforated tiles by applying the stated requirements, we find meeting our performance criteria to be difficult. Traditionaltileshavebeentestedinaductandwerenevercreatedtodiverttheair. Whenwemeasuretheperformanceofthetileinanopenplenum,wefindthedesign of the tile to be flawed leading to mixing, no dispersion of the air into the servers andaninabilitytoflowtotheupperservers. Thisflawcomesfromthedesignofa flatbottom foundonmostperforatedtiles. The best way to illustrate the impact of this design flaw is to show you how flat bottomtilesarelikeacarwindowwhenacarismovingdowntheroad. Whenwithyourcarwindowsopen,youfindinconsistentflowcomingintothecar. Airgoingbythewindowofthecarpassesbypartofthe openarea ofthewindow, providingsomeairintothecarbutpullingairoutofthecaraswell.theonlywayto gettheairtoflowdirectlyintothecaristoangletheairintotheareaneedingthe air.thisanglingoftheairtowardstheheateditemcreateswindchillonthesurface oftheitemtobecooledanddissipatesheatbylettingtheairpassbytheitem.in otherwords,weneedaventwindowtodiverttheairtotheservers. Flatbottomtilesarelikeacarwindow.Airpassesbypartofthetile,whichleadsto mixing, less directed flow and a low stratification line.these flaws have lead to designconsiderationsthatreducetheimpactoftheflawfoundinthetilebutleadto a great deal of inefficient airflow utilization. These airflow system designs have focusedonpressureandopenspaceinsteadofvelocityandwindchill.

Intheirstudyofairflowinthedatacenter,Dr.BobF.SullivanandKennethG.Brillin their paper Cooling Techniques that meet 24 by forever Demands of Your Data Center estimatethatonly28%oftheairintheraisedfloorairflowsystemactually getstotheservers.thismeans72%oftheairiswastedandisonlypresentsoasto guide the other 28%. We, in effect, use air to divert air. Consequently, the same studyfindsthatwehave2.6timesmorecoolingcapacitythanisnecessary. Thehiddendrainofenergy FlatBottomShortCycling Foryearsperforatedtileshavebeenmadewithflatbottoms.Theairwassupposed tocomethroughthetilesbasedonthelevelofopenspaceinthetileandthestatic pressureintheunderfloorplenum.theassumptionwasthattheairwassittingstill (the lake) and would leak consistently through the holes in the tile. All testing of perforated tiles was done in a duct not in an open plenum so testing never contemplatedthereallifeenvironmentofthesetiles. When we analyze the performance of flat bottom tiles in an open plenum environment,welearnquicklythatiftheairismovingtoofast,airnotonlydoesnot gothroughthetile,itactuallypullsairintotheunderfloorplenum.thisiswhatis known of as a short cycle. It was thought that we could correct this problem by movingthetilefurtherawayfromthecracunit.oncewegettheairtoslowdown weareabletogetittomovethroughthetile.thereasonairdoesnotgothrougha tile and is pulled back into the under floor plenum is a direct result of what Isaac Newtontermedas TheLawofMotions orthe VenturiEffect. Airpassingbytheopenareaofthetilecreatesapullofairthatisequalinmagnitude andoppositeindirection.sowhenatileisclosetoacracunit,thevelocityofthe airoverridesthepressureunderthefloor,pullingtheairfromabovethefloor.this leadstolowerpressureatthefrontofthetile.thishappenswhenairpassesbyany partofthetile,itmustbehappeningalloverthedatacenter. Evenifaircomesoutofthebackofthetile,atcertainvelocities,airwillbepulled backintotheplenumatthefrontofthetile.thispullmakesitdifficulttogetairto disperse to the front of the server and pulls the air back down into the plenum creating a short cycle within the tile. This pull and the short cycle itself, has the unwanted effect of making it virtually impossible to get air to the upper servers, doesnotdirectairtothelowerserversandmixesthecoolairwiththewarmair.

Triadthermalimagecomparedtoaflatbottomtile. Note the pull at the front of the flat bottom tile on the right and note the lower temperatureoftheaircomingoutofthethreetriadtiles. AircomingoutoftheTriadtileis4degreescolderthantheaircomingoutof theflatbottomtile.(thankstoeuropeanbaseddaxtenforthistiletesting) TheHi PlumeFin In2003aglobalfortune100,computermanufacturer,begantoworkontheshort cycle issue with their vendor Triad Floors by NxGen, LLC. Triad analyzed the inefficiencies of the perforated tile and came to the conclusion that getting air to stratifytoover7 andtodisperseairintothefrontoftheservers,weremandatory performance qualifications of the perforated tile. As a direct result of this analysis andadetailedscientificstudyoffluiddynamics,garymeyer,founderandpresident oftriadfloors,createdthepatentedhi Plumefin. The fin is scientifically designed to not only create positive airflow through every holeinthetile;italsohasacurvedshapedesignedtocreateadispersedpatternof airflow out of the top of the tile. By altering the pressure under the tile, the Hi Plumefinusesthescienceoffluiddynamicsto,ineffect, bend theairoutwardlyso

itcanflowintotheserversandupto7. The end result of introducing the Hi plume fin design to the airflow cooling environmenthasbeentoimprovecoolingtemperatures,atthefrontoftheserver, thatare: 3 5degreescoolerinthebottomthirdoftherack, 5 10inthemiddlethirdand 10 20degreescoolerintheupperpartoftherack. BelowarethetestresultsfromacustomerwhodeployedTriad s65%hi Plumefin tile.pleasenotethereareonlythree65%openareatriadtilesinthisrowcompared tothesix,56%tilesthatweretherepreviously. Three tiles replacing 6, 56% tiles. Less air but better cooling from less air through directional airflow. Top Before Front 69.4 69.4 78.1 82 83.1 83.1 Top After Front 67.6 62.3 63.7 68.1 66.6 66.8 Difference -1.8-7.1-14.4-13.9-16.5-16.3 Middle Before Front 66.7 66.7 78 78 70 70 Middle After Front 67.3 64.8 64.3 63 65.2 65.6 Difference -.4-1.9-11.7-15 -4.8-4.4 Bottom Before Front 65.8 65.8 70 70 69.7 69.7 Bottom After Front 64.7 65.1 65.5 64.8 66.4 65.9 Difference -1.1 -.7-4.5-5.2-3.3-3.8 Triad Tile Placement Solid tile XXXX Solid tile XXXX Solid tile XXXX 56% Tile Placement XX XX XX XX XX XX

The test results confirm that open space / CFM does not, necessarily, equate to improvedcooling.thetestresultsshowlowertemperatureswithlessairandless open space. Also note the consistent temperature at all levels of the rack. This means the flow of the air to the servers in the rack is consistent no matter the height. This customer was able to raise their set point by 8 degrees, enabling them to realizeacoolingcostsavingsof32%.thisconfirmsthatcfmisnotthesolecriteria formeasuringcoolingandconfirmswhyweseesomuchwasteofairintheairflow system. TheRiver andthelake DataCentermanagersarelearningabouttheinefficienciesofthetileandthecosts associated with adding more and more cooling capacity to the data center airflow system.asaresultoftheirpursuitofreducinghotspotsandimprovedcooling,they havedevelopedtwo,competing,designphilosophies.oneoftheseprovidesairata sufficient velocity to stratify to 7 while flowing into and over the front of the servers;theothercreatesalargepoolofverycoolair,leakingintothecoldaisle.we callthesetwomethodsthe River andthe Lake. Thelakeenvironmentreliesoncreatingalargepoolofcoldairunderthefloorand in the cold aisle. This pool, or lake of air, is pressurized under the floor. As the pressureisreleasedthroughthetile,itisexpectedtofillthecoldaisle.theservers are cooled by the internal fans pulling the cold air from the middle of the aisle throughtheserver. Theriver,ontheotherhand,isdesignedtousetheflowofairtocool.Itreliesonthe movement of air to provide a dissipative effect on the servers and through the serverfans.therivermethodtreatsthedatacenterasanecosystemofairflow.a molecule of cool air leaves the CRAC unit, moves through the under floor plenum, through the tile and directly into the front of the server, cooling the server then retreatingthroughthehotupperplenumbacktothecracunit. Whenyoucomparethesetwodesigns,youfindthelakeneedsairtobeagreatdeal coolerandtheplenumspaceneedstobedeep,tokeeptheaircoolandpressurized. Theairsitsinareasunderthefloorandsitsinapoolinthecoldaisle.Muchofthe airinthissystemisinplaceto contain theairsothereisflowoutofthetile.even withthislargepoolofair,wefinditdifficulttogettheairtotheserverboundary layers at the front and top of the rack. As a result, in a lake design, we are only utilizing28%oftheairtocooltheservers. Theriverontheotherhandreliesonthevelocityandmovementofairanddoesnot requiretheairtobeascool(lessenergyneededtocooltheriver).theriverismore

concernedwiththemovementofair(likeafan)andtheneedtogettheairtothe server.airisredirectedandconcentratedbydirectionalflowtotheservers.ifyou wantarivertowork,youneedtokeeptheairmovingtoandthroughtheserver. There are three benefits to the River system. First of all, by breaking down the boundarylayeratthefrontandtopoftherack,youareabletoimprovethecooling capabilitiesoftheairflowsystem.thesecondbenefitoftheriversystemisthatyou needlessairthanyoudoinalakesystem.theriversystemcontainstheairunder thefloor,focusestheairtotheserverbymanagingtheairoutofthetile,usesthe cabinet door to further channel the air to the servers and finally, supports an 18 degreedeltatinthehotaisletopulltheairfromthebackoftheservers. Awell designedriversystemcanreducetheneedforcrac/crahgeneratedairby upto50%andcanlowerupperservertemperaturesby10to20degrees.thiscan leadtoareductioninenergycostsbyasmuchas60%whilereducingtheneedfor 40% of the air handling devices. You get the best of both worlds, lower energy costsforaloweracquisitioncost.hereisanexample: Currentenvironment/LakeCooling 10,000sfdatacenter 15CRACunitsx$47,000perunit=$705,000 Utilitybill=$62,000permonthx12=$744,000annually RiverCooling 9CRACunitsx$47,000perunit=$423,000 Savings=$282,000 Utilitybill=$62,000x.5=$31,000forcoolingx.4=$7,440x12=$89,280annually Costoffloorsystemandimprovedtiles: Underfloorwalls $25,000 Tilereplacement $23,000 Totalcost $48,000 CRACsavings $282,000 Utilitysavings $89,280 Totalsavings $371,280 MinusRiverproducts $48,000 Totalsavings: $323,280

AirflowUtilizationEfficiency(AUE) Sincethetemperatureatthetopoftherackdrivesthecoolingstrategyforthewhole datacenter,creatingameasurementthatfactorsinthisreadingseemsonlylogical. And,sincethetemperaturesuppliedbytheCRACunitisourcoolingsource,creating ameasurementthatevaluatestheintegrityoftheairasitmovesthroughthefloor andtotheserverseemslogicalaswell. The Airflow Utilization Efficiency (AUE) calculation provides us with a way of evaluatingthethermallossinourairflowsystem.wecomparethetemperaturesat thetopoftheracktothesupplyairtemperature. Hereisanexample: 85degreestopofrack 55degreesupplyair AUE=30 83degreestopofrack 58degreesupplyair AUE=25 This is a simple way to determine if the cool air coming out of the CRAC unit experiences thermal loss leading to a low set point and higher energy usage. The smaller the AUE number, the more efficient your airflow system is working. In additiontothesupplyairtemperatures,wewouldofferthatthecracunit/supply air,shouldbeoptimizedtohaveairflowtemperaturesatthetopoftherackasclose to80aspossible.lastly,sincethedifferenceisshownintemperaturebydegree,you cancomparethedifferenceanddeterminethereductioninenergycosts. 1degree=4%reductioninenergycosts 12degrees=48%reductioninenergycosts RiversystemsquiteoftensupportanAUEnumberof10.Thatis: 80degreestopofrack 70degreessupplyair AUE=10 River systems can do this with half the CRAC units utilized in a lake system. Additionally,theimpactonenergyconsumptionisimprovedaswell.Intheend,the reduced impact on capital expense and recurring costs make the River Cooling

System anextremelycompellingsolutionforanydatacenter. Summary Raisedfloorcoolingcanbeacosteffectivesolutionforcoolingservers.The River CoolingSystem isthemostefficientwayofusingairtocool.whenlookingatthe choicebetweenrivercoolingenvironmentsandlakeenvironments,itisimperative to look at all the costs associated with the solution including installation costs, equipmentcostsandenergycosts. By factoring in all three cost elements, companies will be able to choose the best solutionfortheirspecificneeds.byimprovingairutilizationefficiencyandutilizing passivecooling,wecanlowercapitalcosts.aueisenhancedbygettingairoutofthe floor without loss of temperature and dispersing the flow into the front of the server.thiscanonlybedonewhentheairisdirectedthroughthetiletothefrontof the server. Flat bottom tiles cannot divert air to the server and do not stratify to sevenfeet.theyshortcycleandmixcoldaisleairwithunderfloorair.thisleadsto thermal loss and higher energy costs. Triad s High Plume fin improves AUE by gettingairtotheservers,inadispersedflow,whileprotectingunderfloorairfrom thewarmairabove. (For questions regarding this document, please feel free to contact Tom Weiss at tom.weiss@triadfloors.comorvisitwww.triadfloors.com)