IS-ENES/PrACE Meeting EC-EARTH 3 A High-resolution Configuration
Motivation Generate a high-resolution configuration of EC-EARTH to Prepare studies of high-resolution ESM in climate mode Prove and improve EC-EARTH 3 capabilities as a scientific tool Study and improve scalability of EC-EARTH 3 Work towards PrACE systems These activities were boosted by the OASIS Dedicated User Support 2010. Moreover, the National Supercomputing Centre (NSC) of Sweden supported the work within IS-ENES. Acknowledgements to Eric Maisonnave, CERFACS, and Chandan Basu, NSC.
Configuration: Component models Atmosphere: IFS ECMWF's forecastings system cycle 36R1 EC-EARTH changes regarding long integrations, ocean coupling, aerosols,... Ocean: Nemo + LIM2 Release 3.3beta (continuously updated) Coupler: OASIS3 Development trunk (continuously updated) Packaging New development for EC-EARTH 3, focused at Portability, Consistency, and Easy configuration
Configuration: Grids and Coupling Atmosphere T799/N800 grid with 62 levels Approx. 0.25º horizontal resolution Ocean ORCA025 grid Approx. 0.25º horizontal resolution Coupling setup EC-EARTH specific implementation in IFS (ongoing development) Nemo/LIM coupling interface with minor EC-EARTH changes OASIS3 pseudo parallel mode using 10 instances Number of coupling fields: 20 (16+4)
Configuration: Test platform Distributed memory cluster (Dell PowerEdge) 1268 compute nodes 2x AMD Opteron (x4 cores) 2x8 GB DDR2 10144 cores (of which we were using just over 1600) Full bisection bandwidth Infiniband fabric (2GB/s/link) Scali/Plattform MPI, OpenMPI Intel Compiler Suite 10.1
Results: Load balancing OASIS tool used to study load balance between IFS and Nemo Coupling overhead evaluated Disadvantage: Sequential OASIS only Balance ration of 1:4 cores for Ocean:Atmosphere Varies with the overall processor number Performance numbers (not so much the performance!) depends very much on load balancing!
Results: Scaling analysis Scalability is crucial for targeting large systems However, it's a tricky business For coupled systems, scalability is multi-dimensional Scalability is evaluated Assuming balance ratio provided by sequential Oasis run Guessing/trying otherwise Starting from a multi-core run Compare results to standalone IFS runs
Results: Scalability
Results: Scalability (IFS only)
Results: Parallel efficiency
Results: Parallel efficiency (cont.)
Results: Scalability (cont.)
Results: Data issues Mandatory to have realistic output activity, even at high resolution Output data had manageable size during tests Preprocessing? Data transfer? Long runs? Ensembles? Initial tests of Nemo's new I/O system (IOM) Apparently appropriate architecture for massively parallel systems Experimental. No satisfying results (yet?!)
Conclusions EC-EARTH 3 high-resolution configuration was up and running surprisingly fast and with little problems Portability is extremely important (and not for free) Scalability does not degrade seriously (if at all) compared to standalone IFS run. Is this sufficient? Does it scale to O(10'000) cores? Hard to tell. After component scalability, the coupling setup is (not surprisingly) crucial for performance OASIS3 did not prove to be a bottleneck in the chosen configuration. It might later. The work raised many interesting questions...