High-Performance Scientific Computing in the UK STFC and the Hartree Centre. Mike Ashworth



Similar documents
Nuclear Physics Technology Showcase Event 26 September 2013, IoP London. STFC External Innovations Programme Funding Schemes

The Hartree Centre helps businesses unlock the potential of HPC

HPC and Big Data. EPCC The University of Edinburgh. Adrian Jackson Technical Architect

INFRASTRUCTURE PROGRAMME

Computing Strategic Review. December 2015

Computing Advisory Panel (CAP) Preliminary enquiry in respect of future PRACE membership. Jan- 2014

Shibbolized irods (and why it matters)

Experience of Data Transfer to the Tier-1 from a DIRAC Perspective

Jeff Wolf Deputy Director HPC Innovation Center

Big Data and the Earth Observation and Climate Modelling Communities: JASMIN and CEMS

STFC s Science and Technology Strategy

ALPS - The Swiss Grand Challenge Programme on the Cray XT3. CUG 2007, Seattle Dominik Ulmer, CSCS

Cosmological simulations on High Performance Computers

Evoluzione dell Infrastruttura di Calcolo e Data Analytics per la ricerca

InfiniBand Strengthens Leadership as the High-Speed Interconnect Of Choice

Parallel Software usage on UK National HPC Facilities : How well have applications kept up with increasingly parallel hardware?

Mississippi State University High Performance Computing Collaboratory Brief Overview. Trey Breckenridge Director, HPC

GPU System Architecture. Alan Gray EPCC The University of Edinburgh

HETEROGENEOUS HPC, ARCHITECTURE OPTIMIZATION, AND NVLINK

High-Performance Computing and Big Data Challenge

Building an Open Data Infrastructure for Science: Turning Policy into Practice

Dr. Raju Namburu Computational Sciences Campaign U.S. Army Research Laboratory. The Nation s Premier Laboratory for Land Forces UNCLASSIFIED

Welcome to the. Jülich Supercomputing Centre. D. Rohe and N. Attig Jülich Supercomputing Centre (JSC), Forschungszentrum Jülich

COMP/CS 605: Intro to Parallel Computing Lecture 01: Parallel Computing Overview (Part 1)

HPC-related R&D in 863 Program

HPC & Big Data THE TIME HAS COME FOR A SCALABLE FRAMEWORK

New Jersey Big Data Alliance

PRACE hardware, software and services. David Henty, EPCC,

Linking raw data with scientific workflow and software repository: some early

High Performance Computing

CERN s Scientific Programme and the need for computing resources

How To Build A Supermicro Computer With A 32 Core Power Core (Powerpc) And A 32-Core (Powerpc) (Powerpowerpter) (I386) (Amd) (Microcore) (Supermicro) (

Energy efficiency in HPC :

Impact of the Tevatron on Technology and Innovation. John Womersley Chief Executive, STFC 11 June 2012


Supercomputing Resources in BSC, RES and PRACE

International High Performance Computing. Troels Haugbølle Centre for Star and Planet Formation Niels Bohr Institute PRACE User Forum

Appro Supercomputer Solutions Best Practices Appro 2012 Deployment Successes. Anthony Kenisky, VP of North America Sales

Barry Bolding, Ph.D. VP, Cray Product Division

Cluster Scalability of ANSYS FLUENT 12 for a Large Aerodynamics Case on the Darwin Supercomputer

High Performance Computing in our everyday life

Relations with ISV and Open Source. Stephane Requena GENCI

BSC - Barcelona Supercomputer Center

PACE Predictive Analytics Center of San Diego Supercomputer Center, UCSD. Natasha Balac, Ph.D.

Challenges in e-science: Research in a Digital World

Kriterien für ein PetaFlop System

Human Brain Project -

GTC Presentation March 19, Copyright 2012 Penguin Computing, Inc. All rights reserved

MAC Consultation on the Review of the Shortage Occupation Lists for the UK and Scotland and Creative Occupations.

Review of SC13; Look Ahead to HPC in Addison Snell

The GPU Accelerated Data Center. Marc Hamilton, August 27, 2015

EPCC. Professor Arthur Trew Director, EPCC

Data Centric Systems (DCS)

Software Engineering Support

Characteristics of the UK Research Base: high performance

PRACE the European HPC Research Infrastructure. Carlos Mérida-Campos, Advisor of Spanish Member at PRACE Council

e-infrastructure convergence in UK

Parallel Computing. Introduction

From Petabytes (10 15 ) to Yottabytes (10 24 ), & Before 2020! Steve Barber, CEO June 27, 2012

High Productivity Computing With Windows

Panasas High Performance Storage Powers the First Petaflop Supercomputer at Los Alamos National Laboratory

A successful publicprivate partnership. Bob Jones head of CERN openlab

Computing at the HL-LHC

Page 1 of 5 19/04/2013. E-Infrastructure Leadership Council 15 th April BIS Conference Centre, 1 Victoria Street, London, SW1H 0ET.

HPC enabling of OpenFOAM R for CFD applications

Interconnect Your Future Enabling the Best Datacenter Return on Investment. TOP500 Supercomputers, June 2016

Executive summary. Key points

Pedraforca: ARM + GPU prototype

Building a Top500-class Supercomputing Cluster at LNS-BUAP

Microsoft Research Worldwide Presence


ECDF Infrastructure Refresh - Requirements Consultation Document

ICT4 - Customised and low power computing

Percipient StorAGe for Exascale Data Centric Computing

Trends in High-Performance Computing for Power Grid Applications

HPC technology and future architecture

Make the Most of Big Data to Drive Innovation Through Reseach

PRIMERGY server-based High Performance Computing solutions

IT requirements for the next generation research infrastructures

Big Data and Storage Management at the Large Hadron Collider

High Performance Computing Infrastructure in JAPAN

New Supercomputer BESKOW for large Scalabilty Simulations

EUFORIA: Grid and High Performance Computing at the Service of Fusion Modelling

Partnership for Advanced Computing in Europe

Cooling and thermal efficiently in

Agenda. HPC Software Stack. HPC Post-Processing Visualization. Case Study National Scientific Center. European HPC Benchmark Center Montpellier PSSC

SKA. Hosting one of world s largest science infrastructures. Dr Jasper Horrell, SKA SA. IT Infrastructure and Broadband Summit, Cape Town, Aug 2014

How Cineca supports IT

Quick Reference Selling Guide for Intel Lustre Solutions Overview

Cloud Computing. Alex Crawford Ben Johnstone

Collaborative Computational Projects: Networking and Core Support

THE RESEARCH INFRASTRUCTURES IN FP7

Future and Emerging Technologies (FET) in H2020. Ales Fiala Future and Emerging Technologies DG CONNECT European Commission

Workprogramme

ALPS Supercomputing System A Scalable Supercomputer with Flexible Services

EPSRC Update and Liquid Crystals Discussion

Energy efficient computing on Embedded and Mobile devices. Nikola Rajovic, Nikola Puzovic, Lluis Vilanova, Carlos Villavieja, Alex Ramirez

Deep Learning Meets Heterogeneous Computing. Dr. Ren Wu Distinguished Scientist, IDL, Baidu

RSC presents SPbPU supercomputer center and new scientific research results achieved with RSC PetaStream massively parallel supercomputer

How To Build A Cloud Computer

Transcription:

High-Performance Scientific Computing in the UK STFC and the Hartree Centre Mike Ashworth Scientific Computing Department and STFC Hartree Centre STFC Daresbury Laboratory mike.ashworth@stfc.ac.uk

High Performance Computing in the UK STFC s Scientific Computing Department STFC s Hartree Centre Scientific Computing Highlights

High Performance Computing in the UK STFC s Scientific Computing Department STFC s Hartree Centre Scientific Computing Highlights

Linpack Gflop/s History of UK academic HPC provision 10000000 Moore's Law 14 months 1000000 100000 Total UK Universities Total UK national provision (incl. HPCx, HECToR, Hartree, DIRAC) Peta 10000 1000 Tera 100 10 1 Jan-93 Jan-95 Jan-97 Jan-99 Jan-01 Jan-03 Jan-05 Jan-07 Jan-09 Jan-11 Jan-13 TOP500 list

Linpack Gflop/s History of UK academic HPC provision 10000000 1000000 Hartree & DIRAC Peta 100000 HECToR 10000 1000 CSAR HPCx SRIF Regional centres Tera 100 10 1 Jan-93 Jan-95 Jan-97 Jan-99 Jan-01 Jan-03 Jan-05 Jan-07 Jan-09 Jan-11 Jan-13 TOP500 list

The UK National Supercomputing facilities The UK National Supercomputing Services are managed by EPSRC on behalf of the UK academic communities HPCx ran from 2002-2009 using IBM POWER4 and POWER5 HECToR is the current system 2007-2014 Located at Edinburgh, operated jointly by EPCC and STFC HECToR Phase3 90,112 cores Cray XE6 (660 Tflop/s Linpack) ARCHER is the new service, due for installation around Nov/Dec 2013, service starting in early 2014

DiRAC DiRAC (Distributed Research utilising Advanced Computing) Integrated supercomputing facility for UK research in theoretical modelling and HPC-based simulation in particle physics, astronomy and cosmology Funded by STFC Computer simulated image of the glow of dark matter (Credit: Virgo) Flagship DiRAC system is a 6-rack IBM Blue Gene/Q 98,304 cores 1.0 Pflop/s Linpack Located at Edinburgh

UK Tier-2 regional university centres N8 Research Partnership N8 HPC centre 3.25M facility Based at University of Leeds SGI system 5312 cores Possibly to become S6 with Cambridge & Imperial College X86 system at University of Southampton IBM system 11088 cores GPU system at STFC RAL HP with 372 Fermi GPUs #3 GPU system in Europe

UK academic HPC pyramid Tier-0 PRACE systems Tier-1 National HECToR, Hartree, DIRAC Tier-2 Regional Universities S6 and N8 regional clusters Local Universities and Institutes

High Performance Computing in the UK STFC s Scientific Computing Department STFC s Hartree Centre Scientific Computing Highlights

HM Government (& HM Treasury) Organisation RCUK Executive Group

UK Astronomy Technology Centre, Edinburgh, Scotland Daresbury Laboratory Daresbury Science and Innovation Campus Warrington, Cheshire Polaris House Swindon, Wiltshire Rutherford Appleton Laboratory Harwell Science and Innovation Campus Didcot, Oxfordshire Chilbolton Observatory Stockbridge, Hampshire STFC s Sites Isaac Newton Group of Telescopes La Palma Joint Astronomy Centre Hawaii

Understanding our Universe STFC s Science Programme Particle Physics Large Hadron Collider (LHC), CERN - the structure and forces of nature Ground based Astronomy European Southern Observatory (ESO), Chile Very Large Telescope (VLT), Atacama Large Millimeter Array (ALMA), European Extremely Large Telescope (E-ELT), Square Kilometre Array (SKA) Space based Astronomy European Space Agency (ESA) Herschel/Planck/GAIA/James Webb Space Telescope (JWST) Bi-laterals NASA, JAXA, etc. STFC Space Science Technology Department Nuclear Physics Facility for anti-proton and Ion research (FAIR), Germany Nuclear Skills for - medicine (Isotopes and Radiation applications), energy (Nuclear Power Plants) and environment (Nuclear Waste Disposal)

STFC s Facilities Neutron Sources ISIS - pulsed neutron and muon source/ and Institute Laue-Langevin (ILL), Grenoble Providing powerful insights into key areas of energy, biomedical research, climate, environment and security. High Power Lasers Central Laser Facility - providing applications on bioscience and nanotechnology HiPER Demonstrating laser driven fusion as a future source of sustainable, clean energy Light Sources Diamond Light Source Limited (86%) - providing new breakthroughs in medicine, environmental and materials science, engineering, electronics and cultural heritage European Synchrotron Radiation Facility (ESRF), Grenoble

Major funded activities Scientific Computing Department 160 staff supporting over 7500 users Applications development and support Compute and data facilities and services Research: over 100 publications per annum Deliver over 3500 training days per annum Systems administration, data services, high-performance computing, numerical analysis & software engineering. Major science themes and capabilities Director: Adrian Wander Appointed 24th July 2012 Expertise across the length and time scales from processes occurring inside atoms to environmental modelling

Scientific Highlights Journal of Materials Chemistry 16 no. 20 (May 2006) - issue devoted to HPC in materials chemistry (esp. use of HPCx); Phys. Stat. Sol.(b) 243 no. 11 (Sept 2006) - issue featuring scientific highlights of the Psi-k Network (the European network on the electronic structure of condensed matter coordinated by our Band Theory Group); Molecular Simulation 32 no. 12-13 (Oct, Nov 2006) - special issue on applications of the DL_POLY MD program written & developed by Bill Smith (the 2 nd special edition of Mol Sim on DL_POLY - the 1 st was about 5 years ago); Acta Crystallographica Section D 63 part 1 (Jan 2007) - proceedings of the CCP4 Study Weekend on protein crystallography. The Aeronautical Journal, Volume 111, Number 1117 (March 2007), UK Applied Aerodynamics Consortium, Special Edition. Proc Roy Soc A Volume 467, Number 2131 (July 2011), HPC in the Chemistry and Physics of Materials. Last 5 years metrics: 67 grants of order 13M 422 refereed papers and 275 presentations Three senior staff have joint appointments with Universities Seven staff have visiting professorships Six members of staff awarded Senior Fellowships or Fellowships by Research Councils individual merit scheme Five staff are Fellows of senior learned societies

High Performance Computing in the UK STFC s Scientific Computing Department STFC s Hartree Centre Scientific Computing Highlights

Opportunities Political Opportunity Business Opportunity Scientific Opportunity Technical Opportunity Demonstrate growth through economic and societal impact from investments in HPC Engage industry in HPC simulation for competitive advantage Exploit multi-core Build multi-scale, multiphysics coupled apps Tackle complex Grand Challenge problems Exploit new Petascale and Exascale architectures Adapt to multi-core and hybrid architectures

Government Investment in e-infrastructure - 2011 17 th Aug 2011: Prime Minister David Cameron confirmed 10M investment into STFC's Daresbury Laboratory. 7.5M for computing infrastructure Clockwise from top left 3 rd Oct 2011: Chancellor George Osborne announced 145M for e-infrastructure at the Conservative Party Conference 4 th Oct 2011: Science Minister David Willetts indicated 30M investment in Hartree Centre 30 th Mar 2012: John Womersley CEO STFC and Simon Pendlebury IBM signed major collaboration at the Hartree Centre

Intel collaboration STFC and Intel have signed an MOU to develop and test technology that will be required to power the supercomputers of tomorrow. STFC and Intel have signed an MOU to develop and test technology that will be required to power the supercomputers of tomorrow. Karl Solchenbach, Director of European Exascale Computing at Intel said "We will use STFC's leading expertise in scalable applications to address the challenges of exascale computing in a co-design approach."

Tildesley Report BIS commissioned a report on the strategic vision for a UK e-infrastructure for Science and Business. Prof Dominic Tildesley led the team including representatives from Universities, Research Councils, industry and JANET. The scope included compute, software, data, networks, training and security. Mike Ashworth, Richard Blake and John Bancroft from STFC provided input. Published in December 2011. Google the title to download from the BIS website

Hartree Centre capital spend 2011/12 2.2 5.3 BlueGene/Q 12 idataplex 6 Data Intensive approximate capital spend M Total 37.5M 6 6 Disk & Tape Visualization Infrastructure

Hartree Centre IBM BG/Q Blue Joule TOP500 #18 in the Jun 2013 list #6 in Europe #1 system in UK 6 racks 98,304 cores 6144 nodes 16 cores & 16 GB per node 1.25 Pflop/s peak 1 rack to be configured as BGAS (Blue Gene Advanced Storage) 16,384 cores Up to 1PB Flash memory

TOP500 #222 in the Jun 2013 list 8192 cores, 170 Tflop/s peak node has 16 cores, 2 sockets Intel Sandy Bridge (AVX etc.) 252 nodes with 32 GB 4 nodes with 256 GB 12 nodes with X3090 GPUs 256 nodes with 128 GB ScaleMP virtualization software up to 4TB virtual shared memory Hartree Centre IBM idataplex Blue Wonder

Hartree Centre Datastore Storage: 5.76 PB usable disk storage 15 PB tape store

Hartree Centre Visualization Four major facilities: Hartree Vis-1: a large visualization wall supporting stereo Hartree Vis-2: a large surround and immersive visualization system Hartree ISIC: a large visualization wall supporting stereo at ISIC Hartree Atlas: a large visualization wall supporting stereo in the Atlas Building at RAL, part of the Harwell Imaging Partnership (HIP) Virtalis is the hardware supplier

Hartree Centre Mission Hartree Centre at the STFC Daresbury Science and Innovation Campus will be an International Centre of Excellence for Computational Science and Engineering. It will bring together academic, government and industry communities and focus on multi-disciplinary, multi-scale, efficient and effective computation. The goal is to provide a step-change in modelling capabilities for strategic themes including energy, life sciences, the environment, materials and security.

Douglas Rayner Hartree Father of Computational Science Hartree Fock method Appleton Hartree equation Differential Analyser Numerical Analysis Douglas Rayner Hartree PhD, FRS (1897 1958) It may well be that the high-speed digital computer will have as great an influence on civilization as the advent of nuclear power 1946 Douglas Hartree with Phyllis Nicolson at the Hartree Differential Analyser at Manchester University

Responding to the Challenges Expert optimisation of existing software Profiling and identification of hotspots, use of libraries, tackling issues associated with large core counts, serial bottlenecks, redesign of I/O etc Application Co-design Software and hardware must evolve together Requires close collaboration between hardware architects, computer scientists, and application software experts Re-engineering software requires a specialised development platform Highest possible core count Configured and operated for software development (interactive use, profiling, debugging etc) The Hartree Centre: a Research Collaboratory with IBM

Government Investment in e-infrastructure - 2013 1 st Feb 2013: Chancellor George Osborne and Science Minister David Willetts opened the Hartree Centre and announced a further 185M of funding for e-infrastructure 19M for the Hartree Centre for power-efficient computing technologies 11M for the UK s participation in the Square Kilometre Array This investment forms part of the 600 million investment for science announced by the Chancellor at the Autumn Statement 2012. By putting out money into science we are supporting the economy of tomorrow. George Osborne opens the Hartree Centre, 1 st February 2013

Collaboration with Unilever 1 st Feb 2013: Also announced was a key partnership with Unilever in the development of Computer Aided Formulation (CAF) Months of laboratory bench work can be completed within minutes by a tool designed to run as an App on a tablet or laptop which is connected remotely to the Blue Joule supercomputer at Daresbury. The aggregation of surfactant molecules into micelles is an important process in product formulation This tool predicts the behaviour and structure of different concentrations of liquid compounds, both in the bottle and in-use, and helps researchers plan fewer and more focussed experiments. John Womersley, CEO STFC, and Jim Crilly, Senior Vice President, Strategic Science Group at Unilever

19M investment in power-efficient technologies System with latest NVIDIA Kepler GPUs System based on Intel Xeon Phi System based on ARM processors Active storage project using IBM BGAS Dataflow architecture based on FPGAs Instrumented machine room Systems will be made available for development and evaluation projects with Hartree Centre partners from industry, government and academia Power-efficient Technologies Shopping List

High Performance Computing in the UK STFC s Scientific Computing Department STFC s Hartree Centre Scientific Computing Highlights

Hartree Centre Projects in the First 12 Months 56 projects; 200M BG/Q hours; 14M idataplex hours Also 2.5% BG/Q contributed to PRACE DECI calls

Current Unified Model 450 Met Office Unified Model Unified in the sense of using the same code for weather forecasting and for climate research Combines dynamics on a lat/long grid with physics (radiation, clouds, precipitation, convection etc.) Also couples to other models (ocean, sea-ice, land surface, chemistry/aerosols etc.) for improved forecasting and earth system modelling New Dynamics Davies et al (2005) 400 350 300 250 200 150 100 200301 200401 Met Office ECMWF USA France Germany Japan Canada Australia 2003 2011 200501 200601 200701 200801 200901 201001 Performance of the UM (dark blue) versus a basket of models measured by 3-day surface pressure errors 201101 ENDGame to be operational in 2013 From Nigel Wood, Met Office

Limits to Scalability of the UM The current version (New Dynamics) has limited scalability The latest ENDGame code improves this, but a more radical solution is required for Petascale and beyond (17km) The problem lies with the spacing of the lat/long grid at the poles At 25km resolution, grid spacing near poles is 75m At 10km this reduces to 12m! Perfect scaling From Nigel Wood, Met Office POWER7 Nodes

Challenging Solutions GUNG-HO targets a brand new dynamical core Scalability choose a globally uniform grid which has no poles (see below) Speed maintain performance at high & low resolution and for high & low core counts Accuracy need to maintain standing of the model Space weather implies a 600km deep model Five year project 2011-2015 Operational weather forecasts around 2020! Globally Uniform Next Generation Highly Optimized Working together harmoniously Triangles From Nigel Wood, Met Office Cubesphere Yin-Yang

Summary HPC is flourishing in the UK New Government investment supports a wide range of e-infrastructure projects (incl. data centres, networks, ARCHER, Hartree) The Hartree Centre is a new centre developing and deploying new software for industry, government and academia We are very happy to talk about partnership with centres in China For more information on the Hartree Centre see the http://www.stfc.ac.uk/hartree

If you have been thank you for listening Mike Ashworth http://www.cse.stfc.ac.uk/ mike.ashworth@stfc.ac.uk