PLGrid Infrastructure Solutions For Computational Chemistry

Similar documents
Building a Top500-class Supercomputing Cluster at LNS-BUAP

PLGrid Programme: IT Platforms and Domain-Specific Solutions Developed for the National Grid Infrastructure for Polish Science

Science Gateways and scalable application tools with QosCosGrid (QCG) for large communities in Grids and Clouds

Introduction to ACENET Accelerating Discovery with Computational Research May, 2015

locuz.com HPC App Portal V2.0 DATASHEET

Advanced Service Platform for e-science. Robert Pękal, Maciej Stroiński, Jan Węglarz (PSNC PL)

Grid Activities in Poland

bwgrid Treff MA/HD Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 24.

How To Compare Amazon Ec2 To A Supercomputer For Scientific Applications

Virtualization of a Cluster Batch System

High Performance Computing in Aachen

HPC-related R&D in 863 Program

Estonian Scientific Computing Infrastructure (ETAIS)

Relations with ISV and Open Source. Stephane Requena GENCI

icer Bioinformatics Support Fall 2011

Parallel Programming Survey

Overview of HPC Resources at Vanderbilt

NVIDIA Tesla K20-K20X GPU Accelerators Benchmarks Application Performance Technical Brief

Parallel Processing using the LOTUS cluster

HETEROGENEOUS HPC, ARCHITECTURE OPTIMIZATION, AND NVLINK

High Performance Computing Software as a Service. Integration in KNIME

OpenMP Programming on ScaleMP

Overview of HPC systems and software available within

GPU System Architecture. Alan Gray EPCC The University of Edinburgh

Cloud services in PL-Grid and EGI Infrastructures

Scientific and Technical Applications as a Service in the Cloud

High-Performance Computing and Big Data Challenge

SURFsara HPC Cloud Workshop

The Asterope compute cluster

Linux clustering. Morris Law, IT Coordinator, Science Faculty, Hong Kong Baptist University

DataNet Flexible Metadata Overlay over File Resources

High Performance Computing within the AHRP

Performance Characteristics of Large SMP Machines

Accelerating Simulation & Analysis with Hybrid GPU Parallelization and Cloud Computing

wu.cloud: Insights Gained from Operating a Private Cloud System

Remote & Collaborative Visualization. Texas Advanced Compu1ng Center

Development of parallel codes using PL-Grid infrastructure.

Simulation Platform Overview

SCIENTIFIC CALCULUS HIGH PERFOMANCE COMPUTING SUPER COMPUTING BASED ON GPU BIG DATA, WORKSTATION, STORAGE SISTEMAS INFORMÁTICOS EUROPEOS

SURFsara HPC Cloud Workshop

Mississippi State University High Performance Computing Collaboratory Brief Overview. Trey Breckenridge Director, HPC

Advancements in Storage QoS Management in National Data Storage

HPC technology and future architecture

Agenda. HPC Software Stack. HPC Post-Processing Visualization. Case Study National Scientific Center. European HPC Benchmark Center Montpellier PSSC

Parallel Computing with MATLAB

Trends in High-Performance Computing for Power Grid Applications

Managing Complexity in Distributed Data Life Cycles Enhancing Scientific Discovery

Using NeSI HPC Resources. NeSI Computational Science Team

HPC and Big Data. EPCC The University of Edinburgh. Adrian Jackson Technical Architect

HP ProLiant SL270s Gen8 Server. Evaluation Report

Parallels Plesk Automation

1 DCSC/AU: HUGE. DeIC Sekretariat /RB. Bilag 1. DeIC (DCSC) Scientific Computing Installations

Using the Windows Cluster

Cloud Computing through Virtualization and HPC technologies

The CNMS Computer Cluster

Parallel Software usage on UK National HPC Facilities : How well have applications kept up with increasingly parallel hardware?

Anwendungsintegration und Workflows mit UNICORE 6

HPC Wales Skills Academy Course Catalogue 2015

Hadoop on a Low-Budget General Purpose HPC Cluster in Academia

High Performance. CAEA elearning Series. Jonathan G. Dudley, Ph.D. 06/09/ CAE Associates

Sun in HPC. Update for IDC HPC User Forum Tucson, AZ, Sept 2008

Kriterien für ein PetaFlop System

Introduction Physics at CSC. Tomasz Malkiewicz Jan Åström

HPC Cluster Decisions and ANSYS Configuration Best Practices. Diana Collier Lead Systems Support Specialist Houston UGM May 2014

ALPS Supercomputing System A Scalable Supercomputer with Flexible Services

Introduction to Linux and Cluster Basics for the CCR General Computing Cluster

Clusters: Mainstream Technology for CAE

Performance Evaluation of NAS Parallel Benchmarks on Intel Xeon Phi

PRIMERGY server-based High Performance Computing solutions

Cosmological simulations on High Performance Computers

Outline. High Performance Computing (HPC) Big Data meets HPC. Case Studies: Some facts about Big Data Technologies HPC and Big Data converging

Using WestGrid. Patrick Mann, Manager, Technical Operations Jan.15, 2014

Self service for software development tools

Getting Started with HPC

Altix Usage and Application Programming. Welcome and Introduction

Building Platform as a Service for Scientific Applications

Cluster, Grid, Cloud Concepts

Grids Computing and Collaboration

Appro Supercomputer Solutions Best Practices Appro 2012 Deployment Successes. Anthony Kenisky, VP of North America Sales

GPU Hardware and Programming Models. Jeremy Appleyard, September 2015

Transcription:

PLGrid Infrastructure Solutions For Computational Chemistry Mariola Czuchry, Klemens Noga, Mariusz Sterzel ACC Cyfronet AGH 2 nd Polish- Taiwanese Conference From Molecular Modeling to Nano- and Biotechnology, Opole, 27 VIII 2015

Agenda ACC Cyfronet AGH computing centre of excellence PLGrid Infrastructure What is PLGrid? PL- Grid Consortium Available and solutions and resources Access to Infrastructure Who is fit to free access PLGrid Users Portal How to apply for resources

ACC Cyfronet AGH

Main activity areas

PLGrid Infrastructure

PLGrid is... Computing Infrastructure for polish science which gives access to information technologies and services at well established and guaranteed level

PLGrid Consortum Academic Computer Centre CYFRONET AGH, Kraków (Coordinator) Tricity Academic Computer Centre, Gdańsk Poznan Supercomputing and Networking Center, Poznań Interdisciplinary Centre for Mathematical and Computational Modelling, Warszawa Wroclaw Centre for Networking and Supercomputing, Wrocław

PLGrid Infrastructure Nearly 3000 users use infrastructure There are more than 1000 articles in highly cited scientific journals with acknowledgments to PLGrid Infrastructure Infrastructure is used by 700 Histogram showing number of scientific articles conducted with help of PLGrid Infrastructure sorted by rank of journal Universities and research institutes staff members PhD students Undergraduate students 600 500 587 436 other people involved in polish scientific projects funded by National Science Centre and National Centre for Research and Development 400 300 200 200 100 0 85 58 41 19 3 1 6 50 45 40 35 30 25 20 15 10 1

PLGrid development New generation domain- specific services in the PL- Grid infrastructure for Polish Science Competence Centre in the Field of Distributed Computing Grid Infrastructures

Offer for scientists

Main services

PL- Grid Infrastructure Hardware 588,02 TFLOPS 40 288 cores 107,9 TB RAM 5.8 PB storage

TOP 500 June 2015 HPC systems from Poland 49 - Prometheus (ACC Cyfronet AGH) - 1,65 PFLOPS (PLGrid) 126 Tryton (TASK) 0,63 PFLOPS (partially in PLGrid) 135 Bem (WCSS) 0,63 PFLOPS (partially in PLGrid) 155 - ŚWIERK COMPUTING CENTRE 0,49 PFLOPS 269 Zeus (ACC Cyfronet AGH) 0,37 PFLOPS (PLGRrid) 380 Orion (ICM) 0,20 PFLOPS 418 Nostromo (ICM) 0,19 PFLOPS

PL- Grid Infrastructure Hardware 2015 2,5+ PFLOPS 85 000+ cores 300+ TB RAM 15+ PB storage

PL- Grid Infrastructure Hardware 2015 Available CPUs Intel Xeon 4-, 6-, 12- core (up to 24 cores on node), new Haswell microarchitecture AMD Opteron 6-, 12-, 16- core (up to 64 cores on node) Computational coprocessors GPGPU NVidia Tesla and Kepler (up to 8 units per node), Intel Xeon Phi (up to 2 coprocessors per node) Various nodes configurations 8-64 cores per node up to 512 GB RAM per node vsmp (Intel Xeon) up to 6TB RAM and 768 cores Detailed resources description www.plgrid.pl/oferta/zasoby_obliczeniowe/opis_zasobow/hpc

Access to resources and services PLGrid enables various ways to interact with Infrastructure Local queuing systems on clusters (CLI oraz GUI) grid middleware (UNICORE, QosCosGrid, glite) virtual machines webportals (InSilicoLab, Rimrock, PLG- Data, DataNet,...) Access management through PLGrid Users Portal (https://portal.plgrid.pl) Portfolio of services general services - https://docs.plgrid.pl/uslugi domain specific services - https://docs.plgrid.pl/uslugi_dziedzinowe

PLGrid - software Chemistry & Biology: ADF, AMBER, CFOUR, CP Dalton, GAMESS, Gaussian, Molcas, Molpro, MOPAC, NWChem, Open Babel, TURBOMOLE, AutoDock/AutoGrid, BLAST, Clustal, Siesta, Quantum Espresso, VASP (user s own license required), CPMD, Gromacs, NAMD Physics: FLUENT, Meep, OpenFOAM Nanotechnology: ABINIT, Quantum Espresso, NAMD Multipurpose: Mathematica, MATLAB Development: Intel, PGI, GNU compilers, MKL, CUDA, MPI, OpenMP, Alinea, Python, R, Ruby Databases Sequential, Parallel (MPI, OpenMP), interactive runs possible Users own licences could be used https://apps.plgrid.pl/

Rimrock https://submit.plgrid.pl Rimrock Robust Remote Process and Job Controller Rimrock application simplify the way of interaction with the remote servers It allows to execute application in batch mode or start interactive application application output can be fetched online input can be send using simple REST interface 18

PLG- Data Access to data through web browser

InSilicoLab http://insilicolab.grid.cyfronet.pl Complete workspace in web browser for a scientists who wishes to perform quantum chemistry calculations Assist with preparing input to various scientific applications performing computations on grid infrastructure controlling complex and recurrent jobs collecting output files analysis of obtained results (also from many calculations at once) 20

InSilicoLab Available scientific domains Chemistry Astrophysics Geophysics chemical computations using Gaussian, GAMESS, Turbomole and Terachem Trajectory Sculptor - advanced tool for manipulation of Molecular Dynamics trajectories automatic extraction of relevant parts of the input structure (based on user choices) automatic processing frames chosen by user results easily reused in quantum chemical calculations hydrodynamics calculations with finite element methods computations for Cherenkov Telescope Array (CTA) consortium digital research space for induced seismicity research in IS- EPOS project 21

Tools enhancing project management Adobe Connect - integrated platform for teleconference services Confluence - "wiki- style software for knowledge aggregation JIRA - tool for planning scientific or IT projects and tracking its progress Stash - Version Control with Git

Access to infrastructure

PLGrid Users Portal On- line registration through PLGrid Users Portal (https://portal.plgrid.pl) User verification based on Polish Science Database (http://www.nauka- polska.pl/) On PLGrid Users Portal user can manage access to tools and services monitor utilization of resources manage their computational grants and grid certificates Access to all PLGrid resources through one account and one passphrase (or grid certificate)

Requesting for resources Computational grants in PLGrid Resources for good start personal grant Computational grants all necessary documents and forms through web portal clear and simple methods of evaluation guarantee of access to negotiated resources security of data guarantee

Users Support Trainings e- Learing - https://portal.plgrid.pl/ or https://blackboard.cyfronet.pl/ hands- on with trainers info at http://www.cyfronet.krakow.pl/ Domain experts help through Helpdesk PLGrid service system https://helpdesk.plgrid.pl helpdesk@plgrid.pl Users Manual Available online: https://docs.plgrid.pl/podrecznik_uzytkownika Users Forum https://zapytaj.plgrid.pl/

Registration: https://portal.plgrid.pl helpdesk@plgrid.pl +48 12 632 33 55 ext. 312