Cloud-WIEN2k. A Scientific Cloud Computing Platform for Condensed Matter Physics



Similar documents
Neptune. A Domain Specific Language for Deploying HPC Software on Cloud Platforms. Chris Bunch Navraj Chohan Chandra Krintz Khawaja Shams

High Performance Computing in CST STUDIO SUITE

benchmarking Amazon EC2 for high-performance scientific computing

Building Platform as a Service for Scientific Applications

SR-IOV: Performance Benefits for Virtualized Interconnects!

Cloud-pilot.doc SA1 Marcus Hardt, Marcin Plociennik, Ahmad Hammad, Bartek Palak E U F O R I A

Workshop on Parallel and Distributed Scientific and Engineering Computing, Shanghai, 25 May 2012

HDFS Cluster Installation Automation for TupleWare

Building a Top500-class Supercomputing Cluster at LNS-BUAP

Cluster Scalability of ANSYS FLUENT 12 for a Large Aerodynamics Case on the Darwin Supercomputer

Cluster, Grid, Cloud Concepts

LSKA 2010 Survey Report Job Scheduler

Performance Analysis of a Numerical Weather Prediction Application in Microsoft Azure

1 Bull, 2011 Bull Extreme Computing

Early Cloud Experiences with the Kepler Scientific Workflow System

Cloud Computing Solutions for Genomics Across Geographic, Institutional and Economic Barriers

Introduction to grid technologies, parallel and cloud computing. Alaa Osama Allam Saida Saad Mohamed Mohamed Ibrahim Gaber

LS-DYNA Best-Practices: Networking, MPI and Parallel File System Effect on LS-DYNA Performance

1.0. User Manual For HPC Cluster at GIKI. Volume. Ghulam Ishaq Khan Institute of Engineering Sciences & Technology

Finite Elements Infinite Possibilities. Virtual Simulation and High-Performance Computing

HPC and Big Data. EPCC The University of Edinburgh. Adrian Jackson Technical Architect

Manual for using Super Computing Resources

wu.cloud: Insights Gained from Operating a Private Cloud System

SURFsara HPC Cloud Workshop

Simulation Platform Overview

Load Balancing on a Non-dedicated Heterogeneous Network of Workstations

Part I Courses Syllabus

Cloud computing - Architecting in the cloud

Installing and running COMSOL on a Linux cluster

Performance Evaluation of Amazon EC2 for NASA HPC Applications!

PRIMERGY server-based High Performance Computing solutions

HPC enabling of OpenFOAM R for CFD applications

Cellular Computing on a Linux Cluster

Simple Introduction to Clusters

Pentaho High-Performance Big Data Reference Configurations using Cisco Unified Computing System

The Easiest Way to Run Spark Jobs. How-To Guide

An Open MPI-based Cloud Computing Service Architecture

Scaling LS-DYNA on Rescale HPC Cloud Simulation Platform

Hadoop Setup. 1 Cluster

ACCELERATING COMMERCIAL LINEAR DYNAMIC AND NONLINEAR IMPLICIT FEA SOFTWARE THROUGH HIGH- PERFORMANCE COMPUTING

Flood Modelling for Cities using Cloud Computing FINAL REPORT. Vassilis Glenis, Vedrana Kutija, Stephen McGough, Simon Woodman, Chris Kilsby

The Fastest Way to Parallel Programming for Multicore, Clusters, Supercomputers and the Cloud.

Visualisation in the Google Cloud

An HPC Application Deployment Model on Azure Cloud for SMEs

HPC Wales Skills Academy Course Catalogue 2015

Overview of HPC Resources at Vanderbilt

Cloud Computing on Amazon's EC2

Final Project Proposal. CSCI.6500 Distributed Computing over the Internet

Using the Windows Cluster

Amazon EC2 Product Details Page 1 of 5

A Cost-Evaluation of MapReduce Applications in the Cloud

ABAQUS High Performance Computing Environment at Nokia

How To Set Up Wiremock In Anhtml.Com On A Testnet On A Linux Server On A Microsoft Powerbook 2.5 (Powerbook) On A Powerbook 1.5 On A Macbook 2 (Powerbooks)

CloudFTP: A free Storage Cloud

PARALLEL & CLUSTER COMPUTING CS 6260 PROFESSOR: ELISE DE DONCKER BY: LINA HUSSEIN

MapReduce and Hadoop Distributed File System V I J A Y R A O

Agenda. HPC Software Stack. HPC Post-Processing Visualization. Case Study National Scientific Center. European HPC Benchmark Center Montpellier PSSC

Leveraging Windows HPC Server for Cluster Computing with Abaqus FEA

Cloud Web-Based Operating System (Cloud Web Os)

How To Compare Amazon Ec2 To A Supercomputer For Scientific Applications

Cloud Computing. Adam Barker

Kashif Iqbal - PhD Kashif.iqbal@ichec.ie

A programming model in Cloud: MapReduce

Data Sharing Options for Scientific Workflows on Amazon EC2

Unleashing the Performance Potential of GPUs for Atmospheric Dynamic Solvers

SURFsara HPC Cloud Workshop

U"lizing the SDSC Cloud Storage Service

owncloud Enterprise Edition on IBM Infrastructure

IBM Platform Computing Cloud Service Ready to use Platform LSF & Symphony clusters in the SoftLayer cloud

High Performance. CAEA elearning Series. Jonathan G. Dudley, Ph.D. 06/09/ CAE Associates

ST 810, Advanced computing

MIKE by DHI 2014 e sviluppi futuri

Transcription:

Penn State, August 2013 Cloud-WIEN2k A Scientific Cloud Computing Platform for Condensed Matter Physics K. Jorissen University of Washington, Seattle, U.S.A. Supported by NSF grant OCI-1048052 www.feffproject.org

Materials Science Materials Science research: Theoretical models, evaluated on a computer, are usually needed for interpretation and quantification of measurements. But HPC is often not readily available. sample Hψ=Eψ E=mc2 measurement theoretical model interpretation

Anecdote (High-Performance Computing is everywhere) Computational linguistics:! We automatically identify semantically related words in the 400 million word Dutch Twente corpus to! à Statistically find contextual associations and quantify association strength! à Identify syntactical relations between words! à Relevant to automatic translation software! Multivariate analysis with dozens of variables large computational needs.!!! --- an English Lit major!! https://perswww.kuleuven.be/~u0042527/lingpub.htm!

Quest How do we bring the best theory and simulations to the scientists who need it? (often applied scientists not computational specialists) SOLUTION: Scientific Cloud Computing

Are state-of-the-art! calculations work for specialists? FEFF-old (simple Einstein model for phonons)! GUI Easy install Runs on laptop Load file & Click Run ~ 1 day to learn

Are state-of-the-art calculations work for specialists?! FEFF-gold (accurate ab initio model for phonons)! Dynamical Matrix (DFT) -- ABINIT DFT requires cluster Configure codes Complex workflow Command-line Invented / published 2006-2009 Clearly an improvement Debye Waller Factors -- DMDW Nobody uses it X-ray Absorption -- FEFF ~ 0.x grad students to learn

Are state-of-the-art calculations work for specialists?! Hardware barrier: advanced codes need clusters! Software barrier: running codes is difficult!! - Buy a cluster? IT support? - Supercomputing center? - Collaborate with specialists? - Installation of software tricky - lacking user-friendliness - multi-code workflows difficult t >> 1 before improved theory reaches applied research!

Scientific Cloud Computing Interface simplifies workflow (hides cloud -- app)!! Developer makes virtual XAS compute node with preinstalled WIEN2k! User requests 5 node Cloud Cluster for 3 hours when needed ($20)!

Contains utilities for parallel scientific computing: MPI, compilers, libraries, NFS, Becomes compute node in SCC Cloud Cluster Developer-optimized Scientific codes for your research - WIEN2k for electronic structure calculations - latest version - optimized for performance - MPI parallellization for large calculations SCC Virtual Machine Image My new research group was looking for a way to implement MEEP-mpi (MIT Electromagnetic Equation Propagation) to simulate EM fields in nanoscale optical devices for cavity QED experiments. We believe that mazon EC2 is an economical and time saving solution for our finite difference time domain (FDTD) simulations. My group s research iterates between fabrication and simulation thus it is advantageous to buy computing power only when needed. Moreover it is a relief not to have to maintain our own small cluster within our group. Kai-Mei Fu, University of Washington (USA)

SCC Linux interface For developers of GUIs Java interface library (jar) SCC Java interface For savvy users and developers Collection of shell scripts FEFF GUI

WIEN2k-cloud Starts cluster in EC2 cloud Uploads initialized calculation Runs calculation in EC2 cloud Downloads results to laptop Deletes EC2 cluster WIEN2k GUI (DFT) Other workflows / data flows can be added. Requires: - create EC2 account - install SCC program

Performance LOOSELY Coupled Processes DFT KS equations on 128 k-point grid Good scaling

Performance TIGHTLY Coupled Processes KS for large system at 1 k-point VERY DEMANDING of network performance HPC cluster instances deliver good speedup

5. WIEN2k Performance Benchmarks TIGHTLY Coupled Processes KS for large system at 1 k-point H size 56,000 (25GB) Runtime (16x8 processors) : Local (Infiniband) 3h:48 Cloud (10Gbps) 1h:30 ($40) VERY DEMANDING of network performance 1200 atom unit cell; SCALAPACK+MPI diagonalization, matrix size 50k-100k HPC cluster instances deliver similar speedup as local Infiniband cluster

Scientific Cloud Computing can bring novel theory & HPC modeling to more researchers.! Sa! Sdf! www.feffproject.org Comp. Phys. Comm. 183 (2012) 1911 We acknowledge: w FEFF: S. Story T. Ahmed B. Mattern M. Prange J. Vinson w UW: R. Coffey E. Lazowska J. Loudermilk w Amazon: D. Singh w NSF: C. Bouldin w supported by NSF OCI-1048052

Backup stuff

4. Cloud-Computing on the Amazon EC2 cloud My Laptop FEFF interface 1. Create cluster 2. Calculations 3. Stop cluster MPI Master MPI Slave MPI Slave FEFF9 FEFF9 FEFF9 Cloud Compute Instances * K. Jorissen et al., Comp. Phys. Comm. 183 (2012) 1911

Developer s view: ExecuteCloudContext.java: import edu.washington.scc.*; // Launch the new cluster with cs specifications: ClusterResult rl = clust.launch(cs); // Initialize the FEFF calculation on the cloud cluster: // Copy feff.inp: ClusterResult rp = clust.put(localworkingdir+"/feff.inp", CloudWorkingDir+"/feff.inp"); // Run the FEFF9-MPI calculation: ClusterResult rf9 = clust.executecommand(feff9commandline,cloudout); // Copy the output files back to the local computer: ClusterResult rg = clust.get(cloudworkingdir, LocalWorkingDir); // Terminate the cloud cluster: ClusterResult rt = clust.terminate();

End User s view: FEFF GUI: