How To Use Bwgrid



Similar documents
bwgrid Treff HD/MA Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 30.

bwgrid Treff HD/MA Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 30.

bwgrid Treff MA/HD Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 5.

bwgrid Treff MA/HD Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 29.

bwgrid Treff MA/HD Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 20.

bwgrid Treff MA/HD Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 24.

bwgrid Treff MA/HD Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 19.

bwgrid Treff MA/HD Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 11.

Workshop Agenda Feb 25th 2015

KIT Site Report. Andreas Petzold. STEINBUCH CENTRE FOR COMPUTING - SCC

Linux für bwgrid. Sabine Richling, Heinz Kredel. Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim. 27.

The cloud storage service bwsync&share at KIT

Lessons learned from parallel file system operation

Kriterien für ein PetaFlop System

Introduction to ANSYS Academic Research

HIPAA Compliance Use Case

IGI Portal architecture and interaction with a CA- online

Scyld Cloud Manager User Guide

Betriebssystem-Virtualisierung auf einem Rechencluster am SCC mit heterogenem Anwendungsprofil

High Performance Computing within the AHRP

Towards a Comprehensive Accounting Solution in the Multi-Middleware Environment of the D-Grid Initiative

Authentication Methods

Unterstützung datenintensiver Forschung am KIT Aktivitäten, Dienste und Erfahrungen

Virtualization Infrastructure at Karlsruhe

Backing Up and Restoring Data

Dynamic Extension of a Virtualized Cluster by using Cloud Resources CHEP 2012

Lustre tools for ldiskfs investigation and lightweight I/O statistics

NTP Software VFM Administration Web Site for EMC Atmos

Managing a local Galaxy Instance. Anushka Brownley / Adam Kraut BioTeam Inc.

Features of AnyShare

Advantage for Windows Copyright 2012 by The Advantage Software Company, Inc. All rights reserved. Client Portal blue Installation Guide v1.

Estonian Scientific Computing Infrastructure (ETAIS)

MP 5 System Administration

locuz.com HPC App Portal V2.0 DATASHEET

No.1 IT Online training institute from Hyderabad URL: sriramtechnologies.com

HPC system startup manual (version 1.30)

Designing IT Platform Collaborative Applications with Microsoft SharePoint 2003 Workshop

How do I control and manage SIPe Profiles

Florida Site Report. US CMS Tier-2 Facilities Workshop. April 7, Bockjoo Kim University of Florida

Eine CAE Infrastruktur für LS-DYNA. unter Verwendung von. Microsoft Windows HPC Server 2008

JUROPA Linux Cluster An Overview. 19 May 2014 Ulrich Detert

ITCertMaster. Safe, simple and fast. 100% Pass guarantee! IT Certification Guaranteed, The Easy Way!

Integration of Virtualized Worker Nodes in Standard-Batch-Systems CHEP 2009 Prague Oliver Oberst

Entrust IdentityGuard Comprehensive

Michał Jankowski Maciej Brzeźniak PSNC

Technical Computing Suite Job Management Software

Integration of Virtualized Workernodes in Batch Queueing Systems The ViBatch Concept

Content Distribution Management

LifeSize Video Center Administrator Guide March 2011

Authentication Integration

SURFsara HPC Cloud Workshop

Berkeley Research Computing. Town Hall Meeting Savio Overview

XSEDE Service Provider Software and Services Baseline. September 24, 2015 Version 1.2

Research Technologies Data Storage for HPC

Mod 2: User Management

A Web-based Portal to Access and Manage WNoDeS Virtualized Cloud Resources

CompleteView Admin Console Users Guide. Version Revised: 02/15/2008

Flexible Scalable Hardware independent. Solutions for Long Term Archiving

Offline Data Transfer to VMWare vcloud Hybrid Service

CRIBI. Calcolo Scientifico e Bioinformatica oggi Università di Padova 13 gennaio 2012

Private Cloud for WebSphere Virtual Enterprise Application Hosting

How To Run A Tompouce Cluster On An Ipra (Inria) (Sun) 2 (Sun Geserade) (Sun-Ge) 2/5.2 (

QUANTIFY INSTALLATION GUIDE

Setup Database as a Service using EM12c

Informationsaustausch für Nutzer des Aachener HPC Clusters

AKIPS Network Monitor User Manual (DRAFT) Version 15.x. AKIPS Pty Ltd

SURFsara HPC Cloud Workshop

About Me. Software Architect with ShapeBlue Specialise in. 3 rd party integrations and features in CloudStack

Controlling the Linux ecognition GRID server v9 from a ecognition Developer client

GPFS und HPSS am HLRS

Pandora FMS 3.0 Quick User's Guide: Network Monitoring. Pandora FMS 3.0 Quick User's Guide

Overview. Remote access and file transfer. SSH clients by platform. Logging in remotely

Google Apps and Open Directory. Randy Saeks

13.1 Backup virtual machines running on VMware ESXi / ESX Server

FTP Accounts Contents

CTERA Portal Datacenter Edition

Sun in HPC. Update for IDC HPC User Forum Tucson, AZ, Sept 2008

WatchDox Administrator's Guide. Application Version 3.7.5

Eylean server deployment guide

Administering a Microsoft SQL Server 2000 Database

Windows HPC Server 2008 R2 Service Pack 3 (V3 SP3)

Science Days Stefan Freitag. 03. November Robotics Research Institute Dortmund University of Technology. Cloud Computing in D-Grid

This presentation introduces you to the new call home feature in IBM PureApplication System V2.0.

INSTALLATION GUIDE. Snow License Manager Version 7.0 Release date Document date

The safer, easier way to help you pass any IT exams. Exam : E Backup Recovery - Avamar Expert Exam for Implementation Engineers.

Omniquad Exchange Archiving

SCOPE OF SERVICE Hosted Cloud Storage Service: Scope of Service

EBOX Digital Content Management System (CMS) User Guide For Site Owners & Administrators

Grids Computing and Collaboration

Server application Client application Quick remote support application. Server application

A Guide to New Features in Propalms OneGate 4.0

PRiSM Security. Configuration and considerations

Parallel Processing using the LOTUS cluster

A simple object storage system for web applications Dan Pollack AOL

Performing Administrative Tasks

Nexio Connectus with Nexio G-Scribe

Skybot Scheduler Release Notes

Managing Traditional Workloads Together with Cloud Computing Workloads

Virtualization of a Cluster Batch System

CENTRAL EUROPEAN SPACE PROGRAMME

Transcription:

bwgrid Treff HD/MA Sabine Richling, Heinz Kredel Universitätsrechenzentrum Heidelberg Rechenzentrum Universität Mannheim 15. May 2013 Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 1 / 22

Course Organization What is bwgrid Treff? Participants Current users of the bwgrid Clusters HD/MA Students and scientists interested in Grid Computing Members of the Universities Heidelberg and Mannheim Scope bwgrid Status and Plans Lectures and Workshops Questions and Discussions User contributions To meet you in person Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 2 / 22

Course Organization bwgrid Treff Archive Slides of past bwgrid meetings: bwgrid Website Heidelberg http://www.urz.uni-heidelberg.de/server/grid Right column Archiv bwgrid Website Mannheim http://tiny.uni-mannheim.de/bwgrid Right column bwgrid Treff Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 3 / 22

Course Organization bwgrid Treff 15. May 2013 Agenda for today: bwgrid News Future of bwgrid New access mode Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 4 / 22

bwgrid News bwgrid Introduction Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 5 / 22

What is bwgrid? bwgrid News bwgrid Virtual Organization (VO) D-Grid Community project of the Universities in Baden-Württemberg VO bwgrid is open for all members of the participating Universities bwgrid Resources Compute clusters at 8 locations Central storage unit in Karlsruhe bwgrid Objectives Verifying the functionality and the benefit of Grid concepts for the HPC community in Baden-Württemberg Managing organizational, security, and license issues Development of new cluster and Grid applications Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 6 / 22

bwgrid News bwgrid Resources Compute cluster: Mannheim/Heidelberg: 280 nodes Direct Interconnection Frankfurt Karlsruhe: 140 nodes ////////////// Stuttgart:////// 420///////// nodes Tübingen: 140 nodes Mannheim (interconnected to a single cluster) Heidelberg Ulm (Konstanz): 280 nodes Hardware in Ulm Karlsruhe Stuttgart Freiburg: 140 nodes Esslingen Esslingen: 180 nodes more recent Hardware Central storage: Freiburg Tübingen Ulm (joint cluster with Konstanz) München Karlsruhe: 128 TB (with Backup) 256 TB (without Backup) Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 7 / 22

bwgrid News bwgrid Cluster Mannheim/Heidelberg Configuration VOMRS LDAP Cluster Mannheim Grid InfiniBand Lustre bwfs MA User MA PBS Admin passwd Obsidian 28 km Obsidian User HD Shibboleth InfiniBand Lustre bwfs HD SP IDP AD Cluster Heidelberg Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 8 / 22

bwgrid News bwgrid News Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 9 / 22

bwgrid News bwgrid Cluster Mannheim/Heidelberg News Usage of workspace file system often very high Check free space for workspaces on /bwfs/scratch with: df Check your amount of data in workspaces with: lfs quota -u $USER /bwfs/scratch Clean up workspaces frequently Delete data before releasing workspaces Several maintenances due to problems with storage systems failure of storage controllers offline file system checks necessary New software modules (since January 2013) chem/amber/12 phys/delft3d/5.00.00.1234 cae/dune geoinversion/2.0 math/mathematica/9.0.1 (MA) devel/perl/5.14.2 math/matlab/r2013a http://www.bw-grid.de/das-bwgrid/software/software-suchen/ Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 10 / 22

Future of bwgrid Future of bwgrid Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 11 / 22

Future of bwgrid bwhpc Performance Pyramid Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 12 / 22

Future of bwgrid bwhpc Top level systems Usage requires submission of project proposal & review procedure. Hermit (Tier 1) http://www.hlrs.de/systems/platforms/cray-xe6-hermit/ Installation step 1 (since 12/2011): 3552 nodes, 1 PFlops #27 in TOP500 list 11/2012 (http://www.top500.org/) Further installation step planned for 2013 ForHLR (Tier 2) KIT Press Release 114/2012 http://www.kit.edu/visit/pi_2012_11366.php Planned PFlop system for 26 Mio. EUR (2013 2015) Research fields: environment, energy, nanostructures, nanotechnologies, and materials sciences Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 13 / 22

Future of bwgrid bwunicluster Location: KIT Karlsruhe Number of nodes: 400 Planned begin of operation: August 2013 CPU time quota per University according to financial contribution Focus on New HPC users Education and Teaching Scientists from research areas not covered by bwforcluster Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 14 / 22

Future of bwgrid bwforcluster 4 clusters: Heidelberg/Mannheim, Ulm, Freiburg, Tübingen. Each cluster with focus on different fields of research. Different hardware, but similar user environment. Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 15 / 22

bwhpc Timeline Future of bwgrid bwforcluster Tübingen und Freiburg bwforcluster HD/MA und Ulm ForHLR Karlsruhe bwunicluster Karlsruhe Hermit HLRS bwgrid Esslingen bwgrid Tübingen und Freiburg bwgrid HD/MA und Ulm/Konstanz bwgrid Karlsruhe bwgrid Stuttgart 2008 2009 2010 2011 2012 2013 2014 2015 2016 2017 2018 2019 Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 16 / 22

Future of bwgrid bwgrid Central Storage in Karlsruhe At least available until the end of 2013. Currently no plans for a direct follow-up system. Possible alternative: bwlsdf What is bwlsdf? Large-Scale Data Facility for bw Universities http://bwlsdf.scc.kit.edu Sub-projects: bwsync&share: Online storage for staff and students bwfilestorage: File based storage (access with scp, ftp, http, NFS) bwblockstorage: Block based storage for IT centers bwfederatedstorage: Integration of distributed storage Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 17 / 22

New access mode New access mode Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 18 / 22

New access mode New access mode for bwhpc Basics Status Developed by bwidm project: http://www.bw-grid.de/en/bwservices/bwidm/ Based on Shibboleth: http://shibboleth.net Will replace access with Grid certificate and Grid Middleware. Webregistration required. Login with standard ssh client. Access to bwgrid cluster HD/MA from all bw Universities. Plans for 2013/2014 Access to all bwgrid clusters from all bw Universities. Access to bwunicluster from all bw Universities. Access to bwforclusters from all bw Universities. Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 19 / 22

New access mode Shibboleth access: Webregistration Address for Webregistration: https://sp-grid-webregistration.uni-mannheim.de 1 DFN AAI Test Federation: Choose your University. 2 Shibboleth Service of your University: Login with account and password. 3 bwgrid registration page: Read and accept use conditions. Choose bwgrid cluster. Supply a short project description. Registration and activation are notified by e-mail. Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 20 / 22

New access mode Shibboleth access: Login with ssh client Shibboleth login server for bwgrid HD/MA: zug-hd-2.urz.uni-heidelberg.de Account names for login: EPPN: account@university Local username: sbwxxxxx Examples for login with ssh and file transfer with scp: ssh f38@uni-heidelberg.de@zug-hd-2.urz.uni-heidelberg.de ssh -l emuell@uni-mannheim.de zug-hd-2.urz.uni-heidelberg.de ssh sbw00123@zug-hd-2.urz.uni-heidelberg.de scp sbw00123@zug-hd-2.urz.uni-heidelberg.de:file-there file-here scp file-here f38@uni-heidelberg.de@zug-hd-2.urz.uni-heidelberg.de:file-there scp -o User=emuell@uni-mannheim.de file-here zug-hd-2.urz.uni-heidelberg.de:file-there Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 21 / 22

References Useful Links bwgrid Documentation MA: http://www.uni-mannheim.de/rum/zs/hpc/bwgrid_cluster bwgrid Documentation HD: http://www.urz.uni-heidelberg.de/server/grid bwgrid: http://www.bw-grid.de bwgrid Portal: http://portal.bw-grid.de User Support: https://helpdesk.ngi-de.eu Contact MA: dgrid-support@mailman.uni-mannheim.de Contact HD: dgrid-support@listserv.uni-heidelberg.de Richling/Kredel (URZ/RUM) bwgrid Treff SS 2013 22 / 22