Using open source and commercial visualization packages for analysis and visualization of large simulation dataset
|
|
|
- Morris Jones
- 10 years ago
- Views:
Transcription
1 Using open source and commercial visualization packages for analysis and visualization of large simulation dataset Simon Su, Werner Benger, William Sherman, Eliot Feibush, Curtis Hillegas Princeton University, Louisiana State University, Indiana University, Princeton Plasma Physics Laboratory, Princeton University Abstract Scientists today are generating more data than ever. In any high performance computing center, analyzing and visualizing large amounts of simulation data are becoming more and more challenging. Therefore, high performance computing centers will not only need to manage their high performance computational resources, but also develop state of the art data analysis and visualization capabilities. Simulation data will also need to be generated and visualized on the same storage hardware because transferring the simulated data after the simulation for analysis and visualization is no longer feasible due to the size of the data generated. The scientific visualization community has been developing visualization software with the capability to read, analyze, and generate visualization from large simulation data. Numerous open source and commercial visualization software packages are available for use by the scientists for data analysis and visualization purposes. However, there is no single visualization package that can satisfy all the visualization requirements. Most visualization centers will utilize a combination of open source and commercial visualization packages in their visualization projects. In addition, visualization centers will also develop their own customized visualization module for their project specific visualization requirements that are not supported by the standard visualization packages. 1. Introduction Scientific data visualization is a crucial part of any small and large modeling and simulation research. The simulation data generated will need to be analyzed to verify the accuracy of the model. Especially in this era of petascale and exascale computing, simulation data are only going to get larger and larger[1]. Therefore, ultrascale visualization is a crucial part of any high performance computing center. The private sector and United States federal government have been developing and funding visualization software toolkits that have the capability to analyze and visualize large simulation datasets. The software toolkits provide the domain scientists the capability to visualize large dataset at an interactive rate and at different stages of the modeling and simulation research. When the visualization pipeline is established, domain scientists will be able to visualize their data within minutes of the completion of their simulation on the supercomputer. Some simulation algorithms will generate output in stages and the domain scientists will be able to use the established visualization pipeline to start analyzing and verifying the accuracy of the model before the completion of the simulation. They can stop the simulation if the results are not what they would expect to find instead of wasting their resource allocation on the supercomputer. The scientists can then fine tune their model and re-run the simulation. When the simulation is completed, domain scientists can also use the same visualization pipeline to create comprehensive visualization of their dataset that will enable them to present their research to their peers and to the general public. In addition to the visualization of scientific data, the ability to interact in real-time and in three-dimension (3D) with the 3D simulation data during the visualization process is also crucial to the process of analyzing and understanding the dataset[2]. Many visualization packages like VisIT[3], VISH[4], Paraview[5], and AVS Express[6] have the ability to plot and create visualization from the simulation data in the data format that they can load. However, only a handful of visualization packages will support fully immersive display and 3D interactions with the dataset displayed. AVS Express provides an additional module to support immersive display and 3D interactions with the dataset. Paraview on the other hand, has a limited support in those capabilities. Another way to accomplish immersive display and 3D interactions in scientific data visualization is to develop data visualization applications using a visualization software toolkit like VTK[7] together with software toolkits that can support immersive display and 3D interactions like FreeVR[8] and VR Juggler[9].
2 In section 2, we will describe the visualization hardware setup of the Visualization Center. Sections 3 and 4 will describe the open source and commercial toolkits used in the Visualization Center respectively. Section 5 will describe a unique visualization toolkit that is freely available for research and non military usage. We will discuss the advantages of using both open source and commercial software in section 6 and we will conclude in the next section. workstation will run both Microsoft Windows and PUIAS Linux operating systems. 2. Hardware Environment The Visualization Center has a single wall rear projection display using Sony SRX-S110 projector with 10,000 lumens and 4096 x 2160 native resolution which translates to over 8.8 million pixels on the screen. Figure 1 illustrates the Sony SRX-S110 projector. That resolution is more than 4 times the amount of pixels on an office monitor at 1920x1080 resolution. Figure 2. Elevation View Figure 3. Plan View Figure 1. Sony SRX-S110 The screen is 120" tall and 216" wide with OptiTrack[10] optical tracking to support 3D interaction within the visualization application. Figure 2 and Figure 3 illustrates the Elevation View and the Plan View of the design of the visualization system. The projector is driven by an Intel-based graphics workstation with Nvidia Quadro Plex 2200 D2. The 4 DVI-D connections from Quadro Plex 2200 D2 are being connected to the 4 DVI-D inputs on the Sony SRX-S110 projector. Each DVI-D connection is configured at the 2048 x 1080 resolutions, which combines to provide the projector with 4096 x 2160 resolutions input. The graphics workstation has 8 CPU cores and 24 gigabytes of system memory. In addition, a fiber connection card is also used to provide dedicated data connection to the GPFS file system server which it is a common data storage for the research community. The graphics 3. Open Source Toolkit in Visualization and Virtual Reality Data visualization is a highly customized process that depends largely on the type of data and visualization required. Therefore, the use of open source visualization software will allow the software to be freely customized to address individual data visualization needs. Although the access to the source code will allow the user to customize the visualization software, a good software design will also enable the user to add new visualization features and load new data formats with the software without getting too deep into the core of the visualization package. This will reduce the complexity of the understanding needed expand on the visualization software by a new user. Although commercial software vendors offer paid support in using their commercial visualization software, government funded open source visualization software development projects can also provide excellent support through the user group mailing list. Many of the users of
3 the open source visualization software fall under similar category of user and they are using the visualization software to visualize similar dataset. Many of the issues faced by a new user may have already been addressed by someone on the mailing list. Therefore, both the developers and the users of the visualization software in general will be able to help address the issue faced by a new user. This free support model works well where there is a large user base for the visualization software. The following open source visualization software projects and 3D interactive and immersive software projects are good examples of software toolkits with good support from the developers and the user community VisIT VisIt[3] is visualization software that can be used to visualize large scale simulation datasets. It enables the user to perform general post-processing and visualization of extremely massive datasets in parallel by utilizing the processing power of a supercomputer. It is using the processing power of a supercomputer in a way similar to the how the complex modeling and simulation process runs on the supercomputer to generate the massive datasets. Furthermore, VisIt also provides the user with a simplified access to the supercomputer s massive computational power by decoupling the graphics component and the computational component of the complex visualization software. The most powerful feature of VisIt is to allow the user to run the graphical component of VisIt on the graphics workstation in the user s office while running the computational processing component of VisIt on a remote supercomputer. The user will use the graphical user interface running on the local graphics workstation to start a visualization plot that will initiate the computational process on the remote supercomputer and gather all the processed data to combine them into the final data visualization plot on the local graphics workstation. This enables the user to process and visualize massive datasets that would otherwise be impossible on the local graphics workstation in the user s office. The targeted use cases of VisIt include data exploration, comparative analysis, visual debugging, quantitative analysis, and presentation graphics. Like any large and complex software toolkit, VisIt s core software functionalities depend on several other software toolkits. VisIt depends on the Qt widget library for its user interface, the Python programming language for a command line interpreter, and the Visualization ToolKit (VTK) library for its data modeling and visualization algorithms. The use of existing toolkits allow the core development team of VisIt to focus their efforts on parallelization for large data sets, user interface design, implementation of custom data analysis routines, visualization of non-standard data models like adaptive refinement meshes and mixed materials zones, and creation of a robust overall product. VisIt runs on various platforms of operating systems including Microsoft Windows, Apple Macintosh, and many UNIX variants, including AIX, IRIX, Solaris, Tru64, and, of course, Linux, including ports for SGI's Altix, Cray's XT4, and many commodity clusters. VisIt s basic design is a client-server model, where the server is parallelized. The client-server aspect allows for effective visualization in a remote setting, while the parallelization of the server allows for the largest data sets to be processed at a reasonably interactively rate to support interactive visualization which is crucial in the visual debugging process. The client-server model also allows the user to mix VisIt processes running on various operating systems that allows for the use of a nonhomogenous hardware platform. The user can run VisIt s graphical user interface on a Microsoft Windows machine and connect to a parallel computational processes running on supercomputer running on Linux operating system. VisIt s user community has been using VisIt to visualize many large data sets, including a two hundred sixteen billion data point structured grid, a one billion point particle simulation, and curvilinear, unstructured, and AMR meshes with hundreds of millions to billions of elements. At the Visualization Center, we have been using visit to process 5 gigabytes of simulation data which would otherwise be impossible to visualize given the hardware resources we have on a single machine Paraview ParaView[5] is an open source data analysis and visualization application that runs on various platforms of operating systems including Microsoft Windows, Apple Macintosh, and several Linux variants. ParaView users can use different filters provided and effortlessly build visualizations to analyze their data using qualitative and quantitative techniques. ParaView supports both 3D interactive and scripted processing of a large dataset. ParaView takes advantage of the distributed memory computing resources to analyze extremely large datasets. ParaView is also a scalable visualization application that runs on a wide variety of hardware architecture. This ranges from a single-processor desktop machine or laptop machine, to a multi-processor shared-memory supercomputers or clusters of computers. ParaView can process a wide variety of data formats including structured and unstructured, time varying and static datasets. ParaView takes advantage of the Visualization ToolKit (VTK) to provide a comprehensive suite of visualization algorithms. However, ParaView is also designed so its visualization algorithms can be easily extensible so that new algorithms can be easily added to enhance the visualization capability of the toolkit.
4 At the Visualization Center, ParaView is used to visualize engineering and atmospheric simulation data where the numerical data can be easily presented and analyzed by the domain scientists FreeVR FreeVR is an open source virtual reality (VR) library that provides the developer with all the functionalities needed to create a complex VR application. It has been designed to work with a wide variety of input and output VR hardware that are commonly used in VR systems. VR applications developed with FreeVR library can easily be run on the existing virtual reality hardware facilities, as well as newly established VR hardware systems. FreeVR uses a configuration file that provides an overall specification of the VR hardware system running the VR applications. The configuration file, which is unique for each different VR hardware system, provides a layer of abstraction that hides the complexity of accessing different complex VR devices from the application developer. The VR application developed will be able to run on any VR hardware system that has the correct FreeVR configuration file without modifying any part of the application source code. FreeVR library has very little dependencies and compiles on any Linux platform with minimal effort. Like many other VR integration library, FreeVR only manages the rendering window and the developer will have to use VTK, OpenGL or other scene graph management toolkit to render graphics into the rendering window managed by FreeVR. The currently available FreeVR library runs on a multi-processor shared-memory system and a cluster capable version of FreeVR is under intense development and will be available soon as an open source software library. FreeVR library is used to develop architectural walkthrough applications at the Visualization Center VR Juggler The name of the toolkit, VR Juggler [9], implies that the toolkit juggled various components together to provide a complete functionality of a typical VR library. It is a VR toolkit developed and supported by a team of researchers at Virtual Reality Application Center at Iowa State University. VR Juggler toolkit consists of several different components including Gadgeteer, Juggler Configuration and Control Library (JCCL), Tweek, Sonix, CORBA, VRP, OpenAL/AudioWorks, and OpenGL that sit on top of the operating system layer. The Gadgeteer component provides device management functionality for VR Juggler that enables access to local or remote I/O devices. JCCL component provides the ability to configure and monitor application performance. Tweek is the distributed modelview-controller implementation within VR Juggler. The Sonix component provides the library with simple soundabstraction support. The latest VR Juggler library provides support for both multi-pipe and cluster-based rendering systems. VR Juggler also supports different VR system configurations by using XML based configuration files. This is transparent to the application layer by creating a uniform interface for the VR application developer to access the hardware layer of the different VR systems. The toolkit also provides a Java based VR Juggler configuration tool to assist the users in the creation of the complex XML based configuration file. Similar to any complex software toolkit, VR Juggler depends on a number of software libraries, including CppDom, GMTL, Java Developer Kit, omniorb, OpenAL, OpenALUT, Audiere, VRPN, and Boost to provide different functionality in VR Juggler library. During compilation, the VR Juggler build process also relies on the Flagpoll software library for setting the correct compiler and linker flags. Compilation of VR application developed VR Juggler relies on Doozer software package for many of the compile flags settings. VR Juggler used an object-based methodology where application developers overloaded virtual functions in the main class. Application developers have to write an application class that overloads pre-defined functions and passes the object to VR Juggler kernel, which is responsible for managing and controlling all the resources of the VR application across the system. VR Juggler does not provide rendering or scene graph functionality, and application developers use VTK library or OpenGL to render graphics into the rendering window managed by VR Juggler. VR Juggler library is used to develop architecture walkthrough and multimedia 3D applications at the Visualization Center. 4. Commercial Toolkit in Visualization and Virtual Reality Research groups that have limited expertise in visualization are more likely to use commercially available visualization toolkit. One of the key advantages of using commercial visualization software is to have the option to purchase support that can be budgeted into the project proposal. In general, commercially supported visualization toolkits also have a more structured support with training classes and more expert users that can be hired to work on the project. The Visualization Center has also made available a few commercial visualization software packages that are commonly used by existing user base of the center. The choices of visualization software provided by the center
5 are determined mainly by the requests of the users of the center. Some users have preferred visualization software that they would like to be using in the hardware environment of the Visualization Center AVS/Express AVS/Express provides a comprehensive and versatile data visualization tool that can be used by both application developers and end-users to visualize their datasets. It enables the user to perform rapid data analysis and rich visualization techniques via a complex set of graphical user interface. AVS/Express provides powerful visualization for the users across multiple domains including science, business, engineering, medicine, telecommunications and environmental research. The Visualization Center is using AVS/Express for scientific data visualization ArcGIS ArcGIS is a collection of geographic information systems (GIS) software products that provides the users with spatial analysis, data management, and mapping capabilities. The Visualization Center is using ArcGIS for maps and geographical features visualization. 5. VISH - Freely available visualization toolkit VISH is a scientific visualization software that provides a platform for bridging the gap between algorithm development and accessibility of new visualization techniques to the application scientists. The framework enables development of new visualization algorithms that are not bound to their runtime environment which helps to facilitate the integration of the newly developed visualization algorithm into the domain scientist s daily work flow. The developers of VISH designed the software framework to allow the development and implementation of visualization algorithms with independence from a specific application. This will provide the developer a framework to test their new visualization algorithms and also enable them to share their new visualization module among other applications and their peers. VISH framework also has minimal overhead on dependencies and external components which allow a new user to compile and use the framework without a steep learning curve. The minimal overhead and dependencies will also allow for easy deployment to end users. Many components of VISH can also be used independently of the framework which increases the usefulness of the components outside the framework. VISH software framework acts like an operating system for visualization algorithms with a minimal set of abstract API serving as a framework for development and allows sharing of plugins throughout applications, which provide implementations of the same interfaces. VISH has a vision of becoming a visualization shell ( shell in the sense of a structural work or skin ) and it is not an application by itself. It is an implementation of the infrastructure necessary for scientific visualization. It has a collection of interfaces in the form of libraries which application developers can use to implement their visualization algorithm. The algorithms implemented will just see the VISH API and remain application independent. In the Visualization Center, VISH framework is used to support visualization projects that require some of the new and experimental visualization algorithms that have not yet been released in other open source or commercial visualization packages. 6. Up-front economic advantages and increased productivity achieved The Visualization Center is utilizing a suite of freely and commercially available visualization packages to support multi-disciplinary visualization projects. The use of open source toolkits has really enhanced the visualization capability of the center in addition to the commercial visualization software. It is important to be able to leverage on the advantages of different toolkits to address our visualization challenges. Many of the visualization challenges are more easily addressed by open source toolkit compared to the commercial toolkit with the accessibility of the application source code and the support of the software s large user community base that worked on similar issues that are willing the share their experiences. It is without a doubt that the use of open source visualization software has increased the productivity of the visualization projects at the center. In addition to the usability advantages, the center has also greatly benefited economically from the use of both freely and commercially available software toolkits by significantly reducing the initial setup and acquisition cost of the software suite that is needed to support the visualization projects of the center. The use of freely available toolkits shaved tens of thousands of dollars from the center s initial software budget and provided the users of the center a broader range of software toolkits to use in their visualization projects. Table 1 summarizes the software toolkits used in the Visualization Center.
6 Table 1. Software features summary Software Significant features VisIT Large data visualization Paraview Large data visualization FreeVR VR integration library VR Juggler VR integration library AVS Express Large data visualization ArcGIS Geographical data visualization IDL/ENVI Image processing, data visualization The availability of different software toolkits at the center also encourages collaborations from other research institutions running the same toolkit. The visualization application developed by other visualization centers can easily be port into the software and hardware environment of the center if the application is developed on the same software framework. 7. Conclusion The Visualization Center provides high-end visualization hardware and software that will help facilitate scientific and non-scientific discovery across multiple disciplines. The goal of the center is to provide unified visualization resources and support that would otherwise be unavailable or too expensive to maintain to the individual researcher at their respective research laboratories or department. The visualization center will keep adding more open source and commercial visualization software toolkits into the list of software toolkits available for use at the Visualization Center in the future. The use of both free and commercial software packages has significantly reduced the economic requirement of running the center without sacrificing technical ability of the Visualization Center. 8. Acknowledgement The authors would like to thank PICSciE and the university faculty members for their generous contributions to the funding of the initial setup cost of the Visualization Center. [2] Wen Qi, Jean-Bernard Martens, Robert van Liere, and Arjan Kok, Reach the virtual environment: 3D tangible interaction with scientific data, Proceedings of the 17th Australia conference on Computer-Human Interaction, Australia, November 23-25, 2005, pp [3] Hank Childs, Eric S. Brugger, Kathleen S. Bonnell, Jeremy S Meredith, Mark Miller, Brad J Whitlock, and Nelson Max, A Contract-Based System for Large Data Visualization, IEEE Visualization 2005, Minneapolis, Minnesota, Proceedings of IEEE Visualization 2005, October 23-25, 2005, pp [4] Werner Benger, Georg Ritter, and Ren\'e Heinz, The Concepts of VISH, High-End Visualization Workshop, Austria, Lehmanns Media-LOB.de, Berlin, June 18-21, 2007, pp [5] Amy Squillacote, The ParaView Guide, Kitware Inc, February [6] AVS Express, accessed June 27, [7] Kitware Inc, VTK Users s Guide, Kitware Inc, September [8] William Sherman, accessed June 29, [9] A. Bierbaum, C. Just, P. Hartling, K. Meinert, A. Baker, and C. Cruz-Neira, VR Juggler: A virtual platform for virtual reality application development, Proc. of Virtual Reality 2001, Yokohama, IEEE Computer Society Press, Los Alamitos (2001), 2001, pp [10] OptiTrack optical tracking system, accessed June 27, References [1] Kwan-Liu Ma, Chaoli Wang, Hongfeng Yu, Kenneth Moreland, Jian Huang, and Rob Ross Smith, Next- Generation Visualization Technologies: Enabling Discoveries at Extreme Scale, SciDac Review, IOP Publishing, Spring 2009, Issue 12, pp
Parallel Analysis and Visualization on Cray Compute Node Linux
Parallel Analysis and Visualization on Cray Compute Node Linux David Pugmire, Oak Ridge National Laboratory and Hank Childs, Lawrence Livermore National Laboratory and Sean Ahern, Oak Ridge National Laboratory
The Design and Implement of Ultra-scale Data Parallel. In-situ Visualization System
The Design and Implement of Ultra-scale Data Parallel In-situ Visualization System Liu Ning [email protected] Gao Guoxian [email protected] Zhang Yingping [email protected] Zhu Dengming [email protected]
IBM Deep Computing Visualization Offering
P - 271 IBM Deep Computing Visualization Offering Parijat Sharma, Infrastructure Solution Architect, IBM India Pvt Ltd. email: [email protected] Summary Deep Computing Visualization in Oil & Gas
The Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT): A Vision for Large-Scale Climate Data
The Ultra-scale Visualization Climate Data Analysis Tools (UV-CDAT): A Vision for Large-Scale Climate Data Lawrence Livermore National Laboratory? Hank Childs (LBNL) and Charles Doutriaux (LLNL) September
HPC & Visualization. Visualization and High-Performance Computing
HPC & Visualization Visualization and High-Performance Computing Visualization is a critical step in gaining in-depth insight into research problems, empowering understanding that is not possible with
Parallel Visualization of Petascale Simulation Results from GROMACS, NAMD and CP2K on IBM Blue Gene/P using VisIt Visualization Toolkit
Available online at www.prace-ri.eu Partnership for Advanced Computing in Europe Parallel Visualization of Petascale Simulation Results from GROMACS, NAMD and CP2K on IBM Blue Gene/P using VisIt Visualization
Visualization and Data Analysis
Working Group Outbrief Visualization and Data Analysis James Ahrens, David Rogers, Becky Springmeyer Eric Brugger, Cyrus Harrison, Laura Monroe, Dino Pavlakos Scott Klasky, Kwan-Liu Ma, Hank Childs LLNL-PRES-481881
The MaXX Desktop. Workstation Environment. Revised Road Map Version 0.7. for Graphics Professionals
The MaXX Desktop Workstation Environment for Graphics Professionals Revised Road Map Version 0.7 Document History Author Date Version Comments Eric Masson 01/11/2007 0.5 First Draft Eric Masson 18/11/2007
Software Development around a Millisecond
Introduction Software Development around a Millisecond Geoffrey Fox In this column we consider software development methodologies with some emphasis on those relevant for large scale scientific computing.
Evaluation of Load/Stress tools for Web Applications testing
May 14, 2008 Whitepaper Evaluation of Load/Stress tools for Web Applications testing CONTACT INFORMATION: phone: +1.301.527.1629 fax: +1.301.527.1690 email: [email protected] web: www.hsc.com PROPRIETARY
Manjrasoft Market Oriented Cloud Computing Platform
Manjrasoft Market Oriented Cloud Computing Platform Innovative Solutions for 3D Rendering Aneka is a market oriented Cloud development and management platform with rapid application development and workload
Facts about Visualization Pipelines, applicable to VisIt and ParaView
Facts about Visualization Pipelines, applicable to VisIt and ParaView March 2013 Jean M. Favre, CSCS Agenda Visualization pipelines Motivation by examples VTK Data Streaming Visualization Pipelines: Introduction
CS 3530 Operating Systems. L02 OS Intro Part 1 Dr. Ken Hoganson
CS 3530 Operating Systems L02 OS Intro Part 1 Dr. Ken Hoganson Chapter 1 Basic Concepts of Operating Systems Computer Systems A computer system consists of two basic types of components: Hardware components,
Post-processing and Visualization with Open-Source Tools. Journée Scientifique Centre Image April 9, 2015 - Julien Jomier
Post-processing and Visualization with Open-Source Tools Journée Scientifique Centre Image April 9, 2015 - Julien Jomier Kitware - Leader in Open Source Software for Scientific Computing Software Development
Analytic Modeling in Python
Analytic Modeling in Python Why Choose Python for Analytic Modeling A White Paper by Visual Numerics August 2009 www.vni.com Analytic Modeling in Python Why Choose Python for Analytic Modeling by Visual
Operating Systems for Parallel Processing Assistent Lecturer Alecu Felician Economic Informatics Department Academy of Economic Studies Bucharest
Operating Systems for Parallel Processing Assistent Lecturer Alecu Felician Economic Informatics Department Academy of Economic Studies Bucharest 1. Introduction Few years ago, parallel computers could
Parallel Visualization for GIS Applications
Parallel Visualization for GIS Applications Alexandre Sorokine, Jamison Daniel, Cheng Liu Oak Ridge National Laboratory, Geographic Information Science & Technology, PO Box 2008 MS 6017, Oak Ridge National
CHAPTER FIVE RESULT ANALYSIS
CHAPTER FIVE RESULT ANALYSIS 5.1 Chapter Introduction 5.2 Discussion of Results 5.3 Performance Comparisons 5.4 Chapter Summary 61 5.1 Chapter Introduction This chapter outlines the results obtained from
Example of Standard API
16 Example of Standard API System Call Implementation Typically, a number associated with each system call System call interface maintains a table indexed according to these numbers The system call interface
IDL. Get the answers you need from your data. IDL
Get the answers you need from your data. IDL is the preferred computing environment for understanding complex data through interactive visualization and analysis. IDL Powerful visualization. Interactive
Scientific Visualization with Open Source Tools. HM 2014 Julien Jomier [email protected]
Scientific Visualization with Open Source Tools HM 2014 Julien Jomier [email protected] Visualization is Communication Challenges of Visualization Challenges of Visualization Heterogeneous data
DB2 Connect for NT and the Microsoft Windows NT Load Balancing Service
DB2 Connect for NT and the Microsoft Windows NT Load Balancing Service Achieving Scalability and High Availability Abstract DB2 Connect Enterprise Edition for Windows NT provides fast and robust connectivity
HPC Wales Skills Academy Course Catalogue 2015
HPC Wales Skills Academy Course Catalogue 2015 Overview The HPC Wales Skills Academy provides a variety of courses and workshops aimed at building skills in High Performance Computing (HPC). Our courses
VisIVO, a VO-Enabled tool for Scientific Visualization and Data Analysis: Overview and Demo
Claudio Gheller (CINECA), Marco Comparato (OACt), Ugo Becciani (OACt) VisIVO, a VO-Enabled tool for Scientific Visualization and Data Analysis: Overview and Demo VisIVO: Visualization Interface for the
A Hybrid Visualization System for Molecular Models
A Hybrid Visualization System for Molecular Models Charles Marion, Joachim Pouderoux, Julien Jomier Kitware SAS, France Sébastien Jourdain, Marcus Hanwell & Utkarsh Ayachit Kitware Inc, USA Web3D Conference
VisIt: A Tool for Visualizing and Analyzing Very Large Data. Hank Childs, Lawrence Berkeley National Laboratory December 13, 2010
VisIt: A Tool for Visualizing and Analyzing Very Large Data Hank Childs, Lawrence Berkeley National Laboratory December 13, 2010 VisIt is an open source, richly featured, turn-key application for large
NVIDIA CUDA Software and GPU Parallel Computing Architecture. David B. Kirk, Chief Scientist
NVIDIA CUDA Software and GPU Parallel Computing Architecture David B. Kirk, Chief Scientist Outline Applications of GPU Computing CUDA Programming Model Overview Programming in CUDA The Basics How to Get
Cluster, Grid, Cloud Concepts
Cluster, Grid, Cloud Concepts Kalaiselvan.K Contents Section 1: Cluster Section 2: Grid Section 3: Cloud Cluster An Overview Need for a Cluster Cluster categorizations A computer cluster is a group of
Wildfire Prevention and Management in a 3D Virtual Environment
Wildfire Prevention and Management in a 3D Virtual Environment M. Castrillón 1, P.A. Jorge 2, I.J. López 3, A. Macías 2, D. Martín 2, R.J. Nebot 3,I. Sabbagh 3, J. Sánchez 2, A.J. Sánchez 2, J.P. Suárez
Building Platform as a Service for Scientific Applications
Building Platform as a Service for Scientific Applications Moustafa AbdelBaky [email protected] Rutgers Discovery Informa=cs Ins=tute (RDI 2 ) The NSF Cloud and Autonomic Compu=ng Center Department
Data Centric Systems (DCS)
Data Centric Systems (DCS) Architecture and Solutions for High Performance Computing, Big Data and High Performance Analytics High Performance Computing with Data Centric Systems 1 Data Centric Systems
VisIt Visualization Tool
The Center for Astrophysical Thermonuclear Flashes VisIt Visualization Tool Randy Hudson [email protected] Argonne National Laboratory Flash Center, University of Chicago An Advanced Simulation and Computing
Cray: Enabling Real-Time Discovery in Big Data
Cray: Enabling Real-Time Discovery in Big Data Discovery is the process of gaining valuable insights into the world around us by recognizing previously unknown relationships between occurrences, objects
ABSTRACT. Keywords Virtual Reality, Java, JavaBeans, C++, CORBA 1. INTRODUCTION
Tweek: Merging 2D and 3D Interaction in Immersive Environments Patrick L Hartling, Allen D Bierbaum, Carolina Cruz-Neira Virtual Reality Applications Center, 2274 Howe Hall Room 1620, Iowa State University
Teaching Computational Thinking using Cloud Computing: By A/P Tan Tin Wee
Teaching Computational Thinking using Cloud Computing: By A/P Tan Tin Wee Technology in Pedagogy, No. 8, April 2012 Written by Kiruthika Ragupathi ([email protected]) Computational thinking is an emerging
How To Install Linux Titan
Linux Titan Distribution Presented By: Adham Helal Amgad Madkour Ayman El Sayed Emad Zakaria What Is a Linux Distribution? What is a Linux Distribution? The distribution contains groups of packages and
Part I Courses Syllabus
Part I Courses Syllabus This document provides detailed information about the basic courses of the MHPC first part activities. The list of courses is the following 1.1 Scientific Programming Environment
Equalizer. Parallel OpenGL Application Framework. Stefan Eilemann, Eyescale Software GmbH
Equalizer Parallel OpenGL Application Framework Stefan Eilemann, Eyescale Software GmbH Outline Overview High-Performance Visualization Equalizer Competitive Environment Equalizer Features Scalability
Collaborative modelling and concurrent scientific data analysis:
Collaborative modelling and concurrent scientific data analysis: Application case in space plasma environment with the Keridwen/SPIS- GEO Integrated Modelling Environment B. Thiebault 1, J. Forest 2, B.
Manjrasoft Market Oriented Cloud Computing Platform
Manjrasoft Market Oriented Cloud Computing Platform Aneka Aneka is a market oriented Cloud development and management platform with rapid application development and workload distribution capabilities.
NVIDIA GRID OVERVIEW SERVER POWERED BY NVIDIA GRID. WHY GPUs FOR VIRTUAL DESKTOPS AND APPLICATIONS? WHAT IS A VIRTUAL DESKTOP?
NVIDIA GRID OVERVIEW Imagine if responsive Windows and rich multimedia experiences were available via virtual desktop infrastructure, even those with intensive graphics needs. NVIDIA makes this possible
STATISTICA Solutions for Financial Risk Management Management and Validated Compliance Solutions for the Banking Industry (Basel II)
STATISTICA Solutions for Financial Risk Management Management and Validated Compliance Solutions for the Banking Industry (Basel II) With the New Basel Capital Accord of 2001 (BASEL II) the banking industry
Virtual Machines. www.viplavkambli.com
1 Virtual Machines A virtual machine (VM) is a "completely isolated guest operating system installation within a normal host operating system". Modern virtual machines are implemented with either software
JavaFX Session Agenda
JavaFX Session Agenda 1 Introduction RIA, JavaFX and why JavaFX 2 JavaFX Architecture and Framework 3 Getting Started with JavaFX 4 Examples for Layout, Control, FXML etc Current day users expect web user
An Integrated Simulation and Visualization Framework for Tracking Cyclone Aila
An Integrated Simulation and Visualization Framework for Tracking Cyclone Aila Preeti Malakar 1, Vijay Natarajan, Sathish S. Vadhiyar, Ravi S. Nanjundiah Department of Computer Science and Automation Supercomputer
Big Data Executive Survey
Big Data Executive Full Questionnaire Big Date Executive Full Questionnaire Appendix B Questionnaire Welcome The survey has been designed to provide a benchmark for enterprises seeking to understand the
Network operating systems typically are used to run computers that act as servers. They provide the capabilities required for network operation.
NETWORK OPERATING SYSTEM Introduction Network operating systems typically are used to run computers that act as servers. They provide the capabilities required for network operation. Network operating
Petascale Visualization: Approaches and Initial Results
Petascale Visualization: Approaches and Initial Results James Ahrens Li-Ta Lo, Boonthanome Nouanesengsy, John Patchett, Allen McPherson Los Alamos National Laboratory LA-UR- 08-07337 Operated by Los Alamos
Visualization and Data Analysis with VIDA. Joe Corkery OpenEye Scientific Software
Visualization and Data Analysis with VIDA Joe Corkery OpenEye Scientific Software OpenEye Small software company Efficient large scale 3D computations Tools for managing computed data VIDA Primarily a
www.xenon.com.au STORAGE HIGH SPEED INTERCONNECTS HIGH PERFORMANCE COMPUTING VISUALISATION GPU COMPUTING
www.xenon.com.au STORAGE HIGH SPEED INTERCONNECTS HIGH PERFORMANCE COMPUTING GPU COMPUTING VISUALISATION XENON Accelerating Exploration Mineral, oil and gas exploration is an expensive and challenging
Wyse vworkspace Supports Higher Education s Desktop Virtualization Needs
Wyse vworkspace Supports Higher Education s Desktop Virtualization Needs Prepared by Chris Lyman, Senior Systems Consultant Dell cloud client-computing Solutions Abstract As interest in alternative approaches
Big Data Management in the Clouds and HPC Systems
Big Data Management in the Clouds and HPC Systems Hemera Final Evaluation Paris 17 th December 2014 Shadi Ibrahim [email protected] Era of Big Data! Source: CNRS Magazine 2013 2 Era of Big Data! Source:
The Visualization Pipeline
The Visualization Pipeline Conceptual perspective Implementation considerations Algorithms used in the visualization Structure of the visualization applications Contents The focus is on presenting the
ML2VR Providing MATLAB Users an Easy Transition to Virtual Reality and Immersive Interactivity
ML2VR Providing MATLAB Users an Easy Transition to Virtual Reality and Immersive Interactivity David J. Zielinski Ryan P. McMahan Wenjie Lu Silvia Ferrari Motivation: DiVE: Duke Immersive Virtual Environment.
Virtualization and the U2 Databases
Virtualization and the U2 Databases Brian Kupzyk Senior Technical Support Engineer for Rocket U2 Nik Kesic Lead Technical Support for Rocket U2 Opening Procedure Orange arrow allows you to manipulate the
IBM Tivoli Monitoring for Applications
Optimize the operation of your critical e-business applications IBM Tivoli Monitoring for Applications Highlights Helps maintain the performance and availability of your application environment including
Applications to Computational Financial and GPU Computing. May 16th. Dr. Daniel Egloff +41 44 520 01 17 +41 79 430 03 61
F# Applications to Computational Financial and GPU Computing May 16th Dr. Daniel Egloff +41 44 520 01 17 +41 79 430 03 61 Today! Why care about F#? Just another fashion?! Three success stories! How Alea.cuBase
GEDAE TM - A Graphical Programming and Autocode Generation Tool for Signal Processor Applications
GEDAE TM - A Graphical Programming and Autocode Generation Tool for Signal Processor Applications Harris Z. Zebrowitz Lockheed Martin Advanced Technology Laboratories 1 Federal Street Camden, NJ 08102
IBM Software Group. Lotus Domino 6.5 Server Enablement
IBM Software Group Lotus Domino 6.5 Server Enablement Agenda Delivery Strategy Themes Domino 6.5 Server Domino 6.0 SmartUpgrade Questions IBM Lotus Notes/Domino Delivery Strategy 6.0.x MRs every 4 months
CHAPTER 15: Operating Systems: An Overview
CHAPTER 15: Operating Systems: An Overview The Architecture of Computer Hardware, Systems Software & Networking: An Information Technology Approach 4th Edition, Irv Englander John Wiley and Sons 2010 PowerPoint
STUDY AND SIMULATION OF A DISTRIBUTED REAL-TIME FAULT-TOLERANCE WEB MONITORING SYSTEM
STUDY AND SIMULATION OF A DISTRIBUTED REAL-TIME FAULT-TOLERANCE WEB MONITORING SYSTEM Albert M. K. Cheng, Shaohong Fang Department of Computer Science University of Houston Houston, TX, 77204, USA http://www.cs.uh.edu
Integrated Open-Source Geophysical Processing and Visualization
Integrated Open-Source Geophysical Processing and Visualization Glenn Chubak* University of Saskatchewan, Saskatoon, Saskatchewan, Canada [email protected] and Igor Morozov University of Saskatchewan,
Programming models for heterogeneous computing. Manuel Ujaldón Nvidia CUDA Fellow and A/Prof. Computer Architecture Department University of Malaga
Programming models for heterogeneous computing Manuel Ujaldón Nvidia CUDA Fellow and A/Prof. Computer Architecture Department University of Malaga Talk outline [30 slides] 1. Introduction [5 slides] 2.
Bachelor of Games and Virtual Worlds (Programming) Subject and Course Summaries
First Semester Development 1A On completion of this subject students will be able to apply basic programming and problem solving skills in a 3 rd generation object-oriented programming language (such as
Integrating TAU With Eclipse: A Performance Analysis System in an Integrated Development Environment
Integrating TAU With Eclipse: A Performance Analysis System in an Integrated Development Environment Wyatt Spear, Allen Malony, Alan Morris, Sameer Shende {wspear, malony, amorris, sameer}@cs.uoregon.edu
Stream Processing on GPUs Using Distributed Multimedia Middleware
Stream Processing on GPUs Using Distributed Multimedia Middleware Michael Repplinger 1,2, and Philipp Slusallek 1,2 1 Computer Graphics Lab, Saarland University, Saarbrücken, Germany 2 German Research
Star System. 2004 Deitel & Associates, Inc. All rights reserved.
Star System Apple Macintosh 1984 First commercial OS GUI Chapter 1 Introduction to Operating Systems Outline 1.1 1.2 1.3 1.4 1.5 1.6 1.7 1.8 1.9 1.10 1.11 1.12 Introduction What Is an Operating System?
DISTRIBUTED SYSTEMS AND CLOUD COMPUTING. A Comparative Study
DISTRIBUTED SYSTEMS AND CLOUD COMPUTING A Comparative Study Geographically distributed resources, such as storage devices, data sources, and computing power, are interconnected as a single, unified resource
2) Xen Hypervisor 3) UEC
5. Implementation Implementation of the trust model requires first preparing a test bed. It is a cloud computing environment that is required as the first step towards the implementation. Various tools
DIY Parallel Data Analysis
I have had my results for a long time, but I do not yet know how I am to arrive at them. Carl Friedrich Gauss, 1777-1855 DIY Parallel Data Analysis APTESC Talk 8/6/13 Image courtesy pigtimes.com Tom Peterka
NVIDIA IndeX Enabling Interactive and Scalable Visualization for Large Data Marc Nienhaus, NVIDIA IndeX Engineering Manager and Chief Architect
SIGGRAPH 2013 Shaping the Future of Visual Computing NVIDIA IndeX Enabling Interactive and Scalable Visualization for Large Data Marc Nienhaus, NVIDIA IndeX Engineering Manager and Chief Architect NVIDIA
Deploying and managing a Visualization Farm @ Onera
Deploying and managing a Visualization Farm @ Onera Onera Scientific Day - October, 3 2012 Network and computing department (DRI), Onera P.F. Berte [email protected] Plan Onera global HPC
NVIDIA CUDA GETTING STARTED GUIDE FOR MICROSOFT WINDOWS
NVIDIA CUDA GETTING STARTED GUIDE FOR MICROSOFT WINDOWS DU-05349-001_v6.0 February 2014 Installation and Verification on TABLE OF CONTENTS Chapter 1. Introduction...1 1.1. System Requirements... 1 1.2.
How To Build A Cloud Computer
Introducing the Singlechip Cloud Computer Exploring the Future of Many-core Processors White Paper Intel Labs Jim Held Intel Fellow, Intel Labs Director, Tera-scale Computing Research Sean Koehl Technology
Large Vector-Field Visualization, Theory and Practice: Large Data and Parallel Visualization Hank Childs + D. Pugmire, D. Camp, C. Garth, G.
Large Vector-Field Visualization, Theory and Practice: Large Data and Parallel Visualization Hank Childs + D. Pugmire, D. Camp, C. Garth, G. Weber, S. Ahern, & K. Joy Lawrence Berkeley National Laboratory
Unprecedented Performance and Scalability Demonstrated For Meter Data Management:
Unprecedented Performance and Scalability Demonstrated For Meter Data Management: Ten Million Meters Scalable to One Hundred Million Meters For Five Billion Daily Meter Readings Performance testing results
Chapter 1 - Web Server Management and Cluster Topology
Objectives At the end of this chapter, participants will be able to understand: Web server management options provided by Network Deployment Clustered Application Servers Cluster creation and management
Status and Integration of AP2 Monitoring and Online Steering
Status and Integration of AP2 Monitoring and Online Steering Daniel Lorenz - University of Siegen Stefan Borovac, Markus Mechtel - University of Wuppertal Ralph Müller-Pfefferkorn Technische Universität
QuickSpecs. NVIDIA Quadro K5200 8GB Graphics INTRODUCTION. NVIDIA Quadro K5200 8GB Graphics. Technical Specifications
J3G90AA INTRODUCTION The NVIDIA Quadro K5200 gives you amazing application performance and capability, making it faster and easier to accelerate 3D models, render complex scenes, and simulate large datasets.
Public Cloud Partition Balancing and the Game Theory
Statistics Analysis for Cloud Partitioning using Load Balancing Model in Public Cloud V. DIVYASRI 1, M.THANIGAVEL 2, T. SUJILATHA 3 1, 2 M. Tech (CSE) GKCE, SULLURPETA, INDIA [email protected] [email protected]
Reporting Services. White Paper. Published: August 2007 Updated: July 2008
Reporting Services White Paper Published: August 2007 Updated: July 2008 Summary: Microsoft SQL Server 2008 Reporting Services provides a complete server-based platform that is designed to support a wide
James Ahrens, Berk Geveci, Charles Law. Technical Report
LA-UR-03-1560 Approved for public release; distribution is unlimited. Title: ParaView: An End-User Tool for Large Data Visualization Author(s): James Ahrens, Berk Geveci, Charles Law Submitted to: Technical
The Big Data methodology in computer vision systems
The Big Data methodology in computer vision systems Popov S.B. Samara State Aerospace University, Image Processing Systems Institute, Russian Academy of Sciences Abstract. I consider the advantages of
Network device management solution
iw Management Console Network device management solution iw MANAGEMENT CONSOLE Scalability. Reliability. Real-time communications. Productivity. Network efficiency. You demand it from your ERP systems
A Framework for End-to-End Proactive Network Management
A Framework for End-to-End Proactive Network Management S. Hariri, Y. Kim, P. Varshney, Department of Electrical Engineering and Computer Science Syracuse University, Syracuse, NY 13244 {hariri, yhkim,varshey}@cat.syr.edu
