Code_Saturne 2.0. Code_Saturne development team. Code_Saturne User meeting 7 th and 8 th, December 2009

Similar documents
2013 Code_Saturne User Group Meeting. EDF R&D Chatou, France. 9 th April 2013

HPC Deployment of OpenFOAM in an Industrial Setting

P013 INTRODUCING A NEW GENERATION OF RESERVOIR SIMULATION SOFTWARE

ME6130 An introduction to CFD 1-1

SALOME-CFD: EDF S REFERENCE PLATFORM FOR CFD STUDIES

Overview on Salome-Meca and Code_Aster. Code_Aster, Salome-Meca course material GNU FDL licence (

Sourcery Overview & Virtual Machine Installation

Express Introductory Training in ANSYS Fluent Lecture 1 Introduction to the CFD Methodology

Module 6 Case Studies

OpenFOAM Optimization Tools

Product Synthesis. CATIA - Product Engineering Optimizer 2 (PEO) CATIA V5R18

Turbomachinery CFD on many-core platforms experiences and strategies

Simulation of Fluid-Structure Interactions in Aeronautical Applications

CFD Applications using CFD++ Paul Batten & Vedat Akdag

Introduction to CFD Analysis

Multicore Parallel Computing with OpenMP

CFD: What is it good for?

Customer Training Material. Lecture 2. Introduction to. Methodology ANSYS FLUENT. ANSYS, Inc. Proprietary 2010 ANSYS, Inc. All rights reserved.

Best practices for efficient HPC performance with large models

Numerical Calculation of Laminar Flame Propagation with Parallelism Assignment ZERO, CS 267, UC Berkeley, Spring 2015

Scientific Computing Programming with Parallel Objects

Computational Fluid Dynamics Research Projects at Cenaero (2011)

HPC enabling of OpenFOAM R for CFD applications

Code_Saturne : EDF s general purpose CFD software goes Open Source

Introduction to CFD Analysis

Titelmasterformat durch Klicken bearbeiten

version 3.0 tutorial - Turbulent mixing in a T-junction with CFDSTUDY in SALOME contact: saturne-support@edf.fr

CastNet: Modelling platform for open source solver technology

OpenFOAM Opensource and CFD

Compatibility and Accuracy of Mesh Generation in HyperMesh and CFD Simulation with Acusolve for Torque Converter

Large-Scale Reservoir Simulation and Big Data Visualization

FreeFem++-cs, the FreeFem++ Graphical Interface

O.F.Wind Wind Site Assessment Simulation in complex terrain based on OpenFOAM. Darmstadt,

Overset Grids Technology in STAR-CCM+: Methodology and Applications

Part I Courses Syllabus

CONVERGE Features, Capabilities and Applications

MEL 807 Computational Heat Transfer (2-0-4) Dr. Prabal Talukdar Assistant Professor Department of Mechanical Engineering IIT Delhi

HPC Wales Skills Academy Course Catalogue 2015

Very special thanks to Wolfgang Gentzsch and Burak Yenier for making the UberCloud HPC Experiment possible.

An Easier Way for Cross-Platform Data Acquisition Application Development

Multi-Block Gridding Technique for FLOW-3D Flow Science, Inc. July 2004

Chapter 1. Dr. Chris Irwin Davis Phone: (972) Office: ECSS CS-4337 Organization of Programming Languages

Design and Optimization of OpenFOAM-based CFD Applications for Hybrid and Heterogeneous HPC Platforms

MPI Hands-On List of the exercises

:Introducing Star-P. The Open Platform for Parallel Application Development. Yoel Jacobsen E&M Computing LTD

Parallel Programming at the Exascale Era: A Case Study on Parallelizing Matrix Assembly For Unstructured Meshes

CastNet: GUI environment for OpenFOAM

High Performance. CAEA elearning Series. Jonathan G. Dudley, Ph.D. 06/09/ CAE Associates

Icepak High-Performance Computing at Rockwell Automation: Benefits and Benchmarks

Getting Started with VMware Fusion. VMware Fusion for Mac OS X

CFD software overview comparison, limitations and user interfaces

Virtual Machines.

Introductory FLUENT Training

IS-ENES/PrACE Meeting EC-EARTH 3. A High-resolution Configuration

Quality and Reliability in CFD

Getting Started with VMware Fusion

Lecture 6 - Boundary Conditions. Applied Computational Fluid Dynamics

Next Generation ETS Software A preview of the upcoming new ETS5 system & user enhancements. KNX Association, Brussels André Hänel

Advanced discretisation techniques (a collection of first and second order schemes); Innovative algorithms and robust solvers for fast convergence.

Accelerating CFD using OpenFOAM with GPUs

FRIEDRICH-ALEXANDER-UNIVERSITÄT ERLANGEN-NÜRNBERG

Computational Modeling of Wind Turbines in OpenFOAM

GEDAE TM - A Graphical Programming and Autocode Generation Tool for Signal Processor Applications

Self Financed One Week Training

Introduction. 1.1 Motivation. Chapter 1

BLM 413E - Parallel Programming Lecture 3

Language Evaluation Criteria. Evaluation Criteria: Readability. Evaluation Criteria: Writability. ICOM 4036 Programming Languages

Automated moving mesh techniques in CFD

COMPUTATIONAL FLUID DYNAMICS USING COMMERCIAL CFD CODES

DYNAMIC LOAD BALANCING APPLICATIONS ON A HETEROGENEOUS UNIX/NT CLUSTER

Good FORTRAN Programs

Quick Start Using DASYLab with your Measurement Computing USB device

Recommended hardware system configurations for ANSYS users

Software that writes Software Stochastic, Evolutionary, MultiRun Strategy Auto-Generation. TRADING SYSTEM LAB Product Description Version 1.

Pushing the limits. Turbine simulation for next-generation turbochargers

Creo Simulate 1.0 April 2011

The MaXX Desktop. Workstation Environment. Revised Road Map Version 0.7. for Graphics Professionals

BBIPED: BCAM-Baltogar Industrial Platform for Engineering design

Interactive comment on A parallelization scheme to simulate reactive transport in the subsurface environment with OGS#IPhreeqc by W. He et al.

Fully Automatic In-cylinder Workflow Using HEEDS / es-ice / STAR-CD

Fast Multipole Method for particle interactions: an open source parallel library component

This section will focus on basic operation of the interface including pan/tilt, video, audio, etc.

Cluster Scalability of ANSYS FLUENT 12 for a Large Aerodynamics Case on the Darwin Supercomputer

22S:295 Seminar in Applied Statistics High Performance Computing in Statistics

PARALLELS SERVER BARE METAL 5.0 README

The UberCloud Experiment

YALES2 porting on the Xeon- Phi Early results

Maximize Performance and Scalability of RADIOSS* Structural Analysis Software on Intel Xeon Processor E7 v2 Family-Based Platforms

STCE. Outline. Introduction. Applications. Ongoing work. Summary. STCE RWTH-Aachen, Industrial Applications of discrete adjoint OpenFOAM, EuroAD 2014

VMware Server 2.0 Essentials. Virtualization Deployment and Management

Basin simulation for complex geological settings

Running Windows on a Mac. Why?

VisIt Visualization Tool

Using the Windows Cluster

PARALLELS SERVER 4 BARE METAL README

How to Run Parallel Jobs Efficiently

Computational Fluid Dynamics

Best Practices for Deploying, Replicating, and Managing Real-Time and FPGA Applications. ni.com

Professional Organization Checklist for the Computer Science Curriculum Updates. Association of Computing Machinery Computing Curricula 2008

Transcription:

Code_Saturne 2.0 Code_Saturne development team Code_Saturne User meeting 7 th and 8 th, December 2009

Outlines 1. Enhancements and new features 2. Perspectives 3. Open source initiative 7th and 8th, December 2009

Enhancements, New features With respect to the latest validated version 1.3

Release schedule and validation Previous versions Latest validated version 1.3.2 released in April 2008 Corrective version 1.3.3 released for 2009 Intermediate development version 1.4 concurrently with 1.3.3 version On the road to the next validated version 2.0 Two development snapshots, 2.0-beta1 (June 2009) and 2.0-beta2 (August 2009) Validation of version 2.0-beta2 First stage: August-November About 30 tests-cases, more than 200 simulations Second stage: Starting now On a selected set of test-cases (known to have had issues in the first stage) Next version 2.0-rc (stands for release candidate) Now consolidating fixes issued after the first validation stage Publicly released soon

Graphical Interface Fully re-written in PyQt 4 Now without any regression compared to 1.3 version For a better integration in SALOME Natively supported on GNU/Linux, MacOS X and Windows systems Drag n drop feature for time average and profile definitions Many new features! Head-loss zones definition Fluid-structure interaction for internal / external coupling Lagrangian simulations setup Oxycombustion setup for coal combustion Mathematical expressions interpreter (see next slide)

Mathematical Expression Interpreter (MEI) A new low-level library along with BFT and FVM Dedicated to the interpretation of mathematical expressions Used both by the solver and the interface (as a syntax checker) Mathematical expressions can be used in different places Fluid properties Boundary conditions setup Fluid structure interaction (internal coupling with spring modelling) Will be extended for other functionalities when needed

Mathematical Expression Interpreter (MEI)

Mathematical Expression Interpreter (MEI)

Integration in SALOME platform Code_Saturne graphical interface can be embedded in SALOME platform Compliant with SALOME 5.1.3 (soon to be released) Will be publicly released early 2010 Will also be available at MFEE department /home/salome/runsalome Extends the capabilities of Code_Saturne interface Boundary zone selection Code_Saturne simulations can be launched and stopped from within SALOME Virtual results and drafts directories for an easier user file management Compared to former Tcl/Tk interface, 2.0 interface is far more reactive

One script to run them all The different user scripts have been unified in a single one code_saturne <command> [options] Examples: Documentation Case creation code_saturne info --guide=user --reader=acroread code_saturne create --study=coupling --nsyr=2 Mesh verification code_saturne check_mesh mesh.neu --join --color 98 99 Compilation test GUI launch code_saturne compile --test code_saturne gui It is now possible to have different versions of Code_Saturne simultaneously available E.g.: alias cs=`/path/to/saturne/bin/code_saturne` alias csdev=`/path/to/mysaturnedev/bin/code_saturne` Script completion is available for easier typing of a command

Architectural changes Code_Saturne kernel-related changes Switch to the autotools for configuration and installation Coherent with what was done for the other packages of the code Far more standard in the Unix world than the previous homemade build process Far more easier to maintain or to detect error in the build process Update Fortran source files to Fortran 95 norm Fortran 77 norm showed some limitations in terms of code complexity and maintenance Mostly only changes from fixed to free-form files (a script will be provided to ease the conversion) Fortran 95 features (dynamic memory allocation, modules, ) to be used progressively Global changes Pre-requisites detection now emits a warning if an optional package is not found when the detection is automatic but exits with an error if the package has been requested by the user Switch to the standard management of long/short options (e.g. --study / -s) English translation for the study structure directories (lance becomes runcase, ) All executables begin with cs_ except the main script code_saturne

Preprocessing enhancements Addition of new readers NOPO 64 bits files format from Simail (version 7) CCM files format from Star-CCM Requires to be linked with the libccmio library, available on request through the CD-Adapco web-site A couple of fixes have also been issued for existing formats Preparation for larger mesh reading A configure option has been added to handle long integers (> 2 billions) Useful for very large meshes on machines with more than 128 Gb of memory./configure --enable-long-int Same option (--enable-long-gnum) has been added for FVM configuration Communication between the pre-processor and the solver A single prepocessor_output file is now generated by the pre-processor This file and the solver restart files share the same unified format They can be inquired by the cs_io_dump tool A partitionner cs_partition reads this file and generates domain_nxxxx indirection files for domain splitting No need to regenerate the pre-processor information when changing the number of processor in a simulation

A new parallelized joining algorithm Another bottleneck is removed for very large meshes! Can be considered production-ready, but not yet used by default Activated by a Fortran subroutine usjoin.f90 Cannot handle periodicity at the moment work in progress! Faster than previous algorithm on a single processor Same standard parameters than before More advanced parameters now available to solve corner cases

A couple of other HPC-related features MPI-I/O features Only used for preprocessor_output and restart files reading and writing Activated by the cs_solver option --mpi-io, if FVM is compiled with MPI I/O support Mixed feedback, depending on the MPI distribution and the architecture Parallel partitioning A Space-Filling Curve algorithm has been implemented in the solver Activated by the ALGDOM subroutine Generally worse results than with sequential Metis partitioning, but satisfying nonetheless Porting to number of current HPC facilities In the framework of the PRACE project BlueGene/L and /P, NEC SX9, Cray XT,

Algebraic multigrid algorithm For pressure resolution, instead of standard conjugate gradient Stabilized since 1.4 development version Also used for other scalar, purely diffusive, variables Compatible with parallelism and periodicity Periodicity of translation and/or rotation are compatible Scalable up to a large number of cells and/or processors May leverage convergence issues on mesh of very poor quality Smoother evolution of CPU time per iteration than with standard Conjugate Gradient algorithm Major improvement on the elapsed CPU time Up to 10x faster on the pressure resolution Up to 3 or 4x faster on the global elapsed time!

Atmospheric modelling Porting main features of Mercure_Saturne Implemented as a specific physics in Code_Saturne, triggered with usppmo.f90 Neutral atmosphere modelling Potential temperature formulation for neutral or non-neutral atmosphere A meteorological data files can be given via a METEO_DATA variable in the runcase

Combustion modelling Accounting for possible oxycombustion in coal combustion Extension of the heavy fuel combustion Possibly several initial droplets size Not yet validated 7th and 8th, December 2009

Cooling towers modelling Based on former code N3S-Aero Implemented as a specific physics in Code_Saturne, triggered by usppmo.f90 Poppe and Merkel models available Post-processing of the exchange zones Important fixes since 1.4 development version But still needs to be polished Parallel simulation not fully functional User subroutines given as an example Boundary conditions setup can be difficult

Rotor / stator interaction modelling Main features Dedicated to incompressible flows No turbine modelling Based on code/code coupling, thus a non-conservative method, but less than 1% of mass loss in the different tests Validated against Ubaldi s test case Two different approaches available Frozen Rotor method Full domain only Unsteady method See dedicated talk of B. Audebert

Code_Saturne coupling features Coupling with SYRTHES Now handled with a user subroutine ussyrc.f90 (or the interface) Already prepared for the next version of SYRTHES Version 4 of SYRTHES will be fully parallel Under testing by the SYRTHES development team Coupling with Code_Aster For fluid/structure interaction simulations Coupling through the YACS module of SALOME Still under final stage of integration in the standard version of Code_Aster Validation is under progress Coupling with Code_Saturne itself Mainly used for rotor/stator interaction modelling Can be used for standard coupling

Caveat and deprecated features Known caveat The particle-tracking algorithm in the Lagrangian module may suffer discrepancies on some architectures Due to the internal handling of floating-point precision with some processor / compiler Does not suffer such discrepancy on standard architectures: i386-like, x86_64 and IA64 processors with GCC or Intel compilers For portability question, contact the support The cooling tower module may fail in parallel run When an exchange zone has more than a single neighbour domain Deprecated features Lagrangian modelling for coal combustion Not been used for a long time, so probably broken Matisse engineering module (EDF specific) The solver code adaptation is still functional, but the graphical user interface is not

Perspectives

Fully parallelized tool-chain Code_Saturne 2.0 tool-chain is nearly fully parallel... The last bottleneck for standard calculations is mesh concatenation Next version will be! A development version for parallel mesh concatenation is already under testing Daresbury Laboratory helps to debug the new algorithm Parallel domain partitioning Possible use of ParMetis or PtScotch Improvement in the Space Filling Curve algorithm The Lagrangian particle tracking algorithm is prepared for parallel run Need to be testing thoroughly

Further developments Several development axis for the Navier-Stokes solver Opportunity of a velocity-pressure coupled solver Pseudo-compressible solver scheme for dilatable flows Switch to a lower precision for iterative linear solver Very good feedback on the different tests Probably increase EPSILO(IVAR) to 1.e-5 instead of 1.e-8 Refactoring the boundary conditions management For more robustness in the simulations For increased flexibility Treatment of uncertainties Tests of plugging of OpenTURNS platform (open source) to Code_Saturne If convenient, triggering from Code_Saturne graphical interface

Further developments (cont d) Physical modelling Low-Reynolds second order turbulence model (Elliptic Blending RSM, R. Manceau) Thermal fluxes modelling Turbulence improvement for rotation modelling Synthetic turbulence generation for RANS/LES coupling or LES inlet setup Ionic mobility Fire-driven flows specific enhancements Pumps modelling (conservative method) Stabilization of the heavy-fuel oil combustion Unification of coal and heavy-fuel oil combustion Architectural changes Use of Fortran 95 specific features (dynamic memory allocation, modules, ) Unification of the different packages in a single one

Code_Saturne open source initiative

The long-awaited community features Code_Saturne web-site finally has its forum and bug-tracker! Registering is mandatory for moderating and surveying purposes To register, please contact the support https://www.code-saturne.info A big thank to our colleagues of SINETICS for providing us with the facilities, especially to Christophe Mouton

The long-awaited community features

The long-awaited community features

Code_Saturne packaging and distribution A new Python installer is provided to the user Automatic download of every pre-requisite (except optional libccmio library) Automatic installation of every source package Should be fully working for most standard systems The code is packaged in Debian Work in progress (only beta2 is present as of now) About 10 different architectures supported 1.3.3 already available as a FreeBSD port Still no Windows version, but Code compilation is regularly tested on Cygwin and MingGW systems The new Qt interface is directly portable on Windows Every contribution would be welcome in this area ;-)

Code_Saturne documentation All the user subroutines have been translated in English Low-level libraries (BFT, FVM and MEI) are fully in English The preprocessor is written in French, and will remain as is Regarding the solver part The C source files are fully in English The comments in the Fortran source files will be translated as they are modified work in progress! Code_Saturne documentation is improving Man pages have been added to all executables E.g.: man cs_preprocess A quick reference card is available A double A4 format with main commands to be printed as a memento The theory manual is still to be reorganized and translated A developer s guide is to be written early 2010

Thank you for listening! And a special thank for every contribution to Code_Saturne, users and developers!