NinJo Project Status June 2004 NinJo Project Status The NinJo Team at the last team meeting Overview User Input Layers Diagram based applications Servers Batch Hard- and Software-Benchmarks NinJo Hardware and Java VMs Features and Versions Training and Roll-Out User Input Typical software development cycle consists in Writing and reviewing Requirement specs ( and Design Documents of course ) User User Meeting after each major NinJo release 0.7 ( December 2003), 0.8 ( May 2004), 0.9 (September 2004 ) Results of Eval-Meeting are integrated in the release planning User input mainly on usability and features User Input: Enhancing the usability User Input: Configuration In NinJo nearly everything is configurable! The application itself The workspace The scenes The attributes Not all the parameters of the evaluation versions where properly configured We had to establish a working group that takes care of all the issues provides color tables, plot models, isoline We tried to integrate most of the user requests e.g. Navigator Fast Access to Use Cases mygui Favorites Spinner-Buttons ( e.g. GridLayer: model / model-run, level ) spacing... Undo / Redo 1
Layers: Radar Layer Layers: Remember the client architecture MSC has created it s first layer! Layer Container But everything will be redesigned to allow for something like this Layers: Radar, SCIT Layer Layers: Satellite Layer Display of storm cells and their attributes Currently only WebStart-able Workshop on a Displays all satellites User need not care about projection and satellite position MSG integrated Color composits available Currently work on reintroducing mosaics and polar orbiting satellites in progress generic layer shortly Layers: Streamline layer Diagrams: Meteograms Application in a The meteogram ist the first app to use the highly flexible diagram framework Will be able to display observations and NWP data Benny Koza will give a talk on the diagram framework Switzerland has created it s first layer too! Algorithms based on the work of B. Jobard, although completely rewritten in Java! Streamlines: Switzerland 2
Diagrams: Aerological Diagrams Diagrams: CrossCross-Sections Application in First prototype Application in Integrates many data types shows that the diagram framework can be adapted sucessfully Secondary Window NWV: Based on model or pressure levels Both Observations and Radar: based on NWP-data Volume scans Obs: Temps and surface data Currently NWV-only Layers: Grib-Enhancements Grib Grib-Enhancements Secondary Window Application: Instant 3D Instant 3D Transparency Intelligent Weather depiction New arrows Field cropping First prototype automated mapping of 2DParamaters in 3D-Scene Makes use of Unified-GOF (2Dscene graphs can be reused) Batch Processing Applications: AP 2003 Image and vector products can be created jpg, png, tiff... PDF, FLASH, PS AP2003 Supporting the weather forecast process Warning and monitoring Nowcasting Data modification incl. Animations Flexible legend with html-style language SMS-based scheduling Sophisticated layout and NinJo scheduler to come ( March 2004 ) Basis for application serving ( NinJo 1.0+) Point data based Consolidated forecasts 3
NinJo Hardware Benchmarks Goals of the Benchmark To provide benchmarks on which the decision on the future NinJo client hardware can be based and to find the most suitable benchmark for the NinJo client hardware invitation to tender. To compare the performance of the NinJo- Client software on different platforms (Windows (Intel-Xeon), Windows (AMD-Opteron), MAC- OS X(G5), and Linux (Intel XEON) ) To optimize the performance of the linux servers. To make sure that the planned amount of data can be handled by the NinJo system Server Benchmark Hardware Fujitsu Siemens Primergy RX300 Suse Linux 9 2 x Intel Xeon 3,06 GHz, 4 GB memory 2 system disks Raid 1, 36 GB @ 15K each 4 data disks, 73 GB @ 15K each Adaptec Raid Controller U320, 2Ch 2i/2e, 128MB Sun or IBM JDK Server Benchmark- Results Data Throughput during Import w. different Raids and VMs NinJo-Linux-Server Performance 25,00 21,16 2 10000 Server Benchmark- Results Comparison -Linux Server versus Sun 480 Logarithmic scaling! 14217 2591 2392 1345 1216 1000 730 365,0 247,5 16,67 17,01 100 119 66,5 58,3 115,0 95,3 MByte / sec 15,00 11,96 10 19,5 6,5 6,3 21,5 36,2 14,10 13,90 1 9,22 7,64 1,5 2,0 5,00 3,47 5,09 6,02 5,89 1 Import Hofmüller LatLon Metadata Sounding Time 0,7 Series 0,7 Server Import Data retrieval client-server Time Volume Volume RAID 5 RAID 1 RAID 10 1 Import / SUN 2 Import / 4 Import / 1 Import / 4 Import / SUN SUN IBM IBM RAID 0 1 Import 2 Import 4 Import 0,1 RAID10 / SUN RAID10 / IBM SUN MCH RAID 10 IBM with 4 import threads, SUN MCH import with 2 import threads Client Benchmark Hardware Fujitsu-Siemens Celsius R610, 2 * Xeon 3,2 GHz, 1MB cache 2GB RAM, 2 * 73 GB U320 SCSI disks @ 15 K, RAID 0, NVIDIA FX 1000, 128 MB Windows XP or Suse Linux 9.0 Apple G5 2 * G5 GHz, 2 GHz 2 GB RAM 1 * 160 GB, SATA @ 7.2K ATI Radeon 9800, 128 MB MAC OS X 10.3. IBM Intellistation A pro 2 * Opteron 248, 2,2 GHz 3 GB RAM 1 * 36 GB SCSI disk, 1 * SATA data disk @ 7.2K NVIDIA FX 1100, 128 MB Windows XP 50000 45000 40000 35000 30000 25000 20000 15000 10000 5000 0 Opteron (IBM), WIN XP, 9141 16375 8188 Client Benchmarks Java 1.4.2 Java 1.5ß Xeon ( Siemens), WIN XP, Logarithmic scaling! 12141 21266 10703 Java 1.4.2 Java 1.5ß Favorites Xeon (Siemens), Linux, 12543 24720 Java 1.4.2, SUN New Favorites with Graphik 18043 Java 1.5ß, SUN 14912 41634 Java, 1.4.1, IBM G5, (Apple), OS X 142884 16884 Java 1.4.2 4
2000 Layer Based Client Benchmarks 7076 Application Performance, Layer based 1712 Milestones 2000 2001 2002 2003 II III I II III IV I II III IV I II III IV I IV Requirement specs 2004 I II III IV 1500 1519 1519 1497 Designphase Architecture und Frameworks Prototyp 1 Time in ms 1000 1015 879 1077 996 831 942 871 Framework-training Implementation 706 687 668 679 Version 0.2 Version 0.3 500 214 167 430 390 276 464 310 227 183 Version 0.4 Version 0.5 Version 0.6 Version 0.7 0 Georaster Geovector Pointdata Satellite Grid/Isoline OPTERON XP XEON XP XEON LINUX, SUN XEON LINUX, IBM MAC G5 Version 0.8 Version 0.9 Version 1.0 Integration Key Features NinJo 1.0 ( late 2004 / early 2005 ) Animation ( Usable GUI and handling) AutoMon ( Warning and Monitoring (AP 2003 ) Batch ( to be integrated in SMS, w/o soph. legends) Capture ( Data snapshot for training and reports ) CrossSection ( Using Diagram Framework ) Graphical Editor (first version without data editing) SCIT / KONRAD Lightning Meteogram Radar ( perhaps no Cell View ) Satellite (incl. SAF + polar orbiting satellites ) Soundings Key Features 2 NinJo 1.1 ( summer of 2005 ) Archive ( RDBMS access ) Batch ( automated legends ) Formulas On Screen Analysis ( Field Modification ) Point Forecast Editing ( AP 2003: MMO ) Layers based on base point data layer Road Weather Layer ( : SWIS ) Storm Warning Layer ( : Sturmwarn ) MOS Layer ( GMOS, TAF-Guidance ) Warning Layer ( AP 2003: EPM ) and further - requested - functionalities in all layers Key Features 3 NinJo 1.2 ( late 2005 ) 3D ( Instant 3D ) OOG ( AP 2003: Consolidated PTP ) Replay Trajectories On Screen Analysis ( Field Modification, if requested ) and further - requested - functionalities in all layers Training and Roll-Out Training at and MetInfoBw Training the trainers ( July ) Forecaster training early September-mid October System-Administrator training late January 2005, early February 2005 First Clients will be installed for forecster training Second batch of machines end of January 2005 including servers Installation during February 2005 Parallel operation with MAP ( legacy workstation system ) until NinJo 1.1 5