The PACS Software System. (A high level overview) Prepared by : E. Wieprecht, J.Schreiber, U.Klaas November,5 2007 Issue 1.

Similar documents
SPIRE Pipeline Data Products and Visualization Tools in HIPE

Visualization and Analysis of Spectral Data Cubes an Hipe toolbox

Copernicus Space Component ESA Data Access Overview J. Martin (ESA), R. Knowelden (Airbus D&S)

ASKAP Science Data Archive: Users and Requirements CSIRO ASTRONOMY AND SPACE SCIENCE (CASS)

Make Life Easier by Using Modern Features of the SPCM Software

MSITel provides real time telemetry up to 4.8 kbps (2xIridium modem) for balloons/experiments

Operability in the SAVOIR Context

Data Validation and Data Management Solutions

Database Administration for Spacecraft Operations The Integral Experience

Agilent Technologies E7475A GSM Drive-Test System Product Overview

SITools2 as VO service provider: an example with Herschel at IDOC (Integrated Data and Operation Center)

ATV Data Link Simulator: A Development based on a CCSDS Layers Framework

ESTRACK Management System Support for the CCSDS Space Communication Cross Support Service Management

The Advantages of Enterprise Historians vs. Relational Databases

Latency Analyzer (LANZ)

SureSense Software Suite Overview

QUALITY CONTROL OF THE IUE FINAL ARCHIVE

PLUMgrid Toolbox: Tools to Install, Operate and Monitor Your Virtual Network Infrastructure

SAN Conceptual and Design Basics

FULLY AUTOMATIC AND OPERATOR-LESS ANOMALY DETECTING GROUND SUPPORT SYSTEM FOR MARS PROBE "NOZOMI"

Organization of VizieR's Catalogs Archival

SAP Certified Development Professional - ABAP with SAP NetWeaver 7.0

Fail-Safe IPS Integration with Bypass Technology

LTE protocol tests for IO(D)T and R&D using the R&S CMW500

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

Your Software Quality is Our Business. INDEPENDENT VERIFICATION AND VALIDATION (IV&V) WHITE PAPER Prepared by Adnet, Inc.

Use of Reprogrammable FPGA on EUCLID mission

REDUCING THE COST OF GROUND SYSTEM DEVELOPMENT AND MISSION OPERATIONS USING AUTOMATED XML TECHNOLOGIES. Jesse Wright Jet Propulsion Laboratory,

Volume I, Section 4 Table of Contents

Quality Assurance for Hydrometric Network Data as a Basis for Integrated River Basin Management

DAME Astrophysical DAta Mining Mining & & Exploration Exploration GRID

WROX Certified Big Data Analyst Program by AnalytixLabs and Wiley

STAR JPSS Algorithms Integration Team Configuration Management Plan Version 1.2

Astrophysics with Terabyte Datasets. Alex Szalay, JHU and Jim Gray, Microsoft Research

An Introduction to the MTG-IRS Mission

REDUCING UNCERTAINTY IN SOLAR ENERGY ESTIMATES

TDRS / MUST. and. what it might do for you

Silicon Seminar. Optolinks and Off Detector Electronics in ATLAS Pixel Detector

PIE. Internal Structure

MicroStrategy Course Catalog

The Scientific Data Mining Process

BiDAl: Big Data Analyzer for Cluster Traces

Open EMS Suite. O&M Agent. Functional Overview Version 1.2. Nokia Siemens Networks 1 (18)

Analysis of Open Source Drivers for IEEE WLANs

How To Process Data From A Casu.Com Computer System

Development of the Fabry-Perot Spectrometer Application. Kathryn Browne Code 587

PERSONNEL REQUIREMENTS FOR RADIO FREQUENCY SPACE TO GROUND RESEARCH

THREE YEAR DEGREE (HONS.) COURSE BACHELOR OF COMPUTER APPLICATION (BCA) First Year Paper I Computer Fundamentals

Data Management Implementation Plan

Integrated Sensor Analysis Tool (I-SAT )

Internal Calibration Software Requirements

WFC3 Image Calibration and Reduction Software

PCCC PCCC Course Description

NXP Basestation Site Scanning proposal with AISG modems

Cisco Performance Visibility Manager 1.0.1

FlexPlan: An Operational Mission Planning & Scheduling COTS Used Internationally

Our mission is to develop and to offer innovative customer interaction.

Data and Machine Architecture for the Data Science Lab Workflow Development, Testing, and Production for Model Training, Evaluation, and Deployment

hp ProLiant network adapter teaming

MAST: The Mikulski Archive for Space Telescopes

Design of Remote data acquisition system based on Internet of Things

Recommendations for Performance Benchmarking

A class-structured software development platform for on-board computers of small satellites

USE OF PYTHON AS A SATELLITE OPERATIONS AND TESTING AUTOMATION LANGUAGE

OFM-500 Optical Fiber Mapping Software. A complete software application for managing documentation in fiber optic plants

Product Information = = = sales@te-systems.de phone

A Process for ATLAS Software Development

Optimized and Integrated Management of Communications Satellite Transponders

Visualizing and Analyzing Massive Astronomical Datasets with Partiview

Bitemporal Extensions to Non-temporal RDBMS in Distributed Environment

The Masters of Science in Information Systems & Technology

DATA ITEM DESCRIPTION

Table 1: Stage 1, Semester 1

Study on Developing a Flight Data Visualization

Visualisatie BMT. Introduction, visualization, visualization pipeline. Arjan Kok Huub van de Wetering

THE EUTELSAT QUANTUM CLASS SATELLITE

Best Practice. Management of a Transport Network in Procurement. IT-Process Recommendations for the Collaboration of Companies along the Supply Chain

TestScape. On-line, test data management and root cause analysis system. On-line Visibility. Ease of Use. Modular and Scalable.

WhatsUp Gold v11 Features Overview

imc FAMOS 6.3 visualization signal analysis data processing test reporting Comprehensive data analysis and documentation imc productive testing

Transcription:

The PACS Software System (A high level overview) Prepared by : E. Wieprecht, J.Schreiber, U.Klaas November,5 2007 Issue 1.0 PICC-ME-DS-003

1. Introduction The PCSS, the PACS ICC Software System, is the basic software system to support PACS user and developer. This document provides a concise overview on the various capabilities of the versatile PACS Common Software System (PCSS) with regard to PACS engineering and scientific data visualization and reduction. It supports both PACS users and software specialists developing the software. PACS users can be any astronomer, who has not necessarily deep insight into the instrument behavior, but also instrument and calibration experts. 1 General Overview PCSS contains PACS specific software (e.g. PACS specific I/A s/w, PACS Simulator), the Herschel Common Software System (HCSS) and required additional software packages (e.g. JFreeChart, JSky) ready for installation. Currently there are development measures to merge the ICC software packages into the HCSS package and an "intelligent" installer. A basic design feature of the PCSS development is the provision of a platform providing seamless transition between all phases of the mission. Therefore, the PCSS is able to support the following functionalities: Ground Test Data Analysis (AVM, ILT, IST,...) Instrument Calibration S/W Development environment Interactive Scientific Data Analysis Standard Product Generation Trend Analysis Quick Look Analysis Instrument Simulation Instrument performance and health checks Data Quality Checks These functionalities will be presented in individual overviews below. 2 Relation to HCSS The Herschel Common Science System (HCSS) is being developed by the Herschel Science Center (HSC) and Herschel Instrument Control Centers (ICCs) to provide the complete software system for

the Herschel Observatory mission. The intention is to provide a common system that is able to handle test data, observation planning, mission planning and instrument data from observations within one common development. An important element of this common development is Data Processing (DP). DP handles computed, stored or simulated data and has access to much of the software developed for other purposes within the HCSS (e.g., Quick Look Analysis, which runs on real-time data or replayed data streams from files are even from the operational database). Branches of the HCSS have also been developed for handling Herschel instrument-specific tasks. So software packages for HIFI, PACS and SPIRE also reside within the HCSS framework and are available within DP. A more detailed description of the HCSS system is given in xxxxxxxxxxxxxxx 3 Interactive Data Processing In an interactive session, by starting the user interface "jide", it is possible to use the HCSS specific data formats for convenient array manipulations and mathematical operations. An example is shown in Fig. 1. Illustration 1: User Interface jide Via the ProductAcessLayer (PAL) it is possible to query a database like Product Pool, select and access the data files for an interactive session, see Fig. 2. Data Pools might reside locally or can be accessed remotely with cashing mechanisms, if needed.

Illustration 2: Product Access Layer (PAL) Also tools supporting the visual inspection of data are provided (PlotXY, Display). Illustration 3: Plot : Spectrometer Ramps

Illustration 4: Image : Pipeline Result Photometer PointSource Tests From within jide it is even possible to generate the detector selection data used in uplink and downlink systems :

A more detailed description is given in : xxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxxx 2.Test Data Analysis This supports the handling and data reduction of representative instrument telemetry from the ground tests, including special telemetry of the ground test facility equipment, like x-y-stage positions, external and internal blackbody temperatures and external chopper wheel positions. The ground test data base has been/will be built up from AVM, ILT and IST which will serve as a reference data base for in-flight calibration. The first generation of calibration files for engineering calibration and data processing has been derived with this system. An extension of test data analysis to in-orbit will be the basic instrument commissioning evaluation. 3.Instrument Calibration Calibration analysis comprises specific aspects of the ground test data analysis as well as the evaluation of the in-flight Commissioning and Performance Verification calibration measurements providing the baseline satellite and instrument calibrations for the mission and the calibration refinement during Routine Phase. Calibration observations require more flexibility from the DP system, because some of them require non-standard instrument configurations and observing modes for engineering and instrument optimization. Also in the early phases of the mission the Standard Product Generation is still evolving and being consolidated. Therefore, the PCSS system has been designed highly flexible to handle this kind of data. Calibration experts can contribute their own scripts in an interactive session and follow Calibration Procedures (CAPs), a mixture of guidelines and actual code, in order to process the data and generate a calibration product. Once a calibration product has been generated, the calibration framework is used to save it and attach it to a certain instrument configuration (e.g. instrument model, hardware configuration, time period). The history mechanism of the PCSS framework supports the traceability of calibration data generation to a large extent. The PCSS framework permits convenient evaluation of Housekeeping data via command line and GUI interface (see also Trend Analysis). User access raw and converted (to engineering units) Housekeeping data using MIB mnemonics. There are tools allowing to merge any Housekeeping data with scientific detector data. Science data are accessible via the specified PACS Products. Two data formats can be distinguished, namely raw and on-board processed data. Calibration experts can e.g. receive raw data via the so called buffer transmission mode, while for the standard instrument modes with on-board data reduction raw data are only provided for a small number of pixels in parallel. Therefore, for a verification of the on-board science data reduction software, the PCSS system is able to mimic it. This environment can be used to find and test new on-board reduction algorithms. Last but not least, for the generation of some calibration products raw data are favorable (e.g. saturation limits

of the read-out electronics). 4.S/W development environment There is quite some synergy between calibration, test analysis and professional software development within the same framework. Algorithms can be prototyped and tested, calibration products be derived by instrument and calibration experts and then be transfered into an overall design of a logical data processing chain, both interactively and automatically, also including processing and data quality flags, by the system experts. Software solutions by the software specialists serve a wider community and need not to be tailored to specific analysis tasks. Standardized data products on various levels enable the straight forward transfer to specific software packages for further analysis. 5.Interactive science data analysis Interactive science data analysis allows the visual inspection of all data processing steps, which is an essential feature in consolidating a data processing system towards a full or semi-automatic processing. Specific software modules can be relatively easily exchanged, input parameter tuned and different algorithms be intercompared. 6.Standard Product Generation (Pipeline) Standard Product Generation (Pipeline) processing is processing scientific data automatically or interactive from raw telemetry up to reasonable scientific results. There is a Herschel-wide convention on processing levels of the different instruments. Raw Telemetry : All telemetry packets produced by the instrument in the course of the observation. In PACS IA, we store/manipulate this level as a PacketSequence. Raw Telemetry All telemetry packets produced by the instrument in the course of an observation. In PACS IA this level is stored/manipulated as a PacketSequence Decompressed Science Data This is an artificial level, since the data are not stored and not visible for a general user. However, in an interactive step by step data analysis the product can be Level 0 data A complete set of data as a starting point for scientific data reduction. It is saved in a Level 0 Pool in form of FITS files. After level 0 data generation no connection to the database is possible any more, and therefore all relevant information like uplink information needs to be retrieved from the data base. Level 0 contains the following data components : Science data are organized in user friendly classes, namely the Frames class for on-board reduced data

and the Ramps or PhotRaw class for additional raw channel data. Auxiliary data are provided for the time period covered by the Level 0 data and comprise spacecraft pointing (attitude history), time correlation, and selected spacecraft housekeeping. This information is partly merged as status entries into the basic science classes or is available as pointing products. Decoded HK data in form of tables with converted and raw HK values. Associated observations containing calibration information or trend analysis results from the whole operational day or even a longer period are optional. Illustration 6: PACS Frames class Level 0.5 data Processing until this level is AOT independent and therefore also non-aot engineering observations can be processed up to this level. Additional information like processing flags is added to the Frames class and basic unit conversions are applied. The data are saved in the Product Pool. Level 1 data This data generation is AOT dependent. The resulting product contains the basic astrophysical

flux calibration. For PACS photometry this is a data cube with flux densities and associated sky coordinates. It is input for actual image construction. The product format for photometer data will be the Frames or FramesStack class. The Level 1 data are saved in the Product Pool. It is the goal that level 1 product generation can be done automatically to a large extent. Level 2 data These data products can be used for scientific analysis. Processing to this level contains actual image construction and is highly AOT dependent. Specific software may be plugged in. For optimal results many of the processing steps along the route from level 1 to level 2 may require human interaction. Drivers are both the choice of the right processing parameters as well as optimizing the processing for the scientific aims of the observation. The result is an Image product. Level 3 data These are publishable science products with level 2 products as input. They are not restricted to data from the specific instrument, but can be combined with theoretical models, laboratory data, multi-wavelength data from other observations and catalogues. Their formats should be VO compatible. Various GUI tools support the intermediate and final Product inspection. E.g. the MaskViewer for inspecting Pixel Mask settings like Saturation, Radiation hits, Malfunction pixel etc.

or the DatasetInspector which give quick and convenient overview about Products and Datasets within an jide session. Illustration 8: Dataset Inspector Standard Product Generation is designed in a modular way, see Fig. 9. It is possible to run it within or outside the Pipeline Framework, for a single observation or as bulk processing, producing e.g. first look products. For developers, instrument engineers and scientists it is possible to run the pipeline stepwise and inspect the intermediate results. It is possible to vary processing parameters for each pipeline step or to exchange calibration data used in an application. Therefore stepwise execution of the pipeline gives transparent access to any telemetry data at any time. Even intermediate results can be saved in ProductPools or exported in form of FITS files. A copy option permits to keep intermediate products before and after a processing step in an interactive session. Therefore, it is possible to compare the result of an application. The modular design of the Standard Product Generation supports the users in modifying processing steps, recombining the order of some steps, or add self written scripts to the processing chain.

Illustration 9: Photometer Level 0.5 generation flow (all steps can be executed separately) Results (also intermediate) can be saved in ProductPools or exported in form of Fits files. A copy option permits to keep products before and after a processing step in the IA session. Therefore it is possible to compare the result of an application. The modular design of the pipeline support user to modify processing steps, recombine the order, or add self written scripts to the processing chain. Certain Standard Product Generation steps (mostly up to product level 0.5) are executed AOT independent and therefore it is possible to use them also for engineering observations (non AOT observations).

For Astronomer the data reduction start with the so called Level 0 Products which are generated in an automatic way. But especially during ILT and IST, but also in problematic cases during operations, it is possible to do the Level 0 product generation and Telemetry inspection within an IA session. The detailed description of the PACS Pipeline is currently covered in the PACS DP User Manual. 7.Trend Analysis Trend analysis is used to investigate the temporal evolution of certain engineering parameters or detector data, or to correlate several parameters with each other and search for triggering events. The PACS Trend Analysis (PTA) is a widget oriented pure java program to carry out trend analysis on PACS Hosuekeeping and Science Data. It can work both on telemery file and data base contents, it can import PacketSequences or Tables and offers display, modification, plot and merge capabilities. Illustration 10: PACS Trend Analysis (PTA) Additional tools allow to query the Database or Product Pools for all kind (also not foreseen) long term Trend Analysis. Queries might be time consuming, but give a high flexibility to react on unexpected analysis requirements. The pipeline framework will produce pre-defined trend information when processing all PACS data.

Speed optimized interfaces support the user access of theses data. 8.Quick Look Analysis (QLA) PACS Quick Look Analysis (QLA) is a near Real Time application that reads,decompresses andws data.

9.Instrument Simulator The PCSS system contains the PACS simulator which produces simulated detector data from an input sky and adding known and modeled instrumental artifacts. The results can be read into an IA session and processed back with the available software. This allows to verify processing steps. Simulated data are also used to optimize an observing mode in that respect that one can intercompare the processing results depending e.g. on the frequency of internal calibrations, chopping and mechanisms speeds, S/C raster step size and scanning speeds. 10.Instrument performance and health checks This is a specific aspect of trend analysis and quick look analysis. Trend analysis of engineering parameters of mechanical elements may indicate an upcoming degradance and allow to tale counter measures beforehand by changing the operational procedure. Health checks are e.g. performed by watching for limit violations and event flags. This is a specific aspect of trend analysis and quick look analysis. Trend analysis of engineering parameters of mechanical elements may indicate an upcoming degradance and allow to tale counter measures beforehand by changing the operational procedure. Health checks are e.g. performed by watching for limit violations and event flags. 11.Data quality checks Data quality checks can be performed by checking for the occurrence of certain pipeline processing flags or by visual inspection of the first look data. A pipeline generated QualityProduct and proper error computations within the pipeline support this task. In case of strange data appearance the data set can be investigated back to the raw data, if necessary, and any instrumental malfunction be traced. If the quality checks fails due to an instrumental, spacecraft or telemetry transmission failure, then re-scheduling of the observation can be triggered.