AUTOMATION OF THE DATA MANAGEMENT PROCESS WITHIN THE FIELD OPERATIONAL TEST EUROFOT

Similar documents
ABSTRACT STATE OF THE ART

EB Automotive Driver Assistance EB Assist Solutions. Damian Barnett Director Automotive Software June 5, 2015

SIPAC. Signals and Data Identification, Processing, Analysis, and Classification

Quality Assurance for Hydrometric Network Data as a Basis for Integrated River Basin Management

D5.1 Tips&Info4BRT - Product specification of the event management software ALMO

Bringing Big Data Modelling into the Hands of Domain Experts

IFS-8000 V2.0 INFORMATION FUSION SYSTEM

Knowledge Base Data Warehouse Methodology

A Demonstration of a Robust Context Classification System (CCS) and its Context ToolChain (CTC)

A Symptom Extraction and Classification Method for Self-Management

Integration of PTC and Ride Quality Data. Presented by: Wabtec Railway Electronics, I-ETMS PTC Supplier. and

Utility Communications FOXMAN-UN Network Management System for ABB Communication Equipment

EBERSPÄCHER ELECTRONICS automotive bus systems. solutions for network analysis

siemens.com/tolling Back-office system Sitraffic Sensus Server Supplies all front-end data. Suitable for any GNSS tolling back-office.

CHANCES AND RISKS FOR SECURITY IN MULTICORE PROCESSORS

On the road toward the autonomous truck

A vehicle independent low cost platform for naturalistic driving studies Henning Mosebach, DLR

BENEFIT OF DYNAMIC USE CASES TO EARLY DESIGN A DRIVING ASSISTANCE SYSTEM FOR PEDESTRIAN/TRUCK COLLISION AVOIDANCE

A Software and Hardware Architecture for a Modular, Portable, Extensible Reliability. Availability and Serviceability System

SanDisk ION Accelerator High Availability

Software: Driving Innovation for Engineered Products

Fight fire with fire when protecting sensitive data

Efficient Management of Tests and Defects in Variant-Rich Systems with pure::variants and IBM Rational ClearQuest

Middleware- Driven Mobile Applications

Scala Storage Scale-Out Clustered Storage White Paper

Understanding Web personalization with Web Usage Mining and its Application: Recommender System

BIG DATA IN THE CLOUD : CHALLENGES AND OPPORTUNITIES MARY- JANE SULE & PROF. MAOZHEN LI BRUNEL UNIVERSITY, LONDON

NTT DATA Big Data Reference Architecture Ver. 1.0

A&E Specifications RoadRunner Video Surveillance Systems for School Transportation Applications

Automated Data Acquisition & Analysis. Revolutionize Validation Testing & Launch With Confidence

Rotorcraft Health Management System (RHMS)

Table of Contents. 1. Product Overview Proposed Solution System Architecture Key Modules & Features...

Benefits gained from using new maintenance concepts and international logistics standards

Software: Driving Innovation for Engineered Products. Page

Focus on Customer Information. Stationary, highly automated customer information systems. Innovative and reliable Information Technologies

Improving the Customer Support Experience with NetApp Remote Support Agent

HARTING Auto-ID System Integration

A Survey of Existing Technologies, Applications, Products, and Services for Geofencing

Sensor Integration in the Security Domain

DEPENDABILITY STUDY OF THE ERP SYSTEM

Base One's Rich Client Architecture

"#$%&'#!()*&+%,-! ./01#2! 3"4!5!677!! 3,!89#)9%#:!!!

Intelligent Log Analyzer. André Restivo

One solution, countless benefits

Lavastorm Resolution Center 2.2 Release Frequently Asked Questions

Implementation of CVIS ITS Application in a Driving Simulator Environment Kenneth Sørensen, kenneth.sorensen@sintef.no SINTEF

secure intelligence collection and assessment system Your business technologists. Powering progress

Is a Data Scientist the New Quant? Stuart Kozola MathWorks

CA Deliver r11.7. Business value. Product overview. Delivery approach. agility made possible

A PLATFORM FOR SHARING DATA FROM FIELD OPERATIONAL TESTS

EP A2 (19) (11) EP A2 (12) EUROPEAN PATENT APPLICATION. (43) Date of publication: Bulletin 2012/35

National coal mining giant implements RFID-based VEHICLE TRACKING SYSTEM

How the People Counter Works. Enhanced Building Security and Improved Marketing Intelligence. 3D MLI Sensor Technology

CHAPTER 1 INTRODUCTION

Bendix Wingman Fusion

Best Practices for Deploying and Managing Linux with Red Hat Network

Smart wayside management software

MOBILE ARCHITECTURE FOR DYNAMIC GENERATION AND SCALABLE DISTRIBUTION OF SENSOR-BASED APPLICATIONS

Product Characteristics Page 2. Management & Administration Page 2. Real-Time Detections & Alerts Page 4. Video Search Page 6

Transvision Waste Planner

TRANSPORT SERVICE. RFID Vehicle Outbound Logistics Management Case Study

SOFTWARE TESTING TRAINING COURSES CONTENTS

Cisco ROSA Video Service Manager (VSM) Version 05.03

Collaborative Open Market to Place Objects at your Service

RiMONITOR. Monitoring Software. for RIEGL VZ-Line Laser Scanners. Ri Software. visit our website Preliminary Data Sheet

Total Exploration & Production: Field Monitoring Case Study

Infinity 2020 Perimeter Intrusion Detection System

How to overcome SQL Server maintenance challenges White Paper

How To Use Rittal'S Rizone

Supervision software for Intrusion detection, Fire detection and CCTV systems

Methods and Services of Data Processing for Data Logged by. Auto Process Data Acquisition

NEW. EVEN MORE data acquisition and test stand automation

Big Data. Fast Forward. Putting data to productive use

Dell Desktop Virtualization Solutions Stack with Teradici APEX 2800 server offload card

Ruggedized Digital Video Recording and Transmission System for Vehicle Fleets. May 2014

EB TechPaper. Test drive with the tablet. automotive.elektrobit.com

Applying Deep Learning to Car Data Logging (CDL) and Driver Assessor (DA) October 22-Oct-15

Windows Embedded Security and Surveillance Solutions

Safe Automotive software architecture (SAFE) WP3 Deliverable D3.6.b: Safety Code Generator Specification

THE STRUCTURAL HEALTH MONITORING SYSTEM OF THE RION ANTIRION BRIDGE CHARILAOS TRIKOUPIS. Akis Panagis

Managing Large Imagery Databases via the Web

Challenges for the European Automotive Software Industry

PRIME IDENTITY MANAGEMENT CORE

ABSTRACT. would end the use of the hefty 1.5-kg ticket racks carried by KSRTC conductors. It would also end the

EL Program: Smart Manufacturing Systems Design and Analysis

Adaptive Cruise Control

3SL. Requirements Definition and Management Using Cradle

The Business case for monitoring points... PCM architecture...

Software Development In the Cloud Cloud management and ALM

Honeywell Video Analytics

DETROIT ASSURANCE TM A SUITE OF SAFETY SYSTEMS

Transcription:

AUTOMATION OF THE DATA MANAGEMENT PROCESS WITHIN THE FIELD OPERATIONAL TEST EUROFOT Dipl.-Ing. Mohamed Benmimoun Institut für Kraftfahrzeuge, RWTH Aachen University (IKA) mbenmimoun@ika.rwth-aachen.de Dipl.-Ing. Felix Fahrenkrog Institut für Kraftfahrzeuge, RWTH Aachen University (IKA) fahrenkrog@ika.rwth-aachen.de Dipl.-Ing. Jörg Küfen Institut für Kraftfahrzeuge, RWTH Aachen University (IKA) kuefen@ika.rwth-aachen.de Mimoun El Ghaouty Institut für Kraftfahrzeuge, RWTH Aachen University (IKA) ghaouty@ika.rwth-aachen.de Univ. Prof. Dr.-Ing. Lutz Eckstein Institut für Kraftfahrzeuge, RWTH Aachen University (IKA) eckstein@ika.rwth-aachen.de ABSTRACT A Field Operational Test (FOT) is a study undertaken to evaluate a function, or combination of functions, under normal operating conditions in environments typically encountered by the host vehicle using quasi experimental methods [1]. It is basically a large-scale, quasiexperimental, on-road evaluation of a driver support function, or functions. Field operational tests enable to evaluate the impact of system functions in their intended environment, by their intended users, on a scale and with a duration large enough for statistically valid conclusions to be drawn. The eurofot project is the first large-scale field operational test of multiple advanced driver assistance systems (ADAS) that has been conducted in Europe. The project aims to investigate the impact of ADAS and to encourage their deployment by offering valuable information for the short- and long-term impact of ADAS. The FOT establishes a comprehensive, technical and socio/economic assessment for evaluating the impact of ADAS on safety, environment and user-acceptance in real life situations. Altogether, about 1000 vehicles (cars and trucks) equipped with eight different ADAS technologies will take part in the field operational test. The FOT is coordinated by five vehicle management centers (VMC) and carried out at various test sites across six European countries. The project duration is 46 months and it will end in February 2012. Within the field operational test a high amount of data is collected, processed and analyzed. For all these processes a reliable and well structured data management strategy is necessary, in order to guarantee a traceable workflow and consistent results. Furthermore the data management processes have been automated, in order to realize a performance and time efficient process. Within this paper the approach for automation of the data management process is introduced. Topics: Field operational test; Data management; Automation; Remote data acquisition

INTRODUCTION At the German1-VMC a fleet of 200 vehicles is managed, which consists of 60 trucks and 140 passenger cars from different manufacturers. The vehicles are equipped with different ADAS functions, as shown in Figure 1. German1-VMC ACC, FCW CSW ACC, LDW ACC, LDW OEM ADAS 100 2 60 40 # of vehicles Vehicle type Fleet coordination Data Analysis Figure 1. Tested ADAS and fleet composition at the German1-VMC The tested ADAS functions cover Adaptive Cruise Control (ACC), Lane Departure Warning (LDW), Forward Collision Warning (FCW) and Curve Speed Warning (CSW). All 200 vehicles are equipped with a data acquisition system (DAS), which enable recording and temporary storage of all relevant values and the transfer of recorded data to a central storage server. The estimated amount of data at the German1-VMC adds up to approximately 6 TB, considering one year of field test operation. DATA ACQUISITION For the measurement in the field the German1-VMC has equipped the test vehicles with data acquisition systems (DAS). These DAS will record measurements from up to four CAN busses of the vehicles and additionally GPS information. Other signals sources on vehicles side are not used at the German1-VMC, in order to ease the integration of the DAS compared to other scenarios such as integration of additional sensor equipment (e.g., video sensors etc.). All measurements collected by the DAS installed in the vehicles are on a per trip basis. This means that the recording is started as soon as the vehicle s engine is started and is completed as soon as the vehicle s engine is switched off.

At the German1-VMC only data from the vehicle s CAN-busses as well as the GPS is collected. The needed signals to be collected from the CAN-Busses have been identified by means of the research questions of the project and the pre-defined hypotheses to be tested. Afterwards the availability for the different vehicle manufactures has been checked, in order to avoid that relevant data is missing for the analysis at the end. This step is important due the fact that at the German1-VMC vehicles of three different car manufactures are used and depending on the vehicles used, the available signals vary between the different vehicle types. Additionally to the availability of signals it has to be ensured that the requirements of the analysis on the frequency as well accuracy are fulfilled. For the German1-VMC the decision to record data continuously with defined sampling rates has been made, instead of recording only specific events. The advantage of this approach is the avoidance of loss of possibly relevant data because of not well-suited event recognition (e.g. wrong thresholds). The detection cannot be adapted at a later stage, if relevant raw data has not been collected. Therefore the process for filtering and identification of interesting data respectively events is performed offline after the data have been uploaded onto a server. One advantage of this approach is that improvements as well as optimization can be done during or after the FOT without loss of data. Hence evolutions of algorithms do not affect the consistency of the processed data. As consequence of this approach, a huge amount of data needs to be uploaded and processed within a recursive data analysis process through multiple iterations. This requires a well structured and coordinated data management and processing strategy. The data, which is recorded by the DAS, is limited to 120 signals. Of course the signals can only be stored as long as the signals are available on the CAN-Bus. Signal dynamics have been taken into account to ensure that the recorded data is valuable for the testing of the hypotheses and to avoid invalid conclusions due to signal misinterpretations. The result is, that the vehicle-side measurements are composed of around 40% signals with a sampling frequency of 10 Hz (mostly driving and engine dynamic properties) and 60% with a sampling frequency of 1 Hz (status information). Overall around 550 signal samples are generated per vehicle and second. Data upload The data measured on the connected CAN-channels are stored in the first stage on a FLASH storage device installed on the DAS. Furthermore the DAS at the German1-VMC offers the possibility to communicate with the device during the operational time of the field test using an integrated GPRS-module. This offers the opportunity to realize two functionalities monitoring and data upload which are of importance for the execution of the FOT and ensures completely automated operation without user interaction throughout the entire operation time. Autonomous operation means that no user interaction neither on the driver side nor on the operator side is required as long as no malfunctions of the DAS and the managed processes are detected. At the same time the drivers are totally kept out of the data retrieval loop. Using the GPRS-connection of the DAS the data from the field is continuously transferred throughout the entire duration of the FOT. Beneath this, the operation conditions and states of

each DAS are monitored continuously. By this means malfunctions, possible problems with cabling, hardware defects and loss of data can be monitored and detected within short notice. The continuous transfer of data from the DAS to a centralized server system makes it possible to start analyzing of data at an early stage of the FOT and to test the validity and quality of data from the beginning. The upload procedures are designed to work in parallel with the recording of data during the vehicle operation. Thereby the upload procedures cover continuous transfer of data as well as status information for all DAS to a centralized server system using the GPRS-connection of the DAS. With around 1.2 GB per month and vehicle, the measured vehicle data (e.g., vehicle speed, acceleration etc.) represents the biggest amount of data to be transferred [2]. The possible loss of GPRS-connection needs to be considered, as the vehicles are moving between different cells of the mobile communications network or are driving abroad. Due to this an incremental transfer strategy was chosen for the data transmission. A fixed follow-up time ensures that gaps of GPRS-connectivity can be compensated after a trip is completed. The approach for data retrieval at the German-1-VMC is presented in the figure below. Data upload Server system GPS GPRS Data processing Database Data analysis Results Data acquisition Data storage and analysis Figure 2. Data retrieval process at the German1-VMC The data retrieval process at the German1-VMC is based upon three pillars. The first one is data collection from the vehicle s CAN busses in real environment, the second is related to external data sources, such as geo information from digital maps or weather information. The

Monitoring Data retrieval last pillar is finally realized by a set of algorithms to classify (i.e., event and situational variables recognition) and pre-analyze of all data acquired within the first two pillars (e.g., calculation of performance indicators needed for testing of the hypotheses) [3]. For the processing of all data, a variety of different configuration and parameter sets need to be taken into account. To ensure that all requirements for the data processing are fulfilled, a strict and predefined processing flow needs to be followed. Thus a coordination of several subtasks needs to be considered. The traceability of the coordinated processes as well as the coordination itself are thereby two of the focus points which need to be ensured by a centralized data management. DATA MANAGEMENT Data management at the German1-VMC is designed to handle all data processing related tasks within a centralized platform. This covers the subtasks of uploading and retrieving of data as well as data conversion operations, data pre-processing execution and data storage along with trace and meta-information and finally supporting for the data analysis. Besides the automated data upload, the succeeding processes have been automated, in order to ensure a fully autonomous data retrieval and pre-processing operation. The required tasks and data formats have been harmonized. They have been defined in a manner that a software tool chain consisting of several modular components can ensure a well structured and coordinated processing, especially during evolution of processing algorithms in later iterations while guaranteeing extensibility in future. The software framework designed for the German1-VMC is presented in the figure below. German1-VMC Fleet Algorithm Pool... Extension Extension Extension Extension Upload Processor Fleet Manager Administration Interface Managing GUI Conversion Manager Diagnostic Processor Management Plausibility and Check Manager Configuration Manager Event, Enrichment, and PI Manager Data Manager Central Management System SQL Database Manager Central Data Storage Processing Storage Figure 3. Software architecture used at the German1-VMC

The architecture is based on a modularized approach. The main functions cover data retrieval, processing and storage on an SQL based database as well as management and monitoring of all DAS, data consistency and processing workflow. The main data processing algorithms are not part of the framework by itself, but are embedded as extensions. These extensions can be dynamically defined and selected at any time. Using versioning and other meta-information, the system ensures consistent data pools and results during the data analysis phase. During the normal workflow, data is passed between these software components each time one process step is accomplished. The coordination of the interaction between the software components is performed by the Central Management System (CMS). After data has been successfully uploaded on the server, the CMS is responsible for passing the data to the data manager, which manages subsequent the data processing. The processing covers the conversion of the data to the before mentioned standardized file format, the check and plausibility analysis as well as the enrichment and classification processes using the algorithm pools. The processed data is afterwards stored on a SQL based server, in order to make the standardized data and the processed intermediary results available for the analysis. Complementary services such as data exchange, management and versioning of data and configuration, logging of executed operations and status reports or the distribution of computation work are needed to complete the software infrastructure of the framework. Moreover a graphical user interface makes the system manageable by an operator. Therefore additional services have been implemented and integrated into the software architecture. Besides a graphical user and administration interface processes for diagnostic as well as configuration purposes have been implemented and are used by all software components of the software framework. By means of the Graphical User Interface (GUI) - Fleet Manager - all integrated software modules of the framework can be accessed. Starting from the global CMS operations state and workflow until down to the individual operation and state history information on each DAS can be visualized. Besides an overview of all managed DAS on the server site and their current operation state, which are presented in the lower part of the Fleet Manager, additional processes of the framework have been integrated and can be directly accesses by using the GUI. By selection of one DAS detailed state information can be displayed, e.g. the current firmware version on the DAS, the content and state of the FLASH storage, the current GPS position etc.. Each DAS needs to be registered at the server before starting uploading data to the server. As soon as the Upload Processor receives an incoming message from a new DAS the CMS compares the new IP address with the pre-defined IP addresses stored by the SIM-Card Manager. The SIM-Card Manager contains all available IP addresses of the used SIM-cards and the corresponding vehicle manufacturer taking part in the field operational test. After the check has been completed the CMS escalates automatically an email notification to the system administrator that a new DAS has tried to connect with server. No data will be uploaded by the new DAS to the server as long as the administrator has not activated the corresponding processes for data exchange in the Fleet Manager. The GUI is presented in the figure below.

Figure 4. Graphical user interface of the "Fleet Manager" The operations of the Upload Processor and the bypassing of data to the processing operations are performed through the frameworks components, especially the Upload Processor and the Data Manager in conjunction with a global coordinating instance, called the CMS Core. To keep track of these bypasses, processing operations and the reasons leading to decisions inside of the server modules a diagnostic process inside the framework for supporting the infrastructure modules is implemented and integrated in the software architecture. All diagnostic data can be shared between the software components to trace previously occurred errors in the processing chain. The logged diagnostic data can be accessed through the Diagnostic Explorer which is part of the Fleet Manager. The GUI of the Diagnostic Explorer is shown in Figure 5. The diagnostic data is structurally grouped in multiple endpoints, sessions and processes, which are pre-classified by the server modules to guide the operators attention. The diagnostic information is logged for each DAS managed by the Fleet manager. As soon as a DAS is communicating with the server a session is created and all relevant diagnostic information is logged. The session is closed when the communication has been aborted for a defined time period. Each of these sessions can be accessed by means of the Diagnostic Explorer. The sessions include information on performed operations during the communication phase, e.g. synchronization of data, updates of software versions, registration status, etc.. Furthermore diagnostic information on the executed MATLAB algorithms for data processing as well as status information on the Upload Processor are logged.

Figure 5. Diagnostic Explorer for observing the activities of all connected DAS The total amount of operations and thereby the sessions and processes logged inside the diagnostic services are quite huge. Therefore besides logging and reporting of detailed diagnostic information, the framework offers the possibility to directly escalate issues for certain aspects by email, which has been detected by means of the logged diagnostic information. Detected errors during a session are logged and automatically escalated by the issue management to the system administrator. Issues that are directly escalated cover fleet maintenance derived from the DAS state history as well as non handled or suspicious operations within the data management and data processing. The diagnostic and status information of the DAS are not only presented on the GUI, but automatically processed by the framework, in order to identify possible issues. An example for an email notification created by the software framework is shown in Figure 6. Figure 6. Email notification by the eurofot server system

By means of the status information provided by a DAS the framework has identified a missing or not mounted SD-cards, because no information with respect to the content of the FLASH storage device are available. Furthermore a Configuration Manager, SIM-Card Manager, experimental design overview (when has a defined phase of the FOT started for each DAS), Account Manager as well as an issue management are integrated into the Fleet Manager. An overview of the Configuration Manager as well as the issue management in conjunction with the registered modules and the processor load of the server are presented in the figure below. Figure 7. Overview of additional processes accessed by means of the Fleet Manager Besides the provision of information from the connected DAS and the operating processes the data management needs to handle planning of all managed processes and to coordinate the process execution. These processes are presented in the following. PROCESS EXECUTION A wide number of technically different vehicle configurations needs to be respected at the German1-VMC and reflects the need of different configuration and algorithm sets. Additionally aspects like data integrity, data safety and privacy need to be ensured. Therefore a manual operation approach of data retrieval, decoding and analysis as well as storage is not feasible and keeps too many probable risks. Especially the possibility that research questions can start their investigation already prior to the end of the FOT leads to the difficulty that previously executed algorithms and parameter sets may evolve. Process planning is related to the execution order and scheduling of the preprocessing functions. The increasing amount of data and the high complexity of the pre-

processing chain (including the dependencies between the algorithms) become highly complicated. Thus strategies like proper automated planning of execution order and parallelization of independent execution trees are essential. This is especially of importance, if the used algorithms for the analysis evolve during the FOT. In this case the consistency of all processed data still needs to be guaranteed. This means that already processed data needs to be re-processed with maintainable efforts. An entire re-calculation of an amount of 6 TB of data and an expected amount of more than 100.000 trips each time one algorithm evolves is not a feasible approach. Therefore it is necessary to focus on only the relevant respectively affected data sets. This approach is realized by using a petri-net graph with versioning information for algorithms and processed data. These challenges lead to the need of an intelligent and automatable strategy, to ensure data consistency through the entire data management process. Within the eurofot approach therefore an encapsulation of all processing steps on server side was implemented. This includes management and handling of different versions and configurations of algorithms and parameters. The server side modules need to ensure scalability and data exchange between the several processing stages. The figure below gives an overview on how process planning is performed using the software framework developed at ika. Figure 8. Process for execution planning and data flow The realized concept achieves a decoupling of algorithm implementation and execution. All algorithms are encapsulated with additional meta-information. This meta-information covers

the required input signals for the algorithm as well as the generated output signals, versioning information for detection of evolution, author, comments and annotation of the functionality, which can be used by the process supervisors. The algorithm together with these meta-data is committed to the CMS software framework and embedded as extension in the algorithm pool of the framework. The processing of data and encapsulation of algorithm and parameter sets is one important ground work of an automation process. The eurofot solution follows scalable and extendable software framework to ensure easy adaptation to other FOT scenarios. For starting the execution chain as soon as new data is available the CMS performs a dependency analysis based on the meta-information from the algorithm pool by building a petri-net of the processes using the signal information. Thereby unresolved dependencies (e.g. missing input signals) can be identified and the order of execution determined. The execution of the processing will be performed by distributing the functions together with the required FOT data to a computation cluster. A process for management of modifications (e.g. integration of additional or evolution of already existing algorithms) is autonomously practicable by means of the petri-net approach. As soon as new processes are registered or modified, the CMS automatically performs reprocessing of all affected functions, in order to keep the data consistent. With the presented approach a fully automated data upload and processing is achieved. SUMMARY In this paper the approach for data management of data collected within a field operational test is presented. The data management process is realized by a set of processes, in order to ensure that data is collected, processed for analysis purposes and stored on an SQL server for the final analysis. All these processes need to be coordinated and their execution order determined. Due to the large amount of data that will be collected within the field test (up to 6 TB) the decision was made to realize the entire data management and processing by means of an automated software framework. Moreover an automation of all processes ensures the needed consistency of the output from the implemented processes. The software framework consists of several software modules. Each software module is responsible for a certain task within data collection and data processing. The process chain is coordinated by the central management system (CMS). The CMS passes the uploaded data between the different processes till it has been uploaded on the SQL server. All needed diagnostic information is logged by a diagnostic processor, in order to guarantee traceability and management of all executed processes. A GUI has been implemented which provides information on all managed DAS as well as access to all relevant processes. The managed DAS exchange status information with the server site in order to ensure that the operation state of the DAS can be monitored during its operational time. By means of the status and diagnostic information the CMS derives conclusion on the state of the DAS and the executed processes. As soon as an error is detected the issue management escalates an email notification to the system administrator, in order to be able to react within short notice. The implemented software framework provides the needed software infrastructure, which is needed to realize an automation of the entire process chain.

REFERENCES 1. FESTA consortium (2008). FESTA Handbook, FESTA deliverable 6.4, May 2008 2. Benmimoun, M., Küfen, J. Benmimoun, A. (2009). eurofot Optimised data retrieval process for a large scale field test: manageable by automation?, Transport Research Arena 2010, Brussels 3. Benmimoun, M.; Fahrenkrog, F.; Benmimoun, A. (2010). Automatisierte Situationserkennung zur Bewertung des Potentials von Fahrerassistenzsystemen im Rahmen des Feldversuchs eurofot, VDI/VW-Gemeinschaftstagung Fahrerassistenz und Integrierte Sicherheit 2010, Wolfsburg