tesi di laurea specialistica Analisi sperimentale di strumenti per la fault injection su sistemi Anno Accademico 2009/2010 relatore Ch.mo prof. Domenico Cotroneo Correlatori Ing. Roberto Natella Ing. Ricardo Barbosa Candidato Anna Lanzaro Matr. 885/379 1
Context This work has been promoted European project Critical Step by the Aim of the project is the Transfer of Knowledge (ToK) between Accademy and Industry in order to develop new technologies and standards for safety- critical CINI -> Critical Software ToK SOFTWARE FAULT INJECTION 2
Software Fault Injection Uses for emulation of software faults: Validation of fault-toleranttolerant mechanisms Dependability benchmarking Adopted for emulation of programming errors It modifies parts of code in order to put bugs in the application To inject faults in the binary code is difficult due to the gap between source and executable code Fault injection in high level code is more realistic and accurate. Therefore it can be seen as an useful technique for assessing the Xception s accuracy 3
Objectives In this work, we consider the Xception prototype Software Fault Injection tool To be adopted for dependability evaluation of many safety-critical, NASA s among the others Xception will be compared to the prototype tool SCIFI developed by CINI in order to evaluate its accuracy and to provide feedback for its improvement. The study will be made in the context of a real-world case study from the space domain 4
Which software faults should be injected? A previous field data study found that software faults belong to a small set of fault operators Xception and SCIFI are expected to inject the same faults 5
Xception tool Xception injects faults in the target application at binary-code level Fault injection process Library of fault operators based on common programming errors Changes correspond to the code that would be generated by the compiler if the faults were in the source code. Identification of specific low-level instruction patterns Generation of faulty versions of the target application 6
SCIFI tool It injects faults in the target application at source code level Changes correspond to the real programming errors Based on the same library of fault operators of Xception Fault injection process Identification of fault locations through the Abstract Syntax Tree Generation of patch files, each containing the code of an individual fault 7
How evaluate Xception s accuracy? Fault injection in the target application (application code + OS code) both in the binary code and in the source code Ideal case: for each injected fault in the executable exists a corrisponding injected fault in the source code. Xception s injection SCIFI s injection Application code + OS code 0003ef4c rtems_timer_cancel+0 xd8> blr <_Timer_Get()>.. rtems_task Init( rtems_task_argument ignored ) {.} What is the relationship? 8
Problem statement Xception may not correctly recognize all the bit patterns corresponding to bugs at the programmer s level False positives: bit patterns not corresponding to constructs in the program in which a fault could exist False negatives: constructs in which a fault could exist not recognized in the executable file 9
Case study A satellite data handling system named Command and Data Management System (CDMS), for managing all data transactions between ground and a spacecraft OBS CDMS is composed by 6 sub, each one with a specific task Faults are injected in both the application and OS code CDMS code RTEMS code CDMS object code RTEMS static libraries (.a) Case study binary (OBS+ RTEMS) 10
Experimental methodology 1/2 Setup of the case study Generation of the fault-free application Fault operators are applied using both Xception and SCIFI Generation of faults Information is collected about: - Operator - Location of faults in the code (file, function, line of code) Analysis of generated faults Comparison of the injected faults False positive (injected only by Xception) False negative (injected only by SCIFI) Correct faults (injected by both tools) Analysis of a sample of faults (5%) and collection of statistics based on the obtained results Inspection and validation of results 11
SCIFI Tool vs Xception 3000 SCIFI Tool Xception 2500 2000 1500 1000 500 0 Some operators exhibit significant differences in the number of injected faults We noticed that 22% of Xception s faults were incorrectly generated, and they were removed from the analysis 12
Comparing the results 1/2 Inline Macro Bugs in SCIFI Other Errors Common 6% 7% FP 21% 17% 7% 10% 18% 48% Faults generated by Xception 26% 40% Faults generated by SCIFI Tool FN Common: Correct faults injected by both Xception and SCIFI Macro/Inline: FPs FPs and FNs due to C macros or inline functions Other Errors: FPs and FNs not due to macros or inline functions Bugs in SCIFI: Not Not real FPs/FNs, but noise in the analysis due to bugs in SCIFI FP =Bugs in SCIFI + Macro/Inline + Other Errors FN = Bugs in SCIFI + Macro/Inline + Other Errors 13
Comparing the results 2/2 OMIA Operator for Missing If Around statements OWPFV Operator for Wrong variable in parameter of function Call 6% 2% 11% 16% 6% 22% 8% 24% Inline Macro Bug in SCIFI Other Errors Common 8% 13% 27% 21% 6% 20% 38% 9% Inline Macro Bugs in SCIFI Other Errors Common 63% 41% 37% 21% Faults generated by Xception Faults generated by SCIFI Tool Faults generated by Xception Faults generated by SCIFI Tool 14
Analysis of FPs and FNs Evaluating Xception s accuracy: Errors related to macros (21%) 0003ef4c rtems_timer_cancel+0xd8> blr <_Timer_Get()> Istrictions of <_Timer_Get()>.. 0003ef4c rtems_timer_cancel+0xd8> blr <_Timer_Get()>. Istrictions of <_Timer_Get()> Xception injects a fault only in one copy of the macro/inline function at a time Errors realated to Xception s bugs ( 5%) Some types of Xception errors: When a macro or inline function is faulty, the faulty code is replicated several times in the binary code OMIEB/OMIFS/OMIA: Xception does not distinguish between switch-casecase costructs and if costructs OWPFV: Xception wrongly injects faults in some if conditions. 15
From the Conclusions the analysis results: False Positive: : 52% False Negative: : 60% Investigating FPs and FNs can help to improve Xception FP: 18% FN: 37% Limitations of the analysis SCIFI Tool contains some bugs Comparison based on code locations can fail in some cases Future developments Further improvements of measures Improvement of tools based on the obtained results 16