D6.1 GEEWHEZ Test plan



Similar documents
Software Test Plan (STP) Template

Levels of Software Testing. Functional Testing

Test Plan1.0 For the project A Credit Assessment System (CAS) Version 1.0

Introduction to Automated Testing

Test Plan Online Book Store Phase-II. Vamsi Krishna Mummaneni

E-vote 2011 Version: 1.0 Testing and Approval Date: 26/10/2009. E-vote SSA-U Appendix 5 Testing and Approval Project: E-vote 2011

GUI Test Automation How-To Tips

Testing Introduction. IEEE Definitions

LoadRunner and Performance Center v11.52 Technical Awareness Webinar Training

TEST PLAN OUTLINE (IEEE 829 FORMAT)

VERIFICATION AND VALIDATION AUTOMATED TESTING TOOLS CLAUDIU ADAM

International Journal of Advanced Engineering Research and Science (IJAERS) Vol-2, Issue-11, Nov- 2015] ISSN:

D Test Strategy

SECTION 4 TESTING & QUALITY CONTROL

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

Università Degli Studi di Parma. Distributed Systems Group. Android Development. Lecture 1 Android SDK & Development Environment. Marco Picone

Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

GLOBAL JOURNAL OF ENGINEERING SCIENCE AND RESEARCHES

SOA Solutions & Middleware Testing: White Paper

Information Technology Engineers Examination. Information Security Specialist Examination. (Level 4) Syllabus

Automated testing for Mobility New age applications require New age Mobility solutions

Information Supplement: Requirement 6.6 Code Reviews and Application Firewalls Clarified

Test Plan Template (IEEE Format)

ASHVINS Group. Mobile Application Testing Summary

Basic Testing Concepts and Terminology

Business Application Services Testing

SEACW DELIVERABLE D.1.6

Smarter Balanced Assessment Consortium. Recommendation

SOFTWARE TESTING TRAINING COURSES CONTENTS

Application Performance Testing Basics

Presentation: 1.1 Introduction to Software Testing

How To Test Your Web Site On Wapt On A Pc Or Mac Or Mac (Or Mac) On A Mac Or Ipad Or Ipa (Or Ipa) On Pc Or Ipam (Or Pc Or Pc) On An Ip

DESIGN OF AUTOMATION SCRIPTS EXECUTION APPLICATION FOR SELENIUM WEBDRIVER AND TestNG FRAMEWORK

WHAT WE NEED TO START THE PERFORMANCE TESTING?

Phire Architect Hardware and Software Requirements

BDD FOR AUTOMATING WEB APPLICATION TESTING. Stephen de Vries

Copyrighted , Address :- EH1-Infotech, SCF 69, Top Floor, Phase 3B-2, Sector 60, Mohali (Chandigarh),

Web Applications Testing

1. Introduction 1.1 Methodology

8. Master Test Plan (MTP)

How To Protect A Web Application From Attack From A Trusted Environment

Healthcare Compliance Solutions

Web Design (One Credit), Beginning with School Year

Basic Unix/Linux 1. Software Testing Interview Prep

VOL. 2, NO. 1, January 2012 ISSN ARPN Journal of Science and Technology ARPN Journals. All rights reserved

Autodesk PLM 360 Security Whitepaper

Testing Tools Content (Manual with Selenium) Levels of Testing

BarTender Print Portal. Web-based Software for Printing BarTender Documents WHITE PAPER


How To Test On The Dsms Application

ISTQB Certified Tester. Foundation Level. Sample Exam 1

Risk Mitigation, Monitoring and Management Plan

Dell Enterprise Reporter 2.5. Configuration Manager User Guide

Cisco Change Management: Best Practices White Paper

Web project proposal. European e-skills Association

Web Design and Development ACS-1809

TEST PLAN Issue Date: <dd/mm/yyyy> Revision Date: <dd/mm/yyyy>

Chapter 8 Software Testing

4.13 System Testing. Section 4 Bidder's Products, Methodology, and Approach to the Project System Training

Perfect Your Mobile App with Load Testing and Test Automation

Android Development. Lecture AD 0 Android SDK & Development Environment. Università degli Studi di Parma. Mobile Application Development

ManageEngine IT360. Professional Edition Installation Guide.

How To Test For Performance

CHAPTER 20 TESING WEB APPLICATIONS. Overview

Software Requirement Specification For Flea Market System

Assignment # 1 (Cloud Computing Security)


Software Testing Tutorial

A Database Re-engineering Workbench

Automation using Selenium

Please Note: Temporary Graduate 485 skills assessments applicants should only apply for ANZSCO codes listed in the Skilled Occupation List above.

What's New in BlackBerry Enterprise Server 5.0 SP4 for Novell GroupWise

Testing Mobile Software

SmartCart Design Description

Topics in Website Testing. [Reading assignment: Chapter 14, pp ]

BlackBerry Enterprise Server for Microsoft Exchange Version: 5.0 Service Pack: 2. Feature and Technical Overview

Kenna Platform Security. A technical overview of the comprehensive security measures Kenna uses to protect your data

EPSS Helpdesk - workdays from 08:00 to 20:00 - Phone: support@epss-fp7.org

FileMaker Server 13. Getting Started Guide

IT General Controls Domain COBIT Domain Control Objective Control Activity Test Plan Test of Controls Results

Robotium Automated Testing for Android

Gordon Gear Gift Card POS System

Media Server Installation & Administration Guide

OnCommand Performance Manager 1.1

Software testing. Objectives

Experian Secure Transport Service

CENG 492 TEST SPECIFICATIONS REPORT SECURE VIDEO STREAMING PROXY SERVER

Group18-CUCE2012. Mr. Mobile Project. Software Testing Plan (STP) Version: 4.0. CM Identifier: G18_SE004

Sports Management Information Systems. Camilo Rostoker November 22, 2002

Oracle Fusion Middleware User s Guide for Oracle Insurance Claim Management Process Accelerator 11gRelease 1 ( )

Polish Financial Supervision Authority. Guidelines

Device-Centric Authentication and WebCrypto

Test What You ve Built

SeaClouds Project D SeaClouds periodic evaluation reports

STUDY AND ANALYSIS OF AUTOMATION TESTING TECHNIQUES

ICE Trade Vault. Public User & Technology Guide June 6, 2014

Apache JMeter. Emily H. Halili. Chapter No. 6 "Functional Testing"

Application Security in the Software Development Lifecycle

System Build 2 Test Plan

How To Secure Your Data Center From Hackers

Transcription:

D6.1 GEEWHEZ Test plan Coordinator: Zoomarine Italia S.p.A. Claudio Di Capua Research for the benefit of specific groups Project Start Date:1 st October 2011 Duration:24 months -1 Grant Agreement - 25 August 2011 Version:1.0 Project co-funded by the European Commission within the Seventh Framework Programme (2007-2013) Dissemination Level: Public

Document information Title GEEWHEZ Test plan Workpackage 6 Responsible UC3M Due date Project month 04 (January 2012) Type Report Status Version 1.0 Dissemination Authors Project URL Public Mario Muñoz Organero Marco Hernaiz Cao Claudia Brito Pacheco Marco Vettorello Juan Rosell Ortega www.geewhez.eu Page 2 of 23

Table of Content: 1 Executive Summary... 5 2 Introduction... 6 2.1 Objectives... 6 2.2 Scope... 6 2.2.1 Testing Techniques... 6 2.3 Outside the Scope... 8 3 Outline of Planned Tests... 9 3.1 Unit Testing... 9 3.2 Integration Testing... 9 3.3 System Testing... 10 3.3.1 User Interface Testing... 10 3.3.2 Security Testing... 10 3.3.3 Performance Testing... 11 3.3.4 Recovery Testing... 11 3.4 Acceptance Testing... 11 3.5 Regression Testing... 11 4 Test Plan Criteria... 13 4.1 Pass/Fail Criteria... 13 4.2 Suspension Criteria and Resumption Requirements... 13 4.2.1 Suspension Criteria... 13 4.2.2 Resumption Requirements... 13 5 Test Deliverables... 14 6 Environmental Needs... 16 6.1 Hardware... 16 6.2 Software... 16 6.3 Tools... 17 7 Responsibilities... 19 8 Schedule... 20 9 Risks and Contingencies... 21 10 References... 22 11 Acronyms... 23 Page 3 of 23

List of Figures: Figure 1: Black-Box Testing... 7 Figure 2 Testing document production phases... 15 Figure 3 Test Plan Schedule... 20 List of Table: Table 1 Comparison between White- and Black-Box Testing... 7 Table 2 Levels of Software Testing... 12 Table 3 Hardware required... 16 Table 4 Software required... 17 Table 5 Tools required... 18 Table 6 Responsibilities... 19 Page 4 of 23

1 Executive Summary This deliverable captures the GEEWHEZ project Test Plan intended to describe the scope, approach, resources, and schedule of the testing activities. The document also identifies the test plan deliverables, the participants responsible for implementing each task, and the risks associated with the plan. This document (D6.1 delivered at month 4 of the project) is intended to serve as a framework document for the consideration of these issues. During the development of the testing activities it shall be used in conjunction with the followings: Considered scenarios Analysis of scenarios and extraction of functional and non-functional requirements System Architecture Specification Page 5 of 23

2 Introduction 2.1 Objectives This Test Plan aims to collect all the information necessary to plan and control the testing activities to be performed for the GEEWHEZ system. To achieve this objective, this document takes into account the following issues: 1. Outline the testing approach that will be used. In other words, provide a methodology on what the team involved in the test activities should verify and the types of tests they will perform. 2. List the resulting deliverables of the testing activities. 3. Identify both human and non-human resources required to perform the Test Plan. 4. Provide a timeline with milestones for the testing phase. The testing activities described in this Test Plan are intended to: 1. Ensure that the GEEWHEZ system meets the specifications and design criteria specified in the following documents: Analysis of scenarios and extraction of functional and nonfunctional requirements and System Architecture Specification. 2. Ensure that the GEEWHEZ system is stable and bug-free and the risk of software/hardware failure is reduced to a minimum. In a nutshell, these tasks aim to verify the proper operation of the GEEWHEZ platform and its modules. 2.2 Scope This document is intended to provide a test plan which describes the testing activities to be performed to verify the accuracy, reliability and completeness of the GEEWHEZ system. This test plan will consist of unit, integration, system, acceptance and regression testing. Testing techniques that will be performed include white- and black-box testing. 2.2.1 Testing Techniques Software testing is one of the verification and validation, or V&V, software practices. This can be illustrated with the following example. Verification (the first V) asks the question: Are we building the system right? as opposed to Validation (the second V) which asks the question: Are we building the right system? To answer these questions it is necessary to perform the two testing techniques: white- and blackbox testing. The main difference between both techniques is the tester s view of the system. Generally speaking, white-box testing is a verification technique that takes into account the internal mechanism of a system or component [1]. Its main objective is to verify the internal workings of the system, specifically, the logic and the structure of the code. Software engineers Page 6 of 23

can usually use it to examine if their code works as expected. White-box testing is also known as structural testing, clear box testing, and glass box testing. Black-box testing (also called functional testing or behavioral testing) is a validation technique that ignores the internal mechanism of a system or component and focuses solely on the outputs generated in response to selected inputs and execution conditions [1]. The goal of this type of testing is to test how well the system conforms to the specifications. The software tester does not (or should not) have access to the source code itself. Black-box testing attempts to find errors in the external behavior of the code in the following categories [2]: (1) incorrect or missing functionality; (2) interface errors; (3) errors in data structures used by interfaces; (4) behavior or performance errors; and (5) initialization and termination errors. Figure 1: Black-Box Testing As mentioned above, it is best if the person who plans and executes black box tests is not the programmer of the code and does not know anything about the structure of the code. The programmers of the code are innately biased and are likely to test that the program does what they programmed it to do. The following table summarizes the three differences between both testing techniques: Tester Visibility A failed test case reveals Controlled? White-Box Testing Code Structure A problem (fault) Yes, it helps to identify the specific lines of code involved. Black-Box Testing System s Inputs/Outputs A symptom of a problem (failure). No, it can be hard to find the cause of the failure. Table 1 Comparison between White- and Black-Box Testing In order to make the distinction between fault and failure more clear let s take into consideration their definitions. A fault is an incorrect step, process, or data definition in a program [1]. However, a failure is the inability of a system or component to perform its required function within the specified performance requirement [1]. Page 7 of 23

2.3 Outside the Scope Some tests that were omitted in this Test Plan include: installation and job stream testing. These tests should be done during the deployment phases on each ATP. Page 8 of 23

3 Outline of Planned Tests The participants involved in the testing activities will use the system documentation to prepare all test case specifications. This approach will verify the accuracy and comprehensiveness of the information in the documentation in those areas covered by the tests. The partners in charge of the actual research and development of the GEEWHEZ modules are responsible for performing the tests mentioned below: unit, integration, system, acceptance and regression testing. For each of the following tests that will be performed to verify the GEEWHEZ system it will be specified the following four issues: the testing technique (is it white-box or blackbox testing), the specification (is it the actual code structure, the low/high-level design or the system requirements), the scale (is the tester examining a small bit of code or the whole system and its environment), and the tester (is it the software developer, an independent tester or the customer). 3.1 Unit Testing Unit testing will test individual hardware or software components along with their functions in isolation. Unit testing is important for ensuring the component is solid before it is integrated with other component. Testing Technique: White-box testing Specification: Code structure and/or low-level design This low level form of testing will consist in white-box testing. Simple unit faults might need to be found in black-box testing if adequate white-box testing is not done properly. Using white-box testing techniques, testers (usually the developers creating the code implementation) verify that the code does what it is intended to do at a very low structural level. The tester will perform this task by writing some test code (included in a test case, in turn, included in a test suite) that will call a method with certain parameters and will ensure that the return value of this method is as expected. 3.2 Integration Testing Integration testing is a type of testing in which software components, hardware components, or both are combined and tested to confirm that they interact between them according to their requirements [1]. Integration testing can continue progressively until the entire system has been integrated. Testing Technique: White- and black-box testing Specification: Low- and high-level design Integration testing will allow testing of all the individually tested units together as a whole. Using both white- and black-box testing techniques, the tester (still usually the software developer) verifies that units work together when they are integrated into a larger code base. To plan these integration test cases, testers look at low- and high-level design documents. Page 9 of 23

3.3 System Testing System testing is testing conducted on a complete, integrated system to evaluate the system s compliance with its specified requirements in representative environments [1]. Because system test is done with a full system implementation and environment, several classes of testing can be done that can examine non-functional properties of the system. This Test Plan includes the following: User Interface Testing Security Testing Performance Testing Recovery Testing Testing Technique: Black-box testing Specification: High-level design 3.3.1 User Interface Testing The purpose of user interface testing is to verify that the system s GUI meets its written specifications. The GUI will be tested by comparing the user interface requirements specified in the document Analysis of scenarios and extraction of functional and non-functional requirements with the actual implementation of the GEEWHEZ system. These requirements may include some user interface issues such as aesthetic, validation, navigation, usability and data integrity conditions. 3.3.2 Security Testing Security testing is performed to determine that the system protects data, and consequently there is not any information leakage, and maintains its functionality as intended. It includes the following: Authentication: Allow a receiver to have confidence that information it receives originated from a specific known source. Authorization: Determining that a requester is allowed to receive a service or perform an operation. Confidentiality: Protect the disclosure of data or information to other parties than the intended. Integrity: Check that the intended receiver receives the information or data which is not altered in transmission. Non-repudiation: Interchange of authentication information with some form of provable time stamp. System s or component s security will be evaluated against the security requirements specified in the document Analysis of scenarios and extraction of functional and non-functional requirements. These requirements may include some security issues such as passwords and permissions, and whatever login or authentication method is in use. For instance, in order to verify permissions Page 10 of 23

requirements the tester will verify that if you are not logged in as an administrator, you cannot carry out administrative functions. Although there is usually some basic security tests included in system testing it does not, however, focus on items such as how to obtain administrative privileges outside of the user login. Although "security" is often presented as merely an aspect of system testing, it really needs to be considered and planned separately. It may take place alongside system testing, but it has almost an opposite focus. 3.3.3 Performance Testing This testing verifies that a system or component is being performed accordingly to customer expectations (response time, availability, portability, and scalability). System s or component s performance will be evaluated against the performance requirements specified in the document Analysis of scenarios and extraction of functional and non-functional requirements. 3.3.4 Recovery Testing The purpose of recovery testing is to check how fast the system can restart after any type of crash or hardware failure has occurred. Recovery testing is also done to ensure that system backup and recovery facilities operate as designed. 3.4 Acceptance Testing This formal testing is conducted to validate the system s compliance with all its requirements (functional and non-functional) in customer s environment. These requirements are specified in the document Analysis of scenarios and extraction of functional and non-functional requirements. It also assures appropriate system acceptance by the user (SMEs owning the GEEWHEZ modules). Testing Technique: Black-box testing Specification: Requirements specification After the entire system has been fully tested, now it is ready to be delivered to the SMEs. They will be responsible for writing black-box acceptance tests based on their expectations of the functionality with the assistance of the test team. The test team will run these tests before attempting to deliver the system. 3.5 Regression Testing Regression testing is selective retesting of a system or component to verify that modifications have not caused unintended effects and that the system or component still complies with its specified requirements [1]. It is usually done to ensure that applied changes to the system have not adversely affected previously tested functionality. Testing Technique: White- and black-box testing Specification: Changed documentation (requirements and/or design specification) and high-level design. Page 11 of 23

Since regression tests are run throughout the development cycle, there can be white-box regression tests at the unit and integration levels and black-box tests at the integration, system and acceptance test levels. It is assumed that several iterations of the regression test will be done in order to test system modifications made during the system test period. A regression test will be performed for each new version of the system to detect unexpected impact resulting from system modifications. The following guidelines should be used when choosing a set of regression tests (also referred to as the regression test suite): Choose a representative sample of tests that exercise all the existing software functions Choose tests that focus on the software components/functions that have been changed Choose additional test cases that focus on the software functions that are most likely to be affected by the change. The following table summarizes the five levels of testing included in this Test Plan: Testing Type Opacity Specification General Scope Tester Unit White-Box Actual Code Structure Low-Level Design Small unit of code no larger than a class Programmer who wrote code Integration White-Box Black-Box Low-Level Design High-Level Design Multiple classes Programmer(s) who wrote code System Black-Box Requirements Analysis Acceptance Black-Box Requirements Analysis Whole system in representative environments Whole system in customer s environment Independent tester Customer Regression White-Box Black-Box Changed Documentation High-Level Design Any of the above Programmer(s) or independent testers Table 2 Levels of Software Testing Page 12 of 23

4 Test Plan Criteria 4.1 Pass/Fail Criteria All test suites completed for software modules. A specified number of tests completed without errors and a percentage with minor defects for hardware modules. 4.2 Suspension Criteria and Resumption Requirements 4.2.1 Suspension Criteria Test suite execution will be suspended if a critical failure that impedes the ability or value in performing the associated test(s) is discovered. 4.2.2 Resumption Requirements When a new version of the system is developed after a suspension of testing has occurred, a regression test as described in 3. 5 will be run. Page 13 of 23

5 Test Deliverables The following documents will be generated during the GEEWHEZ testing process: a) Test Plan (this document) b) Test Case Specifications These documents specify for each GEEWHEZ module s and Middleware s testing requirements the exact input values that will be input and the values of any standing data that is required, the exact output values and changes of value of the internal system state that are expected and any special steps for setting up the tests. It also specifies how the tester will physically run the test, the physical set-up required, and the procedure steps that need to be followed. c) Test Reports These documents record for each GEEWHEZ module and Middleware the details of what Test Cases have been run, the order of their running, and the results of the test. The results are either the test passed, meaning that the actual and expected results were identical, or it failed and that there was a discrepancy. If there is a discrepancy it also reports all details of the incident such as actual and expected results, when it failed, and any supporting evidence that will help in its resolution. The report will also include, if possible, an assessment of the impact upon testing of an incident. This report can be automatically generated during the unit testing by the testing framework used (for e.g. TestNG) but it should include the impact on the overall testing procedure. d) Test Summary Report This report brings together all pertinent information about the testing, including an assessment about how well the testing has been done, the number of incidents raised and outstanding, and crucially an assessment about the quality of the system. The following picture illustrates the testing document production phases and how documents are related to each other s: The Test Plan is an essential part of the GEEWHEZ project documentation. As mentioned in section 1, it shall be used in conjunction with other project documents to prepare Test Case Specifications documents, one for each GEEWHEZ module (Water Monitoring System, Surveillance System, Leisure Services and Administrative Tools) plus one more for the Middleware. Each of these will consist of two or more Test Case Specification. After all Test Cases defined for a module have been executed, a new document called Test Reports containing the results of this module s tests executions is generated. Once the entire system has been fully tested, all Test Reports are integrated. As a result the Test Summary Report is produced. Page 14 of 23

Figure 2 Testing document production phases Page 15 of 23

6 Environmental Needs This section presents the non-human resources required for the GEEWHEZ Test Plan. 6.1 Hardware The following list summarizes the system resources required in the test environment for the development of this Test Plan: Name Quantity Type and Other Notes Wireless Network Infrastructure 1 Android smartphones 10 Mainframe Cabinet 1 A tested and deployed wireless network infrastructure connected to the ATP intranet. Android smartphones with different OS (2.1 to 4.0). Cabinet containing the CPU in order to protect it. Thermographic camera 2 Thermographic cameras for night/day vision. Visible range camera 1 Visible range camera CPU 1 CPU for industrial environments. Water Treatment Integrated Unit 18 6.2 Software Central Unit connected to probes and pumps for monitoring and adjusting water parameters. Table 3 Hardware required The following list shows all the software elements required in the test environment for the implementation of this Test Plan: Name Version Licenses Type and Other Notes Windows OS 7 3 Operating System Eclipse Indigo SDK 3.7.1 JDK 1.7 IE, Firefox, Chrome, Safari and Opera Open Source IDE, mostly provided in Java. Development environment for building applications, applets, and components using the Java programming language. Web Browsers PostgreSQL 9.1 Sophisticated open-source Page 16 of 23

Object-Relational DBMS. Table 4 Software required 6.3 Tools The following tools will be employed to support the test process for this Test Plan: Brand Name Version Vendor Type and Other Notes Trac 0.12.2 Edgewall Software Defect/Issue Tracking http://minerva.netgroup.uniroma2.it/geewhez Maven 3.0.3 Apache TestNG 6.3.1 TestNG Monkey r16 Google Software Project Management and Comprehension Tool. http://maven.apache.org/ Testing Framework inspired from JUnit and NUnit. http://testng.org/doc/index.html Test suite for Android UI that generates pseudorandom streams of user events such as clicks, touches, or gestures, as well as a number of systemlevel events. Android JUnit Extension r16 Google Component-specific test case classes for Android environment. REST Assured 1.5 Jayway Framework for testing and validating REST services. Based on JUnit. http://code.google.com/p/rest-assured/ Selenium 2.17 Selenium HQ Web browser automation testing tool for automating web applications. Testing can be automated in TestNG. http://seleniumhq.org/ Jmeter 2.5.1 Apache Graphical server performance testing tool used to simulate a heavy load on a server, network or object to test its strength or to analyse overall performance under different load types. http://jmeter.apache.org/ t.a.w. 1.0 CTIC Accessibility tool for the analysis of Web sites, based on the W3C. Page 17 of 23

Brand Name Version Vendor Type and Other Notes http://www.tawdis.net/ Web debugging proxy which logs all HTTP(S) traffic between the computer and the Internet. Fiddler2 Watcher 2.0 1.5.4 http://fiddler2.com/fiddler2/ Web security testing tool and passive vulnerability scanner. http://websecuritytool.codeplex.com/ Wallflower - Microsoft Benchmark used for white-box tests of surveillance algorithms. http://research.microsoft.com/enus/um/people/jckrumm/wallflower/testimages.htm Table 5 Tools required About the water monitoring system testing environment: the GEEWHEZ consortium submitted to the REA an amendment asking for the inclusion, in the consortium, of a new certified partner in charge of the future development of this system. If the amendment will be approved this new partner will provide a complete description of the water monitoring test procedure and tools. Page 18 of 23

7 Responsibilities The following table shows the participants responsible for implementing each task: Task Develop and execute Middleware test suites Develop and execute Surveillance System test suites Involved Participant(s) MATEMATICI UC3M T-CON FAICO Develop and execute Water Monitoring System test suites Technovation (*) Develop and execute Leisure Services test suites Develop and execute Administrative Tools test suites Develop and execute GEWHEEZ First Integration test suites Develop and execute GEWHEEZ Second Integration test suites Develop and execute GEWHEEZ Final Integration and Prototype test suites. T-CON UC3M FAICO T-CON, UC3M T-CON, UC3M Table 6 Responsibilities (*) About the participant involved in the development and execution of the Water Monitoring System test suites: the GEEWHEZ consortium submitted to the REA an amendment asking for the inclusion, in the consortium, of a new certified partner in charge of the future development of this system. If the amendment will be approved this new partner will be responsible for performing this task. Page 19 of 23

8 Schedule Figure 3 Test Plan Schedule Page 20 of 23

9 Risks and Contingencies These are the overall risks to the project with a special emphasis on the testing process: Lack of personnel resources when testing is to begin Lack of availability of required hardware Late delivery of the hardware Delays in training on the system and/or tools Changes to the original requirements or designs If the requirements change after their formal definition, the following actions will be taken: The test schedule and development schedule will move out an appropriate number of days. The number of test performed will be reduced 1 The number of acceptable defects will be increased 1 Resources will be added to the test team The test team will work overtime (this could affect team morale). The scope of the plan may be changed There may be some optimization of resources. This should be avoided, if possible, for obvious reasons. 1 This item could lower the overall quality of the delivered system Page 21 of 23

10 References [1] IEEE, "IEEE Standard 610.12-1990, IEEE Standard Glossary of Software Engineering Terminology," 1990. [2] R. Pressman, Software Engineering: A Practitioner's Approach. Boston: McGraw Hill, 2001. This Test Plan is based on the IEEE 829-2008 Standard for Software Test Documentation. Page 22 of 23

11 Acronyms Acronym Description ATP CPU DBMS FAICO GUI HTTP HTTPS IDE IE JDK OS REST SDK SMEs SQL UC3M UI W3C Animal Theme Park Central Processing Unit DataBase Management System Fundación Andaluza de Imagen, Color y Óptica Graphical User Interface HyperText Transfer Protocol HyperText Transfer Protocol Secure Integrated Development Environment Internet Explorer Java Development Kit Operating System Representational State Transfer Software Development Kit Small and Medium-sized Enterprises Structured Query Language Universidad Carlos III de Madrid User Interface World Wide Web Consortium Page 23 of 23