Suunto 2.0 web - Quality Assurance Plan

Similar documents
Custom Software Development Approach

Quality Assurance Plan

eorgette ullivan Portfolio

E-vote 2011 Version: 1.0 Testing and Approval Date: 26/10/2009. E-vote SSA-U Appendix 5 Testing and Approval Project: E-vote 2011

Time Monitoring Tool Software Development Plan. Version <1.1>

ISTQB Certified Tester. Foundation Level. Sample Exam 1

HP Agile Manager What we do

Nova Software Quality Assurance Process

MANUAL TESTING. (Complete Package) We are ready to serve Latest Testing Trends, Are you ready to learn.?? New Batches Info

Announcements. Project status demo in class

Development Methodologies Compared

Relative cost of fixing bugs

How we work. Digital Natives working methods

Test Driven Development Part III: Continuous Integration Venkat Subramaniam

AUTOMATED TESTING and SPI. Brian Lynch

AGILE SOFTWARE TESTING

Building Software in an Agile Manner

Website design & development process

Re: RFP # 08-X MOTOR VEHICLE AUTOMATED TRANSACTION SYSTEM (MATRX) FOR MVC ADDENDUM #10

Digital Marketplace Services Service Definition

Software Engineering Reference Framework

What is a life cycle model?

Process Methodology. Wegmans Deli Kiosk. for. Version 1.0. Prepared by DELI-cious Developers. Rochester Institute of Technology

Group18-CUCE2012. Mr. Mobile Project. Software Testing Plan (STP) Version: 4.0. CM Identifier: G18_SE004

TWX-21 Business System Cloud for Global Corporations

Agile extreme Development & Project Management Strategy Mentored/Component-based Workshop Series

Using Use Cases on Agile Projects

The Quality Assurance Centre of Excellence

Xtreme RUP. Ne t BJECTIVES. Lightening Up the Rational Unified Process. 2/9/2001 Copyright 2001 Net Objectives 1. Agenda

How Silk Central brings flexibility to agile development

Smarter Balanced Assessment Consortium. Recommendation

Educational Collaborative Develops Big Data Solution with MongoDB

Copyrighted , Address :- EH1-Infotech, SCF 69, Top Floor, Phase 3B-2, Sector 60, Mohali (Chandigarh),

Test Automation: A Project Management Perspective

Software Configuration Management Plan

Outline. Introduction. WebQuilt and Mobile Devices: A Web Usability Testing and Analysis Tool for the Mobile Internet

7 KEYS TO DELIVER BETTER APPS FASTER

Implementing Continuous Integration Testing Prepared by:

WHITE PAPER. The 7 Deadly Sins of. Dashboard Design

Terrace Consulting Services

Exploratory Testing Dynamics

VAIL-Plant Asset Integrity Management System. Software Development Process

Levels of Software Testing. Functional Testing

2003 Patricia Ensworth Page 1

Lean Software Development

Agile Software Development

The Design and Improvement of a Software Project Management System Based on CMMI

2. Analysis, Design and Implementation

Project Lifecycle Management (PLM)

A Software Engineering Model for Mobile App Development

How To Write An Slcm Project Plan

A Practical Guide to implementing Agile QA process on Scrum Projects

Teaching an Elephant to Dance. Patterns and Practices for Scaling Agility

Software Process Models. Xin Feng

RFP Attachment C Classifications

Engineering Process Software Qualities Software Architectural Design

LEAN AGILE POCKET GUIDE

Minnesota Health Insurance Exchange (MNHIX)

Keywords document, agile documentation, documentation, Techno functional expert, Team Collaboration, document selection;

2. Analysis, Design and Implementation

PROJECT AUDIT METHODOLOGY

Software Engineering. Christopher Simpkins Chris Simpkins (Georgia Tech) CS 2340 Objects and Design CS / 16

A DIGITAL SOLUTIONS AGENCY Queen Anne Ave N. Ste. 337 Seattle WA,

Exploratory Testing Dynamics

The IconProcess: A Web Development Process Based on RUP

Applying Agile Methods in Rapidly Changing Environments

UI Designer JOB SPECIFICATION. Get in touch

Best Practices, Process

Applying Lean on Agile Scrum Development Methodology

Microsoft Modern ALM. Gilad Levy Baruch Frei

BUSINESS RULES AND GAP ANALYSIS

Driving Quality Improvement and Reducing Technical Debt with the Definition of Done

ICAgile Learning Roadmap Agile Testing Track

Whitepaper. Agile Methodology: An Airline Business Case YOUR SUCCESS IS OUR FOCUS. Published on: Jun-09 Author: Ramesh & Lakshmi Narasimhan

Towards Collaborative Requirements Engineering Tool for ERP product customization

The Role of the Software Architect

The Definitive Guide to Data Blending. White Paper

Best-Practice Software Engineering: Software Processes to Support Project Success. Dietmar Winkler

D25-2. Agile and Scrum Introduction

Mobile App Proposal Magazine company- @address.com. January 12, y. Direct Contact.

PRACTICE GUIDE FOR AGILE SOFTWARE DEVELOPMENT [G62]

7/24/2015. Blackstone Drupal Team

Software Requirements Specification

Agile Projects 7. Agile Project Management 21

Controlling Change on Agile Software Development Projects

Software Quality Assurance Plan

A Tailored Approach to Effective and Efficient Software Process Maturity Improvement

Solr Cloud vs Replication

Change Management Best Practices

Essentials of the Quality Assurance Practice Principles of Testing Test Documentation Techniques. Target Audience: Prerequisites:

Successful Strategies for Custom Software Development

Five questions to ask your data integration provider

Issue in Focus: Consolidating Design Software. Extending Value Beyond 3D CAD Consolidation

How To Improve User Interface Design In An Ema System

Bridging the Gap Between Acceptance Criteria and Definition of Done

BAILEY LAUERMAN AD AGENCY RELIES ON KEY SURVEY

CHAPTER 3 : AGILE METHODOLOGIES. 3.3 Various Agile Software development methodologies. 3.4 Advantage and Disadvantage of Agile Methodology

Top 10 Skills and Knowledge Set Every User Experience (UX) Professional Needs

Agile Metrics. It s Not All That Complicated

ANATOMY OF A WEBSITE PROJECT Jasper Ditton, Head of Digital

Transcription:

Suunto 2.0 web - Quality Assurance Plan T-76.4115 Software Development Project: Iteration 2 Quality in a product or service is not what the supplier puts in. It is what the customer gets out and is willing to pay for - P. Drucker

Table 1. Document changelog. Version Date Done Author 0.1 18.10.2008 First draft Häppölä 0.2 29.10.2008 Second draft Häppölä 1.0 30.10.2008 First release Häppölä 1.5 2.11.2008 Additional details Seppälä 2.0 7.11.2008 Updates for I1 Häppölä 2.3 5.12.2008 Post I1-updates Häppölä 2.5 18.11.2008 Updates for I2 Häppölä 3.0 19.11.2008 Reviewed and approved Häppölä, Palomäki, Seppälä

Table of Contents 1 Introduction...1 1.1 Details concerning the project...1 2 Quality Assurance Framework and Goals...2 2.1 Quality goals...2 2.2 Testing levels...3 3 Quality Assurance Practices...4 3.1 Quality and testing practices...5 3.1.1 Unit testing...5 3.1.2 Test cases...5 3.1.3 Explorative testing...6 3.1.4 Coding standards...6 3.1.5 Side-by-side programming...6 3.1.6 Code reviews and change of ideas...6 3.1.7 Heuristics...6 3.1.8 Documentation...6 3.1.9 General Project management...6 3.1.10 Dialogue with the customer...6 3.2 Summary of testing practices...6 3.3 Testing Environments and Software...7 3.4 Stress testing and scalability...7 3.5 Summary of QA activities...8 4 Deliverables and QA Feedback...8 4.1 Quality assurance documentation...8 4.2 Quality of QA material...8 4.3 Test cases and defect tracking...9 4.4 Quality feedback...9

1 Introduction This document describes quality assurance strategies and practices of Suunto 2.0 Web project. The document is part of the requirements of T-76.4115 Software Development Project course of Helsinki University of Technology. The proposed audience of this document is listed in table 2. First part of this document describes the goals of overall quality process and conceptual framework of this project, second chapter quality practices and their connection to quality goals at different levels of software design. Thirdly, this plan discusses on planned effects and resulting documentation of the quality assurance. This document will be updated during the project as needed. Table 2. Audience of the document. Audience Customer Mentor Team Neula Tester Purpose of the document Describe the planned Quality Assurance practices and frameworks to give an overview of quality management and planned testing methods of the whole project. Communicate the scope and context of the QA and respective practices. Work as a guideline for planned practices and their effect on software development and testing as well as project management. Guide through testing and give information of most important testing goals and practices. 1.1 Details concerning the project The Suunto 2.0 web project consists of several autonomous and functionally separate small applications that are built for different social platforms with different programming languages. Some work server side, most on the user s computer. The only actual interconnecting link between all applications is the MySuunto database that provides the user s training data that is then visualized by the application. Applications will conceptually work as a middleware between their platform and MySuunto database, delivering aggregated simple information through attractive, convenient and light user interfaces, gadgets. This detachment and multiplicity informs quality assurance as well as general software development but is discussed in more detail in the separate architectural technical specification document, requirements document and general project plan. Noteworthy, however is that individual applications can be de facto delivered at different times at they are developed simultaneously. We are multitasking development and design: There are constantly two applications under development and new prototypes designed at the end of every sprint. For additional information, reader should refer the project plan. Also Iteration 1 QA report describes emergent quality practices and overall framework used in course of the project. 1

2 Quality Assurance Framework and Goals 2.1 Quality goals Quality goals of the project are derived primarily from non-functional requirements and are mostly qualitative by nature (Table 3). Table 3. Quality goals ID Quality goal Description Verification QG 1 Compatibility Applications developed in this project must be seamlessly compatible with the platform they are developed for and Suunto DB. Will be ensured by extensive testing and following platform development instructions. Meanwhile, the customer s DB deployment is delayed so this integration may not be feasible at all during the project. QG 2 Layout Especially commercial marketing attractiveness and visual compliance with the Suunto brand guidelines. Approval and dialogue with the customer s design team and possibly also testing with external reference groups (like peer group) with this perspective in mind. QG 3 Usability Must be simple for the user to use and add to his/her platform. Applications should work with virtually no technical knowhow. Lightweight appearance and intuitive user interface. Feedback from customers and reference groups. QG 4 Localization Several different languages must be supported. Aspect will be brought along through the whole design and development process. QG 5 Innovativeness Innovative approaches to gadget design. Large role in design has been given to the team. Process is iterative and requirements refined with ongoing development. Brainstorming sessions. Evolutionary and iterative idea development and updating of requirements throughout the project by using feedback from the customer. 2

QG 6 Exploration Gadgets must provide the customer with new technological insight into emerging social technologies and platforms. QG 7 Variety In total 2 commercial quality, 2 working quality, 1 prototype applications and 10 descriptions will be delivered to the customer by the end. Documentation of learning and continuous bilateral idea exchange. Emphasis on project planning and time tracking. 2.2 Testing levels There are four testing levels according to the V-model of software development. Figure 1 shows their respective links to different design levels and phases of the software development process. In our project, however, there is practically no integration testing level since the modules are autonomous applications and due to their small size usually do not consist of reasonable smaller submodules. Also the implementation of the customer s database is delayed, so it may not be available for testing at all during the project. Figure 1. V-model of software development [Wikimedia Commons] 3

3 Quality Assurance Practices Quality strategies and their implementation are updated during the Iteration when needed and the quality practice adaptation could be described as a trial-and-error learning process after the Deming wheel (Figure 1). Act Plan Check Do Figure 2 The Deming PDCA-cycle Taken that project consists of several totally autonomous applications, many originally planned high level quality practices can t be really used for value-adding purposes in reality as became apparent during the Iteration 1. Several approaches were done as an exercise and their actual realized practicability and usability for the project were evaluated at the end of iteration. Please refer to QA report for details. The development process of individual application with respective quality goals and levels are described in the framework of Figure 2. Reader should note that only fraction of finally delivered applications are required to reach the commercial quality. Customer Feedback Requirements Cycle Prototype Description (Level 0) Prototype (Level 1) Working Quality (Level 2) Commercial Quality (Level 3) Delivery Main Focus Innovating Platform Learning Coding Testing & UI finalizing Figure 3 Application development process and three levels of quality. 4

As we re developing several applications simultaneously and waiting comments from the customer, the status of development and required quality was divided into four levels (Table 2). Table 4 Levels of quality Level of quality Prototype Description Prototype Working Quality Commercial Quality Description Written document and visual presentation of the gadget idea. Includes also a small user story. No actual technical implementation yet. First working visual representation and technical implementation of the idea under the platform after exploration of its technical specifications, possibilities and limitations. Application that has the most important functions described in the description implemented and usable (but not yet finalized) user interface. Some minor additional features requested by the customer. Also finished user interface after customer s wishes and comments from the design team. Thoroughly tested according to the test cases. 3.1 Quality and testing practices 3.1.1 Unit testing Unit testing is the lowest level of white box testing and the primary element in the quality assurance chain. The developer is responsible for writing unit tests. Regular runs and automatic unit testing make sure that the code base doesn t break. Unit testing will be performed throughout the project where possible. Different platforms and programming languages give certain limitations to the tools available. Given that the applications are small and they are run through web platforms, unit testing is mainly limited to database interfaces and will be done mostly as an academic exercise. 3.1.2 Test cases Test cases will be written in compliance with functional requirements and most important requirements. These are based on our prototype definitions, their user stories 1 and customer feedback received during the development process. Functional testing will be mostly done at the end, prior delivery due to customer s focusing wishes: altering requirements, altering level of quality and changes in the actual development process. Reader should refer to the user requirements document and QA report for deeper reasoning. 1 This creative and iterative requirement design process is presented in the project plan and requirements document in more detail. 5

3.1.3 Explorative testing After each sprint the customer will be given a prototype to test. Application can be later enhanced and requirements altered to meet new emergent ideas. Also our team and peer group will be given opportunity to play around with the applications to find potential caveats as well as potential enhancement ideas (given the NDA restrictions). Explorative testing also gets important role before delivering prototypes for customer. Charters and logs will be written. 3.1.4 Coding standards Coding standards are described in the project plan. They, however, needed some reviewing to better meet project needs. 3.1.5 Side-by-side programming As a main practice, we will be having two developers working on one application. Therefore side-by-side programming is a definition. We have weekly coding sessions in which nearly all team members are present. Strong emphasis is on continuous communication and mutual sanity checking of the code. 3.1.6 Code reviews and change of ideas Code reviews will be arranged at the end of each sprint at least. This also serves the purpose of mutual learning of different technologies as we have two teams working on two different applications. 3.1.7 Heuristics More abstract goals regarding usability and visual layout are to be measured by constant feedback from the customer and using external reference groups (e.g. the peer group) for testing and new ideas. 3.1.8 Documentation Documentation includes both internal documentation as well as research documentation concerning findings, opportunities and technical issues of new platforms and technologies. The latter will be passed on to the customer as well. 3.1.9 General Project management As the project consists of 18 separate applications and functional requirements can be heavily altered during iterations, emphasis must be all times on overall project management, tracking and - above all - big picture. As been discovered during the first iteration, platform learning takes more time than was expected. Also the customer s needs for variety and level of quality might change during the project. 3.1.10 Dialogue with the customer Constant dialogue with the customer is a must due to the iterative application design and building process. New prototypes will be sketched as first applications reach the prototype phase. Thus customer feedback is of enormous importance. Please refer to the user requirements document. 3.2 Summary of testing practices Planned testing practices at each testing level with planned tools are summarized in the Table 4. 6

Table 5. Testing levels, planned practices and tools Testing level Practices Tools Unit level Unit level testing will be performed where applicable. JUnit, PHPUnit, GWT Integration testing System testing Compatibility with the platforms and Suunto database will be tested with test cases and explorative testing. Database connection will hopefully be available by the end of the project. Explorative testing Meetings at the end of sprints and constant feedback from the customer Bugzilla Bugzilla, Testopia Code reviews at the end of sprints and constant side-by-side programming. Acceptance testing Compatibility with functional requirements will be ensured by test cases and explorative acceptance session with the customer. Bugzilla, Testopia 3.3 Testing Environments and Software In addition to tools described in the Table 5, testing will be done mainly in real environments except for Facebook applications which will be tested non-publicly in the developer area. The Facebook server application will run on a rented Debian Linux virtual server. Thus, after our delivery it still needs further development and larger scale server architecture considerations if going to be released into production as it s going to be delivered mainly as a proof-of-concept artifact. The defect tracking system used in this project is Bugzilla. It has been enhanced with Testopia test case management suite that integrates into Bugzilla s tracking system seamlessly and runs inside it. Some additional QA tools, like CruiseControl, were set up for exercise, but their fit in this project is debatable and were abandoned at the beginning of the second iteration. 3.4 Stress testing and scalability Apart from Facebook, applications run on the client side and restricting elements are thus the user s computer and Suunto database. The prior shouldn t be an issue since the main target is to make applications as lightweight as possible and individual computer performance doesn t affect the system as a whole. The latter element is out of scope of this project. 7

Facebook application may need some focus on scalability in communities with several users. However, the optimization possibilities under this project are very limited since the Facebook application won t likely be commercialized as such. The final public version would need a further consideration of the customer s own future large scale server architecture as currently it will be built to a custom rented server running only that application. 3.5 Summary of QA activities QA practices and according goals are visualized in the quality palette of Table 6. Table 6. Quality palette Quality Goal vs. Quality Practice (effect 1-3) Unit testing QG 1 Compatibility X QG 2 Layout QG 3 Usability QG 4 Localization Test cases X X X Explorative testing X X X X QG5 Innovativeness Coding standards X Side-by-side X X X X programming Code reviews X X Heuristics X X Documentation X X X General project management Dialogue with the customer QG 6 Exploration X X X X X X X QG 7 Variety 4 Deliverables and QA Feedback 4.1 Quality assurance documentation Quality assurance plan will be updated as needed to clarify details and better match empirically found best practices. Other quality documentation will be delivered given the course framework and as described in the project plan schedule. 4.2 Quality of QA material One of the issues to track is the quality, repeatability and consistency of bug reports and other quality related material. 8

4.3 Test cases and defect tracking Our defect tracking system Bugzilla has been enhanced with Testopia test case management suite which manages test plans, test cases, test runs and gives extensive reports with different views. This gives us an opportunity to conveniently avoid hassle of sharing Excel sheets with test cases and matrices. We can get the big picture of overall project even with 5 separate applications in development by consolidating these as separate products under this environment. Test case management tool can be also seen as an exercise for larger development projects. 4.4 Quality feedback The primary source of quality feedback and awareness is the feedback from the customer at the end of every sprint. This gives us an opportunity to correct our steering, development and quality practices if needed. Also it s the responsibility of Quality Manager to follow the status of quality of the entire project as whole. We also try to keep quality praxis as transparent as possible to make it easily communicable to customer as well. 9