Quality Attribute Generic Scenarios

Similar documents
Software Testing. Knowledge Base. Rajat Kumar Bal. Introduction

Evaluation of the Iceland State Financial and Human Resource System REPORT OF THE INDIVIDUAL EVALUATOR. Annex 2 SYSTEM AND SOFTWARE QUALITY

Organizational Requirements Engineering

Assurance in Service-Oriented Environments

Problem Statement. Jonathan Huang Aditya Devarakonda. Overview

Audit Logging. Overall Goals

Software testing. Objectives

Plain English Guide To Common Criteria Requirements In The. Field Device Protection Profile Version 0.75

Value Paper Author: Edgar C. Ramirez. Diverse redundancy used in SIS technology to achieve higher safety integrity

Software Engineering Compiled By: Roshani Ghimire Page 1

Quality Certificate for Kaspersky DDoS Prevention Software

Chapter 8 Software Testing

Defining Quality Workbook. <Program/Project/Work Name> Quality Definition

Enterprise Key Management: A Strategic Approach ENTERPRISE KEY MANAGEMENT A SRATEGIC APPROACH. White Paper February

Best Practices for Managing Virtualized Environments

Network Management Basics

Software Engineering. Software Testing. Based on Software Engineering, 7 th Edition by Ian Sommerville

Bringing Value to the Organization with Performance Testing

Service Design Best Practices

Load Testing and Monitoring Web Applications in a Windows Environment

How To Use Adobe Software For A Business

Successful Scalability Principles - Part 1

Levels of Software Testing. Functional Testing

Agile Performance Testing

FSW QA Testing Levels Definitions

Persistent, Reliable JMS Messaging Integrated Into Voyager s Distributed Application Platform

Designtech Cloud-SaaS Hosting and Delivery Policy, Version 1.0, Designtech Cloud-SaaS Hosting and Delivery Policy

How To Test A Web Based System

SolovatSoft. Load and Performance Test Plan Sample. Title: [include project s release name] Version: Date: SolovatSoft Page 1 of 13

SOFTWARE ASSET MANAGEMENT Continuous Monitoring. September 16, 2013

Smarter Balanced Assessment Consortium. Recommendation

Testing Big data is one of the biggest

How To Test For Performance

05.0 Application Development

Implementation Workflow

Test Plan Template: (Name of the Product) Prepared by: (Names of Preparers) (Date) TABLE OF CONTENTS

Enterprise Application Performance Management: An End-to-End Perspective

WHITE PAPER. Managed File Transfer: When Data Loss Prevention Is Not Enough Moving Beyond Stopping Leaks and Protecting

Learning More About Load Testing

Points of Defect Creation

Fundamentals of LoadRunner 9.0 (2 Days)

ARMORVOX IMPOSTORMAPS HOW TO BUILD AN EFFECTIVE VOICE BIOMETRIC SOLUTION IN THREE EASY STEPS

Classical Software Life Cycle Models

Who Controls Your Information in the Cloud?

CYBERSECURITY TESTING & CERTIFICATION SERVICE TERMS

IT General Controls Domain COBIT Domain Control Objective Control Activity Test Plan Test of Controls Results

Software Development Processes

HIPAA CRITICAL AREAS TECHNICAL SECURITY FOCUS FOR CLOUD DEPLOYMENT

A closer look at HP LoadRunner software

A Comprehensive Approach to Master Data Management Testing

Exploratory Testing Dynamics

Introduction to Automated Testing

Oracle Insurance Policy Administration System Quality Assurance Testing Methodology. An Oracle White Paper August 2008

Standard: Event Monitoring

A COMPLETE GUIDE HOW TO CHOOSE A CLOUD-TO-CLOUD BACKUP PROVIDER FOR THE ENTERPRISE

Writing for Developers: The New Customers. Amruta Ranade

<Project Name> Solution Architecture Preliminary System Design

Designed by James Bach Version /20/ Copyright , Satisfice, Inc.

SERVICE LEVEL AGREEMENT (SLA)

Recovery Management. Release Data: March 18, Prepared by: Thomas Bronack

Design Principles for Protection Mechanisms. Security Principles. Economy of Mechanism. Least Privilege. Complete Mediation. Economy of Mechanism (2)

Software Architecture Documentation

Error Codes for F-Secure Anti-Virus for Firewalls, Windows 6.20

Tools for Testing Software Architectures. Learning Objectives. Context

CHAPTER 20 TESING WEB APPLICATIONS. Overview

Schedule 2s. Additional terms for Security Services. 1. Security Service Description

Secure Data Transfer and Replication Mechanisms in Grid Environments p. 1

An Introduction to. Metrics. used during. Software Development

E-vote 2011 Version: 1.0 Testing and Approval Date: 26/10/2009. E-vote SSA-U Appendix 5 Testing and Approval Project: E-vote 2011

TEST PLAN Issue Date: <dd/mm/yyyy> Revision Date: <dd/mm/yyyy>

Table of Contents INTRODUCTION Prerequisites... 3 Audience... 3 Report Metrics... 3

MOC 20467B: Designing Business Intelligence Solutions with Microsoft SQL Server 2012

EWICS London, January 18, 2005 BSI. Safety-Related Security. Concepts

Integrating F5 Application Delivery Solutions with VMware View 4.5

Service Delivery Module

L2 Box. Layer 2 Network encryption Verifiably secure, simple, fast.

TABLE OF CONTENT. Page 2 of 9 INTERNET FIREWALL POLICY

How To Test For Elulla

PRODUCT DESCRIPTIONS AND METRICS

MovieLabs Specification for Enhanced Content Protection Version 1.0

Let SDN Be Your Eyes: Secure Forensics in Data Center Networks

Presentation: 1.1 Introduction to Software Testing

Infasme Support. Incident Management Process. [Version 1.0]

Firewalls vs. ESBCs: You May Be Under Attack and Not Even Know It. Mike Reiman Director of Software Solutions

Guideline for stresstest Page 1 of 6. Stress test

Using Automated, Detailed Configuration and Change Reporting to Achieve and Maintain PCI Compliance Part 4

Charu Babbar 1, Neha Bajpai 2 and Dipti Kapoor Sarmah 3

Using Palo Alto Networks to Protect the Datacenter

S E C U R I T Y A S S E S S M E N T : B o m g a r B o x T M. Bomgar. Product Penetration Test. September 2010

Trends in Application Recovery. Andreas Schwegmann, HP

Methods of software quality assurance. Daniel Osielczak Sebastian Mianowski

CDC UNIFIED PROCESS JOB AID

Getting Things Done: Practical Web/e-Commerce Application Stress Testing

Wireless Device Management: Controlling The Wireless Enterprise And Reducing Total Cost of Ownership

National Institute of Standards and Technology

Contract # Accepted on: March 29, Starling Systems. 711 S. Capitol Way, Suite 301 Olympia, WA 98501

The Role of Automation Systems in Management of Change

TIBCO Spotfire and S+ Product Family

Security White Paper The Goverlan Solution

Transcription:

Quality Attribute Generic Scenarios Different quality attributes Availability is concerned with system failure and duration of system failures. System failure means when the system does not provide the service for which it was intended. Modifiability is about the cost of change, both in time and money. Performance is about timeliness. Events occur and the system must respond in a timely fashion. Security is the ability of the system to prevent or resist unauthorized access while providing access to legitimate users. An attack is an attempt to breach security. Testability refers to the ease with which the software can be made to demonstrate its faults or lack thereof. To be testable the system must control inputs and be able to observe outputs. Usability is how easy it is for the user to accomplish tasks and what support the system provides for the user to accomplish this. Dimensions: Learning system features Using the system efficiently Minimizing the impact of errors Adapting the system to the user s needs Increasing confidence and satisfaction Quality attribute scenarios A Quality Attribute Scenario is a quality-attribute-specific requirement. There are 6 parts: 1. Source of stimulus (e.g., human, computer system, etc.) 2. Stimulus a condition that needs to be considered 3. Environment - what are the conditions when the stimulus occurs? 4. what elements of the system are stimulated. 5. Response the activity undertaken after arrival of the stimulus 6. Response measure when the response occurs it should be measurable so that the requirement can be tested.

Availability General availability scenario System s processors Communication channels Persistent storage Source Stimulus Environment Response Measure Internal to system External to system Crash Omission Timing No response Incorrect response Sample availability scenario Normal operation Startup Shutdown Repair mode Degraded (failsafe) mode Overloaded operation Prevent the failure Log the failure Notify users / operators Disable source of failure Temporarily unavailable Continue (normal / degraded) Time interval available Availability % Detection time Repair time Degraded mode time interval Unavailability time interval

Availability tactics More information You can find more information in the book Software architecture in Practice. Please note that the website of the book contains an outdated version (version 2) while this document is based on version 3. Availability: http://www.ece.ubc.ca/~matei/eece417/bass/ch04lev1sec4.html Availability tactics: http://www.ece.ubc.ca/~matei/eece417/bass/ch05lev1sec2.html

Modifiability General modifiability scenario Code Data Interfaces Components Resources Configurations Source Stimulus Environment Response Measure End-user Developer Systemadministrator Add / delete / modify functionality, quality attribute, capacity or technology Sample modifiability scenario Runtime Compile time Build time Initiation time Design time Make modification Test modification Deploy modification Cost in effort Cost in money Cost in time Cost in number, size, complexity of affected artifacts Extent affects other system functions or qualities New defects introduced

Modifiability tactics More information You can find more information in the book Software architecture in Practice. Please note that the website of the book contains an outdated version (version 2) while this document is based on version 3. Modifiability: http://www.ece.ubc.ca/~matei/eece417/bass/ch04lev1sec4.html Modifiability tactics: http://www.ece.ubc.ca/~matei/eece417/bass/ch05lev1sec3.html

Performance General performance scenario System Component Source Stimulus Environment Response Measure Internal to the system External to the system Periodic events Sporadic events Bursty events Stochastic events Sample performance scenario Normal mode Overload mode Reduced capacity mode Emergency mode Peak mode Process events Change level of service Latency Deadline Throughput Jitter Miss rate Data loss

Performance tactics More information You can find more information in the book Software architecture in Practice. Please note that the website of the book contains an outdated version (version 2) while this document is based on version 3. Performance: http://www.ece.ubc.ca/~matei/eece417/bass/ch04lev1sec4.html Performance tactics: http://www.ece.ubc.ca/~matei/eece417/bass/ch05lev1sec4.html

Security General security scenario System services Data within the system Component / resource of the system Data produced / consumed by the system Source Stimulus Environment Response Measure Identified user Unknown user Hacker from outside the organization Hacker from inside the organization Attempt to display data Attempt to modify data Attempt to delete data Access system services Change system s behavior Reduce availability Sample security scenario Normal mode Overload mode Reduced capacity mode Emergency mode Peak mode Process events Change level of service Latency Deadline Throughput Jitter Miss rate Data loss

Security tactics More information You can find more information in the book Software architecture in Practice. Please note that the website of the book contains an outdated version (version 2) while this document is based on version 3. Security: http://www.ece.ubc.ca/~matei/eece417/bass/ch04lev1sec4.html Security tactics: http://www.ece.ubc.ca/~matei/eece417/bass/ch05lev1sec5.html

Testability General testability scenario Portion of the system being tested Source Stimulus Environment Response Measure Unit tester Integration tester System tester Acceptance tester End user Automated testing tools Execution of tests due to completion of code increment Design time Development time Compile time Integration time Deployment time Run time Execute test suite & capture results Capture cause of fault Control & monitor state of the system Effort to find fault Effort to achieve coverage % Probability of fault being revealed by next test Time to perform tests Effort to detect faults Length of longest dependency chain Time to prepare test environment Reduction in risk exposure

Sample testability scenario Testibility tactics More information You can find more information in the book Software architecture in Practice. Please note that the website of the book contains an outdated version (version 2) while this document is based on version 3. Testability: http://www.ece.ubc.ca/~matei/eece417/bass/ch04lev1sec4.html Testability tactics: http://www.ece.ubc.ca/~matei/eece417/bass/ch05lev1sec6.html

Usability General usability scenario System Portion of the system with which the user is interacting Source Stimulus Environment Response Measure End user (possibly special role) Use the system efficiently Learn to use the system Minimize impact of errors Adapt the system Configure the system Sample usability scenario Runtime Configuration time Provide features needed Anticipate the user s needs Task time Number of errors Number of tasks accomplished User satisfaction Gain of user knowledge Ratio of successful operations to total operations Amount of time / data lost when error occurs

Usability tactics More information You can find more information in the book Software architecture in Practice. Please note that the website of the book contains an outdated version (version 2) while this document is based on version 3. Usability: http://www.ece.ubc.ca/~matei/eece417/bass/ch04lev1sec4.html Usability tactics: http://www.ece.ubc.ca/~matei/eece417/bass/ch05lev1sec7.html

Table 1 Availability Generic Scenario. Availability Source Stimulus Environment Response Response Measure Internal to system or external to system Crash, omission, timing, no response, incorrect response System s processors, communication channels, persistent storage Normal operation; degraded (failsafe) mode Log the failure, notify users/operators, disable source of failure, continue (normal/degraded) Time interval available, availability%, repair time, unavailability time interval Table 2 Modifiability Generic Scenario. MODIFIABILITY Source Stimulus Environment End-user, developer, system-administrator Add/delete/modify functionality or quality attr. System user interface, platform, environment At runtime, compile time, build time, design-time Response RespMeasure Locate places in architecture for modifying, modify, test modification, deploys modification Cost in effort, money, time, extent affects other system functions or qualities

Table 3 Performance Generic Scenario. Performance Source Stimulus A number of sources both external and internal Periodic events, sporadic events, bursty, stochastic events Environment Response RespMeasure System, or possibly a component Normal mode; overload mode ; reduced capacity mode Process stimuli; change level of service Latency, deadline, throughput, capacity jitter, miss rate, data loss Table 4 Security Generic Scenario. Scenario Portion Source Stimulus Environment Response RespMeasure Possible Values User/system who is legitimate/imposter/unknown with full/limited access Attempt to display/modify data; access services System services, data Normal operation; degraded (failsafe) mode Authenticate user; hide identity of user; grant/block access; encrypt data; detect excessive demand Time /effort/resources to circumvent security measures with probability of success

Table 5 Testability Generic Scenario. Scenario Portion Possible Values Source Stimulus Environment Response Unit developer, increment integrator, system verifier, client acceptance tester, system user Analysis, architecture, design, class, subsystem integration, system delivered Piece of design, piece of code, complete system At design time, at development time, at compile time, at deployment time Provide access to state data values, observes results, compares RespMeasure % coverage; prob. of failure; time to perform tests; length of time to prepare test environment Table 6 Usability Generic Scenario. Scenario Portion Source Stimulus Environment Response RespMeasure Possible Values End user Wants to: learn system, use system, recover from errors, adapt system, feel comfortable System At runtime, or configure time, install-time (see below) Task time, number of errors, number of tasks accomplished, user satisfaction, gain of user knowledge, amount of time/data lost