IPA/SEC Data entry form Version 3.0 for IPA/SEC White Paper 20xx on software development projects in Japan



Similar documents
Information Technology Engineers Examination. Network Specialist Examination. (Level 4) Syllabus. Details of Knowledge and Skills Required for

FUNCTION POINT ANALYSIS: Sizing The Software Deliverable. BEYOND FUNCTION POINTS So you ve got the count, Now what?

Fundamentals of Function Point Analysis

Industry Metrics for Outsourcing and Vendor Management

Information Technology Engineers Examination. Information Security Specialist Examination. (Level 4) Syllabus

FUNCTION POINT ANAYSIS DETERMINING THE SIZE OF ERP IMPLEMENTATION PROJECTS By Paulo Gurevitz Cunha

FAST Function Points. David Seaver Director Estimation and Measurement Fidelity Investments

DETERMINING THE SIZE OF ERP IMPLEMENTATION PROJECTS. Paulo Gurevitz Cunha EDS EDS --Electronic Data Systems Data Engineering West,

Introduction to Function Points

Industry Metrics for Outsourcing and Vendor Management

Why SNAP? What is SNAP (in a nutshell)? Does SNAP work? How to use SNAP when we already use Function Points? How can I learn more? What s next?

Derived Data in Classifying an EO

Software Development: Tools and Processes. Lecture - 16: Estimation

Paragon Protect & Restore

Configuring ThinkServer RAID 100 on the Lenovo TS130

Annex 9: Technical proposal template. Table of contents

Function Point Measurement from Java Programs

PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING PMI PMBOK & ESTIMATING

Counting Infrastructure Software

APPLYING FUNCTION POINTS WITHIN A SOA ENVIRONMENT

Calculation of the Functional Size and Productivity with the IFPUG method (CPM 4.3.1). The DDway experience with WebRatio

Information Technology Engineers Examination. Information Technology Service Manager Examination. (Level 4) Syllabus

Warehouse R x Inventory Management Software. Technical Overview

CTERA Cloud Care. Support Services. Mar Version , CTERA Networks. All rights reserved.

JOURNAL OF OBJECT TECHNOLOGY

Measuring Change Requests to support effective project management practices.

JAVA/J2EE DEVELOPER RESUME

InfiniteInsight 6.5 sp4

KPI, OEE AND DOWNTIME ANALYTICS. An ICONICS Whitepaper

Project QA and Collaboration Plan for <project name>

A Rules Engine Experiment: Lessons Learned on When and How to use a Rules-Based Solution

Overview Software Assurance is an annual subscription that includes: Technical Support, Maintenance and Software Upgrades.

PATROL From a Database Administrator s Perspective

How to Decide which Method to Use

Measurement Information Model

Project Plan for <project name>

Chartis RiskTech Quadrant for Model Risk Management Systems 2014

System Infrastructure Non-Functional Requirements Related Item List

Press 1 for How to count Press 2 for an IVR Press 3 for using Function Points

ORACLE FORMS APPLICATIONS?

IBM TSM DISASTER RECOVERY BEST PRACTICES WITH EMC DATA DOMAIN DEDUPLICATION STORAGE

Version Overview. Business value

Bringing Value to the Organization with Performance Testing

How To Test For Performance

Oracle WebLogic Server 11g: Administration Essentials

theguard! ApplicationManager System Windows Data Collector

System Development and Life-Cycle Management (SDLCM) Methodology. Approval CISSCO Program Director

E 2 T 2 ENTERPRISE ENGINE FOR TROUBLE TICKETING

FROM RELATIONAL TO OBJECT DATABASE MANAGEMENT SYSTEMS

EUCIP IT Administrator - Module 2 Operating Systems Syllabus Version 3.0

SATA II 4 Port PCI RAID Card RC217 User Manual

Cisco Performance Visibility Manager 1.0.1

Manage Workflows. Workflows and Workflow Actions

Cloud-based Managed Services for SAP. Service Catalogue

E-vote 2011 Version: 1.0 Testing and Approval Date: 26/10/2009. E-vote SSA-U Appendix 5 Testing and Approval Project: E-vote 2011

Fleet Manager Quick Guide (Non Maintenance Mode)

Big Data Analytics Service Definition G-Cloud 7

Semarchy Convergence for Data Integration The Data Integration Platform for Evolutionary MDM

Call Center - Supervisor Application User Manual

Information Technology Project Oversight Framework

ANNEX A.1 TECHNICAL SPECIFICATIONS OPEN CALL FOR TENDERS F-SE-13-T01 WEB DEVELOPMENT SERVICES

HP 3PAR Recovery Manager Software for Microsoft Exchange Server 2007, 2010, and 2013

Outsourcing BI Maintenance Services Version 3.0 January With SourceCode Inc.

SAP InfiniteInsight Explorer Analytical Data Management v7.0

SIZING ANDROID MOBILE APPLICATIONS

How To Use Rackspace Backup On A Computer Or A Hard Drive

SAP BusinessObjects Business Intelligence (BI) platform Document Version: 4.1, Support Package Report Conversion Tool Guide

Deputy Secretary for Information Technology Date Issued: November 20, 2009 Date Revised: December 20, Revision History Description:

CA Single Sign-On r12.x (CA SiteMinder) Implementation Proven Professional Exam

A Zebra Technologies White Paper. Bar Code Printing from Oracle WMS and MSCA

Lecture 26 Enterprise Internet Computing 1. Enterprise computing 2. Enterprise Internet computing 3. Natures of enterprise computing 4.

1z0-102 Q&A. DEMO Version

Design Document. Offline Charging Server (Offline CS ) Version i -

Oracle Application Server

Optional Lab: Schedule Task Using GUI and at Command in Windows 7

OVERVIEW OF THE PROJECT...

Measuring Software Functionality Using Function Point Method Based On Design Documentation

Custom Solutions Center. Users Guide. Low Cost OEM PackML Templates L02 Release. Version LC-1.0

A Scheme for Implementing Load Balancing of Web Server

CatDV Pro Workgroup Serve r

Siebel Business Process Framework: Workflow Guide. Siebel Innovation Pack 2013 Version 8.1/8.2 September 2013

Getting Things Done: Practical Web/e-Commerce Application Stress Testing

Information Technology Engineers Examination. Systems Auditor Examination. (Level 4) Syllabus

Guide to the IT Passport Examination

Scalability and BMC Remedy Action Request System TECHNICAL WHITE PAPER

Oakland County Department of Information Technology Project Scope and Approach

Canada Savings Bonds Program. FTP Server User Guide. Version 2.5

SCCM Plug-in User Guide. Version 2.21

Installation and Deployment

Transcription:

IPA/SEC Data entry form Version 3.0 for IPA/SEC White Paper 20xx on software development projects in Japan Information-Technology Promotion Agency, Japan(IPA) Software Engineering Center(SEC) Contents Introduction...1 Data entry form Version 3.0...2 Introduction This document describes the data entry form for IPA/SEC White paper 20xx on software development projects in Japan. The contents of this document are equivalent to Appendix B of White Paper 2008 and later. It is recommended that this document be read in conjunction with the other IPA/SEC publications. Relevant IPA/SEC publications include the following: IPA/SEC White paper 20xx on software development projects in Japan IPA/SEC Data item definitions Version 3.0 for IPA/SEC White paper 20xx on software development projects in Japan. For information about this document, contact: IPA/SEC Bunkyo-Green Court Center Office, 2-28-8, Hon-Komagome, Bunkyo-ku, Tokyo 113-6591, JAPAN. Web: http://sec.ipa.go.jp/ 2010 IPA/SEC. All Rights Reserved. Information-Technology Promotion Agency, Japan/ Software Engineering Center, 2009. IPA/SEC Data entry form Version 3.0 1

Rose : Mandatiry Beige : Mandatiry Light yellow : Important Light green : Recommended Light blue : Alternative : Automatic entry (Entry disabled) IPA/SEC Data Entry Form Ver.3.0 Category (*) Choose No. Data an alternative Description Free text, or alternatives The project identification assigned by the company that offered the project data. This data item is also used 10084 Proprietary project ID for sub-system identification. Example: 1-1, 1-2, (IDs distinguishing sub-system projects from each other) 11001 Whole/sub flag (*) The flag that identifies whether the data belongs to the whole system project or a sub-system project. Assign the same group ID to member projects of the same group. 11002 Grouping ID * Write a free text for this data item regardless of the choice made for the data item 11001. 10085 Reliability of company-evaluated project data (*) The reliability of project data. 103 Project type (*) The type of project (development or not) Choose an alternative for the stability of the existing system if "maintenance/service" or "enhancement" is 104 The stability of the existing system (*) chosen for item 103. 105 Project category (*) The category of the project. Name for "Other" 106 Entrusted development working site (*) Choose one to three alternatives for the working site if "entrusted development" is chosen for item 105. Software development (*) Infrastructure-building (*) Operational environment preparation(*) System migration (*) Maintenance (*) Operation support (*) 107 Project purpose The primary purpose(s) of the project (multiple choice). Consulting (*) Project management (*) Quality assurance (*) * Write "O" for every alternative that fits your project. On-site environment preparation/adjustment for a running system (*) Customer training (*) Other (description) (1) General Characteristics of Development Projects 108 New customer or old customer (*) Did the project serve a new customer or an old customer? 109 New business or not (*) Was the project aimed at a new industry or business or an old industry or business? Choose one to three alternatives if the project used outsourced workforce. 118 Source of outsourced workforce (*) * An affiliate refers to a company that has capital transactions with another. Write one or more country names if item 118 has the value of "c" or "d." 119 Outsourcing country Example: China, India Choose one to three alternatives if the data item 118 has any value other than "e." 110 New subcontractors or not (*) (Keep consistency with item 118.) 111 Using new technology or not (*) Whether or not the project used new technology. 112 Cleaness of responsibility and roles of project team members (*) How clearly were the responsibility and roles of project team members defined? 113 Clearness of goals and priority (*) How clearly were the project objectives (delivery date, quality, technologies, etc.) and their priority defined? (2) Project Applications 114 Working space (*) The working space for the project team. 115 Project environment (acoustic noise) (*) The level of acoustic noise in the working environment. Evaluate whether or not the project is an overall success with respect to QCD. 116 Project success_self-evaluation (*) * A project is a success if its planning is appropriate and if its planned goals were achieved. A project that had no planning is a success if it ends up with desirable results. 120 Evaluation of planning (Cost) (*) Whether or not the cost planning was valid. 121 Evaluation of planning (Quality) (*) Whether or not the objectives of delivered quality were valid. 122 Evaluation of planning (Development schedule) (*) Whether or not the development schedule planning was valid. 123 Eveluation of results (Cost) (*) Evaluation of the results of cost planning. 124 Evaluation of results (Quality) (*) Evaluation of achievement of delivered quality objectives. Whether or not the schedule planning was valid. Evaluate the schedule planning based on the state of 125 Evaluation of results (Development schedule) (*) delay in product delivery with respect to the delivery date specified by the customer. The reason why the cost, quality, and development schedule (delivery date) objectives were not achieved. 126 Reason for QCD objectives failure (*) (For example, the data item 123 has the value "c", "d", or "e.") Choose one to three alternatives. 117 Subjective evaluation of customer satisfaction (*) How do you feel about the customer's satisfaction? Choose one alternative based on your own feeling. The type of industry the developed system is used for, or the type of industry in which the project's 201 Industry type (*) customer works. (Choose one to three alternatives.) 202 Business type (*) The type of business the developed system is used for. (Choose one to three alternatives.) 203 System applications (*) The application of the system developed by the project. (Choose one to three alternatives.) 204 User accessibility (*) Whether the system developed by the project is accessible to limited users or is open to the public. The number of users who use the developed system. This data item is valid if data item 204 has the value 205 users a (Accessible to limited users). (persons) 206 user sites The number of user sites where servers or other devices are installed. (sites) 207 User concurrency The maximum number of users who concurrently use the developed system. (persons) Copyright (C) 2008-2010 IPA/SEC. All rights reserved. - 1/6 - IPA/SEC Data Entry Form Version 3.0

Category (3) System Characteristics (4) Development Techniques (*) Choose No. Data an alternative Description Free text, or alternatives 301 Type of developed system (*) The type of the software developed by the project. Name for "Other." Did the project use one or more business software packages? 302 Use of business application package (*) # Except for in-house business software packages. Whether or not the company of the project used the business software package(s) for the first time. This 303 First-time use of business application package (*) data item is valid if data item 302 has the value a (Yes). The name of the software package(s) used for the project. This data item is valid if data item 302 has the 304 Name of business software package value a (Yes). Ex: SAP, Oracle Applications Make a rough estimation of the ratio of the total functional size of the used business software package(s) to 305 The functional size ratio of business software package the functional size of the whole developed system. This data item is valid if data item 302 has the value a (%) (Yes). The ratio of customization cost of the used business software package(s) to the total cost of the 306 Customization cost ratio of business software package package(s). This data item is valid if data item 302 has the value a (Yes). (%) 307 Processing Mode (*) In what processing mode the developed system is used. (Choose one to three alternatives.) The type of architecture of the developed system. (Up to three types are selectable from the largest size to 308 Architecture (*) smaller ones.) 309 Target platform (*) The primary operating system platform of the developed system. (Choose one to three alternatives.) 310 Use of Web technology (*) What kinds of Web technology did the project use? (Choose one to three alternatives.) 311 Online transaction processing system (*) The software used for online transaction processing. Name for "Other." 312 Primary programming language (1) (*) Language for "Other." 312 Primary programming language (2) (*) The programming language primarily used. Language for "Other." *1 Up to 5 languages are selectable from the most frequently used one to lesser ones. 312 Primary programming language (3) (*) *2 Choose "w: Other" for unlisted languages such as CGI, Java applets, and EJB and write the names of Language for "Other." 312 Primary programming language (4) (*) the languages. Language for "Other." 312 Primary programming language (5) (*) Language for "Other." 313 Use of DBMS (*) What kind of DBMS did the project use? (Choose one to three alternatives.) 401 Development life cycle model (*) Development life cycle model Name for "Other." 402 Use of operation support tool (*) Did the project use an operation support tool? Name for "Other." Did the project examine one or more similar past projects in the planning phase? 403 Examined similar projects or not (*) # Choose b: No if the project did not examine one or more similar projects that existed. 404 Use of project management tool (*) Did the project use a project management tool? 405 Use of configuration management tool (*) Did the project use a configuration management tool? # Example configuration management tools: ClearCase, CVS, Subversion, PVCS, SCCS, VSS. 406 Use of design support tool (*) Did the project use a design support tool? 407 Use of documentation tool (*) Did the project use a documentation tool? 408 Use of debug/testing support tool (*) Did the project use a debug/testing support tool? 409 Use of CASE tool (*) Did the project use an upstream or integrated CASE tool? 411 Use of code generator (*) Did the project use a code generator? * If the name of the code generator used is an in-house tool and making its name open is not allowed, write In-house tool. 412 Application of Development Methods (*) The schematic development approach applied to the project. Name for "Other." 413 Re-use rate_development planning document re-used pages/ total pages (%) 414 Re-use rate_requirements definition document re-used pages/ total pages (%) 415 Re-use rate_basic design document re-used pages/ total pages (%) 416 Re-use rate_detailed design document re-used pages/ total pages (%) 417 Reuse rate of source code The ratio of reused SLOC size of the source code to the total SLOC size of source code. (%) The reuse ratio of reused software components such as library components in terms of functional size. 418 Reuse rate of software components The approximate ratio of functional size of the reused software components to the total functional size of the (%) developed system. 419 Reuse rate of test cases for integration test re-used test cases/ total test cases (%) 420 Reuse rate of test cases for system test re-used test cases/ total test cases (%) 421 Reuse rate of test cases for acceptance test re-used test cases/ total test cases (%) 422 Use of development frameworks (*) Did the project use a development framework? Examples: Struts, Net framework, JBOSS, J2EE. Copyright (C) 2008-2010 IPA/SEC. All rights reserved. - 2/6 - IPA/SEC Data Entry Form Version 3.0

(*) Choose Category No. Data an alternative Description Free text, or alternatives 501 Clearness of user requirements specifications (*) The degree of clearness the requirements specifications had at the beginning of the basic design phase. 502 User participation in user requirement specifications (*) The degree of user participation in the requirement specifications. 503 User expertise in computing (*) The level of user expertise in computer systems and system development. 504 User expertise in applied business (*) The level of user expertise in the applied business. 505 Clearness of user role and responsibility (*) How clearly were the role and responsibility of the user and those of the vender defined? 506 User acknowledgment of requirements specifications (*) Did the user acknowledge the requirements specifications? 507 User comprehension of system design (*) The degree of user understanding of the system design 508 User acknowledgment of system design (*) Did the user acknowledge the system design? 509 User participation in acceptance test (*) The degree of user participation in the acceptance test. 511 members participated in requirements definition The number of key persons who defined the requirements. (persons) The level of reliability requirements in terms of the failure rate, recovery time, data recovery, and other (5) User 512 Level of requirements (Reliability) (*) factors. Requirement The level of usability requirements in terms of the ease of software learning, ease of operation learning, Management 513 Level of requirements (Usability) (*) ease of operation management, the sophistication of graphical interface design, and other factors. The level of performance and efficiency requirements in terms of the response time, processing time, 514 Level of requirements (Performance and efficiency) (*) processing power, the utilization of system resources such as hard disks and memory, and other factors. The level of maintainability requirements in terms of the ease of software correction, ease of fault locating, 515 Level of requirements (Maintainability) (*) ease of fault identification, ease of software change, protection against possible troubles in software change, the ease of software correction validity verification, and other factors. The level of portability requirements in terms of the ease of adjustment to a new environment, ease of 516 Level of requirements (Portability) (*) installation in the environment, ease of concurrent operation with other software components, ease of porting from other software, and other factors. 517 Level of requirements (Running cost) (*) The level of requirements in terms of the system running cost. 518 Level of requirements (Security) (*) The level of system security requirements. 519 Legal restrictions (*) Legal restrictions placed on the developed system. The skill level of project managers. Score the PM skill in accordance with the job "Project Management" of 601 PM skill (*) the IT Skill Standard Version 1.1. Staff skills 602 Staff skill_application domain experience (*) The skill level of staff with respect to the application to which the developed system is aimed at. 603 Staff skill_analysis and design experience (*) The skill level of staff with respect to the system analysis and design. 604 Staff skill_programming language and software tool experience(*) The skill level of staff with respect to programming languages and software tools. 605 Staff skill_development platform experience(*) The skill level of staff with respect to the use of development platform. General comment 1012 1Contract of this project (primary subcontract, secondary subcontract, in-house contract). 2If the system size is measured in SLOCs, clarify in what kind of quantity the size was measured (number of lines, number of steps, number of physical lines, or number of logical lines). 3Write remarks such as the outsourced effort (converted from the amount of money ordered with actual effort). Copyright (C) 2008-2010 IPA/SEC. All rights reserved. - 3/6 - IPA/SEC Data Entry Form Version 3.0

Size (1) FP (2) FP size of upgraded part * Enter the FP size of the existing system and the FP sizes of added, changed, and/or deleted parts if "upgrade" is chosen for item 103. Planned FP size (unadjusted) FP size Phase After system planning After requirements definition After basic design After detailed design Unadjusted Adjusted Adjustment factor FP size Measurement method (*) Name for "Other" method Part FP size Planned FP size Existing FPs Added FPs Changed FPs Deleted FPs Purity of measurement method for actual FP size (*) FP measurement support technology(*) Write the name of the method if it is a customized version. Inclusiveness of existing FP size (3) SLOC Write SLOC sizes in SLOCs (not in KSLOCs). After system planning Planned SLOC size After requirements definition After basic design After detailed design size SLOC size Comment line inclusion (*) Comment line ratio(*) Blank line inclusion (*) Blank line ratio (*) Inclusiveness of existing SLOC size Part-based SLOC size Per-programming-language actual SLOC size (top 5 languages) (Part) Planned Language Comment line Comment line Blank line inclusion (*) ratio(*) inclusion (*) Blank line ratio (*) Enter the actual SLOC sizes of the top five programming languages. Existing Independent of the Part-based SLOC size table. Added/new Changed Deleted Copyright (C) 2008-2010 IPA/SEC. All rights reserved. - 4/6 - IPA/SEC Data Entry Form Version 3.0

(4) Detailed FP size (for IFPUG)) * If "IFPUG" is chosen for item 701, enter the number of and the FP size of each basic FP element (EI, EO, EQ, ILF, and EIF) on a per-degree-of-complexity. Functions FP Large Medium Small Planned * FP = Large 6 + Medium 4 + Small 3 EI Transactional Planned * FP = Large 7 + Medium 5 + Small 4 EO functions Planned * FP = Large 6 + Medium 4 + Small 3 EQ Planned * FP = Large 15 + Medium 10 + Small 7 ILF Data functions Planned * FP = Large 10 + Medium 7 + Small 5 EIF (5) Detailed FP size (for methods other than IFPUG (6) Detailed FP size (COSMIC-FFP) * If the used FP measurement method is the type "Other" that is created based on the NESMA indicative, NESMA estimated, or IFPUG method, * If the used FP measurement method is COSMIC-FFP, enter its detailed data. enter the total number of transactional functions, that of data functions, and their total FP sizes. Function functions FP Value Transactional functions Planned triggering events functional processes Data functions Planned data groups Entry Subprocesses Exit Read Write Cfsu (7) Other indices related to the size Development planning Design document volume Requirements definition Basic design Detailed design DFD data sets processes database tables GUI screen types report formats batch processes Value Simple Typical Complex Use-case use-cases actors Copyright (C) 2008-2010 IPA/SEC. All rights reserved. - 5/6 - IPA/SEC Data Entry Form Version 3.0

Effort, development schedule, number of staff Unit of effort (*) Conversion ratio from person-month to person-hour Planned development effort At the beginning of basic design At the beginning of detailed design If the effort in the project data is measured in person-months, enter the number of hours one person works per month at the working rate of 100%. Otherwise, enter "1" here. [Person hours] [Person hours] Development Requirements Acceptance Basic Design Detailed Design Implementation Integration Test System Test Planning Definition Test Out-Of-Category Whole Project Origination of actual tasks (*) Origination of requirements specifications changes(*) Beginning date Planned Completion date Offered data (reference) Development schedule (*1) Months Beginning date Idling duration (*2) Completion date Months In-house Development 0.0 Offered data (reference) Management (*3) 0.0 Other (*4) 0.0 Out-of-category (*5) 0.0 < Subtotal > In-house 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [Person hours] Hours 0 0 0 0 0 0 0 0 0 0 h effort Review effort (in-house) 0.0 [Person hours] times 0 issues 0 Outsourced Origination of tasks Development 0.0 Expenditure ratio (%) < Total > In-house + Outsourced 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 0.0 [Person hours] Hours 0 0 0 0 0 0 0 0 0 0 h In-house Average staff Peak Outsourced Average Peak (*1) Enter the beginning date and the completion date or the number of months (to one decimal place) in the development schedule cells. You can enther both of them in the cells. (*2) The duration in which the project remained idling. (For example, waiting for a signature of the customer or for the arrival of test data.) The active duration of the project is obtained by subtracting the idling duration from the whole project development schedule. (*3) If the project management effort was collected separately, enter its amount. (*4) If the project has some actual effort does not fall into the development effort or management effort, enter the actual effort. (For example, effort of infrastructure-building, operation environment preparation, system migration, operation support, consulting.) (*5) Enter the effort that does not fall into any defined category. Quality and reliability Definition of test case count Definition of software bug count Personnel assignment to test tasks (*) Existence of quantitative delivery quality standard (*) If you choose Yes, write a description. Personnel assignment to quality assurance tasks (*) Existence of third-party reviews (*) Follow-Up (operation) Integration Test System Test 1 month 3 months 6 months 12 months test cases Identified defects Failures Faults Very critical Failures Critical identified defects Insignificant Total (*1) Very critical Faults Critical Insignificant Total (*1) Grades of criticalness Very critical The defect causes a damage to the customer and quick countermeasures have to be taken. Critical The defect causes no damage to the customer, but quick countermeasures have to be taken. Insignificant The defect causes no damage to the customer and there is no need of quick countermeasures. Copyright (C) 2008-2010 IPA/SEC. All rights reserved. - 6/6 - IPA/SEC Data Entry Form Version 3.0