New criteria for assessing a technological design



Similar documents
1.1 The Nature of Software... Object-Oriented Software Engineering Practical Software Development using UML and Java. The Nature of Software...

Software Engineering Transfer Degree

2012/2013 Programme Specification Data. Engineering

Analytic Hierarchy Process for Design Selection of Laminated Bamboo Chair

The SPES Methodology Modeling- and Analysis Techniques

ASU College of Education Course Syllabus ED 4972, ED 4973, ED 4974, ED 4975 or EDG 5660 Clinical Teaching

The Phases of an Object-Oriented Application

ECSA EXIT LEVEL OUTCOMES Extract from ECSA Document PE-61/E-02-PE Rev-2 26 July 2004

UNIVERSITY OF INFINITE AMBITIONS. MASTER OF SCIENCE COMPUTER SCIENCE DATA SCIENCE AND SMART SERVICES

Completing a Peer Review

International Engineering Alliance. Glossary of Terms Ver 2: 15 September 2011

CRITICAL AND CREATIVE THINKING RUBRIC GRADUATE PROGRAMS

HIT Workflow & Redesign Specialist: Curriculum Overview

AN EXCHANGE LANGUAGE FOR PROCESS MODELLING AND MODEL MANAGEMENT

REGULATIONS AND CURRICULUM FOR THE MASTER S PROGRAMME IN INFORMATION ARCHITECTURE FACULTY OF HUMANITIES AALBORG UNIVERSITY

Introduction and Overview

STAGE 1 COMPETENCY STANDARD FOR PROFESSIONAL ENGINEER

Bachelor of Information Technology

Nothing in this job description restricts management's right to assign or reassign duties and responsibilities to this job at any time.

CS Standards Crosswalk: CSTA K-12 Computer Science Standards and Oracle Java Programming (2014)

ACADEMIC POLICY AND PLANNING COMMITTEE REQUEST FOR AHC GENERAL EDUCATION CONSIDERATION

Software Engineering/Courses Description Introduction to Software Engineering Credit Hours: 3 Prerequisite: (Computer Programming 2).

Neuve_Communiqué_April_2009.pdf

STFC IPS Guidance Notes. Contents

Optimization in the Virtual Data Center

Bachelor's Degree in Business Administration and Master's Degree course description

2014 New Jersey Core Curriculum Content Standards - Technology

Software Design. Design (I) Software Design Data Design. Relationships between the Analysis Model and the Design Model

Software Engineering Reference Framework

Short Course. Coding. Specification for Junior Cycle

SUBJECT-SPECIFIC CRITERIA

How To Determine Your Level Of Competence

SE 367 Software Engineering Basics of Software Engineering

Weldon School of Biomedical Engineering Continuous Improvement Guide

The Role of the Software Architect

Indiana Statewide Transfer General Education Core

SCHOOL OF ENGINEERING Baccalaureate Study in Engineering Goals and Assessment of Student Learning Outcomes

REQUIREMENTS FOR THE MASTER THESIS IN INNOVATION AND TECHNOLOGY MANAGEMENT PROGRAM

BS Environmental Science ( )

Describe the process of parallelization as it relates to problem solving.

How To Teach Math

WSMC High School Competition The Pros and Cons Credit Cards Project Problem 2007

Quality Assurance for doctoral education

I - Institutional Information

Future Landscapes. Research report CONTENTS. June 2005

MA in Sociology. Assessment Plan*

Metropolitan State University of Denver Master of Social Work Program Field Evaluation

How To Develop Software

NJ Department of Education Office of Educational Technology Digital Learning NJ (DLNJ)

2012/2013 Programme Specification Data. Business Purchasing and Supply Chain Management. Business/Management

Instructional Design Framework CSE: Unit 1 Lesson 1

Software Engineering. What is SE, Anyway? Based on Software Engineering, 7 th Edition by Ian Sommerville

Relevant QAA subject Benchmarking group(s): Master's degrees in business and management, 2007 Henley Business School at Univ of Reading

Risk Knowledge Capture in the Riskit Method

Introduction. Getting started with software engineering. Ian Sommerville 2000 Software Engineering, 6th edition. Chapter 1 Slide 1

Circuits and Boolean Expressions

Requirements engineering

APPENDIX F Science and Engineering Practices in the NGSS

Writing learning objectives

Plymouth University. Faculty of Science and Engineering. School of Computing Electronics and Mathematics. Programme Specification

Programming and Coding. Draft Specification for Junior Cycle Short Course

Kindergarten to Grade 4 Manitoba Foundations for Scientific Literacy

Department/Academic Unit: Economics Degree Program: MA

Models of Dissertation Research in Design

Mastering increasing product complexity with Collaborative Systems Engineering and PLM

International Master's Programme in Computer Engineering, 120 higher education credits

An Introduction to the PRINCE2 project methodology by Ruth Court from FTC Kaplan

Introduction to Software Engineering

Programme Syllabus for Master Programme in International Strategic Management

Models of Dissertation in Design Introduction Taking a practical but formal position built on a general theoretical research framework (Love, 2000) th

Research competences in University education: profile of Master's Programs

Towards Collaborative Requirements Engineering Tool for ERP product customization

Department of Engineering & Technology IT 112 Product Development & Design Proposed University Studies Course

An Introduction to Software Engineering

An Introduction to Software Engineering. Ian Sommerville 2004 Software Engineering, 7th edition. Chapter 1 Slide 1

NOTICE 127 OF The HEQSF-compliant qualification attached was developed in order to fill the gap that was identified in the progression from

o Ivy Tech DESN 105- Architectural Design I DESN 113- Intermediate CAD o Vincennes University ARCH 221- Advanced Architectural Software Applications

Document Engineering: Analyzing and Designing the Semantics of Business Service Networks

Metropolitan State University of Denver Master of Social Work Program Field Evaluation

Curriculum for Business Economics and Information Technology

Quotes from Object-Oriented Software Construction

FET-Open in Horizon2020 Work Programme Roumen Borissov Future and Emerging Technologies FET-Open Research Executive Agency

Design Maturity Matrix

APPENDIX I Engineering Design in the NGSS Key Definitions

Abstraction in Computer Science & Software Engineering: A Pedagogical Perspective

Master of Science in Management

Transcription:

New criteria for assessing a technological design Kees van Hee and Kees van Overveld April 2012

1. Introduction In 2010 we developed a set of criteria for the evaluation of technological design projects of the PDEng programmes of the Stan Ackermans Institute (SAI). We have separated the evaluation of the final outcome, the technological design (see [1]), from the evaluation of the process that has led to the outcome(see [2]). The main reason for this separation is that the project is teamwork and that the supervisors may have a substantial influence on the outcome because they are responsible for an acceptable final result. So in theory it may be that the outcome of the design project is very good but that the contribution of the trainee is poor and it also may happen that the outcome is not very exciting, but the contribution of the trainee was splendid. The criteria, documented in [1], are based on 9 aspects of a design and measured by 27 indicators, which all have an operational scale, which means there is an objective method to determine indicator values. This is important if one wants to compare the quality of different technological designs. In the past academic year we have experimented with the criteria in all programmes of SAI. After that we have evaluated the applicability of the criteria. The following observations were made: 1. The evaluation takes too much time, because: there are too many indicators. some indicators are too difficult to apply (e.g. complexity and economical value ). 2. Some indicators are not applicable in every situation and so the evaluators have to decide which indicators are relevant and which are not. This is confusing. 3. Some indicators should be relative with respect to the (given) problem. 4. There was no recipe to come to a final judgment. In some programmes a short course was given to trainees before they started their project. This turned out to be very useful because the trainees started with thinking about the value of their contributions. It was a pity that the evaluators did not take the course, which might have helped them to overcome some of the appearing difficulties. Based on these experiences we have redesigned the evaluation of technological designs. We have made the following changes: 1. We distinguish five aspects (instead of the 9 before) that cover only 12 criteria (instead of the earlier 27 indicators). 2. Each criterion is valued by a 5-point ordinal subject scale. We offer one interpretation of the scales, but the evaluators have the freedom to take another one depending on the type of project. 3. The criteria (called indicators before, see the report criteria for assessing a technological design) are only offered as an aid to compute the aspects value in case the members of the evaluation team cannot agree. 4. We offer a simple recipe to determine a final mark. 1

2. Technological designs In [1] we give an extensive analysis of the merit and the objectives of the SAI programmes. Some of it we summarize here and we put the programmes in the international context. The programmes of SAI are programmes that fit in the third cycle of the Bologna declaration ([3]). This means that the trainees are expected to deliver a scientific or technological contribution to society In case of engineering programmes it is important to determine what a contribution should be. In order to do so, we cite the following definition of engineering of the American Engineering Council of Professional Development (ECPD/ABET) (see [4]): The creative application of scientific principles to design or develop structures, machines, apparatus, manufacturing processes or works. Meaning the solution of an engineering problem is an artifact, i.e. a man-made system that is either a tangible or an intangible product, or a process. Artifacts as we see them, serve an economical or societal purpose, which means they have a value. Artifacts we consider are designed according to scientific principles, which means that there should be a systematic method for synthesizing the design and that a design is evaluated using scientifically based analysis methods. In our programmes we consider the technological design of an artifact as the outcome of the project. A technological design of an artifact can occur in various modalities. One is an abstract representation of the artifact in the form of a symbolic model, which can be an informal model, a mathematical model or a computer model. Another modality is a physical model in the form of a prototype. In most cases a technological design consists of more than one model, each describing a facet of the artifact. A technological design serves the following purposes: Communication between the stakeholders about the artifact; Documentation of the artifact for instance for instructing users, installation, maintenance and future modifications; Analysis of the artifact. The analysis relates to the behavior of the artifact in a certain context. We distinguish formal analysis and experimental analysis; Construction of the artifact, which means that it serves as blue print. We distinguish three kinds of projects: (1) a technological design of a complete artifact, (2) a design of a component of a larger artifact, (3) a redesign or a reconfiguration of an existing artifact. In each project the emphasis can be on different phases of a design. So the focus can be on the, on the modeling or on the analysis of the artifact. 3. Evaluation of technological designs To evaluate technological designs we distinguish 5 aspects or views: 1. Functionality. Answering the question What is the artifact doing in its environment? 2. Construction. Answering the question How will the artifact do this? 3. Realizablity. Answering the question How can the artifact be realized? 4. Impact. Answering the question What are the risks and benefits of the artifact for its environment? 2

5. Presentation. Answering the question What does the artifact look like in all its details? In total we distinguish 12 criteria to evaluate an artifact. They are grouped per aspect. We give for each criterion a 5-point scale varying from 1 to 5. The idea is that the members of an evaluation team express their judgment as a value on this scale. When the range of values for one criterion is at least two among the team members, we recommend a detailed analysis of the different judgments rather than merely averaging the individual members scores. For this analysis we offer a set of quantitative indicators to make an objective judgment. Some criteria can be interpreted in different ways. For each criterion we suggest one, but the evaluation team can choose its own in a concrete situation. There are two ways to define scales. One interpretation is the justification or the quality of the argumentation, while another interpretation is the value of the solution. For example one could have found a very profitable solution but have no arguments that it is the best there is. On the other hand one could have found a straightforward solution but a proof is given that there is no better solution available. It is up to the evaluation team to choose an interpretation; however the choice should be documented. (This has to be incorporated in the examination rules.) 4. The criteria The criteria are grouped per aspect. They all have a 5-point scale. 1. Functionality a. Satisfaction. This concerns the extent to which the designed artifact satisfies the. Often the formal develop during the project, based on mere informal initial. In case the are relatively easy to meet, the evaluation team will be more strict in weighing the discrepancies than in case the are very difficult. So in a way the judgment of the evaluation team will evaluate the satisfaction relatively to the difficulty of the problem. Poor fit to the Insufficient fit to the More or less meets Meets Exceeds b. Ease of use. This concerns the ease of use for the stakeholders. The stakeholders are e.g.: end users, operators, engineers is responsible for installation and maintenance of the artifact. Very difficult Difficult Acceptable Easy Very easy 3

c. Reusability. The extent to which the artifact can be used in other situations. No reuse In same context, In same context, In different context, In different domains same scale different scale same domain We distinguish the notions of scale, context and (application) domain. In different disciplines these notions may have different meanings. 2. Construction a. Structuring. This concerns the partitioning of the artifact in logical or physical components. Structuring may use hierarchy, which means that subsystems can be considered as components themselves. The structuring is often called the architecture of an artifact. Structuring is important to understand the construction of an artifact and it is used for instance for manufacturing and maintenance. The structuring has 4 elements: (1) overview, with or without hierarchy, (2) low degree of coupling between components, (3) high cohesion within components, (4) clear interfaces. None 1 out of 4 2 out of 4 3 out of 4 All 4 b. Inventivity. The measure for originality. One way to express this is by the surprise factor. Surprise for Surprise for peers Surprise for laymen professionals No surprise at all Surprise for supervisors c. Convincingness. This concerns the evidence that the construction will work and has the defined functionality. Here we distinguish several forms of proof. An empirical proof is a statistical argument based of either simulations or on experimentation with a prototype. No proof Informal Empirical proof based Empirical proof based Formal and empirical proof on simulation on a prototype proof 3. Realizability a. Technical realizability This concerns certainty that it is technically possible to produce the artifact. Unkown if it can be produced Informal arguments Model-based analysis Prototype is realized 0-series is produced 4

b. Economical realizability This concerns the business case for the artifact. A business case can be scored in two ways: the analysis is convincing or the outcome such that it is easy to convince stakeholder to invest in it. The next scale combines the two. No business case Accurate estimate of costs Accurate estimates of costs and revenues A wellsubstantiated financing plan Business case committed by stakeholders 4. Impact a. Societal. This concerns the influence the artifact will have on societal values such as sustainability or health and well-being. Negative No Low positive Moderate positive High positive b. Risks This either may concern the risks of the artifact during development of the artifact or the risks related to the use of the artifact. The analyses of the risks as well as the measures for mitigation are important. Risks not Risks informally Risks scientifically Risk mitigation measures taken Risks scientifically and adequately mitigated 5. Presentation The presentation includes the documentation of the artifact, but it may also concern a prototype or an animation. a. Completeness Very poor Poor Marginal Good Very good b. Correctness Unreliable presentation Many errors found Acceptable number of errors Few errors found No errors found 5

5. Final mark Six of the 12 criteria concern the kernel of the artifact: the functionality and the construction, while the six others cover other aspects. To reach a final judgment, the scores on the 12 criteria need to be aggregated. This can be done by means of multiplicative weights, which allows to express differences in relative importance among the criteria. By default, all weights could be taken equal; alternatively, the weights for the criteria 1a 2c could be made bigger than those for 3a 5b. Also other motivated choices can be appropriate. In case multiplicative weights are used the motivation to do this should be documented. 6. References 1. Kees van Hee and Kees van Overveld. Criteria for assessing a technological design. SAI report 2010. 2. Criteria for assessing the design process of a technological design. SAI report 2010. 3. Bologna declaration. http://eacea.ec.europa.eu/education/eurydice/documents/thematic_reports/122en.pdf 4. Engineering Council of Professional Development. Canons of ethics for engineers. New York (1947) 6

Appendix - Evaluation Form Aspects Design* Project name Designer Company Company supervisor University supervisor Scale: fail poor fair good excellent Aspects for assessing design Functionality Criteria Value per criterium Weight criterium Score aspect Satisfaction Ease of use Reusability Construction Structuring Inventivity Convincingness Realizability Impact Technical realizability Economical realizability Social Risks Presentation Completeness Correctness Total score * Please, use the scale description on the following page when filling out the evaluation form. 7

Aspects for assessing a technological design 1. Functionality Criteria Fail Poor Fair Good Excellent Satisfaction Poor fit to the Insufficient fit to the More or less meets Meets Exceeds Ease of use Very difficult Difficult Acceptable Easy Very easy Reusability No reuse In same context, same scale In same context, different scale In different context, same domain In different domains 2. Construction Criteria Fail Poor Fair Good Excellent Structuring None 1 out of 4 2 out of 4 3 out of 4 All 4 Inventivity No surprise at all Surprise for laymen Surprise for peers Surprise for professionals Surprise for supervisors Convincingness No proof Informal proof Empirical proof based on simulation Empirical proof based on a prototype Formal and empirical proof 3. Realizability Criteria Fail Poor Fair Good Excellent Technical realizability Economical realizability Unkown if it can be produced No business case Informal arguments Accurate estimate of costs Model-based analysis Accurate estimates of costs and revenues Prototype is realized A wellsubstantiated financing plan 0-series is produced Business case committed by stakeholders 4. Impact Criteria Fail Poor Fair Good Excellent Social Negative No Low positive Moderate positive High positive Risks Risks not Risks informally Risks scientifically Risk mitigation measures taken Risks scientifically and adequately mitigated 5. Presentation Criteria Fail Poor Fair Good Excellent Completeness Very poor Poor Marginal Good Very good Correctness Unreliable presentation Many errors found Acceptable number of errors Few errors found No errors found 8

9