Management Methods for Reducing Span Time during New Product Development



Similar documents
(Refer Slide Time: 01:52)

Model-Based Conceptual Design through to system implementation Lessons from a structured yet agile approach

Set-Based Design: A Decision-Theoretic Perspective

UNCERTAINTIES OF MATHEMATICAL MODELING

Software Development with Agile Methods

The Scrum Guide. The Definitive Guide to Scrum: The Rules of the Game. July Developed and sustained by Ken Schwaber and Jeff Sutherland

Dynamic Change Management for Fast-tracking Construction Projects

PROJECT SCOPE MANAGEMENT

INTEGRATED OPTIMIZATION OF SAFETY STOCK

STRATEGIC CAPACITY PLANNING USING STOCK CONTROL MODEL

CSE 435 Software Engineering. Sept 16, 2015

Project Time Management

Justifying Simulation. Why use simulation? Accurate Depiction of Reality. Insightful system evaluations

Simulation for Business Value and Software Process/Product Tradeoff Decisions

CHAPTER 3 SECURITY CONSTRAINED OPTIMAL SHORT-TERM HYDROTHERMAL SCHEDULING

Deployment of express checkout lines at supermarkets

Risk Management for IT Security: When Theory Meets Practice

Simulation and Lean Six Sigma

The problem with waiting time

Chapter 4 Software Lifecycle and Performance Analysis

PROJECT TIME MANAGEMENT

FUNBIO PROJECT RISK MANAGEMENT GUIDELINES

A Comparison of System Dynamics (SD) and Discrete Event Simulation (DES) Al Sweetser Overview.

How To Test For Elulla

Knowledge-Based Systems Engineering Risk Assessment

14TH INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN AUGUST 2003

Agile and lean methods for managing application development process

Module 11. Software Project Planning. Version 2 CSE IIT, Kharagpur

A Case Study of the Systems Engineering Process in Healthcare Informatics Quality Improvement. Systems Engineering. Ali M. Hodroj

CRITICAL CHAIN AND CRITICAL PATH, CAN THEY COEXIST?

Prescriptive Analytics. A business guide

Systems Engineering Complexity & Project Management

Realizing the Benefits of Finite Capacity Scheduling to Manage Batch Production Systems

DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN. April 2009 SLAC I

Keywords document, agile documentation, documentation, Techno functional expert, Team Collaboration, document selection;

Fairfield Public Schools

Analysis Of Shoe Manufacturing Factory By Simulation Of Production Processes

IMPROVING THE DESIGN-CONSTRUCTION INTERFACE

VOLATILITY AND DEVIATION OF DISTRIBUTED SOLAR

Arena 9.0 Basic Modules based on Arena Online Help

A Framework for Adaptive Process Modeling and Execution (FAME)

LEAN AGILE POCKET GUIDE

Quantitative Inventory Uncertainty

Automated Scheduling Methods. Advanced Planning and Scheduling Techniques

Agile support with Kanban some tips and tricks By Tomas Björkholm

Computing & Communications Services

AGILE BUSINESS INTELLIGENCE

Modeling Stochastic Inventory Policy with Simulation

USER S GUIDE for DSM@MIT

Expert Reference Series of White Papers. Intersecting Project Management and Business Analysis

Improving Software Development Economics Part II: Reducing Software Product Complexity and Improving Software Processes

COMBINING THE METHODS OF FORECASTING AND DECISION-MAKING TO OPTIMISE THE FINANCIAL PERFORMANCE OF SMALL ENTERPRISES

Using Analytic Hierarchy Process (AHP) Method to Prioritise Human Resources in Substitution Problem

Finite Capacity Portfolio and Pipeline Management

A Risk Management System Framework for New Product Development (NPD)

White Paper Operations Research Applications to Support Performance Improvement in Healthcare

Agile and lean methods for managing application development process

JOURNAL OF OBJECT TECHNOLOGY

Recurrent Neural Networks

Mining the Software Change Repository of a Legacy Telephony System

Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!

OPTIMIZATION OF LOAD HAUL DUMP MINING SYSTEM BY OEE AND MATCH FACTOR FOR SURFACE MINING

SINGLE-STAGE MULTI-PRODUCT PRODUCTION AND INVENTORY SYSTEMS: AN ITERATIVE ALGORITHM BASED ON DYNAMIC SCHEDULING AND FIXED PITCH PRODUCTION

Applying Lean on Agile Scrum Development Methodology

Lean vs. Agile similarities and differences Created by Stephen Barkar -

ABHINAV NATIONAL MONTHLY REFEREED JOURNAL OF RESEARCH IN SCIENCE & TECHNOLOGY

CCPM: TOC Based Project Management Technique

Executive Guide to SAFe 24 July An Executive s Guide to the Scaled Agile Framework.

Oracle Real Time Decisions

Bottlenecks in Agile Software Development Identified Using Theory of Constraints (TOC) Principles

The Study on the Effect of Background Music on Customer Waiting Time in Restaurant

Gerard Mc Nulty Systems Optimisation Ltd BA.,B.A.I.,C.Eng.,F.I.E.I

Airline Fleet Maintenance: Trade-off Analysis of Alternate Aircraft Maintenance Approaches

SIMPLIFIED PERFORMANCE MODEL FOR HYBRID WIND DIESEL SYSTEMS. J. F. MANWELL, J. G. McGOWAN and U. ABDULWAHID

Extensive operating room (OR) utilization is a goal

Mapping an Application to a Control Architecture: Specification of the Problem

Lean Software Development and Kanban

Scrum in a Large Project Theory and Practice

9. Model Sensitivity and Uncertainty Analysis

Governments information technology

for Oil & Gas Industry

Dynamic Simulation and Supply Chain Management

Transcription:

Proceedings of the 2011 Industrial Engineering Research Conference T. Doolen and E. Van Aken, eds. Management Methods for Reducing Span Time during New Product Development Samuel Suss, Onur Hisarciklilar, Vince Thomson Mechanical Engineering Department, McGill University 817 Sherbrooke Street West, Montreal, Canada H3A 2K6 Abstract Shorter span time for new product development (NPD) has become a measure of success in the global marketplace. All of the present techniques to improve NPD: concurrent engineering, dedicated design teams, enhanced computer tools, etc., require excellent communication. A model of NPD was built to investigate parameters that affect communication and how they impact design project effort and span time. The model considered the effect of coordination in the form of task decomposition, allocation of resources, and information exchange methods upon information flow at the design team, system integration and project levels. The model was validated against present practices in aerospace companies. Results from simulations give insight into the impact of these coordination mechanisms on the span time and effort of NPD projects under various levels of uncertainty. The effects of coordination mechanisms at the project level were also tested. In particular, the use of the coordination principles from the Sprint method (Scrum) (highly employed in the software industry) during product design showed significant span time reduction for projects with moderate to high uncertainty. Other coordination mechanisms, such as the use of virtual data models during NPD, were also studied, and showed great promise for span time reduction. Keywords Product development, coordination, engineering design 1. Introduction New product development (NPD) is a vitally important part of the product lifecycle, consuming a large proportion of the overall time for bringing a product to market, and setting about 70% of product cost [1]. The implementation of engineering design tools, concurrent engineering practices and product data management systems has contributed to reduced NPD cycle times in recent years. However, in large NPD projects where hundreds of engineers work to develop complex products, there remain significant inefficiencies. The amount of waste in aerospace and defence product development programs is estimated at 60 90% of the charged time with about 60% of all tasks being idle at any given time. The actual time engineers spend on value-added activities is much less than half of the total working time. Much efficiency is lost in wasted communication, waiting for information and lack of coordination [2]. Planning and managing an NPD project to minimize span time requires detailed understanding of the mechanisms that drive the process, how they interact, and how sensitive project performance characteristics are to each of them. To this end we performed simulations using a model [3, 4] to study span time and effort for various sets of conditions. Section 2 describes the model. Simulation results follow in Section 3 with conclusions in Section 4. 2. Description of the Model New product development is an undertaking comprised of the various multi-functional activities done between defining a technology or market opportunity and starting production. The goal of NPD is to create the recipe for producing a product [5]. Thus, the output of product development is not products, which are physical objects, but rather information. Although flawlessly doing the same thing twice in a row in manufacturing is a success, flawlessly doing the same thing twice in a row in NPD is a failure, since duplicating a recipe adds no value. Although many business processes seek an identical result repeatedly, NPD seeks to do something new, once. NPD involves creativity and innovation which are nonlinear and iterative [6]. 2.1 Features of NPD incorporated in the model The design process within NPD can be viewed as a system of interrelated activities which are performed to increase knowledge or to reduce uncertainty about a design solution [7]. Product design is typically decomposed into a

hierarchy of subtasks. This is done to cope with the complexity of the design task and to gain the benefits of specialization. Many subtasks have dependencies between them which require management or coordination. These dependencies are primarily information that is generated within some subtasks and required by others; therefore, the performance of the entire design task is dependent on the progress of individual subtasks and the exchange of information between them. If the coordination of information is carried out efficiently and effectively, design projects have better results in terms of span time and effort. In order to find how to best coordinate the flow of information as the subtasks are being carried out, the structure and synchronization of the information must be analyzed. In this paper we consider this information flow to consist of the following types of information: design, which can be specified directly, e.g., materials and geometry; performance, which is a consequence of design information, e.g., fatigue life and weight; and requirements, which constrain design or performance information, such as the geometry needed to meet overall product performance. This follows the approach of others that have worked on analysis of information exchange in design processes [8, 9]. One of the most prominent features of a design activity is its iterative character. The majority of development time and cost has been shown to be an iterative activity [10]. Here, we consider the possibility of iterations that occur when a task is being performed, where not all of the required input information is available at the outset. The work begins with partial information which is uncertain and is updated periodically as the process unfolds. If two tasks are reciprocally dependent, the information received by the first task is used to generate new information required by the second task, and vice versa. This type of iteration is classified as progressive or incremental. Unnecessary iteration occurs when a task has gone too far with the partial or preliminary information it has received, and upon receiving further information the task must rework some of what it has already done. Iteration that comes about as a result of a design review is also taken into account. Here, we reason that as tasks are completed and results reviewed, there is a probability that a task must be reworked in its entirety. This likelihood of design version rework diminishes with the completeness of information exchange that has taken place with the interdependent tasks in the process. Likelihood of design version rework also diminishes when the uncertainty of information developed by the end of the phase is sufficiently reduced. The iteration brought about by design version rework is classified as feedback or repetition. Another prominent feature of design activity is its uncertainty. One of the main distinctions regarding uncertainty in engineering design is between the lack of knowledge (epistemic) and random (aleatory) uncertainty [11, 12]. Some epistemic uncertainty is reducible, for instance, via further studies, measurements or expert consultation. Random uncertainty describes the inherent variation associated with a system or environment. In design, aleatory uncertainty is often manifested as changes in design information brought about by events beyond the control of an individual designer such as changes to product requirements or to estimates of performance parameters. This aleatory uncertainty cannot be reduced through more information or knowledge acquisition during the design process. Wood et al. [13] propose that as a design progresses the level of epistemic uncertainty is reduced, whereas a degree of stochastic uncertainty usually remains. Product development also includes system level oversight tasks where information about the various activities being carried out on subsystems is scrutinized and tested to ensure that the subsystems being developed function correctly together as a system [14]. The information examined by the system level team is often sent to other development teams for use in their own tasks. In this way the interdependencies between design activities is coordinated. Schedule considerations also constrain the product development process when decisions are made to end the work on design tasks. This occurs if the work performed has reached a certain quality regardless if all of the interdependencies have been fully exploited to achieve an optimum design. This trade-off of span time versus nonconformance failure at a design review is often used as a tactic to speed up the development of new products. 2.2 Modeling the features of NPD and their interactions The informational dependency relationship between any pair of tasks can be quantified and compiled as a numerical dependency structure matrix D, which is the product of the initial estimated uncertainty and sensitivity between each pair of tasks [15]. The model sets the total amount of information that must be communicated between interdependent tasks as being directly proportional to their dependency strength Dij. Empirical results supporting this assumption were published by Allen [16]. We can therefore form an information exchange matrix NC which is directly proportional to the dependency matrix D. Execution of each development task is modeled as being composed of three subtasks: work, which consists of the technical, production work entailed in the task; read, which consists of reading and interpreting all incoming information; and prep, which consists of preparing information generated by executing the task into communicable form (i.e., preparing reports, summarizing data, etc.), as well as coordinating the actual communication. We consider

that these subtasks cannot be performed simultaneously and that each requires the exclusive use of the development team while it is being performed (modeled as a finite resource). We do not require, however, that the entire work subtask be done to completion each time it is begun, but rather it can be performed as a series of steps in between which communication subtasks are performed. In a particular instance, the amount of effort required by an individual development team (in man-hours) to perform its specific work subtask can be determined by a triangular probability density function (PDF) of given minimum, median and maximum values. Similarly, the amount of effort required by this team to perform a read or prep subtask on a unit amount of information can also be determined from a triangular PDF for each respective subtask. Information is generated by the work subtask in discrete units, and is gathered and prepared in batches in the prep subtask for communication to other development teams. The communication interval is one of the input parameters whose effects on project span time were studied. Each unit of information is addressed randomly to a task in the system, but the total number of units of information addressed to each task is equal to the corresponding element in NC. Dispatched units of information are processed in the addressee s read subtask. Information is generated in each development task in proportion to the amount of effort made in its work subtask. However, the precision of this information increases only asymptotically according to the way in which the epistemic uncertainty reduces. Epistemic uncertainty of each discrete information entity developed in a task is a function of the current state of progress of the task. Aleatory uncertainty of each discrete information entity is a random number proportional to the epistemic uncertainty at the time of its creation and a scaling factor. Rework occurs in a work subtask if the work has progressed too far with the information previously received. This occurs if the uncertainty of input information received from any task before a new work period is greater than the uncertainty of input information received from that task in a prior work period. If work in a task has progressed too far beyond the percentage of input information received, further work on the task is suspended until more input information has been received. However, depending on the perceived uncertainty of information previously received, a decision may be made to continue after a certain period of time regardless. This is based on observations in collaborative projects, where schedule constraints force a manager to make the decision to risk continuing work on a task in order to allow the project to move forward. The missing information is estimated based on previous information received and the perception of imprecision of this previous information. Progress in design refers to a process that achieves a succession of improvements or refinements on the way toward the final outcome [17] or to a process in which the level of uncertainty in the artefact is reduced as the design progresses [18]. This reasoning was incorporated into the model in the way in which tasks in the model make progress through technical and communication work done during a simulation. The information exchanged between interdependent tasks is given uncertainty attributes that can lead to rework, where these uncertainty attributes are based on the state of the task that sent the information. The ability of the simulated tasks to complete work while exchanging information needed to continue to make progress is what drives the tasks to their final outcome. An S-shape appears to be the best function to represent task progress as it can embody a growth phenomenon similar to learning [19]. This was adapted in the model to calculate the reduction in epistemic uncertainty as a function of task state with an S-shape function (given that the work progress curve can be transformed into an uncertainty reduction curve by subtracting the S-shape function from its upper asymptote). The model makes use of a Gompertz equation as the S-shape function because of its flexibility in allowing for different rates of approach to the upper and lower asymptotes at activity start and end. Two coefficients govern the shape of the Gompertz curve and can be set to values that cover a range of possible epistemic uncertainty reduction behaviour. The aleatory uncertainty is calculated for any task state with the use of a random number generator and a scaling factor. Further details can be seen in [3, 4]. Using discrete event simulation [20] the logic of the model was executed to produce the process behavior for each set of conditions examined. 2.3 Validation of the model Ideally the best way to validate a model of a NPD process would be to compare data from a comprehensive series of actual processes with data from simulations of the same processes. In practice however, this has not yet been feasible because of the long cycle time of complex PD processes and the difficulty in obtaining sufficiently detailed historical data. Validation of the model to date was based on comparison of model results with predictions of other models, with data from real processes that are available for several cases, and with the validation of the specific behaviors incorporated in the model s logic with those that have been reported to take place in NPD. Further

validations of results are being performed as data becomes available. The results shown here serve to illustrate the operation of the model, and how it can explain the mechanisms that drive the NPD process. 3. Scenarios investigated with the model The following table describes the parameters used as input to the model to create each scenario. In scenarios where comparisons of results are made, we maintained the total work requirement in the NPD project (as calculated by summing the average of the triangular PDF of each task across all phases) as a constant. Table 1 Input parameters defining scenarios for the NPD model Parameter Description CI Vector defining the nominal communication interval (hours) for each task BR Vector containing the flag value for each task whether to broadcast information to all other tasks DeltaT Vector defining Δt (hours) maximum work cycle time between which communications are attended to by each development team D Matrix defining dependency strength between each pair of tasks with element values of 0 to 9 INTMAX System level integrator capacity IQ Integrator queue time threshold in hours where additional integrator resource is added MINT,SINT Mean value and standard deviation of the normal PDF for the integrator process LAT Matrix defining the mean value and standard deviation of the PDF for communication latency M Vector defining the scaling factor for aleatory uncertainty for each task MWI Vector defining the number of hours each task waits for new information while in the starve condition before another cycle of work is done OVF Matrix defining the degree of overlap between each pair of tasks in a project PREP Matrix defining the parameters of the triangular PDF for the prep subtask for each task RD Matrix defining the parameters of the TPDF for the read subtask for each task WK Matrix defining the parameters of the TPDF for the work subtask for each task B, C Vector defining Gompertz function values for the epistemic uncertainty for each task NPT Number of tasks in each phase TP Number of phases in the project QMR Vector defining minimum number of entities in the read queue before attending to this queue QMP Vector defining minimum number of entities in the prep queue before attending to this queue SCH Vector defining the scheduled span time for each task 3.1 Number of tasks in the project decomposition and the relationship to integrator resource management Figure 1 shows simulation results for the normalized span time of a project with 2 phases that have been split into NPT development tasks. The total amount of work to be performed in each case is constant and is used to normalize the results for project span time. In figure 1, the tasks are highly interdependent so that each task requires information from every other task and the tasks all begin simultaneously. In the case shown, the reduction in epistemic uncertainty is slow and there is zero aleatory uncertainty. In this series of simulations, the PDFs for the work subtasks were equal. The results show that initial reduction in span time is linear with an increase in the number of tasks, but levels off as the delays brought about by information exchange and system level integration with a higher number of tasks come into play. As can be seen, further increases in the number of tasks have less effect on the span time. Insufficient integrator resource capacity causes the span time and effort to rise sharply because tasks are not able to complete due to the information flow bottleneck. Simulation shows that the information flow bottleneck causes increase in design version rework. Even with sufficient resource capacity, if the resources are not deployed early enough, the consequent delays to the project result in increased effort and span time with fewer tasks in the decomposition. This is controlled in the simulation by the parameter IQ, which is the threshold that triggers the addition of an integrator resource. It turns out that keeping the value of IQ less than 5% of the nominal time required to perform a task ensures that the integrator queue time does not impede the completion of tasks during the simulations we examined. Thus, relationships for the required integrator capacity to avoid bottlenecks can be found for specific projects with estimated uncertainty reduction profiles.

Figure 1- Normalized span time variation with the number of tasks 3.2 Effect of dependency and task overlap Figure 2 shows the effects of overlapping tasks in NPD projects where the tasks are sequentially dependent, for example, with 3 sequentially dependent tasks, task 3 requires information from tasks 1 and 2, task 2 requires information from task 1 and task 1 does not require any information from other tasks. The case shown is for slowly reducing uncertainty and moderate magnitude of aleatory uncertainty. As can be seen in figure 2, full overlap leads to lowest span times, but with higher effort. The additional effort is due to rework that is developed when work in downstream tasks, based on interim information received from the upstream task before its work is complete, is lacking in precision. Figure 2- NPD project effort versus span time for sequentially dependent tasks As can be seen in the figure, for full overlap, if the information is updated more often (shorter communication interval) the span time initially decreases further without any further increase in effort. However, if the

communication interval is too short, span time does not decrease any further and effort increases rapidly. This occurs when unstable interim information is communicated and acted upon too often, creating more rework. 3.3 Effect of non-equal task sizes Figure 3 shows how span time is affected if one of the tasks in the task decomposition of a project requires a greater proportion of effort than all the rest. The figure shows how span time for a project divided into 4 interdependent tasks and 2 phases changes with the effort multiple (NW) of the one unequal task. Thus, when NW=2, one of the tasks requires twice the average total effort of each of the other 3 tasks. In each case the average total project effort is identical. This significant increase in span time can be understood when studying the simulation. The longer time taken by the larger task to generate information sufficient to keep the other tasks from reaching a starve condition makes the pace of the project as a whole follow that of the larger task. Even though the total work required is equal for all values of NW, the smaller tasks cannot make progress and are often idle waiting for input information. Figure 3- Effect on span time when one task is larger than the others 3.4 Adapting sprint methods to non-software NPD In the sprint method as applied to software development, each sprint is required to produce a complete, tested version of the software that completely answers a planned set of requirements. Each successive sprint goes on to add additional requirements so that at the end of the project the software meets the complete list of requirements for the final product. Similar to this for non-software NPD, the model is divided into a series of short phases, analogous to sprints with a design review at the end of each phase. A key element here is that the tasks involved in each phase collectively solve a specific set of well defined design problems that can be evaluated during the design review. NPD of a complex product is essentially a series of activities providing information that allow key design decisions to be made, and thus, the design review at the end of each of these phases or sprints formalizes the point at which these decisions are made. The tasks in each phase leading up to a design review are those that generate the information required to make these key decisions. As in the sprint method, the work in each phase must be coordinated intensely so that the design review is successful and rework of a phase is not required. In practice, this is facilitated by smaller sized phases. Figure 4 illustrates the results for a comparison between two projects requiring the same amount of effort: a typical project with 2 phases and design reviews, and a project using the sprint method with 6 phases and design reviews. The results show that there is little difference between standard and sprint methods when there is rapid reduction in uncertainty, whereas there is a marked difference when there is a slow reduction in uncertainty. Similar results, but with smaller differences, are predicted for a comparison measuring effort. Thus, the model predicts that more frequent course calibrations (in the form of design reviews between sprints) prevent the tasks from straying too far from the correct direction (resulting in more rework) when there is high uncertainty in interim information.

3.5 Effect of the communication interval As we have seen, exchanging interim information when there is task overlap is an effective way to shorten span time in both cases of sequential and reciprocal dependencies. However, interim information can be imprecise, and is subject to instability of aleatory uncertainty. Depending on the rate at which the imprecision of information generated by a task is reduced and on the magnitude of the instability of this information, acting on interim information can lead to unnecessary rework. Simulations show that there is an optimum interval for communicating interim information to other tasks that minimizes span time and effort. Simulation studies of various scenarios of uncertainty and task dependency indicate that the optimum communication interval is at about 8% of the nominal time required to execute a task as calculated from its triangular PDF. Figure 4- Effect of increasing the number of design reviews and uncertainty 3.6 Effect of other coordination improvements The sensitivity of the span time to methods or policies that can facilitate the exchange of information in a timely manner can be ascertained using the simulation model described here. For example, under conditions of high interdependence and uncertainty, communication latency (delays between the time information is prepared for communication and the time it reaches its destination) can increase the span time of a project by 40% if delays are greater than 15% of the nominal time required to perform a task. These delays, where information must travel through several layers of an organization, can accumulate and cause unnecessary rework and have significant knockon effects on the work of many tasks. Depending on the structure of the product development system and the levels of uncertainty, simulations can evaluate the impact of methods to reduce this latency. Product data management systems or virtual models of the new product being developed, if well designed, can reduce communication latency, facilitate the preparation of information for communication, and reduce the time required to interpret new data. However, too much information, too often, can propagate imprecise and unstable data causing unnecessary rework, and wasted effort in communication. 4. Conclusion Simulations of NPD processes were conducted to study the effects of process structure, critical resource management, communication policies, and uncertainty on rework, project span time and effort. Results showed that significant potential for reducing project span time and effort can be achieved by applying the following methods with sufficient knowledge of task interdependencies and uncertainty reduction profiles: a. overlap tasks when there is sufficiently frequent, interim information exchange (sections 3.2 and 3.5); b. structure projects such that task sizes are similar (section 3.3); c. provide sufficient resource capacity in critical support tasks (section 3.1); d. implement sprint methods where high uncertainty occurs (section 3.4); e. adopt policies to reduce communication effort and latency (section 3.6).

The greatly reduced span time when using the sprint method (Figure 4) is consistent with other results. Untimely information exchange by integrators increases span time (Figure 1). Unequal task size causes delays in information exchange (Figure 3). The sprint method forces setting of task size, resource allocation and information exchange such that the information required by dependent tasks is exchanged in a timely and predictable way. As a consequence, dependent tasks are not starved for information since the synchronous sprints and design reviews set short, fixed periods for receiving information. Overall, this simulation model can be used to help in the planning and management of actual NPD projects by providing guidelines for key process attributes. Further research to gather relevant process data should be performed to validate the use of the model for quantitative evaluation of NPD process improvement strategies. References 1. Wheelwright, S. and K. Clark, 1992, "Revolutionizing Product Development." New York: The Free Press. 2. Oppenheim, B.W., 2004, "Lean Product Development Flow," Systems Engineering, 7(4): p. 352-376. 3. Suss, S., K. Grebici, and V. Thomson, 2010, "The Effect of Uncertainty on Span Time and Effort within a Complex Design Process," in Modelling and Management of Engineering Processes, P. Heisig, J. Clarkson, and S. Vajna, Editors, Springer: London p. 77-88. 4. Suss, S. and V. Thomson, 2010, "Coordination of Complex Product Development Processes," in ASME 15th Design for Manufacturing and the Lifecycle Conference DETC2010. Montreal, Canada: ASME. 5. Reinertsen, D., 1999, "Lean thinking isn't so simple," Electronic Design, 47(10): p. 48H. 6. Kline, S.J., 1985, "Innovation is not a linear process," Research Management, 28(4): p. 36-45. 7. Grebici, K., D.C. Wynn, and P.J. Clarkson, 2008, "Modelling the relationship between uncertainty levels in design descriptions and design process duration," in Proceedings of the International Conference on Integrated Design and Manufacturing in Mechanical Engineering. Beijing, China. 8. Krishnan, V., S.D. Eppinger, and D.E. Whitney, 1997, "A model-based framework to overlap product development activities," Management Science, 43(4): p. 437-451. 9. O Donovan, B.D., et al., 2003, "Signposting: Modelling uncertainty in design processes," in Proc Int Conference Engineering Design (ICED). Stockholm. 10. Cooper, K.G., 1993, "The Rework Cycle: Benchmarks for the Program Manager," Project Management Journal, 24(1). 11. Oberkampf, W.L., et al., 2004, "Challenge problems: uncertainty in system response given uncertain parameters," Reliability Engineering & System Safety, 85(1-3): p. 11-19. 12. Mourelatos, Z.P. and J. Zhou, 2005, "Reliability estimation and design with insufficient data based on possibility theory," AIAA journal, 43(8): p. 1696. 13. Wood, K.L., E.K. Antonsson, and J.L. Beck, 1990, "Representing imprecision in engineering design: comparing fuzzy and probability calculus," Research in Engineering Design, 1(3): p. 187-203. 14. Yassine, A.A., et al., 2003, "Information hiding in product development: the design churn effect," Research in Engineering Design, 14(3): p. 145-161. 15. Yassine, A.A. and D. Braha, 2003, "Complex Concurrent Engineering and the Design Structure Matrix Method," Concurrent Engineering: Research and Applications, 11(3): p. 165-176. 16. Allen, T.J., 2007, "Architecture and communication among product development engineers," California Management Review, 49(2): p. 23. 17. Safoutin, M.J., 2003, "A methodology for empirical measurement of iteration in engineering design processes." 18. Hykin, D.H.W. and L.C. Laming, 1975, "Design case histories: Report of a field study of design in the United Kingdom engineering industry," ARCHIVE: Proceedings of the Institution of Mechanical Engineers 1847-1982 (vols 1-196), 189(1975): p. 203-211. 19. Carrascosa, M., S.D. Eppinger, and D.E. Whitney, 1998, "Using the design structure matrix to estimate product development time," in Proceedings of the ASME Design Engineering Technical Conferences (Design Automation Conference). Atlanta, Georgia, USA. 20. ARENA, 2006, "Simulation Software." Rockwell Automation Technologies.