Management Methods for Reducing Span Time during New Product Development
|
|
|
- Ross Cooper
- 10 years ago
- Views:
Transcription
1 Proceedings of the 2011 Industrial Engineering Research Conference T. Doolen and E. Van Aken, eds. Management Methods for Reducing Span Time during New Product Development Samuel Suss, Onur Hisarciklilar, Vince Thomson Mechanical Engineering Department, McGill University 817 Sherbrooke Street West, Montreal, Canada H3A 2K6 Abstract Shorter span time for new product development (NPD) has become a measure of success in the global marketplace. All of the present techniques to improve NPD: concurrent engineering, dedicated design teams, enhanced computer tools, etc., require excellent communication. A model of NPD was built to investigate parameters that affect communication and how they impact design project effort and span time. The model considered the effect of coordination in the form of task decomposition, allocation of resources, and information exchange methods upon information flow at the design team, system integration and project levels. The model was validated against present practices in aerospace companies. Results from simulations give insight into the impact of these coordination mechanisms on the span time and effort of NPD projects under various levels of uncertainty. The effects of coordination mechanisms at the project level were also tested. In particular, the use of the coordination principles from the Sprint method (Scrum) (highly employed in the software industry) during product design showed significant span time reduction for projects with moderate to high uncertainty. Other coordination mechanisms, such as the use of virtual data models during NPD, were also studied, and showed great promise for span time reduction. Keywords Product development, coordination, engineering design 1. Introduction New product development (NPD) is a vitally important part of the product lifecycle, consuming a large proportion of the overall time for bringing a product to market, and setting about 70% of product cost [1]. The implementation of engineering design tools, concurrent engineering practices and product data management systems has contributed to reduced NPD cycle times in recent years. However, in large NPD projects where hundreds of engineers work to develop complex products, there remain significant inefficiencies. The amount of waste in aerospace and defence product development programs is estimated at 60 90% of the charged time with about 60% of all tasks being idle at any given time. The actual time engineers spend on value-added activities is much less than half of the total working time. Much efficiency is lost in wasted communication, waiting for information and lack of coordination [2]. Planning and managing an NPD project to minimize span time requires detailed understanding of the mechanisms that drive the process, how they interact, and how sensitive project performance characteristics are to each of them. To this end we performed simulations using a model [3, 4] to study span time and effort for various sets of conditions. Section 2 describes the model. Simulation results follow in Section 3 with conclusions in Section Description of the Model New product development is an undertaking comprised of the various multi-functional activities done between defining a technology or market opportunity and starting production. The goal of NPD is to create the recipe for producing a product [5]. Thus, the output of product development is not products, which are physical objects, but rather information. Although flawlessly doing the same thing twice in a row in manufacturing is a success, flawlessly doing the same thing twice in a row in NPD is a failure, since duplicating a recipe adds no value. Although many business processes seek an identical result repeatedly, NPD seeks to do something new, once. NPD involves creativity and innovation which are nonlinear and iterative [6]. 2.1 Features of NPD incorporated in the model The design process within NPD can be viewed as a system of interrelated activities which are performed to increase knowledge or to reduce uncertainty about a design solution [7]. Product design is typically decomposed into a
2 hierarchy of subtasks. This is done to cope with the complexity of the design task and to gain the benefits of specialization. Many subtasks have dependencies between them which require management or coordination. These dependencies are primarily information that is generated within some subtasks and required by others; therefore, the performance of the entire design task is dependent on the progress of individual subtasks and the exchange of information between them. If the coordination of information is carried out efficiently and effectively, design projects have better results in terms of span time and effort. In order to find how to best coordinate the flow of information as the subtasks are being carried out, the structure and synchronization of the information must be analyzed. In this paper we consider this information flow to consist of the following types of information: design, which can be specified directly, e.g., materials and geometry; performance, which is a consequence of design information, e.g., fatigue life and weight; and requirements, which constrain design or performance information, such as the geometry needed to meet overall product performance. This follows the approach of others that have worked on analysis of information exchange in design processes [8, 9]. One of the most prominent features of a design activity is its iterative character. The majority of development time and cost has been shown to be an iterative activity [10]. Here, we consider the possibility of iterations that occur when a task is being performed, where not all of the required input information is available at the outset. The work begins with partial information which is uncertain and is updated periodically as the process unfolds. If two tasks are reciprocally dependent, the information received by the first task is used to generate new information required by the second task, and vice versa. This type of iteration is classified as progressive or incremental. Unnecessary iteration occurs when a task has gone too far with the partial or preliminary information it has received, and upon receiving further information the task must rework some of what it has already done. Iteration that comes about as a result of a design review is also taken into account. Here, we reason that as tasks are completed and results reviewed, there is a probability that a task must be reworked in its entirety. This likelihood of design version rework diminishes with the completeness of information exchange that has taken place with the interdependent tasks in the process. Likelihood of design version rework also diminishes when the uncertainty of information developed by the end of the phase is sufficiently reduced. The iteration brought about by design version rework is classified as feedback or repetition. Another prominent feature of design activity is its uncertainty. One of the main distinctions regarding uncertainty in engineering design is between the lack of knowledge (epistemic) and random (aleatory) uncertainty [11, 12]. Some epistemic uncertainty is reducible, for instance, via further studies, measurements or expert consultation. Random uncertainty describes the inherent variation associated with a system or environment. In design, aleatory uncertainty is often manifested as changes in design information brought about by events beyond the control of an individual designer such as changes to product requirements or to estimates of performance parameters. This aleatory uncertainty cannot be reduced through more information or knowledge acquisition during the design process. Wood et al. [13] propose that as a design progresses the level of epistemic uncertainty is reduced, whereas a degree of stochastic uncertainty usually remains. Product development also includes system level oversight tasks where information about the various activities being carried out on subsystems is scrutinized and tested to ensure that the subsystems being developed function correctly together as a system [14]. The information examined by the system level team is often sent to other development teams for use in their own tasks. In this way the interdependencies between design activities is coordinated. Schedule considerations also constrain the product development process when decisions are made to end the work on design tasks. This occurs if the work performed has reached a certain quality regardless if all of the interdependencies have been fully exploited to achieve an optimum design. This trade-off of span time versus nonconformance failure at a design review is often used as a tactic to speed up the development of new products. 2.2 Modeling the features of NPD and their interactions The informational dependency relationship between any pair of tasks can be quantified and compiled as a numerical dependency structure matrix D, which is the product of the initial estimated uncertainty and sensitivity between each pair of tasks [15]. The model sets the total amount of information that must be communicated between interdependent tasks as being directly proportional to their dependency strength Dij. Empirical results supporting this assumption were published by Allen [16]. We can therefore form an information exchange matrix NC which is directly proportional to the dependency matrix D. Execution of each development task is modeled as being composed of three subtasks: work, which consists of the technical, production work entailed in the task; read, which consists of reading and interpreting all incoming information; and prep, which consists of preparing information generated by executing the task into communicable form (i.e., preparing reports, summarizing data, etc.), as well as coordinating the actual communication. We consider
3 that these subtasks cannot be performed simultaneously and that each requires the exclusive use of the development team while it is being performed (modeled as a finite resource). We do not require, however, that the entire work subtask be done to completion each time it is begun, but rather it can be performed as a series of steps in between which communication subtasks are performed. In a particular instance, the amount of effort required by an individual development team (in man-hours) to perform its specific work subtask can be determined by a triangular probability density function (PDF) of given minimum, median and maximum values. Similarly, the amount of effort required by this team to perform a read or prep subtask on a unit amount of information can also be determined from a triangular PDF for each respective subtask. Information is generated by the work subtask in discrete units, and is gathered and prepared in batches in the prep subtask for communication to other development teams. The communication interval is one of the input parameters whose effects on project span time were studied. Each unit of information is addressed randomly to a task in the system, but the total number of units of information addressed to each task is equal to the corresponding element in NC. Dispatched units of information are processed in the addressee s read subtask. Information is generated in each development task in proportion to the amount of effort made in its work subtask. However, the precision of this information increases only asymptotically according to the way in which the epistemic uncertainty reduces. Epistemic uncertainty of each discrete information entity developed in a task is a function of the current state of progress of the task. Aleatory uncertainty of each discrete information entity is a random number proportional to the epistemic uncertainty at the time of its creation and a scaling factor. Rework occurs in a work subtask if the work has progressed too far with the information previously received. This occurs if the uncertainty of input information received from any task before a new work period is greater than the uncertainty of input information received from that task in a prior work period. If work in a task has progressed too far beyond the percentage of input information received, further work on the task is suspended until more input information has been received. However, depending on the perceived uncertainty of information previously received, a decision may be made to continue after a certain period of time regardless. This is based on observations in collaborative projects, where schedule constraints force a manager to make the decision to risk continuing work on a task in order to allow the project to move forward. The missing information is estimated based on previous information received and the perception of imprecision of this previous information. Progress in design refers to a process that achieves a succession of improvements or refinements on the way toward the final outcome [17] or to a process in which the level of uncertainty in the artefact is reduced as the design progresses [18]. This reasoning was incorporated into the model in the way in which tasks in the model make progress through technical and communication work done during a simulation. The information exchanged between interdependent tasks is given uncertainty attributes that can lead to rework, where these uncertainty attributes are based on the state of the task that sent the information. The ability of the simulated tasks to complete work while exchanging information needed to continue to make progress is what drives the tasks to their final outcome. An S-shape appears to be the best function to represent task progress as it can embody a growth phenomenon similar to learning [19]. This was adapted in the model to calculate the reduction in epistemic uncertainty as a function of task state with an S-shape function (given that the work progress curve can be transformed into an uncertainty reduction curve by subtracting the S-shape function from its upper asymptote). The model makes use of a Gompertz equation as the S-shape function because of its flexibility in allowing for different rates of approach to the upper and lower asymptotes at activity start and end. Two coefficients govern the shape of the Gompertz curve and can be set to values that cover a range of possible epistemic uncertainty reduction behaviour. The aleatory uncertainty is calculated for any task state with the use of a random number generator and a scaling factor. Further details can be seen in [3, 4]. Using discrete event simulation [20] the logic of the model was executed to produce the process behavior for each set of conditions examined. 2.3 Validation of the model Ideally the best way to validate a model of a NPD process would be to compare data from a comprehensive series of actual processes with data from simulations of the same processes. In practice however, this has not yet been feasible because of the long cycle time of complex PD processes and the difficulty in obtaining sufficiently detailed historical data. Validation of the model to date was based on comparison of model results with predictions of other models, with data from real processes that are available for several cases, and with the validation of the specific behaviors incorporated in the model s logic with those that have been reported to take place in NPD. Further
4 validations of results are being performed as data becomes available. The results shown here serve to illustrate the operation of the model, and how it can explain the mechanisms that drive the NPD process. 3. Scenarios investigated with the model The following table describes the parameters used as input to the model to create each scenario. In scenarios where comparisons of results are made, we maintained the total work requirement in the NPD project (as calculated by summing the average of the triangular PDF of each task across all phases) as a constant. Table 1 Input parameters defining scenarios for the NPD model Parameter Description CI Vector defining the nominal communication interval (hours) for each task BR Vector containing the flag value for each task whether to broadcast information to all other tasks DeltaT Vector defining Δt (hours) maximum work cycle time between which communications are attended to by each development team D Matrix defining dependency strength between each pair of tasks with element values of 0 to 9 INTMAX System level integrator capacity IQ Integrator queue time threshold in hours where additional integrator resource is added MINT,SINT Mean value and standard deviation of the normal PDF for the integrator process LAT Matrix defining the mean value and standard deviation of the PDF for communication latency M Vector defining the scaling factor for aleatory uncertainty for each task MWI Vector defining the number of hours each task waits for new information while in the starve condition before another cycle of work is done OVF Matrix defining the degree of overlap between each pair of tasks in a project PREP Matrix defining the parameters of the triangular PDF for the prep subtask for each task RD Matrix defining the parameters of the TPDF for the read subtask for each task WK Matrix defining the parameters of the TPDF for the work subtask for each task B, C Vector defining Gompertz function values for the epistemic uncertainty for each task NPT Number of tasks in each phase TP Number of phases in the project QMR Vector defining minimum number of entities in the read queue before attending to this queue QMP Vector defining minimum number of entities in the prep queue before attending to this queue SCH Vector defining the scheduled span time for each task 3.1 Number of tasks in the project decomposition and the relationship to integrator resource management Figure 1 shows simulation results for the normalized span time of a project with 2 phases that have been split into NPT development tasks. The total amount of work to be performed in each case is constant and is used to normalize the results for project span time. In figure 1, the tasks are highly interdependent so that each task requires information from every other task and the tasks all begin simultaneously. In the case shown, the reduction in epistemic uncertainty is slow and there is zero aleatory uncertainty. In this series of simulations, the PDFs for the work subtasks were equal. The results show that initial reduction in span time is linear with an increase in the number of tasks, but levels off as the delays brought about by information exchange and system level integration with a higher number of tasks come into play. As can be seen, further increases in the number of tasks have less effect on the span time. Insufficient integrator resource capacity causes the span time and effort to rise sharply because tasks are not able to complete due to the information flow bottleneck. Simulation shows that the information flow bottleneck causes increase in design version rework. Even with sufficient resource capacity, if the resources are not deployed early enough, the consequent delays to the project result in increased effort and span time with fewer tasks in the decomposition. This is controlled in the simulation by the parameter IQ, which is the threshold that triggers the addition of an integrator resource. It turns out that keeping the value of IQ less than 5% of the nominal time required to perform a task ensures that the integrator queue time does not impede the completion of tasks during the simulations we examined. Thus, relationships for the required integrator capacity to avoid bottlenecks can be found for specific projects with estimated uncertainty reduction profiles.
5 Figure 1- Normalized span time variation with the number of tasks 3.2 Effect of dependency and task overlap Figure 2 shows the effects of overlapping tasks in NPD projects where the tasks are sequentially dependent, for example, with 3 sequentially dependent tasks, task 3 requires information from tasks 1 and 2, task 2 requires information from task 1 and task 1 does not require any information from other tasks. The case shown is for slowly reducing uncertainty and moderate magnitude of aleatory uncertainty. As can be seen in figure 2, full overlap leads to lowest span times, but with higher effort. The additional effort is due to rework that is developed when work in downstream tasks, based on interim information received from the upstream task before its work is complete, is lacking in precision. Figure 2- NPD project effort versus span time for sequentially dependent tasks As can be seen in the figure, for full overlap, if the information is updated more often (shorter communication interval) the span time initially decreases further without any further increase in effort. However, if the
6 communication interval is too short, span time does not decrease any further and effort increases rapidly. This occurs when unstable interim information is communicated and acted upon too often, creating more rework. 3.3 Effect of non-equal task sizes Figure 3 shows how span time is affected if one of the tasks in the task decomposition of a project requires a greater proportion of effort than all the rest. The figure shows how span time for a project divided into 4 interdependent tasks and 2 phases changes with the effort multiple (NW) of the one unequal task. Thus, when NW=2, one of the tasks requires twice the average total effort of each of the other 3 tasks. In each case the average total project effort is identical. This significant increase in span time can be understood when studying the simulation. The longer time taken by the larger task to generate information sufficient to keep the other tasks from reaching a starve condition makes the pace of the project as a whole follow that of the larger task. Even though the total work required is equal for all values of NW, the smaller tasks cannot make progress and are often idle waiting for input information. Figure 3- Effect on span time when one task is larger than the others 3.4 Adapting sprint methods to non-software NPD In the sprint method as applied to software development, each sprint is required to produce a complete, tested version of the software that completely answers a planned set of requirements. Each successive sprint goes on to add additional requirements so that at the end of the project the software meets the complete list of requirements for the final product. Similar to this for non-software NPD, the model is divided into a series of short phases, analogous to sprints with a design review at the end of each phase. A key element here is that the tasks involved in each phase collectively solve a specific set of well defined design problems that can be evaluated during the design review. NPD of a complex product is essentially a series of activities providing information that allow key design decisions to be made, and thus, the design review at the end of each of these phases or sprints formalizes the point at which these decisions are made. The tasks in each phase leading up to a design review are those that generate the information required to make these key decisions. As in the sprint method, the work in each phase must be coordinated intensely so that the design review is successful and rework of a phase is not required. In practice, this is facilitated by smaller sized phases. Figure 4 illustrates the results for a comparison between two projects requiring the same amount of effort: a typical project with 2 phases and design reviews, and a project using the sprint method with 6 phases and design reviews. The results show that there is little difference between standard and sprint methods when there is rapid reduction in uncertainty, whereas there is a marked difference when there is a slow reduction in uncertainty. Similar results, but with smaller differences, are predicted for a comparison measuring effort. Thus, the model predicts that more frequent course calibrations (in the form of design reviews between sprints) prevent the tasks from straying too far from the correct direction (resulting in more rework) when there is high uncertainty in interim information.
7 3.5 Effect of the communication interval As we have seen, exchanging interim information when there is task overlap is an effective way to shorten span time in both cases of sequential and reciprocal dependencies. However, interim information can be imprecise, and is subject to instability of aleatory uncertainty. Depending on the rate at which the imprecision of information generated by a task is reduced and on the magnitude of the instability of this information, acting on interim information can lead to unnecessary rework. Simulations show that there is an optimum interval for communicating interim information to other tasks that minimizes span time and effort. Simulation studies of various scenarios of uncertainty and task dependency indicate that the optimum communication interval is at about 8% of the nominal time required to execute a task as calculated from its triangular PDF. Figure 4- Effect of increasing the number of design reviews and uncertainty 3.6 Effect of other coordination improvements The sensitivity of the span time to methods or policies that can facilitate the exchange of information in a timely manner can be ascertained using the simulation model described here. For example, under conditions of high interdependence and uncertainty, communication latency (delays between the time information is prepared for communication and the time it reaches its destination) can increase the span time of a project by 40% if delays are greater than 15% of the nominal time required to perform a task. These delays, where information must travel through several layers of an organization, can accumulate and cause unnecessary rework and have significant knockon effects on the work of many tasks. Depending on the structure of the product development system and the levels of uncertainty, simulations can evaluate the impact of methods to reduce this latency. Product data management systems or virtual models of the new product being developed, if well designed, can reduce communication latency, facilitate the preparation of information for communication, and reduce the time required to interpret new data. However, too much information, too often, can propagate imprecise and unstable data causing unnecessary rework, and wasted effort in communication. 4. Conclusion Simulations of NPD processes were conducted to study the effects of process structure, critical resource management, communication policies, and uncertainty on rework, project span time and effort. Results showed that significant potential for reducing project span time and effort can be achieved by applying the following methods with sufficient knowledge of task interdependencies and uncertainty reduction profiles: a. overlap tasks when there is sufficiently frequent, interim information exchange (sections 3.2 and 3.5); b. structure projects such that task sizes are similar (section 3.3); c. provide sufficient resource capacity in critical support tasks (section 3.1); d. implement sprint methods where high uncertainty occurs (section 3.4); e. adopt policies to reduce communication effort and latency (section 3.6).
8 The greatly reduced span time when using the sprint method (Figure 4) is consistent with other results. Untimely information exchange by integrators increases span time (Figure 1). Unequal task size causes delays in information exchange (Figure 3). The sprint method forces setting of task size, resource allocation and information exchange such that the information required by dependent tasks is exchanged in a timely and predictable way. As a consequence, dependent tasks are not starved for information since the synchronous sprints and design reviews set short, fixed periods for receiving information. Overall, this simulation model can be used to help in the planning and management of actual NPD projects by providing guidelines for key process attributes. Further research to gather relevant process data should be performed to validate the use of the model for quantitative evaluation of NPD process improvement strategies. References 1. Wheelwright, S. and K. Clark, 1992, "Revolutionizing Product Development." New York: The Free Press. 2. Oppenheim, B.W., 2004, "Lean Product Development Flow," Systems Engineering, 7(4): p Suss, S., K. Grebici, and V. Thomson, 2010, "The Effect of Uncertainty on Span Time and Effort within a Complex Design Process," in Modelling and Management of Engineering Processes, P. Heisig, J. Clarkson, and S. Vajna, Editors, Springer: London p Suss, S. and V. Thomson, 2010, "Coordination of Complex Product Development Processes," in ASME 15th Design for Manufacturing and the Lifecycle Conference DETC2010. Montreal, Canada: ASME. 5. Reinertsen, D., 1999, "Lean thinking isn't so simple," Electronic Design, 47(10): p. 48H. 6. Kline, S.J., 1985, "Innovation is not a linear process," Research Management, 28(4): p Grebici, K., D.C. Wynn, and P.J. Clarkson, 2008, "Modelling the relationship between uncertainty levels in design descriptions and design process duration," in Proceedings of the International Conference on Integrated Design and Manufacturing in Mechanical Engineering. Beijing, China. 8. Krishnan, V., S.D. Eppinger, and D.E. Whitney, 1997, "A model-based framework to overlap product development activities," Management Science, 43(4): p O Donovan, B.D., et al., 2003, "Signposting: Modelling uncertainty in design processes," in Proc Int Conference Engineering Design (ICED). Stockholm. 10. Cooper, K.G., 1993, "The Rework Cycle: Benchmarks for the Program Manager," Project Management Journal, 24(1). 11. Oberkampf, W.L., et al., 2004, "Challenge problems: uncertainty in system response given uncertain parameters," Reliability Engineering & System Safety, 85(1-3): p Mourelatos, Z.P. and J. Zhou, 2005, "Reliability estimation and design with insufficient data based on possibility theory," AIAA journal, 43(8): p Wood, K.L., E.K. Antonsson, and J.L. Beck, 1990, "Representing imprecision in engineering design: comparing fuzzy and probability calculus," Research in Engineering Design, 1(3): p Yassine, A.A., et al., 2003, "Information hiding in product development: the design churn effect," Research in Engineering Design, 14(3): p Yassine, A.A. and D. Braha, 2003, "Complex Concurrent Engineering and the Design Structure Matrix Method," Concurrent Engineering: Research and Applications, 11(3): p Allen, T.J., 2007, "Architecture and communication among product development engineers," California Management Review, 49(2): p Safoutin, M.J., 2003, "A methodology for empirical measurement of iteration in engineering design processes." 18. Hykin, D.H.W. and L.C. Laming, 1975, "Design case histories: Report of a field study of design in the United Kingdom engineering industry," ARCHIVE: Proceedings of the Institution of Mechanical Engineers (vols 1-196), 189(1975): p Carrascosa, M., S.D. Eppinger, and D.E. Whitney, 1998, "Using the design structure matrix to estimate product development time," in Proceedings of the ASME Design Engineering Technical Conferences (Design Automation Conference). Atlanta, Georgia, USA. 20. ARENA, 2006, "Simulation Software." Rockwell Automation Technologies.
(Refer Slide Time: 01:52)
Software Engineering Prof. N. L. Sarda Computer Science & Engineering Indian Institute of Technology, Bombay Lecture - 2 Introduction to Software Engineering Challenges, Process Models etc (Part 2) This
Model-Based Conceptual Design through to system implementation Lessons from a structured yet agile approach
Model-Based Conceptual Design through to system implementation Lessons from a structured yet agile approach Matthew Wylie Shoal Engineering Pty Ltd [email protected] Dr David Harvey Shoal Engineering
Set-Based Design: A Decision-Theoretic Perspective
Set-Based Design: A Decision-Theoretic Perspective Chris Paredis, Jason Aughenbaugh, Rich Malak, Steve Rekuc Product and Systems Lifecycle Management Center G.W. Woodruff School of Mechanical Engineering
UNCERTAINTIES OF MATHEMATICAL MODELING
Proceedings of the 12 th Symposium of Mathematics and its Applications "Politehnica" University of Timisoara November, 5-7, 2009 UNCERTAINTIES OF MATHEMATICAL MODELING László POKORÁDI University of Debrecen
Software Development with Agile Methods
Case Study Software Development with Agile Methods Introduction: Web application development is a much studied, heavily practiced activity. That is, capturing and validating user requirements, estimating
The Scrum Guide. The Definitive Guide to Scrum: The Rules of the Game. July 2013. Developed and sustained by Ken Schwaber and Jeff Sutherland
The Scrum Guide The Definitive Guide to Scrum: The Rules of the Game July 2013 Developed and sustained by Ken Schwaber and Jeff Sutherland Table of Contents Purpose of the Scrum Guide... 3 Definition of
Dynamic Change Management for Fast-tracking Construction Projects
Dynamic Change Management for Fast-tracking Construction Projects by Moonseo Park 1 ABSTRACT: Uncertainties make construction dynamic and unstable, mostly by creating non value-adding change iterations
PROJECT SCOPE MANAGEMENT
5 PROJECT SCOPE MANAGEMENT Project Scope Management includes the processes required to ensure that the project includes all the work required, and only the work required, to complete the project successfully
INTEGRATED OPTIMIZATION OF SAFETY STOCK
INTEGRATED OPTIMIZATION OF SAFETY STOCK AND TRANSPORTATION CAPACITY Horst Tempelmeier Department of Production Management University of Cologne Albertus-Magnus-Platz D-50932 Koeln, Germany http://www.spw.uni-koeln.de/
STRATEGIC CAPACITY PLANNING USING STOCK CONTROL MODEL
Session 6. Applications of Mathematical Methods to Logistics and Business Proceedings of the 9th International Conference Reliability and Statistics in Transportation and Communication (RelStat 09), 21
CSE 435 Software Engineering. Sept 16, 2015
CSE 435 Software Engineering Sept 16, 2015 2.1 The Meaning of Process A process: a series of steps involving activities, constraints, and resources that produce an intended output of some kind A process
Project Time Management
Project Time Management Study Notes PMI, PMP, CAPM, PMBOK, PM Network and the PMI Registered Education Provider logo are registered marks of the Project Management Institute, Inc. Points to Note Please
Justifying Simulation. Why use simulation? Accurate Depiction of Reality. Insightful system evaluations
Why use simulation? Accurate Depiction of Reality Anyone can perform a simple analysis manually. However, as the complexity of the analysis increases, so does the need to employ computer-based tools. While
Simulation for Business Value and Software Process/Product Tradeoff Decisions
Simulation for Business Value and Software Process/Product Tradeoff Decisions Raymond Madachy USC Center for Software Engineering Dept. of Computer Science, SAL 8 Los Angeles, CA 90089-078 740 570 [email protected]
CHAPTER 3 SECURITY CONSTRAINED OPTIMAL SHORT-TERM HYDROTHERMAL SCHEDULING
60 CHAPTER 3 SECURITY CONSTRAINED OPTIMAL SHORT-TERM HYDROTHERMAL SCHEDULING 3.1 INTRODUCTION Optimal short-term hydrothermal scheduling of power systems aims at determining optimal hydro and thermal generations
Deployment of express checkout lines at supermarkets
Deployment of express checkout lines at supermarkets Maarten Schimmel Research paper Business Analytics April, 213 Supervisor: René Bekker Faculty of Sciences VU University Amsterdam De Boelelaan 181 181
Risk Management for IT Security: When Theory Meets Practice
Risk Management for IT Security: When Theory Meets Practice Anil Kumar Chorppath Technical University of Munich Munich, Germany Email: [email protected] Tansu Alpcan The University of Melbourne Melbourne,
Simulation and Lean Six Sigma
Hilary Emmett, 22 August 2007 Improve the quality of your critical business decisions Agenda Simulation and Lean Six Sigma What is Monte Carlo Simulation? Loan Process Example Inventory Optimization Example
The problem with waiting time
The problem with waiting time Why the only way to real optimization of any process requires discrete event simulation Bill Nordgren, MS CIM, FlexSim Software Products Over the years there have been many
Chapter 4 Software Lifecycle and Performance Analysis
Chapter 4 Software Lifecycle and Performance Analysis This chapter is aimed at illustrating performance modeling and analysis issues within the software lifecycle. After having introduced software and
PROJECT TIME MANAGEMENT
6 PROJECT TIME MANAGEMENT Project Time Management includes the processes required to ensure timely completion of the project. Figure 6 1 provides an overview of the following major processes: 6.1 Activity
FUNBIO PROJECT RISK MANAGEMENT GUIDELINES
FUNBIO PROJECT RISK MANAGEMENT GUIDELINES OP-09/2013 Responsible Unit: PMO Focal Point OBJECTIVE: This Operational Procedures presents the guidelines for the risk assessment and allocation process in projects.
A Comparison of System Dynamics (SD) and Discrete Event Simulation (DES) Al Sweetser Overview.
A Comparison of System Dynamics (SD) and Discrete Event Simulation (DES) Al Sweetser Andersen Consultng 1600 K Street, N.W., Washington, DC 20006-2873 (202) 862-8080 (voice), (202) 785-4689 (fax) [email protected]
How To Test For Elulla
EQUELLA Whitepaper Performance Testing Carl Hoffmann Senior Technical Consultant Contents 1 EQUELLA Performance Testing 3 1.1 Introduction 3 1.2 Overview of performance testing 3 2 Why do performance testing?
Knowledge-Based Systems Engineering Risk Assessment
Knowledge-Based Systems Engineering Risk Assessment Raymond Madachy, Ricardo Valerdi University of Southern California - Center for Systems and Software Engineering Massachusetts Institute of Technology
14TH INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN 19-21 AUGUST 2003
14TH INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN 19-21 AUGUST 2003 A CASE STUDY OF THE IMPACTS OF PRELIMINARY DESIGN DATA EXCHANGE ON NETWORKED PRODUCT DEVELOPMENT PROJECT CONTROLLABILITY Jukka Borgman,
Agile and lean methods for managing application development process
Agile and lean methods for managing application development process Hannu Markkanen 24.01.2013 1 Application development lifecycle model To support the planning and management of activities required in
Module 11. Software Project Planning. Version 2 CSE IIT, Kharagpur
Module 11 Software Project Planning Lesson 28 COCOMO Model Specific Instructional Objectives At the end of this lesson the student would be able to: Differentiate among organic, semidetached and embedded
A Case Study of the Systems Engineering Process in Healthcare Informatics Quality Improvement. Systems Engineering. Ali M. Hodroj
A Case Study of the Systems Engineering Process in Healthcare Informatics Quality Improvement By Ali M. Hodroj Project Report submitted to the Faculty of the Maseeh School of Engineering and Computer Science
CRITICAL CHAIN AND CRITICAL PATH, CAN THEY COEXIST?
EXECUTIVE SUMMARY PURPOSE: This paper is a comparison and contrast of two project management planning and execution methodologies: critical path methodology (CPM) and the critical chain project management
Prescriptive Analytics. A business guide
Prescriptive Analytics A business guide May 2014 Contents 3 The Business Value of Prescriptive Analytics 4 What is Prescriptive Analytics? 6 Prescriptive Analytics Methods 7 Integration 8 Business Applications
Systems Engineering Complexity & Project Management
Systems Engineering Complexity & Project Management Bob Ferguson, PMP NDIA: CMMI Technology Conference November 2007 Outline A conversation Defining complexity and its effects on projects Research into
Realizing the Benefits of Finite Capacity Scheduling to Manage Batch Production Systems
Presented at the WBF North American Conference Baltimore, MD, USA 30 April - 4 May 2007 67 Alexander Drive PO Box 12277 Research Triangle Park, NC 27709 +1.919.314.3970 Fax: +1.919.314.3971 E-mail: [email protected]
DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN. April 2009 SLAC I 050 07010 002
DRAFT RESEARCH SUPPORT BUILDING AND INFRASTRUCTURE MODERNIZATION RISK MANAGEMENT PLAN April 2009 SLAC I 050 07010 002 Risk Management Plan Contents 1.0 INTRODUCTION... 1 1.1 Scope... 1 2.0 MANAGEMENT
Keywords document, agile documentation, documentation, Techno functional expert, Team Collaboration, document selection;
Volume 4, Issue 4, April 2014 ISSN: 2277 128X International Journal of Advanced Research in Computer Science and Software Engineering Research Paper Available online at: www.ijarcsse.com A Document Driven
Fairfield Public Schools
Mathematics Fairfield Public Schools AP Statistics AP Statistics BOE Approved 04/08/2014 1 AP STATISTICS Critical Areas of Focus AP Statistics is a rigorous course that offers advanced students an opportunity
Analysis Of Shoe Manufacturing Factory By Simulation Of Production Processes
Analysis Of Shoe Manufacturing Factory By Simulation Of Production Processes Muhammed Selman ERYILMAZ a Ali Osman KUŞAKCI b Haris GAVRANOVIC c Fehim FINDIK d a Graduate of Department of Industrial Engineering,
IMPROVING THE DESIGN-CONSTRUCTION INTERFACE
Improving the Design-Construction Interface IMPROVING THE DESIGN-CONSTRUCTION INTERFACE Luis F. Alarcón 1 and Daniel A. Mardones 2 ABSTRACT In building projects customer requirements, constructive aspects
VOLATILITY AND DEVIATION OF DISTRIBUTED SOLAR
VOLATILITY AND DEVIATION OF DISTRIBUTED SOLAR Andrew Goldstein Yale University 68 High Street New Haven, CT 06511 [email protected] Alexander Thornton Shawn Kerrigan Locus Energy 657 Mission St.
Arena 9.0 Basic Modules based on Arena Online Help
Arena 9.0 Basic Modules based on Arena Online Help Create This module is intended as the starting point for entities in a simulation model. Entities are created using a schedule or based on a time between
A Framework for Adaptive Process Modeling and Execution (FAME)
A Framework for Adaptive Process Modeling and Execution (FAME) Perakath Benjamin [email protected] Madhav Erraguntla [email protected] Richard Mayer [email protected] Abstract This paper describes the
LEAN AGILE POCKET GUIDE
SATORI CONSULTING LEAN AGILE POCKET GUIDE Software Product Development Methodology Reference Guide PURPOSE This pocket guide serves as a reference to a family of lean agile software development methodologies
Quantitative Inventory Uncertainty
Quantitative Inventory Uncertainty It is a requirement in the Product Standard and a recommendation in the Value Chain (Scope 3) Standard that companies perform and report qualitative uncertainty. This
Automated Scheduling Methods. Advanced Planning and Scheduling Techniques
Advanced Planning and Scheduling Techniques Table of Contents Introduction 3 The Basic Theories 3 Constrained and Unconstrained Planning 4 Forward, Backward, and other methods 5 Rules for Sequencing Tasks
Agile support with Kanban some tips and tricks By Tomas Björkholm
Agile support with Kanban some tips and tricks By Tomas Björkholm Foreword A year ago I held an Open Space at Scrum Gathering in Stockholm about Agile Support. I have since received several requests to
Computing & Communications Services
2010 Computing & Communications Services 2010 / 10 / 04 Final Kent Percival, M.Sc., P.Eng. Defining the Value of the Business Analyst In achieving its vision, key CCS partnerships involve working directly
AGILE BUSINESS INTELLIGENCE
AGILE BUSINESS INTELLIGENCE OR HOW TO GIVE MANAGEMENT WHAT THEY NEED WHEN THEY NEED IT Evan Leybourn Author Directing the Agile Organisation Melbourne, Australia [email protected] INTRODUCTION
Modeling Stochastic Inventory Policy with Simulation
Modeling Stochastic Inventory Policy with Simulation 1 Modeling Stochastic Inventory Policy with Simulation János BENKŐ Department of Material Handling and Logistics, Institute of Engineering Management
USER S GUIDE for DSM@MIT
USER S GUIDE for DSM@MIT TABLE OF CONTENTS 1. OVERVIEW...3 2. INSTALLATION...5 3. FUNCTIONS...7 3.1 Inputs for the Structuring Module...7 3.2 Analyses in the Structuring Module...8 3.3 Editing the DSM...13
Expert Reference Series of White Papers. Intersecting Project Management and Business Analysis
Expert Reference Series of White Papers Intersecting Project Management and Business Analysis 1-800-COURSES www.globalknowledge.com Intersecting Project Management and Business Analysis Daniel Stober,
Improving Software Development Economics Part II: Reducing Software Product Complexity and Improving Software Processes
Improving Software Development Economics Part II: Reducing Software Product Complexity and Improving Software Processes by Walker Royce Vice President and General Manager Strategic Services Rational Software
COMBINING THE METHODS OF FORECASTING AND DECISION-MAKING TO OPTIMISE THE FINANCIAL PERFORMANCE OF SMALL ENTERPRISES
COMBINING THE METHODS OF FORECASTING AND DECISION-MAKING TO OPTIMISE THE FINANCIAL PERFORMANCE OF SMALL ENTERPRISES JULIA IGOREVNA LARIONOVA 1 ANNA NIKOLAEVNA TIKHOMIROVA 2 1, 2 The National Nuclear Research
Using Analytic Hierarchy Process (AHP) Method to Prioritise Human Resources in Substitution Problem
Using Analytic Hierarchy Process (AHP) Method to Raymond Ho-Leung TSOI Software Quality Institute Griffith University *Email:[email protected] Abstract In general, software project development is often
Finite Capacity Portfolio and Pipeline Management
Finite Capacity Portfolio and Pipeline Management Under pressure to develop more products while holding their budgets constant, New Product Development (NPD) organizations are operating under severe resource
A Risk Management System Framework for New Product Development (NPD)
2011 International Conference on Economics and Finance Research IPEDR vol.4 (2011) (2011) IACSIT Press, Singapore A Risk Management System Framework for New Product Development (NPD) Seonmuk Park, Jongseong
White Paper Operations Research Applications to Support Performance Improvement in Healthcare
White Paper Operations Research Applications to Support Performance Improvement in Healthcare Date: April, 2011 Provided by: Concurrent Technologies Corporation (CTC) 100 CTC Drive Johnstown, PA 15904-1935
Agile and lean methods for managing application development process
Agile and lean methods for managing application development process Hannu Markkanen 27.01.2012 1 Lifecycle model To support the planning and management of activities required in the production of e.g.
JOURNAL OF OBJECT TECHNOLOGY
JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2006 Vol. 5. No. 8, November-December 2006 Requirements Engineering Tasks Donald Firesmith,
Recurrent Neural Networks
Recurrent Neural Networks Neural Computation : Lecture 12 John A. Bullinaria, 2015 1. Recurrent Neural Network Architectures 2. State Space Models and Dynamical Systems 3. Backpropagation Through Time
Mining the Software Change Repository of a Legacy Telephony System
Mining the Software Change Repository of a Legacy Telephony System Jelber Sayyad Shirabad, Timothy C. Lethbridge, Stan Matwin School of Information Technology and Engineering University of Ottawa, Ottawa,
Software Metrics & Software Metrology. Alain Abran. Chapter 4 Quantification and Measurement are Not the Same!
Software Metrics & Software Metrology Alain Abran Chapter 4 Quantification and Measurement are Not the Same! 1 Agenda This chapter covers: The difference between a number & an analysis model. The Measurement
OPTIMIZATION OF LOAD HAUL DUMP MINING SYSTEM BY OEE AND MATCH FACTOR FOR SURFACE MINING
OPTIMIZATION OF LOAD HAUL DUMP MINING SYSTEM BY OEE AND MATCH FACTOR FOR SURFACE MINING *Ram Prasad Choudhary Department of Mining Engineering, National Institute of Technology Karnataka, Surathkal-575025
SINGLE-STAGE MULTI-PRODUCT PRODUCTION AND INVENTORY SYSTEMS: AN ITERATIVE ALGORITHM BASED ON DYNAMIC SCHEDULING AND FIXED PITCH PRODUCTION
SIGLE-STAGE MULTI-PRODUCT PRODUCTIO AD IVETORY SYSTEMS: A ITERATIVE ALGORITHM BASED O DYAMIC SCHEDULIG AD FIXED PITCH PRODUCTIO Euclydes da Cunha eto ational Institute of Technology Rio de Janeiro, RJ
Applying Lean on Agile Scrum Development Methodology
ISSN:2320-0790 Applying Lean on Agile Scrum Development Methodology SurendRaj Dharmapal, Dr. K. Thirunadana Sikamani Department of Computer Science, St. Peter University St. Peter s College of Engineering
www.stephenbarkar.se Lean vs. Agile similarities and differences 2014-08-29 Created by Stephen Barkar - www.stephenbarkar.se
1 www.stephenbarkar.se Lean vs. Agile similarities and differences 2014-08-29 Purpose with the material 2 This material describes the basics of Agile and Lean and the similarities and differences between
ABHINAV NATIONAL MONTHLY REFEREED JOURNAL OF RESEARCH IN SCIENCE & TECHNOLOGY www.abhinavjournal.com
SOFTWARE DEVELOPMENT LIFE CYCLE (SDLC) ANALYTICAL COMPARISON AND SURVEY ON TRADITIONAL AND AGILE METHODOLOGY Sujit Kumar Dora 1 and Pushkar Dubey 2 1 Programmer, Computer Science & Engineering, Padmashree
CCPM: TOC Based Project Management Technique
CCPM: TOC Based Project Management Technique Prof. P.M. Chawan, Ganesh P. Gaikwad, Prashant S. Gosavi M. Tech, Computer Engineering, VJTI, Mumbai. Abstract In this paper, we are presenting the drawbacks
Executive Guide to SAFe 24 July 2014. An Executive s Guide to the Scaled Agile Framework. [email protected] @AlShalloway
An Executive s Guide to the Scaled Agile Framework Al Shalloway CEO, Net Objectives Al Shalloway CEO, Founder [email protected] @AlShalloway co-founder of Lean-Systems Society co-founder Lean-Kanban
Oracle Real Time Decisions
A Product Review James Taylor CEO CONTENTS Introducing Decision Management Systems Oracle Real Time Decisions Product Architecture Key Features Availability Conclusion Oracle Real Time Decisions (RTD)
Bottlenecks in Agile Software Development Identified Using Theory of Constraints (TOC) Principles
Master thesis in Applied Information Technology REPORT NO. 2008:014 ISSN: 1651-4769 Department of Applied Information Technology or Department of Computer Science Bottlenecks in Agile Software Development
The Study on the Effect of Background Music on Customer Waiting Time in Restaurant
Send Orders for Reprints to [email protected] The Open Cybernetics & Systemics Journal, 2015, 9, 2163-2167 2163 Open Access The Study on the Effect of Background Music on Customer Waiting Time
Gerard Mc Nulty Systems Optimisation Ltd [email protected]/0876697867 BA.,B.A.I.,C.Eng.,F.I.E.I
Gerard Mc Nulty Systems Optimisation Ltd [email protected]/0876697867 BA.,B.A.I.,C.Eng.,F.I.E.I Data is Important because it: Helps in Corporate Aims Basis of Business Decisions Engineering Decisions Energy
Airline Fleet Maintenance: Trade-off Analysis of Alternate Aircraft Maintenance Approaches
2003 2004 2005 2006 2007 2008 2009 2010 Cost per Flight Hour (USD) Airline Fleet Maintenance: Trade-off Analysis of Alternate Aircraft Maintenance Approaches Mike Dupuy, Dan Wesely, Cody Jenkins Abstract
SIMPLIFIED PERFORMANCE MODEL FOR HYBRID WIND DIESEL SYSTEMS. J. F. MANWELL, J. G. McGOWAN and U. ABDULWAHID
SIMPLIFIED PERFORMANCE MODEL FOR HYBRID WIND DIESEL SYSTEMS J. F. MANWELL, J. G. McGOWAN and U. ABDULWAHID Renewable Energy Laboratory Department of Mechanical and Industrial Engineering University of
Extensive operating room (OR) utilization is a goal
Determining Optimum Operating Room Utilization Donald C. Tyler, MD, MBA*, Caroline A. Pasquariello, MD*, and Chun-Hung Chen, PhD *Department of Anesthesiology and Critical Care Medicine, The Children s
Mapping an Application to a Control Architecture: Specification of the Problem
Mapping an Application to a Control Architecture: Specification of the Problem Mieczyslaw M. Kokar 1, Kevin M. Passino 2, Kenneth Baclawski 1, and Jeffrey E. Smith 3 1 Northeastern University, Boston,
Lean Software Development and Kanban
1 of 7 10.04.2013 21:30 Lean Software Development and Kanban Learning Objectives After completing this topic, you should be able to recognize the seven principles of lean software development identify
Scrum in a Large Project Theory and Practice
Scrum in a Large Project Theory and Practice Agile World 2012 Munich, July 12, 2012 Dr. Sebastian Stamminger Scrum in Large Projects Agenda Theory Case Study Teams Our Process Challenges Lessons Learned
9. Model Sensitivity and Uncertainty Analysis
9. Model Sensitivity and Uncertainty Analysis 1. Introduction 255 2. Issues, Concerns and Terminology 256 3. Variability and Uncertainty In Model Output 258 3.1. Natural Variability 259 3.2. Knowledge
Governments information technology
So l u t i o n s Blending Agile and Lean Thinking for More Efficient IT Development By Harry Kenworthy Agile development and Lean management can lead to more cost-effective, timely production of information
for Oil & Gas Industry
Wipro s Upstream Storage Solution for Oil & Gas Industry 1 www.wipro.com/industryresearch TABLE OF CONTENTS Executive summary 3 Business Appreciation of Upstream Storage Challenges...4 Wipro s Upstream
Dynamic Simulation and Supply Chain Management
Dynamic Simulation and Supply Chain Management White Paper Abstract This paper briefly discusses how dynamic computer simulation can be applied within the field of supply chain management to diagnose problems
