Tools and Methods Based on Knowledge Elicitation to Support Engineering Design

Size: px
Start display at page:

Download "Tools and Methods Based on Knowledge Elicitation to Support Engineering Design"

Transcription

1 Università Politecnica delle Marche Facoltà di Ingegneria Dipartimento di Meccanica Scuola di Dottorato di Ricerca della Facoltà di Ingegneria curriculum Ingegneria Meccanica e Gestionale - IX ciclo nuova serie Tools and Methods Based on Knowledge Elicitation to Support Engineering Design Tutor accademico: Chiar.mo Prof. Ferruccio Mandorli Coordinatore del corso: Chiar.mo Prof. Nicola Paone Tesi di dottorato di: Ing. Paolo Cicconi Novembre 2010

2 p. 2 Index Summary... 5 Introduction Engineering tools and methods Knowledge based engineering Expertise system Developing framework for expert system Modularity and configuration Configuration Virtual Prototypes Knowledge Based Systems Analysis and Design Support MOKA Object-Oriented Design Knowledge formalization and representation Knowledge classification Knowledge elicitation methods Direct/Indirect Interaction type Type of knowledge obtained Knowledge: implicit, explicit, tacit Knowledge sharing Methodology approach Traditional design method... 38

3 p Common framework to optimize design method Activity to formalize knowledge Basic methodology level Intermediate automation level High automation level Evaluating business skills Evaluating on designer skills Knowledge base Developing methods Developing basic automation level Developing intermediate automation level Developing high automation level Who does designing and developing? Risk evaluation Optimal automation level Small enterprises: test case of FGR Srl Product description: Drip Emitters Introduction on drip emitters design Proposed approach Drip emitters: design process A Knowledge base for supporting dripper design Dripper design parameters Chosen test cases description... 68

4 p Product virtual analysis Experimental tests Design parameters discussion and correlation New dripper design method Medium enterprises: test case of G.I.&E. Spa Context and plant lay-out design Solving routing problem Approach to Layout knowledge management Gathering of a knowledge base for layout configuration Knowledge formalization and representation Technology employed for knowledge management System implementation Software architecture Routing approach Routing algorithm From 2D to 3D Visualization Energy performance knowledge management Test case plant Conclusion Appendix A: Object-Oriented programming List of publications Bibliography Acknowledgement

5 p. 5 Summary Successful companies are those that have been able to improve human resources and that have created particular conditions to evolve and develop knowledge and know-how. This doesn t happen by chance but begins when companies realize that knowledge is the greatest richness. These considerations are increasingly necessary when considering the evolution of the occidental economy and the difficulty to satisfy the increate of consumer s demanding for lesser quantities of products but much more customizable. The challenge for Italian engineering enterprises is to sell quality products continuing to develop innovative solutions quickly, and keeping costs down. To win they have to invest in the most important part of an industrial companies: the design process, in order to ensure the future of the same company with dedicated strategies for innovation and technology. There isn t a guide about optimization of a type of design, but there are many alternatives both commercial and customized to engineering and automatize the process of industrial design. In particular, within this research thesis has been analysed the state of the art on Knowledge Based Systems and their implementation issues. A deep analysis on the Knowledge Elicitation has been required to capture and then to formalize not only explicit knowledge but also the implicit and tacit one. With great care it has been generalized the typical design flow in order to find the critical points and to establish the bases on the introduction of improvement process. Three different levels of design methodologies have been proposed, one for each degree of automation. The basic level of automated design is based on the designer experience in which the same designers have a good level of engineering skills and basic computer skills through which they can develop little tools such as features libraries or customized tables variables to reduce repetitive tasks in the planning stage. Moving to a higher level of automation, the intermediate one, engineering skills of the designer must be more because the design is based on virtual prototyping, so on engineering use of commercial applications that automate calculations for the rapid physical assessment of the product. At the upper of automation, design is completely based on business knowledge implemented on dedicated software applications; then, in addition to engineering expertise, other skills are required on object-oriented programming and in particular expertise on the programming interfaces provided by many engineering software.

6 p. 6 Formulating the methodology of this thesis, many enterprises located in the region of Marche have been followed; however for the discussion only two of the most significant applicative examples have been reported. In the first test case a working methodology based on virtual prototyping has been evaluated and implemented, in the second case knowledge has been thoroughly formalized in order to realize a software application dedicated to the specific requirements of the designers.

7 p. 7 Introduction This thesis deepens the three-years PhD research work at the Mechanical Department of Polytechnic University of Marche. Research contribution was shared between University and an Italian SME called FGR Srl. Here is an introduction about global context, engineering design tasks and the importance of knowledge elicitation. The XXI century is the scenario of the exasperation of the globalization. The local market has been rapidly transformed to a global market without rules and ethics, where main sales channel is the worldwide computer network. Recent facts have highlighted how Europe and America are aware of a time of economic crisis due to sudden changes. A lot of exit strategies have been used, but the first solutions have been to encourage, with monetary incentives, affairs and resources on local areas to increase regions richness saving high transport costs and reducing many risk. Few economics have seen in this strategy the possibility of a deglobalization, which must be recognized as a new intelligent way to manage the global market. Today each organization can survive only if there is mutual aid with its neighbour into a geographical zone. Individual results aren t the most important goals but only the community statistics are evaluated. With this complex context on the background, companies must increase their competitiveness in the global market to higher level to survive guarantee. In the mechanical production field this objective can be pursuit reducing time-to-market and increasing products quality. But the pure competitiveness goes further the aspect of production, this concept is also related to supply chain and it s valid for service company too. In particular, the author research is focused on engineering design which is the most influential and determinant aspect of an enterprises. In the context of industrial design, computer-aided tools and methods are very common to reduce time and costs and to optimize the design process. Good product design in short time is achievable through sophisticated virtual models which represent all functional and manufacturing aspects. Nowadays, virtual prototyping technologies permit high level of simulations of many design aspects: geometry, kinematics, strength, fluid dynamics, production processes, etc Performing a wide range of such analysis on a computer, physical product prototypes can be highly reduced. That means lower costs, shorter times to market and an overall higher quality level.

8 p. 8 However, this approach presents some problems especially for Small and Medium Enterprises (SME). These firms are often operating in low value mass products markets. Here, performance and quality are also further important in order to maintain and possibly increase the market share and resist to emergent countries competitors. Therefore, virtual prototyping techniques need to be effectively employed in design departments. On the other hand, SME often lack of resources and competences to effectively employ such systems. Software is usually expensive and require high skilled dedicated operators. On the contrary in a small department people are required of a wide generic knowledge in order to cope with different design aspects. That means used tools are often limited to CAD systems and product optimization is performed by time consuming trial-and-error approaches. For instance, even if the need of Finite Elements Analysis (FEA) or Computerised Fluid Dynamics (CFD) tools is recognised, costs are too high compared to product value and batch sizes. This research work aims to develop some methodologies to improve use of tools and methods based on knowledge elicitation. The main objective is to analyse the way to elicit and formalize knowledge in a reality companies to build after a work method based on commercial or tailored tools. On introducing tailored application, a brief state of the art has been described starting to expert system to arrive at KBE system. Then object-oriented approach is also explained on its particular relation with modularity and configuration. Very importance was given to knowledge management, and especially to the elicitation phase to capture hidden tacit knowledge from knowledge holders. The not explicit knowledge is the main of company know-how, so formalizing the implicit expertise it s possible to understand many companies choice and actions, improve product and production design and facilitate designer duties. This research has investigated on three levels of methodology, at each level concerns a status of automated engineering design; so the high level is that related to powerful use of commercial and customized tools to eliminate all repetitive and long phase in a typical engineering design. The intermediate level reduce design time with the intelligent use of commercial tools and finally the basic one is only a pure operations outline integrated with entry level computer-aided tools. Research results has been applied to two different test cases: one on a small enterprises and another on a medium one. Each case has been deeply described and discussed with the main goals.

9 p Engineering tools and methods Nowadays there are many type of application tools to aid engineering work on context of industrial design. The CAD (Computer Aided Design) are the most famous systems used on technical department to support product drafting; usually this concept is lead only to a aided geometrical representation but this thinking isn t useful to the modern CAD systems. CAD competence oversteps the pure geometrical modelling and implements the possibility to realize a parametrical and configurable product adding data about expert knowledge. Then, many CAD offer functions similar to a CAE (Computer Aided Engineering) including a deep finite element analysis for multiphysic simulation. So there is a lot of possibilities to find commercial applications, but these systems are more specialist for the generic problems but not for the particular customized situation. Then standard tools present methods for their control and not for global design, but a design methodology is required to improve the mechanical departments value, to increase quality and reduce efficiently costs and time-to-market. Briefly, in this section a little presentation of classic knowledge based theory for supporting engineering design in industrial context Knowledge based engineering Engineering knowledge is complex, diversificated, and interrelated, involving implicit knowledge, tacit knowledge, background knowledge and underlying assumptions (R. Brimble, 2000). Today the main key is how to manage knowledge effectively in a company. Best goals are on capturing and reusing knowledge, to reduce the solutions find timing in products building. Knowledge Based Engineering (KBE) automates engineering design phase joining engineering knowledge within software applications. But development and maintenance of complex KBE applications requires both Knowledge Engineering and Software Engineering techniques. Knowledge must be captured, represented, validated and maintained; software must be specified, designed, coded and tested. So it s very important the use of a systematic approach within large development teams to facilitate long-term maintenance and re-use.

10 p. 10 Knowledge Based Engineering (KBE) is a technical domain that includes methodologies and tools to acquire, formalizes and represents in IT systems the knowledge of a specific application field. KBE is a special type of Knowledge Based System with a particular focus on product engineering design and downstream activities such as analysis, manufacturing, production planning cost estimation and even sales. The development of such applications aims to shorten the time of products configuration phase, to aid in decision-making activities and to automate repetitive procedures. Nowadays, many companies try to invest in KBE systems. Configuration is often applied in consolidated productive situations to standardise functional groups and improve economies of scale. By means of a suitable analysis, it is possible to determine product platform for future production. Further development is represented by variants definition through the assembly of "intelligent" modules that encapsulate the configuration rules and the design parameters (F. Mandorli, 2000). However, this research is focused on those cases whose final solutions cannot be explicitly detected only on the base of specific design parameters. Here final configuration is the result of many design activities. The impact of each single selection or choice needs to be assessed in terms of costs, performance, assemblability and so on. In absence of decision support tools such task, generally, is intuitively performed on the basis of the expert's personal skill. In order to evaluate alternative solutions, the designer must be able to manage the different types of knowledge that are part of the configuration model knowledge. The goal is to develop a system to support the expert during his/her decision-making activity. Then, the problem to formalise, integrate and structure different types of knowledge involved in both the design for configuration and configuration of the solution phases is a crucial point. The implementation of this support tool requires knowledge relative to the product domain. This knowledge can be at least classified in two kinds: explicit knowledge and tacit knowledge. The explicit knowledge is rational and sequential, and can be found on books, manuals and catalogue. On the contrary, tacit knowledge is more linked to the individual experiences, so it is very difficult to describe it. Knowledge is mainly drawn from the development team, made of people with different tasks and composed by internal and external collaborators. In SME some competences cannot be found due to the reduced internal staff. So it is important to formalise and store this knowledge in order to avoid continuous expenses for outsourcing (M. Cederfelt, 2005) (P. Bermell-Garcìa, 2001).

11 p. 11 Knowledge recovery should be carried on in order to gather information without slowing down enterprise activities. In this analysis phase, the base for future development is established, since rules and tacit knowledge are collected. Then, the phase of development follows. The experts team defines the tasks and implements a methodology and related tools. The third step is the systems test, in which they start to be employed in the design department. Generally, KBE is the base for develop Knowledge Based Systems (KBSs), which are artificial intelligent applications to support the designer activity, imitating the human problemsolving and supporting decision-making, learning and action in complex problems. A KBS is a technology based on a database of knowledge and on methods to recognise similar situations. From the above categorization it emerges how KBS implementation requires suitable formalization approaches relative to the specific problem domains and type Expert system Expert systems are computer programs which are intended to solve real word problems, achieving the same level of accuracy as human experts (Shadbolt, et al., 1989). One of the greatest obstacles, in expertise system analysis, is the acquisition of the knowledge which human experts use in their problem solving; the issue is so important to the development of knowledge-based systems (Hayes-Roth, et al., 1983). Expert systems are available in large number of areas(durkin, 1993)such as control, design, diagnosis, instruction, interpretation, monitoring, planning, predication, selection and simulation. These systems evolved as first commercial products of Artificial Intelligence(Hasan, et al., 2011);it s possible to define eleven categories in expertise systems methodologies(liao, 2004): Ruled based systems Knowledge-based systems Neural networks Fuzzy expert systems Object-oriented methodology Case-based reasoning Modelling System architecture Intelligent agents

12 p. 12 Ontology Database methodology An expert system is a particular software based on human knowledge captured for problem solving without human experts consulting. These application of artificial intelligence were introduced by researchers in the Stanford Heuristic Programming Project, principal contributors to the technology were Bruce Buchanan, Edward Shortliffe, Randall Davis, William Van Melle, Carli Scott, and others at Stanford. The principal distinction between expert systems and traditional problem solving programs is the way in which the problem related expertise is coded. In traditional applications, problem expertise is encoded in both program and data structures; in the expert system approach all of the problem related expertise is encoded mostly in data structures. An example, related to tax advice, contrasts the traditional problem solving program with the expert system approach. In the traditional approach data structures describe the taxpayer and tax tables, while a program contains rules (encoding expert knowledge) that relate information about the taxpayer to tax table choices. In the expert system approach, the latter information is also encoded in data structures called Knowledge Base (or rulebase); so the program becomes an interface engine which is relatively independent of the problem domain (taxes) and processes the rules without regard to the problem area they describe, processing sequence and focus. Organization of an expert system has several benefits: New Rules can be added to the knowledge base without needing to rebuild the program. This allows changes to be made rapidly to a system; Rules are arguably easier for (non-programmer) domain experts to create and modify than writing code. (Commercial rule engines typically come with editors that allow rule creation/modification through a graphical user interface, which also performs actions such as consistency and redundancy checks.) Modern rule engines allow a hybrid approach: someone allow rules to be "compiled" into a form that is more efficiently machine-executable. Also for efficiency concerns, rule engines allow rules to be defined more expressively and concisely by allowing software developers to create functions in a traditional programming language such as Java, which can then be invoked from either the condition or the action of a rule.

13 p. 13 There are different expert systems in which a rulebase and an inference engine cooperate to simulate the reasoning process that a human expert pursues in analysing a problem and arriving at a conclusion. In these systems, in order to simulate the human reasoning process, a vast amount of knowledge needed to be stored in the knowledge base. Generally, the knowledge base of such an expert system consisted of a relatively large number of "if then" type of statements that were interrelated in a manner that, in theory at least, resembled the sequence of mental steps that were involved in the human reasoning process. Because of the need for large storage capacities and related programs to store the rulebase, most expert systems have, in the past, been run only on large information handling systems. Recently, the storage capacity of personal computers has increased to a point where it is becoming possible to consider running some types of simple expert systems on personal computers. In some applications of expert systems, the nature of the application and the amount of stored information necessary to simulate the human reasoning process for that application is just too vast to store in the active memory of a computer. In other applications of expert systems, the nature of the application is such that not all of the information is always needed in the reasoning process. When an expert system employs a single integrated rulebase to diagnose the minimum system configuration of the data processing system, much of the rulebase is not required since many of the components which are optional units of the system will not be present in the system. Nevertheless, early expert systems required the entire rulebase to be stored since all the rules were, in effect, chained or linked together by the structure of the rulebase. When the rulebase is segmented, preferably into contextual segments or units, it is then possible to eliminate portions of the Rulebase containing data or knowledge that is not needed in a particular application. The segmenting of the rulebase also allows the expert system to be run with systems or on systems having much smaller memory capacities than was possible with earlier arrangements since each segment of the rulebase can be paged into and out of the system as needed. The segmenting of the rulebase into contextual segments requires that the expert system manage various intersegment relationships as segments are paged into and out of memory during execution of the program. Since the system permits a rulebase segment to be called and executed at any time during the processing of the first rulebase, provision must be made to store the data that has been accumulated up to that point so that at some time later in the process, when the system returns to the first segment, it can proceed from the last point or rule node that was processed. Also, provision must be made so that data that has been collected by the system up to that point can be passed to the second segment of the rulebase after it has

14 p. 14 been paged into the system and data collected during the processing of the second segment can be passed to the first segment when the system returns to complete processing that segment. Instead, the term "expertise" implies specialized knowledge, held by an expert. Expertise management is the oppose of knowledge management in the way to find people having particular types of specialized knowledge or skill. Main question is know who aspect. Expertise rules are created from the domain expertise, the knowledge base stores the rules of the expert system; there are generally three individuals having an interaction with expert systems: end-user: the individual who uses the system for its problem solving assistance; problem domain expert who builds and supplies the knowledge base providing the domain expertise; knowledge engineer who assists the experts in determining the representation of their knowledge, enters this knowledge into an explanation module and who defines the inference technique required to obtain useful problem solving activity. Usually, the knowledge engineer will represent the problem solving activity in the form of rules which is referred to as a rule-based expert system. An understanding of the "inference rule" concept is important to understand expert systems. An inference rule is a statement that has two parts, an if clause and a then clause. This rule is what gives expert systems the ability to find solutions to diagnostic and prescriptive problems. An expert system's rulebase is made up of many such inference rules. They are entered as separate rules and it is the inference engine that uses them together to draw conclusions. Because each rule is a unit, rules may be deleted or added without affecting other rules (though it should affect which conclusions are reached). One advantage of inference rules over traditional programming is that inference rules use reasoning which more closely resemble human reasoning. Thus, when a conclusion is drawn, it is possible to understand how this conclusion was reached. Furthermore, because the expert system uses knowledge in a form similar to the expert, it may be easier to retrieve this information from the expert. Compared to traditional programming techniques, expert-system approaches provide the added flexibility (and hence easier modifiability) that arises from the ability to model rules as data rather than as code. In situations where an organization's IT department is overwhelmed by a software-development backlog, rule-engines, by facilitating turnaround, provide a means

15 p. 15 that can allow organizations to adapt more readily to changing needs. In practice, modern expert-system technology is employed as an adjunct to traditional programming techniques, and this hybrid approach allows the combination of the strengths of both approaches. Thus, rule engines allow control through programs (and user interfaces) written in a traditional language, and also incorporate necessary functionality such as inter-operating with existing database technology. However, an expert-system/rule-based approach is not optimal for all problems, and it requires considerable knowledge not to misapply a technology. Ease of rule creation and rule modification can be double-edged. A system can be sabotaged by a non-knowledgeable user who is enabled all too easily to add worthless rules, or rules that conflict with existing ones. Reasons for the failure of many systems include the absence of (or neglecting to employ diligently) facilities for system audit, detection of possible conflict, and rule lifecycle management (e.g., version control, thorough testing before deployment, etc.) The problems to be addressed here are as much technological as organizational Developing framework for expert system System named as Knowledge-Based Engineering (KBE) or later as Rule-Driven Design (RDD) were developed from the 80s in order to improve design productivity (Raffaeli, et al., 2009) First remarkable commercial systems of this type were ICADTM and The Concept ModellerTM. They were already based on object-oriented (O-O) language to define product architecture, parts parameters and methods to choice, dimension, and build assemblies. Windows based application soon appeared as RuleStreamTM and Selling Point by Oracle(Mandorli, et al., 2002)(Mandorli, et al., 2001). The target was set on small enterprises, on only one mechanical engineer with a good know-how in CAD and programming. A typical application is built on an object-oriented (O-O) programming platform or shell. The core of a KAE shell is the interpreter of the generative design language and the dependencies manager that drives the model re-generation when model inputs change. Finally, it is worth to mention that high range CAD system packages offer environments, often referred as Knowledge modules, enabling to define dimensioning rules on the base of some design parameters (Bodein, et al., 2009). Even if integrated with CAD and represented entities, those environments often require coding using dedicated languages to define

16 p. 16 conditional expressions and show limited functionalities. Moreover a generic integrated approach to work with parametric associative CAD systems is needed since many difficulties arise in real design environments linked to the possibility to obtain well-structured models. The analysis and usage in real industrial contexts have revealed that, although significant improvements have been carried out in user-system interaction, tool interface and IT infrastructure potentialities, the methodological approach is still weak. In fact, product configuration related software tools have appeared on the market with very scarce success (Schotborgh, et al., 2009)and the majority of developed application was restricted to selling support field, where implementing details could be neglected. Main causes can be recognized in: difficult to elicit, represent and formalize design; need to continuously update new technologies, new standards, new requirements with product evolving; necessity of engineers with programming competence for a proper system use and maintenance. These tools were thought as product platform development environments where designers were often requested to cope with data structures and coding. Even if users could update the system, they are not willing or not prepared to do it. The focus was being set on the obtainment of geometrical solutions as results of choices among a series of variants. Design knowledge is usually represented by graphs, sets of formulas and code routines. Despite tools are general purpose and flexible, they require high efforts in implementing new design solutions. As a result, they are strictly bounded to the specific application field. Two important gaps are shown by these systems(raffaeli, et al., 2010). The first one is the distance between the sales and the technical department. As mentioned before new products specifications and requirements should be incorporated without the necessity of costly implementation phases and requirements translations. Secondly, knowledge about product and production systems should be formalized in a way which is familiar to designer background. Design for configuration phase must be accomplished through the introduction of alternative product structures, design parameters, parametric rules and conditional expressions in order to completely express design rationale. Such input is still based on trees, rules, expressions, code and basic shapes. The distance between the system and the designer is still considerable Modularity and configuration These theories find interesting application in design automation activities since configuration is often applied in consolidated productive situations to standardize functional groups and improve economies of scale. In fact, a product is by definition modular if it is

17 p. 17 possible to accomplish a rapid and low-cost design modification that fully satisfies the market requirements (Raffaeli, et al., 2010). Generally speaking the term design automation in mechanical design is used for a wide range of tasks, from automated generation and modification of detailed geometry using parametric modelling to automated generation of components in embodiment design through to automated generation of new concepts. However, most current Computer-Aided tools (CAx) claiming design automation capabilities often are limited to supporting the detail design phase by automating modifications to existing geometry, as reviewed in the next paragraph. Research in the area of design synthesis automation aims to extend CAx tools to include support for rapid, automatic generation of topologically different design variants for a given problem specification. However, in mechanical engineering applications, apart from the specific area of structural topology optimization, few methods have transitioned from research to implementation in practice. Research in computational synthesis has been underway for several decades. An interesting review can be found by Lin (Lin, et al., 2009)where conventional knowledge based engineering systems using symbolic representations are compared to shape grammars that operate on shapes directly both to generate new shapes and to transform the geometry of existing shapes (Stiny, 1980) Typically, the design process of a configurable product takes place in two steps: the design for configuration and the configuration of the solution (Tiihonen, et al., 1999). Design for configuration is the process to define functional and technological properties as well as dimensioning and assembling rules of the basic and optional components that can be combined to form the final product. Configuration of the solution is the process to define a specific member of the product family in order to meet user s needs Configuration Virtual Prototypes Configuration Virtual Prototypes (CVP) represents classes of potential product instances. A CVP is an extended product definition which includes requirements, solutions, geometry as well as different types of design knowledge in a structured manner. Three main different design domains have been recognized as crucial in a CVP definition as shown in figure 1: product specification, geometrical data and product knowledge.

18 p. 18 Figure 1-1 Representation of the CVP framework The first aspect is related to sales requirements, in terms of product specification that leads to a particular product instance. Typically, specifications include product layout in terms of modules arrangement, marketing and functional requirements, required performances and design solutions choice. All these features are generally expressed by a set of parameters or options, whose availability is constrained by reciprocal conditions. The main outcome is represented by specific design choices and a required product structure. Modules choice and arrangement, as well as the detailed list of components are mainly drawn from these choices. The analysis of product development processes, in several companies, showed that the strategic choices related to new product lines are characterized, in the first phases, by the definition of Market Requirements (MRs) and Product Requirements (PRs) (Germani, et al., 2006). MRs are determined by marketing experts on the basis of customers analysis, PRs are the technical perspective of product concept. Their elaboration leads to the realization of the specific new solution. They have to be interrelated and structured through priority and dependence relations. Generally, the design knowledge which is useful in the first product development phases, especially the tacit knowledge, is rarely formalized. When schematically represented, it is classified by flat methods, without structured hierarchical relations and using only a level of representation. In CVP approach, information is organized in two levels: the first one correlates the market requirements and product requirements; the second one translates them into functional requirements (FR). The knowledge formalization and elaboration tasks

19 p. 19 have been based on the Design Structure Matrix (DSM) methodology that uses matrices to represent interdependency relations between market requirements and product requirements. The second aspect is related to product geometrical definition in terms of CAD models and technical documentation. As mentioned before, a product is here regarded as an assembly of configurable modules which embody the rules for geometrical configuration. Basically, products components and assemblies are the results of parametric models which are originally interactively defined by the designer. These models embody detailed geometric definition, as well as dimensioning parameters and strategies for building assemblies through mating constraints. Additionally, simplified layouts are believed as a powerful means to input and define product structure, desired solution and design parameters and CAD independent product data structure are necessary to manage configuration tasks through standardized high level functionalities; Finally, knowledge management plays a significant role for the framework definition. While product specifications fix which parts are necessary and geometrical data how these parts are shaped and assembled, product knowledge contains the rules and constraints which set why functional requirements and details have been defined in a certain way. In the case of the design process of a new product variant or in a product redesign process, CVP definition updates through an acquisition process of new specifications, geometries and knowledge. Specifically, knowledge refers to mapping of product or marketing requirements to functional requirements, mapping of functional requirements to specific design solutions, mapping of design solutions in product structures in term of parts and assemblies, dimensioning rules of parts and assemblies, multi-level assembly building strategy to form final product model. Knowledge for part dimensioning is normally expressed in terms of formulas and if-thenelse statements. Part of it can be recovered from parametric CAD model which are at the base of CVP. They embody the explicit definition of design parameters and some dimensioning rules which are more effectively input on CAD system rather than configuration tools. Besides, assembly strategy is traditionally based on procedures which position parts by coordinates in a completely unstructured manner. This approach requires onerous coding activity and moreover, a strong dependency of the solution to the specific product variant. In CVP such knowledge is recognized as implicitly stored in assembly template models, where the designer has already defined a strategy through a sequence of mating constraints. An algorithm is presented hereafter to elicit such knowledge and use it as source for product assemblies building strategy.

20 p. 20 Traditional problems connected to tools revision can be overcome if information exchange process between these different domains is maintained as automatic as possible. That means tools must be provided in order to manage the three highlighted product information domains. Designer is required to input information using a semantic which is affine to its standard working environment. Such information is then automatically drawn inside the system and transformed by configuration tools in usable formalisms Knowledge Based Systems Analysis and Design Support The Knowledge Acquisition and Documentation Structuring (KADS) research offers a structured development process for knowledge-based systems(wielinga, et al., 1992). It focuses on Expert Systems (ES) that reason about situations with the goal of extending the situation description reveal causality, i.e. what happens and why. To do this, the ES requires a knowledge base that describes facts, conditions, inferences and dependencies. This causality knowledge is challenging to acquire from domain experts, as they are experts in problem solving, not in explaining their solutions(fensel, 1995). As a result, development methods for ES moved away from the concept of knowledge acquisition as direct knowledge transfer and instead introduce a Knowledge Engineering (KE) process. During this KE process, a specialized knowledge engineer develops, or designs, a computational model of some expert s knowledge. This cyclic process requires the knowledge engineer to observe and interpret the original knowledge, and verify the correctness of the new computational model(schotborgh, et al., 2009). KADS offers several semi-formal models to structure the knowledge of experts and aid the modelling activity. A further formalization of this approach is the Knowledge Acquisition and Representation Language (Fensel, 1995). KARL supports the process to formalize the knowledge from knowledge engineer into a software language. The result is a formal modelling language that can infer and reason without supervision, given certain strict mathematical conditions. Because of the formalization, KARL provides support such as graphical representation and an interpreter and debugger of knowledge. One advantage of this knowledge engineering approach is that the knowledge model can have high expressiveness and the problem solving capabilities exceed that of a single expert. Summarizing, one could say that knowledge-based systems that reason about cause and effect require knowledge engineering because the causal knowledge is difficult to extract directly. The modelling is done in a formal modelling language because the reasoning algorithm require mathematical rigor to reason through the knowledge base autonomously.

21 p. 21 The type of engineering design problems addressed in this thesis do not require causality knowledge, because only automation is required. This reduces the need for knowledge acquisition(wielinga, et al., 1992) of causality knowledge and a formal mathematical language or reasoning algorithm MOKA MOKA is a European research project that started in 1998 and was active for 30 months. It provides a methodology to develop Knowledge-Based Engineering (KBE) applications. MOKA is an acronym for Methodology and software tools Oriented to Knowledge based engineering Applications. The goal is to reduce the investment and risk of KBE development: similar to this thesis. The scope is routine design engineering with a strong link to geometry (Stokes, 2001). The MOKA approach prescribes the knowledge engineering process and supports it with a software tool. A standardization of knowledge was developed, called ICARE (acronym for Illustration, Constraint, Activities, Rules and Entities). ICARE is divided into a part that describes the design object (constraints, entities and illustrations) and a part to describe the design process (illustrations, activities and rules). The entire process of KBE development is described as follows: knowledge gathering: collection of raw knowledge from design experts. A broad view on the design object, processes, related aspects and background information; structuring: develop the so-called Informal model of knowledge, divided into object information and design process descriptions. The ICARE concepts are used to facilitate this step and the next; formalizing: refine the Informal model and develop a rigorous Formal model of the application knowledge, that is used to build the KBE system. This model consists of two sections: the Product Model that describes the object and related knowledge, and the Design Process Model defines the execution and decision making order, plus the process of selection choices; 4. implementation: software development of a KBE application. MOKA focuses on the second and third step. Software tools are developed to allow non- KBE specialists to structure and formalize the relevant knowledge using the ICARE concepts. The process is methodologically described in the MOKA handbook (Stokes, 2001).

22 p. 22 Several commonalities and differences are identified between MOKA and this thesis. The goals are quite similar: reduce the development effort of knowledge-based software to support the design process. But, there are also some differences. MOKA has a strong link to geometry and geometric modelling: it uses assemblies and parts explicitly. This thesis does not do this, it adopts concepts of parameters and topological elements. MOKA does not prescribe the knowledge gathering step to determine the system boundaries and acquire the relevant knowledge. This thesis aims to do so. MOKA s scope is widely comparable to this thesis aiming at any engineering design knowledge. Perhaps due to this wide scope, MOKA handles the solving algorithm (the Design Process Model) as case specific knowledge. This thesis has a narrower scope but uses a generic solving algorithm. Finally, MOKA project has defined a methodology to capture and formalize engineering knowledge, allowing it to be retained and exploited, for example within KBE applications (Brimble, et al., 2000) Object-Oriented Design An object contains encapsulated data and procedures grouped together to represent an entity. The 'object interface', how the object can be interacted with, is also defined. An objectoriented program is described by the interaction of these objects. Object-oriented design is the discipline of defining the objects and their interactions to solve a problem that was identified and documented during object-oriented analysis. The use of the O-O representation of products and design knowledge is the fundamental characteristic that allows the correspondence between modular products and a KBE application (Mandorli, et al., 2000). Such systems were important contributions to represent knowledge, to automate and aid design activities (Colombo, et al., 1992). Apart of the quality of functionalities and implementation, the main limitation can be recognized in the capabilities required to the users. In many industrial fields, like automotive or aerospace, this approach has been adopted to automate design or procedures (Mandorli, et al., 2001)(Germani, et al., 2004). The natural and simplicity Object-Oriented Analysis can be also an effective methodology for transferring knowledge from the problem expert to the computer expert for implementing the application. Naturalness and simplicity derive from the fact that the basic concepts, namely objects and attributes, classes and members, systems and parts, correspond to

23 p. 23 the methods of organization that we are all accustomed to using to investigate and understand the reality that surrounds us(mandorli, 1995).Applying this method to the solution of our problem is quite obvious when you interpret the basic concepts of component by component, geometrical, functional and technology in terms of objects, classes, and properties.

24 p Knowledge formalization and representation Knowledge acquisition is often based on the analysis of interviews with experts and the gathering of experimental tests. Therefore it is necessary to elicit knowledge, convert the tacit to explicit one and formalize it. The eliciting phase has been focused on who or what were the depositaries of knowledge. Figure 2-1 Design knowledge eliciting process 2.1. Knowledge classification A basic classification of knowledge can be divided into three level: Metaphysics level: a broad specialized field where only problems with ultimate questions are asked; some of which are difficult to answer within Human logic and rationality (ex. Philosophy, Para-Psychology etc.). Main questions are about life and death. Arts level: imaginative subjects representation. Main problem exists in imagination and reasoning where correct answer is only probability. Science level: knowledge about problems that can be proved and solved through experimental methods. This knowledge is used in rational problem, but in epistemology science doesn t exist without arts and metaphysics; in fact these last enclose creative and imaginative capacities. Science level is the principal domain to solve engineering problems. For each knowledge level is possible to individuate an arbitrary classification which often concurs with knowledge representation in problem solving methodologies Knowledge elicitation methods Many Knowledge Elicitation (KE) methods have been used to obtain the information required to solve problems. These methods can be classified in many ways. One common way is

25 p. 25 how directly they obtain information from the domain expert. Direct methods involve directly questioning a domain expert on how they do their job. In order that these methods could be successful, the domain expert has to be reasonably articulate and willing to share information. The information has to be easily expressed by the expert, which is often difficult when tasks frequently performed often become 'automatic.' Indirect methods are used in order to obtain information that cannot be easily expressed directly (Burge, 1998). There are many different knowledge elicitation techniques and it is often difficult to choose between them. Because the success of a knowledge elicitation effort is dependent on the technique chosen, much work has been done to classify knowledge elicitation techniques. In J. Burge research (Burge, 1998) there are discussed three KE categories: Direct/Indirect; Interaction Type; Type of Knowledge Obtained Direct/Indirect Direct methods are those that obtain the information directly from the expert, so the required information is obtained by asking direct questions or from direct observation. During KE sessions using direct techniques, the expert verbalizes the needed information. Instead, in indirect methods the needed information is not requested directly; so the results of the knowledge elicitation session must be analysed in order to extract the needed information. The analysis required depends on the technique and the goals of the knowledge elicitation session. The advantage of an indirect approach is that these methods can sometimes obtain additional information than that provided by direct methods, producing more information, in fact much knowledge may be implicit; another advantage is that some subjects are not as verbal as other subjects and are unlikely to give full and detailed answers to direct questions Interaction type The interaction type is based on knowledge grouping by the type of interaction with the domain expert. Table 2-1 Elicitation methods shows the categories and the type of information produced with typical results.

26 p. 26 Category Examples Type Results Interview Structured Unstructured Semi-structured Direct Varies Depending on Questions asked Case study Critical incident method Forward scenario simulation Critical decision method Direct Procedures followed Rationale Protocols Protocols analysis Direct Procedures followed Rationale Critiquing Critiquing Direct Evaluation of problem solving strategy compared to alternatives Role Playing Role playing Indirect Procedures Difficulties encountered due to role Procedures followed Simulation Simulation Wizard of Oz Direct Prototyping Rapid prototyping Direct Evaluation of proposed Storyboarding approach Teachback Teachback Direct Correction of misconceptions Observation Observation Direct Procedure followed Goal Related Goal decomposition Diving the domain Direct Goals and subgoals Grouping of goals List Related Decision analysis Direct Estimate of worth of all decisions for a task Construct Elicitation Repertory Grid Multi-dimensional scaling Indirect Entities Attributes Sometimes relationship Sorting Card sorting Indirect Classification of entities Laddering Laddering Indirect Hierarchical map of the task domain 20 Questions 20 questions Indirect Information used to solve problems Organization of problem space Document analysis Document analysis Indirect Varies depending on available documents Interaction with experts Table 2-1 Elicitation methods Interviewing consists of asking the domain expert questions about the domain of interest and how they perform their tasks. The questions list can be unstructured, semi-structured, or structured. It is very difficult to know which questions should be asked, especially if the interviewer is not familiar with the domain; so the success of an interview session is dependent on the questions asked and on the ability of the expert to externalize their knowledge. The model is built by the knowledge engineer based on information obtained during the interview

27 p. 27 and then reviewed with the domain expert. In some cases, the models can be built interactively with the expert, especially if there are software tools available for model creation. In Case Study methods, different examples of problems/tasks within a domain are discussed. The problems consist of specific cases that can be typical, difficult, or memorable; so these cases are used as a context within which directed questions are asked. The Protocol Analysis involves asking the expert to perform a task while "thinking aloud." The intent is to capture both the actions performed and the mental process used to determine these actions. As with all the direct methods, the success of the protocol analysis depends on the ability of the expert to describe why they are making their decision. In some cases, the expert may not remember why they do things a certain way. In many cases, the verbalized thoughts will only be a subset of the actual knowledge used to perform the task. One method used to augment this information is Interruption analysis. For this method, the knowledge engineer interrupts the expert at critical points in the task to ask questions about why they performed a particular action. In Critiquing is an approach evaluated by the expert to validate results of previous KE sessions. In Role Playing, the expert adapts a role and acts out a scenario where their knowledge is used. The main task is analyse the situation from a different perspective, to reveal that was not discussed when the expert was asked directly. Simulation method is used when it is not possible to actually perform the task, so the task is simulated using a computer system or other means. For example, the design process can be simulated using a multi-agent system to mimic a design team. In Prototyping, the expert is asked to evaluate a prototype of the proposed system being developed. This is usually done iteratively as the system is refined. The Teachback methods allow the knowledge engineer to teach the information back to the expert, who then provides corrections and fills in gaps. In Observation methods, the knowledge engineer observes the expert performing task. This prevents the knowledge engineer from inadvertently interfering in the process, but does not provide any insight into why the decisions were made.

28 p. 28 In Goal Related methods, focused discussion techniques are used to elicit information about goals and subgoals. For example, in Reclassification, the domain expert is asked to specify what evidence would indicate that a given goal is the correct one. Instead in List Related methods, the expert is asked to provide lists of information, usually decisions. So far, all the KE methods discussed are direct methods. The first classification that includes indirect methods is Construct Elicitation, which are used to obtain information about how the expert discriminates between entities in the problem domain. The most commonly used construct elimination method is Repertory Grid Analysis. For this method, the domain expert is presented with a list of domain entities and is asked to describe the similarities and differences between them. These similarities and differences are analysed to derive the important attributes of the entities. After completing the initial list of attributes, the knowledge engineer works with the domain expert to assign ratings to each entity/attribute pair. The type of rating depends on the information being elicited. In some cases, attributes are rated as present/not present for each entity, in others a scale is used where the attribute is ranked by the degree to which it is present. The sorting methods is based on domain entities sorting to determinate how the expert classifies their knowledge. The domain expert is presented with a list of entities to be sorted. They are then asked to sort them either using pre-defined dimensions or along any dimension they feel is important. Subjects may be asked to perform multiple sorts, each using a different dimension. In Laddering, a hierarchical structure of the domain is formed by asking questions designed to move up, down, and across the hierarchy. Examples of concepts would be goals and subgoals, tasks and subtasks. For example, to elicit concepts that are higher in the hierarchy the expert is asked "why" questions. To get concepts lower in the hierarchy the expert is asked "how" questions. To get concepts at the same level, the expert is asked for alternatives. The20 Questions is a method used to determine how the expert gathers information. The knowledge engineer selects an item in the problem space (such as a problem or a solution) and the expert questions them to determine what item the knowledge engineer has chosen. Document analysis involves gathering information from existing documentation. This may or may not involve interaction with a human expert to confirm or add to this information. Some document analysis techniques, particularly those that involve a human expert, can be classified as direct. Others, such as collecting artifacts of performance, such as documents or notes, in order to determine how an expert organizes or process information, are classified as indirect.

29 p Type of knowledge obtained Besides being grouped into direct and indirect categories, KE methods can also be grouped through the type of knowledge obtained. For example, many of the indirect KE methods are more useful to obtain classification knowledge while direct methods are more suited to obtain procedural knowledge. The most important types of information used are: Procedures Problem solving strategy Goals, sub-goals Classification Relationships Evaluation Many methods fit into more than one category and are listed more than once. Also, this classification shows the information most commonly extracted using a method and does not imply that only that type of information can be elicited Knowledge: implicit, explicit, tacit Generally speaking, in accord to Polanyi research work (Polanyi, 1966) (Raelin, 1997), knowledge has a double aspect: tacit and explicit. Tacit knowledge is linked to individual experiences while explicit one is rational and sequential. Combining explicit and tacit forms, knowledge can be distinguished in five categories: conceptualization, reflection, experimental, transactive and expert (Raelin, 1997)(Dooley, et al., 1999). Conceptualization knowledge is the pure analytical one sourced from employed theory; reflection knowledge is linked to direct observation of empirical data; experimental knowledge investigates phenomena behaviour within certain boundary conditions starting from observational phase, with high validity; transactive knowledge is stored by individual group members and identify the existence and location of knowledge held by others; finally, expert knowledge is the tacit knowledge acquired by the experimental data. Tacit knowledge is contrasted with explicit or propositional knowledge. Very loosely, tacit knowledge collects all those things that we know how to do but perhaps do not know how to explain (at least symbolically). The term tacit knowledge comes to us courtesy of Michael Polyani, a chemical engineer turned philosopher of science. This biographical detail is not incidental, for Polanyi emerged from his laboratory with the news that the philosophers had

30 p. 30 scientific practice all wrong: their account of how science proceeds was massively weighted toward the propositional, encoded, formulaic knowledge that is exchanged between laboratories, and almost totally ignorant of the set of skills that are required to actually work in one of those laboratories. Polanyi s reason is that we recognise the importance of this second, embodied (and hence personal ) sort of knowledge, and that we collapse the hierarchy that sees handson skills and unwritten rules neglected and devalued, whilst the propositional report is privileged. Tacit knowledge is messy, difficult to study, regarded as being of negligible epistemic worth. Proper knowledge exists in propositional form (which is, conveniently, much easier to study). So, tacit knowledge is difficult to transfer to another person by means of writing it down or verbalising it. For example, stating to someone that the screws are threaded is a piece of explicit knowledge that can be written down, transmitted, and understood by a recipient. However, the ability to speak a language, use algebra, or design and use complex equipment requires all sorts of knowledge that is not always known explicitly, even by expert practitioners, and which is difficult to explicitly transfer to users. While tacit knowledge appears to be simple, it has far reaching consequences and is not widely understood. With tacit knowledge, people are not often aware of the knowledge they possess or how it can be valuable to others. Effective transfer of tacit knowledge generally requires extensive personal contact and trust. Another example of tacit knowledge is the ability to ride a bicycle, where the formal knowledge is, that to balance, if the bike falls to the left, one steers to the left; and to turn right, the rider first steers to the left, and then when the bike falls, steer to the right; but knowing this formally is no help in riding a bicycle, and few riders are in fact aware of this. Tacit knowledge is not easily shared. Tacit knowledge consists often of habits and culture that we do not recognize in ourselves. In the field of knowledge management, the concept of tacit knowledge refers to a knowledge which is only known by an individual and that is difficult to communicate to the rest of an organization. Knowledge that is easy to communicate is called explicit knowledge. The process of transforming tacit knowledge into explicit knowledge is known as codification or articulation. The tacit aspects of knowledge are those that cannot be codified, but can only be transmitted via training or gained through personal experience. Tacit knowledge has been described as know-how -- as opposed to know-what (facts), know-why (science), or know-who (networking). It involves learning and skill but not in a way that can be written down.

31 p. 31 The conflicts demonstrated in the previous two paragraphs are a result of Nonaka's model of organizational knowledge creation, in which he proposed that tacit knowledge could be converted to explicit knowledge. Tacit knowledge is presented variously as uncodifiable ("tacit aspects of knowledge are those that cannot be codified") and codifiable ("transforming tacit knowledge into explicit knowledge is known as codification"). This is common in the knowledge management literature. Nonaka diverted from Polanyi's original view of 'tacit knowing' without empirical or conceptual foundation. The salient characteristic of the tacit knowledge approach is the basic belief that knowledge is essentially personal in nature so it is difficult to extract from the heads of individuals(sanchez). In effect, this approach to knowledge management assumes, often implicitly, that the knowledge in and available to an organization will largely consist of tacit knowledge that remains in the heads of individuals in the organization. Working from the premise that knowledge is inherently personal and will largely remain tacit, the tacit knowledge approach typically holds that the dissemination of knowledge in an organization can best be accomplished by the transfer of people as knowledge carriers from one part of an organization to another. Further, this view believes that learning in an organization occurs when individuals come together under circumstances that encourage them to share their ideas and (hopefully) to develop new insights together that will lead to the creation of new knowledge. Recommendations for knowledge management practice proffered by researchers and consultants working within the tacit knowledge approach naturally tend to focus on managing people as individual carriers of knowledge. To make wider use of the tacit knowledge of individuals, managers are urged to identify the knowledge possessed by various individuals in an organization and then to arrange the kinds of interactions between knowledgeable individuals that will help the organization perform its current tasks, transfer knowledge from one part of the organization to another, and/or create new knowledge that may be useful to the organization. Let us consider some examples of current practice in each of these activities that are typical of the tacit knowledge approach. Most managers of organizations today do not know what specific kinds of knowledge the individuals in their organization know. This common state of affairs is reflected in the lament usually attributed to executives of Hewlett-Packard in the 1980s: If we only knew what we know, we could conquer the world. As firms become larger, more knowledge intensive, and more globally dispersed, the need for their managers to know what we know is becoming acute. Thus, a common initiative within the tacit knowledge approach needs some efforts to improve understanding of who knows about what in an organization -- an effort that is

32 p. 32 sometimes described as an effort to create know who forms of knowledge. An example of such an effort is the creation within Philips, the global electronics company, of a yellow pages listing experts with different kinds of knowledge within Philips many business units. Today on the Philips intranet one can type in the key words for a specific knowledge domain and the yellow pages will retrieve a listing of the people within Philips worldwide who have stated that they have such knowledge. Contact information is also provided for each person listed, so that anyone in Philips who wants to know more about that kind of knowledge can get in touch with listed individuals. An example of the tacit knowledge approach to transferring knowledge within a global organization is provided by Toyota. When Toyota wants to transfer knowledge of its production system to new employees in a new assembly factory, such as the factory recently opened in Valenciennes, France, Toyota typically selects a core group of two to three hundred new employees and sends them for several months training and work on the assembly line in one of Toyota s existing factories. After several months of studying the production system and working alongside experienced Toyota assembly line workers, the new workers are sent back to the new factory site. These repatriated workers are accompanied by one or two hundred longterm, highly experienced Toyota workers, who will then work alongside all the new employees in the new factory to assure that knowledge of Toyota s finely tuned production process is fully implanted in the new factory. Toyota s use of Quality Circles also provides an example of the tacit knowledge approach to creating new knowledge. At the end of each work week, groups of Toyota production workers spend one to two hours analyzing the performance of their part of the production system to identify actual or potential problems in quality or productivity. Each group proposes countermeasures to correct identified problems, and discusses the results of countermeasures taken during the week to address problems identified the week before. Through personal interactions in such Quality Circle group settings, Toyota employees share their ideas for improvement, devise steps to test new ideas for improvement, and assess the results of their tests. This knowledge management practice, which is repeated weekly as an integral part of the Toyota production system, progressively identifies, eliminates, and even prevents errors. As improvements developed by Quality Circles are accumulated over many years, Toyota s production system has become one of the highest quality production processesin the world(bowen, et al., 1997). Implicit knowledge management employs tools, techniques and methodologies that capture these seemingly elusive thought processes and make them more generally available to

33 p. 33 the organization. Thus, the thought processes used by your best thinkers become a leverage-able asset for the organization. To accomplish this, you need the ability to dissect your expert s explanation of the component steps to executing a process. This process is both science and art. Most importantly it is essential that you don t begin with previously formed assumptions, or let opinions cloud the data collection process. If you can keep an open mind, process logic or expertise can eventually be codified into a series of related modules. Much of the success of implicit knowledge mining resides in the ability of the interviewer to elicit the right level of detail from the expert, and not to immediately assume that the reasoning behind certain approaches or tasks is not discernable. Often, it is necessary to guide an expert through their own thought process, through the steps used to arrive at conclusions that the expert believes (and has gotten others to believe) are tacit or instinctive behavior. This is not to say that all tacit knowledge can be transfigured into implicit knowledge. There will always be bodies of know-how and experience that remain tacit. Also tacit knowledge is not an effective way to achieve alignment between personal and organizational values. (Storytelling and mentoring are better ways to achieve value alignment.) The goal of implicit knowledge management is to determine how much of the tacit knowledge in your organization defies any form of codification, and to mine that which does not. Once an organization is willing to accept that some of its tacit knowledge can be captured, it can initiate the process of identifying and documenting the portion labeled implicit. This process is advanced by structured methodologies that employ interviewing techniques and a schema for capturing thought processes. Explicit knowledge is knowledge that has been or can be articulated, codified, and stored in certain media. It can be readily transmitted to others. The information contained in encyclopedias (including Wikipedia) are good examples of explicit knowledge. The most common forms of explicit knowledge are manuals, documents, procedures, and how-to videos. Knowledge also can be audio-visual. Works of art and product design can be seen as other forms of explicit knowledge where human skills, motives and knowledge are externalized. In contrast to the views held by the tacit knowledge approach, the explicit knowledge approach holds that knowledge is something that can be explained by individuals; even though some effort and even some forms of assistance may sometimes be required to help individuals

34 p. 34 articulate what they know. As a result, the explicit knowledge approach assumes that the useful knowledge of individuals in an organization can be articulated and made explicit. Working from the premise that important forms of knowledge can be made explicit, the explicit knowledge approach also believes that formal organizational processes can be used to help individuals articulate the knowledge they have to create knowledge assets. The explicit knowledge approach also believes that explicit knowledge assets can then be disseminated within an organization through documents, drawings, standard operating procedures, manuals of best practice, and the like. Information systems are usually seen as playing a central role in facilitating the dissemination of explicit knowledge assets over company intranets or between organizations via the internet. Usually accompanying the views that knowledge can be made explicit and managed explicitly is the belief that new knowledge can be created through a structured, managed, scientific learning process. Experiments and other forms of structured learning processes can be designed to remedy important knowledge deficiencies, or market transactions or strategic partnering may be used to obtain specific forms of needed knowledge or to improve an organization s existing knowledge assets. The recommendations for knowledge management practice usually proposed by researchers and consultants working within the explicit knowledge approach focus on initiating and sustaining organizational processes for generating, articulating, categorizing, and systematically leveraging explicit knowledge assets. Some examples of knowledge management practice in this mode help to illustrate this approach. In the 1990s, Motorola was the global leader in the market for pagers. To maintain this leadership position, Motorola introduced new generations of pager designs every months. Each new pager generation was designed to offer more advanced features and options for customization than the preceding generation. In addition, a new factory with higher-speed, more flexible assembly lines was designed and built to produce each new generation of pager. To sustain this high rate of product and process development, Motorola formed teams of product and factory designers to design each new generation of pager and factory. At the beginning of their project, each new team of designers received a manual of design methods and techniques from the team that had developed the previous generation of pager and factory. The new team would then have three deliverables at the end of their project: (i) an improved and more configurable next-generation pager design, (ii) the design of a more efficient and flexible assembly line for the factory that would produce the new pager, and (iii) an improved design

35 p. 35 manual that incorporated the design knowledge provided to the team in the manual it received -- plus the new and improved design methods that the team had developed to meet the product and production goals for its project. This manual would then be passed on to the next design team given the task of developing the next generation of pager and its factory. In this way, Motorola sought to make explicit and capture the knowledge developed by its engineers during each project and to systematically leverage that knowledge in launching the work of the next project team. In addition to its tacit knowledge management practice of moving new employees around to transfer knowledge of its production system, Toyota also follows a highly disciplined explicit knowledge management practice of documenting the tasks that each team of workers and each individual worker is asked to perform on its assembly lines. These documents provide a detailed description of how each task is to be performed, how long each task should take, the sequence of steps to be followed in performing each task, and the steps to be taken by each worker in checking his or her own work(bowen, et al., 1997) When improvements are suggested by solving problems on the assembly line as they occur or in the weekly Quality Circle meetings of Toyota s teams of assembly line workers, those suggestions are evaluated by Toyota s production engineers and then formally incorporated in revised task description documents. In addition to developing well-defined and documented process descriptions for routine, repetitive production tasks, some organizations have also created explicit knowledge management approaches to supporting more creative tasks like developing new products. In the Chrysler unit of DaimlerChrysler Corporation, for example, several platform teams of development engineers have responsibility for creating the next generation platforms4 on which Chrysler s future automobiles will be based. Each platform team is free to actively explore and evaluate alternative design solutions for the many different technical aspects of their vehicle platform. However, each platform team is also required to place the design solution it has selected for each aspect of their vehicle platform in a Book of Knowledge on Chrysler s intranet. This catalog of developed design solutions is then made available to all platform teams to consult in their development processes, so that good design solutions developed by one platform team can also be located and used by other platform teams. Other firms have taken this explicit knowledge management approach to manage knowledge in product development processes even further. For example, Fanuc Automation, one of the world s leading industrial automation firms, develops design methodologies that are

36 p. 36 applied in the design of new kinds of components for their factory automation systems. In effect, instead of leaving it up to each engineer in the firm to devise a design solution for each new component needed, GE Fanuc s engineers work together to create detailed design methodologies for each type of component the firm uses. These design methodologies are then encoded in software and computerized so that the design of new component variations can be automated. Desired performance parameters for each new component variation are entered into the automated design program, and GE Fanuc s computer system automatically generates a design solution for the component. In this way, GE Fanuc tries to make explicit and capture the design knowledge of its engineers and then to systematically re-use that knowledge by automating most new component design tasks. Even a casual review of the many articles and consulting recommendations on knowledge management practice today soon reveals a plethora of recommended processes and techniques. Unfortunately, especially for the many managers looking tore searchers and consultants for insights to guide development of sound knowledge management practices, many of these recommendations seem unconnected to each other, and in the worst cases many seem to be quite at odds with each other. Close analysis of these recommendations, however, usually reveals that the many ideas for practice being advanced today can be grouped into one of two fundamentally different views of knowledge itself and of the resulting possibilities for managing knowledge in organizations. These two views are characterized here as the tacit knowledge approach and the explicit knowledge approach. Let us consider the basic premises and the possibilities for knowledge management practice implied by each of these two views Knowledge sharing Knowledge sharing is an activity through which knowledge is exchanged among people, friends, or members of a family, a community or an organization. Organizations have recognized that knowledge constitutes a valuable intangible asset for creating and sustaining competitive advantages. Knowledge sharing activities are generally supported by knowledge management systems. However, technology constitutes only one of the many factors that affect the sharing of knowledge in organizations, such as organizational culture, trust, and incentives. The sharing of knowledge constitutes a major challenge in the field of knowledge management because some employees tend to resist sharing their knowledge with the rest of the organization(dalkir, 2005).

37 p. 37 Knowledge sharing is an important focus in the strategic management field, indeed, in many industries, the importance of developing abilities to use properly the knowledge contained in the firm s network, has become evident. Benchmarking has demonstrated the potentially great benefits of best practices transfer. Instances of failure in downsizing, on the other hand, have revealed the costs of losing knowledge. Empowerment and globalization have created local knowledge with potential for utilization elsewhere, and information technology has given individuals increasingly differentiated knowledge, unknown to the head office. Moreover, the very basis for some organizational activities is the sharing of knowledge both between units and with outside partners and clients.

38 p Methodology approach To support engineering design is fundamental the definition of a methodology based on main expertise knowledge. In this PhD thesis, the method researched aims to define a framework based on main industrial knowledge eliciting, sharing and its formalization; all is finalized on engineering design optimization. Result method can be implemented in every industrial reality to improve productivity optimizing own ability; finally best goal is the rules and data codification in a customization expertise application automating design processes and reducing time to market. This research has investigated on three levels of methodology on a common base framework: High automation level; Intermediate automation level; Basic methodological level. At each level concerns a status of automated engineering design; so the high level is that related to powerful use of commercial and customized tools to eliminate all repetitive and long phase in a typical engineering design. The intermediate level reduce design time with the intelligent use of commercial tools and finally the basic one is only a pure operations outline integrated with entry level computer-aided tools Traditional design method First, an explanation about the traditional way to design in an industrial enterprise. Design process is commonly based only on the experience of engineers supported by CAD- CAM systems and trial-and-error procedures. Nowadays Computer Aided Engineering (CAE) systems can be successfully employed to investigate performance of virtual product without any physical realization of physical prototypes. The integration of virtual prototyping tools in the design is very important in shortening the whole production cycle. However, some specific knowledge is required for a correct interpretation of the results. In fact, nominal CAD model always differs from real one. Therefore CAD/CAE systems outputs must be matched with experimental tests in order to draw correct results.

39 p. 39 Standard technology available to the engineer is often limited to an only geometric modelling system (CAD).Those application can integrate plug-in for supporting a superficial level of analysis about product performance (static resistance, modal analysis, fluid-dynamics, thermal analysis, etc.) but with inaccurate results; the engineer has to interpret more accurately the results, so the automation level is very low and a very important support is done by expertise and old projects feedbacks (tables, results, reports, drawings, etc.). This methodology is particular widespread in SMEs (Small Medium Enterprises), in which the designers have the basis application tools to operate and elaborate new solution. In few cases there is also the using of bi-dimensional CAD to contain costs of software tools and users training. In next figure a schematic representation of a design standard procedure. Figure 3-1 Classic design process Figure 3-1 shows the beginning of a new project by a new requested order with specific data and defined output; geometrical modelling is support by engineer expertise with is combined with old projects data such as tables, drawings, reports, feedbacks, etc. ; finally testing is only made by physical prototype realization, so a not correct result brings back the design to the modelling phase and the engineer has to make changes based on him experience. This cycle is the very problem of a standard design procedure, in fact it could be a long loop

40 p. 40 before finding a validate solution; all know-how knowledge is contained in the engineer, who has to complete many repetitive tasks, principally without adding all his high value Common framework to optimize design method Common base framework is the methodology to analyse the traditional company design and to implement one of the three methodology levels. A deep analysis is required to elicit knowledge and formalize knowledge to define an optimized design method. Figure 3-2Process to analyse and optimizise a traditional design Figure 3-2represents process to analyse and optimize a traditional standard design process in an industrial reality. Procedures and flows are studied to understand know-how and knowwho, so in this phase it s needful the contribution of knowledge holders (company internal employees), knowledge engineers (external consultants) and the classification of used tools. All actors support the final knowledge formalization, which is the base to an optimized design methodologies implementable on one of three automation levels: high, intermediate and basic. The automation level is a combination of managers tasks, available tools and knowledge holders expertise.

41 p. 41 Figure 3-3 Tasks load between computer and user at each automated design level Previous Figure 3-3describes qualitatively the difference between an high and a low automated design. In basic automation procedures many tasks belongs to the user (ex. engineer, experts, manager, etc.) who is little supported by specifics computer-aided applications ; instead, on intermediate automation level, the designer is very supported to commercial application (parametric CAD/CAE, commercial CAD plug-in, spread-sheets, database, etc.); finally the high automation level consists in a design integrated with classic computer-aided software and customized application finalize to design automation and virtual prototyping Activity to formalize knowledge Knowledge formalizing is a chaotic activity which can be also represents in next Figure 3-4. Diagram shows a very detailed way to capture knowledge and formulate rules. Actors identifies industrial knowledge holders (who have to explain know-who) and specialized knowledge engineers who have to acquire and elaborate know-how. Targets symbolizes main expected goals.

42 p. 42 Figure 3-4 Activities to optimize engineering design Detailing, actors are people involved in a project to optimize engineering design into an industrial enterprise, this team is clearly dived in two groups: knowledge holders and knowledge engineers; the first, knowledge holders, can be identified whit company corporate such as engineers, mangers, industrial experts, salesclerks and factory workers. This group is individuated by principal company management, who claims any responsibility; each components have to collaborate to explicit the principal topics to improve with the knowledge engineers. Those ones are external engineers, expertized on knowledge management processes; usually it s possible call then as engineers, but they must be also experts in information technology or in communication. Their groundwork is enough large: from mechanical to electronic, from informatics to mathematic; main request ability for those group are: Able to interview; To listen and to understand key words; Understand capably know-who to optimize know-how capturing; Good general knowledge and prepared on many topics; Able to expand quickly own knowledge; Able to synthesize; Able to manage knowledge and to elaborate it.

43 p. 43 Knowledge holders and engineers meet each other in technical briefings to establish targets together. First company corporate determine the advice topics, explaining data in required input and output; almost simultaneously the engineers reviewing their skills in specifics topics to prepare the kick-off briefing. To define problem targets, many meetings are needed; but each of them are useful to final knowledge acquisitions; in fact the knowledge engineers can observe know-who and so where to find know-how. Targets block is closely related to knowledge elicitation phase and result check, but it s also linked to actors by briefings and meeting discussions. Know-how and know-who represent the knowledge management phase with the knowledge elicitation, sharing and formalizing. Elicitation is the way to capture all relevant knowledge domains, but how capture engineering knowledge? The answer is relative simply in explicit knowledge domain; truly it s possible to consult an handbook, search online, study new topics and review old knowledge. But, it s very onerous to elicit the main implicit knowledge, which depends from personal expertise. (As described, the hidden knowledge is very difficult to externalize and it is more difficult to transpose and to implement). During knowledge acquisition, this research has used 8 different knowledge elicitation methods: interview, case study, simulation, prototyping, teachback, observation, 20 questions and document analysis. Interview, 20 questions, teachback and observation methods are the most direct to ask information and explanations about different topics; case study and document analysis are indirect way to acquire data from company history; simulation and prototyping are very similar in meaning the rules about design and product performance. Acquired data are previously shared between each actors to add and to expand it. Knowledge sharing phase is fundamental to validate the elicitation one; so it need of technical briefings and meetings for company internal corporate consult and for all actors discussion. A negative response could restart part of precedent knowledge elicitation phase. Then, knowledge formalization step is owner of knowledge engineers who elaborate customized rules to lead the engineering design phase in industrial aims; those is indistinguishable from the elicitation, in fact methodology in industrial design has to be based on know-how knowledge. Finally results about knowledge formalization are rules, tables, graphs, etc., which have to passed the knowledge owners check control first to be validated Basic methodology level First level of automate design with more tasks assigned to designer. This level is related to the classical SMEs where the design engineer is able to move on every fields. If today a

44 p. 44 classical design is based only on using computer-aided drafting systems, the basic provide methodology extends this view to a parametric modelling feature-based, which allow the users to create libraries of feature templates, which are defined by its shape and main geometric parameters. Using parametric feature-based models, the designer can reduce many repetitive operations in his work and use the additional times to optimize product quality and process efficiency increasing company competitiveness. So application tools, which are used in setup automate design, have to be very easy to use and configure, because it s required a short time to start-up these design tools, otherwise more complicated aided-applications could detract very important design times to the users. Figure 3-5 Automated design at basic level Figure 3-5shows addition of user-defined tools to a standard design process; this variation ensures an important support on geometric modelling. Benefits of this methodology are: reducing time to market, immediate advantages, accessible software costs and designers with basic computer skills. Main disadvantages are: design not completely supported by automate processes and lack of an approach based virtual prototyping to reduce cost of physical prototypes.

45 p Intermediate automation level The medium level increases the initial prototype reliability; performance of virtual geometrical model can be evaluate with commercial application used to analysis generic aspects such as acoustic, thermic, fluid-dynamic, static, vibration, injection moulding, machining, etc. This level is an expansion of basic one, with the support for an affordable virtual prototyping to reduce times and material costs on physical prototypes and their testing. Figure 3-6 Automated design at intermediate level It s shown in Figure 3-6the introduction of Virtual Prototyping control after Simulation Tools analysis; these steps are separated because simulations concern many physical-based systems, so all result have to be evaluate in decisional control for model acceptation. Boundary condition data are hidden in input data and order specifications, these information must be written on model product properties and parameters. The greatest benefit is the reduction of physical testing on prototypes; then, also a prior qualitative estimate about product and process performance is very important in a mechanical design, so final designing are most responsible. Disadvantages are: the presence of additionally commercial tools requires users more computer skilled and more prepared on physical-analysis topics, otherwise this technologies would be slow down the designing. If there is an effective reducing of material costs, it could be

46 p. 46 an increase of virtual models and long virtual simulations. The virtual prototyping is more expensive for a SME, then, in this methodology, it results separated from base user-defined tools introduced in last level High automation level The higher level introduces a Knowledge Based approach on supporting design, this includes linked tools, some of them user-defined and other customized or commercial, which implement knowledge formalized rules to a correct design; this phase could enclose virtual prototyping testing with it. Figure 3-7 Automated design at high level In Figure 3-7is evident the automation control container which is driven only by external design supervision; engineer expertise is translated on Knowledge Bases tools, so in this methodology is required only a supervision control of designing procedure. Correct model is just elaborated at this step, eventual prototype cost aren t completely eliminated but a correct knowledge based approach increases design efficiency and quality, reducing all repetitive and long procedures because they are assigned to computer processor. Often, this tools are customized and implemented on a specific industrial reality. So the most important benefit is the possibility to have a supported designing without a very expert user; so standard designer could not have a good computer skills or excelsior

47 p. 47 mechanical experience about specific topics, because all know-how is on formalized knowledge base. Then, It s important the possibility of using personalized tools tailored. What are disadvantages? The most important is that this system have to implement correctly all design rules and product and process know-how. Often, customized tools are expensive, their costs is proportional to their level of process automation, even if also generic CAE commercial systems are more expensive of these. Concluding, this third automation level is more popular in medium-large enterprises, where it is easier to make an important action of knowledge elicitation; this because a large company is more structured then a small one, so there is possibility to establish a knowledge sharing to begin the knowledge formalization phase Evaluating business skills On enterprise point of view, business skills are knowledge and ability rooted in the company; those complete story, old projects, tables, reports, designers experiences and knowledge, employees abilities, etc. Skills are shown in point through the view of view of engineering and computer ones, even if other capabilities can be investigate those two were the most important in this research work. Figure 3-8Business skills trend

48 p. 48 On Figure 3-8graphical representation of fundamental business skills at different automation designing level. Summarizing the complete engineering skills always increase with introduction of automation tools and methods; in fact, automating a process, it requires a deeply know-how definition and an elated skill to formalize and to implement knowledge, so automate design is based on a good quality of classical design. Curve diagram for engineering skills is blue, its gradient grows rapidly from basic level to the intermediate one because this passage requires the change from a design based on experience to one based on optimization performance and cost-reducing; curve gradient increase slowly between intermediate and high automation level because at these conditions it s took the best of engineering skills. Computer skills in a standard enterprise is represents in two curves, one with solid line describes the typical informatics skills for an industrial company, then the second one with dotted line show the eventual acquired skills. Those curves increase together from basic level to intermediate one likely the engineering skills discussed; then from intermediate level there is a fork because a company could or could not directly incorporate informatics knowledge to automate design, so this depends from enterprise managers decision to buy or to implement tailored application tools on supporting engineering design. Computer skills required for building an automate design process are very high, so many times that is postponed to an external consultant service with a substantial reduction in development liability but with an equivalent price Evaluating on designer skills This paragraph wants to explain the variation of designer skills requirement to a standard designer into the three solution level described behind. These skills are evaluated as the sufficient abilities to lead a design process using specific automation tools. So while next paragraph describes the complete business skills, here the ones related to the standard designer how could it be a junior-engineer.

49 p. 49 Figure 3-9 Engineering and computer skills required to an user The graph no Figure 3-9 shows the engineering and computer skills of a generic design user in each layer discusses before in function of correspondent automation level. Hear a specific comment: Basic level: it s layer of design very based on experience, so it s required good engineering skills but not very high because more tasks results still repetitive. Computer skills are basic because many application tools used are simple and technical. Increasing automation level the global skills go hand in hand. Intermediate level: it s the one of elaborate calculations and simulations. The automation isn t only in the way of repetitive operations reducing but also the it s also a support of performance simulation to evaluate products and processes behaviour. At this layer it concerns very high level of engineering and computer skills because there is using of professional tools, which are very specifics in ways of setup, condition and mathematical model definition, and in results interpretation (physical analysis, optimizations, comparisons) and elaboration (data management, reports, graphics, etc.). High level: it s the zone of high automation tools, which aided the designer to complete tasks in bit time with less attention than last level; the main hypothetic case is automatic designing at a mouse click. This level can be obtained by a deeply knowledge analysis and formalization into customized tools; so all knowledge is archived in applications and databases, and single user has to be able

50 p. 50 in software using and not much in engineering optimization because optimization due to knowledge tools administrators. Which is the best approach for an optimized engineering design? The response depends on enterprise size and human resources. The next paragraph speaks about risk on three different implement levels, but in theory the intermediate methodology is the only one that can lead to very good results in because improves company know-how Knowledge base Knowledge base is a container of optimized data which backups and recoveries expertise required. This structure could be divided in certain knowledge domains; many times there are two knowledge entities: Data Table and Rule Table. Data table holders domain ontology of expertise model and knowledge acquisition; rule table maintains entity-relation. Each level presents, with more or less evident, a knowledge base which supports the engineering design. At basic and intermediate automation level the knowledge base is very simple because this is represented only by reports, parametric models, feature templates and tables. High level include a complete knowledge base interfacing with advanced tools. So tables, databases, rules, graphs and model templates are managed by tailored applications based on enterprise know-how. Knowledge base building is after phases of elicitation and formalization, its completely definition due to a good knowledge sharing between knowledge holders and who has to receive and implement all information data and rules Developing methods This section explains the implementation of each level reported on discussed methodology. While last sections describe the different views of an automation design for a designer (engineer, expertise, etc.), here is the view of who implements the methodology supported by specialized tools. Different IT tools can be used to implement customised applications. The first consideration when selecting the most appropriate development tool is the needed balance between the amount of control that will be incorporated into the system and the development effort (Mandorli, et al., 2001). Remarks about the effort and skill required to develop a tailored

51 p. 51 application is particularly significant if we consider the lost of information that may occur when the domain expert has to interact with the IT expert that will develop the application. This drawback could be avoided if the domain expert can develop, test and finally use the application. Design automation requires an degree of informatics implementation which depends by automation level; so increasing the weight of automation, implementing skills increases and an expert designer is needed on developing the design system. Ability in informatics technics isn t an scope usual scope in a mechanical department, sometimes this is also true in an electronic one. Therefore, the who development problem is an important question to evaluate before undertake an innovation process. It s very important a risk estimation on view of costs and benefits. Main development problem is to determinate the implementation way on updating design process and at whom the task is; the opportunities are two: to assign every task to designers or internal employee (at home), or to commission everything to expert external consultants. Later, on next sections, also a table on supporting decision making about development planning Developing basic automation level At this level informatics skills requirement are very low, a little ability is required to implement standard functionality on CAD system: CAD Feature libraries: standard available feature based CAD systems allow the users to create customized libraries of feature templates defining a shape and main geometric parameters. The control of the correct use of the feature template during the definition of a new model is left to the user. This operation is often supported by software guide, so the skills level in informatics abilities is very basic. Parametric modelling: an expert user can model an object with parametric dimensions which are driven by user-defined variables. So it s possible to implement engineering formulas and law on a template model. This action is the base of configuration in designing. Excel equations: most of the commercially available CAD systems provide a connection link between Microsoft Excel application documents and geometric CAD-Kernel. So at basic level is required a fairly good knowledge of Microsoft Excel application or similar (ex. Open Office); this task required a basic

52 p. 52 informatics skill common on many design-user (engineer or technical expert). Formulas and rules can be implemented with greater precision and functionality. At this level is better that informatics tasks are given to each user-designer, at most to a specialized company employee such as a company informatics human-resource. So people that realize the basic development implementation has to be into the enterprise too, to reduce external dependencies and additional costs Developing intermediate automation level The intermediate level concerns the computer skills about implementation of MACRO, VBScript, and using of commercial FEM systems and technical computer languages. Parametric modelling and feature-based approach can be also supported on CAD systems by MACRO, which is a rule or pattern that specifies how a certain input instructions sequence should be mapped to an output sequence by a defined procedure, or by simple VBScript, which is a lightweight programming language with a fast interpreter for use in a wide variety of Microsoft environments. Ability on using commercial software are required to elaborate automatic calculations and to data analysing and representation. Especially first type of skills are very difficult to find in a standard mechanical designer, instead the second one are more common in an engineer but this required more informatics feeling and engineering skills by the single designer. So this level isn t often completely implemented by internal company employees, but many time an external consult is necessary Developing high automation level At the upper level there is the implementation of tailored applications to support engineering design. Particular computer and informatics skills are required to development knowledge based tools by programming languages object oriented such as VisualBasic.NET or C++; so it s recommended to assign this task on a professional partner who is expert on this topics if the company hasn t a substitutive figure in its interior. On developing knowledge application, there is the necessity to experience it; so know how about Windows (or in alternative Linux) applications is the base, then particular knowledge and experience on using SDK (Software Development Kit) libraries is fundamental. Expert developer on engineering aided tools implementation could have customized

53 p. 53 programming libraries to interface tailored application with commercial software like CAD/CAE systems, database, spreadsheet, etc. A specific expertise is request to a informatics person who works on developing knowledge based application connected to CAD Kernel; important basic topics are: the Boundary-Representation theory which defines the method for representing geometrical shapes (in modern CAD system) using the limits, in fact a solid is represented as a collection of connected surface elements (the boundary between solid and non-solid); how connecting CAD to personal application and how entering on CAD geometrical Kernel through special API (Application Programming Interface) and SDK; geometrical and mathematical knowledge to use important operations; basic engineering knowledge; finally a good expertise on objectoriented programming language. All those minimum are very difficult to find in a classical design engineer but also in a standard informatics programmer; the developer of application supporting high automate design has to have particular skills not findable in everywhere, so there is balance between the decision to use informatics consults and the choice to acquire informatics competences. Concluding, the scope of result applications are about design automation, virtual prototyping and product configuration; then, on supporting of high automation level design, it s required a double knowledge: one about engineering product/process (ability of design team) and another on software developing (software developer and expert knowledge engineer).at this level, the figure of company manager has to decide who develops the tailored application tools, what resources to use and the project planning Who does designing and developing? Research work needs of a comparison between design tasks and developer ones. Last sections it has been explain the design work and its duty to forward knowledge while previous paragraphs has just described about informatics needs on implementing customized application tools. Designer has more competence on engineering tasks and informatics developer on purely computer topics. Three very important questions remain still open: How to divide the tasks? Is it needed an informatics person? If yes, it s convenience call an informatics consultant? In next figure a symbolic representation of engineering design and informatics development behaviour.

54 p. 54 Figure 3-10 Development and design activities Figure 3-10 is like a table with two columns and three levels of rows (one for each automation level: basic, intermediate and high), on first column the main tasks for development actions, on last one a summary design description. While design actions are exclusively to engineer, development ones are poised between two figures (programmer and designer); so it s possible to give a weight rate on choice to assign computer tasks to designer, which is also the final user of new work methodology. Figure 3-11 Recommended choice to assign development tasks

55 p. 55 Basic level is designer s own. He has more competence and knowledge to maximize using commercial tools through prepared customizable functionalities, so it s unnecessary the presence of a computer programmer, furthermore it s request a very high specifics engineering skills on using software technologies available. At intermediate level, designer is always the best choice to develop the improvements but the programmer weight is vary balance because the computer skills required aren t very simple. Light application like MACRO and VBScript could be known by a modern designer, so for a SME is very important and convenient to have a skilled engineer, instead for a big enterprise can be indifferent the use of informatics resource. But both for the computer expert and for designer it s necessary a knowledge about geometrical representation on three-dimensional Kernel of the most commercial CAD application. Final level is appositive to the first, in fact computer competences are very high thus usually simple designer doesn t have many specific informatics abilities and skills. Figure 3-12 Qualitatively necessity of an external informatics consult to automate the design process Previous graph on Figure 3-12describes qualitatively the direction to keep on deciding for an external informatics consult. Red caps symbolizes an unadvisable condition to call an informatics consult, the yellow ones are for uncertain situation and finally green ones stays for good recommendation to assign informatics tasks to external consultants. The graphical is like a table and exactly constitutes a particular diagonal matrix. Each case can be deeply evaluated, so previous graphs are only main guidelines to use on valuating the way to development management. Next section illustrates the strategy planning about design rebuilding.

56 p Risk evaluation Risk concerns the deviation of one or more results of one or more future events from their expected value. Technically, the value of those results may be positive or negative. However, general usage tends to focus only on potential harm that may arise from a future event, which may accrue either from incurring a cost ("downside risk") or by failing to attain some benefit ("upside risk"). Many times risk is considered as a state of uncertainty where some of the possibilities involve a loss or other undesirable outcome, but generally successes under risked condition are the most profitable. On design changing and optimization, negative risk concerns the eventuality to not lead some improvements proportional to times and money invested. So little efforts always result not more risky, but high investments want elevated benefits to justify the main project. It s fundamental analyse company risk first beginning a designing revolution to embrace one of three automation level described. Our research hypothesizes a particular behaviour about risk-benefits ratio (Figure 3-13) to give a qualitative consideration on multiple scenarios. Figure 3-13 A colour scale graph for rate risks/benefits Particular diamond chart evidence hypothetic linear risk/benefits increasing from blue zone to red one. There are two very important independent parameters: enterprise size which can be small, medium and large, and which are directly related available resources; engineering skills which measures the enterprise abilities and knowledge required for design rebuilding, so this parameters lead the automation level choice. On background, there is the assumption that only big enterprises can have unlimited resources because this property depends from company size; instead evaluate skills are linked only to engineering skills. A colour scale indicates qualitatively the value of risk-benefits ratio; red points are for high risks lead to little benefits,

57 p. 57 yellow and light-green symbolize balance between risks and benefits and blue is the colour related to very low risks and high related benefits. The combination of basic level automation on very small enterprises is the less risky condition so related benefits are very important; this is the case of little technical departments with few designers (1-3 persons) that have the possible to automate alone their design methods with low cost and their direct control. At this phase there are many expected benefits for a little design automation related especially on time saving. Otherwise, small enterprises could have many risks and little benefits on developing an high automated design process for two reason: those business entities have very limited resource available so high increment of development costs is very significant and improvement marginal cost is very elevate related to slow return of a small production amount. An appositive behaviour is in large enterprises which can count on strong resources and on structured organization; thus an high level of automate design is completely supported on project and resources management, and the elevated production amount and high number of design reduce costs impact of new developed application and methodology. This combination is more profit in high automate design area because the company backbone can absorb costs shovelling on production amount and organize applications and methodologies trainings. Even if at basic automate level the absolute value of risk is still low, the benefits contribution is also very little so the ratio risk-benefit is evaluated as not suitable. Configurations with acceptable value risk-benefit are always that related on small enterprises which have more balance on implementing the intermediate automate level with limited risk and good benefits. This combination can keep up the company engineering skills, because its organization is structured but also compact on decision and on designing Optimal automation level Take and keep a strategy is very important on implanting tools and methods based on knowledge elicitation for supporting specific engineering design. Main steps are: Analysis actual design process and evidence main criticisms; Individuate available resources (human, hardware and software); Fix the best development level on automated design; Individuate knowledge holders, possibility knowledge engineers and eventual, informatics consultants;

58 p. 58 Planning knowledge elicitation, sharing and management; Management implementation of researched method; Validate results. On automate design process there are two very important phase: the initially risk evaluation and the development planning. Risk is the way to know how is hazardous the use of a specific type of new design method; project planning isn t only important for scheduling but also for determinate actors and tasks. A deep innovation on design requires a project planning to provide resources (human, hardware and software) and times. This section explains how decide the innovation level on automated design valuating enterprise size, skills load and relation costs-benefits which is related to previous risk analysis. In according with risk evaluation and development scenario described, a simple table (Figure 3-14) has been compiled to support decision about transition from a standard designing way to an automated design. On columns enterprise sizes and on rows the attained automation levels on design process. Figure 3-14 Graph for supporting the choice of design automation level to implement

59 p. 59 For each combination there is a score which forecasts the results success for each type of enterprise; there is also a qualitative estimation about benefits, costs and the weight percentage between engineering and informatics load to support decision on human resources detection. Green colour, which is on main diagonal, determines the best choices: small enterprises have high advantage on design based on a basic automation level because at medium benefits there are very low additional costs and load skills are also purely engineering. Increasing automation level benefits decrease and costs increase with informatics requirements. Then medium enterprises have little difference on choice, best condition is in the intermediate level because for a discrete costs tier there are high estimated results; other choices are balance in rapport of benefits-costs. Large enterprises are often appositive to small ones, in fact their advantage condition is on zone of high automated design because at high costs there are very revenues, instead on basic level high costs have little additional benefits. It doesn t exist a yardstick for supporting decision on the choice of very optimal automated design level to implement in a company through methods and tools based on knowledge elicitation. This section described a guide table to use before planning a company strategy. However on Figure 3-14 the complete set of green and yellow points effectually represent the best average decision choices related to specific company size.

60 p Small enterprises: test case of FGR Srl Some research activities has been conducted with specific partner enterprises, one of that is FGR Srl of Camerano (AN), a small Italian mechanical company which works on mould construction for injection moulding and which has a typical specialization on design and production of drip emitters (a particular plastic device used on modern drop irrigation). According to the theory explained in the last sections, this type of company should follow the implementation of basic automated design but here there is just a design methodology like this; in fact the small technical department is composed of two designer who have just implemented a personal design methodology to eliminate repetitive procedures and optimize operations. So research activity for FGR has been focused on introduction of virtual prototyping to reduce material cost and time-to-market on drip emitters design. The section explains the product description, the old project process, the virtual prototyping adding based on both commercial CAE software and tailored knowledge base. The aim is to form a knowledge base made of the gathered data and of design rules to help the definition of a new product as new specifications come Product description: Drip Emitters The drip emitter (alias dripper) is an important device in water-saving agriculture, and it characterizes all development of modern agriculture. The use of dripper emitter is fundamental in arid regions or where rain begins to decrease. The task of this component is to dissipate pressure and to deliver water at a constant rate by lowering the pressure energy. Shapes are various as shown in Figure 4-1. Usually dimensions are very small, and the water flow crosses through micro-orifices like a labyrinth channels which make the pressure drop. Discharge rate is usually 1 to 8 L/h and is linked to the small width and depth of the flow path which is about 0.5 to 1.5 mm high. Figure 4-1Various design solutions for dripper emitters

61 p. 61 Drippers are equally spaced inside irrigation lines which are laid on the ground or just few centimetres below the surface level. During pipe extrusion drippers are welded toward its inner surface. Pipe diameter is around 16 mm and its thickness varies between 0.12 and 1.5 mm. In agriculture many pipe-lines are used and the intake pressure is variable. In horizontals fields, nominal pressure is 1 bar, while in sloping fields pressure can reach even 4 bars in lower level areas (Figure 4-2). Figure 4-2Dripper emitters in irrigation lines There are two big families of drip emitters: the flat dripper and the round type. Each of them can be divided in two subfamilies: unregulated dripper and regulated dripper. The flow rate in unregulated dripper varies with inlet water pressure. On the contrary, regulated emitter maintains a relatively constant flow rate at varying water pressure, within the limits specified by the manufacturer. Last ones show good performance in sloped fields where intake pressure is inevitably variable. The most important properties in drip tubing irrigation systems are uniformity, anticlogging capacity and life-span of all components. A well designed dripper device should maximise these aspects and ensure a good hydraulic performance. Uniformity is the property of each dripper of a piping line to provide almost the same discharge rate in a range of ±10%. Anti-clogging capacity is the property of an emitter to reduce the precipitation of suspended particles. In fact, these devices can easily clog. Efficient turbulence can create some reverse whirlpools in low velocity zones and this effect prevents the sedimentation of suspended particles. Another method to reduce clogging is the introduction of a filter at water inlet section. This filter is often made of a grid which blocks particles larger than a third of the labyrinth smallest cross section.

62 p. 62 Dripper life is linked to the plastic material used to produce this device. Many producers employ only thermoplastic materials. Most of them are made in high density polyethylene, because this choice is an important compromise between physical and moulding properties Introduction on drip emitters design Drip emitter design process is commonly based only on the experience of engineers supported by CAD-CAM systems and trial-and-error procedures. Nowadays Computer Aided Engineering (CAE) systems can be successfully employed to investigate performance of emitters without any physical realization of physical prototypes. Figure 4-3 Standard drip design On previous figure (Figure 4-3) a representation of standard drip design based on knowledge based approach but without virtual prototyping development, so many physical tests are required to validate design project and often initial drafting is changed many times (also 3-4). Methodology works is divided in two ways: first the investigation of a methodology to optimize the using of commercial CAE system to simulate product performance, second ones the study of a specific knowledge based to reduce simulation time. In particular, CAE systems include Computational Fluid Dynamics (CFD) software, which is useful to calculate hydraulic performance of the emitter such as the output flow rate and the pressure drop in the labyrinth. On the other hand the production can be analysed with

63 p. 63 the help of moulding simulation systems in order to investigate product integrity, mould cycle duration and efficiency. The integration of virtual prototyping tools in the design flow is very important in shortening the whole production cycle. However, some specific knowledge is required for a correct interpretation of the results. CFD outcomes highly depend on the geometry but the last one is not so certain. In fact, nominal CAD model differs from effective dimensions of a real dripper assembled into a pipeline. The extrusion process, used to form the pipe and stick the dripper, creates a permanent junction between the parts. The dentate path penetrates into the internal face of the pipe and the actual depth of the channel reduces. The effective correct depth is not easily predictable, because it depends on the type of materials, geometry, external pipe thickness, extrusion temperature, speed of the extrusion etc... Therefore CAD/CAE systems outputs must be matched with experimental tests in order to draw correct results Proposed approach The aim of this activity is the development of a framework for the implementation of knowledge based applications to support the design of products requiring complex virtual and experimental analysis. The steps to come to a valid knowledge base to be embodied in a support tool can be summarised as follows: an investigation phase based on dialog with customers and suppliers, a research about the product, the application of virtual prototyping tools, the study of production and assembly process, materials, and finally the study about particular experimental set-ups. The principle of the approach is recognised in the DOE method. Characteristic input and output parameters are used for the specific problem and test cases in the knowledge base play the role of the experiments (McCreary, 2007). After data has been gathered, it needs to be stored in a system following the steps here listed: Targets individuation: it is the definition of the principal objectives of the study; Input parameters individuation: it is the analysis of all variables on which the problem depends. These parameters can be divided in geometrical, physical, process and operating

64 p. 64 parameters. They respectively represent physical constraints, material properties, production processes parameters and parameters linked to the operating conditions; Output parameters individuation: these parameters are affected by changes in input ones. So these variables are part of the specifications and must be experimentally verified. They can be divided into functional and quality parameters; Figure 4-4Diagram showing the proposed approach Selection of design parameters: only parameters which mostly influence the study are analysed. Parameters reduction permits to save time and money in the next steps. This selection must be justified to guarantee analysis consistency. As result of verification step these parameters may be changed; Experiments planning: this step is important to predict analysis duration and cost. It is recommended to reduce experiments number to the essential ones. This phase can be integrated or substituted by virtual prototyping technologies; Experiments execution and data collection: this phase is based on experimental setups and measures operations. All data must be organised in a structured database;

65 p. 65 Data analysis: it is the central step in which the virtual and experimental results are used to find rules and conditions. So this phase is liked to knowledge caption. The output is a first attempt theory; Verification: here theoretical assumptions and their correctness are verified. This step can bring back to the selection of design parameters; Knowledge formalization, is the final step in which the product knowledge is formalised in terms of parameters correlations, so it is ready to use in a similar problem. Once data is acquired and knowledge formulated, a tool to support design process can be implemented. The core of this support tool is a structured multidiscipline database that is the collection of all design aspects of the analysed test cases and rules to link input and output parameters. A specific design solution is then extracted recognising product category and similar test cases. Then parameters correlations and rules are used to predict the product behaviour Drip emitters: design process The research is focused on drippers design and production. Companies define the exact shape of drippers on the base of specifications and then design and produce injection moulds for their realization. Usually they sell the product, but sometimes only moulds. Above all, they provide a specific dripper design service. Dripper production follows the mass customisation paradigm, today highly diffused in the modern globalization. The dripper is not a standard product and customers are represented by pipe producers. These firms buy drippers which are inserted in the pipeline during the extrusion process. Every customer requires different specification based on the specific irrigation application and the technologies being used to manufacture the pipeline. When a new order comes, it specifies some overall dimension requirement, a specific flow rate, a certain intake pressure, specific environmental working conditions and other functional requirements. All these variables lead to the necessity of a new design which often may be similar to a previous one. However this does not mean that the design process can be fully recovered. Small changes require the repetition of all the design, manufacturing and testing steps as pointed out as follows.

66 p. 66 Currently the time for designing and realising of the final prototype of a drip emitter is quite long (almost 3 months) and it includes four steps: the design of the emitter, the project of injection moulding process, the assembling process between dripper and external pipe, and the experimental set-up of the emitters pipelines (Figure 4-5). Figure 4-5 Diagram showing dripper design phases and iterations between companies An initial wrong design could highly increment the cost of all the realization process. For instance, negative results from the experimental set-up require a product revision and the repetition of all the design and manufacturing steps. Moreover, drippers are usually designed and produced by a firm while they are assembled by pipe producers. That means the overall time for iteration is long and the all design process could span for months. The first design step is the most complex because engineers must considerate at least four fundamental aspects: fluid dynamics, geometrical and dimensional constraints, influence of geometry into moulding process and the choice of materials. A new project begins with the analysis of the geometrical constraints. Overall dimensions depend on different extruding machines which include drippers in pipeline. Each of them makes use of a particular track to convey the emitters: so it is not possible to standardize product geometrical limits. Secondly, the designer must fulfil customer fluid dynamics specifications. In particular, every dripper has its own characteristic discharge rate, linked to a particular agricultural application. This parameter is very critical. There are no rules or methods to analytically compute this value due to the complexity of the geometry.

67 p. 67 CFD simulation may be employed but there are many parameters influencing the results. It is important to know them precisely in order to come out with good results. That means the designer usually bases his work only on experience. At first he fixes a possible labyrinth path. Then he works only varying the depth of the channel. In choosing the geometry he must take into account anti-clogging properties, life-span of the parts and overall performance. A dentate design is usually preferred since it meets these two aspects. The profile is often triangular since it guarantees a turbulence flow to increase pressure dissipation and to prevent sedimentation of suspended grains. In addition, an intake filter is added to stop bigger particles. After geometry definition, the moulding process is designed. Main aspects are related to lines productivity and the correct and constant properties of the product. This is very important for the quality of dripper pipelines, because every dripper must emit almost the same water quantity to guarantee a balanced irrigation of any plant of the field. The discharge uniformity is a central parameter the designer must control in all the process. Besides, the realization of moulds requires many types of machine tools, such as copper electrodes and mills with an accuracy of about 0.01 mm. The compromise between performance, cost and fast realization is hardly reachable and requires knowledge linked to the experience. After a pilot batch has been obtained, the customer tests a first assembly-line to experimentally measure the effective discharge rate. Results are often not very good, so the first dripper model may need a deep revision and the repetition of all previous phases. This leads to a trial-and-error loop which terminates only when the experimental results are sufficiently good. This loop spans in all production steps, so it is very expensive for the company which needs to employ many resources to realise changes to the first dripper design A Knowledge base for supporting dripper design In this section the problem of the construction of a knowledge base for a dripper design supporting tool is addressed. Some meaningful test cases are examined both from a virtual and experimental point of view. This information is used to extract main design parameters and their correlations.

68 p Dripper design parameters To test the introduced methodology, two different cases from the flat and the round dripper families have been analysed. The input parameters, which influence the performance of all the drippers, can be divided in geometric parameters, such as dentate path shape, path depth, pipe thickness; process parameters, such as moulding pressure, moulding temperature, assembly process temperature; dripper and pipe material properties; operating parameters as water pressure, water temperature and clogging state. The output parameters can be mainly recognised in discharge rate and lifetime. All these parameters are numerous, heterogeneous and complexly linked. Here some hypotheses follow which where formulated to simplify the approach. The study was focused on geometric and on operating parameters. Factors linked to material and moulding process where considered as constant. Material was fixed in high density polyethylene both for the dripper and pipe; operating temperature and clogging state where respectively fixed in about 23 C and in the absence of any clogging sediment. As output parameter, discharge rate was only taken into account while lifetime has been ignored since it mainly depends on chosen material and employing conditions. In the test cases a constant pressure of about 1 bar was fixed and the attention focused on geometrical parameters, such as the dentate path geometry and the path depth, which deeply influence the dripper performance. Generally speaking, parameters are chosen out of convenience considerations. New design very often new design starts from an existing model which maintains most of the geometric choices such as structure, labyrinth shape, inlet position, and so on. For that reason a new design is often based on a product family choice and then concentrates on parameters such as path depth, overall length and number of labyrinth bends which, conveniently varied, lead to desiderate performance Chosen test cases description Three kind of flat drippers, characterised by three different dentate labyrinths where analysed. Moreover, for each flat device three different path depths were considered. All the dripper have were also experimentally tested on pipelines of different pipe thickness.

69 p. 69 Figure 4-6 Three type of dentate paths being analysed to study the flat dripper (flow path is blue) In particular, external dimensions of the flat drippers respectively are: 35x8 mm (flat type A), 20x8 mm (flat type B) and 30x8mm (flat type C). The path depth varies between 0.75, 1 and 1.25 mm. Finally the thickness of the assembly pipe, on which the drippers were installed, was chosen in 0.15 or 0.3 mm. In Figure 4-6 the three types of path are shown. At first sight, it may be observed as the first path is long but each dentate tip is very rounded; the second path is very shaped and finally the last path has an almost rectangular dentate module. After simulations and experiments, it is possible to discuss about the influence of geometry design on the discharge rate. On the other side, round emitters have cylindrical symmetry, so are pretty different from the flat type. The approach being used is the same of the flat ones. So three labyrinth types were chosen (see Figure 4-7), but in this case only the pipe thickness were studied, varying between 0.7, 1 and 1.2 mm. Besides, channels depth was been maintained fixed. Pipe thickness effect is here more evident than in flat drippers. This is mainly due to cooling phase after pipe extrusion. Radial tension make the dripper weld to the pipe with partial materials overlapping. A thicker pipe will cause stronger tensions and therefore deeper material deformation. As result, labyrinth channel effective cross section will be smaller than nominal one.

70 p. 70 Figure 4-7 The round dripper chosen as second test case (flow path is blue) The external diameter of these emitters is 16 mm, while lengths are 50, 40 and 35 mm. All three types have nominal labyrinth depth of about 0.8 mm. As happens for most designs, two parameters were investigated: path depth and pipe thickness. The last one is not a strictly related dripper design parameter but highly influence the results so can be considered as one of them. Other parameters were maintained constant among homogenous product families Product virtual analysis Chosen dripper models were both experimentally and numerically analysed. Fluid dynamics aspects were simulated with a commercial CFD system, Fluent by Fluent Inc. All geometries were meshed with grids of 0.1 mm spacing leading to more than 1x10^5 cells. From the literature is clear how water flow into the emitter dentate path can be considered as turbulent. So, the model to calculate fluid dynamics sizes was used (B.E., et al., 1974). The flow inside the emitters could be considerate as a viscous steady uncompressible flow described by these fundamental equations (Wei, 2006): ui Continuity equation: 0 x i (1)

71 p. 71 Navier-Stokes equation: u iu j P u u i j e (2) x i x j xi x j xi Gravity and surface roughness effects were neglected. In the simulation standard boundary conditions about flow inlets and outlets were set. Relative pressure at inlets were set to 1 bar, corresponding to normal emitters working pressure, while at the outlets pressure was fixed to zero. The outcomes of numeric CFD analysis are reported in the following paragraph Experimental tests The experimental phase was consisted in the design of the moulds and in the realization of the different flat and round drippers discussed above. Then tests were carried out to measure output parameters. Figure 4-8 The test dripper machine The data were gathered in two different ways: with a standard discharge rate measurement of some extruded emitter piping and by means of an innovative test machine. Since the discharge depends on the type of pipe, the second test was designed to simulate the tube interference effect. Basically, a silicon cylinder encloses the dripper and let the water flow into the labyrinth. A particular of this machine is reported in Figure 4-8.

72 p. 72 Figure 4-9 Schematic diagram of experimental set-up used in measuring the discharge rate of drippers. In Figure 4-9, a scheme of this innovative dripper experimental measurement setup is illustrated. This machine can test drippers simulating the effects of the pipe, leading to time and costs savings. In fact, dripper performance can be measured before pipe extrusion. drippers. Type Path depth (mm) CFD Discharge Rate (L/h) Measured Discharge Rate (L/h) Differen ce A % A % A % B % B % B % C % C % C % Table 4-1 Comparison between simulated water discharge rate and measured values for flat Type Path depth (mm) CFD Discharge Rate (L/h) Measured Discharge Rate (L/h) Differen ce A % B % C % Table 4-2 Comparison between simulated water discharge rate and measured values for round drippers. Type Path depth (mm) Pipe thickness (mm) Discharge Rate (L/h) A A A

73 p. 73 B B B C C C A A A B B B C C C Table 4-3 Flat drippers discharge rate data measured with the standard method. Flat and round drippers were tested by means of the machine. For flat drippers nine experiments were planned, because of three different dentate paths and three path height. For round drippers three typologies of dentate paths were analysed all sharing the same depth. In the following tables experimental results are reported along with CFD outcomes. Afterwards, experiments with the classical method for dripper discharge rate measurement were carried out. A measure station with a pump that provides water to five drip tubing has been set up. Each pipeline is one meter long, with a total of 25 drippers. These measurements are very time consuming compared with the ones realized with the test machine, but permit to analyse the effects of the pipe in the dripper performance. For the flat dripper eighteen measure combinations were used because of two tube thickness. Besides, for round drippers additional nine tests were carried out. Those measurements are reported in the followingtable 4-3 and Table 4-4. Type Path depth (mm) Pipe thickness (mm) Discharge rate (L/h) A B C A B C A B C Table 4-4 Round drippers discharge rate data measured with the standard method.

74 p Design parameters discussion and correlation Data were analysed to find correlations between input and output parameters. Some one-on-one correlations emerged. For instance the effect of path depth on discharge rate is evident in flat emitters. The relation between the two parameters is almost linear in our test cases (Figure 4-10). In flat dripper Type A the ratio between discharge rate and path depth is almost 2 L/h for each mm. In other terms, a depth increase of 25% causes 25% higher water flow. This behaviour can be observed on round emitters too, but in this study experimental or CFD data for it are not available. Figure 4-10 Discharge rate versus path depth for flat drippers Similar is the influence of pipe thickness effect on discharge rate. In fact, especially in round emitters, a thicker pipe reduces the cross section of channels labyrinth and then make the flow rate decrease. Usually this effect is neglected and leads to problems with the pipe line assembly firms. Each customer registers different dripper performance on the base of the pipe being use and on the specific extrusion process parameters, for instance the speed rate or cooling effects. CFD analysis shows big difference from experimental data. The gap is from 28 to 33% for round emitters and from 8 to 12% for the flat ones. Of course these errors are linked to the quality of the mesh model created for CFD analysis and the accuracy of the test machines. Moreover, CFD results also depend on some hydraulic parameter assumptions which would require further investigation. Anyway, the main reason is that CFD were based on nominal dripper geometry which does not consider the effect of pipe collapse into the labyrinth. Therefore, on test machine the discharge rate is always smaller than CFD results. Numeric experiments should be somehow corrected considering this effect, for instance reducing nominal path depth only for simulation purpose. This choice was not done and data

75 p. 75 were reported as they came out from virtual or physical models. In fact, the aim of this paper is to show the correlation of design parameters among homogeneous families of products. It means that, as long as the error of CFD is repeatable and correlation between parameters assured, new design behaviours can be predicted on the base of old known ones. The correlation between labyrinth area and its volume is also worth to be further explored. Each single dentate tip causes a pressure drop linked to its geometry. However, considering the labyrinth as a whole, the area on volume ratio takes into account frictional effects on the walls. Increasing this ratio leads to more flow resistance and then a reduction of water discharge. The correlations which emerged apply to the specific dripper design family. It was noticed how different dripper types show different levels of correlations between parameters. However, among homogenous families, results can be extend to new designs and performance be predicted with a sufficient grade of reliability New dripper design method Finally design method is summary on Figure 4-11 to give a graphical result to the lectors. The method framework is derived by initial methodology mainly based on experience, addition of formalized knowledge about empirical laws, computational fluid-dynamics tests and physic.

76 p. 76 Figure 4-11 Resultant new dripper design process Input data and requirements are also the same, but the dripper design is based on more strong knowledge base which is supported by new equations and a virtual prototyping evaluation is used to validate intermediate drawings analysing main performance at fixed boundary conditions. The importance of a strong knowledge base is on the definition of an initial drawing,thenthat can be consolidated by prototyping tools which could be are commercial or tailored software. So physical prototype building phase is postponed to a next step with a little incensement of first drawing design but with a very saving money and times about materials and machine tools operations. A negative final testing it s possible but with a low probability and the returns on drawing review are very few.

77 p Medium enterprises: test case of G.I.&E. Spa The second test case reported is about a collaboration between Mechanical Department of Polytechnic University of Marche and GI&E Holding S.p.A. of Porto Recanati (MC), which is a medium-large Italian holding working on many affairs such as green power technologies, environment, engineering and building. In particular the collaboration has been done with the Turbec department which is focused on micro gas turbines for energy power and for efficiency cogeneration systems. Final objectives was to implement a software tools to support layout configuration of little cogeneration plants calculating the necessary components, the estimate costs and the routing paths of electric cables and piping. Last scope it was a very important step to attain an high level of automated design. Generally, the design of industrial plants requires managing many geometrical and nongeometrical data to reach a satisfactory solution in terms of costs, performance and quality. An approach is presented to support designers in the elicitation and formalization phase of the required knowledge. Then an integral prototypal software application accomplishes layout configuration tasks through a customized graphic wizard. A routing algorithm is presented to automate calculation and modelling of piping and electrical cables respecting design constraints. Cogeneration plant powered by micro gas-turbines has been chosen as test case to evaluate the proposed design method and tool; study result has been implemented in an application called Mg Configurator which leads commercials and engineers on plant designing through a wizard methodology. Next sections explain application context, routing problem and solution, knowledge management and system implementation with related results Context and plant lay-out design The increasing cost of traditional energy sources and the general sensibility to earth environment preservation have provided a strong push to renewable energy sources and to the

78 p. 78 minimization of losses and wastes. For many companies that means the birth of new business related to the design and installation of cogeneration plants which aim to reuse heat for conditioning while producing electricity on site. In the layout definition of these plants, it is often mandatory to realize virtual prototypes to focus on solutions, costs, performances, parts arrangement and then prevent errors during installation phase. In this context routing problem is very important as it happens in many other fields such as electronics applications, navigation systems and computer networks. Basically, the principal design objectives are the correct dimensioning of parts, the convenient arrangement of plant components and the definition of piping or wiring following short and simple paths. In many small design departments pipeline routes are still designed by two-dimensional CAD systems, with many limitations and pour design support. In the 90 years, there was a push towards piping 3D design(yamada, et al., 1998). Nowadays, routing planning in a virtual threedimensional environment has been further increased with the birth of piping specific tools for medium and high level CAD systems.main advantages of a parametric piping 3D modelling are: Virtual representation of working region boundaries; Exact obstacles definition; 3D model of each component: pipes, valves, filters, connections, engines, turbines, etc.; Visible determination of physical interferences, accessibility and quality of paths; Easy printing of perspectives views and immediate understanding of plants also for non-technicians (customers for instance); Rapid changing of dimensions with parametric features; Geometrical data are often provided as three dimensional models of structures, obstacles, routing search spaces, connects source and target terminals [2]. On the other side design knowledge is often unstructured and based on the experience of senior engineers. In fact, standard piping design supporting tools cannot support reasoning and incorporate limited automatic routing design features. Moreover existing tools support piping modelling and optimal equipment location but they do not offer an interactive configuration phase in which designer is helped in the layout arrangement.

79 p. 79 This research activity aims to define the framework of a knowledge based system to support the design, the configuration and modelling of plants in mechanical and industrial areas. To this aim, an approach to represent plants knowledge is presented along with relative data management tools. The approach for the configuration of the layout is based on the realization of a virtual prototype of the plant on the base of the Configurable Virtual Prototype (CVP) method already described in(raffaeli, et al., 2009). CVP stores knowledge of the product and supports configuration task Solving routing problem While the functional task is to transfer a specific volume of a fluid between two endpoints, a piping design problem is much broader and involves the definition of optimal route into a specific geometrical environment with assigned constraints. Routing problem is basically an iterative design task involving the geometrical definition of the given working space and a knowledge base of design rules and best practices. In the literature many mathematical algorithms have been developed to solve routing task. In this paragraph some of them are recalled. Shortest path algorithm One of the main solutions to pipe routing is to find the shortest path between two points on the space using the graphical Dijkstra s method (Dijkstra, 1959). Yamada developed this method modelling a three dimensional geometrical space including the influence of pipelines (Yamada, et al., 1998). His system does not find the best route but aides the designer to decide the optimal piping route with important information about the path length. This information is important since costs and pressure losses are directly proportional to path length. Anyway this approach neglects the curves effect. Graphically each pipeline divides the plant space in different regions with boundary surfaces; mathematically bidirectional functions between pipelines and partition are established. A limit of this method is the little information about equipment and building structures. Maze algorithm Lee s algorithm also known as grid expansion algorithm(lee, 1961) [12] is one of the earliest algorithm developed to enable automatic routing. The domain is divided by a grid of cells where the obstacles are marked with an X. The algorithm calculates the optimal route between two endings points having no interference with obstacles. This method assigns values

80 p. 80 to each node depending on distance from the target. Every maze algorithm guarantees a solution but it is not efficient, in fact it requires a lot memory and consequently it is very slow. Escape algorithm This algorithm, also called line-search algorithm, generates the solution in a very fast way without much memory consumption, but it could not converge(hightower, 1969) [13]. This iterative method calculates each line of a bi-dimensional path by extension of the end of the previous line along its direction. The iteration loop repeats the method until new line crosses the target point. Genetic algorithm (GA) A genetic algorithm approach was studied by Ito(Ito, 1999). This method finds the optimal pipe route using an objective function. This function calculates the fitness value of a generic path of a set of candidate routes. Then, the best paths are combined with a crossover method to generate new piping route variants. The method is iterative and stops when the score reaches a target value. The fitness function is based on: route path length, arrangement of the pipes under the same categories, maintenance spaces, number of curves and absence of interference with obstacles. For each aspect a range of scores is defined. From the analysis of the routing algorithms found in the literature, the most complete method seems to be the genetic algorithm. In fact the fitness function may introduce many useful conditions like shortest path, low drop pressure, lack of interference, minimum workspace and so on. A weakness of GA in routing problem is the possibility of long calculation time. On the other side the fastest is the Escape algorithm, but it does not consider the conditions which are present in GA and Cell-generation method. None of the presented algorithm combines a strategy in searching the optimal path with the knowledge of the particular problem being solved. Solution are reached only on the base of geometrical constrains. In this research activity, solution is guided by the specific plant design and related knowledge. In particular, the routing algorithm employed for the test case is an evolution of escape one, in which paths are calculated considering performance, normative and geometric constraints. Optimal solution is searched by iterating loops Approach to Layout knowledge management A knowledge based approach has been required to formulate a generic methodology to manage the layout configuration. Knowledge acquisition is often based on the analysis of interviews with experts and the gathering of experimental tests. Therefore it is necessary to

81 p. 81 elicit knowledge, convert the tacit to explicit one and formalize it. As described before, knowledge formalization process can have five different forms which are complimentary one another. In this research activity, knowledge has been at first classified in different domains, each of them formalized in specific ways (Figure 2-1). The eliciting phase has been focused on who or what were the depositaries of knowledge Gathering of a knowledge base for layout configuration Coming to the specific knowledge required by layout plant design, it has been classified in four domains: configuration, geometry, normative and performance. In particular, domains reports an example of information gathered for the cogeneration specific application field. Knowledge domains Configuration domain Geometrical domain Normative domain Performance domain Description Setup type, fuel type, placing type, machines number, components, etc... Planimetry, geometrical boundaries, physical obstacles, interference detection, minimum workspace, etc... Components dimensioning, normative requirements, local laws. Fluid dynamics performance, electrical power, thermal power, thermo fluid dynamics efficiency, pressure losses, etc... Table 5-1Micro co-generation plants design knowledge domains Figure 5-1 Knowledge domains interdependency

82 p. 82 Figure 5-1 shows the connection between the four levels of knowledge. The gearbox representation is a similitude of the iterations among the domains. While configuration level is the prominent aspect, the final motion transmission, which represents the designing layout, depends also on geometrical, normative and performance balances. In a linear transmission each gear is fundamental for the final output. Even if geometrical and normative knowledge interact, normative often is as a sort of layout verification. All levels are necessary for a highly customized result which has been found to be the most important characteristic of the design process. Therefore, an effective design supporting system should provide an outlook on all these aspects. Information gathered for each domain is now briefly discussed. Configuration knowledge domain This domain gathers product specifications, market requirements, functional requirements and concept solutions. It emerges basically from company experts background. It concerns configuration typologies, components, options, etc..., but it also contains rules for components selection and components relations. Five levels of configuration design have identified in this research: setup type (power, cogeneration, three-generation), fuel type (natural gas, methane, diesel), placing type (indoor, outdoor), machine number, components (selection, data, assembly rules). Configuration knowledge is an analytical explicit knowledge, such as for instance setup type and fuel type. It can be recovered on books and products specifications. On the contrary relations between components derive from experts knowledge. So assembly rules and components choice have a tacit nature and have been extracted during technical meetings with the people of the design and production departments. Geometrical knowledge domain This knowledge manages geometrical data along with expertise rules. Data are represented by walls planimetry, components and obstacles position of the installation building. Rules refer to interference detection and parts arrangement in order to guarantee a minimum workspace around plant components. Tacit knowledge about positioning of plant layout was elicited by technical workers, who are the responsible of plant final installation. Information gathered during talks with technicians has been formalized in geometric formulas and rules. Normative knowledge domain Laws and normative provides rules to guarantee a safe installation and operation of the plant. This kind of knowledge comes both during parts choice but also as verification of the obtained configuration. Normative often presents a pure analytical knowledge, based on constraints, which is stored on official publications. This knowledge is normally readily available, but technical meetings have been useful for interpreting rules and laws in the specific

83 p. 83 design context. This knowledge can be regarded as fixed and unavoidable. When opposite requirements emerge other domains indications must accommodate to meet the particular request of normative. Performance knowledge domain This aspect is a synthesis of conceptualization, system modelling, experimentation and expertise. Empirical and theoretical approaches mix to estimate plant performance. Total performance is evaluated in terms of fluid dynamics performance, as noise or pressure losses, electrical power and thermal power outputs and related uncertainties. This knowledge has been formalised through diagrams, tables, formulas and corrective factors Knowledge formalization and representation Class diagrams have been designed to formalize all knowledge domains following UML approach. These diagrams helped to organize gathered knowledge and were the base to develop the software tool presented in the next section. Classes have been populated with fields representing data selected in the knowledge base acquisition phase. The Configuration Class (see Figure 5-2) represents the global information relative to a certain plant. It includes and connects all other domains of knowledge, which are formalized in other separated sub-class diagrams. Configuration class holds all information about plant layout such as typology, fuel and performance. Main field is the components collection, which gathers all plant equipment. Component class manages many fundamental data such as: id code, category type, cost, geometrical dimensions and position, related codes, alternative options and the definition of parts compatibility.

84 p. 84 Figure 5-2 Classes diagram tree representing layout knowledge A mixed of conceptual and expert knowledge has been collected. All components data are placed in datasheets, in which analytical data are explained with numeric values, rules are defined with text strings and compatibilities are expressed with Boolean checks (Figure 5-3). Layout configuration knowledge has been therefore formalized in standard classes, diagrams and tables. Information about general configuration, design parameters, components collection, plant performance, layout and geometrical data are included. Geometrical data provide dimensions and relative mounting positions of parts. Mathematical functions have been defined determinate lengths, areas, volumes, intersection, projections, rotations, translation, etc... The following step of the approach has concerned the elaboration of graphical and nongraphical means to let a generic user easily and intuitively interact with the data structure. A bidimensional graphical representation has been elaborated in order to capture a schematic plant layout. Graphic objects represent all main plant components in a simplified but exhaustive manner and are linked to the classes storing internal knowledge. Each plant component links to internal data class which permits to edit principal characteristics, geometrical data, rules and custom functions through graphic forms.

85 p. 85 Figure 5-3 Detail of the table capturing compatibility between components for each configuration option Technology employed for knowledge management In this paragraph knowledge management technologies being used in this research are addressed. The main framework is an homemade data structure implemented by an objectoriented programming language (Microsoft VB.NET). Knowledge management tools have been chosen in order to facilitate maintenance and usability of the software. Main data collections are stored in standard Microsoft Excel data sheets. Each sheet is organized in tables that store information about component attributes such as price, name, code, description, etc... User can easily update data without coding. The main framework manages the connection with Excel by API libraries. A Configuration Class data structure is the container of geometrical and non-geometrical data about the designed layout. Design rules are implemented in terms of IF-THEN-ELSE statements and components compatibility conditions. Geometries are internally managed by 3D entities and represented on a two dimensional layout by using points, vectors, lines and polygons. The final 3D shaded plant layout is obtained by interacting with a feature based parametric CAD system using Application Programming Interface (API) based on COM technology System implementation The proposed approach has been developed in the context of layout configuration of micro-cogeneration plants. The core of such plants is the micro gas turbine T100 produced by Turbec Spa a company of Ghergo Industry & Engineering Group (G.I.&E. Holding).

86 p. 86 A windows-based application called Mg Configurator was developed in Microsoft VisualStudio.NET environment using Visual Basic.NET language to design and customize the installation of cogeneration plants. The application permits to configure all installation components and to build a two dimensional detailed representation. Finally it is possible to export the layout plant in the three dimensional environment of a widely diffused CAD system, SolidWorks 2009 by Dassault Systems Software architecture MgConfigurator has been implemented as a wizard structure, where a main container can alternatively load eight distinguished child forms. Figure 5-4 First step of configuration wizard structure Each child form has a basic structure but shows specific data and functionalities. Also graphical elements are different but share a common data structure. That means these forms could be considered as interrelated tools operating on the layout project from different design perspectives. Here follows a description of each tool. I. Product general configuration tool The configuration tool is the first step of a layout configuration (Figure 5). This module allows defining the main configuration features as setup type, fuel type, machine number and placing mode. The first selections are filters for the automatic components choice. II. Components choice tool Second step is the components selection through a data grid view control. The application is able to preselect a standard list of component on the base of the features chosen in the first step. This tool reads configuration data and compatibility rules from data sheet and guides user to a right selection of components and options.

87 p. 87 III. Planimetry tool A graphical panel has been implemented to visualize the installation area of a plant. It is possible to import the geometric local plant from a.dxf file or from a sketch feature of a Solid Works document. A wireframe representation is used for the walls. Then predefined graphical controls to represent obstacles, windows and doors present in the real installation area can be added. It is also possible to import curved walls that are approximated by polylines. IV. Component placing tool The components layout is defined in a fourth tool called Placing, in which the user can manage graphical controls representing turbines, compressors, co-generators, etc Each control has fields with specific data about pipeline connections; so flanges types, 3D positions and directions are available for the specific component position. The user has to specify the external connections with similar graphical controls for correct piping endpoints detection. V. Routing tool In the routing tool mathematical and iterative functions have been implemented to calculate routing paths of piping and electrical cables. The visual result is a bi-dimensional graphical visualization, but the software manages a collection of 3D nodes. VI. 3D Modelling tool To export 2D visualization in a more comprehensible 3D one, a software module able to connect Solid Works 2009 CAD system has been developed. Parametric models templates have been designed to represent each component in a 3D environment. Routing paths are modelled from the mean axis definition through a sweep feature. VII. 3D Viewer tool A viewer tool has been integrated in the software system to visualize the result of the 3D plant after the generation in the separated CAD system. This module includes the OCX control edrawings Viewer. VIII. Estimated cost tool Layout plant design process is completed by the estimation of the cost, which is computed as the summation of components cost and piping cost. The first one is the components cost sum while the second one is derived from necessary pipe spans length and curves Routing approach The routing tool is one of the main features in Mg Configurator. All available routing types are: Intake duct: to inlet air to combustion and to refrigeration; Ventilation duct: to remove hot air from turbine cabinet;

88 p. 88 Fuel duct: to provide fuel for combustion; Evacuation gas duct: for safety evacuation of non-necessary gas fuel; Water ducts: to transport hot and cold water in cogeneration and tri-generation plants. Electric cables: for electricity management. The principal task of the routing interface is the calculation of three-dimensional routing paths, avoiding any physical interference with other components and with geometrical boundaries. In this work the path is calculated from coordinates and directions of the endpoints, without implementing static templates of simple routes. Every complex route is split in n-routes. Generation function evaluates possible interferences and goes around each obstacle found along the path working in a 3D space (Figure 5-5). Figure 5-5 Example of a route avoiding an obstacle In particular main input data for a route generation are taken from the plant configuration class: geometrical definitions of extremes points (3D coordinates and directions), route type (air intake, gas exhaust, fuel, etc...), reference normative, the definition of installation space (dimensions), obstacles and components (position and dimensions). The proposed approach is based on a double iteration algorithm as schematically shown by the block diagram in Figure 5-6. The first external cycle generates a complete route path until it is valid and matches all design constraints. The internal iteration cycle calculates each nodes of a single route until the final extreme is reached. The generation of each point is determined by assigning scores to the principal directions as a function of their distance from the endpoint. The point validation is based on the condition of no interference. Similarly the validation of route depends on normative and on acceptable fluid dynamic performance which is mainly connected to path length. The particular iteration approach allows generating many different correct candidate paths and then selecting the most performing one on the base of total cost or total fluid dynamic losses.

89 p. 89 Figure 5-6 Block diagram of routing calculation algorithm Routing algorithm The algorithm described above has been implemented by VB.Net programming language. Main loop of the program illustrating the basic steps of the algorithm is reported in Figure 5-7. Figure 5-7 Main loop instructions of the proposed routing algorithm The code is completed by the implementation of the required sub functions and with a control on infinite loops avoidance. In worst cases after a defined number of cycles, the algorithm escapes without a solution.

90 p. 90 In the algorithm the most relevant functions are the computation of intermediate route nodes and interference detection and correction. Calculating path intermediate points A function defines coordinates and direction of each intermediate node incrementally while calculating the whole path. At each step six candidate directions are considered which are aligned to main coordinate axes; a score is assigned to them considering current direction and distance from final extreme point. In the approach score assignment is a particular fuzzy random function. Directions toward target have a relative high scores, while opposite ones receive low scores; in this way directions with higher score becomes more probable but other directions are not completely discarded. The basically random aspect allows the calculation of different points for each cycle; so many acceptable solutions can be found iteratively. Interference detection and correction While computing path a function is delegated to check interference. This function compute a bounding box to each layout plant component: walls, doors, obstacles, turbines, etc...; if a line of a route intersects the volume of a bounding box, then the interference is detected. The approximation of a component with a bounding box is normally acceptable for the geometries being involved and guaranties high algorithm efficiency.if interference is presented for a route line, a specific routine tries to reduce the length of the last segment within an iteration cycle. In interference is not eliminated after a certain number of iterations, a new path is recalculated From 2D to 3D Visualization Using Mg Configurator, users can configure micro cogeneration plant layouts working on a bi-dimensional graphic editor (Figure 5-8). So it is possible to manage piping, machines and equipment with 2D graphic blocks without any CAD interaction.

91 p. 91 Figure 5-8 Example of two-dimensional representation of a gas turbine layout After layout definition, integration between CAD and MgConfigurator has been provided to automatically realise a 3D model of the plant. The automatic modelling is based on Solid Works API (Application Programming Interface) which allows a direct link with the geometrical modelling kernel. Custom programming libraries has been implemented to generate 3D model documents. The 3D modelling tool generates assembly documents from parametric geometric data migrating internal geometries representation in a CAD document. Moreover this process is basically CAD independent and can be extended to other CAD systems. Standard components CAD models had been interactively modelled and stored inside a database. In the plant assembly document generation parameters are updated to meet desired dimensions and parts located by coordinates read in the 2D layout. Moreover, application supports modelling of piping and electric wires starting from sketches through 3D sweep features. An additional 3D viewer tool uses the OCX graphic control edrawings, a multi CAD viewer included into Solid Works suite. This solution let the user evaluate the newly generated assembly model in a three dimensional space (Figure 5-9), without using the CAD system. At this step it is also possible to generate drafts and send with printed views. In conclusion CAD is exclusively used as modelling kernel and can also ran in a separate server machine. Then only one CAD licence is enough for many Mg Configurator installations.

92 p. 92 Figure 5-9 An example of 3D plant generation Energy performance knowledge management Knowledge about performance has been formalized in tables and formulas to evaluate energy power and efficiency. In particular in this section the calculation of piping pressure losses and of global performance is briefly illustrated. Piping pressure losses Energy losses in pipes decrease the nominal performance of standard plants. So piping design requires fluid dynamic analysis to estimate pressure drop. In linear pipes losses has been computed using fluid dynamic Darcy-Weisbach formula (Equation 5-1), which links pressure loss (ΔP) to friction (f) along a given length of pipe (L), on an average velocity (V) of the fluid flow, with a fixed diameter (D). P f 2 V L 2 D Equation 5-1 The friction factor f is not a constant, but it depends on the parameters of the pipe and on the velocity of the fluid flow. It is known with high accuracy within certain flow regimes. For laminar flows f equals 64/Re, where Re is the Reynolds Number; for turbulent flow, friction factor f is found using a diagram such as the Moody chart or solving equations such as the Colebrook equation (Equation 5-2), where λ is the friction factor f, ε the roughness of the pipe and D the inside diameter log Re 3.7 D Equation 5-2

93 p. 93 In Mg Application a root-finding algorithm has been developed to iteratively calculate the friction factor λ using the Colebrook formula. Performance Plant performance depends on the number of gas turbine chosen at configuration phase. The manufacturer of turbines provides tables and graphs to determine the performance in terms of power and efficiency. Standard operating conditions and relative performances are reported in Table 5-2for the specific test case. Estimated power and efficiency must to be multiplied with corrective factors depending of altitude, temperature, drop pressure on the air inlet flange and on the exhaust flange. P ele f1 f2 f3 f4 P nom Equation 5-3 The previous formula (Equation 5-3) shows that the effective electrical power is a function of nominal power by corrective factors explained in next Table 5-3. Temperature 15 C RH 60% Drop pressure (input-output) 0 Pa Fuel (Natural Gas) PCI 3 39 MJ/m n Fuel supply pressure bar Altitude 0 m Electrical power 100 KW Electrical efficiency 30% Thermal power 165 KW Global efficiency 80% Table 5-2Operating conditions of Turbec gas turbine f1: altitude rate f2: temperature rate f3: drop pressure inlet intake rate f4: drop pressure exhaust rate Table 5-3 Corrective factors about electrical power

94 p Test case plant In this section an example of plant which has been configured with Mg Configurator is presented. The layout is an outdoor installation with four co-generative micro gas-turbines powered by natural gas. General configuration data were inserted in a primary form as in Figure 5-4 First step of configuration wizard structure and then necessary components detailed in the following steps. Figure 5-10 Dimensional test case layout plant The geometrical layout was configured by the dedicated graphics editor shown in Figure 11. It contains the machines (numbered big blocks), column (blocks at the corners), the planimetry drawn by boundary black lines, principal electrical cables (oblique lines), hot water circuits and cold water circuit. Piping has been generated automatically using the software functionalities. Finally Figure 5-11 shows the plant isometric view, which was automatically generated by Mg Configurator in about 1 minute.

95 p. 95 Figure 5-11 Isometric view of configured outdoor cogeneration plant In next table (Table 5-4 Evaluate and tested performance) the comparison between performance evaluated data from the knowledge based application and the real tested value are shown. Performance Evaluated Tested Electrical power 293 KW 386 KW Efficiency power 29.5% 29.1% Thermal power 652 KW 644 KW Global efficiency 79.8% 78.7% Table 5-4 Evaluate and tested performance

96 p Conclusion Starting on analysing the traditional design method, three different levels of automated design have been analysed and described in methodology section. Each approach has been verified in many companies which are mechanical enterprises collaborating with Polytechnics University of Marche. In this thesis only two of the most significant test cases are explained in detail. On FGR enterprise test case has been introduced a method based on virtual prototyping to support dripper design reducing physical test and to develop products with more efficiency and performances. Very important results has been achieved using methodology based on the intermediate level of automated design and a knowledge based has been developed with the fundamental parameters and empirical laws. The case of Gi&E has presented a knowledge base framework reusable for the configuration task of plants. In particular it has presented an approach to support the technical configuration phase and the geometrical layout definition of industrial plants, with a particular attention to routing problem of piping. A Windows application, based on engineering knowledge, has been developed on a user-friendly wizard interface to guide the designer in a complete layout configuration, with an high level of automation Concluding, the methodology approach and the test cases results have validated the strategy to elicit knowledge for development tools and methods on supporting the engineering design.

97 p. 97 Appendix A: Object-Oriented programming Object-oriented programming (OOP) is a programming paradigm that uses "objects" data structures consisting of data fields and methods together with their interactions to design applications and computer programs. Programming techniques may include features such as data abstraction, encapsulation, modularity, polymorphism, and inheritance (Wikipedia, 2010). Many modern programming languages now support OOP. An object is a discrete bundle of functions and procedures, often relating to a particular real-world concept such as a bank account holder or hockey player. Other pieces of software can access the object only by calling its functions and procedures that have been allowed to be called by outsiders. A large number of software engineers agree that isolating objects in this way makes their software easier to manage and keep track of. However, a significant number of engineers feel the reverse may be true: that software becomes more complex to maintain and document, or even to engineer from the start. The conditions under which OOP prevails over alternative techniques (and vice-versa) often remain unstated by either party, however, making rational discussion of the topic difficult, and often leading to heated debates[citation needed] over the matter. Object-oriented programming has roots that can be traced to the 1960s. As hardware and software became increasingly complex, manageability often became a concern. Researchers studied ways to maintain software quality and developed object-oriented programming in part to address common problems by strongly emphasizing discrete, reusable units of programming logic[citation needed]. The technology focuses on data rather than processes, with programs composed of self-sufficient modules ("classes"), each instance of which ("objects") contains all the information needed to manipulate its own data structure ("members"). This is in contrast to the existing modular programming that had been dominant for many years that focused on the function of a module, rather than specifically the data, but equally provided for code reuse, and self-sufficient reusable units of programming logic, enabling collaboration through the use of linked modules (subroutines). This more conventional approach, which still persists, tends to consider data and behaviour separately. An object-oriented program may thus be viewed as a collection of interacting objects, as opposed to the conventional model, in which a program is seen as a list of tasks (subroutines) to perform. In OOP, each object is capable of receiving messages, processing data, and sending messages to other objects. Each object can be viewed as an independent 'machine' with a

98 p. 98 distinct role or responsibility. The actions (or "methods") on these objects are closely associated with the object. For example, OOP data structures tend to 'carry their own operators around with them' (or at least "inherit" them from a similar object or class). In the conventional model, the data and operations on the data don't have a tight, formal association Not all of these concepts are to be found in all object-oriented programming languages, and so object-oriented programming that uses classes is called sometimes class-based programming. In particular, prototype-based programming does not typically use classes. As a result, a significantly different yet analogous terminology is used to define the concepts of object and instance. Fundamental features that support the OOP programming style in most object-oriented languages (Pierce, 2002): Dynamic dispatch when a method is invoked on an object, the object itself determines what code gets executed by looking up the method at run time in a table associated with the object. This feature distinguishes an object from an abstract data type (or module), which has a fixed (static) implementation of the operations for all instances. It is a programming methodology that gives modular component development while at the same time being very efficient. Encapsulation (or multi-methods, in which case the state is kept separate) Subtype polymorphism Object inheritance (or delegation) Open recursion a special variable (syntactically it may be a keyword), usually called this or self, that allows a method body to invoke another method body of the same object. This variable is late-bound; it allows a method defined in one class to invoke another method that is defined later, in some subclass thereof. A Class is a template for an object, a user-defined data-type that contains variables, properties, and methods. A class defines the abstract characteristics of a thing (object), including its characteristics (its attributes, fields or properties) and the things it can do (behaviours) or methods, operations or features). One might say that a class is a blueprint or factory that describes the nature of something. Classes provide modularity and structure in an object-oriented computer program. A class should typically be recognizable to a nonprogrammer familiar with the problem domain, meaning that the characteristics of the class

99 p. 99 should make sense in context. Also, the code for a class should be relatively self-contained (generally using encapsulation). Collectively, the properties and methods defined by a class are called members. One can have an instance of a class; the instance is the actual object created at run-time. The set of values of the attributes of a particular object is called its state. The object consists of state and the behaviour that's defined in the object's classes. Method is a set of procedural statements for achieving the desired result. It performs different kinds of operations on different data types. In a programming language, methods (sometimes referred to as "functions") are verbs. Inheritance is a process in which a class inherits all the state and behaviour of another class. This type of relationship is called child-parent or is-a relationship. "Subclasses" are more specialized versions of a class, which inherit attributes and behaviours from their parent classes, and can introduce their own. Each of its sub-classes will inherit these members, meaning that the programmer only needs to write the code for them once. Abstraction is simplifying complex reality by modelling classes appropriate to the problem, and working at the most appropriate level of inheritance for a given aspect of the problem. Encapsulation conceals the functional details of a class from objects that send messages to it. Encapsulation is achieved by specifying which classes may use the members of an object. The result is that each object exposes to any class a certain interface those members accessible to that class. The reason for encapsulation is to prevent clients of an interface from depending on those parts of the implementation that are likely to change in the future, thereby allowing those changes to be made more easily, that is, without changes to clients.. Members are often specified as public, protected or private, determining whether they are available to all classes, sub-classes or only the defining class. Some languages go further: Java uses the default access modifier to restrict access also to classes in the same package, C# and VB.NET reserve some members to classes in the same assembly using keywords internal (C#) or Friend (VB.NET). Eiffel and C++ allow one to specify which classes may access any member. Polymorphism allows the programmer to treat derived class members just like their parent class's members. More precisely, Polymorphism in object-oriented programming is the ability of objects belonging to different data types to respond to calls of methods of the same name, each one according to an appropriate type-specific behaviour.

100 p. 100 List of publications Cicconi, Raffaeli, A Knowledge Base approach for affrodable virtual prototyping: the drip emitters test case, Competitive Design - Proceedings of the 19th CIRP Design Conference, Edited by Rajkumar Roy and EssamShehab, Cranfield, United Kingdom, Cranfield University, 2009, pp , ISBN Cicconi, Mandolini, Germani, Knowledge-based tool for cost estimation in agile product design, 3rd International Conference on Changeable, Agile, Reconfigurable and Virtual Production (CARV 2009) Proceedings, Edited by M.F. Zaeh, Munchen, Germany, Herbert UtzVerlag GmbH, 2009, CD-ROM, pp , ISBN ; Cicconi, Raffaeli, Knowledge based plants layout configuration and piping routing, Proceedings of the 20th CIRP Design Conference 2010, Nantes, France, 2010, Springer in press. Raffaeli, Cicconi, Mengoni, Germani, Modular Product Configuration: an Automatic Tool for Eliciting Design Knowledge from Parametric CAD Models, Proceedings of the ASME 2010 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference IDETC/CIE 2010, Montreal, Quebec, Canada, ASME 2010, 2010, part number DETC , CD-ROM, ISBN Cicconi, Germani, Mandolini, How to Support Mechanical Product Cost Estimation in the Embodiment Design Phase, New World Situation: New Directiond in Concurrent Engineering - Proceedings of 17th ISPE Internationalsp Conference on Concurrent Engineering, Edited by Jerzy Pokojski, Shuichi Fukuda, JozefSalwinski, Cracow, Poland, Springer-Verlag London Limited, 2010, pp , ISBN

101 p. 101 Bibliography B.E. Launder e D.B. Spalding The numerical computation of turbulent flow [Sezione di libro]. - [s.l.] : Comput. Meth. Appl. Mech, Bodein Y., Rose B. e Caillaud E. Improving CAD performance: a decisional model for knowledgeware implementation [Atti di convegno] // Proceedings of ICED Vol. N.8, pp Bowen S. e H.K. Spear and Decoding the DNA of the Toyota production system [Rivista] // Harvard Business Review p Brimble R. e Sellini F. The MOKA Modelling Language [Atti di convegno] // 12th International Conference, EKAW 2000 Juan-les-Pins, France, October 2 6, 2000 Proceedings. - Juan-les-Pins : Springer, p. volume 1937/2000, Burge J.E. Knowledge Elicitation for Design Task Sequencing Knowledge [Libro]. - WORCESTER : WORCESTER POLYTECHNIC INSTITUTE, Colombo G., Cugini U. e Mandorli F. An Example of Knowledge Representation in Mechanical Design Using Object-Oriented Methodology [Atti di convegno] // Proceedings of The 1992 European Symposium Simulation and AI in Computer-Aided Techniques Vol. pp Colombo L. P. M., Armanasco F. e Perego, O. Experimentation on a cogenerative system based on a microturbine [Rivista] // Applied Thermal Engineering p. 27: Dalkir K. Knowledge Management In Theory And Practice [Libro]. - Oxford : Elsevier Inc, Dijkstra E W A note on two problems in connexion with graph, [Rivista] // Numerische Mathematik p. 1: Dooley K., Anderson J. e Liu J X Process Quality Knowledge Bases [Rivista] // Journal of quality management p. 4(2). Durkin J. Expert system: Catalog of Applications [Libro]. - [s.l.] : Akron, OH: Intelligent Computer Systems Inc., 1993.

102 p. 102 Fensel D. The Knowledge Acquisition and Representation Language [Libro]. - [s.l.] : KARL. Kluwer Academic Publishers,, Germani M. e Mandorli F. Self-configuring components approach to product variant development [Articolo] // AIEDAM Special Issue: Platform Product Development for Mass Customization Vol. Vol. 18(1), pp Germani M., M. Mengoni e Raffaeli R. Design Structure Matrix used as Knowledge Capture Method for Product Configuration [Atti di convegno] // Proceedings of the 9th International Design Conference - DESIGN [s.l.] : Edited by D. Marjanovic, Vol. Vol. 1, pp Hasan S.S. e Isaac R., K. An integrated approach of MAS-CommonKADS, Model- View-Controller and web application optimization strategies for web-based expert system development [Articolo] // Expert Systems with Application p. 38, Hayes-Roth F., Waterman D.A. e Lenat D.B. Building Expert Systems [Libro]. - London : Addison-Wesley, Hightower D. W. A solution to line routing problems on the continues plane [Rivista] // Proceedings of Sixth Design Automation Workshop IEEE p Ito T. U. A genetic algorithm approach to piping roite path planning [Rivista] // Journal of intelligent Manufacturing p. 10: Lee C.Y. An algorithm for path connections and its applications [Rivista] // IRE Transactions on Electronic Computer p. EC-10: Liao S.-H. Expert System Methodologies and Applications - A decade review from 1995 to 2004 [Articolo] // Expert System Methodologies and Applications p. 28, Lin Y.S. [et al.] A method and software tool for automated gearbox synthesis [Atti di convegno] // Proceedings of the 2009 ASME Design Engineering Technical Conferences & Computers and Information in Engineering Conference. - San Diego, California : [s.n.], Vol. Paper number DETC Mandorli F. [et al.] An Approach to Implement Feature Based Applications [Atti di convegno] // International IFIP Conference on Feature Modeling and Advanced Design-For- The-Life-Cycle Systems FEATS

103 p. 103 Mandorli F. [et al.] How to Implement Feature-Based Applications using KBE Technology [Rivista] // International IFIP Conference on Feature Modeling and Advanced Design-For-The-Life-Cycle Systems FEATS Mandorli F. e Bordegoni M. Product model definition support for knowledge aided engineering applications development [Atti di convegno] // Proceedings of ASME-DETC' Mandorli F. Sistemi per lo sviluppo di applicazioni KAE: un esempio nel settore degli armadi componibili [Atti di convegno] // Proceedings of IX Convegno Nazionale Associazione Nazionale Disegno di Macchine ADM ' Mandorli F., Berti S. e Germani M. A KBE system to manage the module configuration using the corporate knowledge [Atti di convegno] // Proceedings of DESIGN p. Vol. 1, pp McCreary Tips And Tricks For Using Simulation Doe To Assess The Complex Interactions Of Your Process [Atti di convegno] // Proceedings of Simulation Conference Winter Pierce B. What is Object-Oriented Programming? [Sezione di libro] // Types and Programming Languages. - [s.l.] : MIT Press, Raelin J A model of work-based learning [Rivista] // Organization Science p. 8(6): Raffaeli R. [et al.] An approach to support the implementation of product configuration tools [Atti di convegno] // Proceedings of the 2009 ASME Design Engineering Technical Conferences & Computers and Information in Engineering Conference. - San Diego, California : ASME, p. DETC Raffaeli R. [et al.] Modular Product Configuration: an Automatic Tool for Eliciting Design Knowledge from Parametric CAD Models [Atti di convegno] // Proceedings of the ASME 2010 International Design Engineering Technical Conferences & Computers and Information in Engineering Conference IDETC/CIE Montreal, Quebec, Canada : ASME 2010, Vol. DETC Sanchez Ron Tacit Knowledge versus Explicit Knowledge Approaches to Knowledge Management Practice / prod. School Copenhagen Business. - Frederiksberg : [s.n.].

104 p. 104 Schotborgh W. [et al.] Why is design automation software not everywhere? [Atti di convegno] // Proceedings of ICED Vol. N.8, pp Shadbolt N. e Burton M. The empirical study of knowledge elicitation techniques [Libro] Stiny G. Introduction to Shape and Shape Grammars [Articolo] // Environment and Planning, Vol. 7, pp Stokes MOKA consortium Melody Managing Engineering Knowledge: [Libro]. - [s.l.] : Professional Engineering Publication,, Tiihonen J. [et al.] Modeling Configurable Product Families [Atti di convegno] // Proceedings of ICED Vol. Vol. 2, pp Wei Q. Study on hydraulic performance of drip emitters by computational fluid dynamics [Rivista] // Agricultural Water Management 06 Agriculture Water Management. - [s.l.] : Ed. Elsevier, p Wielinga B.J., Schreiber A.T. and Breuke J.A. Kads: a modelling approach [Book Section] // Readings in Knowledge Acquisition and Learning / book auth. B. Buchanan D. Wilkins. - San Mateo : Morgan Kaufmann, Wikipedia Object-oriented_programming [Online] // en.wikipedia.org Yamada Y. e Teraoka Y. An Optimal Design of Piping Route in a CAD System for Power Plant [Rivista] // Computer Math. Applic p. 35(6):

105 p. 105 Acknowledgement The author wish to tanks all Design Tools and Methods group of the Mechanical Department and the Polytechnic University of Marche for the great opportunity to attend a Ph.D. course. In particular very gratefulness to working-group coordinators: Full Prof. Ferruccio Mandorli, Prof. Eng. Michele Germani, Prof. Eng. Maura Mengoni and Prof. Eng. Roberto Raffaeli. An additional acknowledgement for Prof. Eng. Roberto Raffaeli for his technical and human support on many research activities and for Prof. Eng. Michele Germani for the many research opportunities created in these years. Thanks to all colleagues, including: Eng. Marco Mandolini, Andrea Finaurini, Eng. Margherita, Eng. Alessandro Morbidoni, Eng. Claudio Favi, et many others. For the preparation of this Ph.D. thesis a great acknowledgement to Prof. Ferruccio Mandorli for his support and for his ideas and insights. Best thanks to Mr Francesco and Mrs Rosanna Ruschioni to the opportunity to research new methodologies on their enterprises: FGR Srl and Eruroplast Snc. Other credits to many companies collaborating with University, including Gi&E Spa and Biesse Spa. The greater thanks to my family: Mam, Dad and Vincenzo for all the help and moral support! A very special cuddle to my great Love: Valeria!!! I cannot forget all my special friends: Alessandro, Paolo, Cesidio, Valerio, Francesco, Samuele, Giuseppe, etc.

How To Develop Software

How To Develop Software Software Engineering Prof. N.L. Sarda Computer Science & Engineering Indian Institute of Technology, Bombay Lecture-4 Overview of Phases (Part - II) We studied the problem definition phase, with which

More information

Reusable Knowledge-based Components for Building Software. Applications: A Knowledge Modelling Approach

Reusable Knowledge-based Components for Building Software. Applications: A Knowledge Modelling Approach Reusable Knowledge-based Components for Building Software Applications: A Knowledge Modelling Approach Martin Molina, Jose L. Sierra, Jose Cuena Department of Artificial Intelligence, Technical University

More information

Knowledge based Engineering (KBE)

Knowledge based Engineering (KBE) perspective Knowledge based Engineering (KBE) Key Product Development Technology to Enhance Competitiveness Devaraja Holla V. Abstract In today s competitive environment, it becomes imperative to look

More information

Doctoral School on Engineering Sciences Università Politecnica delle Marche

Doctoral School on Engineering Sciences Università Politecnica delle Marche Doctoral School on Engineering Sciences Università Politecnica delle Marche Extended summary Knowledge-based approaches to support the design and development of the electrochemical storage Scuola di Dottorato

More information

Simulation Software: Practical guidelines for approaching the selection process

Simulation Software: Practical guidelines for approaching the selection process Practical guidelines for approaching the selection process Randall R. Gibson, Principal / Vice President Craig Dickson, Senior Analyst TranSystems I Automation Associates, Inc. Challenge Selecting from

More information

Chapter Managing Knowledge in the Digital Firm

Chapter Managing Knowledge in the Digital Firm Chapter Managing Knowledge in the Digital Firm Essay Questions: 1. What is knowledge management? Briefly outline the knowledge management chain. 2. Identify the three major types of knowledge management

More information

CHAPTER 1. Introduction to CAD/CAM/CAE Systems

CHAPTER 1. Introduction to CAD/CAM/CAE Systems CHAPTER 1 1.1 OVERVIEW Introduction to CAD/CAM/CAE Systems Today s industries cannot survive worldwide competition unless they introduce new products with better quality (quality, Q), at lower cost (cost,

More information

Business Intelligence and Decision Support Systems

Business Intelligence and Decision Support Systems Chapter 12 Business Intelligence and Decision Support Systems Information Technology For Management 7 th Edition Turban & Volonino Based on lecture slides by L. Beaubien, Providence College John Wiley

More information

37 Marketing Automation Best Practices David M. Raab Raab Associates Inc.

37 Marketing Automation Best Practices David M. Raab Raab Associates Inc. 37 Marketing Automation Best Practices David M. Raab Raab Associates Inc. Many companies today have installed marketing automation or demand generation software.* But buying a system is like joining a

More information

IAI : Expert Systems

IAI : Expert Systems IAI : Expert Systems John A. Bullinaria, 2005 1. What is an Expert System? 2. The Architecture of Expert Systems 3. Knowledge Acquisition 4. Representing the Knowledge 5. The Inference Engine 6. The Rete-Algorithm

More information

Knowledge based system to support the design of tools for the HFQ forming process for aluminium-based products

Knowledge based system to support the design of tools for the HFQ forming process for aluminium-based products MATEC Web of Conferences 21, 05008 (2015) DOI: 10.1051/matecconf/20152105008 C Owned by the authors, published by EDP Sciences, 2015 Knowledge based system to support the design of tools for the HFQ forming

More information

Expert in Disaster Recovery Scenarios. 1. Introduction. Michel Verheijen and Marcel E.M. Spruit

Expert in Disaster Recovery Scenarios. 1. Introduction. Michel Verheijen and Marcel E.M. Spruit Expert in Disaster Recovery Scenarios Michel Verheijen and Marcel E.M. Spruit Many organizations rely heavily on the availability of their information systems. It is the responsibility of the disaster

More information

ARTIFICIAL INTELLIGENCE METHODS IN EARLY MANUFACTURING TIME ESTIMATION

ARTIFICIAL INTELLIGENCE METHODS IN EARLY MANUFACTURING TIME ESTIMATION 1 ARTIFICIAL INTELLIGENCE METHODS IN EARLY MANUFACTURING TIME ESTIMATION B. Mikó PhD, Z-Form Tool Manufacturing and Application Ltd H-1082. Budapest, Asztalos S. u 4. Tel: (1) 477 1016, e-mail: [email protected]

More information

To introduce software process models To describe three generic process models and when they may be used

To introduce software process models To describe three generic process models and when they may be used Software Processes Objectives To introduce software process models To describe three generic process models and when they may be used To describe outline process models for requirements engineering, software

More information

FEAWEB ASP Issue: 1.0 Stakeholder Needs Issue Date: 03/29/2000. 04/07/2000 1.0 Initial Description Marco Bittencourt

FEAWEB ASP Issue: 1.0 Stakeholder Needs Issue Date: 03/29/2000. 04/07/2000 1.0 Initial Description Marco Bittencourt )($:(%$63 6WDNHKROGHU1HHGV,VVXH 5HYLVLRQ+LVWRU\ 'DWH,VVXH 'HVFULSWLRQ $XWKRU 04/07/2000 1.0 Initial Description Marco Bittencourt &RQILGHQWLDO DPM-FEM-UNICAMP, 2000 Page 2 7DEOHRI&RQWHQWV 1. Objectives

More information

8. KNOWLEDGE BASED SYSTEMS IN MANUFACTURING SIMULATION

8. KNOWLEDGE BASED SYSTEMS IN MANUFACTURING SIMULATION - 1-8. KNOWLEDGE BASED SYSTEMS IN MANUFACTURING SIMULATION 8.1 Introduction 8.1.1 Summary introduction The first part of this section gives a brief overview of some of the different uses of expert systems

More information

Goal Seeking in Solid Edge

Goal Seeking in Solid Edge Goal Seeking in Solid Edge White Paper Goal Seeking in Solid Edge software offers a fast built-in method for solving complex engineering problems. By drawing and dimensioning 2D free-body diagrams, Goal

More information

SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK

SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK Office of Safety and Mission Assurance NASA-GB-9503 SOFTWARE CONFIGURATION MANAGEMENT GUIDEBOOK AUGUST 1995 National Aeronautics and Space Administration Washington, D.C. 20546 PREFACE The growth in cost

More information

Chapter 13: Knowledge Management In Nutshell. Information Technology For Management Turban, McLean, Wetherbe John Wiley & Sons, Inc.

Chapter 13: Knowledge Management In Nutshell. Information Technology For Management Turban, McLean, Wetherbe John Wiley & Sons, Inc. Chapter 13: Knowledge Management In Nutshell Information Technology For Management Turban, McLean, Wetherbe John Wiley & Sons, Inc. Objectives Define knowledge and describe the different types of knowledge.

More information

Online Chapter A The Role of the Systems Analyst

Online Chapter A The Role of the Systems Analyst Systems Analysis and Design in a Changing World, sixth edition A-1 Online Chapter A The Role of the Systems Analyst Table of Contents Chapter Overview Learning Objectives Why read this chapter? This chapter

More information

Teaching Methodology for 3D Animation

Teaching Methodology for 3D Animation Abstract The field of 3d animation has addressed design processes and work practices in the design disciplines for in recent years. There are good reasons for considering the development of systematic

More information

A Visual Language Based System for the Efficient Management of the Software Development Process.

A Visual Language Based System for the Efficient Management of the Software Development Process. A Visual Language Based System for the Efficient Management of the Software Development Process. G. COSTAGLIOLA, G. POLESE, G. TORTORA and P. D AMBROSIO * Dipartimento di Informatica ed Applicazioni, Università

More information

KNOWLEDGE ORGANIZATION

KNOWLEDGE ORGANIZATION KNOWLEDGE ORGANIZATION Gabi Reinmann Germany [email protected] Synonyms Information organization, information classification, knowledge representation, knowledge structuring Definition The term

More information

MEng, BSc Applied Computer Science

MEng, BSc Applied Computer Science School of Computing FACULTY OF ENGINEERING MEng, BSc Applied Computer Science Year 1 COMP1212 Computer Processor Effective programming depends on understanding not only how to give a machine instructions

More information

Important dimensions of knowledge Knowledge is a firm asset: Knowledge has different forms Knowledge has a location Knowledge is situational Wisdom:

Important dimensions of knowledge Knowledge is a firm asset: Knowledge has different forms Knowledge has a location Knowledge is situational Wisdom: Southern Company Electricity Generators uses Content Management System (CMS). Important dimensions of knowledge: Knowledge is a firm asset: Intangible. Creation of knowledge from data, information, requires

More information

Improving Knowledge-Based System Performance by Reordering Rule Sequences

Improving Knowledge-Based System Performance by Reordering Rule Sequences Improving Knowledge-Based System Performance by Reordering Rule Sequences Neli P. Zlatareva Department of Computer Science Central Connecticut State University 1615 Stanley Street New Britain, CT 06050

More information

KING SAUD UNIVERSITY COLLEGE OF COMPUTER AND INFORMATION SCIENCES DEPARTMENT OF INFORMATION SYSTEMS THE MASTER'S DEGREE PROGRAM INFORMATION SYSTEMS

KING SAUD UNIVERSITY COLLEGE OF COMPUTER AND INFORMATION SCIENCES DEPARTMENT OF INFORMATION SYSTEMS THE MASTER'S DEGREE PROGRAM INFORMATION SYSTEMS KING SAUD UNIVERSITY COLLEGE OF COMPUTER AND INFORMATION SCIENCES DEPARTMENT OF INFORMATION SYSTEMS THE MASTER'S DEGREE PROGRAM IN INFORMATION SYSTEMS 1. Introduction 1.1 Information Systems Information

More information

WebSphere Business Modeler

WebSphere Business Modeler Discovering the Value of SOA WebSphere Process Integration WebSphere Business Modeler Workshop SOA on your terms and our expertise Soudabeh Javadi Consulting Technical Sales Support WebSphere Process Integration

More information

Enterprise Architecture: Practical Guide to Logical Architecture

Enterprise Architecture: Practical Guide to Logical Architecture Objecteering Practical Guides Enterprise Architecture: Practical Guide to Logical Architecture Author: Version: 1.0 Copyright: Softeam Softeam Consulting Team Supervised by Philippe Desfray Softeam 21

More information

Chapter 9 Knowledge Management

Chapter 9 Knowledge Management Turban, Aronson, and Liang Decision Support Systems and Intelligent Systems, Seventh Edition Chapter 9 Knowledge Management 9-1 Learning Objectives Define knowledge. Learn the characteristics of knowledge

More information

Introduction. Chapter 1

Introduction. Chapter 1 Chapter 1 Introduction The area of fault detection and diagnosis is one of the most important aspects in process engineering. This area has received considerable attention from industry and academia because

More information

MEng, BSc Computer Science with Artificial Intelligence

MEng, BSc Computer Science with Artificial Intelligence School of Computing FACULTY OF ENGINEERING MEng, BSc Computer Science with Artificial Intelligence Year 1 COMP1212 Computer Processor Effective programming depends on understanding not only how to give

More information

Name of pattern types 1 Process control patterns 2 Logic architectural patterns 3 Organizational patterns 4 Analytic patterns 5 Design patterns 6

Name of pattern types 1 Process control patterns 2 Logic architectural patterns 3 Organizational patterns 4 Analytic patterns 5 Design patterns 6 The Researches on Unified Pattern of Information System Deng Zhonghua,Guo Liang,Xia Yanping School of Information Management, Wuhan University Wuhan, Hubei, China 430072 Abstract: This paper discusses

More information

COURSE CATALOGUE 2013-2014

COURSE CATALOGUE 2013-2014 COURSE CATALOGUE 201-201 Field: COMPUTER SCIENCE Programme: Bachelor s Degree Programme in Computer Science (Informatics) Length of studies: years (6 semesters) Number of ECTS Credits: 180 +0 for the B.Sc.

More information

SPATIAL DATA CLASSIFICATION AND DATA MINING

SPATIAL DATA CLASSIFICATION AND DATA MINING , pp.-40-44. Available online at http://www. bioinfo. in/contents. php?id=42 SPATIAL DATA CLASSIFICATION AND DATA MINING RATHI J.B. * AND PATIL A.D. Department of Computer Science & Engineering, Jawaharlal

More information

Surveying and evaluating tools for managing processes for software intensive systems

Surveying and evaluating tools for managing processes for software intensive systems Master Thesis in Software Engineering 30 Credits, Advanced Level Surveying and evaluating tools for managing processes for software intensive systems Anuradha Suryadevara IDT Mälardalen University, ABB

More information

Introduction. 1.1 Motivation. Chapter 1

Introduction. 1.1 Motivation. Chapter 1 Chapter 1 Introduction The automotive, aerospace and building sectors have traditionally used simulation programs to improve their products or services, focusing their computations in a few major physical

More information

Issue in Focus: Consolidating Design Software. Extending Value Beyond 3D CAD Consolidation

Issue in Focus: Consolidating Design Software. Extending Value Beyond 3D CAD Consolidation Issue in Focus: Consolidating Design Software Extending Value Beyond 3D CAD Consolidation Tech-Clarity, Inc. 2012 Table of Contents Introducing the Issue... 3 Consolidate Upstream from Detailed Design...

More information

Expert System and Knowledge Management for Software Developer in Software Companies

Expert System and Knowledge Management for Software Developer in Software Companies Expert System and Knowledge Management for Software Developer in Software Companies 1 M.S.Josephine, 2 V.Jeyabalaraja 1 Dept. of MCA, Dr.MGR University, Chennai. 2 Dept.of MCA, Velammal Engg.College,Chennai.

More information

Extracting Business. Value From CAD. Model Data. Transformation. Sreeram Bhaskara The Boeing Company. Sridhar Natarajan Tata Consultancy Services Ltd.

Extracting Business. Value From CAD. Model Data. Transformation. Sreeram Bhaskara The Boeing Company. Sridhar Natarajan Tata Consultancy Services Ltd. Extracting Business Value From CAD Model Data Transformation Sreeram Bhaskara The Boeing Company Sridhar Natarajan Tata Consultancy Services Ltd. GPDIS_2014.ppt 1 Contents Data in CAD Models Data Structures

More information

Overview of the TACITUS Project

Overview of the TACITUS Project Overview of the TACITUS Project Jerry R. Hobbs Artificial Intelligence Center SRI International 1 Aims of the Project The specific aim of the TACITUS project is to develop interpretation processes for

More information

The Phios Whole Product Solution Methodology

The Phios Whole Product Solution Methodology Phios Corporation White Paper The Phios Whole Product Solution Methodology Norm Kashdan Phios Chief Technology Officer 2010 Phios Corporation Page 1 1 Introduction The senior staff at Phios has several

More information

And the Models Are 16-03-2015. System/Software Development Life Cycle. Why Life Cycle Approach for Software?

And the Models Are 16-03-2015. System/Software Development Life Cycle. Why Life Cycle Approach for Software? System/Software Development Life Cycle Anurag Srivastava Associate Professor ABV-IIITM, Gwalior Why Life Cycle Approach for Software? Life cycle is a sequence of events or patterns that are displayed in

More information

APPLYING CASE BASED REASONING IN AGILE SOFTWARE DEVELOPMENT

APPLYING CASE BASED REASONING IN AGILE SOFTWARE DEVELOPMENT APPLYING CASE BASED REASONING IN AGILE SOFTWARE DEVELOPMENT AIMAN TURANI Associate Prof., Faculty of computer science and Engineering, TAIBAH University, Medina, KSA E-mail: [email protected] ABSTRACT

More information

Abstraction in Computer Science & Software Engineering: A Pedagogical Perspective

Abstraction in Computer Science & Software Engineering: A Pedagogical Perspective Orit Hazzan's Column Abstraction in Computer Science & Software Engineering: A Pedagogical Perspective This column is coauthored with Jeff Kramer, Department of Computing, Imperial College, London ABSTRACT

More information

The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements.

The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision of resources to support service requirements. CAPACITY AND AVAILABILITY MANAGEMENT A Project Management Process Area at Maturity Level 3 Purpose The purpose of Capacity and Availability Management (CAM) is to plan and monitor the effective provision

More information

Customer effectiveness

Customer effectiveness www.pwc.com/sap Customer effectiveness PwC SAP Consulting Services Advance your ability to win, keep and deepen relationships with your customers. Are your customers satisfied? How do you know? Five leading

More information

Software Engineering. Software Processes. Based on Software Engineering, 7 th Edition by Ian Sommerville

Software Engineering. Software Processes. Based on Software Engineering, 7 th Edition by Ian Sommerville Software Engineering Software Processes Based on Software Engineering, 7 th Edition by Ian Sommerville Objectives To introduce software process models To describe three generic process models and when

More information

How To Learn From The Most Successful Manufacturers

How To Learn From The Most Successful Manufacturers Tech-Clarity Perspective: Best Practices for Developing Industrial Equipment Top Performers Drive Growth and Profitability with Advanced Design Practices and Enabling Technology Tech-Clarity, Inc. 2012

More information

Chapter 11. Managing Knowledge

Chapter 11. Managing Knowledge Chapter 11 Managing Knowledge VIDEO CASES Video Case 1: How IBM s Watson Became a Jeopardy Champion. Video Case 2: Tour: Alfresco: Open Source Document Management System Video Case 3: L'Oréal: Knowledge

More information

The challenge of reducing non-revenue water by implementing the change management index A first comparative assessment in four development countries

The challenge of reducing non-revenue water by implementing the change management index A first comparative assessment in four development countries The challenge of reducing non-revenue water by implementing the change management index A first comparative assessment in four development countries Monika Konatar*, Matthias Hitzel** * Human Resource

More information

How the Computer Translates. Svetlana Sokolova President and CEO of PROMT, PhD.

How the Computer Translates. Svetlana Sokolova President and CEO of PROMT, PhD. Svetlana Sokolova President and CEO of PROMT, PhD. How the Computer Translates Machine translation is a special field of computer application where almost everyone believes that he/she is a specialist.

More information

HELP DESK SYSTEMS. Using CaseBased Reasoning

HELP DESK SYSTEMS. Using CaseBased Reasoning HELP DESK SYSTEMS Using CaseBased Reasoning Topics Covered Today What is Help-Desk? Components of HelpDesk Systems Types Of HelpDesk Systems Used Need for CBR in HelpDesk Systems GE Helpdesk using ReMind

More information

System Software Product Line

System Software Product Line System Software Product Line 2 1 Introduction The concept of Software Product Lines has been developed for more than a decade. Being initially an academic topic, product lines are more and more incorporated

More information

Course 803401 DSS. Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization

Course 803401 DSS. Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization Oman College of Management and Technology Course 803401 DSS Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization CS/MIS Department Information Sharing

More information

INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN ICED 01 GLASGOW, AUGUST 21-23, 2001

INTERNATIONAL CONFERENCE ON ENGINEERING DESIGN ICED 01 GLASGOW, AUGUST 21-23, 2001 INTERNATIONAL CONFERENCE ON ENGINEERING ICED 01 GLASGOW, AUGUST 21-23, 2001 REDUCING DEVELOPMENT CYCLE BY DATA MANAGEMENT WITHIN THE OFFICE Mario Storga, Davor Pavlic and Dorian Marjanovic Keywords: Product

More information

Knowledge Management

Knowledge Management Knowledge Management Management Information Code: 164292-02 Course: Management Information Period: Autumn 2013 Professor: Sync Sangwon Lee, Ph. D D. of Information & Electronic Commerce 1 00. Contents

More information

Total Exploration & Production: Field Monitoring Case Study

Total Exploration & Production: Field Monitoring Case Study Total Exploration & Production: Field Monitoring Case Study 1 Summary TOTAL S.A. is a word-class energy producer and provider, actually part of the super majors, i.e. the worldwide independent oil companies.

More information

How To Develop A Multi Agent System (Mma)

How To Develop A Multi Agent System (Mma) S-Tropos: An Iterative SPEM-Centric Software Project Management Process Yves Wautelet, Manuel Kolp, Youssef Achbany IAG Institut d Administration et de Gestion, ISYS Unité de Systèmes d Information, Université

More information

Chapter 5. Warehousing, Data Acquisition, Data. Visualization

Chapter 5. Warehousing, Data Acquisition, Data. Visualization Decision Support Systems and Intelligent Systems, Seventh Edition Chapter 5 Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization 5-1 Learning Objectives

More information

Family Evaluation Framework overview & introduction

Family Evaluation Framework overview & introduction A Family Evaluation Framework overview & introduction P B Frank van der Linden O Partner: Philips Medical Systems Veenpluis 4-6 5684 PC Best, the Netherlands Date: 29 August, 2005 Number: PH-0503-01 Version:

More information

Fourth generation techniques (4GT)

Fourth generation techniques (4GT) Fourth generation techniques (4GT) The term fourth generation techniques (4GT) encompasses a broad array of software tools that have one thing in common. Each enables the software engineer to specify some

More information

Indiana University East Faculty Senate

Indiana University East Faculty Senate Indiana University East Faculty Senate General Education Curriculum for Baccalaureate Degree Programs at Indiana University East The purpose of the General Education Curriculum is to ensure that every

More information

Chapter 8 Approaches to System Development

Chapter 8 Approaches to System Development Systems Analysis and Design in a Changing World, sixth edition 8-1 Chapter 8 Approaches to System Development Table of Contents Chapter Overview Learning Objectives Notes on Opening Case and EOC Cases

More information

Computer Aided Systems

Computer Aided Systems 5 Computer Aided Systems Ivan Kuric Prof. Ivan Kuric, University of Zilina, Faculty of Mechanical Engineering, Department of Machining and Automation, Slovak republic, [email protected] 1.1 Introduction

More information

Training Management System for Aircraft Engineering: indexing and retrieval of Corporate Learning Object

Training Management System for Aircraft Engineering: indexing and retrieval of Corporate Learning Object Training Management System for Aircraft Engineering: indexing and retrieval of Corporate Learning Object Anne Monceaux 1, Joanna Guss 1 1 EADS-CCR, Centreda 1, 4 Avenue Didier Daurat 31700 Blagnac France

More information

Improving Interoperability in Mechatronic Product Developement. Dr. Alain Biahmou, Dr. Arnulf Fröhlich, Dr. Josip Stjepandic

Improving Interoperability in Mechatronic Product Developement. Dr. Alain Biahmou, Dr. Arnulf Fröhlich, Dr. Josip Stjepandic International Conference on Product Lifecycle Management 1 Improving Interoperability in Mechatronic Product Developement Dr. Alain Biahmou, Dr. Arnulf Fröhlich, Dr. Josip Stjepandic PROSTEP AG Dolivostr.

More information

Malay A. Dalal Madhav Erraguntla Perakath Benjamin. Knowledge Based Systems, Inc. (KBSI) College Station, TX 77840, U.S.A.

Malay A. Dalal Madhav Erraguntla Perakath Benjamin. Knowledge Based Systems, Inc. (KBSI) College Station, TX 77840, U.S.A. AN INTRODUCTION TO USING PROSIM FOR BUSINESS PROCESS SIMULATION AND ANALYSIS Malay A. Dalal Madhav Erraguntla Perakath Benjamin Knowledge Based Systems, Inc. (KBSI) College Station, TX 77840, U.S.A. ABSTRACT

More information

Component visualization methods for large legacy software in C/C++

Component visualization methods for large legacy software in C/C++ Annales Mathematicae et Informaticae 44 (2015) pp. 23 33 http://ami.ektf.hu Component visualization methods for large legacy software in C/C++ Máté Cserép a, Dániel Krupp b a Eötvös Loránd University [email protected]

More information

Healthcare Measurement Analysis Using Data mining Techniques

Healthcare Measurement Analysis Using Data mining Techniques www.ijecs.in International Journal Of Engineering And Computer Science ISSN:2319-7242 Volume 03 Issue 07 July, 2014 Page No. 7058-7064 Healthcare Measurement Analysis Using Data mining Techniques 1 Dr.A.Shaik

More information

FACULTY OF COMPUTER SCIENCE AND INFORMATION TECHNOLOGY AUTUMN 2016 BACHELOR COURSES

FACULTY OF COMPUTER SCIENCE AND INFORMATION TECHNOLOGY AUTUMN 2016 BACHELOR COURSES FACULTY OF COMPUTER SCIENCE AND INFORMATION TECHNOLOGY Please note! This is a preliminary list of courses for the study year 2016/2017. Changes may occur! AUTUMN 2016 BACHELOR COURSES DIP217 Applied Software

More information

Master of Science Service Oriented Architecture for Enterprise. Courses description

Master of Science Service Oriented Architecture for Enterprise. Courses description Master of Science Service Oriented Architecture for Enterprise Courses description SCADA and PLC networks The course aims to consolidate and transfer of extensive knowledge regarding the architecture,

More information

ONTOLOGY FOR MOBILE PHONE OPERATING SYSTEMS

ONTOLOGY FOR MOBILE PHONE OPERATING SYSTEMS ONTOLOGY FOR MOBILE PHONE OPERATING SYSTEMS Hasni Neji and Ridha Bouallegue Innov COM Lab, Higher School of Communications of Tunis, Sup Com University of Carthage, Tunis, Tunisia. Email: [email protected];

More information

Miracle Integrating Knowledge Management and Business Intelligence

Miracle Integrating Knowledge Management and Business Intelligence ALLGEMEINE FORST UND JAGDZEITUNG (ISSN: 0002-5852) Available online www.sauerlander-verlag.com/ Miracle Integrating Knowledge Management and Business Intelligence Nursel van der Haas Technical University

More information

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT

SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT SOFTWARE DEVELOPMENT STANDARD FOR SPACECRAFT Mar 31, 2014 Japan Aerospace Exploration Agency This is an English translation of JERG-2-610. Whenever there is anything ambiguous in this document, the original

More information

Fuzzy Knowledge Base System for Fault Tracing of Marine Diesel Engine

Fuzzy Knowledge Base System for Fault Tracing of Marine Diesel Engine Fuzzy Knowledge Base System for Fault Tracing of Marine Diesel Engine 99 Fuzzy Knowledge Base System for Fault Tracing of Marine Diesel Engine Faculty of Computers and Information Menufiya University-Shabin

More information

Graduate Co-op Students Information Manual. Department of Computer Science. Faculty of Science. University of Regina

Graduate Co-op Students Information Manual. Department of Computer Science. Faculty of Science. University of Regina Graduate Co-op Students Information Manual Department of Computer Science Faculty of Science University of Regina 2014 1 Table of Contents 1. Department Description..3 2. Program Requirements and Procedures

More information

Knowledge Based Engineering In Product Development - KBE

Knowledge Based Engineering In Product Development - KBE 2003:05 TECHNICAL REPORT Knowledge Based Engineering - In Product Development MARCUS SANDBERG Department of Applied Physics and Mechanical Engineering Division of Computer Aided Design 2003:05 ISSN: 1402-1536

More information

IFS-8000 V2.0 INFORMATION FUSION SYSTEM

IFS-8000 V2.0 INFORMATION FUSION SYSTEM IFS-8000 V2.0 INFORMATION FUSION SYSTEM IFS-8000 V2.0 Overview IFS-8000 v2.0 is a flexible, scalable and modular IT system to support the processes of aggregation of information from intercepts to intelligence

More information

THE ROLE OF KNOWLEDGE MANAGEMENT SYSTEM IN SCHOOL: PERCEPTION OF APPLICATIONS AND BENEFITS

THE ROLE OF KNOWLEDGE MANAGEMENT SYSTEM IN SCHOOL: PERCEPTION OF APPLICATIONS AND BENEFITS THE ROLE OF KNOWLEDGE MANAGEMENT SYSTEM IN SCHOOL: PERCEPTION OF APPLICATIONS AND BENEFITS YOHANNES KURNIAWAN Bina Nusantara University, Department of Information Systems, Jakarta 11480, Indonesia E-mail:

More information

Chapter 5 Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization

Chapter 5 Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization Turban, Aronson, and Liang Decision Support Systems and Intelligent Systems, Seventh Edition Chapter 5 Business Intelligence: Data Warehousing, Data Acquisition, Data Mining, Business Analytics, and Visualization

More information

Exhibit F. VA-130620-CAI - Staff Aug Job Titles and Descriptions Effective 2015

Exhibit F. VA-130620-CAI - Staff Aug Job Titles and Descriptions Effective 2015 Applications... 3 1. Programmer Analyst... 3 2. Programmer... 5 3. Software Test Analyst... 6 4. Technical Writer... 9 5. Business Analyst... 10 6. System Analyst... 12 7. Software Solutions Architect...

More information

The heart of your business*

The heart of your business* Advisory services Technology The heart of your business* Advance your ability to win, keep and deepen relationships with your customers Customer Effectiveness *connectedthinking Are your customers satisfied?

More information

Task-Model Driven Design of Adaptable Educational Hypermedia

Task-Model Driven Design of Adaptable Educational Hypermedia Task-Model Driven Design of Adaptable Educational Hypermedia Huberta Kritzenberger, Michael Herczeg Institute for Multimedia and Interactive Systems University of Luebeck Seelandstr. 1a, D-23569 Luebeck,

More information

Communication Networks. MAP-TELE 2011/12 José Ruela

Communication Networks. MAP-TELE 2011/12 José Ruela Communication Networks MAP-TELE 2011/12 José Ruela Network basic mechanisms Network Architectures Protocol Layering Network architecture concept A network architecture is an abstract model used to describe

More information

CHAPTER 1 INTRODUCTION

CHAPTER 1 INTRODUCTION 1 CHAPTER 1 INTRODUCTION Exploration is a process of discovery. In the database exploration process, an analyst executes a sequence of transformations over a collection of data structures to discover useful

More information

Foundations of Business Intelligence: Databases and Information Management

Foundations of Business Intelligence: Databases and Information Management Foundations of Business Intelligence: Databases and Information Management Problem: HP s numerous systems unable to deliver the information needed for a complete picture of business operations, lack of

More information

Doctor of Philosophy in Computer Science

Doctor of Philosophy in Computer Science Doctor of Philosophy in Computer Science Background/Rationale The program aims to develop computer scientists who are armed with methods, tools and techniques from both theoretical and systems aspects

More information

Ontology and automatic code generation on modeling and simulation

Ontology and automatic code generation on modeling and simulation Ontology and automatic code generation on modeling and simulation Youcef Gheraibia Computing Department University Md Messadia Souk Ahras, 41000, Algeria [email protected] Abdelhabib Bourouis

More information

Rule-Based Ship Design

Rule-Based Ship Design Rule-Based Ship Design A White Paper Process, Power & Marine, a division of Intergraph Table of Contents 1. Introduction... 1 2. Advantages to Rule-Based Design... 2 2.1 Embedding Design Practices...2

More information

Software Architecture

Software Architecture Cairo University Faculty of Computers and Information Computer Science Department Premasters Studies Software Architecture Report on Software Product Line Submitted to: Dr. Hany Ammar Submitted by: Hadeel

More information

A Contribution to Expert Decision-based Virtual Product Development

A Contribution to Expert Decision-based Virtual Product Development A Contribution to Expert Decision-based Virtual Product Development László Horváth, Imre J. Rudas Institute of Intelligent Engineering Systems, John von Neumann Faculty of Informatics, Óbuda University,

More information

Banking Application Modernization and Portfolio Management

Banking Application Modernization and Portfolio Management Banking Application Modernization and Portfolio Management Key Challenges and Success Factors As part of their long-term strategic plans, banks are seeking to capitalize on their legacy applications. Acquired

More information

Basic Trends of Modern Software Development

Basic Trends of Modern Software Development DITF LDI Lietišķo datorsistēmu programmatūras profesora grupa e-business Solutions Basic Trends of Modern Software Development 2 3 Software Engineering FAQ What is software engineering? An engineering

More information

1-04-10 Configuration Management: An Object-Based Method Barbara Dumas

1-04-10 Configuration Management: An Object-Based Method Barbara Dumas 1-04-10 Configuration Management: An Object-Based Method Barbara Dumas Payoff Configuration management (CM) helps an organization maintain an inventory of its software assets. In traditional CM systems,

More information