A comparison of BON and Hodge-Mock software development methodologies December 2012 Abbas Naderi Afooshteh (abiusx@acm.org) 91211054 Sharif University of Technology 1. Introduction This whitepaper intends to provide a comparison between two object oriented software development methodologies based on a criteria introduced in the next section. Better Object Notation (BON) was coined by Jean-Marc Nerson and Kim Waldén at 1989 to act as a substitute for the UML. It was renamed Business Object Notation later at 1993 and still maintains its practices at its official homepage, http://www.bon-method.com. Hodge-Mock on the other hand, was crafted to satisfy the needs of industry for an air traffic control system, then generalized and published as a paper on March 1992. Both methodologies claim to provide best practices of software development methodologies of their time, though many of these claims are proven wrong in practice. The document is composed of three main sections; in the first section, the criteria of comparison is described. The second section describes each methodology briefly, then summarizes its strengths and weaknesses. Last section provides a comparison of the two in an easily graspable approach. 2. Criteria The criteria described below is used as the framework of assessment for software development methodologies. It consists of 11 statements and is based on the paper by Ramsin et all [3]. I. Clarity, rationality, accuracy and consistency of definition II. III. Coverage of the generic development lifecycle activities (Analysis, Design, Implementation, Test, Maintenance) Support for umbrella activities, especially including: A. Risk Management B. Project management C. Quality assurance IV. Seamlessness and smoothness of transition between phases, stages and activities V. Basis in the requirements (functional and non-functional) VI. Testability and tangibility of artifacts, and traceability to requirements VII. Encouragement of active user involvement VIII.Practicability and practicality IX. Manageability of complexity X. Extensibility, configurability, flexibility, scalability
XI. Application Scope These criteria are refenreced by their respective roman numbers frequently in the rest of this document. 3. Business Object Notation (BON) Nerson introduced BON as an alternative for UML which intended to be more traceable and have smoother transition between phases [2]. Due to relatively young nature of object oriented methodologies at the time of it's writing, many existing approaches lacked smoothness of transition (IV) and traceability of artifacts to requirements and to former phase artifacts (VI). Despite its name, BON is not just a notation. It includes well described processes for most of the SDLC (II). The rest of this section describes key features of BON as described in the late paper. (a) Analysis BON emphasizes clearly on looking for classes instead of detecting processes in the existing organization, which was the common habit of analysits in structural programming days. It suggests looking for Data Abstractions to become classes and attach services to themselves, instead of looking for services to later attract data. It also encourages clustering classes based on three gouping criteria; subsystems, abstraction levels and end user standpoint (coupling of classes). Clustering is a key feature of BON and helps a great deal in code reuse and in modeing complex systems. For this purpose, a cluster chart is drawn, showing classes in clusters and their definitions and roles. To achieve this smoothly (IV), the suggested approach is to start with a general cluster, then divide it by refactoring classes based on the triple criteria mentioned above. Figure 1: BON Cluster Chart BON is inspired by CRC for having collaborative requirement gathering and object charts in analysis phase (V). Then a dynamic model is required depicting parts of system behavior, and a static model which shows structure of the system. External events are listed so that boundary classes could be marked by tracing external events (events that are not predicted by the system). They are categorized as either a class, or as an input to a class. Internal events are also defined as timerelated stimuli which can be relative or absolute. The object communication protocl is then derived from them. The dynamic model is mostly based on these events and ensures that objects can successfully reach each other.
Figure 2: BON Event Chart Figure 3: BON Class Chart Then Class Chart is sketched from intial list of classes in a table with 3 columns, in native textual descriptions. First column holds the question, the information that other classes can ask this one. Second column lists commands, the services this class can provide to others. Third is constraints, the knowledge that the class maintains. In the class chart table, on top of the 3 columns, type of the class and its behavioral resemblance is mentioned for later genspec operations. Class relationships are then classified as either inheritance or client-supplier forms. In the client-supplier form, association and aggregation are the available relationships. (b) Design At first design step, generalization and adding of solution domain classes to the
system is performed concurrently. Then class charts which were vague representation of classes are turned into class descriptions, more formal and including almost any structural information a class can hold. Class descriptions also define input/output of methods and their pre/post conditions. Then the behavior resemblance of class charts is turned into inheritance relationships. At this step, indexing takes place, which is commenting, versioning and listing of sources for a class. This is done for code reuse in other projects - which BON puts great value in - and indexes are kept in a repository. Figure 4: BON Class Description A class description hold class features, and entries from class chart columns become comments to these features. Questions are turned into attributes, commands into methods (called procedures in BON) and constraints become assertions and class invariants. It is important to note that BON was loosely coupled with Eiffel programming language, which supported class invariance built-in. Many common object oriented programming languages of today and the BON era lacked this feature, so BON suggests replacing them with some sorts of assertions if not available in the language. Figure 5: BON Indexing Static model which includes class structures and Dynamic model which depics communication protocols between classes are derived afterwards. These models can be big - specially static models - as they include all the classes of the system, and Nerson depics only a subset of the example system's static model in his paper, without describing how to hold such a huge diagram for a vaster system.
Figure 6: BON Static and Dynamic Model Notations Then object creation chart is drawn, determining which class is responsible to instantiate objects of other classes. Figure 7: BON Object Creation Table BON classifies data structures into three different types : storage, traversing and access method. It claims that by mixing these three, all sorts of complex data structures could be formed. (c) Generalization The methodology contains an important step before beginning of development. It's intent is to maximumize future reuse of the software, but is only suggested if a large software house is developing the software and there are good chances of reusing software components. The process is done by first designing an initial version of the system, then making specializations from first version and then abstracting high level features to higher level classes. These different layers of genspec operations would have
to maintain the interface so that minimum change is required through the software. Finally clusters are more generalized for furthur reuse of a whole cluster in another system. (d) Tools BON - as well as many early 2000s methodologies - considers CASE tools one of the greatest assets in a methodology and it's modeling techniques are inspired by O* and OOSA. It claims scalability via the clustering technique, allowing for a divide and conquer approach to analysis and design, though clustering is done by different approaches and could introduce inconsistencies. The method also claims high reverse engineerability, due to its object oriented nature. BON requires through documentation to maintain traceability, and requires all texts and diagrams to be well documented and traced to each other. Systematic design, is what it claims class invariants achieve. They maintain the order and balance of the whole system by enforcing the intraobject states. Management of components are made easy by the indexing process in BON, this makes for better reuse and maintenance. (e) Development BON does not address development, testing, maintenance and other phases of the SDLC. This is mostly due to the nature of industry at the time of the writing, many companies were already employing mature object oriented development processes, and they only lacked standardized analysis and design phases. 4. Hodge-Mock Hodge-Mock is another noteworthy object oriented software development methodology introduced in early 1992. It intended to be used as a guideline for developing a large-scale air traffic control system over a period of few years, thus has enterprise level analysis and design in consideration [3]. Like BON, Hodge-Mock was introduced in an era of industry's heavy use of object oriented programming languages without a proper documentation notation, so it only addresses Analysis and Design. Similar to BON, Hodge-Mock claims that it focuses on Traceability and Transition between phases (IV). It names itself a framework, not a methodology and claims it supports full development lifecycle, though only analysis and design are described. (a) Introduction Hodge-Mock is inspired by many techniques (as it calls them). It categorizes techniques as functional, structural and object-oriented and is based on objectoriented techniques. It also assumes that object orientation is the solution to every software problem, which is plausible considering the peak of object-orientation at its time. The methodology was born to address diversities in object oriented design and implementation methods of its time, and employs many techniques and notations, namely Ward (and Nerson), Coad Yourdon, Booch, Bailin, McMenamin and Palmer (the latter not being object oriented), and tries to unify their strengths.
Since it targets an ATC system, reducing development risks are emphasized. Hodge-Mock tries to address 4 commong object oriented problems in earlier works: lack of clear and consistent guidelines on phase lack of guidelines for transision and showing traceability lack of automated tools support lack of practical experienced object oriented staff As with other seminal methods, Hodge- Mock claims it is iterative and cyclic at the same time and supports both systematic (information) view of the system and behavioral (interactive) view. It also relies heavily on CASE tools to check consistency of models on either views. This methodology synthesizes many notations and techniques continually, and its documents and models are refined and enhanced in each phase to ensure traceability. (b) Analysis First step is identifying objects of problem domain by answering the question "what is present within the system and its domain". This task is broken down into 4 stages, namely requirement analysis, information analysis, event analysis and Client Server diagrams, which are performed concurrently. Requirement analysis tries to fix ambiguities and deficiencies in requirement statement. It clarifies constraints such as performance from customer perspective. It also answers the questions "what do I need to know about the problem domain?" and "what and who are sources of information on the problem domain?" and refines requirement statement of the contract. Information analysis, defines data and information of the system which are then turned into objects and their attributes. First ERD diagram is drawn, then ERD is converted into objects by answering "what object does this data element best describe?". Some of the entities are turned into objects and some are made into attributes. Object Relational Diagram (ORD) is then drawn which depics objects, their attributes, sub objects and behavors. Then these objects are grouped into subjects. Afterwards, Object Description (OD) is created, which is a textual description of objects. OCR (Object Cross Reference) is the next model, a table which describes all objects and the services they provide for each other. It is ideal that OCR be generated from OD via CASE tools. Finally inheritance is observed and ID diagram is drawn. Event analysis involes defining behavor of the objects within the system. In this stage, system is considered a stimulusresponse machine which reacts to external stimuli. This part also validates the models derived in other parts of analysis. A list of system activities known as fundamental activities is generated, and custodial activities are added to it. Then System Behavior Scripts (SBS) table is generated. SBD (system behavior diagram) is a state machine derived from SBS, which shows externally visible states of the system.
Figure 8: Hodge-Mock ID (Inheritance Diagram) Some new data elements might be added to the system in event analysis, since external stimuli require some data elements to hold and achive state. The others are validated through stimuli navigation. Finally a client server diagram (CSD) is drawn, which shows objects in details and how they interact with each other. These four stages are not only done concurrently, they are done in a cyclic manner until the analysis is deemed complete. Hodge-Mock requires the analysis to be done rigorously and contain precise information, as it is seamlessly transformed into the design. Figure 9: Hodge-Mock SBD The analysis phase does not consider the solution domain, thus storage and processing is considered unlimited. Also the evaluation stage is necessary after a few operations, to ensure that things are going in the way they are intended to go (via customer agreement). (c) System Design System design phase defines detailed behavor and interactions of the system. Hodge-Mock claims that system design phase still assumes unlimited computing and storage powers, but also requires the phase to clarify how the system works (e.g to keep the data in linked lists) which is an inconsistency (I). Additional objects are added to the system and existing objects are refined in this phase. OID (object interaction diagram) is drawn, which is derived closely from CSD but contains more details. Then OBD (object behavior diagram) is derived from OBS onsly for objects that are complex and have inner-states themselves. This diagram is somewhat similar
to SBD, but depics object level behavior instead of system level. OD and OCR are refined in a way that OD defines services now, and OCR represent parameters of each service in columns of a table. This is ideally generated with CASE tools from previous work. Evaluation is done by some scenarios (preferably collaborative with customer) and this phase is deemed complete. (d) Software Design Software design phase focuses on implementing the system with a certain programming language for a certain platform. Objects from the solution domain are added to the system in this phase, thus OD, OBD, OID, OBS and OCR are thoroughly expanded, in cases that implementation requiers additional details. Since OD is very platform dependant, most of the OD models are refined by platform specific system calls. Also access specifications (private/public) are defined. Object Processing Diagram (OPD) is then drawn, showing internal processing of objects. Methods are described as public/private in this step, then getter/ setters are defined and input/output parameters of each method are depicted. Hodge-Mock, as well as many late seminal methodologies relies heavily and sophisticated CASE tools for consistency checks and rapid modeling. Figure 10: Hodge-Mock Lifecycle
Figure 11: Hodge-Mock ERD Figure 12: Hodge-Mock OD (Object Diagram)
Figure 13: Hodge-Mock ORD (Object Relation Diagram) Figure 14: Hodge-Mock OCR
Figure 15: Hodge-Mock SBS Figure 16: Hodge-Mock CSD (Client Server Diagram)
Figure 17: Hodge-Mock OID (Object Interaction Diagram) Figure 18: Hodge-Mock OBS
Figure 19: Hodge-Mock OPD (Object Process Diagram)
5. Comparison The two described methodologies are of late seminal methodologies era, when object oriented software engineering was taking its pace and methodologies were getting mature. Since Hodge-Mock is inspired by early BON in some ways, it has a more mature approach than that of BON, e.g naming all models and diagrams clearly instead of just providing visuals for them. Both methodologies omit describing past design phases in their initial paper, and the rationale behind that is mentioned earlier. Below a table containing the defined criteria compares strengths and weaknesses of these two methods relative to each other in a visual and productive manner, and on each criteria, the methodology which is considered more mature by the author, is highlighted and the intensity of highlight depicts the weight of matureness. Criteria BON Hodge-Mock Clarity, rationality, accuracy and consistency of definition Definition is pretty clear and visual representation included. No inconsistency. Rational and accurate, completed by the website. Definition is pretty clear and visual representation included. Some inconsistency. Somewhat rational and accurate. Coverage of the generic development lifecycle activities (Analysis, Design, Implementation, Test, Maintenance) Support for umbrella activities, especially including: I. Risk Management II. III. Project management Quality assurance Seamlessness and smoothness of transition between phases, stages and activities Only analysis and design discussed. Other activities on completed version (web site) No risk management. Some project management (code reuse, indexing, etc.). Little quality assurance. Main focus of the method, but some models are not seamlessly and smoothly converted Only analysis and design discussed. Some implementation notes included. No risk management. Little project management. Good quality assurance (evaluation on each stage). Main focus of the method, mostly seamless, pretty smooth
Basis in the requirements (functional and nonfunctional) Based on a method similar to CRC, with some additions Based on requirement statement, refining it with user collaboration Testability and tangibility of artifacts, and traceability to requirements Encouragement of active user involvement Artifacts are not testable unless advanced CASE used. Medium tangibility. Little traceability to requirements. User is not involved after requirement gathering Pretty testable via evaluation scenarios. Artifacts are tangible (naming and visual representation). Requirements refined into artifacts. User is involved in validation and analysis, and by some degree in design Practicability and practicality Some transitions are vague, it is not clear how practical some parts of the process are. It is pretty practicable though. Practical but fairly complex. Requires advanced CASE tools to be practicable. Manageability of complexity Work units are pretty complex at some stages (e.g clustering the whole system, initial analysis) Almost everything is broken down, but still some stages are complex for big systems. Extensibility, configurability, flexibility, scalability Some extension included (clustering, indexing). Not very configurable, mostly suited for proposed group of examples. Average flexibility, allows omitting some stages. Good scalability, supports considerably big systems. Not much extension, but initially for big systems. Not configurable. Somewhat flexible, allows omitting some parts for simple models (pseudocode, intera-object state) Pretty scalabe, intended for huge enterprises. Overally, one can say that BON is more suited for smaller projects, where the vaguness and validation has little cost and does not happen often, and Hodge-Mock is more mature and suited for enterprise level projects, with lots of models and smaller transitions, although both methods are quite similar as they were proposed around the same time and intended to address the same needs.
6. References [1] Hodge, L. R., & Mock, M. T. (1992). A proposed object-oriented development methodology. Software Engineering Journal. [2] Nerson, J. (1992). Applying Analysis and Design. Communications of the ACM, 35(9). [3] Ramsin, R., & Paige, R. F. (2010). Iterative criteria-based approach to engineering the requirements of software development methodologies. IET Software, 4(2), 91. doi:10.1049/iet-sen.2009.0032 [4] Ramsin, Raman, & Paige, R. F. (2004). Process-centred Review of Object Oriented Software Development Methodologies Technical Report Table of Contents. [5] Wald, K., & Nerson, J. (2001). Seamless object-oriented software architecture.