1 Published in January 26, 2014 By Steven Worsham and Kenneth von Halle Steven Worsham, a Senior Business Analyst at Sapiens, uses The Decision Model in a variety of different project environments within the mortgage and financial industries. His experience leading end-to-end decision modeling projects incorporates both data Quality and Business Decisions within the Sapiens DECISION software application. Steven graduated from the University of North Carolina in Chapel Hill with a Bachelor s Degree in Business Administration and is a certified Decision Model Practitioner. He currently resides in Raleigh, North Carolina Ken von Halle is a Decision Analyst with Knowledge Partners International (KPI), helping clients to become proficient in decision modeling. He has experience in creating decision models for various purposes such as: banking, mortgage eligibility, student loan eligibility, wealth management, data quality, and data migration. Ken currently resides in Mendham, NJ.
2 Previously, this column introduced the idea of using The Decision Model to specify and enforce Data Quality logic (i.e., DQ logic) . That column covered DQ logic from the perspective of an organizational Data Quality Framework. The organizational DQ Framework depicted various dimensions of Data Quality and how a particular project decided to represent each dimension using The Decision Model (TDM). Today, it is very common for organizations to use The Decision Model for managing DQ logic. The results are impressive and also deliver unique advantages over other approaches. In some cases, organizations represent DQ logic in The Decision Model as part of requirements deliverables. In other cases, organizations create DQ logic in TDM-compliant software which validates the logic against TDM principles, generates and executes test cases, and sometimes deploys to target technology. The information in this column comes from two decision analysts: Steven Worsham with Sapiens International Corporation  and Ken von Halle with Knowledge Partners International LLC . They have created various kinds of DQ decision models for different organizations and industries. They recommend dividing the DQ logic into four basic DQ categories. This month s column describes these DQ categories along with examples of how they fit into process models and decision models. Background How would you define a Data Quality Decision Model (i.e., DQ decision model)? A DQ decision model contains the logic that makes sure the data required by a business decision model is of acceptable quality. This means, it contains logic to test the quality of input data. On the other hand, a business decision model contains logic that uses that data to arrive at a business decision. For example, a DQ decision model determines if an address is, in fact, a valid address by testing its individual data fields (called fact types in The Decision Model) against postal requirements. Subsequently, a business decision model uses that valid address to determine if it is within an organization s marketing territory. This separation means DQ decision models make sure the input data is valid before using that data in business decision models. Do you need experience in data management or data architecture to create DQ decision models? No, a decision analyst does not need knowledge of an organization s stored data structures to create decision models, even when dealing with data quality. Although DQ decisions must 1 P age
3 execute prior business decisions in a run-time environment, we typically model the DQ decisions after we model the business decisions, or at least after thorough analysis of the business process. The reason is that the business decisions drive the DQ requirements (e.g., what data do we need? Is any of the data conditionally required? What should the data look like?). The good news is that the physical nature of the data has no bearing on the data quality requirements when working with decision models. Do you still need different skills to create DQ decision models than those skills needed to create business logic decision models? You need the same skills to create DQ decision models as to create business decision models. These skills are a disciplined analytical approach to problem solving and attention to detail. Particularly, it is the attention to detail that distinguishes excellence in decision modelers. Why use The Decision Model for DQ logic? In your experience, why do companies choose to use The Decision Model for DQ logic in the first place? There are at least three reasons. First, The Decision Model provides business control over the logic in a form understandable to non-technical audiences, ensuring that the logic is written accurately and is clearly understood throughout the enterprise. Second, The Decision Model can be deployed into any (and multiple) technologies for execution. This provides the ability to maintain a single source of record for DQ logic and ensures that the same logic is applied in the same way within the various execution technologies. Third, creating, maintaining, and governing a decision model repository creates appropriate transparency throughout an organization. A central decision model repository provides the ability to communicate DQ decision models across functional areas. Why not include DQ logic within the business decision model that uses that input data? First, executing DQ logic in its own decision model prior to executing a decision model that uses that data, avoids executing large decision models only to have them break due to low quality or missing data. The separation guarantees that the business logic is behaving as expected and that false conclusions are not reached due to poor data. Second, if DQ logic is maintained independently, it becomes reusable across multiple processes. 2 P age
4 Third, when stored as a separate decision model, the management and governance over DQ logic is substantially simpler, allowing both business and technical users to clearly comprehend the logic. Experience has uncovered yet a fourth and powerful reason to create separate DQ models - more sophisticated options become available. For example, a DQ decision model can provide a complete error count for all data quality infractions along with corresponding error messages. This is quite powerful and is explored in greater detail below. How do DQ decision models provide business control over the logic? Most organizations perform DQ directly against actual data sources. Therefore, the DQ code executes against data as that data physically exists in electronic form. This leads to a dependence on technical professionals to translate DQ requirements from business people into technical DQ code. In contrast (and by design), decision models represent the business person s DQ requirements in a form they understand. That s because, in The Decision Model, the input data is defined by business people as business-friendly fact types. Business-friendly fact types represent the data in the way business people think of it and want to see it. This means that there is not necessarily a one-to-one correspondence between input source data and the fact types referenced in a decision model. This is a very unique and important characteristic of The Decision Model. Let s explore it more. Fact types in decision models have business-friendly names chosen by business Subject Matter Experts (SMEs), rather than names from a database or file. Fact types also have businessfriendly domain values also chosen by business SMEs, not necessarily the values actually found in the data source. It is best to illustrate with an example. Consider that a decision model needs the birth date of a driver because it uses the birth date of a driver as a condition to come to a decision about that driver. To a business SME, this is simply Driver Birth date. However, in a database, this birth date may not be stored as Driver Birth date. It may be stored as a Person Birth date (in a Person table) where that person plays the role of driver (a data relationship) on a Driver License (in a Driver License table). Presenting this data to a SME as a fact type called Driver Birth date involves joining those two tables. However, a business SME really has no need to know that it involves multiple tables. Therefore, a corresponding DQ decision model can represent this as one fact type, Driver Birth date, along with DQ logic to test the quality of that birth date.  Unlike traditional DQ code, decision model DQ logic also applies to business friendly interim fact types created from a combination of persisted data inputs as opposed to the individual inputs themselves. For example, a fact type of Driver Age has its own domain restrictions (e.g., 3 P age
5 greater than 21 in some places if the driver is supervising a new driver), but usually only birth date is stored. Using The Decision Model, the business user can define an interim fact type for Driver Age and represent the DQ restrictions in the business-friendly representation. So business SMEs define fact types for decision models, regardless of their actual storage, along with logic to be sure it is of high quality. This is very unlike other approaches to DQ. Creating DQ Decision Models What do DQ decision models look like and how are they different from business decision models? DQ decision models can be constructed in many fashions, but their structure is often much larger than business decision models in terms of the quantity of Rule Families. This is because DQ logic typically requires a separate Rule Family for each fact type in an entire input data set and addresses all DQ categories. Yet, the individual DQ Rule Family tables tend to be smaller and less complex, usually requiring only one or two conditions. As with many decision models there are usually a variety of acceptable structures that the DQ decision models can take. Four DQ categories play a role in determining the acceptable structures. Can you describe the four DQ categories? The four primary DQ categories test for: (1) DQ Complete: this logic tests for populated versus unpopulated fact types, including conditionally required Fact Types (2) DQ Domain Validity: this logic determines whether fact type values are within the allowed values (3) DQ Consistency: this logic tests for required relationships among fact types (4) DQ Value Validity: this logic determines whether fact type values are reasonable and accurate. Can you describe the best way to model the four DQ categories? There are two important considerations: how many DQ decision models and what exactly do those DQ decision models do. How Many DQ Decision Models? At one extreme, a single DQ decision model can test all four DQ categories for all input fact types. Such a decision model tends to consist of many Rule Families and is generally not 4 P age
6 recommended for a variety of reasons . This one DQ decision model executes completely within one process task.  On the other hand, there can be distinct DQ decision models for each DQ category . In this case, these decision models execute in their own process task and therefore execute in the sequence enforced in the process flow. Let s look at an example. Assume you are working with a process carried out by a police officer when he or she stops a car due to suspicious or careless driving. One business decision the officer makes is to determine if the driver is compliant with respect to the use of interactive electronic devices. Let s call this business decision Driver Device Compliance and its logic tests whether the driver is using a cell phone, video game, or other hands-free device. However, before executing that business decision model, a DQ decision model must test the quality of the input data, specifically whether all required data is populated and whether its values are within the allowable domain values . Let s explore the creation of two DQ decision models. The first one, called Driver Device Compliance DQ Complete, tests whether or not required fact types are populated. The second one, called Driver Device Compliance DQ Domain Validity, tests for domain values. Figure 1 illustrates the process model for this option. 5 P age
7 Figure 1: Process Model with Two DQ Decision Models The first process task in Figure 1 contains the DQ decision model only for checking whether required (or conditionally required) fact types are populated. Figure 2 contains this Decision Model Diagram (Decision View) while Figure 3 contains the corresponding Rule Family. Figure 2: Decision Model Diagram (Decision View) for Driver Device Compliance DQ Complete Figure 3: Rule Family for Driver Device Compliance DQ Complete 6 P age
8 If the first DQ decision model is successful, the second process task in Figure 1 proceeds to the DQ decision model for checking fact type values. The Decision Model for this is in Figure 4 while its Rule Family is in Figure 5. Figure 4: Decision Model Diagram (Decision View) for Driver Device Compliance DQ Domain Validity Figure 5: Rule Family for Driver Device Compliance DQ Domain Validity The sequence between these DQ decision models is enforced by the process flow. Can you clarify when to create separate decision models for different DQ categories and when to bundle them all into one DQ decision model? Determining how many DQ decision models depends on the size and complexity of the logic. If there aren t many fact types (e.g., fewer than ten data elements) and there aren t complex relationships among them, one DQ decision model may suffice, but is rare. Most often, separate decision models for different DQ categories works best. What Exactly Should DQ Decision Models Do? 7 P age
9 In general, DQ decision models can be re-active or pro-active. Re-active DQ decision models simply report on the DQ errors while pro-active DQ decision models are more sophisticated and actually work around the errors. Re-Active DQ Decision Models Re-active decision models function like a glass screen. They are passive in their evaluation of each fact type against DQ logic by simply detecting and reporting data quality errors. Re-active DQ decision models are useful in processes that require absolute assurance of data quality due to their ability to catch all data quality errors along with specifications about the errors (e.g., unpopulated values, invalid fact type values, etc.). This allows the process to stop at the occurrence of DQ errors and attempt to remedy the data prior to executing business decision logic. Pro-Active DQ Decision Models Pro-active DQ decision models correct or enhance input fact type values that do not meet data quality standards. These decision models replace missing or incorrect data values with reasonable, assumed values. To accomplish this, assumed values are set as the conclusion to a representative interim fact type when the true data value does not meet data quality standards. The representative interim fact type can then be used in a subsequent business decision model, allowing the process to continue despite the lapse in absolute data quality. It is recommended that any decision models that use the assumed values include specific messages detailing the assumptions and stating that the resulting business conclusion is based on those assumptions. An example is a DQ decision model executing against a student loan application. For example, it can supply missing data for income. It will base its conclusion on the supplied data and conclude that the student will be eligible (if all other input data leads to an eligible conclusion) if, when the true income is provided, it conforms to the DQ-added value. Mixing Pro-Active and Re-Active Decision Models It is interesting to note that a single DQ decision model may be a combination of Pro-Active and Re-Active data quality checks. In these instances, some fact types are show stoppers because the business decision requires highest quality data. In a combination DQ model, the business decision may be flexible enough to allow for supplied values. A decision model of this nature results in a negative conclusion if any of the show stopper fact types do not pass data quality. It results in a positive conclusion if all show stopper fact types pass all DQ tests and flexible fact types may be reset to supplied values. A negative conclusion from this decision would most likely prevent the business process from continuing with the business decisions. 8 P age
10 Some people would be surprised to learn that DQ for valid values for fields is handled in decision models. Isn t this usually handled by the DBMS? Why use decision models? It depends on the business s needs. Most of the time, the business side of an organization wants to be able to specify the DQ validation logic. For example, business people often want the ability to change, as needed, the allowable values for a fact type. Using decision models not only provides a business-friendly translation of the DQ validation logic, but also allows nontechnical people to build and manage the model to their requirements, providing the business the ability to manipulate the logic as they see fit. Wrap Up What was the biggest surprise in creating DQ decision models? The biggest surprise was how complex the DQ decision models can become. When modeling a DQ decision model for three fact types with relatively simple relationships, the DQ decision model quickly turned into eleven Rule Families. This proved how The Decision Model is capable of exposing the complexities hidden within DQ logic, but does so in an understandable, non-technical way. What were the most sophisticated (or interesting) DQ decision models? The most sophisticated example was the one mentioned above about a student loan application.  It provides assumed values for unpopulated data, carries out the decision, and then informs the user which data values needed to be populated, along with a warning that the use of actual data values may result in a different conclusion. This is a lot of functionality to execute in a technology-independent, truly declarative representation that can deploy to any target environment. Do you have any words of advice to people just starting out doing DQ decision models? Consider the following approach to start with: 1. Break up the DQ logic into 3-4 different, process-connected high-level decisions based on the DQ categories discussed in this article. 2. Begin with the DQ Complete decision model. This decision model should check that every fact type that needs to be populated is populated. It may include population logic for conditionally required fact types. 9 P age
11 3. Next focus on the DQ Domain Validity decision model. This decision model validates that each populated value is an acceptable value. 4. Then, create the DQ Consistency decision model to validate the relationships among the fact types (e.g., if STATE is NJ, then ZIP has to be a valid ZIP for NJ). 5. If any additional data quality considerations remain, consider creating one or more DQ Value Validity decision models to address logic for accuracy, reasonableness, and other DQ dimensions. What is the most valuable but subtle advantage to using decision models for both business logic and DQ logic? The benefit of having these separate DQ decision models is that, by the time the business logic decision model executes, it has all the data it needs in the highest quality it needs. The DQ decision models ensure that every required fact type value coming in is populated, its value is acceptable, its relationship to other fact types are valid, and other DQ logic is correct. This separation makes it easier to manage the DQ logic separate from the business logic. In addition, organizations managing a business decision repository (or BDMS for Business Decision Management System) are able to deliver a central repository used by both business people and IT for managing all logic models across an enterprise, from business logic to DQ logic.  See Better, Faster, Cheaper Part II - The Decision Model Meets Data Quality Head On  For more information on Sapiens International Corporation, specifically DECISION software, see  For more information on Knowledge Partners International LLC, see  In reality, behind the scenes and not visible to the business SME, the Driver Birthdate fact type is materialized as the product of a table join and presented in its business-friendly form. The specification for this join, however, can happen later; it need not slow down the creation, validation, and testing of decision models. More importantly, it need not be visible to the business SME.  The option of one DQ decision model for all DQ categories and fact types is usually unmanageable when input data sets are large. 10 P age
12  This decision model can execute in any sequence because decision models are declarative meaning that sequence is irrelevant to arriving at the conclusion.  There is the option of dividing a single DQ category decision into multiple decision models due to the size of the data set. Typically, best practice is to base the division on business concept (entity) or other characteristic. For example, DQ logic for DQ complete may divide naturally into Borrower Information DQ Complete, Loan Information DQ Complete, Property Information DQ complete, etc.  In this simple example, there are no relationships among fact types to test.  And others that are similar to the student loan application are equally sophisticated 11 P age
WHITE PAPER: THE BENEFITS OF DATA MODELING IN BUSINESS INTELLIGENCE The Benefits of Data Modeling in Business Intelligence DECEMBER 2008 Table of Contents Executive Summary 1 SECTION 1 2 Introduction 2
Purpose The purpose of this document is to provide guidance on the practice of Modeling and to describe the practice overview, requirements, best practices, activities, and key terms related to these requirements.
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
The Benefits of Data Modeling in Business Intelligence Table of Contents Executive Summary...... 3 Introduction.... 3 Why Data Modeling for BI Is Unique...... 4 Understanding the Meaning of Information.....
Leading the Evolution WHITE PAPER BUSINESS RULES AND GAP ANALYSIS Discovery and management of business rules avoids business disruptions WHITE PAPER BUSINESS RULES AND GAP ANALYSIS Business Situation More
How to Activate People to Adopt Data Governance Awareness, Ownership and Accountability A whitepaper by First San Francisco Partners 2010 Copyright First San Francisco Partners How to Activate People to
SOA policy management White paper April 2009 Realizing business flexibility through integrated How integrated management supports business flexibility, consistency and accountability John Falkl, distinguished
Building Responsive Enterprises: One decision at a Time James Taylor CEO, Decision Management Solutions Visibility, prediction, impact and action are the keys More information at: www.decisionmanagementsolutions.com
C H A P T E R 3 A Design Technique: Integration ing This chapter focuses on a new design technique for the analysis and design of data integration processes. This technique uses a graphical process modeling
Oracle Data Integrator: Administration and Development What you will learn: In this course you will get an overview of the Active Integration Platform Architecture, and a complete-walk through of the steps
WHITE PAPER: THE BENEFITS OF DATA MODELING IN BUSINESS INTELLIGENCE The Benefits of Data Modeling in Business Intelligence DECEMBER 2008 Table of Contents Executive Summary 1 SECTION 1 2 Introduction 2
Demand Generation vs. Marketing Automation David M. Raab Raab Associates Inc. Demand generation systems help marketers to identify, monitor and nurture potential customers. But so do marketing automation
BUSINESS INTELLIGENCE Bogdan Mohor Dumitrita 1 Abstract A Business Intelligence (BI)-driven approach can be very effective in implementing business transformation programs within an enterprise framework.
DIGGING DEEPER: What Really Matters in Data Integration Evaluations? It s no surprise that when customers begin the daunting task of comparing data integration products, the similarities seem to outweigh
IT Services Management Service Brief Service Continuity (Disaster Recovery Planning) Prepared by: Rick Leopoldi May 25, 2002 Copyright 2002. All rights reserved. Duplication of this document or extraction
Enterprise Aligning Quality With Your Program Presented by: Mark Allen Sr. Consultant, Enterprise WellPoint, Inc. (email@example.com) 1 Introduction: Mark Allen is a senior consultant and enterprise
Fast-Tracking Data Warehousing & Business Intelligence Projects via Intelligent Data Modeling January 2010 Claudia Imhoff, Ph.D Sponsored by: Table of Contents Introduction... 3 What is a Data Model?...
Full Article Authors: Robert Laird, SOA Foundation Architect Andy Ritchie, BPM, Business Rules & Events Solutions Synergy Duncan Clark, ILOG Synergies Architect John Falkl, Distinguished Engineer & Chief
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 firstname.lastname@example.org
Driving BPM Success Requires the Right People COLLABORATIVE WHITEPAPER SERIES Business Process Management (BPM) consists of a methodology, architecture, and tools that when implemented change how businesses
JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,
Information Technology Standard Commonwealth of Pennsylvania Governor's Office of Administration/Office for Information Technology STD Number: STD-INF003B STD Title: Data Modeling Basics Issued by: Deputy
A Step-by-Step Guide to Defining Your Cloud Services Catalog Table of Contents Introduction Chapter 1 Defining the Services Catalog Chapter 2 Building a Services Catalog Chapter 3 Choosing the Right Solution
September 10-13, 2012 Orlando, Florida Business User driven Scorecards to measure Data Quality using SAP BusinessObjects Information Steward Asif Pradhan Learning Points SAP BusinessObjects Information
Increase Software Development Productivity: Equations for Efficiency By Adam Kolawa, Parasoft Co-Founder and CEO Why Productivity Matters In today s economy, software development is a great expense for
Master Reference Data: Extract Value from your Most Common Data White Paper Table of Contents Introduction... 3 So What?!? Why align Reference Data?... 4 MDM Makes MDM Better... 5 Synchronize, Integrate
Data Quality for BASEL II Meeting the demand for transparent, correct and repeatable data process controls Harte-Hanks Trillium Software www.trilliumsoftware.com Corporate Headquarters + 1 (978) 436-8900
IT Services Management Service Brief Capacity Management Prepared by: Rick Leopoldi May 25, 2002 Copyright 2002. All rights reserved. Duplication of this document or extraction of content is strictly forbidden.
Business Process Management In An Application Development Environment Overview Today, many core business processes are embedded within applications, such that it s no longer possible to make changes to
Performance Analyst It s about you Are you able to manipulate large volumes of data and identify the most critical information for decision making? Can you derive future trends from past performance? If
MISMO Software Compliance Certification Program Overview VERSION 4.0. 1 Contents 1.0 Purpose and Value Proposition... 3 1.1 Purpose of the MISMO Software Compliance Certification Program... 3 1.2 Interoperability
Measuring and Monitoring the Quality of Master Data By Thomas Ravn and Martin Høedholt, November 2008 Introduction We ve all heard about the importance of data quality in our IT-systems and how the data
IACBE Advancing Academic Quality in Business Education Worldwide Example of a Supervisor of Internship Rubric International Assembly for Collegiate Business Education 11374 Strang Line Road Lenexa, Kansas
Model Decisions and Business Rules in System Architect Larry Goldberg & Barbara von Halle Knowledge Partners International LLC Who is KPI? Solutions: Decision WorkBench Plug-in for System Architect STEP
BigFix and configuration management database solutions Configuration management databases (CMDB) have emerged as an important tool for understanding relationships between IT assets and their support for
CHAPTER 4 Data Quality Assurance The previous chapters define accurate data. They talk about the importance of data and in particular the importance of accurate data. They describe how complex the topic
Better Business Analytics with Powerful Business Intelligence Tools Business Intelligence Defined There are many interpretations of what BI (Business Intelligence) really is and the benefits that it can
FIREWALL CLEANUP WHITE PAPER Firewall Cleanup Recommendations Considerations for Improved Firewall Efficiency, Better Security, and Reduced Policy Complexity Table of Contents Executive Summary... 3 The
Driving SOA Governance - Part II: Operational Considerations by Leo Shuster, SOA Architect, National Bank SERVICE TECHNOLOGY MAGAZINE Issue XLIX April 2011 This is the second part of a multi-part article
Five Fundamental Data Quality Practices W H I T E PA P E R : DATA QUALITY & DATA INTEGRATION David Loshin WHITE PAPER: DATA QUALITY & DATA INTEGRATION Five Fundamental Data Quality Practices 2 INTRODUCTION
Open S-BPM: Goals and Architecture Albert Fleischmann Werner Schmidt Table of Content 1 Introduction... 2 2 Mission, Vision and Objectives... 2 3 Research and Development Areas... 3 4 Open S-BPM Architecture...
IT Services Management Service Brief Release Management Prepared by: Rick Leopoldi May 25, 2002 Copyright 2002. All rights reserved. Duplication of this document or extraction of content is strictly forbidden.
Observing Data Quality Service Level Agreements: Inspection, Monitoring and Tracking WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 DQ SLAs.... 2 Dimensions of Data Quality.... 3 Accuracy...
Data Management Roadmap A progressive approach towards building an Information Architecture strategy 1 Business and IT Drivers q Support for business agility and innovation q Faster time to market Improve
Class Announcements TIM 50 - Business Information Systems Lecture 15 Database Assignment 2 posted Due Tuesday 5/26 UC Santa Cruz May 19, 2015 Database: Collection of related files containing records on
Three Fundamental Techniques To Maximize the Value of Your Enterprise Data Prepared for Talend by: David Loshin Knowledge Integrity, Inc. October, 2010 2010 Knowledge Integrity, Inc. 1 Introduction Organizations
Finding, Fixing and Preventing Data Quality Issues in Financial Institutions Today FIS Consulting Services 800.822.6758 Introduction Without consistent and reliable data, accurate reporting and sound decision-making
Chapter 5 Foundations of Business Intelligence: Databases and Information Management 5.1 Copyright 2011 Pearson Education, Inc. Student Learning Objectives How does a relational database organize data,
Data governance is a means to define the policies, standards, and data management services to be employed by the organization. Information Management & Data Governance OVERVIEW A thorough Data Governance
New York Health Benefit Exchange Blueprint Summary for 9.7.4 Data Management Plan October 26, 2012 Item Number Topic 9.7.4 Data Management Plan Version Number Modified By Revision Date Description of Change
WHITE PAPER: LEGACY MODERNIZATION An Automated Approach to Legacy Modernization JANUARY 2010 Mike Helft and Cindy Peake CA MAINFRAME SOLUTIONS Table of Contents Executive Summary SECTION 1 2 What is Legacy
WHITE PAPER DEVELOPING A SUCCESSFUL WEBSITE RFP Find the Right Web Development Partner through an Engaging and Informative RFP By Devin Gauthier INTRODUCTION So you ve decided to undertake a new website
Engineer Position Description February 9, 2015 Engineer Position Description February 9, 2015 Page i Table of Contents General Characteristics... 1 Career Path... 2 Explanation of Proficiency Level Definitions...
Business Analyst to Business Architect To Infinity... and Beyond! White Paper March 2010 Authors: Jack Hilty Cathy Brunsting Editor: Janice Koerber Copyright 2010 SentientPoint, Inc. All Rights Reserved
The Five Biggest MISSED Internet Marketing Opportunities Most Lawyers Don't Know About Many lawyers and other professionals equate internet marketing with Search Engine Optimization (SEO). And while SEO
A Trend Micro Research Paper Email Correlation and Phishing How Big Data Analytics Identifies Malicious Messages RungChi Chen Contents Introduction... 3 Phishing in 2013... 3 The State of Email Authentication...
The Resource Management Life Cycle Resource Planning for 2013 Revised November 2012 http://epmlive.com Contents Introduction...2 What is Resource Management?...2 Who Participates in Resource Management?...2
PLANNING YOUR DASHBOARD PROJECT Use of dashboards has allowed us to identify adverse trends quickly and implement corrective actions to address the problems. This has allowed us to improve efficiency within
BPA-AS-A-SERVICE: PRACTICAL USE CASES How social collaboration and cloud computing are changing process improvement TABLE OF CONTENTS 1 Introduction 1 The value of BPA 2 Social collaboration 3 Moving to
DRIVING THE CHANGE ENABLING TECHNOLOGY FOR FINANCE 15 TH FINANCE TECH FORUM SOFIA, BULGARIA APRIL 25 2013 BRAD HATHAWAY REGIONAL LEADER FOR INFORMATION MANAGEMENT AGENDA Major Technology Trends Focus on
1. Dimensional Data Design - Data Mart Life Cycle 1.1. Introduction A data mart is a persistent physical store of operational and aggregated data statistically processed data that supports businesspeople
Data Migration through an Approach An Executive Overview Introducing MIKE2.0 An Open Source Methodology for http://www.openmethodology.org Management and Technology Consultants Data Migration through an
December 2007 IBM Information Server FastTrack: The need for speed accelerating data integration projects Page 2 Contents 3 Creating a collaborative development environment 5 Optimizing data integration
12 th ICCRTS Adapting C2 to the 21 st Century Human Performance Technology: A Discipline to Improve C2 Concept Development and Analysis Tracks: Track 1: C2 Concepts, Theory, and Policy Track 6: Metrics
Achieving ITSM Excellence Through Availability Management Technology Concepts and Business Considerations Abstract This white paper outlines the motivation behind Availability Management, and describes
ServiceNow: Change Management Phase I Project Charter VERSION: 1.3 REVISION DATE: 9/28/2011 Approval of the Project Charter indicates an understanding of the purpose and content described in this deliverable.
Towards a Blended Workforce - the Evolution of Recruitment Process Outsourcing (RPO) Models The UK s ever-fragmenting workforce and the associated challenges of resourcing With research published in September
BIG DATA KICK START Troy Christensen December 2013 Big Data Roadmap 1 Define the Target Operating Model 2 Develop Implementation Scope and Approach 3 Progress Key Data Management Capabilities 4 Transition
A framework for creating custom rules for static analysis tools Eric Dalci John Steven Cigital Inc. 21351 Ridgetop Circle, Suite 400 Dulles VA 20166 (703) 404-9293 edalci,email@example.com Abstract Code
Web Team Roles & Responsibilities This document defines the approved Roles & Responsibilities for the Web Team and related Job Descriptions. Table of Contents Introduction 1 1.1 Roles & responsibilities
Information Paper The Roles and Domain of the Professional Accountant in Business Published by the Professional Accountants in Business Committee Professional Accountants in Business Committee International
Oracle Forms and SOA: Software development approach for advanced flexibility An Oracle Forms Community White Paper Malcolm Smith Atos Origin April 2008 Oracle Forms and SOA: Software development approach
Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source
Fourth generation techniques (4GT) The term fourth generation techniques (4GT) encompasses a broad array of software tools that have one thing in common. Each enables the software engineer to specify some
A Tutorial on Quality Assurance of Data Models 1 QA of Data Models Barry Williams tutorial_qa_of_models.doc Page 1 of 17 31/12/2012 00:18:36 A Tutorial on Quality Assurance of Data Models 2 List of Activities
This is the site to apply: http://www.bkfs.com/corporateinformation/careers/pages/workwithus.aspx Automation Quality Assurance Manager - DEV0001R Black Knight is the premier provider of integrated technology,
Information Governance Workshop David Zanotta, Ph.D. Vice President, Global Data Management & Governance - PMO Recognition of Information Governance in Industry Research firms have begun to recognize the
GEOGRAPHIC INFORMATION SYSTEMS (GIS): THE BEDROCK OF NG9-1-1 THE TIME IS NOW FOR PSAPS AND REGIONAL AGENCIES TO TAKE ADVANTAGE OF THE ACCURATE GEOSPATIAL DATABASES THAT WILL BE KEY TO NEXT GENERATION EMERGENCY
Master Data Management Defining & Measuring MDM Maturity, A Continuous Improvement Approach DEFINE IMPROVE MEASURE Presentation by Mark Allen 1 About the Author Mark Allen has over 25 years of data management
DATA GOVERNANCE Enterprise Data Governance Strategies and Approaches for Implementing a Multi-Domain Data Governance Model Mark Allen Sr. Consultant, Enterprise Data Governance WellPoint, Inc. 1 Introduction:
The world of computing has grown from a small, unsophisticated world in the early 1960 s to a world today of massive size and sophistication. Nearly every person on the globe in one way or the other is
Business Analysis Standardization & Maturity Contact Us: 210.399.4240 firstname.lastname@example.org Copyright 2014 Enfocus Solutions Inc. Enfocus Requirements Suite is a trademark of Enfocus Solutions Inc.
WSJ: SOA Myths About Service-Oriented Architecture Demystifying SOA Service-oriented architecture (SOA) refers to an architectural solution that creates an environment in which services, service consumers,
db4o The Open Source Object Database Java and.net Agile Techniques for Object Databases By Scott Ambler 1 Modern software processes such as Rational Unified Process (RUP), Extreme Programming (XP), and