Enterprise Information Flow
|
|
- Josephine Gordon
- 8 years ago
- Views:
Transcription
1 Enterprise Information Flow White paper
2 Table of Contents 1. Why EIF 1 Answers to Tough Questions 1 2. Description and Scope of Enterprise Information Flow 3 Data and Information Structures 3 Data Attributes 4 Data Sources 4 Data Targets 5 Data Flows 5 Information Transformations 5 Analytic Processing 6 3. Attributes 8 4. Terms and Abbreviations 9 Authors Ondrej Zyka, Head of Data Management in Profinit Ondrej has more than 15 years of Data Management experience and spend his entire career focusing on Data Management, Data Quality and Application Integration. Ivo Mouka, Senior Consultant in Profinit Ivo is Profinit s Senior Consultant with 20+ years of experience in Information Management, Business Intelligence and Data Quality with background from Healthcare and Digital Publishing.
3 1. Why EIF Enterprise Information Flow (EIF) is a new field that studies the life cycle of information in its entirety, including all interactions between the information and the surrounding environment. The huge increase in the amount of data processed in recent decades can be attributed to enterprises hunger for ever more precise information to give them a competitive edge, to regulatory requirements, and also to the fact that previously unavailable information is now accessible through new information channels such as the big data distilled from websites, public databases, or the information automatically collected on users of various systems and devices. In an environment so complex, traditional solutions are bound to fail. A more intelligent approach is needed. The aim of EIF is to prevent wrong decisions by exactly analyzing the sources and processing of information, identifying critical spots in the process, and eliminating emotions and the human factor when evaluating information quality. The vision for EIF is to provide help in situations similar to the following cases: When in 1999 NASA lost a $125 million Mars orbiter because a Lockheed Martin engineering team used English units of measurement, while the agency s team used the more conventional metric system, for a key spacecraft operation, Tom Gavin, the JPL administrator to whom all project managers reported said: This is an end-to-end process problem. A single error like this should not have caused the loss of Climate Orbiter. Something went wrong in our system processes, in checks and balances that we have, that should have caught this and fixed it. Experiences with high risk operations on financial markets led the EU to introduce a Solvency II Directive for insurance companies. It is scheduled to come into effect in January Exact descriptions of the data flow and data calculations within the organization are essential to prove the appropriateness, completeness, and accuracy of information to regulatory bodies. This is one of the key capabilities of EIF. Answers to Tough Questions Could we prevent similar sudden crises that could result in considerable losses or even the fall of a company? Is our asset valuation sufficiently transparent? How correct and relevant is the data we use for cash flow predictions; are the algorithms we use proven enough? If these cases appear to be isolated events which rarely occur in your field of business, there is the common danger of creeping inefficiency that affects many large or growing companies and negatively impacts their financial results: John Schmidt of Informatica, in his blog article of May 2013, presents the following finding based on data in the Forrester s 2013 IT Budget Planning Guide for CIOs: The cost of IT as a percent of revenue increases as organizations get larger. What is going on here? What happened to economies of scale? A key part of his answer is that IT spends a lot of time helping the organization to automate business processes, but spends almost nothing on automating its own IT processes. IT is largely a manual activity. The main benefit of EIF is that it introduces order into work with information, which is the precondition for automation and the effective use of tools. EIF seeks to answer a number of questions: 1
4 At the organizational level Where and how was the information we use for our decisions created? Who is responsible for its quality? Is the analysis trustworthy? Who did it? On what basis was the client classified as credible? Are the estimates based only on our company figures, or have other sources such as Central Bank estimates or EU predictions also been taken into consideration? The introduction of an Enterprise Information Flow discipline is a standard part of Enterprise Information Management in any organization which is striving to raise its Data Management Maturity level. One area where the EIF approach comes to good use is in the compliance with directives, such as Solvency II for the insurance sector or BASEL III for banks, with their strict demands on data quality. Who did the data enrichment and what information did they use? At the personal level Who is using information that concerns my person and for what purpose? Is sensitive information being passed on to other parties? Is the information adequately aggregated or anonymized? Who is getting information about my phone calls and how detailed is this information? Who can see my data on Facebook? 2
5 2. Description and Scope of Enterprise Information Flow Enterprise Information Flow overlaps, to some extent, with existing and welldefined areas dealing with enterprise data: Data Management, Metadata Management, and Data Quality. To get a closer look at the behavior of information in an enterprise, it is necessary to expand the reach of individual disciplines, to broaden the range of observed attributes, and to use more sophisticated methods for their processing. Data management essentially means setting the rules, establishing the organizational structure, and defining and performing the required processes with the help of suitable tools. The Enterprise Information Flow approach involves looking at these segments of management from the perspective of individual data elements. In other words, it means monitoring what rules apply to each individual data element, who is responsible for its availability, quality, etc. The basic entities Enterprise Information Flow is concerned with are: Data and information structures Large collections of data attributes Sources and targets Data flows Information transformations Analytic processing Let s describe the individual entities, their place in the Enterprise Information Flow concept, and the contribution of EIF compared to established approaches. The key features are described in Data Flows and Information Transformation chapters, but let s not get ahead of ourselves and let s start with basics. Data and Information Structures Metadata management deals with data structures in detail, including their technical realization and business purpose (Business Glossary and its link to the data structures). This concept is well suited for structured data. However, we must be able to work equally effectively with semi-structured or non-structured data. In the case of semi-structured data, we can use information contained in definition files such as XSD, DTD, RDF, Dublin Core, e-gms, and AGLS. Similarly, we can use definitions from systems where the data description is a part of the data file; examples include XML, UML, HTML and other similar types. In these cases, the definition is not static as in the case of structured data; it is directly linked to the data contents. Consequently, it is necessary to deal not only with the structures that store the data, but also with the data itself. The description of non-structured data and Big Data datasets is a major problem. We can often only obtain information on the existence and the location of the dataset and a description of its contents or perhaps the way it was created. Some examples are data from s, discussion forums, or document management systems. 3
6 It is not difficult to technologically obtain analytical data such as the number of records, the average size of a record, the creation time, or the structure of key identifiers of individual records. But to receive a detailed description, as we are accustomed to with the structured data, is not feasible. It is necessary to be able to work with and to integrate technical metadata at various levels of granularity. Sometimes we have an exact description as to where to find the required information, at other times a note on the existence of the required information in particular records, and in some situations the only possibility is to analyze particular datasets without any guarantee that they contain the information we are interested in. Business metadata poses a similar granularity problem. On one hand, we need very precise domain specific definitions of terms such as nominal value, sale discount, liquidity risk, or commission rate. On other hand, general definitions, such as customer data, document, or observation, permit defining the contents of data that are not precisely described or have not yet been analyzed. The value of a solution depends on its capability to maintain links across all layers of description granularity. Data Attributes Metadata management can be understood as a collection of various attributes on data. Enterprise Information Flow is characterized by collecting a large breadth of attributes on individual data components, and it puts particular emphasis on processing these sets of parameters. The objective is to obtain and process all attributes, concerning not only the structure of data, but also the significance for individual users, security, sources, processing, utilization, and data quality. The structure of the information obtained for particular types of data can be quite varied. Similarly, the sources of individual attributes are not only data repositories, but also all other systems that process data, and even the users of data. The emphasis on processing the attributes leads to the necessity of an open solution, where, based on the processing and analysis of attributes, the new attributes for individual data items are dynamically created and the values of existing attributes are supplied or modified. Attributes may describe individual data fields or records (as with XML documents), or provide a description for a structure (as with relational databases) or for specific data sets (for example all orders from one day, results of a census, Big Data datasets, etc.). Data Sources The key to the description of a data source is the description of its structure. Moreover, EIF requires a capability to precisely define all source data and information. Data sources must be identified primarily from the point of reliability. It is necessary to identify the owners of the sources and their trustworthiness. For every source, the important data indicators must be defined together with their historical values recorded on a regular basis. From the organizational point of view, we distinguish internal and external sources. Other features of data sources we can observe are: Is the data recorded manually or automatically? Is the data the result of analytical processes or direct input? Who are the other users of the data? What is the opinion of the other users on data quality? The concept of data source description must be sufficiently open to support possible substitutions of one source for another, more particularly to allow the modelling of how such a substitution may affect the entire life cycle of the data and its interaction with other systems and surroundings. 4
7 A practical example is the capability to analyze available sources of data, such as the Open Data Directory, and to assist in the search for alternative sources of the data utilized. Data Targets As with data sources, the key to the description of data targets is the description of their data structure. Other important attributes of data targets concern their usability and ability to interact with data users and the environment. The main features of data targets we can observe are similar to those of data sources. They include: Is it possible to process target data automatically? What is the latency of target data? Who uses the data? What about user satisfaction? What types of decisions are based on the data? Is the data for internal use only? Data Flows The description and analysis of data and information flows is a principal foundation of EIF. Description analyses technologically formalized data flows (FTP, ETL procedures, data replication, XSLT, use of ESB and web services, BPM systems and BPEL transformation, etc.), and they are proficiently handled and supported by contemporary metadata tools. Appropriate metadata tools should be able to analyze data flows formed by scripts and SQL procedures. Another big challenge for processing data flows lies in mastering the issues of repositories which unify data from various sources. It may concern MDM systems, integrated entities in data warehouses, or shared file systems. It is a situation where a plain data flow, based on data structures and static transformation analyses, links all the source and target data files transferred through the concentrator. Such information on source to target links is entirely inadequate. The control of data flows requires the exact identification of the data lineage through concentrators for each individual data record. Two methods can be used: Either the required information can be added to every data record, or more sophisticated methods based on identification and transformation of specific data attributes can be used for data going through concentrators. Yet another challenge of EIF is to process and document not only the data actually transmitted (or transformed in the process), but also the other data that is required for any particular transmission even though it itself is not transmitted. An example from relational databases would be the data used in the WHERE clause of the SQL statement. This data is not included in the transmitted data stream, but it is necessary for understanding the formation of the resulting data sets. Information Transformations Unlike the data transformations and data flows where there are usable technical solutions, the situation at the information level is considerably less developed. EIF concentrates on the following areas: The overall description of the information flow must be able to deal with information flows on multiple levels and to detect information in the flows such as decisions based on watching data on a monitor or reading reports, telephone communication, etc. Even the mere responsibility for data files must be considered a specific transformation, as it may considerably affect the trustworthiness or availability of data. Overall, it is necessary to work with all transformations of information where users are engaged. Such transformations include manual fixes of data, transformations 5
8 started by a staff member, and transformations with parameters entered by a user. With growing frequency, the information processing must deal with transformations that are difficult to describe technically and that significantly change the quality of the information. The use of models based on artificial intelligence for scoring customers may serve as an example. Analyses performed by data scientists are another example of transformations that create entirely new information which did not exist in the system before. At the same time, it is hard to determine what data the result was based on. Another type of transformations that are hard to describe are the integration transformations used in the area of Master Data Management, such as the transformations used for the cleansing and enrichment of data, algorithms for masking and anonymizing data, or algorithms for aggregating data. EIF requires mastering these tasks concerning transformations: Efficient descriptions, which are ideally automatically generated based on code for the widest possible range of languages and tools used for transformations Descriptions of input and output data set transformations at both the technological and the business level The key is the ability to define a method which will generate attributes of output data from the attributes of input data (security attributes, attributes of data quality, description of data sources and ownership, etc.) Analytic Processing The strength of EIF depends on the organization s ability to analyze all the information gathered. The standard analyses, such as where used, data lineage, impact analyses, profiling, and historical searches on metadata or profiling data, must be supplemented with many other types of analyses that will draw from the richness of the collected attributes. One very useful feature is the ability to quickly and easily create new types of impacts and analyses over the collected attributes. These analyses may modify existing attributes or create new ones. The analyses take into consideration the data attributes, data flows, and transformations. The analytical procedures can vary widely. They can be as simple as the following examples: During the transmission of data strings, does any truncation occur? During the transmission of numeric values, does rounding occur? During transmission, are numeric values transformed into strings? During transmission, is there any reduction in data security requirements? Does the data that leaves the department or the company contain any sensitive information? What is the overall computational complexity (cost) of the resulting files? What is the critical path for obtaining the target data files? Does the same data quality criteria apply to all integrated data? How many people are involved in the process of obtaining the target data file? What are all the technologies utilized for obtaining the target data file? Which technologies contain sensitive data? They can be also complex: What part of the data processing, from sources to results, is the most expensive, the slowest, has the weakest technical support, and is the most vulnerable from the security point of view? If an ETL tool were replaced or if the enterprise switched to the ELT solution, how would the system parameters change? 6
9 Is there a critical spot (person, department or system) through which all information regarding a particular decision process passes? What is the relationship between the costs of individual transformations and the number of decision processes for which they are used? 7
10 3. Attributes EIF is based on the collection and processing of a large number of various attributes. Let s list some examples: Structural attributes Placement of data in systems and environments Data structures Precision of the description of data structures Definition of data types Technical level descriptions Business or conceptual level descriptions Data storage technology Security attributes Physical access rights for modifications and the use of data Assignment of data security levels according to the security rules of the enterprise Access rules for modifying and using data Logical rights at the access level for modifying and using data Flags indicating that data has a link to a security incident Legal security requirements Requirements for anonymization and encryption Data source attributes What is the source of the data What transformation created the data Was the data manually obtained or collected using technology Data quality rules used in data creation External or internal source Who else uses the source Known incidents associated with the source of data Evaluation of the source by other users Processing attributes Last transaction used Granularity of primary data Technology used for processing Computational complexity of processing Latency of availability compared to the original data Latency of availability compared to the last time data was stored Use of manually controlled transformations Up-to-date status of the data Administration attributes Data owner Data steward Rules for data administration Last audit of administration Possibility of manual interventions Previous data owner Mode of receiving / handing over data Usage attributes Users of data User satisfaction Known incidents related to the use of data Possible ways of publishing data Rules for the use of data Attributes for the availability of data Automated process ability of outputs Data quality attributes Data quality indicators Data quality indicator values (profiling results) Opinions of users DQ related issues List of corrections of data Duplicity and rules for integration 8
11 4. Terms and Abbreviations Term BPEL BPM ESB ETL ELT MDM SOA Explanation Business Process Execution Language Business Process Management Enterprise Service Bus - The open standards-based distributed messaging middleware Extract, Transform, and Load - A key process for data acquisition in data warehousing Extract, Load, Transform - An alternate process for data manipulation; unlike ETL in that data is extracted, loaded into the database, and only then transformed Master Data Management Service Oriented Architecture 9
12 About Profinit and Manta Tools Profinit is a member of the multi-national New Frontier Group, which is a leader in the field of digital transformation of organizations and companies in Central and Eastern Europe. With more than 1,600 employees in 16 countries, we are also one of the 10 largest providers of ICT services in the entire CEE region and belong among the top in the field of made-to-order software development, data management, data storage, and business intelligence. Profinit s key Enterprise Information Management product is Manta Tools, the only solution which tells you what is really happening inside your BI environment. Manta Tools optimizes enterprise data flow, reveals data lineage, helps to perform impact analyses and ensure the stability of data architecture. It also works with systems based on multiple technologies (Teradata, Informatica, Oracle and IBM Cognos). 10
DATA GOVERNANCE AND DATA QUALITY
DATA GOVERNANCE AND DATA QUALITY Kevin Lewis Partner Enterprise Management COE Barb Swartz Account Manager Teradata Government Systems Objectives of the Presentation Show that Governance and Quality are
More informationMaster Data Management
Master Data Management Managing Data as an Asset By Bandish Gupta Consultant CIBER Global Enterprise Integration Practice Abstract: Organizations used to depend on business practices to differentiate them
More informationSAP BusinessObjects Information Steward
SAP BusinessObjects Information Steward Michael Briles Senior Solution Manager Enterprise Information Management SAP Labs LLC June, 2011 Agenda Challenges with Data Quality and Collaboration Product Vision
More informationService Oriented Data Management
Service Oriented Management Nabin Bilas Integration Architect Integration & SOA: Agenda Integration Overview 5 Reasons Why Is Critical to SOA Oracle Integration Solution Integration
More informationFederal Enterprise Architecture and Service-Oriented Architecture
Federal Enterprise Architecture and Service-Oriented Architecture Concepts and Synergies Melvin Greer Chief Strategist, SOA / Cloud Computing Certified Enterprise Architect Copyright August 19, 2010 2010
More informationBusiness Intelligence and Service Oriented Architectures. An Oracle White Paper May 2007
Business Intelligence and Service Oriented Architectures An Oracle White Paper May 2007 Note: The following is intended to outline our general product direction. It is intended for information purposes
More informationMaster Data Management. Zahra Mansoori
Master Data Management Zahra Mansoori 1 1. Preference 2 A critical question arises How do you get from a thousand points of data entry to a single view of the business? We are going to answer this question
More informationBusiness Intelligence. A Presentation of the Current Lead Solutions and a Comparative Analysis of the Main Providers
60 Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative Analysis of the Main Providers Business Intelligence. A Presentation of the Current Lead Solutions and a Comparative
More informationJOURNAL OF OBJECT TECHNOLOGY
JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,
More informationChapter 6 Basics of Data Integration. Fundamentals of Business Analytics RN Prasad and Seema Acharya
Chapter 6 Basics of Data Integration Fundamentals of Business Analytics Learning Objectives and Learning Outcomes Learning Objectives 1. Concepts of data integration 2. Needs and advantages of using data
More informationIBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS!
The Bloor Group IBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS VENDOR PROFILE The IBM Big Data Landscape IBM can legitimately claim to have been involved in Big Data and to have a much broader
More informationDATA GOVERNANCE AT UPMC. A Summary of UPMC s Data Governance Program Foundation, Roles, and Services
DATA GOVERNANCE AT UPMC A Summary of UPMC s Data Governance Program Foundation, Roles, and Services THE CHALLENGE Data Governance is not new work to UPMC. Employees throughout our organization manage data
More informationMeasure Your Data and Achieve Information Governance Excellence
SAP Brief SAP s for Enterprise Information Management SAP Information Steward Objectives Measure Your Data and Achieve Information Governance Excellence A single solution for managing enterprise data quality
More informationMDM and Data Warehousing Complement Each Other
Master Management MDM and Warehousing Complement Each Other Greater business value from both 2011 IBM Corporation Executive Summary Master Management (MDM) and Warehousing (DW) complement each other There
More informationWhitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business
More informationBuilding a Data Quality Scorecard for Operational Data Governance
Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...
More informationSpreadsheet Governance Pushes MDM to the Desktop
Spreadsheet Governance Pushes MDM to the Desktop James Kobielus Principal Analyst, Data Management November 1, 2006 Summary Issue Spreadsheets are a wild card in the master data management (MDM) equation.
More informationWhite Paper. Thirsting for Insight? Quench It With 5 Data Management for Analytics Best Practices.
White Paper Thirsting for Insight? Quench It With 5 Data Management for Analytics Best Practices. Contents Data Management: Why It s So Essential... 1 The Basics of Data Preparation... 1 1: Simplify Access
More informationWhat s New with Informatica Data Services & PowerCenter Data Virtualization Edition
1 What s New with Informatica Data Services & PowerCenter Data Virtualization Edition Kevin Brady, Integration Team Lead Bonneville Power Wei Zheng, Product Management Informatica Ash Parikh, Product Marketing
More informationWhat to Look for When Selecting a Master Data Management Solution
What to Look for When Selecting a Master Data Management Solution What to Look for When Selecting a Master Data Management Solution Table of Contents Business Drivers of MDM... 3 Next-Generation MDM...
More informationSTRATEGIES ON SOFTWARE INTEGRATION
STRATEGIES ON SOFTWARE INTEGRATION Cornelia Paulina Botezatu and George Căruţaşu Faculty of Computer Science for Business Management Romanian-American University, Bucharest, Romania ABSTRACT The strategy
More informationNCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation
NCOE whitepaper Master Data Deployment and Management in a Global ERP Implementation Market Offering: Package(s): Oracle Authors: Rick Olson, Luke Tay Date: January 13, 2012 Contents Executive summary
More informationData warehouse and Business Intelligence Collateral
Data warehouse and Business Intelligence Collateral Page 1 of 12 DATA WAREHOUSE AND BUSINESS INTELLIGENCE COLLATERAL Brains for the corporate brawn: In the current scenario of the business world, the competition
More informationRiversand Technologies, Inc. Powering Accurate Product Information PIM VS MDM VS PLM. A Riversand Technologies Whitepaper
Riversand Technologies, Inc. Powering Accurate Product Information PIM VS MDM VS PLM A Riversand Technologies Whitepaper Table of Contents 1. PIM VS PLM... 3 2. Key Attributes of a PIM System... 5 3. General
More informationSpring 2011 Conference Sandanski, May 13th 15th 2011 Oracle SOA Suite 11g Rapid service integration and process automation with a no-coding approach
Spring 2011 Conference Sandanski, May 13th 15th 2011 Oracle SOA Suite 11g Rapid service integration and process automation with a no-coding approach George Moykin Senior Consultant, Middleware george.moykin@oracle.com
More informationOWB Users, Enter The New ODI World
OWB Users, Enter The New ODI World Kulvinder Hari Oracle Introduction Oracle Data Integrator (ODI) is a best-of-breed data integration platform focused on fast bulk data movement and handling complex data
More information<Insert Picture Here> Evaluating the Risk & Rewards of Effective Integration of Systems & Technology within a Financial Institution
Evaluating the Risk & Rewards of Effective Integration of Systems & Technology within a Financial Institution Bijan Olfati, Director, FS Data Warehouse & Analytics What are the Risks?
More informationWhat You Don t Know Does Hurt You: Five Critical Risk Factors in Data Warehouse Quality. An Infogix White Paper
What You Don t Know Does Hurt You: Five Critical Risk Factors in Data Warehouse Quality Executive Summary Data warehouses are becoming increasingly large, increasingly complex and increasingly important
More informationUnified Data Integration Across Big Data Platforms
Unified Data Integration Across Big Data Platforms Contents Business Problem... 2 Unified Big Data Integration... 3 Diyotta Solution Overview... 4 Data Warehouse Project Implementation using ELT... 6 Diyotta
More informationWhite Paper. Unified Data Integration Across Big Data Platforms
White Paper Unified Data Integration Across Big Data Platforms Contents Business Problem... 2 Unified Big Data Integration... 3 Diyotta Solution Overview... 4 Data Warehouse Project Implementation using
More informationEAI vs. ETL: Drawing Boundaries for Data Integration
A P P L I C A T I O N S A W h i t e P a p e r S e r i e s EAI and ETL technology have strengths and weaknesses alike. There are clear boundaries around the types of application integration projects most
More informationA WHITE PAPER By Silwood Technology Limited
A WHITE PAPER By Silwood Technology Limited Using Safyr to facilitate metadata transparency and communication in major Enterprise Applications Executive Summary Enterprise systems packages such as SAP,
More informationAssessing and implementing a Data Governance program in an organization
Assessing and implementing a Data Governance program in an organization Executive Summary As companies realize the importance of data and the challenges they face in integrating the data from various sources,
More informationEstablish and maintain Center of Excellence (CoE) around Data Architecture
Senior BI Data Architect - Bensenville, IL The Company s Information Management Team is comprised of highly technical resources with diverse backgrounds in data warehouse development & support, business
More informationMaster Data Management and Data Warehousing. Zahra Mansoori
Master Data Management and Data Warehousing Zahra Mansoori 1 1. Preference 2 IT landscape growth IT landscapes have grown into complex arrays of different systems, applications, and technologies over the
More informationBig Data and Big Data Governance
The First Step in Information Big Data and Big Data Governance Kelle O Neal kelle@firstsanfranciscopartners.com 15-25- 9661 @1stsanfrancisco www.firstsanfranciscopartners.com Table of Contents Big Data
More informationNext Generation Business Performance Management Solution
Next Generation Business Performance Management Solution Why Existing Business Intelligence (BI) Products are Inadequate Changing Business Environment In the face of increased competition, complex customer
More informationCONNECTING DATA WITH BUSINESS
CONNECTING DATA WITH BUSINESS Big Data and Data Science consulting Business Value through Data Knowledge Synergic Partners is a specialized Big Data, Data Science and Data Engineering consultancy firm
More informationData Warehouse (DW) Maturity Assessment Questionnaire
Data Warehouse (DW) Maturity Assessment Questionnaire Catalina Sacu - csacu@students.cs.uu.nl Marco Spruit m.r.spruit@cs.uu.nl Frank Habers fhabers@inergy.nl September, 2010 Technical Report UU-CS-2010-021
More informationBringing agility to Business Intelligence Metadata as key to Agile Data Warehousing. 1 P a g e. www.analytixds.com
Bringing agility to Business Intelligence Metadata as key to Agile Data Warehousing 1 P a g e Table of Contents What is the key to agility in Data Warehousing?... 3 The need to address requirements completely....
More informationAre You Big Data Ready?
ACS 2015 Annual Canberra Conference Are You Big Data Ready? Vladimir Videnovic Business Solutions Director Oracle Big Data and Analytics Introduction Introduction What is Big Data? If you can't explain
More informationORACLE DATA INTEGRATOR ENTERPRISE EDITION
ORACLE DATA INTEGRATOR ENTERPRISE EDITION ORACLE DATA INTEGRATOR ENTERPRISE EDITION KEY FEATURES Out-of-box integration with databases, ERPs, CRMs, B2B systems, flat files, XML data, LDAP, JDBC, ODBC Knowledge
More informationSQL Server Master Data Services A Point of View
SQL Server Master Data Services A Point of View SUBRAHMANYA V SENIOR CONSULTANT SUBRAHMANYA.VENKATAGIRI@WIPRO.COM Abstract Is Microsoft s Master Data Services an answer for low cost MDM solution? Will
More informationData Virtualization A Potential Antidote for Big Data Growing Pains
perspective Data Virtualization A Potential Antidote for Big Data Growing Pains Atul Shrivastava Abstract Enterprises are already facing challenges around data consolidation, heterogeneity, quality, and
More information8 Ways that Business Intelligence Projects are Different
8 Ways that Business Intelligence Projects are Different And How to Manage BI Projects to Ensure Success Business Intelligence and Data Warehousing projects have developed a reputation as being difficult,
More informationEvaluating Data Warehousing Methodologies: Objectives and Criteria
Evaluating Data Warehousing Methodologies: Objectives and Criteria by Dr. James Thomann and David L. Wells With each new technical discipline, Information Technology (IT) practitioners seek guidance for
More informationIBM Information Management
IBM Information Management January 2008 IBM Information Management software Enterprise Information Management, Enterprise Content Management, Master Data Management How Do They Fit Together An IBM Whitepaper
More informationWhat's New in SAS Data Management
Paper SAS034-2014 What's New in SAS Data Management Nancy Rausch, SAS Institute Inc., Cary, NC; Mike Frost, SAS Institute Inc., Cary, NC, Mike Ames, SAS Institute Inc., Cary ABSTRACT The latest releases
More informationIndustry Models and Information Server
1 September 2013 Industry Models and Information Server Data Models, Metadata Management and Data Governance Gary Thompson (gary.n.thompson@ie.ibm.com ) Information Management Disclaimer. All rights reserved.
More informationData Virtualization and ETL. Denodo Technologies Architecture Brief
Data Virtualization and ETL Denodo Technologies Architecture Brief Contents Data Virtualization and ETL... 3 Summary... 3 Data Virtualization... 7 What is Data Virtualization good for?... 8 Applications
More informationINFORMATION TECHNOLOGY STANDARD
COMMONWEALTH OF PENNSYLVANIA DEPARTMENT OF PUBLIC WELFARE INFORMATION TECHNOLOGY STANDARD Name Of Standard: Data Warehouse Standards Domain: Enterprise Knowledge Management Number: Category: STD-EKMS001
More informationPOLAR IT SERVICES. Business Intelligence Project Methodology
POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...
More informationData Integration for the Real Time Enterprise
Executive Brief Data Integration for the Real Time Enterprise Business Agility in a Constantly Changing World Overcoming the Challenges of Global Uncertainty Informatica gives Zyme the ability to maintain
More informationTalend Metadata Manager. Reduce Risk and Friction in your Information Supply Chain
Talend Metadata Manager Reduce Risk and Friction in your Information Supply Chain Talend Metadata Manager Talend Metadata Manager provides a comprehensive set of capabilities for all facets of metadata
More informationEnabling Data Quality
Enabling Data Quality Establishing Master Data Management (MDM) using Business Architecture supported by Information Architecture & Application Architecture (SOA) to enable Data Quality. 1 Background &
More informationHow To Manage Risk With Sas
SOLUTION OVERVIEW SAS Solutions for Enterprise Risk Management A holistic view of risk of risk and exposures for better risk management Overview The principal goal of any financial institution is to generate
More informationAPPLYING FUNCTION POINTS WITHIN A SOA ENVIRONMENT
APPLYING FUNCTION POINTS WITHIN A SOA ENVIRONMENT Jeff Lindskoog EDS, An HP Company 1401 E. Hoffer St Kokomo, IN 46902 USA 1 / 16 SEPTEMBER 2009 / EDS INTERNAL So, Ah, How Big is it? 2 / 16 SEPTEMBER 2009
More informationBeyond Lambda - how to get from logical to physical. Artur Borycki, Director International Technology & Innovations
Beyond Lambda - how to get from logical to physical Artur Borycki, Director International Technology & Innovations Simplification & Efficiency Teradata believe in the principles of self-service, automation
More informationBeyond the Single View with IBM InfoSphere
Ian Bowring MDM & Information Integration Sales Leader, NE Europe Beyond the Single View with IBM InfoSphere We are at a pivotal point with our information intensive projects 10-40% of each initiative
More informationData Integration Checklist
The need for data integration tools exists in every company, small to large. Whether it is extracting data that exists in spreadsheets, packaged applications, databases, sensor networks or social media
More informationHadoop Data Hubs and BI. Supporting the migration from siloed reporting and BI to centralized services with Hadoop
Hadoop Data Hubs and BI Supporting the migration from siloed reporting and BI to centralized services with Hadoop John Allen October 2014 Introduction John Allen; computer scientist Background in data
More informationBusiness User driven Scorecards to measure Data Quality using SAP BusinessObjects Information Steward
September 10-13, 2012 Orlando, Florida Business User driven Scorecards to measure Data Quality using SAP BusinessObjects Information Steward Asif Pradhan Learning Points SAP BusinessObjects Information
More informationORACLE DATA INTEGRATOR ENTEPRISE EDITION FOR BUSINESS INTELLIGENCE
ORACLE DATA INTEGRATOR ENTEPRISE EDITION FOR BUSINESS INTELLIGENCE KEY FEATURES AND BENEFITS (E-LT architecture delivers highest performance. Integrated metadata for alignment between Business Intelligence
More informationManagement Update: The Cornerstones of Business Intelligence Excellence
G00120819 T. Friedman, B. Hostmann Article 5 May 2004 Management Update: The Cornerstones of Business Intelligence Excellence Business value is the measure of success of a business intelligence (BI) initiative.
More informationORACLE DATA INTEGRATOR ENTERPRISE EDITION
ORACLE DATA INTEGRATOR ENTERPRISE EDITION Oracle Data Integrator Enterprise Edition 12c delivers high-performance data movement and transformation among enterprise platforms with its open and integrated
More informationA TECHNICAL WHITE PAPER ATTUNITY VISIBILITY
A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY Analytics for Enterprise Data Warehouse Management and Optimization Executive Summary Successful enterprise data management is an important initiative for growing
More informationMETA DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING
META DATA QUALITY CONTROL ARCHITECTURE IN DATA WAREHOUSING Ramesh Babu Palepu 1, Dr K V Sambasiva Rao 2 Dept of IT, Amrita Sai Institute of Science & Technology 1 MVR College of Engineering 2 asistithod@gmail.com
More informationApplied Business Intelligence. Iakovos Motakis, Ph.D. Director, DW & Decision Support Systems Intrasoft SA
Applied Business Intelligence Iakovos Motakis, Ph.D. Director, DW & Decision Support Systems Intrasoft SA Agenda Business Drivers and Perspectives Technology & Analytical Applications Trends Challenges
More informationENTERPRISE EDITION ORACLE DATA SHEET KEY FEATURES AND BENEFITS ORACLE DATA INTEGRATOR
ORACLE DATA INTEGRATOR ENTERPRISE EDITION KEY FEATURES AND BENEFITS ORACLE DATA INTEGRATOR ENTERPRISE EDITION OFFERS LEADING PERFORMANCE, IMPROVED PRODUCTIVITY, FLEXIBILITY AND LOWEST TOTAL COST OF OWNERSHIP
More informationData Integrity and Integration: How it can compliment your WebFOCUS project. Vincent Deeney Solutions Architect
Data Integrity and Integration: How it can compliment your WebFOCUS project Vincent Deeney Solutions Architect 1 After Lunch Brain Teaser This is a Data Quality Problem! 2 Problem defining a Member How
More informationScalable Enterprise Data Integration Your business agility depends on how fast you can access your complex data
Transforming Data into Intelligence Scalable Enterprise Data Integration Your business agility depends on how fast you can access your complex data Big Data Data Warehousing Data Governance and Quality
More informationThe following is intended to outline our general product direction. It is intended for informational purposes only, and may not be incorporated into
The following is intended to outline our general product direction. It is intended for informational purposes only, and may not be incorporated into any contract. It is not a commitment to deliver any
More informationGradient An EII Solution From Infosys
Gradient An EII Solution From Infosys Keywords: Grid, Enterprise Integration, EII Introduction New arrays of business are emerging that require cross-functional data in near real-time. Examples of such
More informationFROM DATA STORE TO DATA SERVICES - DEVELOPING SCALABLE DATA ARCHITECTURE AT SURS. Summary
UNITED NATIONS ECONOMIC COMMISSION FOR EUROPE CONFERENCE OF EUROPEAN STATISTICIANS Working paper 27 February 2015 Workshop on the Modernisation of Statistical Production Meeting, 15-17 April 2015 Topic
More informationAddressing Risk Data Aggregation and Risk Reporting Ben Sharma, CEO. Big Data Everywhere Conference, NYC November 2015
Addressing Risk Data Aggregation and Risk Reporting Ben Sharma, CEO Big Data Everywhere Conference, NYC November 2015 Agenda 1. Challenges with Risk Data Aggregation and Risk Reporting (RDARR) 2. How a
More informationWHITE PAPER. Enabling predictive analysis in service oriented BPM solutions.
WHITE PAPER Enabling predictive analysis in service oriented BPM solutions. Summary Complex Event Processing (CEP) is a real time event analysis, correlation and processing mechanism that fits in seamlessly
More informationCreating a Business Intelligence Competency Center to Accelerate Healthcare Performance Improvement
Creating a Business Intelligence Competency Center to Accelerate Healthcare Performance Improvement Bruce Eckert, National Practice Director, Advisory Group Ramesh Sakiri, Executive Consultant, Healthcare
More informationCustomer Insight Appliance. Enabling retailers to understand and serve their customer
Customer Insight Appliance Enabling retailers to understand and serve their customer Customer Insight Appliance Enabling retailers to understand and serve their customer. Technology has empowered today
More informationIntegrating Netezza into your existing IT landscape
Marco Lehmann Technical Sales Professional Integrating Netezza into your existing IT landscape 2011 IBM Corporation Agenda How to integrate your existing data into Netezza appliance? 4 Steps for creating
More informationUS Department of Education Federal Student Aid Integration Leadership Support Contractor June 1, 2007
US Department of Education Federal Student Aid Integration Leadership Support Contractor June 1, 2007 Draft Enterprise Data Management Data Policies Final i Executive Summary This document defines data
More informationSolving Your Big Data Problems with Fast Data (Better Decisions and Instant Action)
Solving Your Big Data Problems with Fast Data (Better Decisions and Instant Action) Does your company s integration strategy support your mobility, big data, and loyalty projects today and are you prepared
More informationReduce and manage operating costs and improve efficiency. Support better business decisions based on availability of real-time information
Data Management Solutions Horizon Software Solution s Data Management Solutions provide organisations with confidence in control of their data as they change systems and implement new solutions. Data is
More information5 Best Practices for SAP Master Data Governance
5 Best Practices for SAP Master Data Governance By David Loshin President, Knowledge Integrity, Inc. Sponsored by Winshuttle, LLC 2012 Winshuttle, LLC. All rights reserved. 4/12 www.winshuttle.com Introduction
More informationIt s about you What is performance analysis/business intelligence analytics? What is the role of the Performance Analyst?
Performance Analyst It s about you Are you able to manipulate large volumes of data and identify the most critical information for decision making? Can you derive future trends from past performance? If
More informationData Discovery, Analytics, and the Enterprise Data Hub
Data Discovery, Analytics, and the Enterprise Data Hub Version: 101 Table of Contents Summary 3 Used Data and Limitations of Legacy Analytic Architecture 3 The Meaning of Data Discovery & Analytics 4 Machine
More informationEnd to End Solution to Accelerate Data Warehouse Optimization. Franco Flore Alliance Sales Director - APJ
End to End Solution to Accelerate Data Warehouse Optimization Franco Flore Alliance Sales Director - APJ Big Data Is Driving Key Business Initiatives Increase profitability, innovation, customer satisfaction,
More informationA Design Technique: Data Integration Modeling
C H A P T E R 3 A Design Technique: Integration ing This chapter focuses on a new design technique for the analysis and design of data integration processes. This technique uses a graphical process modeling
More informationTRENDS IN THE DEVELOPMENT OF BUSINESS INTELLIGENCE SYSTEMS
9 8 TRENDS IN THE DEVELOPMENT OF BUSINESS INTELLIGENCE SYSTEMS Assist. Prof. Latinka Todoranova Econ Lit C 810 Information technology is a highly dynamic field of research. As part of it, business intelligence
More informationSOA Success is Not a Matter of Luck
by Prasad Jayakumar, Technology Lead at Enterprise Solutions, Infosys Technologies Ltd SERVICE TECHNOLOGY MAGAZINE Issue L May 2011 Introduction There is nothing either good or bad, but thinking makes
More informationFive best practices for deploying a successful service-oriented architecture
IBM Global Services April 2008 Five best practices for deploying a successful service-oriented architecture Leveraging lessons learned from the IBM Academy of Technology Executive Summary Today s innovative
More informationVermont Enterprise Architecture Framework (VEAF) Master Data Management Design
Vermont Enterprise Architecture Framework (VEAF) Master Data Management Design EA APPROVALS Approving Authority: REVISION HISTORY Version Date Organization/Point
More informationReport on the Dagstuhl Seminar Data Quality on the Web
Report on the Dagstuhl Seminar Data Quality on the Web Michael Gertz M. Tamer Özsu Gunter Saake Kai-Uwe Sattler U of California at Davis, U.S.A. U of Waterloo, Canada U of Magdeburg, Germany TU Ilmenau,
More informationTHOMAS RAVN PRACTICE DIRECTOR TRA@PLATON.NET. An Effective Approach to Master Data Management. March 4 th 2010, Reykjavik WWW.PLATON.
An Effective Approach to Master Management THOMAS RAVN PRACTICE DIRECTOR TRA@PLATON.NET March 4 th 2010, Reykjavik WWW.PLATON.NET Agenda Introduction to MDM The aspects of an effective MDM program How
More informationA New Foundation For Customer Management
The Customer Data Platform: A New Foundation For Customer Management 730 Yale Avenue Swarthmore, PA 19081 info@raabassociatesinc.com The Marketing Technology Treadmill Marketing automation. Inbound marketing.
More informationOracle BI Applications (BI Apps) is a prebuilt business intelligence solution.
1 2 Oracle BI Applications (BI Apps) is a prebuilt business intelligence solution. BI Apps supports Oracle sources, such as Oracle E-Business Suite Applications, Oracle's Siebel Applications, Oracle's
More informationSTRATEGIC AND FINANCIAL PERFORMANCE USING BUSINESS INTELLIGENCE SOLUTIONS
STRATEGIC AND FINANCIAL PERFORMANCE USING BUSINESS INTELLIGENCE SOLUTIONS Boldeanu Dana Maria Academia de Studii Economice Bucure ti, Facultatea Contabilitate i Informatic de Gestiune, Pia a Roman nr.
More informationEnabling Better Business Intelligence and Information Architecture With SAP Sybase PowerDesigner Software
SAP Technology Enabling Better Business Intelligence and Information Architecture With SAP Sybase PowerDesigner Software Table of Contents 4 Seeing the Big Picture with a 360-Degree View Gaining Efficiencies
More informationBefore You Buy: A Checklist for Evaluating Your Analytics Vendor
Executive Report Before You Buy: A Checklist for Evaluating Your Analytics Vendor By Dale Sanders Sr. Vice President Health Catalyst Embarking on an assessment with the knowledge of key, general criteria
More informationBest Practices in Leveraging a Staging Area for SaaS-to-Enterprise Integration
white paper Best Practices in Leveraging a Staging Area for SaaS-to-Enterprise Integration David S. Linthicum Introduction SaaS-to-enterprise integration requires that a number of architectural calls are
More informationReal World MDM Challenges and Benefits
Real World MDM Challenges and Benefits Gary Bahl, Sr. Manager MDM Product Marketing and Management, Talend, gbahl@talend.com Kumar Ramamurthy, Sr. Director, DWBI and Analytics, Virtusa, kumarr@virtusa.com
More information