White Paper. Data Migration
|
|
|
- Ralph Price
- 10 years ago
- Views:
Transcription
1 White Paper Data Migration A White Paper by Bloor Research Author : Philip Howard Publish date : May 2011
2 data migration projects are undertaken because they will support business objectives. There are costs to the business if it goes wrong or if the project is delayed, and the most important factor in ensuring the success of such projects is close collaboration between the business and IT. Philip Howard Free copies of this publication have been sponsored by
3 Executive introduction In 2007, Bloor Research conducted a survey into the state of the market for data migration. At that time, there were few tools or methodologies available that were targeted specifically at data migration and it was not an area of focus for most vendors. As a result, it was not surprising that 84% of data migration projects ran over time, over budget, or both. During the spring of 2011, Bloor Research re-surveyed the market to discover what lessons have been learned since While a detailed analysis of the survey results will not be available until later in the year, it is encouraging that now only a minority of migration projects are not completed on time and within budget. This paper will discuss why data migration is important to your business and why the actual process of migration needs to be treated as a business issue, what lessons have been learned in the last few years, the initial considerations that organisations need to bear in mind before undertaking a data migration, and best practices for addressing these issues. A Bloor White Paper Bloor Research
4 Data migration is a business issue If you re migrating from one application environment to another, implementing a new solution, or consolidating multiple databases or applications onto a single platform (perhaps following an acquisition), you re doing it for business reasons. It may be to save money or to provide business users with new functionality or insights into business trends that will help to drive the business forward. Whatever the reasons, it is a business issue. Historically, certainly in 2007, data migration was regarded as risky. Today, as our latest results indicate nearly 62% of projects are delivered on-time and on-budget there is much less risk involved in data migration, provided that projects are approached in the proper manner. Nevertheless, there are risks involved. Figure 1 illustrates some of the costs associated with overrunning projects, as attested by respondents in our 2011 survey. Note that these are all directly related to business costs. So, there are risks, and 30% of data migration projects are delayed because of concerns over these and other issues. These delays, which average approximately four months (but can be a year or more), postpone the accrual of the business benefits that are driving the migration in the first place. In effect, delays cost money, which is why it is so important to take the risk out of migration projects. It is worth noting that companies were asked about the top three factors affecting the success of their data migration projects. By far, the most important factor was business engagement with 72% of organisations quoting this as a top three factor and over 50% stating this as the most important factor. Conversely, over one third of companies quoted lack of support from business users as a reason for project overruns. To summarise: data migration projects are undertaken because they will support business objectives. There are costs to the business if it goes wrong or if the project is delayed, and the most important factor in ensuring the success of such projects is close collaboration between the business and IT. Whether this means that the project should be owned by the business treated as a business project with support from IT or whether it should be delegated to IT with the business overseeing the project is debatable, but what is clear is that it must involve close collaboration. Figure 1: Costs to the business of overrunning projects Bloor Research A Bloor White Paper
5 Lessons learned Used in 2007 Used in 2011 Data profiling tool 10% 72% Data cleansing tool 11% 75% Formal methodology 72% 94% In-house methodology 76% 41% Figure 2: Increased use of tools and methodologies As previously noted, there has been a significant reduction in the number of overrunning or abandoned projects. What drove this dramatic shift? There are a number of areas that have significantly changed since 2007, which are summarised in Figure 2. This shows a very significant uptake in the use of data profiling and data cleansing tools, as well as more companies now using a formal methodology. With respect to a methodology, there has been a significant move towards formalised approaches that are supplied by vendors and systems integrators rather than developed in-house. While we cannot prove a causal relationship between these changes and the increased number of on-time and onbudget projects, these figures are, at the very least, highly suggestive. Since these are lessons that have already been learned we will not discuss these elements in detail, but they are worth discussing briefly. Data profiling Data profiling is used to identify data quality errors, uncover relationships that exist between different data elements, discover sensitive data that may need to be masked or anonymised and monitor data quality on an ongoing basis to support data governance. One important recommendation is that data profiling be conducted prior to setting your budgets and timescales. This is because profiling enables identification of the scale of the data quality, masking and relationship issues that may be involved in the project and thus enables more accurate estimation of project duration and cost. Exactly half of the companies using data profiling tools use them in this way, and they are more likely to have projects that run to time and on budget, by 68% compared to 55%. While the use of data profiling to discover errors in the data is fairly obvious, its use with respect to relationships in the data may not be. Figure 3 shows a business entity for an estate agent, showing relevant relationships. If you migrate this data to a new environment Figure 3: Diagrammatic representation of a business entity then the relationship between the property and the land registry, for example, needs to remain intact, otherwise the relevant application won t work. Note that this is a business-level representation of these relationships. Data profiling tools need to be able to represent data at this level so that they support use by business analysts and domain experts, as well as in entityrelationship diagrams used by IT. Ideally, they should enable automatic conversion from one to the other to enable the closest possible collaboration. Data quality Data quality is used to assure accurate data and to enrich it. It is only in recent years that more attention has been paid to data quality; yet many companies still ignore it. For example, the GS1 Data Crunch report, published in late 2009, compared the product catalogues of the UK s four leading supermarket chains with the same details at their four leading suppliers. It found that 60% of the supermarkets records were duplicates, resulting in overstocking and wastage for some products and under-stocking and missed sales for others. The total cost of this poor data quality for these four companies over 5 years was estimated at 1bn. As another and more generic proof point, a Forbes Insights survey published in 2010 stated that data-related problems cost the majority of companies more than $5 million annually. One-fifth estimate losses in excess of $20 million per year. Poor data quality can be very costly and can break or impair a data migration project just as easily as broken relationships. For example, A Bloor White Paper Bloor Research
6 Lessons learned suppose that you have a customer address with a typo such as a 0 instead of an O. An existing system may accept such errors, but the new one may apply rules more rigorously and not support such errors, meaning that you cannot access this customer record. In any case, the end result will be that this customer will never receive your marketing campaign. According to survey respondents in 2011, poor data quality or lack of visibility into data quality issues was the main contributor (53.4%) to project overruns, where these occurred. Methodology Having an appropriate migration methodology is quoted in our survey as a critical factor by more respondents than any other except business engagement. It is gratifying to see that almost all organisations now use a formal methodology. For overrunning projects, more than a quarter blamed the fact that there was no formal methodology in place. In terms of selecting an appropriate methodology for use with data migration, it should be integrated with the overall methodology for implementing the target business application. As such, the migration is a component of the broader project, and planning for data migration activities should occur as early as possible in the implementation cycle Bloor Research A Bloor White Paper
7 Pre-migration decisions There are various options that need to be considered before commencing a project. In particular: Who will have overall control of the project? Is it the business leader with support from IT or IT with support from the business? Who owns the data? Owners will need to be identified for all key data identities, and they will need to be aware of what is expected of them and their responsibilities during the migration project. This is critical: we have previously discussed the importance of business engagement and it is at this level that it is most important. Selected tools should support business-it collaboration so that business users can work within the project using business-level constructs and terminology, which can be automatically translated into IT terms. This will assist in ensuring business engagement and will also facilitate agile user acceptance testing. Will older data be archived rather than migrated to the new system? Almost 60% of the migrations in our most recent survey involved archival of historic data from earlier systems. In most cases, it is less expensive to archive historic data versus migrating and maintaining it in the new system reducing total cost of ownership. It also reduces the scale requirements for the migrated system. A key feature of archiving is that you need to understand relationships in the data (using data profiling) so it often makes sense to combine archiving and migration into a single project. Setting and monitoring retention policies is also a concern as it is necessary to comply with legal requirements. This monitoring may be included as part of data governance (see next section). What development processes will be employed? Will it be an agile methodology with frequent testing or a more traditional approach? Agile approaches should include user acceptance testing as well as functional and other test types the earlier that mismatches can be caught the better. Does the project involve any sensitive data? According to our survey results, around 44% of projects involve sensitive data. If sensitive data is an issue, it will need to be discovered (profiling tools can do this) and then masked. In terms of masking, simple scrambling techniques may not be enough. For example, if you scramble a postal code randomly this will be fine if the application is only testing for a correct data format, but if it checks the validity of the code then you will need to use more sophisticated masking algorithms. It must also be borne in mind that relationships may need to be maintained during the masking process, for similar reasons. For example, a patient with a particular disease and treatment cannot be randomly masked because there is a relationship between diseases and treatments that may need to be maintained. Manual masking methods were used by most companies in our survey (though more than 10% simply ignored the issue, thereby breaking the law), but this will only be appropriate for the most simple masking issues, where neither validity nor relationships are an issue. How will the project be cut-over? Is this to be a turn the old system off and the new system on, over a long weekend cut-over? If the cut-over is to run over an extended period, best practice suggests that there should be a freeze on changes to the source system(s) during that time. Will a more iterative approach, perhaps with parallel systems running, be employed? Or will a zero-downtime approach be used? If systems are running in parallel after the migration is completed, then criteria needs to be defined that determines when the old system can be turned off. Procedures then need to be in place to monitor those criteria so that the old system is turned off as soon as possible. Failure to define appropriate criteria can lead to old systems being maintained for far too long, which is a waste of time, money and resources. Is the migration to be phased? For example, if migrating to a new application and a new database it might be decided to migrate the application first and the database later. Or, if the new environment will host much more detailed data than the current one, a decision to migrate just the current data and then add in the additional data as a second phase may be best. This can be a useful approach if you are migrating between two very different environments. Who will you partner with? The evidence from our 2011 survey suggests that projects that are managed using in-house resources are least likely to overrun. If you do not have such internal expertise then a third A Bloor White Paper Bloor Research
8 Pre-migration decisions party will be needed to assist in the migration project and to provide a methodology. Results suggest that software vendors are better able to do this than systems integrators (59% no overruns compared to 52%), perhaps because of their in-depth knowledge of the integration tools that they supply and/or their functional knowledge of target applications. A further consideration will be the degree of knowledge transfer the third party will be able to provide, allowing you to build your own internal expertise to support future projects. In addition, go-live is still a very critical and disruptive event that is often associated with the implementation of a new business application and new processes for the organisation. Experienced third party support for the change management processes involved can be critical. Results suggest that software vendors with strong consultancy capabilities are better able to do this than pure-play systems integrators. Other major factors in choosing a partner are experience with mission-critical and enterprise-scale migrations, the ability to drive business-it collaboration and the change management that may involve, technical expertise with the tools to be used, functional and domain expertise with respect to the business processes involved, and a suitable methodology (as previously discussed). Data governance As noted, it is imperative to ensure good data quality in order to support a data migration project. However, it is all too easy to think of data quality as a one-off issue. It isn t. Data quality is estimated to deteriorate between 1% and 1.5% per month. Conduct a one-off data quality project now and in three years you will be back where you started. If investing in data quality to support data migration, plan to monitor the quality of this data on an on-going basis (see box) and remediate errors as they arise. In addition to addressing the issues raised in the previous section, consider using data migration as a springboard for a full data governance programme, if one is not used already. Having appropriate data governance processes in place was the third most highly rated success factor in migration projects in our 2011 survey, after business involvement and migration methodology. While we do not have direct evidence for this, we suspect that part of the reason for this focus is because data governance programmes encourage business To monitor data quality bear in mind that 100% accurate data is not obtainable and, even if it was, it would be prohibitively expensive to maintain. It is therefore important to set targets for things such as number of duplicate records, records with missing data, records with invalid data, and so on. These key performance indicators can be presented by means of a dashboard together with trend indicators that support efforts towards continuous improvement. Data profiling tools can not only be used to monitor data quality, but also to calculate key performance indicators and present these as described. engagement. This happens through the appointment of data stewards and other representatives within the business that become actively involved not only in data governance itself, but also in associated projects such as data migration. In practice, implementing data governance will mean extending data quality to other areas, such as monitoring violations of business rules and/or compliance (including retention policies for archived data) as well as putting appropriate processes in place to ensure that data errors are remediated in a timely fashion. Response time for remediation can also be monitored and reported within your data governance dashboard. In this context, it is worth noting that regulators are now starting to require ongoing data quality control. For example, the EU Solvency II regulations for the insurance sector mandate that data should be accurate, complete and appropriate and maintained in that fashion on an ongoing basis. The proposed MiFID II regulations in the financial services sector use exactly the same terminology; and we expect other regulators to adopt the same approach. It is likely that we will see the introduction of Sarbanes-Oxley II on the same basis. All of this means that it makes absolute sense to use data migration projects as the kick-off for ongoing data governance initiatives, both for potential compliance reasons and, more generally, for the good of the business Bloor Research A Bloor White Paper
9 Initial steps We have discussed what needs to be considered before you begin. Here, we will briefly highlight the main steps involved in the actual process of migration. There are basically four considerations: 1. Development: develop data quality and business rules that apply to the data and also define transformation processes that need to be applied to the data when loaded into the target system. 2. Analysis: this involves complete and indepth data profiling to discover errors in the data and relationships, explicit or implicit, which exist. In addition, profiling is used to identify sensitive data that needs to be masked or anonymised. The analysis phase also includes gap analysis, which involves identifying data required in the new system that was not included in the old system. A decision will need to be made on how to source this. In addition, gap analysis will be required on data quality rules because there may be rules that are not enforced in the current system, but compliance will be needed in the new one. By profiling the data and analysing data lineage, it is also possible to identify legacy tables and columns that contain data, but which are not mapped to the target system within the data migration process. 4. Testing: it is a good idea to adopt an agile approach to development and testing, treating these activities as an iterative process that includes user acceptance testing. Where appropriate, it may be useful to compare source and target databases for reconciliation. These are the main four elements, but they are by no means the only ones. A data integration tool (ETL - extract, transform and load), for example, will be needed in order to move the data and this should also have the collaborative and reuse capabilities that has been outlined. A further consideration is that whatever tools are used, they should be able to maintain a full audit trail of what has been done. This is important because part of the project may need to be rolled back if something goes wrong and an audit trail provides the documentation of what precisely needs to be rolled back. Further, such an audit trail provides documentation of the completed project and it may be required for compliance reasons with regulations such as Sarbanes-Oxley. Other important capabilities include version management, how data is masked, and how data is archived. Clearly it will be an advantage if all tools can be provided by a single vendor running on a single platform. 3. Cleansing: data will need to be cleansed and there are two ways to do this. The existing system can be cleansed and then the data extracted as a part of the migration, or the data can be extracted and then cleansed. The latter approach will mean that the migrated system will be cleansed, but not the original. This will be fine for a short duration big-bang cut-over, but may cause problems for parallel running for any length of time. In general, a cleansing tool should be used that supports both business and IT-level views of the data (this also applies to data profiling) in order to support collaboration and enable reuse of data quality and business rules, thereby helping to reduce cost of ownership. A Bloor White Paper Bloor Research
10 Summary Data migration doesn t have to be risky. As our research shows, the adoption of appropriate tools, together with a formal methodology has led, over the last four years, to a significant increase in the successful deployment of timely, on-cost migration projects. Nevertheless, there are still a substantial number of projects that overrun more than a third. What is required is careful planning, the right tools and the right partner. However, the most important factor in ensuring successful migrations is the role of the business. All migrations are business issues and the business needs to be fully involved in the migration before it starts and on an on-going basis if it is to be successful. As a result, a critical factor in selecting relevant tools will be the degree to which those tools enable collaboration between relevant business people and IT. Finally, data migrations should not be treated as one-off initiatives. It is unlikely that this will be the last migration, so the expertise gained during the migration process will be an asset that can be reused in the future. Once data has been cleansed as a part of the migration process, it represents a more valuable resource than it was previously, because it is more accurate. It will make sense to preserve the value of this asset by implementing an on-going data quality monitoring and remediation programme and, preferably, a full data governance project. Further Information Further information about this subject is available from Bloor Research A Bloor White Paper
11 Bloor Research overview Bloor Research is one of Europe s leading IT research, analysis and consultancy organisations. We explain how to bring greater Agility to corporate IT systems through the effective governance, management and leverage of Information. We have built a reputation for telling the right story with independent, intelligent, well-articulated communications content and publications on all aspects of the ICT industry. We believe the objective of telling the right story is to: Describe the technology in context to its business value and the other systems and processes it interacts with. Understand how new and innovative technologies fit in with existing ICT investments. Look at the whole market and explain all the solutions available and how they can be more effectively evaluated. Filter noise and make it easier to find the additional information or news that supports both investment and implementation. Ensure all our content is available through the most appropriate channel. Founded in 1989, we have spent over two decades distributing research and analysis to IT user and vendor organisations throughout the world via online subscriptions, tailored research services, events and consultancy projects. We are committed to turning our knowledge into business value for you. About the author Philip Howard Research Director - Data Philip started in the computer industry way back in 1973 and has variously worked as a systems analyst, programmer and salesperson, as well as in marketing and product management, for a variety of companies including GEC Marconi, GPT, Philips Data Systems, Raytheon and NCR. After a quarter of a century of not being his own boss Philip set up what is now P3ST (Wordsmiths) Ltd in 1992 and his first client was Bloor Research (then ButlerBloor), with Philip working for the company as an associate analyst. His relationship with Bloor Research has continued since that time and he is now Research Director. His practice area encompasses anything to do with data and content and he has five further analysts working with him in this area. While maintaining an overview of the whole space Philip himself specialises in databases, data management, data integration, data quality, data federation, master data management, data governance and data warehousing. He also has an interest in event stream/complex event processing. In addition to the numerous reports Philip has written on behalf of Bloor Research, Philip also contributes regularly to and and was previously the editor of both Application Development News and Operating System News on behalf of Cambridge Market Intelligence (CMI). He has also contributed to various magazines and published a number of reports published by companies such as CMI and The Financial Times. Away from work, Philip s primary leisure activities are canal boats, skiing, playing Bridge (at which he is a Life Master) and walking the dog.
12 Copyright & disclaimer This document is copyright 2011 Bloor Research. No part of this publication may be reproduced by any method whatsoever without the prior consent of Bloor Research. Due to the nature of this material, numerous hardware and software products have been mentioned by name. In the majority, if not all, of the cases, these product names are claimed as trademarks by the companies that manufacture the products. It is not Bloor Research s intent to claim these names or trademarks as our own. Likewise, company logos, graphics or screen shots have been reproduced with the consent of the owner and are subject to that owner s copyright. Whilst every care has been taken in the preparation of this document to ensure that the information is correct, the publishers cannot accept responsibility for any errors or omissions.
13 2nd Floor, St John Street LONDON, EC1V 4PY, United Kingdom Tel: +44 (0) Fax: +44 (0) Web:
White Paper. Lower your risk with application data migration. next steps with Informatica
White Paper Lower your risk with application data migration A White Paper by Bloor Research Author : Philip Howard Publish date : April 2013 If we add in Data Validation and Proactive Monitoring then Informatica
White Paper. SAP ASE Total Cost of Ownership. A comparison to Oracle
White Paper SAP ASE Total Cost of Ownership A White Paper by Bloor Research Author : Philip Howard Publish date : April 2014 The results of this survey are unequivocal: for all 21 TCO and related metrics
White Paper. What the ideal cloud-based web security service should provide. the tools and services to look for
White Paper What the ideal cloud-based web security service should provide A White Paper by Bloor Research Author : Fran Howarth Publish date : February 2010 The components required of an effective web
White Paper. Architecting the security of the next-generation data center. why security needs to be a key component early in the design phase
White Paper Architecting the security of the next-generation data center A White Paper by Bloor Research Author : Fran Howarth Publish date : August 2011 teams involved in modernization projects need to
Enterprise Information Management Services Managing Your Company Data Along Its Lifecycle
SAP Solution in Detail SAP Services Enterprise Information Management Enterprise Information Management Services Managing Your Company Data Along Its Lifecycle Table of Contents 3 Quick Facts 4 Key Services
Data Integration Platforms - Talend
Data Integration Platforms - Talend Author : Philip Howard Publish date : July 2008 page 1 Introduction Talend is an open source provider of data integration products. However, while many open source
White Paper The Benefits of Business Intelligence Standardization
White Paper The Benefits of Business Intelligence Standardization Why Should You Standardize Your Business Intelligence Tools? Author: Timo Elliott ([email protected]) Contributors: Audience:
White Paper. Getting ahead in the cloud. the need for better identity and access controls
White Paper Getting ahead in the cloud A White Paper by Bloor Research Author : Fran Howarth Publish date : March 2013 Users are demanding access to applications and services from wherever they are, whenever
Data Migration. How CXAIR can be used to improve the efficiency and accuracy of data migration. A CXAIR White Paper. www.connexica.
Search Powered Business Analytics, the smartest way to discover your data Data Migration How CXAIR can be used to improve the efficiency and accuracy of data migration A CXAIR White Paper www.connexica.com
InDetail. Kdb+ and the Internet of Things/Big Data
InDetail Kdb+ and the Internet of Things/Big Data An InDetail Paper by Bloor Research Author : Philip Howard Publish date : August 2014 Kdb+ has proved itself in what is unarguably the most demanding big
2.1 STAGE 1 PROJECT PROCUREMENT STRATEGY
APM Procurement Guide : Draft7_RevA_Chapter 2.1_Project Procurement Strategy_Jan12 1 2.1 STAGE 1 PROJECT PROCUREMENT STRATEGY In this stage, the project definition is developed so that decisions can be
Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management
Making Business Intelligence Easy Whitepaper Measuring data quality for successful Master Data Management Contents Overview... 3 What is Master Data Management?... 3 Master Data Modeling Approaches...
White Paper On Pilot Method Of ERP Implementation
White Paper On Pilot Method Of ERP Implementation Rod Clarke Rod Clarke provides guidance, advice and support to businesses in successfully applying IS/IT in support of their business goals. He brings
Measure Your Data and Achieve Information Governance Excellence
SAP Brief SAP s for Enterprise Information Management SAP Information Steward Objectives Measure Your Data and Achieve Information Governance Excellence A single solution for managing enterprise data quality
MANAGING THE SOFTWARE PUBLISHER AUDIT PROCESS
MANAGING THE SOFTWARE PUBLISHER AUDIT PROCESS 3 THE USE OF BUSINESS SOFTWARE AND SPORTS ARE DEFINITELY QUITE SIMILAR; IF YOU WANT TO PLAY (USE THE SOFTWARE), YOU HAVE TO ACCEPT THE RULES. THIS INCLUDES
Squaring the circle: using a Data Governance Framework to support Data Quality. An Experian white paper
Squaring the circle: using a Governance Framework to support Quality An Experian white paper June 2014 Introduction Most organisations wish for better quality data which makes it surprising just how many
A discussion of information integration solutions November 2005. Deploying a Center of Excellence for data integration.
A discussion of information integration solutions November 2005 Deploying a Center of Excellence for data integration. Page 1 Contents Summary This paper describes: 1 Summary 1 Introduction 2 Mastering
Enterprise Data Management for SAP. Gaining competitive advantage with holistic enterprise data management across the data lifecycle
Enterprise Data Management for SAP Gaining competitive advantage with holistic enterprise data management across the data lifecycle By having industry data management best practices, from strategy through
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff
Whitepaper Data Governance Roadmap for IT Executives Valeh Nazemoff The Challenge IT Executives are challenged with issues around data, compliancy, regulation and making confident decisions on their business
Appendix D Programme Stream 6 CRM Procurement. Programme Stream 6 Remodelling of Customer Services Programme CRM Procurement
Programme Stream 6 Remodelling of Customer Services Programme CRM Procurement Recommendations That the Executive note CRM procurement will be put out to tender in 2010/11 and note the proposed phasing
Imprest. Improving project, programme and portfolio management
Imprest Improving project, programme and portfolio management Improve the management of your project portfolio with Imprest Higher Education is experiencing unparalleled change as a result of fundamental
five ways you can increase wholesale trade profit a five ways series publication from enabling
five ways you can increase wholesale trade profit For additional copies of this booklet or for more information regarding Enterprise Resource Planning (ERP), Financial Management, Construction, Property
Contract Management Part One Making the Business Case for Investment
Contract Management Part One Making the Business Case for Investment Executive Summary This paper is the first in a short series of three which will look at the business case for organisations to invest
Successful CRM. Delivered. Prepare for CRM Success. Our How to start right and stay right!
Successful CRM. Delivered. Prepare for CRM Success Our How to start right and stay right! ConsultCRM: Prepare for CRM Success Introduction ConsultCRM has years of experience in the area of Customer Relationship
A WHITE PAPER By Silwood Technology Limited
A WHITE PAPER By Silwood Technology Limited Using Safyr to facilitate metadata transparency and communication in major Enterprise Applications Executive Summary Enterprise systems packages such as SAP,
InDetail. RainStor archiving
InDetail RainStor archiving An InDetail Paper by Bloor Research Author : Philip Howard Publish date : November 2013 Archival is a no-brainer when it comes to return on investment and total cost of ownership.
White Paper. An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management
White Paper An Overview of the Kalido Data Governance Director Operationalizing Data Governance Programs Through Data Policy Management Managing Data as an Enterprise Asset By setting up a structure of
Process Streamlining. Whitepapers. Written by A Hall Operations Director. Origins
Whitepapers Process Streamlining Written by A Hall Operations Director So, your processes are established and stable, but are clearly inefficient and you are not meeting your performance expectations.
Best Practices for Log File Management (Compliance, Security, Troubleshooting)
Log Management: Best Practices for Security and Compliance The Essentials Series Best Practices for Log File Management (Compliance, Security, Troubleshooting) sponsored by Introduction to Realtime Publishers
BUSINESS RULES AND GAP ANALYSIS
Leading the Evolution WHITE PAPER BUSINESS RULES AND GAP ANALYSIS Discovery and management of business rules avoids business disruptions WHITE PAPER BUSINESS RULES AND GAP ANALYSIS Business Situation More
Data Conversion for SAP. Using Accenture s load right method to improve data quality from extraction through transformation to load
Data Conversion for SAP Using Accenture s load right method to improve data quality from extraction through transformation to load The importance of data conversion As organizations drive to reduce costs
Inside Track Research Note. In association with. Enterprise Storage Architectures. Is it only about scale up or scale out?
Research Note In association with Enterprise Storage Architectures Is it only about scale up or scale out? August 2015 About this The insights presented in this document are derived from independent research
Discover, Cleanse, and Integrate Enterprise Data with SAP Data Services Software
SAP Brief SAP s for Enterprise Information Management Objectives SAP Data Services Discover, Cleanse, and Integrate Enterprise Data with SAP Data Services Software Step up to true enterprise information
(Refer Slide Time: 01:52)
Software Engineering Prof. N. L. Sarda Computer Science & Engineering Indian Institute of Technology, Bombay Lecture - 2 Introduction to Software Engineering Challenges, Process Models etc (Part 2) This
Red Hat Enterprise Linux: The ideal platform for running your Oracle database
Red Hat Enterprise Linux: The ideal platform for running your Oracle database 2 Introduction 2 Scalability 2 Availability 3 Reliability 4 Manageability 5 Red Hat subscriptions 6 Conclusion www.redhat.com
There s an Easier Way to Adopt Cloud. Leave it to Singtel Cloud Experts. Singtel Business
Singtel Business Product Factsheet Brochure Managed Cloud Expert Defense Services Services There s an Easier Way to Adopt Cloud. Leave it to Singtel Cloud Experts. Singtel Cloud Expert Services is dedicated
Realizing the Benefits of Data Modernization
February 2015 Perspective Realizing the Benefits of How to overcome legacy data challenges with innovative technologies and a seamless data modernization roadmap. Companies born into the digital world
0845 345 3300 [email protected] www.theaccessgroup.com THE FINANCIAL FRONTIER
0845 345 3300 [email protected] www.theaccessgroup.com THE FINANCIAL FRONTIER Executive summary Many firms use ERP-derived stock modules to manage inventory levels and positioning. Unfortunately
Report on a deposit licence for E-prints
!74 0.94.:2039 Report on a deposit licence for E-prints Circulation: PUBLIC Gareth Knight Arts & Humanities Data Service Notes 1. For the purposes of this document, an e-print is defined as an electronic
sdsys THAGORAS SCISYS UK LTD The National Archives Customer Relationship Management System and Integrated Email Marketing Solution RESPONSE TO TENDER
sdsys THAGORAS SCISYS UK LTD The National Archives Customer Relationship Management System and Integrated Email Marketing Solution RESPONSE TO TENDER The National Archives Redacted under! IFOI exemption
Eight key steps which help ensure a successful data migration project: A white paper for inspection management professionals
Eight key steps which help ensure a successful data migration project: A white paper for inspection management professionals Data migration defined Data migration is the selection, preparation, extraction,
Making Information Governance a Reality for Your Organization Maximize the Value of Enterprise Information
SAP Thought Leadership Paper Information Governance Making Information Governance a Reality for Your Organization Maximize the Value of Enterprise Information Table of Contents 6 The Importance of Information
Modernizing enterprise application development with integrated change, build and release management.
Change and release management in cross-platform application modernization White paper December 2007 Modernizing enterprise application development with integrated change, build and release management.
Enterprise Data Quality
Enterprise Data Quality An Approach to Improve the Trust Factor of Operational Data Sivaprakasam S.R. Given the poor quality of data, Communication Service Providers (CSPs) face challenges of order fallout,
Getting started with a data quality program
IBM Software White Paper Information Management Getting started with a data quality program 2 Getting started with a data quality program The data quality challenge Organizations depend on quality data
PROJECT MANAGEMENT FRAMEWORK
PROJECT MANAGEMENT FRAMEWORK DOCUMENT INFORMATION DOCUMENT TYPE: DOCUMENT STATUS: POLICY OWNER POSITION: INTERNAL COMMITTEE ENDORSEMENT: APPROVED BY: Strategic document Approved Executive Assistant to
8 Critical Success Factors When Planning a CMS Data Migration
8 Critical Success Factors When Planning a CMS Data Migration Executive Summary The first step to success. This paper is loaded with critical information that will promote the likelihood of your success
Case Management 101: 10 Things You Must Know About Case Management BUILD FOR CHANGE
Case Management 101: 10 Things You Must Know About Case Management BUILD FOR CHANGE Executive Summary In our connected world, there is no tolerance for disconnected approaches to work. This is true as
!!!!! White Paper. Understanding The Role of Data Governance To Support A Self-Service Environment. Sponsored by
White Paper Understanding The Role of Data Governance To Support A Self-Service Environment Sponsored by Sponsored by MicroStrategy Incorporated Founded in 1989, MicroStrategy (Nasdaq: MSTR) is a leading
SAP White Paper Enterprise Information Management
SAP White Paper Enterprise Information Management Including Business Content in a Comprehensive Information Management Program Enhance Efficiency and Compliance Through Process-Centric Information Management
Spotlight. Log and Event Management
Spotlight Log and Event Management A Spotlight Paper by Bloor Research Author : Philip Howard Publish date : December 2009 It makes sense to treat event management and log management as two sides of the
InDetail. Grid-Tools Test Data Management
InDetail Grid-Tools Test Data Management An InDetail Paper by Bloor Research Author : Philip Howard Publish date : March 2011 As far as we know, Grid-Tools is the only specialist vendor in this space.
The ROI of Data Governance: Seven Ways Your Data Governance Program Can Help You Save Money
A DataFlux White Paper Prepared by: Gwen Thomas The ROI of Data Governance: Seven Ways Your Data Governance Program Can Help You Save Money Leader in Data Quality and Data Integration www.dataflux.com
IBM ediscovery Identification and Collection
IBM ediscovery Identification and Collection Turning unstructured data into relevant data for intelligent ediscovery Highlights Analyze data in-place with detailed data explorers to gain insight into data
Choosing the Right Master Data Management Solution for Your Organization
Choosing the Right Master Data Management Solution for Your Organization Buyer s Guide for IT Professionals BUYER S GUIDE This document contains Confidential, Proprietary and Trade Secret Information (
Digital Continuity in ICT Services Procurement and Contract Management
Digital Continuity in ICT Services Procurement and Contract Management This guidance relates to: Stage 1: Plan for action Stage 2: Define your digital continuity requirements Stage 3: Assess and manage
Management Update: CRM Success Lies in Strategy and Implementation, Not Software
IGG-03122003-01 D. Hagemeyer, S. Nelson Article 12 March 2003 Management Update: CRM Success Lies in Strategy and Implementation, Not Software A customer relationship management (CRM) package doesn t ensure
Best Practices in Enterprise Data Governance
Best Practices in Enterprise Data Governance Scott Gidley and Nancy Rausch, SAS WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Data Governance Use Case and Challenges.... 1 Collaboration
Data Audit Solution. Data quality visibility in 5 days for improving corporate performance. TABLE OF CONTENTS. Purpose of this white paper
Solution Data quality visibility in 5 days for improving corporate performance. Purpose of this white paper This white paper describes the BackOffice Associates engagement and the increasing importance
GOVERNANCE AND MANAGEMENT OF CITY COMPUTER SOFTWARE NEEDS IMPROVEMENT. January 7, 2011
APPENDIX 1 GOVERNANCE AND MANAGEMENT OF CITY COMPUTER SOFTWARE NEEDS IMPROVEMENT January 7, 2011 Auditor General s Office Jeffrey Griffiths, C.A., C.F.E. Auditor General City of Toronto TABLE OF CONTENTS
The importance of selecting the right ERP solution
The importance of selecting the right ERP solution The benefits of selecting and successfully implementing the right ERP solution for your business are widespread. The right ERP solution, tailored to suit
Specialist Cloud Services Lot 4 Cloud EDRM Consultancy Services
Specialist Cloud Services Lot 4 Cloud EDRM Consultancy Services Page 1 1 Contents 1 Contents... 2 2 Transcend360 Introduction... 3 3 Service overview... 4 3.1 Service introduction... 4 3.2 Service description...
Data Migration for Legacy System Retirement
September 2012 Data Migration for Legacy System Retirement A discussion of best practices in legacy data migration and conversion. (415) 449-0565 www.gainesolutions.com TABLE OF CONTENTS The Importance
REFINING YOUR BUSINESS MODEL
Stage 4: Piloting Theme 4: Business Model REFINING YOUR BUSINESS MODEL Introduction As you build a track record of operation and impact, there will be an ongoing iterative refinement of your business model.
