White Paper. Agile data management with X88

Size: px
Start display at page:

Download "White Paper. Agile data management with X88"

Transcription

1 White Paper Agile data management with X88 A White Paper by Bloor Research Author : Philip Howard Publish date : June 2011

2 This paper is a call for some more forward thinking from data management practitioners and vendors and a move towards a more agile stance for the development of data management rules. One of the few companies that has recognised this deficiency and has responded to it is X88. Philip Howard

3 Introduction This paper does not concern itself with the benefits of using data management (data quality, data integration, data governance, data enrichment, data transformation, test data management and so on) tools and processes. That is assumed. It is also assumed that you understand that these are not one-off exercises but ongoing requirements. Further, we are going to make the additional assumption that you realise the benefits of automating as much of your data management processes as possible. This is on the basis that the more that is automated then the less manual intervention will be required and this will be both faster and more cost-effective. Of course, such processes cannot be entirely automated but one would like to do as much as one can. The way that you automate data management processes depends on the type of management under consideration and the type of tool in use. Here we need to make a distinction between test data management on the one hand and data quality and data integration processes on the other. If we consider agile development in general, the reason why this has not been as successful as it might have been is precisely because development may have been agile but the data to support it has not. During application development, data rules and data models often change but when you have such changes you need to re-generate relevant test data. In practice, using traditional techniques, a typical lifecycle for such re-generation is measured in weeks when you would really like it to be in hours. The issue of agile data management with respect to test data is, precisely, the data itself, and the automation of the re-generation of that data. project. If you catch an error early on it is much easier and quicker to rectify it at that stage than after you have completed development when there may be other complications caused by that error. In particular, the practice of testing early and often includes user acceptance testing, which prevents the sort of specification mismatch that can occur when what the user meant wasn t actually reflected in the specification and/or wasn t correctly interpreted by the developer, or worse, the user was mistaken about what was needed. In other words, we are arguing that the development of data management rules should be treated in exactly the same way as any other development project. Further, it is our view that an agile development methodology is particularly well suited to both data quality and data integration environments. This is because users should be heavily involved in the project in any case, either because they are sponsoring the project directly or because it is part of a larger data governance initiative, data migration or data warehousing project. This paper will therefore discuss the issues raised by data management rules and their development, which are actually common to rules-based environments in general, and consider potential solutions. We will also discuss X88 s Pandora product, with particular reference to how that product supports data prototyping in an agile environment. In so far as data quality and data integration tools are concerned these are either rulesbased or semantic. The former dominate the market and it is with these that this paper is concerned and, since it is the rules themselves that provide whatever degree of automation is offered, it is on the creation and subsequent maintenance of these rules that this paper is focused. In our view, not enough attention has been paid to rule definition and management by (most) vendors in this space. In particular, we will argue that an agile approach to rules development should be adopted, for precisely the same reasons that it makes sense to use an agile approach to development in general. That is, that it is more productive to test early and often rather than at the end of a development A Bloor White Paper Bloor Research

4 The issues with rules What does a data management rule do? It takes data from one or more sources, manipulates it and then puts the result/data somewhere. The key is in the manipulation: you may improve values through data quality processes, you may derive values to load into an OLAP cube, you may enrich the data with geo-spatial coordinates, you may apply complex business logic, you may look for correlations between different pieces of data, you may in fact, do almost anything. Clearly, different types of data management tools have widely different inbuilt capabilities. Data integration tools, for example, typically come with a variety of pre-built functions so that you can add, concatenate and apply statistical functions to incoming data. However, these are not rules but merely functions that can be used to help in the development of business rules. Data quality tools, on the other hand, are typically delivered with various out-of-thebox rules that can be applied generically and which do not need to be developed. For example, you might have a rule that a credit card number has to be of the format xxxx xxxx xxxx xxxx where each x is a number between 0 and 9. Similarly, you might have pre-built rules about the validity of postal codes. There is no problem with rules such as these: credit card formats are unlikely to change and, as long as you keep up-to-date with PAF (postal address files) then post code validation should be automatic. However, when it comes to rules that are specific to your business then issues can become much more complex. In some cases you may be able to customise rules that are provided out-of-the-box by the vendor but in many others you may have to build appropriate rules from scratch. As a general principle we can say that you will always have to develop appropriate rules for data integration tasks and you will almost always have to do so (at least for any non-trivial tasks) when using data quality tools. As an illustration of this consider the following reallife example from the healthcare sector. The organisation involved is owned by nearly 200 non-profit hospital and healthcare systems servicing more than 2,000 hospitals and over 53,000 other healthcare sites across the United States. Its primary mission is to drive down healthcare costs while improving the quality of patient care. What the company wants to do is improve its supply chain management. In order to do this it needs to cleanse and de-duplicate its supplier master records (not simple: 3M, for example, has more than 200 valid variations of its name, including subsidiaries), cleanse and de-duplicate its product records (very complex, because the same product can be supplied by different distributors with different product codes and different descriptions) and extract relevant cleansed data from source systems, transform it and then load that data into the data warehouse for analysis. At the same time we can assume that the company in question has recognised that simply fixing its data quality issues as a one-off project is not enough and has implemented a data governance project across the enterprise. This involves, amongst other things, monitoring both data quality and compliance with relevant regulations. The results are reported within a dashboard that displays appropriate KPIs (key performance indicators). Where do these KPIs come from? They might, as an example, represent weighted aggregations of data quality metrics. In other words, you gather raw data quality scores and then apply relevant business rules to these figures in order to calculate these KPIs. Or, as another example from within the healthcare sector, you might need to monitor that sensitive patient data is being appropriately masked or redacted so that unauthorised persons cannot view that data. Here again, you would need to define relevant business rules, monitor actual practice and report the results in a KPI. In other words, there are a whole series of sets of business rules that need to be developed and tested that range from traditional data quality rules through transformation rules to the calculation of KPIs. Moreover, these data management rules, whether for data quality or data integration or for monitoring data governance or compliance, can be highly complex. Moreover, it is only a business analyst or domain expert that is likely to understand what rules are appropriate to the business but, and here is the rub, you will likely need a developer to actually build the rule. This leads us onto the issue of exactly how you develop data quality, monitoring, transformation and other data management rules. In practice, development follows a traditional paradigm: a specification from the business, followed by development using the tool in question, followed by acceptance testing. Bear in mind that we are not just talking about one-off Bloor Research A Bloor White Paper

5 The issues with rules rules but perhaps thousands. Moreover, there may be relationships within the data that need to be maintained: if you have accounts with both Federal Express and FedEx, you may want to match and then merge these but you also need to ensure that any related orders (both current and historic) still retain their relationships with the merged entity. The problem with this approach to development is twofold. Firstly, there are the commonplace issues that bedevil all specifications: that business and IT do not fully understand each other, typically speaking different (technical) languages; the business changes its mind; both sides can make errors and mistakes; and there is often ambiguity in the specification. Secondly, if there is a specification mismatch you won t find out about it until the very end of the process when you perform acceptance testing, which is the most expensive time to find out. To put this another way: this is not an agile approach. A further problem is that when you have thousands or tens of thousands of data quality rules, or any sort of rules for that matter, then maintenance of those rules can become an issue: you can end up spending so much resource maintaining existing rules that you do not have time, or very little time, to develop new rules, and this may mean that, for example, you cannot extend your data management processes to new business domains as fast as the organisation would like. It is therefore important that the environment enables rule maintenance in as easy to use a manner as possible, as well as providing an optimal environment for the development of new rules. Next, there is the question of where the development takes place. You develop new rules within whatever tool you are using but what do you test this against? The live data? That would be dangerous and, besides, it will put an additional strain on the operational system, which will be counter-productive. Alternatively, you may take a copy of the operational system or a subset of it, as you would do in conventional testing environments. Unfortunately, this has other consequences: it means that you will need extra licenses for the development and test systems, as well as extra hardware to run these on. Finally, there is the question of sensitive data: who is allowed to see personal information (for example) within the data? If developers and/or testers are not permitted to see this then you will need to mask the test data in some way in order to comply with data privacy regulations. Of course, the very transformations that you build into data quality rules can be used to mask data but there remains the question of who tests the tester, so to speak. In any case, it is not as simple as replacing 16 digits in a credit card number with an x we will discuss this further in due course. To summarise: we do not believe that most companies (and vendors) think of building data management rules as analogous to developing other types of application. We think they should: data rules and transformations suffer from exactly the same sort of issues as more generalised development and the same sort of answers apply. Specifically, there are three major considerations: 1. An environment that supports the testing, development and validation of the rules and processes under consideration and, in particular, which supports early-stage user acceptance testing. Needless to say, this should be collaborative, and as easy to use and productive as possible. 2. The creation, generation or abstraction of a dataset that can be used as a test bed for the rules or processes that are of interest, and which can be re-generated or recreated, as required, in an agile manner. By preference, the generation or re-generation of this dataset should have as little impact on operational systems as possible. 3. Relevant output. If you are developing transformation rules for data integration, as an example, then you would like the output to either generate all the code or directly integrate with the data integration tool in question so that all relevant transformation processes can be generated automatically by that tool or, if you want a vendor-neutral approach, then you would like to generate a complete, documented specification that can be used to create the relevant workflows required to support the data integration process. In the following sections we will discuss each of these requirements. A Bloor White Paper Bloor Research

6 Agile development Aside from the facilities provided by data management vendors themselves which, as we have discussed, tend to be locked into a traditional and non-agile approach the principal alternative approach is known as data prototyping. This is complementary with, rather than competitive to, the various data quality, integration and governance solutions that are available in the market, though it could also be profitably used where no such solutions are in place. Data prototyping While we might argue with the following definition, according to Wikipedia a data prototype is a form of functional or working prototype. It states that the justification for its creation is usually a data migration, data integration or application implementation project and the raw materials used as input are an instance of all the relevant data which exists at the start of the project. Wikipedia goes on to say that the objectives of data prototyping are to produce: A set of data cleansing and transformation rules which have been seen to produce data that is all fit for purpose. A dataset that is the result of those rules being applied to an instance of the relevant raw (source) data. To achieve this, a data architect uses a graphical interface to interactively develop and execute transformation and cleansing rules using raw data. The resultant data is then evaluated and the rules refined. Beyond the obvious visual checking of the data on-screen by the data architect, the usual evaluation and validation approaches are to use data profiling software and then to insert the resultant data into a test version of the target application and trial its use. In other words, this offers an agile approach to the development of data quality rules and transformations: build a rule, test it, refine it, test it again and so on, and get users involved in acceptance testing whenever it is appropriate. Perhaps the most important aspect of this is that this is built on top of data profiling capabilities. This is fundamental for data integration, as you must understand the relationships (Federal Express and its orders) that exist within the data in order to be able to satisfactorily build a prototype and incrementally test it. However, there is a secondary issue: most data profiling tools profile the live data in-situ. This is not practical in a development environment because you are continually degrading the performance of the operational system by re-profiling it. This is why data quality providers do not typically offer prototyping: because there would be too big a performance overhead on live systems. So this approach will only be practical if the profiling software takes a copy of the relevant data and metadata when it reads it for the first time. Such an approach will mean that all subsequent analysis has no impact on operational systems and, as well, such analyses should perform better in their own right. The other point, of course, is that you do not actually need live data to test against: just representative data. We will discuss this further in due course. In the same way that agile development allows a programmer to validate their work as they develop, prototyping allows the incremental validation of the specification of rules during their development by the subject matter experts. This is achieved by interactively studying and even profiling the data, which results from the application of each (part of a) rule. In the case of Data Integration projects the final validation step is to load the data from the data prototype into the target application to ensure that it actually works: after all the analyst could have made a mistake. This validation of a result by subject matter experts, rather than an English specification or pseudo-code, is fundamentally different to the current iterative approach and can bring the enormous time and effort savings alluded to in the introduction to this paper Bloor Research A Bloor White Paper

7 Agile data In order to test application software, and we can think of data management rules as a form of application software, you need to have data to test against. The traditional ways of achieving this are to copy the operational database (often one copy for development and another for testing) or to subset the database. The problem with the former, and to a lesser extent the latter, is that it is expensive in terms of license costs and additional hardware support. Furthermore, the actual process of taking a copy of the database can cause a problem. In a data migration, for example, the schema for the new database may change during development, which effectively means that you need a new copy of the data to test against. This can seriously slow down the agility of the environment if, for example, it takes two weeks (which is fairly typical) to get a new copy of the database. Another problem with taking a database copy is that the current data in the database will not exhibit all the potential errors that you might want to process through your rules: many outliers (which are most interesting from a testing perspective) will only be present occasionally. This problem is exacerbated, of course, if you subset the data because that process can eliminate even more outliers. However, by using all the data, the rule development and validation can at least take account of all the actual variations in the data rather than trying to second-guess what might be in there. Finally, if data within your dataset is sensitive then it may need to be masked. However, this is not as simple as it may appear. For example, you could simply hide a credit card number by replacing each digit with an x, which will be fine if you are only concerned with data protection. However, if you want to test a payment application then you will need to work with real (pseudo-) numbers in order to test your applications. Similarly, simple shuffling techniques (for example, replacing zip code with 54321) will not work if your application requires a valid zip code. So, you will need to mask in such a way that the data remains valid. has a consulting physician who practices in a particular hospital and uses a designated operating theatre. If you scramble the data so that a patient with flu ends up having open heart surgery then your software may break down simply because your masking routines have not ensured that important relationships remain intact. So, discovery of these relationships may be essential and falls within the domain of data profiling and, by inference, data prototyping. There is another way, which is to generate synthetic data. This requires an understanding of the data model underlying the database(s) in question, which will typically be based on the database schema(s) complemented by relationships discovered through use, again, of data profiling. The two main advantages of using this approach are that the data does not need to be masked, and synthetic data can be re-generated very quickly (typically in a matter of hours rather than weeks) when required. There is also a third potential benefit in that, if you have a complete understanding of the data model and all its associated business rules, then you should be able to generate a dataset that includes all potential outliers. However, this is dependent on the thoroughness of your data model, the completeness of your data profiling, and the accuracy of your understanding of the processes involved. This can never by guaranteed to be 100% so using synthetic data, together with data prototyping, may well be a successful marriage because, with data prototyping, you are effectively going beyond the analysis by trialling the transformed data within the new application to see if it actually works. Further, it may not simply be a question of identifying what data needs to be masked and then hiding it. This is because you need to ensure that data relationships remain intact during the masking process as otherwise testing may break down. This will, of course, be dependent on the application but in complex environments it can be critical. For example, a patient has a disease, which has a treatment, which A Bloor White Paper Bloor Research

8 Agile output We have little to add to the brief description provided above. If we assume that you are using data prototyping then you would either like it to generate code, or be integrated with relevant data quality and data integration platforms, or generate a specification. In the last case specification really does not do justice to what has been produced: what you have is a fully tested (including acceptance testing) and complete document describing precisely what needs to be done. Given that classical specification-based approaches frequently lead to misunderstandings and failures, calling this sort of output a specification does it a disservice. In addition, as is sometimes the case in industry, the act of building a prototype may be all that is required to answer a question, prove a point or evaluate feasibility. The actual data produced by the data prototype may itself be the deliverable, especially if the requirement is a one-off, As a result the output of an Agile approach to data rule development could simply be that further phases of the project are not necessary Bloor Research A Bloor White Paper

9 X88 Pandora Pandora, from X88 Software, is a data management product which exists in three versions: Data Discovery and Profiling Edition Data Integration Prototyping Edition Data Quality Management Edition. In fact this is somewhat misleading as the three editions are not different products, as their names might suggest, but supersets of one another with the Data Quality Management Edition being the broadest in terms of functionality. In other words, both the Data Integration Prototyping and Data Quality Management editions include prototyping capabilities and both depend on the data discovery and profiling capabilities of the entry-level product. Data discovery and profiling is at the heart of the product. We do not intend to discuss this in detail. However, where Pandora differs from most profiling products in the market is that it copies the data from the system or systems being analysed into its own repository. This has the advantage that the profiling, relationship discovery and dependency and key analysis you perform have no impact on source systems. Furthermore, you can potentially do a lot more with the information you retrieve when working with your own source of data rather than relying on operational systems. In particular, it is this that enables data prototyping. Specifically, the two more advanced editions of Pandora include a graphical rule builder and automatic mapping report generation as well as a business glossary (see Figure 1) and support for the definition of user defined functions. The glossary is important because it supports collaboration between business analysts and developers while user-defined functions allow you to define your own functions if the 300+ functions included within Pandora are not sufficient for you. However, the main point is that instead of defining rules (which may be data quality, data integration or transformation rules) in a spreadsheet or in a document, which is the typical approach adopted in most organisations, you can design these within the graphical rule builder and then test them, within the same environment, to see if they work in the way that you expect. Figure 1: Pandora Business Glossary options The graphical rule-builder uses a point-andclick interface that does not require any programming or scripting, and does not require knowledge of SQL. Indeed, the repository is not relational and does not have the limits that SQL can impose on certain types of functions. Major capabilities include: Joins you can join data based on the relationships discovered during profiling. Any type of multi-system join is available, all of which are performed within the Pandora repository and without reference back to source systems. One specific feature worth noting is the dynamic table union function. This allows you to create the union of multiple source tables even when these are dissimilar. This is particularly useful in supporting MDM implementations, for example. Sorts the product supports single and multi-column sorting as well as reverse sorts. In particular, you can sort on a sub-value within a datatype. Filters data can be filtered via fuzzy matching, value contains options, pattern matching, range and datatype tests. Grouping Pandora supports multi-column group by operations on both native and calculated or derived values. This is often combined with aggregation functions to provide record count and value summations. Other pre-built functions include support for aggregations (see Figure 2) and look-ups from reference tables, amongst others. A Bloor White Paper Bloor Research

10 X88 Pandora Figure 2: Aggregation options in Pandora Figure 3: Business-friendly mapping diagram In practice, once you have defined your rules, tested them and confirmed that they do what you want (which will, no doubt, be an iterative process), then you are ready to generate your mappings. This is done automatically for you by the product, which takes into account all the rules and functions you have deployed and automatically generates the optimal mapping to meet those requirements. Note that this is illustrated (see Figure 3) using business terminology in order to support business/it collaboration and the sort of agile approach we have advocated in this paper. To do this, Pandora uses metadata (stored in its repository) matching. It uses exact field-name matches in the first instance, then a series of fuzzy matching rules to ascertain the most likely target fields for available source fields. Where there are multiple opportunities for mapping, it provides the user with a list of potential candidates from which to choose. The auto-mapper validates all of the mappings to ensure that the source and target are compatible in terms of datatype, length, nullability and syntax rules. Inconsistencies are highlighted visually. The key point here is that you can now just load the resulting data generated from your mappings into the target application and see how it runs for acceptance testing purposes. Once you are happy with the results you can use Pandora to generate, again automatically, a mapping specification document that, together with the associated data, marks a clear handover from Design to Build, and an objective sign-off for the Build when that is completed. Best practice will be to provide the Build team with the Pandora-generated specification document (an example of which is illustrated in Figure 4) and the same source data used to develop it. If the process developed by the Build produces Bloor Research A Bloor White Paper

11 X88 Pandora output data that is identical to the Pandora data prototype, then the Build can be signed off. Otherwise it does not conform to the specification. This makes managing outsourced and similar developments much simpler than is typically the case. Note that, since the data prototype uses all the source data, most test cases will be catered for by default. Finally, bear in mind that the mapping specification generated and passed on to the Build team is much more than a traditional mapping document or specification. This is because you know that it will work thanks to the testing and design work that you have already done. The only thing that we could wish for is that you could import the mappings directly into a relevant data integration tool and could then use them to generate the relevant workflows within that tool. However, even without this capability, we believe Pandora to be extremely useful and it is in use at a number of X88 customers in conjunction with high-end data integration products such as Informatica and Ab Initio. Figure 4: Pandora mapping report Finally, it is worth noting that X88 Software is working on a Data Integration Edition of Pandora to provide an end-to-end solution for departmental projects or for organisations that do not have data integration products in place. A Bloor White Paper Bloor Research

12 Conclusion The environments in which data management rules are developed and tested in most organisations today are old-fashioned, time consuming and error prone. This is primarily because data management is not treated as a form of application development, where things have moved on from the sort of approach that is classically presented within data quality and data integration environments. This paper is therefore a call for some more forward thinking from data management practitioners and vendors and a move towards a more agile stance for the development of data management rules. One of the few companies that has recognised this deficiency and has responded to it is X88. The company s Pandora tool supports data prototyping for data integration, data quality and data management environments generally. Properly used this approach can eliminate the specification mismatch that is the cause of so many overrunning, delayed or cancelled data management projects. Further Information Further information about this subject is available from Bloor Research A Bloor White Paper

13 Bloor Research overview Bloor Research is one of Europe s leading IT research, analysis and consultancy organisations. We explain how to bring greater Agility to corporate IT systems through the effective governance, management and leverage of Information. We have built a reputation for telling the right story with independent, intelligent, well-articulated communications content and publications on all aspects of the ICT industry. We believe the objective of telling the right story is to: Describe the technology in context to its business value and the other systems and processes it interacts with. Understand how new and innovative technologies fit in with existing ICT investments. Look at the whole market and explain all the solutions available and how they can be more effectively evaluated. Filter noise and make it easier to find the additional information or news that supports both investment and implementation. Ensure all our content is available through the most appropriate channel. Founded in 1989, we have spent over two decades distributing research and analysis to IT user and vendor organisations throughout the world via online subscriptions, tailored research services, events and consultancy projects. We are committed to turning our knowledge into business value for you. About the author Philip Howard Research Director - Data Philip started in the computer industry way back in 1973 and has variously worked as a systems analyst, programmer and salesperson, as well as in marketing and product management, for a variety of companies including GEC Marconi, GPT, Philips Data Systems, Raytheon and NCR. After a quarter of a century of not being his own boss Philip set up what is now P3ST (Wordsmiths) Ltd in 1992 and his first client was Bloor Research (then ButlerBloor), with Philip working for the company as an associate analyst. His relationship with Bloor Research has continued since that time and he is now Research Director. His practice area encompasses anything to do with data and content and he has five further analysts working with him in this area. While maintaining an overview of the whole space Philip himself specialises in databases, data management, data integration, data quality, data federation, master data management, data governance and data warehousing. He also has an interest in event stream/complex event processing. In addition to the numerous reports Philip has written on behalf of Bloor Research, Philip also contributes regularly to and and was previously the editor of both Application Development News and Operating System News on behalf of Cambridge Market Intelligence (CMI). He has also contributed to various magazines and published a number of reports published by companies such as CMI and The Financial Times. Away from work, Philip s primary leisure activities are canal boats, skiing, playing Bridge (at which he is a Life Master) and walking the dog.

14 Copyright & disclaimer This document is copyright 2011 Bloor Research. No part of this publication may be reproduced by any method whatsoever without the prior consent of Bloor Research. Due to the nature of this material, numerous hardware and software products have been mentioned by name. In the majority, if not all, of the cases, these product names are claimed as trademarks by the companies that manufacture the products. It is not Bloor Research s intent to claim these names or trademarks as our own. Likewise, company logos, graphics or screen shots have been reproduced with the consent of the owner and are subject to that owner s copyright. Whilst every care has been taken in the preparation of this document to ensure that the information is correct, the publishers cannot accept responsibility for any errors or omissions.

15 2nd Floor, St John Street LONDON, EC1V 4PY, United Kingdom Tel: +44 (0) Fax: +44 (0) Web:

InBrief. Data Profiling & Discovery. A Market Update

InBrief. Data Profiling & Discovery. A Market Update InBrief Data Profiling & Discovery A Market Update An InBrief Paper by Bloor Research Author : Philip Howard Publish date : June 2012 Data Profiling and Discovery X88 Pandora Market trends In 2009 we

More information

White Paper. Lower your risk with application data migration. next steps with Informatica

White Paper. Lower your risk with application data migration. next steps with Informatica White Paper Lower your risk with application data migration A White Paper by Bloor Research Author : Philip Howard Publish date : April 2013 If we add in Data Validation and Proactive Monitoring then Informatica

More information

White Paper. The importance of an Information Strategy

White Paper. The importance of an Information Strategy White Paper The importance of an Information Strategy A White Paper by Bloor Research Author : Philip Howard Publish date : December 2008 The idea of an Information Strategy will be critical to your business

More information

White Paper. Data Migration

White Paper. Data Migration White Paper Data Migration A White Paper by Bloor Research Author : Philip Howard Publish date : May 2011 data migration projects are undertaken because they will support business objectives. There are

More information

Spotlight. Big data and the mainframe

Spotlight. Big data and the mainframe Spotlight Big data and the mainframe A Spotlight Paper by Bloor Research Author : Philip Howard Publish date : March 2014 there needs to be an infrastructure in place to manage the inter-relationship between

More information

White Paper. Data exchange and information sharing

White Paper. Data exchange and information sharing White Paper Data exchange and information sharing A White Paper by Bloor Research Author : Philip Howard Publish date : February 2011 We highly recommend a move away from hand coding (for enabling partner

More information

How do you get more from your Data Warehouse?

How do you get more from your Data Warehouse? A White Paper by Bloor Research Author : Philip Howard Publish date : November 2007 The need for data warehousing is growing ever more acute and poses a number of problems for data warehouse providers

More information

Spotlight. Data Discovery

Spotlight. Data Discovery Spotlight Data Discovery A Spotlight Report by Bloor Research Author : Philip Howard Publish date : February 2009 We believe that the ability to discover and understand the relationships that exist across

More information

White Paper. The benefits of basing email and web security in the cloud. including cost, speed, agility and better protection

White Paper. The benefits of basing email and web security in the cloud. including cost, speed, agility and better protection White Paper The benefits of basing email and web security in the cloud A White Paper by Bloor Research Author : Fran Howarth Publish date : July 2010 the outsourcing of email and web security defences

More information

White Paper. The benefits of a cloud-based email archiving service. for use by organisations of any size

White Paper. The benefits of a cloud-based email archiving service. for use by organisations of any size White Paper The benefits of a cloud-based email archiving service A White Paper by Bloor Research Author : Fran Howarth Publish date : June 2010 Given the importance placed today on emails as a means of

More information

White Paper. SAP ASE Total Cost of Ownership. A comparison to Oracle

White Paper. SAP ASE Total Cost of Ownership. A comparison to Oracle White Paper SAP ASE Total Cost of Ownership A White Paper by Bloor Research Author : Philip Howard Publish date : April 2014 The results of this survey are unequivocal: for all 21 TCO and related metrics

More information

White Paper. Master Data Management

White Paper. Master Data Management White Paper Master Data Management A White Paper by Bloor Research Author : Philip Howard Publish date : May 2013 Whatever your reasons for wanting to implement MDM, the sorts of facilities described for

More information

White Paper. White Paper by Bloor Author Philip Howard Publish date March 2012. The business case for Data Quality

White Paper. White Paper by Bloor Author Philip Howard Publish date March 2012. The business case for Data Quality White Paper White Paper by Bloor Author Philip Howard Publish date March 2012 The business case for Data Quality there is much to be said in favour of a platform-based approach to data quality. Author

More information

White Paper. What the ideal cloud-based web security service should provide. the tools and services to look for

White Paper. What the ideal cloud-based web security service should provide. the tools and services to look for White Paper What the ideal cloud-based web security service should provide A White Paper by Bloor Research Author : Fran Howarth Publish date : February 2010 The components required of an effective web

More information

master data management and data integration: complementary but distinct

master data management and data integration: complementary but distinct master data management and data integration: complementary but distinct A White Paper by Bloor Research Author : Philip Howard Review date : October 2006 Put simply, if you ignore data integration or do

More information

X88 Pandora Technical Overview V3.0

X88 Pandora Technical Overview V3.0 X88 Pandora Technical Overview V3.0 March 2011 Introduction X88 Pandora is an innovative Data Management software product which is allowing enterprises to reduce delivery times on data-dependent projects

More information

Spotlight. Spotlight Paper by Bloor Author Philip Howard Publish date September 2014. Automated test case generation

Spotlight. Spotlight Paper by Bloor Author Philip Howard Publish date September 2014. Automated test case generation Spotlight Spotlight Paper by Bloor Author Philip Howard Publish date September 2014 Automated test case generation Since its inception, IT has been about automating business processes. However, it has

More information

White Paper. When email archiving is best done in the cloud. ease of use a prime consideration

White Paper. When email archiving is best done in the cloud. ease of use a prime consideration White Paper When email archiving is best done in the cloud A White Paper by Bloor Research Author : Fran Howarth Publish date : June 2010 An email archiving service provided in the cloud is a viable alternative

More information

White Paper. Considerations for maximising analytic performance

White Paper. Considerations for maximising analytic performance White Paper Considerations for maximising analytic performance A White Paper by Bloor Research Author : Philip Howard Publish date : September 2013 DB2 with BLU Acceleration should not only provide better

More information

InDetail. Grid-Tools Test Data Management

InDetail. Grid-Tools Test Data Management InDetail Grid-Tools Test Data Management An InDetail Paper by Bloor Research Author : Philip Howard Publish date : March 2011 As far as we know, Grid-Tools is the only specialist vendor in this space.

More information

Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario

Contents. visualintegrator The Data Creator for Analytical Applications. www.visualmetrics.co.uk. Executive Summary. Operational Scenario About visualmetrics visualmetrics is a Business Intelligence (BI) solutions provider that develops and delivers best of breed Analytical Applications, utilising BI tools, to its focus markets. Based in

More information

White Paper. CA Database Management for DB2 & IMS for z/os

White Paper. CA Database Management for DB2 & IMS for z/os White Paper CA Database Management A White Paper by Bloor Research Author : Philip Howard Publish date : June 2011 It is clear from our discussions with AXA, CECA and Telefónica that these companies believe

More information

Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management

Making Business Intelligence Easy. Whitepaper Measuring data quality for successful Master Data Management Making Business Intelligence Easy Whitepaper Measuring data quality for successful Master Data Management Contents Overview... 3 What is Master Data Management?... 3 Master Data Modeling Approaches...

More information

White Paper. The benefits of a cloud-based service for web security. reducing risk, adding value and cutting costs

White Paper. The benefits of a cloud-based service for web security. reducing risk, adding value and cutting costs White Paper The benefits of a cloud-based service for web security A White Paper by Bloor Research Author : Fran Howarth Publish date : February 2010 By using a service based in the cloud, protection against

More information

Data Integration Platforms - Talend

Data Integration Platforms - Talend Data Integration Platforms - Talend Author : Philip Howard Publish date : July 2008 page 1 Introduction Talend is an open source provider of data integration products. However, while many open source

More information

Data Quality Improvement and the Open Mapping Tools

Data Quality Improvement and the Open Mapping Tools Improving Data Quality with Open Mapping Tools February 2011 Robert Worden Open Mapping Software Ltd 2011 Open Mapping Software Contents 1. Introduction: The Business Problem 2 2. Initial Assessment: Understanding

More information

InDetail. Kdb+ and the Internet of Things/Big Data

InDetail. Kdb+ and the Internet of Things/Big Data InDetail Kdb+ and the Internet of Things/Big Data An InDetail Paper by Bloor Research Author : Philip Howard Publish date : August 2014 Kdb+ has proved itself in what is unarguably the most demanding big

More information

Key Benefits of Microsoft Visual Studio Team System

Key Benefits of Microsoft Visual Studio Team System of Microsoft Visual Studio Team System White Paper November 2007 For the latest information, please see www.microsoft.com/vstudio The information contained in this document represents the current view

More information

Measure Your Data and Achieve Information Governance Excellence

Measure Your Data and Achieve Information Governance Excellence SAP Brief SAP s for Enterprise Information Management SAP Information Steward Objectives Measure Your Data and Achieve Information Governance Excellence A single solution for managing enterprise data quality

More information

How To Use Open Source Software For Library Work

How To Use Open Source Software For Library Work USE OF OPEN SOURCE SOFTWARE AT THE NATIONAL LIBRARY OF AUSTRALIA Reports on Special Subjects ABSTRACT The National Library of Australia has been a long-term user of open source software to support generic

More information

TopBraid Insight for Life Sciences

TopBraid Insight for Life Sciences TopBraid Insight for Life Sciences In the Life Sciences industries, making critical business decisions depends on having relevant information. However, queries often have to span multiple sources of information.

More information

White Paper. Thirsting for Insight? Quench It With 5 Data Management for Analytics Best Practices.

White Paper. Thirsting for Insight? Quench It With 5 Data Management for Analytics Best Practices. White Paper Thirsting for Insight? Quench It With 5 Data Management for Analytics Best Practices. Contents Data Management: Why It s So Essential... 1 The Basics of Data Preparation... 1 1: Simplify Access

More information

A WHITE PAPER By Silwood Technology Limited

A WHITE PAPER By Silwood Technology Limited A WHITE PAPER By Silwood Technology Limited Using Safyr to facilitate metadata transparency and communication in major Enterprise Applications Executive Summary Enterprise systems packages such as SAP,

More information

JOURNAL OF OBJECT TECHNOLOGY

JOURNAL OF OBJECT TECHNOLOGY JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,

More information

White Paper. Architecting the security of the next-generation data center. why security needs to be a key component early in the design phase

White Paper. Architecting the security of the next-generation data center. why security needs to be a key component early in the design phase White Paper Architecting the security of the next-generation data center A White Paper by Bloor Research Author : Fran Howarth Publish date : August 2011 teams involved in modernization projects need to

More information

METADATA-DRIVEN QLIKVIEW APPLICATIONS AND POWERFUL DATA INTEGRATION WITH QLIKVIEW EXPRESSOR

METADATA-DRIVEN QLIKVIEW APPLICATIONS AND POWERFUL DATA INTEGRATION WITH QLIKVIEW EXPRESSOR METADATA-DRIVEN QLIKVIEW APPLICATIONS AND POWERFUL DATA INTEGRATION WITH QLIKVIEW EXPRESSOR A QlikView Technical Brief Document March 2013 qlikview.com Introduction This technical brief highlights a subset

More information

TRIMIT Fashion reviewed: Tailor made out of the box?

TRIMIT Fashion reviewed: Tailor made out of the box? TRIMIT Fashion reviewed: Tailor made out of the box? Introduction TRIMIT Fashion delivers fashion specific functionalities on top of the recognized ERP (Enterprise Resource Planning) system called Dynamics

More information

Spotlight. Operations Management Applying operations management in the services sector

Spotlight. Operations Management Applying operations management in the services sector Spotlight Operations Management A Spotlight Paper by Bloor Research Author : Simon Holloway Publish date : November 2009 With new pressures on costs, it is becoming more imperative to get better control

More information

Business Benefits From Microsoft SQL Server Business Intelligence Solutions How Can Business Intelligence Help You? PTR Associates Limited

Business Benefits From Microsoft SQL Server Business Intelligence Solutions How Can Business Intelligence Help You? PTR Associates Limited Business Benefits From Microsoft SQL Server Business Intelligence Solutions How Can Business Intelligence Help You? www.ptr.co.uk Business Benefits From Microsoft SQL Server Business Intelligence (September

More information

InDetail. RainStor archiving

InDetail. RainStor archiving InDetail RainStor archiving An InDetail Paper by Bloor Research Author : Philip Howard Publish date : November 2013 Archival is a no-brainer when it comes to return on investment and total cost of ownership.

More information

Getting started with a data quality program

Getting started with a data quality program IBM Software White Paper Information Management Getting started with a data quality program 2 Getting started with a data quality program The data quality challenge Organizations depend on quality data

More information

2014 STATE OF SELF-SERVICE BI REPORT

2014 STATE OF SELF-SERVICE BI REPORT 2014 STATE OF SELF-SERVICE BI REPORT Logi Analytics First Executive Review of Self-Service Business Intelligence Trends 1 TABLE OF CONTENTS 3 Introduction 4 What is Self-Service BI? 5 Top Insights 6 In-depth

More information

Accelerate BI Initiatives With Self-Service Data Discovery And Integration

Accelerate BI Initiatives With Self-Service Data Discovery And Integration A Custom Technology Adoption Profile Commissioned By Attivio June 2015 Accelerate BI Initiatives With Self-Service Data Discovery And Integration Introduction The rapid advancement of technology has ushered

More information

Bringing agility to Business Intelligence Metadata as key to Agile Data Warehousing. 1 P a g e. www.analytixds.com

Bringing agility to Business Intelligence Metadata as key to Agile Data Warehousing. 1 P a g e. www.analytixds.com Bringing agility to Business Intelligence Metadata as key to Agile Data Warehousing 1 P a g e Table of Contents What is the key to agility in Data Warehousing?... 3 The need to address requirements completely....

More information

Is ETL Becoming Obsolete?

Is ETL Becoming Obsolete? Is ETL Becoming Obsolete? Why a Business-Rules-Driven E-LT Architecture is Better Sunopsis. All rights reserved. The information contained in this document does not constitute a contractual agreement with

More information

Introducing InterSystems DeepSee

Introducing InterSystems DeepSee Embedded Real-time Business Intelligence. Discover the Treasures. Make Applications More Valuable with Embedded Real-time Business Intelligence You can enhance your transactional applications with features

More information

SQL Server Master Data Services A Point of View

SQL Server Master Data Services A Point of View SQL Server Master Data Services A Point of View SUBRAHMANYA V SENIOR CONSULTANT SUBRAHMANYA.VENKATAGIRI@WIPRO.COM Abstract Is Microsoft s Master Data Services an answer for low cost MDM solution? Will

More information

The Benefits of Data Modeling in Data Warehousing

The Benefits of Data Modeling in Data Warehousing WHITE PAPER: THE BENEFITS OF DATA MODELING IN DATA WAREHOUSING The Benefits of Data Modeling in Data Warehousing NOVEMBER 2008 Table of Contents Executive Summary 1 SECTION 1 2 Introduction 2 SECTION 2

More information

Spotlight. Log and Event Management

Spotlight. Log and Event Management Spotlight Log and Event Management A Spotlight Paper by Bloor Research Author : Philip Howard Publish date : December 2009 It makes sense to treat event management and log management as two sides of the

More information

SQL Server 2012 Business Intelligence Boot Camp

SQL Server 2012 Business Intelligence Boot Camp SQL Server 2012 Business Intelligence Boot Camp Length: 5 Days Technology: Microsoft SQL Server 2012 Delivery Method: Instructor-led (classroom) About this Course Data warehousing is a solution organizations

More information

PRACTICAL GUIDE TO IMPLEMENTING A BUSINESS INTELLIGENCE SOLUTION

PRACTICAL GUIDE TO IMPLEMENTING A BUSINESS INTELLIGENCE SOLUTION PRACTICAL GUIDE TO IMPLEMENTING A BUSINESS INTELLIGENCE SOLUTION To make business decisions every day, timeous and accurate information gives you an advantage. TABLE OF CONTENTS Contents What is Business

More information

Requirements Management

Requirements Management REQUIREMENTS By Harold Halbleib Requirements Management Identify, Specify, Track and Control Requirements Using a Standard Process About the author... Harold Halbleib has a degree in Electrical Engineering

More information

Test Data Management Concepts

Test Data Management Concepts Test Data Management Concepts BIZDATAX IS AN EKOBIT BRAND Executive Summary Test Data Management (TDM), as a part of the quality assurance (QA) process is more than ever in the focus among IT organizations

More information

CROSS INDUSTRY PegaRULES Process Commander. Bringing Insight and Streamlining Change with the PegaRULES Process Simulator

CROSS INDUSTRY PegaRULES Process Commander. Bringing Insight and Streamlining Change with the PegaRULES Process Simulator CROSS INDUSTRY PegaRULES Process Commander Bringing Insight and Streamlining Change with the PegaRULES Process Simulator Executive Summary All enterprises aim to increase revenues and drive down costs.

More information

Agile Business Intelligence Data Lake Architecture

Agile Business Intelligence Data Lake Architecture Agile Business Intelligence Data Lake Architecture TABLE OF CONTENTS Introduction... 2 Data Lake Architecture... 2 Step 1 Extract From Source Data... 5 Step 2 Register And Catalogue Data Sets... 5 Step

More information

QAD Business Intelligence Data Warehouse Demonstration Guide. May 2015 BI 3.11

QAD Business Intelligence Data Warehouse Demonstration Guide. May 2015 BI 3.11 QAD Business Intelligence Data Warehouse Demonstration Guide May 2015 BI 3.11 Overview This demonstration focuses on the foundation of QAD Business Intelligence the Data Warehouse and shows how this functionality

More information

Using Master Data in Business Intelligence

Using Master Data in Business Intelligence helping build the smart business Using Master Data in Business Intelligence Colin White BI Research March 2007 Sponsored by SAP TABLE OF CONTENTS THE IMPORTANCE OF MASTER DATA MANAGEMENT 1 What is Master

More information

A New Foundation For Customer Management

A New Foundation For Customer Management The Customer Data Platform: A New Foundation For Customer Management 730 Yale Avenue Swarthmore, PA 19081 info@raabassociatesinc.com The Marketing Technology Treadmill Marketing automation. Inbound marketing.

More information

Managing Third Party Databases and Building Your Data Warehouse

Managing Third Party Databases and Building Your Data Warehouse Managing Third Party Databases and Building Your Data Warehouse By Gary Smith Software Consultant Embarcadero Technologies Tech Note INTRODUCTION It s a recurring theme. Companies are continually faced

More information

ORACLE HEALTHCARE ANALYTICS DATA INTEGRATION

ORACLE HEALTHCARE ANALYTICS DATA INTEGRATION ORACLE HEALTHCARE ANALYTICS DATA INTEGRATION Simplifies complex, data-centric deployments that reduce risk K E Y B E N E F I T S : A key component of Oracle s Enterprise Healthcare Analytics suite A product-based

More information

Bringing Strategy to Life Using an Intelligent Data Platform to Become Data Ready. Informatica Government Summit April 23, 2015

Bringing Strategy to Life Using an Intelligent Data Platform to Become Data Ready. Informatica Government Summit April 23, 2015 Bringing Strategy to Life Using an Intelligent Platform to Become Ready Informatica Government Summit April 23, 2015 Informatica Solutions Overview Power the -Ready Enterprise Government Imperatives Improve

More information

ADVANTAGES OF IMPLEMENTING A DATA WAREHOUSE DURING AN ERP UPGRADE

ADVANTAGES OF IMPLEMENTING A DATA WAREHOUSE DURING AN ERP UPGRADE ADVANTAGES OF IMPLEMENTING A DATA WAREHOUSE DURING AN ERP UPGRADE Advantages of Implementing a Data Warehouse During an ERP Upgrade Upgrading an ERP system presents a number of challenges to many organizations.

More information

Five Fundamental Data Quality Practices

Five Fundamental Data Quality Practices Five Fundamental Data Quality Practices W H I T E PA P E R : DATA QUALITY & DATA INTEGRATION David Loshin WHITE PAPER: DATA QUALITY & DATA INTEGRATION Five Fundamental Data Quality Practices 2 INTRODUCTION

More information

Business Intelligence for Everyone

Business Intelligence for Everyone Business Intelligence for Everyone Business Intelligence for Everyone Introducing timextender The relevance of a good Business Intelligence (BI) solution has become obvious to most companies. Using information

More information

POLAR IT SERVICES. Business Intelligence Project Methodology

POLAR IT SERVICES. Business Intelligence Project Methodology POLAR IT SERVICES Business Intelligence Project Methodology Table of Contents 1. Overview... 2 2. Visualize... 3 3. Planning and Architecture... 4 3.1 Define Requirements... 4 3.1.1 Define Attributes...

More information

Contents. Introduction... 1

Contents. Introduction... 1 Managed SQL Server 2005 Deployments with CA ERwin Data Modeler and Microsoft Visual Studio Team Edition for Database Professionals Helping to Develop, Model, and Maintain Complex Database Architectures

More information

The Ultimate Guide to Buying Business Analytics

The Ultimate Guide to Buying Business Analytics The Ultimate Guide to Buying Business Analytics How to Evaluate a BI Solution for Your Small or Medium Sized Business: What Questions to Ask and What to Look For Copyright 2012 Pentaho Corporation. Redistribution

More information

Measuring and Monitoring the Quality of Master Data By Thomas Ravn and Martin Høedholt, November 2008

Measuring and Monitoring the Quality of Master Data By Thomas Ravn and Martin Høedholt, November 2008 Measuring and Monitoring the Quality of Master Data By Thomas Ravn and Martin Høedholt, November 2008 Introduction We ve all heard about the importance of data quality in our IT-systems and how the data

More information

can you effectively plan for the migration and management of systems and applications on Vblock Platforms?

can you effectively plan for the migration and management of systems and applications on Vblock Platforms? SOLUTION BRIEF CA Capacity Management and Reporting Suite for Vblock Platforms can you effectively plan for the migration and management of systems and applications on Vblock Platforms? agility made possible

More information

Make the right decisions with Distribution Intelligence

Make the right decisions with Distribution Intelligence Make the right decisions with Distribution Intelligence Bengt Jensfelt, Business Product Manager, Distribution Intelligence, April 2010 Introduction It is not so very long ago that most companies made

More information

How to save money with Document Control software

How to save money with Document Control software How to save money with Document Control software A guide for getting the most out of your investment in a document control software package and some tips on what to look out for By Christopher Stainow

More information

Data Quality Assessment. Approach

Data Quality Assessment. Approach Approach Prepared By: Sanjay Seth Data Quality Assessment Approach-Review.doc Page 1 of 15 Introduction Data quality is crucial to the success of Business Intelligence initiatives. Unless data in source

More information

Requirements-Based Testing: Encourage Collaboration Through Traceability

Requirements-Based Testing: Encourage Collaboration Through Traceability White Paper Requirements-Based Testing: Encourage Collaboration Through Traceability Executive Summary It is a well-documented fact that incomplete, poorly written or poorly communicated requirements are

More information

The Spectrum of Data Integration Solutions: Why You Should Have Them All

The Spectrum of Data Integration Solutions: Why You Should Have Them All HAWTIN, STEVE, Schlumberger Information Systems, Houston TX; NAJIB ABUSALBI, Schlumberger Information Systems, Stavanger, Norway; LESTER BAYNE, Schlumberger Information Systems, Stavanger, Norway; MARK

More information

White Paper. Exploiting the Internet of Things with investigative analytics

White Paper. Exploiting the Internet of Things with investigative analytics White Paper Exploiting the Internet of Things with investigative analytics A White Paper by Bloor Research Author : Philip Howard Publish date : May 2013 The Internet of Things has the potential to change

More information

A Shift in the World of Business Intelligence

A Shift in the World of Business Intelligence Search Powered Business Analytics, the smartest way to discover your data A Shift in the World of Business Intelligence Comparison of CXAIR to Traditional BI Technologies A CXAIR White Paper www.connexica.com

More information

The Ultimate Guide to Buying Business Analytics

The Ultimate Guide to Buying Business Analytics The Ultimate Guide to Buying Business Analytics How to Evaluate a BI Solution for Your Small or Medium Sized Business: What Questions to Ask and What to Look For Copyright 2012 Pentaho Corporation. Redistribution

More information

High-Performance Business Analytics: SAS and IBM Netezza Data Warehouse Appliances

High-Performance Business Analytics: SAS and IBM Netezza Data Warehouse Appliances High-Performance Business Analytics: SAS and IBM Netezza Data Warehouse Appliances Highlights IBM Netezza and SAS together provide appliances and analytic software solutions that help organizations improve

More information

In-Database Analytics

In-Database Analytics Embedding Analytics in Decision Management Systems In-database analytics offer a powerful tool for embedding advanced analytics in a critical component of IT infrastructure. James Taylor CEO CONTENTS Introducing

More information

SimCorp Solution Guide

SimCorp Solution Guide SimCorp Solution Guide Data Warehouse Manager For all your reporting and analytics tasks, you need a central data repository regardless of source. SimCorp s Data Warehouse Manager gives you a comprehensive,

More information

COURSE SYLLABUS COURSE TITLE:

COURSE SYLLABUS COURSE TITLE: 1 COURSE SYLLABUS COURSE TITLE: FORMAT: CERTIFICATION EXAMS: 55043AC Microsoft End to End Business Intelligence Boot Camp Instructor-led None This course syllabus should be used to determine whether the

More information

Squaring the circle: using a Data Governance Framework to support Data Quality. An Experian white paper

Squaring the circle: using a Data Governance Framework to support Data Quality. An Experian white paper Squaring the circle: using a Governance Framework to support Quality An Experian white paper June 2014 Introduction Most organisations wish for better quality data which makes it surprising just how many

More information

Module 1 Study Guide Introduction to PPO. ITIL Capability Courses - Planning, Protection and Optimization

Module 1 Study Guide Introduction to PPO. ITIL Capability Courses - Planning, Protection and Optimization Module 1 Study Guide Introduction to PPO ITIL Capability Courses - Planning, Protection and Optimization Introducing PPO Welcome to your Study Guide. This document is supplementary to the information available

More information

How to address top problems in test data management

How to address top problems in test data management How to address top problems in test data management Data reuse, sub-setting and masking Business white paper Table of contents Why you need test data management... 3 The challenges of preparing and managing

More information

Sagent Data Flow. from Group 1 Software. an extract from the Bloor Research report, Data Integration, Volume 1

Sagent Data Flow. from Group 1 Software. an extract from the Bloor Research report, Data Integration, Volume 1 Sagent Data Flow from Group 1 Software an extract from the Bloor Research report, Data Integration, Volume 1 Sagent Data Flow Sagent Data Flow Fast facts Sagent Data Flow, which is now provided by Group

More information

Data Quality; is this the key to driving value out of your investment in SAP? Data Quality; is this the key to

Data Quality; is this the key to driving value out of your investment in SAP? Data Quality; is this the key to Driving Whitby Whitby value Partners Partners from Business Driving Intelligence value from Business Business Intelligence Intelligence Whitby Partners 78 York Street London W1H 1DP UK Tel: +44 (0) 207

More information

Better Business Analytics with Powerful Business Intelligence Tools

Better Business Analytics with Powerful Business Intelligence Tools Better Business Analytics with Powerful Business Intelligence Tools Business Intelligence Defined There are many interpretations of what BI (Business Intelligence) really is and the benefits that it can

More information

Data Migration. How CXAIR can be used to improve the efficiency and accuracy of data migration. A CXAIR White Paper. www.connexica.

Data Migration. How CXAIR can be used to improve the efficiency and accuracy of data migration. A CXAIR White Paper. www.connexica. Search Powered Business Analytics, the smartest way to discover your data Data Migration How CXAIR can be used to improve the efficiency and accuracy of data migration A CXAIR White Paper www.connexica.com

More information

Knowledge Base Data Warehouse Methodology

Knowledge Base Data Warehouse Methodology Knowledge Base Data Warehouse Methodology Knowledge Base's data warehousing services can help the client with all phases of understanding, designing, implementing, and maintaining a data warehouse. This

More information

Eliminating Complexity to Ensure Fastest Time to Big Data Value

Eliminating Complexity to Ensure Fastest Time to Big Data Value Eliminating Complexity to Ensure Fastest Time to Big Data Value Copyright 2013 Pentaho Corporation. Redistribution permitted. All trademarks are the property of their respective owners. For the latest

More information

MicroStrategy Course Catalog

MicroStrategy Course Catalog MicroStrategy Course Catalog 1 microstrategy.com/education 3 MicroStrategy course matrix 4 MicroStrategy 9 8 MicroStrategy 10 table of contents MicroStrategy course matrix MICROSTRATEGY 9 MICROSTRATEGY

More information

Talend Metadata Manager. Reduce Risk and Friction in your Information Supply Chain

Talend Metadata Manager. Reduce Risk and Friction in your Information Supply Chain Talend Metadata Manager Reduce Risk and Friction in your Information Supply Chain Talend Metadata Manager Talend Metadata Manager provides a comprehensive set of capabilities for all facets of metadata

More information

Data warehouse and Business Intelligence Collateral

Data warehouse and Business Intelligence Collateral Data warehouse and Business Intelligence Collateral Page 1 of 12 DATA WAREHOUSE AND BUSINESS INTELLIGENCE COLLATERAL Brains for the corporate brawn: In the current scenario of the business world, the competition

More information

www.dotnetsparkles.wordpress.com

www.dotnetsparkles.wordpress.com Database Design Considerations Designing a database requires an understanding of both the business functions you want to model and the database concepts and features used to represent those business functions.

More information

ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION

ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION EXECUTIVE SUMMARY Oracle business intelligence solutions are complete, open, and integrated. Key components of Oracle business intelligence

More information

TRENDS IN THE DEVELOPMENT OF BUSINESS INTELLIGENCE SYSTEMS

TRENDS IN THE DEVELOPMENT OF BUSINESS INTELLIGENCE SYSTEMS 9 8 TRENDS IN THE DEVELOPMENT OF BUSINESS INTELLIGENCE SYSTEMS Assist. Prof. Latinka Todoranova Econ Lit C 810 Information technology is a highly dynamic field of research. As part of it, business intelligence

More information

Data Discovery, Analytics, and the Enterprise Data Hub

Data Discovery, Analytics, and the Enterprise Data Hub Data Discovery, Analytics, and the Enterprise Data Hub Version: 101 Table of Contents Summary 3 Used Data and Limitations of Legacy Analytic Architecture 3 The Meaning of Data Discovery & Analytics 4 Machine

More information

3 myths of email analytics. and how they are impacting your results

3 myths of email analytics. and how they are impacting your results 3 myths of email analytics and how they are impacting your results Date: 11/17/2008 The volume of insights you can gain by adding ad hoc analysis capabilities to your standard set of email reporting metrics

More information

Digital Asset Manager, Digital Curator. Cultural Informatics, Cultural/ Art ICT Manager

Digital Asset Manager, Digital Curator. Cultural Informatics, Cultural/ Art ICT Manager Role title Digital Cultural Asset Manager Also known as Relevant professions Summary statement Mission Digital Asset Manager, Digital Curator Cultural Informatics, Cultural/ Art ICT Manager Deals with

More information