In-database Analytical Systems: Perspective, Trade-offs and Implementation
|
|
|
- Kristopher Stanley
- 10 years ago
- Views:
Transcription
1 In-database Analytical Systems: Perspective, Trade-offs and Implementation Executive summary TIBCO Spotfire is a visualization-based data discovery tool. It has always held its data in memory; this allows for very rapid processing but does place a restriction on the volume of data that can be analyzed. TIBCO has always acknowledged that for some customers this can be an issue. As an earlier white paper says: However if you need to analyze TBytes of data in a single analysis, and people do, then in-memory systems are probably not for you. 1 The significant change, which this paper addresses, is that Spotfire can now run its dimension-free data exploration environment against data held in Teradata, Oracle s Exadata, SQL Server 2008 and 2012 and in SQL Server Analysis Services (SSAS). Since these engines can comfortably handle data volumes far in excess of that which can be held in memory, it is now possible to use Spotfire s unsurpassed analytical capabilities against some of the largest data sets in the world. This provides much greater flexibility and also the ability to use the same tool for both classes of data provides some unexpected synergies.
2 How has Spotfire traditionally worked? In-memory vs. disk-based approaches Data discovery tools, such as Spotfire, tend to work in one of two ways they can either hold and manage the data themselves in a purpose-built, optimized store or they can connect to an existing database engine (relational or multidimensional) and work with the data it holds. Of those that hold and manage the data themselves, some hold the data in memory and some hold it on disk. Both forms of storage have their pros and cons, which is why both continued to be used. So what are the tradeoffs and how do they interact? Price Disk is much, much cheaper than RAM. How much cheaper depends on a large number of variables, notably the huge variety of memory and disk types, but it is probably safe to say that disk storage is between 30 and 50 times cheaper than RAM. Some sample prices gathered at the end of 2012 are: RAM $ per Gbyte Disk $ per GByte Approx. Ratio Laptop :1 Server :1 So storage cost, considered alone, tends to favor disk-based systems. Data volume Data volume is intimately allied to price; the larger the volume of data you want to analyze, the more likely you are to favor a disk-based system. We tend to say that in-memory is suited to modest volumes of data, but one person s modest is another person s massive. To try to provide a sense of perspective, let s assume you want to analyze a single table with 20 columns. Half of the columns are text (limited to, say, 50 characters but with an average length of seven characters), five are dates, three integers and two reals. A 1.5 million row table saved as a CSV file is 236 Mbytes about a quarter of a GByte. If we double the number of columns (40) and rows (3 million), that s about 1 GByte. Double those two again, to 80 columns and 6 million rows, and it is still only 4 Gbytes. Most laptops now have 8 GBytes of RAM so even without compression this relatively large number of complex rows can be analyzed comfortably in memory. In-database Analytical Systems: Perspective, Trade-offs and Implementation 2
3 So in-memory systems are clearly favored for the analysis of modest quantities of data, and disk-based systems where the volume of data is huge and simply won t fit in memory. Speed, and the trade-offs necessary to achieve it However there is a third variable to consider. There is very good evidence that the productivity of the users of a system is highly influenced by the speed with which it can deliver answers to their questions (see the box out on Train of thought analysis ). Train of thought analysis The ability to interact rapidly with an analytical system is paramount for generating results efficiently. It s not just the absolute time the querying session takes, it s the fact that sometimes the answers you get pose other questions and you re able to follow this new train of thought to a point where a new aspect of your business is revealed. A rapid response has been shown to be vital; if responses are too slow, thoughts tend to wander, to think about coffee or tonight s game and the will to follow a novel route is quashed by the undesirability of all the time wasted in waiting. The gain from such rapid responses can be impressive: research in the 1980s by IBM 2 showed that productivity increased by a dramatic 62% if the response time was improved. We also tend to collect data as individual transactions but analytical questions tend to examine the data at reasonably high levels of aggregation. So, for example, we might collect the data like this: CustFName CustLName Item Date Store Units Sold Fred Smith Canned Peaches 8/5/2012 Walla Walla 2 Sally Jones Candy 8/5/2012 Greenwich 1 but want to analyze the unit sales month by month across different stores, or state by state across different years. It is apparent that most of the analytical queries will be returning answers that are made up of aggregations of thousands or millions of rows. This isn t a problem for in-memory systems because aggregation calculations performed in memory are so fast. In-database Analytical Systems: Perspective, Trade-offs and Implementation 3
4 Disk-based systems, on the other hand, would struggle to return these aggregations fast enough if they had to read each individual transaction from a specific, single location on a disk and perform the aggregations in real time. So disk-based systems adopt a variety of strategies to provide faster response times. SSAS, for example, typically pre-aggregates the data after loading and before the users start to query it. This has the disadvantage of introducing a delay between data loading and querying, but it does provide a level of performance that can approach in-memory systems. Teradata doesn t pre-aggregate the data; instead it parallelizes the query across multiple nodes where each node potentially only handles a subset of the data. Both of these are excellent solutions in that they provide very high performance against large/huge data sets but each comes at a cost - either in terms of a delay or the requirement for additional hardware resources. So, when speed of response is important (and it usually is) and the data volume is reasonable, in-memory systems are typically favored. However disk-based systems have the advantage that they allow you to analyze truly massive quantities of data we are talking terabytes and even petabytes of data. (This is not an exaggeration. Teradata currently has 35 customers in its so-called Petabyte club.) But that then begs the question, why would anyone want to even think of analyzing that quantity of data? Data warehouses and Big Data The value of consolidating data Many people, particularly those at a high level in a company, often ask deceptively simple analytical questions. A classic example is the CEO who needs to know if the company is making a profit. It s very easy to ask the question, it takes just five words: Are we currently in profit? The answer is even simpler, a single word from a choice of two Yes or No. The question and the answer are simple, but providing that answer is, for companies of any size, mind-bogglingly complex. The problem is that almost all organizations run multiple transactional systems - for finance, HR, sales, manufacturing, CRM and so on. Data from all these systems must be taken into consideration before the simple question can be answered accurately. The solution is to: Extract the data required for analysis from the transactional systems Transform it into a consistent format and quality Load it into a centralized store for analysis The centralized store of data is then referred to as a data warehouse. In-database Analytical Systems: Perspective, Trade-offs and Implementation 4
5 The volume of data handled by a warehouse is typically huge for two reasons: firstly it pulls a great deal of data from multiple transactional systems and secondly the warehouse often stores history going back two, five, perhaps ten years at the transactional level of detail. It is now common to refer to this data as big data (see box out). This historical data, although very bulky, is of enormous value to the business for the simple reason that we can often use past behavior to give insights into the future. Big Data Big data is a term with multiple meanings. If you gather five analytical specialists in one room, there you will find six definitions of big data. In this paper we are using it simply to mean very large quantities of transactional data the kind of volumes that you find in the data warehouse of a reasonably sized company. This definition has two advantages. It is straightforward and it also describes, very neatly, a common class of data that needs to be analyzed. Constructing a data warehouse takes a huge amount of time and effort, but once that data has been consolidated and is in one location, it becomes enormously valuable which is why most organizations have, by now, made the investment of creating a data warehouse. Summary so far Analysis of data can either be performed on data that is held in memory or on data held on disk. The in-memory approach gives much faster analysis but RAM is significantly more expensive than disk. There are ways in which disk-based analysis can be speeded up, for example by pre-aggregating the data. The analytical requirements (and the regulatory reporting requirements) of many companies have driven them to create data warehouses that contain big data. This data is far too large to fit in memory, so it has to be disk-based. The three major points that any system used for data discovery needs to address are the speed with which analyses can be performed, the clarity with which the results can be displayed to non-specialist users and the ability to analyze any measure against any dimension and any other measure (see box out on Measures and Dimensions). In-database Analytical Systems: Perspective, Trade-offs and Implementation 5
6 Measures and Dimensions Analysis is often described in terms of measures and dimensions. A measure is simply a numeric value: units sold, number of customers, etc. KPIs (Key Performance Indicators) are all typically measures. But measures, no matter how many you have, are almost always meaningless without reference to at least one so-called dimension. For example, it may be the case that your company has sold 431,763,273,812 items since its inception in This might have some wow value but it only begins to reveal information about your business if you take that measure and split it out so, for instance, it shows the unit sales of canned peaches in two stores, displayed as monthly totals over the past year. This is far more meaningful: the measure now is shown with the context of a product dimension (canned peaches), a location dimension (the store) and a time dimension (the last twelve months). Some analytical queries require a measure to be analyzed against another measure. For example, in analyzing the effectiveness of loss leaders in supermarket sales, it is common to plot sales price against cost price. Spotfire natively supports this kind of analysis; some other analytical systems do not. Spotfire addressed all three of these but the brutal truth is that most of the users who want to perform analysis don t care about the difference between in-memory and disk-based systems, they don t care about data volumes or pre-aggregation. They simply want to have a fast, friendly, intelligent interface into their data wherever it resides and whatever volume it happens to have, so that they can extract useful information from it. And if we can give them that, we give them a hugely powerful tool. Printed reports are all very well, but they stifle the insight that comes from allowing a highly knowledgeable business user, with years of experience, to freely browse the historical data of the company looking for trends, anomalies and patterns. A good data discovery tool allows that unique marriage of experience and data to flower into information and ultimately business advantage. Given this background, TIBCO felt it was important to expand the capabilities of Spotfire on the one hand leaving its capability to analyze significant volumes of data in memory completely untouched (in fact, enhanced), on the other providing the capability to hook directly into the disk-based data held in the major database engines. The remainder of this paper is about how this was achieved and explains some of the implications of being able to analyze both in memory and on disk. In-database Analytical Systems: Perspective, Trade-offs and Implementation 6
7 How does it work? Spotfire establishes a database connection to the database engine that is hosting the disk-based data warehouse. (The example we are using is a data warehouse. In practice, Spotfire can establish a database connection to any Teradata, Oracle s Exadata, SQL Server or SSAS database.) The data remains held by that engine, but it appears in Spotfire and the user is free to manipulate the data as normal using the familiar Spotfire interface. So, for example, you might be looking at ten years worth of sales data and use the mouse to filter the data down to just white goods, month by month, for As the filter is applied, the data in the interface updates. However it is important to understand exactly how this is achieved because an understanding of the mechanism reveals both the strengths and limitations of this approach. (What!? You mean it has limitations? Of course it does; all engineering solutions have their pros and cons.) The raw data remains at all times in the data warehouse, under the control of the database engine (Teradata, Oracle Exadata etc.). As choices are made in Spotfire s interface, Spotfire generates queries in SQL (or in MDX in the case of SSAS). Those queries are sent to the data warehouse. They are executed there, both in terms of extracting the correct data and aggregating it where necessary. The processed data is then returned to Spotfire where it is rendered into whatever presentation has been requested bar chart, scatter plot etc. Notice the clear separation of responsibilities here. Spotfire s job is to display the data, not only making it easy for the user to visualize that data but also to allow them to form their questions and data manipulations very intuitively. Finding the data and aggregating it is the responsibility of the engine, and the data returned by the external engine is displayed in Spotfire s data discovery interface. The pros here are huge; users of Spotfire have just acquired the capability to connect to multi-terabyte or even petabyte data stores and analyze them it an intuitive, well understood way. The limitations are somewhat more subtle. Spotfire is capable of a huge range of data visualizations and manipulations, more than most comparable systems. In order to be able to work with this huge volume of disk-based data, Spotfire is relinquishing the actual handling of the data to another engine; inevitably some of Spotfire s specific manipulations aren t possible simply because some of the engines cannot perform these manipulations. In practice, by far the majority of the traditional Spotfire functionality is available when connected to one of these diskbased systems. The exact set of Spotfire-specific manipulations that can/cannot be performed depends on the database engine to which the connection is made. In-database Analytical Systems: Perspective, Trade-offs and Implementation 7
8 And there are a few cases where a specific engine offers, for example, an aggregation function that is not offered by Spotfire (or perhaps Spotfire offers a similar function but it doesn t map exactly onto the one offered by the database engine). In that case the function is usually made available to the user in the Spotfire interface. Of course, describing this is relatively straightforward Spotfire establishes a database connection to the database engine that is hosting the data warehouse. In practice it is complex to achieve because the different engines use different languages and/or dialects of the same language to describe the analysis required. SQL Teradata, Exadata, SQL Server Three of these engines (Teradata, Oracle s Exadata and SQL Server) use SQL as their communication language, so Spotfire has to translate any request formed in its user interface into SQL before the query can be sent to the engine. Imagine several years of sales data held in a set of relational tables. One is an Invoice table, with SaleDate and Amount as columns. Suppose the Spotfire user requests the aggregated total for the invoices in Q One SQL statement that could be used to extract the data necessary to deliver this information might read as: SELECT Invoice.SaleDate, SUM(Invoice.Amount) FROM Invoice WHERE Invoice.SaleDate >= 1/1/2012 AND Invoice.SaleDate < 4/1/2012 GROUP BY Invoice.SaleDate; (This query includes a GROUP BY because Spotfire always brings back aggregated data to reduce the volume of data returned. Indeed Spotfire intelligently controls the amount of data pulled from the on-disk store to reduce network traffic and optimize the speed of response.) This returns the daily invoice totals for the period of Q which will number around 90 individual values. These can then be aggregated to give the requested total in a variety of ways either by increasing the complexity of the SQL statement shown or by chaining several SQL statements together. However, to the Spotfire user this process would be transparent; they would use the interface to request the data and it would appear. Within what is called standard SQL it is entirely possible to express the same query in multiple ways, some efficient and some woefully inefficient. And the very nature of truly intuitive data discovery means that it is impossible to know in advance which questions are going to be posed. So in practice it is an enormously complex job to write a translator that not only produces the functionally correct SQL but highly optimized, functionally correct SQL. In-database Analytical Systems: Perspective, Trade-offs and Implementation 8
9 And if that wasn t complex enough, it turns out that SQL is a non-standard standard. In other words, whilst all of these engines will understand and execute standard SQL, they all have different dialects. True, the Spotfire engineers could have ignored these variations and just sent standard SQL to the engines. However, using the specific dialect for a given engine often provides huge performance benefits, so the connections to each of these engines have been highly optimized for that particular engine. MDX SSAS And just to keep the engineers on their toes, the fourth engine (SSAS) uses a totally different language called MDX. It uses MDX because the data in SSAS is not structured as relational tables, but as a multi-dimensional structure. In other words, the data is not held as traditional two-dimensional tables but rather it is structured as a set of measures (such as Invoice Amount) held at the intersections of a set of dimensions (such as Date, Customer, Product etc.). Microsoft (along with other manufacturers) has elected to use MDX to query these structures because the language is specifically designed to express multi-dimensional queries. In MDX the query described above is very different from SQL because date information will already be stored in the data structure within a date dimension. Assuming that the Date dimension (DateDim) includes a level for Quarter, the query might look like this: SELECT {[DateDim].[Calend].[2012 Q1]} ON Columns, {[Measures].[Invoice Amount]} ON Rows FROM [SalesCube] Note that while the SQL query above returns the individual values which still need to be aggregated, this MDX query is returning the aggregated total for Q MDX makes some analytical queries much easier to express, for example, obtaining the four aggregated quarter values from 2012 can be expressed very easily: SELECT {[DateDim].[Calend].[2012].Children} ON Columns, {[Measures].[Invoice Amount]} ON Rows FROM [SalesCube] The SQL required to perform the same operation is considerably more complex. In-database Analytical Systems: Perspective, Trade-offs and Implementation 9
10 The bottom line is that the new features in Spotfire allow it to connect to four of the most powerful data warehouse engines seamlessly, no matter what language they use. How this can be used to your advantage Until now, Spotfire has been able to provide visualization-based data discovery against data that could be held in memory, now it can do the same against inmemory data and truly massive sets of data held on disk. And the ability to move seamlessly between these two produces a huge synergy. Suppose that your data warehouse holds the daily sales totals for 10,000 products, over ten years in 2,000 different stores. This equates to a little over 73 billion rows of data. Suppose that you start using Spotfire to visualize the data and uncover a fascinating trend. You can continue to analyze against the main data warehouse but you might find that the focus of your interest is a certain set of fifty products in one hundred stores over the last three years. That s about five million rows and even with 80 columns that will fit comfortable into 4 GBytes of RAM. So you can extract that set from the warehouse and analyze it in memory. This has three potential advantages: it may be faster than your data warehouse (depending on the power of the warehouse and how many simultaneous users it has currently) you can use the complete set of functions and visualizations in Spotfire you can work off line from the warehouse You don t have to subset the data in this precise way of course; you could take 100 products from ten stores over the entire time period or any other combination that yields a sub-set that will fit in memory. And this can work the other way around, of course. You may start off analyzing a sub-set of the warehouse in memory, find an interesting trend for a certain set of products and then want to know if that is true for other products. You can simply connect to the data warehouse and find out. The bottom line is that both in-memory and disk-based data discovery have their advantages, Spotfire now allows you to do both and it allows you to combine them. In-database Analytical Systems: Perspective, Trade-offs and Implementation 10
11 And it isn t just data warehouses. If you have multiple database engines (perhaps a Teradata data warehouse and a SQL Server finance system) it is now possible to use the same data discovery tool to analyze data from multiple database engines. You can even analyze data from different engines in the same Spotfire report or analysis. Summary We are visual animals. Most people find visualization-based tools by far the most intuitive way of understanding their data. Both in-memory and disk-based systems have their pros and cons. In the past we have had to choose between them. TIBCO s Spotfire now allows us to use one tool to perform both types of analysis. This has the obvious advantage that we can use one tool across all of our sets of data, but it goes further than that. Remembering that both in-memory and disk-based systems have their own unique advantages, it means that in any given instance, we can elect to use the one which has the pros we need at that point in our analysis. We can essentially have our cake and eat it. About the Author Professor Mark Whitehorn specializes in the areas of data analysis, data modelling, business intelligence (BI) and Data Science. Mark is an author, academic researcher (he holds the Chair of Analytics at the University of Dundee in Scotland) and also works as a consultant designing analytical systems for national and international companies. Additional Contributors: Ben McGraw and Erik Petterson of TIBCO Software, Inc. 1 In-memory Analytical Systems: Perspective, Trade-offs and Implementation. TIBCO White Paper 2 Factors affecting programmer productivity during application development A. J. Thadhani. IBM SYSTEMS JOURNAL, VOL 23, NO 1. P 19, 1984 A comparative study of system response time on program developer productivity G. N. Lambert. IBM SYSTEMS JOURNAL, VOL 23, NO 1. P 36, 1984 TIBCO Spotfire TIBCO Software Inc. 212 Elm Street Somerville, MA North America: Europe: All Rights Reserved. TIBCO, TIBCO Software, The Power of Now, the TIBCO and Spotfire logos and TIBCO Spotfire are trademarks or registered trademarks of TIBCO Software Inc. in the United States and/or other countries. All other product and company names are marks mentioned in this document are the property of their respective owners and are mentioned for identification purposes only
In-memory Analytical Systems: Perspective, Trade-offs and Implementation
In-memory Analytical Systems: Perspective, Trade-offs and Implementation Executive Summary This white paper examines the pros and cons of in-memory analytics with particular reference to TIBCO Spotfire
Understanding the Value of In-Memory in the IT Landscape
February 2012 Understing the Value of In-Memory in Sponsored by QlikView Contents The Many Faces of In-Memory 1 The Meaning of In-Memory 2 The Data Analysis Value Chain Your Goals 3 Mapping Vendors to
The 3 questions to ask yourself about BIG DATA
The 3 questions to ask yourself about BIG DATA Do you have a big data problem? Companies looking to tackle big data problems are embarking on a journey that is full of hype, buzz, confusion, and misinformation.
Whitepaper. Innovations in Business Intelligence Database Technology. www.sisense.com
Whitepaper Innovations in Business Intelligence Database Technology The State of Database Technology in 2015 Database technology has seen rapid developments in the past two decades. Online Analytical Processing
Implementing Data Models and Reports with Microsoft SQL Server
Course 20466C: Implementing Data Models and Reports with Microsoft SQL Server Course Details Course Outline Module 1: Introduction to Business Intelligence and Data Modeling As a SQL Server database professional,
Alexander Nikov. 5. Database Systems and Managing Data Resources. Learning Objectives. RR Donnelley Tries to Master Its Data
INFO 1500 Introduction to IT Fundamentals 5. Database Systems and Managing Data Resources Learning Objectives 1. Describe how the problems of managing data resources in a traditional file environment are
WHY IT ORGANIZATIONS CAN T LIVE WITHOUT QLIKVIEW
WHY IT ORGANIZATIONS CAN T LIVE WITHOUT QLIKVIEW A QlikView White Paper November 2012 qlikview.com Table of Contents Unlocking The Value Within Your Data Warehouse 3 Champions to the Business Again: Controlled
Big data big talk or big results?
Whitepaper 28.8.2013 1 / 6 Big data big talk or big results? Authors: Michael Falck COO Marko Nikula Chief Architect [email protected] Businesses, business analysts and commentators have
Spotfire and Tableau Positioning. Summary
Licensed for distribution Summary Both TIBCO Spotfire and Tableau allow users of various skill levels to create attractive visualizations of data, displayed as charts, dashboards and other visual constructs.
Executive summary. Table of contents. Four options, one right decision. White Paper Fitting your Business Intelligence solution to your enterprise
White Paper Fitting your Business Intelligence solution to your enterprise Four options, one right decision Executive summary People throughout your organization are called upon daily, if not hourly, to
Fitting Your Business Intelligence Solution to Your Enterprise
White paper Fitting Your Business Intelligence Solution to Your Enterprise Four options, one right decision. Table of contents Executive summary... 3 The impediments to good decision making... 3 How the
Microsoft Services Exceed your business with Microsoft SharePoint Server 2010
Microsoft Services Exceed your business with Microsoft SharePoint Server 2010 Business Intelligence Suite Alexandre Mendeiros, SQL Server Premier Field Engineer January 2012 Agenda Microsoft Business Intelligence
Dimodelo Solutions Data Warehousing and Business Intelligence Concepts
Dimodelo Solutions Data Warehousing and Business Intelligence Concepts Copyright Dimodelo Solutions 2010. All Rights Reserved. No part of this document may be reproduced without written consent from the
Data Visualization Techniques
Data Visualization Techniques From Basics to Big Data with SAS Visual Analytics WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Generating the Best Visualizations for Your Data... 2 The
Chapter 6. Foundations of Business Intelligence: Databases and Information Management
Chapter 6 Foundations of Business Intelligence: Databases and Information Management VIDEO CASES Case 1a: City of Dubuque Uses Cloud Computing and Sensors to Build a Smarter, Sustainable City Case 1b:
INTRODUCTION TO BUSINESS INTELLIGENCE What to consider implementing a Data Warehouse and Business Intelligence
INTRODUCTION TO BUSINESS INTELLIGENCE What to consider implementing a Data Warehouse and Business Intelligence Summary: This note gives some overall high-level introduction to Business Intelligence and
Integrating SAP and non-sap data for comprehensive Business Intelligence
WHITE PAPER Integrating SAP and non-sap data for comprehensive Business Intelligence www.barc.de/en Business Application Research Center 2 Integrating SAP and non-sap data Authors Timm Grosser Senior Analyst
Report Model (SMDL) Alternatives in SQL Server 2012. A Guided Tour of Microsoft Business Intelligence
Report Model (SMDL) Alternatives in SQL Server 2012 A Guided Tour of Microsoft Business Intelligence Technical Article Author: Mark Vaillancourt Published: August 2013 Table of Contents Report Model (SMDL)
Management Consulting Systems Integration Managed Services WHITE PAPER DATA DISCOVERY VS ENTERPRISE BUSINESS INTELLIGENCE
Management Consulting Systems Integration Managed Services WHITE PAPER DATA DISCOVERY VS ENTERPRISE BUSINESS INTELLIGENCE INTRODUCTION Over the past several years a new category of Business Intelligence
Bringing Big Data into the Enterprise
Bringing Big Data into the Enterprise Overview When evaluating Big Data applications in enterprise computing, one often-asked question is how does Big Data compare to the Enterprise Data Warehouse (EDW)?
<no narration for this slide>
1 2 The standard narration text is : After completing this lesson, you will be able to: < > SAP Visual Intelligence is our latest innovation
Using Tableau Software with Hortonworks Data Platform
Using Tableau Software with Hortonworks Data Platform September 2013 2013 Hortonworks Inc. http:// Modern businesses need to manage vast amounts of data, and in many cases they have accumulated this data
WINDOWS AZURE DATA MANAGEMENT
David Chappell October 2012 WINDOWS AZURE DATA MANAGEMENT CHOOSING THE RIGHT TECHNOLOGY Sponsored by Microsoft Corporation Copyright 2012 Chappell & Associates Contents Windows Azure Data Management: A
Chapter 6 8/12/2015. Foundations of Business Intelligence: Databases and Information Management. Problem:
Foundations of Business Intelligence: Databases and Information Management VIDEO CASES Chapter 6 Case 1a: City of Dubuque Uses Cloud Computing and Sensors to Build a Smarter, Sustainable City Case 1b:
SAP HANA FAQ. A dozen answers to the top questions IT pros typically have about SAP HANA
? SAP HANA FAQ A dozen answers to the top questions IT pros typically have about SAP HANA??? Overview If there s one thing that CEOs, CFOs, CMOs and CIOs agree on, it s the importance of collecting data.
In-Database Analytics
Embedding Analytics in Decision Management Systems In-database analytics offer a powerful tool for embedding advanced analytics in a critical component of IT infrastructure. James Taylor CEO CONTENTS Introducing
In-Memory or Live Data: Which is Better?
In-Memory or Live Data: Which is Better? AUTHOR: Ellie Fields, Director Product Marketing, Tableau Software DATE: July 2011 The short answer is: both. Companies today are using both to deal with ever-larger
Maximising value through business insight. Business Intelligence White Paper
Maximising value through business insight Business Intelligence White Paper October 2015 CONTENTS Reports were tedious. Earlier it would take days for manual collation. Now all this is available at the
hmetrix Revolutionizing Healthcare Analytics with Vertica & Tableau
Powered by Vertica Solution Series in conjunction with: hmetrix Revolutionizing Healthcare Analytics with Vertica & Tableau The cost of healthcare in the US continues to escalate. Consumers, employers,
Six Signs. you are ready for BI WHITE PAPER
Six Signs you are ready for BI WHITE PAPER LET S TAKE A LOOK AT THE WAY YOU MIGHT BE MONITORING AND MEASURING YOUR COMPANY About the auther You re managing information from a number of different data sources.
Unlocking The Value of the Deep Web. Harvesting Big Data that Google Doesn t Reach
Unlocking The Value of the Deep Web Harvesting Big Data that Google Doesn t Reach Introduction Every day, untold millions search the web with Google, Bing and other search engines. The volumes truly are
Implementing Data Models and Reports with Microsoft SQL Server 20466C; 5 Days
Lincoln Land Community College Capital City Training Center 130 West Mason Springfield, IL 62702 217-782-7436 www.llcc.edu/cctc Implementing Data Models and Reports with Microsoft SQL Server 20466C; 5
ANALYTICS BUILT FOR INTERNET OF THINGS
ANALYTICS BUILT FOR INTERNET OF THINGS Big Data Reporting is Out, Actionable Insights are In In recent years, it has become clear that data in itself has little relevance, it is the analysis of it that
Jet Enterprise Frequently Asked Questions Pg. 1 03/18/2011 JEFAQ - 02/13/2013 - Copyright 2013 - Jet Reports International, Inc.
Pg. 1 03/18/2011 JEFAQ - 02/13/2013 - Copyright 2013 - Jet Reports International, Inc. Regarding Jet Enterprise What are the software requirements for Jet Enterprise? The following components must be installed
Foundations of Business Intelligence: Databases and Information Management
Foundations of Business Intelligence: Databases and Information Management Wienand Omta Fabiano Dalpiaz 1 drs. ing. Wienand Omta Learning Objectives Describe how the problems of managing data resources
Streaming Analytics and the Internet of Things: Transportation and Logistics
Streaming Analytics and the Internet of Things: Transportation and Logistics FOOD WASTE AND THE IoT According to the Food and Agriculture Organization of the United Nations, every year about a third of
Harnessing the power of advanced analytics with IBM Netezza
IBM Software Information Management White Paper Harnessing the power of advanced analytics with IBM Netezza How an appliance approach simplifies the use of advanced analytics Harnessing the power of advanced
An Accenture Point of View. Oracle Exalytics brings speed and unparalleled flexibility to business analytics
An Accenture Point of View Oracle Exalytics brings speed and unparalleled flexibility to business analytics Keep your competitive edge with analytics When it comes to working smarter, organizations that
Intelligence Reporting Standard Reports
Intelligence Reporting Standard Reports Sage 100 ERP (formerly Sage ERP MAS 90 and 200) Intelligence Reporting empowers you to quickly and easily gain control and obtain the information you need from across
Sage 200 Business Intelligence Datasheet
Sage 200 Business Intelligence Datasheet Business Intelligence comes as standard as part of the Sage 200 Suite giving you a unified and integrated view of all your data, with complete management dashboards,
In-Memory Analytics for Big Data
In-Memory Analytics for Big Data Game-changing technology for faster, better insights WHITE PAPER SAS White Paper Table of Contents Introduction: A New Breed of Analytics... 1 SAS In-Memory Overview...
Lost in Space? Methodology for a Guided Drill-Through Analysis Out of the Wormhole
Paper BB-01 Lost in Space? Methodology for a Guided Drill-Through Analysis Out of the Wormhole ABSTRACT Stephen Overton, Overton Technologies, LLC, Raleigh, NC Business information can be consumed many
SELLING PROJECTS ON THE MICROSOFT BUSINESS ANALYTICS PLATFORM
David Chappell SELLING PROJECTS ON THE MICROSOFT BUSINESS ANALYTICS PLATFORM A PERSPECTIVE FOR SYSTEMS INTEGRATORS Sponsored by Microsoft Corporation Copyright 2014 Chappell & Associates Contents Business
Microsoft Dynamics NAV
Microsoft Dynamics NAV 2015 Microsoft Dynamics NAV Maximising value through business insight Business Intelligence White Paper December 2014 CONTENTS Reports were tedious. Earlier it would take days for
Data Visualization Techniques
Data Visualization Techniques From Basics to Big Data with SAS Visual Analytics WHITE PAPER SAS White Paper Table of Contents Introduction.... 1 Generating the Best Visualizations for Your Data... 2 The
QlikView 11.2 SR5 DIRECT DISCOVERY
QlikView 11.2 SR5 DIRECT DISCOVERY FAQ and What s New Published: November, 2012 Version: 5.0 Last Updated: December, 2013 www.qlikview.com 1 What s New in Direct Discovery 11.2 SR5? Direct discovery in
Data Warehouse: Introduction
Base and Mining Group of Base and Mining Group of Base and Mining Group of Base and Mining Group of Base and Mining Group of Base and Mining Group of Base and Mining Group of base and data mining group,
Microsoft 20466 - Implementing Data Models and Reports with Microsoft SQL Server
1800 ULEARN (853 276) www.ddls.com.au Microsoft 20466 - Implementing Data Models and Reports with Microsoft SQL Server Length 5 days Price $4070.00 (inc GST) Version C Overview The focus of this five-day
I N T E R S Y S T E M S W H I T E P A P E R INTERSYSTEMS CACHÉ AS AN ALTERNATIVE TO IN-MEMORY DATABASES. David Kaaret InterSystems Corporation
INTERSYSTEMS CACHÉ AS AN ALTERNATIVE TO IN-MEMORY DATABASES David Kaaret InterSystems Corporation INTERSYSTEMS CACHÉ AS AN ALTERNATIVE TO IN-MEMORY DATABASES Introduction To overcome the performance limitations
Whitepaper. 4 Steps to Successfully Evaluating Business Analytics Software. www.sisense.com
Whitepaper 4 Steps to Successfully Evaluating Business Analytics Software Introduction The goal of Business Analytics and Intelligence software is to help businesses access, analyze and visualize data,
Harness Your SAP Data with User-Driven Dashboards
AUGUST 2010 Harness Your SAP Data with User-Driven Dashboards Sponsored by Contents Introduction 1 The Problems of Big BI 2 The Road to Big BI 2 Unacceptable Delays 3 Big BI and Sticky Information 4 Power
How to Navigate Big Data with Ad Hoc Visual Data Discovery Data technologies are rapidly changing, but principles of 30 years ago still apply today
How to Navigate Big Data with Ad Hoc Visual Data Discovery Data technologies are rapidly changing, but principles of 30 years ago still apply today INTRODUCTION Data is the heart of TIBCO Spotfire. It
SQL Server 2014. In-Memory by Design. Anu Ganesan August 8, 2014
SQL Server 2014 In-Memory by Design Anu Ganesan August 8, 2014 Drive Real-Time Business with Real-Time Insights Faster transactions Faster queries Faster insights All built-in to SQL Server 2014. 2 Drive
How To Choose A Business Intelligence Toolkit
Background Current Reporting Challenges: Difficulty extracting various levels of data from AgLearn Limited ability to translate data into presentable formats Complex reporting requires the technical staff
Microsoft SQL Server Business Intelligence and Teradata Database
Microsoft SQL Server Business Intelligence and Teradata Database Help improve customer response rates by using the most sophisticated marketing automation application available. Integrated Marketing Management
Introducing CXAIR. E development and performance
Search Powered Business Analytics Introducing CXAIR CXAIR has been built specifically as a next generation BI tool. The product utilises the raw power of search technology in order to assemble data for
Perform-Tools. Powering your performance
Perform-Tools Powering your performance Perform-Tools With Perform-Tools, optimizing Microsoft Dynamics products on a SQL Server platform never was this easy. They are a fully tested and supported set
Microsoft Dynamics NAV
Microsoft Dynamics NAV Maximising value through business insight Business Intelligence White Paper October 2015 CONTENTS Reports were tedious. Earlier it would take days for manual collation. Now all this
Tap into Big Data at the Speed of Business
SAP Brief SAP Technology SAP Sybase IQ Objectives Tap into Big Data at the Speed of Business A simpler, more affordable approach to Big Data analytics A simpler, more affordable approach to Big Data analytics
Taming Big Data. 1010data ACCELERATES INSIGHT
Taming Big Data 1010data ACCELERATES INSIGHT Lightning-fast and transparent, 1010data analytics gives you instant access to all your data, without technical expertise or expensive infrastructure. TAMING
6.0, 6.5 and Beyond. The Future of Spotfire. Tobias Lehtipalo Sr. Director of Product Management
6.0, 6.5 and Beyond The Future of Spotfire Tobias Lehtipalo Sr. Director of Product Management Key peformance indicators Hundreds of Records Visual Data Discovery Millions of Records Data Mining or Data
Big Data and Its Impact on the Data Warehousing Architecture
Big Data and Its Impact on the Data Warehousing Architecture Sponsored by SAP Speaker: Wayne Eckerson, Director of Research, TechTarget Wayne Eckerson: Hi my name is Wayne Eckerson, I am Director of Research
Ignite Your Creative Ideas with Fast and Engaging Data Discovery
SAP Brief SAP BusinessObjects BI s SAP Crystal s SAP Lumira Objectives Ignite Your Creative Ideas with Fast and Engaging Data Discovery Tap into your data big and small Tap into your data big and small
KnowledgeSEEKER Marketing Edition
KnowledgeSEEKER Marketing Edition Predictive Analytics for Marketing The Easiest to Use Marketing Analytics Tool KnowledgeSEEKER Marketing Edition is a predictive analytics tool designed for marketers
OLAP Services. MicroStrategy Products. MicroStrategy OLAP Services Delivers Economic Savings, Analytical Insight, and up to 50x Faster Performance
OLAP Services MicroStrategy Products MicroStrategy OLAP Services Delivers Economic Savings, Analytical Insight, and up to 50x Faster Performance MicroStrategy OLAP Services brings In-memory Business Intelligence
CRGroup Whitepaper: Digging through the Data. www.crgroup.com. Reporting Options in Microsoft Dynamics GP
CRGroup Whitepaper: Digging through the Data Reporting Options in Microsoft Dynamics GP The objective of this paper is to provide greater insight on each of the reporting options available to you within
SalesLogix Advanced Analytics
SalesLogix Advanced Analytics SalesLogix Advanced Analytics Benefits Snapshot Increase organizational and customer intelligence by analyzing data from across your business. Make informed business decisions
Sage 200 Business Intelligence Datasheet
Sage 200 Business Intelligence Datasheet Business Intelligence comes as standard as part of the Sage 200 Suite giving you a unified and integrated view of important data, with complete management dashboards,
SQL Server 2012 Performance White Paper
Published: April 2012 Applies to: SQL Server 2012 Copyright The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication.
PLATFORA INTERACTIVE, IN-MEMORY BUSINESS INTELLIGENCE FOR HADOOP
PLATFORA INTERACTIVE, IN-MEMORY BUSINESS INTELLIGENCE FOR HADOOP Your business is swimming in data, and your business analysts want to use it to answer the questions of today and tomorrow. YOU LOOK TO
The Benefits of Data Modeling in Business Intelligence
WHITE PAPER: THE BENEFITS OF DATA MODELING IN BUSINESS INTELLIGENCE The Benefits of Data Modeling in Business Intelligence DECEMBER 2008 Table of Contents Executive Summary 1 SECTION 1 2 Introduction 2
Performance management for your people: a five-step guide to transforming your organisation s culture
Performance management for your people: a five-step guide to transforming your organisation s culture Introduction Performance management (PM) is now widely recognised as an essential element of business
Empower Individuals and Teams with Agile Data Visualizations in the Cloud
SAP Brief SAP BusinessObjects Business Intelligence s SAP Lumira Cloud Objectives Empower Individuals and Teams with Agile Data Visualizations in the Cloud Empower everyone to make data-driven decisions
Fact Sheet In-Memory Analysis
Fact Sheet In-Memory Analysis 1 Copyright Yellowfin International 2010 Contents In Memory Overview...3 Benefits...3 Agile development & rapid delivery...3 Data types supported by the In-Memory Database...4
How To Model Data For Business Intelligence (Bi)
WHITE PAPER: THE BENEFITS OF DATA MODELING IN BUSINESS INTELLIGENCE The Benefits of Data Modeling in Business Intelligence DECEMBER 2008 Table of Contents Executive Summary 1 SECTION 1 2 Introduction 2
Empowering the Masses with Analytics
Empowering the Masses with Analytics THE GAP FOR BUSINESS USERS For a discussion of bridging the gap from the perspective of a business user, read Three Ways to Use Data Science. Ask the average business
Toronto 26 th SAP BI. Leap Forward with SAP
Toronto 26 th SAP BI Leap Forward with SAP Business Intelligence SAP BI 4.0 and SAP BW Operational BI with SAP ERP SAP HANA and BI Operational vs Decision making reporting Verify the evolution of the KPIs,
Create Mobile, Compelling Dashboards with Trusted Business Warehouse Data
SAP Brief SAP BusinessObjects Business Intelligence s SAP BusinessObjects Design Studio Objectives Create Mobile, Compelling Dashboards with Trusted Business Warehouse Data Increase the value of data with
Big Data Integration: A Buyer's Guide
SEPTEMBER 2013 Buyer s Guide to Big Data Integration Sponsored by Contents Introduction 1 Challenges of Big Data Integration: New and Old 1 What You Need for Big Data Integration 3 Preferred Technology
An Architectural Review Of Integrating MicroStrategy With SAP BW
An Architectural Review Of Integrating MicroStrategy With SAP BW Manish Jindal MicroStrategy Principal HCL Objectives To understand how MicroStrategy integrates with SAP BW Discuss various Design Options
DATA WAREHOUSE BUSINESS INTELLIGENCE FOR MICROSOFT DYNAMICS NAV
www.bi4dynamics.com DATA WAREHOUSE BUSINESS INTELLIGENCE FOR MICROSOFT DYNAMICS NAV True Data Warehouse built for content and performance. 100% Microsoft Stack. 100% customizable SQL code. 23 languages.
ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION
ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION EXECUTIVE SUMMARY Oracle business intelligence solutions are complete, open, and integrated. Key components of Oracle business intelligence
How To Handle Big Data With A Data Scientist
III Big Data Technologies Today, new technologies make it possible to realize value from Big Data. Big data technologies can replace highly customized, expensive legacy systems with a standard solution
Tableau and the Enterprise Data Warehouse: The Visual Approach to Business Intelligence
Tableau and the Enterprise Data Warehouse: The Visual Approach to Business Intelligence AUTHOR Dan Jewett Tableau Software DATE 08/01/2009 Organizations focus on acquiring customers, increasing revenues
In-Memory or Live Data: Which Is Better?
In-Memory or Live Data: Which Is Better? Author: Ellie Fields, Director Product Marketing, Tableau Software July 2011 p2 The short answer is: both. Companies today are using both to deal with ever-larger
Why developers should use ODBC instead of native proprietary database interfaces
P RODUCT O VERVIEW Why developers should use ODBC instead of native proprietary database interfaces The financial and technical basis for using ODBC with wire protocol drivers instead of native database
Getting Started Guide
Getting Started Guide Introduction... 3 What is Pastel Partner (BIC)?... 3 System Requirements... 4 Getting Started Guide... 6 Standard Reports Available... 6 Accessing the Pastel Partner (BIC) Reports...
Integrating IBM Cognos TM1 with Oracle General Ledger
Integrating IBM Cognos TM1 with Oracle General Ledger Highlights Streamlines the data integration process for fast and precise data loads. Enables planners to drill back into transactional data for the
