Information Fabric 3.0

Size: px
Start display at page:

Download "Information Fabric 3.0"

Transcription

1 For: Application Development & Delivery Professionals Information Fabric 3.0 by Noel Yuhanna and Mike Gilpin, August 8, 2013 Updated: August 12, 2013 Key Takeaways Enterprise Data Integration Challenges Have Grown Increasingly Critical Integrating business data from the proliferation of data repositories to deliver the unified view needed to support business applications, analytics, and real-time insights has become a nightmare. The explosion of new data sources, including social media, mobile devices, partners, the marketplace, and machine-generated data, aggravates the problem. Forrester s Information Fabric Uses New Technologies To Broaden Its Impact Based on recent interviews of customers and vendors, we are updating our reference architecture to include new and enhanced technology supporting a wider range of applications and business requirements. Product implementations have matured and innovated with emerging technologies like big data, streaming and real-time data, APIs, and in-memory. You Need A Data Virtualization Strategy To Succeed; Without One, You Risk Falling Behind Without data virtualization, you risk knowing less about your customers. You ll get fewer real-time business insights, lose your competitive advantage, and spend more to address data challenges. Firms that invest in data virtualization technologies will respond more quickly, deliver more and better products, and grow faster than their competitors. Forrester Research, Inc., 60 Acorn Park Drive, Cambridge, MA USA Tel: Fax:

2 August 8, 2013 Updated: August 12, 2013 Information Fabric 3.0 Forrester s Reference Architecture For Enterprise Data Virtualization by Noel Yuhanna and Mike Gilpin with David Murphy Why Read This Report Enterprises face growing challenges in bridging disparate sources of data to fuel analytics, predictive analytics, real-time insights, and applications. The data explosion is also exacerbating integration, security, performance, quality, and availability issues. Business users need reliable information fast (in real time) to make business decisions, while IT needs to lower costs, minimize complexity, and improve operational efficiency. Eight years ago, Forrester invented the category of data virtualization with our vision of the information fabric ; these solutions continue to evolve to address the pressing problem of delivering comprehensive capabilities to enable dynamic, real-time data services. In updating our information fabric reference architecture to version 3.0, we reflect these new business requirements to accommodate new technology enhancements supporting big data, cloud, mobility, distributed in-memory caching, and dynamic services. Use information fabric 3.0 to inform and guide your data virtualization strategy. Table Of Contents 2 Enterprise Data Integration Challenges Are Increasingly Critical Information Fabric 3.0 Uses New Technologies To Broaden Its Impact Information Fabric 3.0 Enables The Rapid Delivery Of Dynamic Services Information Fabric Use Cases Are Expanding Toward Real Time The Market Landscape Is Expanding Beyond Traditional Vendors recommendations Information Fabric Should Be Part Of Your Data Management Strategy Notes & Resources Forrester interviewed several vendors and user companies, including Composite Software, Denodo Technologies, IBM, Informatica, and SAP. Related Research Documents Deliver On Big Data Potential With A Hub- And-Spoke Architecture June 12, 2013 Server Virtualization Predictions For 2013 March 15, 2013 Information Fabric 2.0: Enterprise Information Virtualization Gets Real April 9, WHAT IT MEANS You Need A Data Virtualization Strategy To Avoid Falling Behind 26 Supplemental Material 2013, Forrester Research, Inc. All rights reserved. Unauthorized reproduction is strictly prohibited. Information is based on best available resources. Opinions reflect judgment at the time and are subject to change. Forrester, Technographics, Forrester Wave, RoleView, TechRadar, and Total Economic Impact are trademarks of Forrester Research, Inc. All other trademarks are the property of their respective companies. To purchase reprints of this document, please clientsupport@forrester.com. For additional information, go to

3 Information Fabric ENTERPRISE DATA INTEGRATION CHALLENGES Are INCREASINGLY CRITICAL For decades, firms have deployed enterprise applications on independent databases, supporting custom data models, scalability, and performance while speeding delivery. It s become a nightmare to try to integrate the proliferation of data from these applications in order to deliver the unified view of business data required to support new business applications, analytics, and real-time insights. The explosion of new data sources, including social media, mobile devices, partners, the marketplace, and machine-generated data further aggravates the problem. In a recent survey, data management professionals cited data integration as the second most challenging issue after performance, which is not surprising given the increase in data volume, variety, and velocity (see Figure 1). Poorly integrated business data often leads to poor business decisions, reduces customer satisfaction and competitive advantage, and slows product innovation ultimately limiting revenue. Business decision-makers place a very high priority on improving the use of data and analytics to improve business decisions and outcomes (see Figure 2). Our interviews of several organizations that have implemented data virtualization show that business stakeholders are demanding faster access to information that s consistent, reliable, and trusted across the enterprise. Business users want real-time business intelligence (BI), predictive analytics, customer sentiment analysis, and custom dashboards and reports to support enhanced business decisions. All of these scenarios depend on integrated and trusted data which IT has to deliver. Yet many traditional data integration technologies, such as enterprise application integration (EAI), enterprise information integration (EII), and data federation, lack the agility that today s fast-moving environment requires. This gap between business demand and data integration capability is the primary motivation for data and solution architects to innovate with the collection of technologies that Forrester labels the information fabric. We re updating our information fabric framework and reference architecture to version 3.0 to accommodate a range of new business requirements and corresponding technology capabilities, as: The continuing data explosion demands a new data management approach. Traditional structured data continues to grow rapidly. In addition, mobile, social, and web applications generate far more data than traditional applications, and the need to integrate geolocation, sensor, clickstream, RFID, and blog data to support predictive and real-time analytics just ups the ante. Traditional means of data integration create so much duplicate data that they re now more of a problem than a solution, requiring enterprises to embrace new approaches to data management that help moderate rather than exacerbate data growth. Our data stores have become so large that it is becoming very difficult to extract information that we want for our reporting, analytics, and insights in a reasonable amount of time. We are experiencing a tremendous data explosion for some of our highly critical applications, with some data repositories growing by more than 100% annually. (Director of IT, North American telco)

4 Information Fabric Tougher compliance requirements drive a growing need for data protection. Facing these requirements amid an increased threat level for both internal and external intrusions, data and security architects are seeking new ways to enhance data security measures that support all types of data. Today, many mission-critical data repositories are highly vulnerable due to poor authentication practices, the use of default configurations and logins, and weak data access control measures. Data privacy laws such as PCI, HIPAA, and GLBA also require firms to implement stronger global data security measures. 1 We have more than a thousand data repositories that are critical for our business to function, many of which contain highly sensitive information. We can t say that our data is 100% safe or protected today, but we are improving. Our goal to centralize access to all information and provide a granular data access facility. (IT manager, North American retailer) Business stakeholders demand access to more real-time information. As the pace of business accelerates, real-time business insight becomes critical, motivating business decision-makers to greatly increase spending on real-time analytics and big data (see Figure 3). Mobility is changing the way we interact and deal with information. Data mobilization enables rich interactions and advanced analytics in real time using devices such as tablets, smartphones, wearables, and other emerging mobile technologies. 2 But IT organizations struggle to respond quickly when these new apps need access to integrated data. Before, when we called our systems real-time, it was only real-time for 40% of our existing data. For the remaining 60%, we manually drill in through the databases and data warehouses (DWs) to get to the data, which is time-consuming and challenging. We are trying to move toward real time; hopefully, in 12 to 18 months we will have a real-time data virtualization platform that can help deliver new insights, grow our business, and have competitive advantage that s our goal. (Director of IT, North American manufacturer) External data sources grow to dominate the integration landscape. Many firms still focus largely on integrating internal data, despite the profusion of public and external data sources, including credit reports, property records, government open-data initiatives, and consumergenerated content on social media like Facebook, Twitter, LinkedIn, and Instagram. 3 Mobile apps on smartphones and tablets collect even more data, such as geolocation. This treasure trove of information can deliver deep new insights into customer and market behavior but requires new, open, cloud-friendly means of integration such as APIs supporting the open data protocol (OData) to become a core part of your firm s data integration approach. 4 We have many silos of information; across the life cycle of a piece of information, we have some overlap, derivatives, and enrichment of data going across different lines of business. Something that s relatively new to us is that we ve now started to collect consumer data from many internal and external sources. It s an incredible amount of information but

5 Information Fabric we still lack the ability to perform real-time and predictive analytics on it. (Director of IT, North American entertainment firm) Figure 1 Integrating Data Has Become More Challenging Than Ever How challenging are the following database management issues to your organization? (Respondents indicating that an issue is extremely challenging or somewhat challenging ) Delivering improved performance Integrating data Lack of people resources Securing private data Delivering higher availability High data volume growth Upgrading databases Too many databases Migrating databases Lack of a database budget Very large databases Lack of system resources Too many security patches to deploy 80% 75% 71% 71% 69% 69% 67% 63% 63% 61% 57% 55% 52% Base: 104 database management professionals (multiple responses accepted) Source: February 2013 Global Database Management Online Survey Source: Forrester Research, Inc.

6 Information Fabric Figure 2 Business Decision-Makers Are Prioritizing The Improved Use Of Data And Analytics Which of the following technology initiatives will you or your team ask IT to prioritize over the next 12 months? Not on our agenda Low priority Moderate priority High priority Critical priority Don t know High or critical priority Improve the use of data and 55% analytics to improve business decisions 4% 10% 28% 37% 18% 2% and outcomes 25% Create a comprehensive strategy and implementation plan for public cloud 23% 22% 24% 18% 7% 6% and other as-a-service offerings 17% Implement a 35% 23% 20% 13% 4% 4% bring-your-own-device strategy 43% Improve IT project delivery performance 7% 13% 33% 32% 35% 11% 4% Improve IT budget performance 7% 16% 37% 27% 8% 5% Base: 3,616 business budget decision-makers in North America, Europe, Asia Pacific, and Latin America (percentages may not total 100 because of rounding) Source: Forrsights Business Decision-Makers Survey, Q Source: Forrester Research, Inc.

7 Information Fabric Figure 3 Business Stakeholders Are Investing More In Big Data And Real-Time Analytics How do you expect your group s or department s spending on the following technologies and services to change in 2012? Decrease more than 10% Decrease 5% to 10% About the same Increase 5% to 10% Increase more than 10% Don t know 3% Increase spending by 5% or more 2% 68% Big data solutions (N = 254) 28% 37% 31% 2% 2% 63% Real-time predictive business and customer analytics (N = 326) 5% 29% 38% 25% 1% Software, security, data or business intelligence, infrastructure, or other as-a-service offerings (N = 614) 2% 6% 1% 37% 53% 37% 71% 16% 2% Application development for mobile 25% 42% 29% 2% or tablets (N = 542) 1% 51% Packaged software applications other 7% 39% 38% 13% 2% than collaboration (N = 1,011) Base: Business decision-makers who expect to directly purchase each technology (percentages may not total 100 because of rounding) Source: Forrsights Business Decision-Makers Survey, Q Source: Forrester Research, Inc. INFORMATION FABRIC 3.0 Uses new technologies To Broaden Its Impact In the early days of the information fabric (version 1.0), firms had to manually integrate a range of point technologies to create a fabric-like platform; the reference architecture represented an implementation pattern more than a market category. In version 2.0, this pattern became commercialized in a range of products from Composite Software, Denodo Technologies, IBM, and Informatica that provided more integrated solutions. Since then, mainstream usage has grown to the point that the CTO of an investment firm remarked that he couldn t imagine why anyone would not use data virtualization if they could. Product implementations have also matured and become more innovative, embracing emerging trends like big data; streaming and other real-time data; APIs; the increasing use of distributed in-memory approaches and elastic caching; and stronger integration with appliance and cloud form factors. Based on our latest round of interviews of customers and vendors, Forrester is again updating the information fabric reference architecture to support a wider range of applications and business requirements using these new and enhanced technology capabilities. The new architecture is designed to guide information and application architects to create a fabric tailored to their firm s requirements that can support application requirements for the next several years. Forrester defines information fabric 3.0 as:

8 Information Fabric The integration of any data in real time or near real time from disparate data sources, whether internal or external, into coherent data services that support business transactions, analytics, predictive analytics, and other workloads and patterns. Regardless of the physical location of data, any application, process, tool, or user can make one simple request to get a comprehensive view of business data. The information fabric integrates various data sources in real time and maintains data integrity within its framework. It minimizes complexity and hides heterogeneity by embodying a coherent model of data that reflects business requirements rather than the details of underlying systems and sources. The fabric can adapt to changing business policies and rules while delivering trusted data. It allows for centralized administration of its distributed in-memory resources, metadata repositories, storage, policies, access, and processing functions. It supports all types of data structured, semistructured (XML/ JSON), or unstructured ( s, images, audio, video, or content) and integrates all of these forms of information into a single smart data service. Information fabric 3.0 offers several key capabilities, including (see Figure 4): Enabling real-time data sharing within an enterprise or with customers and partners. The information fabric enables data sharing between peers, employees, partners, and customers. It allows any application, process, dashboard, tool, or user to access any business data, regardless of where the data is physically or logically located and the data format, whether structured, semistructured, or unstructured. Applications, processes, and tools can simply make standard SQL, XQuery, SOAP, REST, ODBC, or JDBC calls to the information fabric to access, insert, update, or delete business data. The fabric focuses on eliminating excessive duplication of data across the organization and offers consistent, trusted data for all enterprise applications. Supporting more complex business transactions. Unlike EII and EAI, data virtualization is more than a data integration technology; it s a data management framework that supports scalable transactional applications and workloads. Although we estimate the current adoption of transaction-focused information fabric to be less than 10%, adoption is accelerating especially where organizations find it difficult to support unpredictable web-scale applications and workloads. The fabric delivers an elastic data platform flexible enough to support more users and data by scaling out additional system resources, whether on-premises or in the cloud. Supporting a self-service data platform for IT and business users. Although most of the vendor solutions are still ramping up their self-service capabilities, enabling self-service capability for business users and IT is an increasingly important use case for the information fabric. The best solutions for enabling self-service provide tools designed for end users that support easy discovery and navigation of the data offered within the fabric, in forms ranging from easily navigable REST APIs to integration with self-service BI tools such as those from Microsoft, SAP, SAS, Tableau Software, and Tibco Software. 5

9 Information Fabric Leveraging new approaches to improve high-performance data access. Previous generations of distributed data access middleware were hampered in their goal by the lack of today s generation of massively parallel distributed in-memory technologies, which now enable the fabric to process, store, and access more information more quickly. In addition, the information fabric can use compute fabrics such as Hadoop, grid, cloud, and virtualized platforms to offload the processing of appropriate workloads to complement distributed in-memory frameworks. One large global energy firm paired its fabric with a farm of analytics appliances, simplifying access to a treasure trove of operational data and tripling the effective usage. Providing an additional layer for automation. The information fabric is configured with metadata, representing business and technical policies for data management across the enterprise. This enables greater automation of data integration, access, and management, thereby reducing administration effort, which has been a major source of operational cost in running previous generations of distributed data access middleware. It also brings greater agility; firms can often satisfy new information requirements by simply and rapidly configuring this metadata, reducing the time-to-value from weeks to hours. Figure 4 A Logical View Of Information Fabric 3.0 Applications Analytics Transactions Other workloads Processes Reports Queries Transactions Workloads/patterns Dashboard Visualization Predictive analytics Consumers Real-time and distributed in-memory data platform In-memory analytics Transaction processing Data services platform Quality Data model Discovery Transformation Governance Security Other compute workloads Catalog Profiling Events Information fabric BI DBMS Data integration (EII, ETL, ESB) Derived data sources EDW Data mart Data rationalization Data transformation ODS Data quality MDM Hub & spoke (Data hub) Hadoop MPP EDW NoSQL (i.e., big data) External data Social data Marketplace data SaaS data Public cloud data Derived data sources and external processing Traditional data sources New data sources CRM ERP Other apps Social media Sensors Geolocation Devices Data sources Source: Forrester Research, Inc.

10 Information Fabric INFORMATION FABRIC 3.0 ENABLES The RAPID DELIVERY OF DYNAMIC SERVICES The information fabric is not simply data federation or caching software, it s a data virtualization framework comprising many components, including data movement and staging, data security, data quality, data profiling, data modeling and mapping, in-memory, data governance, metadata, external data processing, and data discovery (see Figure 5). The evolution of new or improved capabilities in information fabric 3.0 is occurring across all of these components. Figure 5 Information Fabric 3.0 Reference Architecture Enterprise data virtualization reference architecture Business intelligence Any app Predictive analytics Dashboards Search Transactions Processes Messaging XML SOAP SQL ODBC JDBC HTTP REST APIs Real-time self-service information access Monitoring and role-based tools Replication, streams, real-time events Data security (access, audit, masking) Elastic, intelligent in-memory data fabric (dynamic optimization and execution) Transaction management Data quality Canonical model/logical data model Connectors/APIs Other workload/ pattern mgmt. Data profiling Analytics management Data transformation Dynamic discovery Data governance External processing and data hub (Hadoop, NoSQL, EDW i.e., big data) Metadata (biz rules, catalog, taxonomy, lineage) Any data: structured, unstructured, semistructured, big data, and cloud Audio, video, images, text, the Web, social media, blogs, CRM, ERP, SCM, devices, sensors, and RFID data Source: Forrester Research, Inc. Elastic Intelligent In-Memory Resources Are The Core Of Information Fabric 3.0 The information fabric s use of elastic in-memory resources in version 3.0 goes beyond data caching with a unified distributed in-memory data fabric that spans nodes, servers, and geographical locations. The unified in-memory data platform is not about duplicating cached data on every server in the compute farm. Rather, it focuses on intelligently managing fabric data at runtime: knowing what data is cached and where, based on actual patterns of data access and usage, in order

11 Information Fabric to optimize runtime performance to conform to service-level policies. Regardless of where the data resides, the information fabric seamlessly integrates data from many physical servers and locations to deliver a single trusted view. In order to provide a complete information fabric solution, vendors must integrate several key components updated in version 3.0 with the distributed in-memory framework, including: Canonical and logical modeling speeds implementation of business views and policies. Support for model-driven development and management of the fabric layer is one of the capabilities of data virtualization solutions that has most improved in recent years, greatly increasing the utility and automation of fabric solutions. Integrating such tools with your fabric takes it beyond middleware to deliver a key information management capability for the enterprise. Forrester defines a canonical information model as: A model of the semantics and structure of information that adheres to a set of rules agreed upon within a defined context for communicating among a set of applications or parties. 6 Given that the whole purpose of an information fabric is communicating among applications or parties, it s easy to see the crucial role a canonical model plays. Indeed, most of the data virtualization users we interviewed have also made canonicalization a key part of their implementation. 7 Tools that automate this process from information discovery to model mapping to generation of physical artifacts to model management to model versioning and change management are critical to success with larger data fabrics. The most advanced tools help large organizations deal with more complex scenarios including model federation, semantic modeling, and integration with third-party modeling tools and metadata repositories. Transaction management orchestrates information to sustain data integrity. Although most past information fabric deployments were read-only, adoption of transactional applications is growing rapidly, often driven by the growth of mobile and cloud apps. Transaction management in the information fabric orchestrates data within the in-memory data fabric and between the fabric and sourced master data repositories, whether internal or external, to ensure integrity once applications commit or roll back a unit of work. Many new applications using information fabric for transactions are using fabric data as master data, persisted to a local store for recoverability. The fabric keeps traditional data sources in sync using two-phase commit, XA protocol, long-running transactions, or an eventually consistent model, depending on the transaction quality of service required. Analytics management supports real-time and predictive analytics. The information fabric also offers the ability to assemble several sources of data for aggregation, cube, or ROLAP analysis or to simply deliver pre-aggregated data to consumers. Analytics management ensures

12 Information Fabric that data has undergone the necessary transformation, quality checks, and aggregation to support real-time and predictive analytics. It integrates with in-memory analytics to support mathematical, geographical, statistical, and custom computational algorithms. Analytics management also integrates with third-party analytics and predictive analytics vendor solutions such as those offered by IBM, KXEN, Oracle, SAP, SAS, StatSoft, and Tibco Software. 8 Support for other workloads and patterns enables specialized applications. Pharmaceutical, biotech, genomics, and other organizations with advanced use cases require fabrics to accept computational jobs packaged along with the data on which they operate. Advanced information fabrics support this requirement right in the main data fabric with a general-purpose capability for work packages. This capability can also be used to mimic the old idea of a stored procedure whenever optimum performance requires computation to take place as close to the data as possible. This approach is especially relevant to work that can be partitioned onto many nodes for massively parallel execution, including text analytics. Data profiling, transformation, and quality have evolved to support dynamic use cases. Unlike past information fabric architectures that relied on batch-oriented data profiling, transformation, and data quality, new and more dynamic use cases require new capabilities. Dynamic data profiling examines data and metadata from existing data sources and collects information about the data in real time, increasing transparency about the data s provenance and reliability. Dynamic transformation implements information fabric policies to change or enhance data in real time based on events or context, such as weather, stock market performance, customer behavior, or location. Finally, dynamic quality cleanses information on the fly based on policies, configurations, or events by alerting, flagging, or removing data that failed to meet the defined data requirements. Data Source Integration Expands To Enlarge The Scope Of The Information Fabric Data sources are the foundation of any information fabric deployment. Past information fabric implementations focused primarily on on-premises structured data found in databases, data marts, and DWs. Information fabric 3.0 increases the focus on external data such as social media, marketplaces, SaaS and cloud sources, and on unstructured and semistructured data formats. For an information fabric solution to provide a comprehensive solution for incorporating a wide range of data sources, it must include several key components: APIs connect your information fabric to the world. One API expert we interviewed estimated that more than 80% of the APIs his team delivers focus on information delivery; we estimate that the average is 60% or higher. So it s no surprise that firms using data virtualization and delivering APIs find that the two go together well. Most new APIs today use REST and often JSON, but enterprise-oriented fabric solutions must also support SOAP and messaging to

13 Information Fabric meet the full range of API requirements most large firms face often for the same interface in multiple forms. 9 The evolution of the API economy increasingly places your fabric in the middle of many incoming and outgoing APIs, almost like a switch or router in the cloud between your firm and your partners. The CTO of a major healthcare firm commented that this pattern will dominate his industry in the future. 10 Data connectors and adapters remain critical for conventional sources. Like other data integration solutions, the information fabric comes with a wide range of prebuilt connectors and adapters for enterprise data sources and applications, along with tools for building additional connectors and adapters for custom applications or other unique requirements. These typically build on conventional SQL-based data access standards such as JDBC and ODBC, but also support the standards required by off-the-shelf applications such as SAP. In maturing to information fabric 3.0, these connectors and adapters often include additional support for realtime use cases including introspection and dynamic formats, and new or emerging standards such as JSON, XML, SOAP, REST, OData, and linked data. Dynamic discovery has become essential to faster time-to-value. Today, enterprises are building new data services every week to support mobile apps or custom analytics to deliver new insights critical to competitive advantage. Integrating new data sources was more challenging in the past because data integration was never dynamic, requiring predefined access paths to sources. Adding a new source often required rewriting queries and establishing new access paths. Dynamic discovery automates the discovery of new internal or external data sources and presents them as new services for integration with existing ones. Dynamic discovery can scan networks, clouds, external sources, and data-as-a-service marketplaces to offer services that will dynamically extend your information fabric. Metadata is the heart and brain of the information fabric. An information fabric represents virtual data with metadata constructs, enabling teams to dynamically construct or alter data access to support real-time queries and reports. Integrating this metadata with other

14 Information Fabric metadata from other information fabrics enables a global information platform (see Figure 6). Metadata must be extensible to support new applications and data sources, whether internal or external, structured or unstructured, to support agility. When integrating more than one data fabric, metadata must be synchronized in real time among all platforms participating in the information fabric. This metadata must be able to describe not only the data model but also the data access path but also the lineage of derived information. Figure 6 Global Information Fabric With Data Intelligence Social Information fabric Data domains are content related to a specific topic, such as financial data or customer data. Information fabric Information fabric Marketplace Information fabric LOB Information fabric Enterprise-focused IT-managed Information fabric Partners LOB Source: Forrester Research, Inc. Data Movement And External Processing Frameworks Are Also Evolving Components that move data across layers provide the key capabilities of an information fabric. Past information fabric solutions primarily moved data in batches; however, information fabric 3.0 often moves data in real time or near real time. It also strives to move the processing closer to the data especially with big data. Information fabric 3.0 integrates with Hadoop, HBase, grid, cloud, and virtualized platforms to support external data processing, creating additional data movement requirements. To support all these requirements, information fabric 3.0 now includes:

15 Information Fabric External data processing and hub-and-spoke patterns to extend the platform. Although the information fabric supports distributed in-memory execution, local caching, and fabric data persistence to databases and files, it can also leverage external compute grids, Hadoop clusters, HBase, NoSQL databases, virtualized platforms, traditional databases and DWs, and data hubs to store and process larger, more complex data sets. 11 For example, firms offload large amounts of clickstream, blog, and sensor data and other data sets to Hadoop or appliance clusters, which process and aggregate the information before loading results into the information fabric distributed in-memory platform. An information fabric also makes an ideal integration mechanism for making the connections required when implementing hub-and-spoke patterns. 12 Data movement components that adapt to new requirements. The information fabric sources data from databases, applications, logs, files, clickstreams, streams, sensors, devices, and other sources. Depending on the requirements, data from these sources can be moved in batches or in near-real-time or real-time streams. To support this variable data flow, the information fabric integrates with several technologies, such as extract, transform, load (ETL); change data capture; replication; and streams. It s often no longer necessary to source raw data directly from data sources when it can also be sourced from compute platforms such as hub-and-spoke, Hadoop, grid, cloud, and virtualized platforms after transformation, aggregation, and integration. Data Security, Self-Service, And Administration Extend To Support Larger Fabrics Past information fabric solutions often lacked strong security features like comprehensive auditing or data masking. In addition, self-service features were very basic and focused primarily on IT staff. With information fabric 3.0, data security is extended to support real-time data obfuscation, integrated security across fabrics, comprehensive auditing, and data-at-rest and data-in-motion encryption. In addition, self-service capabilities are increasingly designed for business users and analysts, offering them the ability to create, access, manage, and consume data services. This means that vendors must extend their solutions to include: Comprehensive data features that are tightly integrated. The information fabric delivers integrated and centralized data access, eliminating the need to manage authentication, authorization, and access control for each data source. Over the past three years, Forrester has found several enterprises that moved to data virtualization primarily to improve their data security and meet regulatory compliance requirements. The information fabric enables centralized data access and control, enforcing a stricter level of data security than traditional applications and databases have. With the information fabric, each person s access to information is based on a confirmed identity in a specific usage context a more granular level of access control. In addition, the information fabric offers tight integration with other security frameworks, such as dynamic data masking, auditing, and encryption, to deliver highly secure data virtualization from the ground up.

16 Information Fabric Complete monitoring of the health of the fabric. Monitoring the health of an information fabric is critical; there are many components that could stall or break, disrupting the information flow. Information fabric monitoring ensures that processes, connectors, interfaces, distributed processing engines, applications, and data sources are available to service any request. If a data source becomes unavailable, the information fabric can use alternate data sources. In addition, the parallel architecture of the information fabric provides the redundancy required to ensure continuous availability and automated elastic scaling to maintain delivery service levels. Role-based tooling that focuses on simplicity and extends coverage to end users. Developers and data architects built (and were the main users of) early information fabric implementations to support data virtualization, data services, and the applications that consume them. As the information fabric moved into the mainstream, tools and capabilities supported many roles, enabling broader usage and greater ROI. Role-based tools went beyond developers and data architects to support information security pros, data stewards, enterprise architects, and database and infrastructure administrators. In version 3.0, role-based tools are expanding to support nontechnical roles such as business analysts and business users to support a self-service information fabric and self-service BI. INFORMATION FABRIC USE CASES Are EXPANDing TOWARD REAL TIME In the past, the information fabric focused primarily on batch staging of integrated data to support queries, reporting, and basic analytics for customer relationship management (CRM), enterprise resource planning (ERP), and other custom applications. However, recent use cases exhibit increasing real-time use cases such as fraud detection, customer analytics, collaboration, social and mobile applications, and other real-time analytics. Firms get the greatest benefit from the information fabric with applications, processes, or users that need to access widely distributed data obtained from heterogeneous internal and external sources delivered in real time. In addition, the information fabric has expanded beyond analytics and transactional workloads to support other workloads in pharmaceuticals, healthcare, intelligence, fraud detection, and risk, offering a nextgeneration data management platform today. We re still in the early stages of global information fabrics that intelligently connect multiple fabrics inside and outside a firm. However, some large organizations in financial services, government, and telecommunications are integrating multiple information fabrics centered on a particular domain, such as customer, partner, or financial data. Forrester expects that, in the coming years, these fabrics will connect lines of business within these firms, while also expanding to include partners, marketplaces, and social fabrics. Information fabric solutions will also become increasingly intelligent, using more semantic metadata to integrate data more dynamically. 13 Forrester has found many different emerging use cases for information fabric 3.0.

17 Information Fabric A Retailer Uses SAP To Deliver On-Demand Customer Analytics Forrester spoke with an IT manager at a large North American retailer to understand how the firm deals with data integration and data management challenges. The growth of data from a few terabytes four years ago to tens of terabytes today has been a key concern for the company. Data from clickstreams, social media, and blogs have only added to the challenge. Data was spread across several databases, clusters of logs, and other repositories of unstructured data. The firm wanted to have real-time access to billing, revenue, and other customer data in order to better understand its customers and their usage patterns, enable competitive differentiation, improve business agility, and enhance the customer experience. It wanted answers to questions like Why did a customer spend more than 2 minutes on the details of the product but didn t end up buying it?, Which products are often sold together and why?, and What other products did the customer browse before buying the product? The retailer wanted real-time financial insights and immediate access to billed and unbilled revenue, as well as the ability to upsell and cross-sell products while the customer was on the call. Our applications, reports, and systems were never designed to support real-time analytics, so it was extremely challenging for us at first. We had to raise the bar from having data integrated and reported 24 hours later to doing it in closer to real time. It s like transitioning from a bicycle doing 10 km per hour to a Formula 1 race car that can handle more than 200 km per hour. (IT manager, large North American retailer) This retailer chose SAP HANA and SAP Data Services to deliver a virtualized data layer. The retailer was able to extract data from many sources, including social media, blogs, and a CRM application, in real time to support on-demand customer analytics. While the compressed data integrated into HANA so far takes up close to 8 TB, the firm plans to add more data, such as geolocation and customer preferences, to enable even deeper insights. We were able to get real-time insights into our products, customers, and business it s been a game-changer. Thanks to the SAP HANA platform, we now know which products will have a higher demand and when, can upsell and cross-sell more products to customers with a higher success rate, and are better able to target discounts and promotions. (IT manager, large North American retailer) Informatica Enables Actionable Operational Intelligence At A Large Pharmaceutical Firm Forrester spoke with an IT director at a large pharmaceutical company in North America with more than 10,000 employees. When other firms need to take new drugs to market, this firm runs clinical trials on them and submits the data to the US Food and Drug Administration. Both operational and clinical data have grown over the years. The firm needed to automate this flow of data so it could deliver it more often and improve operational efficiency. In the past, the firm used custom coding to

18 Information Fabric support data integration and movement, but this approach did not deliver the required scalability or performance and was unable to support unexpected increased usage. The firm was taking three to six months to define requirements, extract data, and validate results results that quickly became obsolete. Delivering new feeds required considerable rework to generate and test data flows. Multiple customers wanted our data, but not all of it was in our DW and databases. So we considered pulling the information together from multiple sources and collating it in a custom way for each of our partners. Each wanted to see different information; while 80% of the information is in our DW, 20% is elsewhere but is crucial for them to have. We started to write custom B2B feeds, but the cost to maintain them was huge. That s when we started to look around for a new approach. (IT director, North American pharmaceutical firm) The firm chose the Informatica Data Services platform to support its data virtualization strategy. It created virtual data objects to connect to any data quickly without data refreshes. It was able to quickly map to the virtual (canonical) schema, allowing developers to visually inspect records and make changes as needed. Support for real-time profiling and preview enables data validation and speeds up development. With the data virtualization solution in place, we re now working on other use cases that are very interesting to us, such as exploratory data warehousing. These users don t want everything; they just want some information, with certain dimensions. These are conceptually like data marts, but we don t want to make them physical, so we create a data virtualization sandbox. We also have incoming data in many different forms, some of it particularly challenging like patient data collected via interactive voice response and we re now moving this into a consistent canonical form. (IT director, North American pharmaceutical firm) Composite Software Helps Deliver A Virtual Data Layer For A Large Financial Company Forrester spoke with a senior VP from a global investment services firm that provides investment products and services in more than 40 countries. The firm aims to find the right mix of products and services for its institutional customers via portfolio and investment management and plan administration solutions. The firm s advisors guide customers through strategic planning and implementation and help them evaluate the results. To accomplish this, the investment management and research team delivers research data and information to more than 300 employees, requiring the building of a research database using data from dozens of disparate sources and in several formats. In order to serve employees with limited resources in a timely fashion, developers were using various prebuilt SQL views to gain a certain level of efficiency. These views were functional, but extremely resource-intensive and cumbersome to maintain.

19 Information Fabric We wanted to federate data but avoid heavy lifting with ETL. Our goal was to expose the data they collect via internal services and join it together to build a data-as-a-service capability. (Senior vice president, global investment services firm) The firm decided to use data virtualization technology to provide a single, unified, reusable view for real-time data access, improving performance and time-to-market while greatly reducing initial setup and ongoing support costs. The customer found the Composite Software solution easy to use to support high-performance queries; developers can obtain multiple views with only a few hours of work. In the previous legacy SQL environment, reaching performance goals required weeks of tuning. Developers can also create consolidated views that contain all of the fields they need and consistently return data in the form they require, making it easier to respond quickly to changes requested by the business. The firm delivers these views from a canonical model comprised of 37 entities. With the data accessible in such a logical, easy-to-use structure, users without extensive research database or SQL knowledge can now easily query this view using familiar tools such as Excel, Cognos, and SQL Server Reporting Services. Composite helps the firm save more than $2 million per year in productivity improvements and achieve a 1.5% revenue increase through faster access to information by key business managers. Today, more than 1 million requests are made to the data virtualization layer each week; the demand for enterprise data exchange or integration is driving this. We are looking at how we can use data virtualization to support people doing makeshift research data typically doesn t get stored anywhere; it gets used once and then is lost and aim to expand to more use by Excel jockeys, cowboys that come up with data to support an investment thesis. If we can start to draw those users into a virtual environment, then when they do find something that has a need for persistence, they can more easily promote that into the enterprise environment. (Senior vice president, global investment services firm) Denodo Helps An Investment Management Firm Turn Data Into Information Forrester spoke with the vice president of data and application architecture in the corporate technology group of one of the largest investment management firms in the US. The firm was finding it challenging to manage its growing data volumes and provide business users all the data they required in a consistent way. The firm s enterprise data program defines a unified view of data; the business provides data stewardship, while IT is responsible for building the data virtualization framework. We have a whole series of subject-area data stores. Rather than requiring everyone to know where they are and how to access them, we provide a single place to access a lot of our data. Business users wanted all data linked and actionable. (VP of data and application architecture, US investment management firm)

20 Information Fabric Data virtualization abstracts source-level complexity from the data consumers, and this requires architects to communicate the lineage information available within the platform. Business users want to know how accurate data that might have gone through several rounds of integration and transformation is. With source-level data hidden, every time there is an issue with the data downstream, my team gets called: Why is it showing this data field? Why is it like this? They view it as a utility, so they call us like you d call a utility and they have to, because they don t know where the data is coming from. We get lots of calls about whether that was the right value or not. We end up as the central point of figuring out why this data had that value, and whether it s right. If I had it to do it again, I d make this a responsibility of the data stewards in the business, not of IT. (VP of data and application architecture, US investment management firm) The firm uses Denodo Technologies to support its data virtualization strategy. So far, the firm has found that Denodo virtualization software has allowed them to combine disparate data sources easily, creating a real-time data fabric for multiple consuming applications, processes, and tools. Because the data is not copied, it reduces redundancy and ensures timeliness. Also, the firm has delivered linkable and browsable data through RESTful services. The data virtualization project has been well-received, delivering a centralized location for data that s linked and actionable. The only question is how long you can delay going with data virtualization. If you look to the future, you simply have to do it this way. Point-to-point spaghetti won t work in the future. You have to create these environments if you want to do it right. (VP of data and application architecture, US investment management firm) IBM Is Helping Premier Improve Patient Care Premier, which serves more than 2,600 US hospitals and 84,000 other healthcare sites, wanted to help providers identify which treatments benefit patients the most to ensure that they get the best care. Siloed applications made it difficult for healthcare firms to connect different data sources and metrics to see the big picture of how to drive healthcare transformation. To address this challenge, Premier s healthcare alliance is implementing a high-performance IT strategy of trusted information that enables its members to analyze data from more than 86,000 healthcare providers and identify best practices to improve patient health while safely reducing healthcare costs. The goal is for business users within the alliance to compare the effectiveness of care, both locally and against national benchmarks, to help them improve resource utilization while minimizing waste in healthcare delivery and administration costs. With help from IBM, the alliance is now developing a completely new IT data strategy to support data virtualization. The organization s IT division is building its new infrastructure from the

Data Integration Checklist

Data Integration Checklist The need for data integration tools exists in every company, small to large. Whether it is extracting data that exists in spreadsheets, packaged applications, databases, sensor networks or social media

More information

Data Virtualization for Agile Business Intelligence Systems and Virtual MDM. To View This Presentation as a Video Click Here

Data Virtualization for Agile Business Intelligence Systems and Virtual MDM. To View This Presentation as a Video Click Here Data Virtualization for Agile Business Intelligence Systems and Virtual MDM To View This Presentation as a Video Click Here Agenda Data Virtualization New Capabilities New Challenges in Data Integration

More information

Klarna Tech Talk: Mind the Data! Jeff Pollock InfoSphere Information Integration & Governance

Klarna Tech Talk: Mind the Data! Jeff Pollock InfoSphere Information Integration & Governance Klarna Tech Talk: Mind the Data! Jeff Pollock InfoSphere Information Integration & Governance IBM s statements regarding its plans, directions, and intent are subject to change or withdrawal without notice

More information

Real-Time Data Management Delivers Faster Insights, Extreme Transaction Processing, And Competitive Advantage

Real-Time Data Management Delivers Faster Insights, Extreme Transaction Processing, And Competitive Advantage A Forrester Consulting Thought Leadership Paper Commissioned By SAP Real-Time Data Management Delivers Faster Insights, Extreme Transaction Processing, And Competitive Advantage June 2013 Table Of Contents

More information

Getting Started Practical Input For Your Roadmap

Getting Started Practical Input For Your Roadmap Getting Started Practical Input For Your Roadmap Mike Ferguson Managing Director, Intelligent Business Strategies BA4ALL Big Data & Analytics Insight Conference Stockholm, May 2015 About Mike Ferguson

More information

Aligning Your Strategic Initiatives with a Realistic Big Data Analytics Roadmap

Aligning Your Strategic Initiatives with a Realistic Big Data Analytics Roadmap Aligning Your Strategic Initiatives with a Realistic Big Data Analytics Roadmap 3 key strategic advantages, and a realistic roadmap for what you really need, and when 2012, Cognizant Topics to be discussed

More information

The Enterprise Data Hub and The Modern Information Architecture

The Enterprise Data Hub and The Modern Information Architecture The Enterprise Data Hub and The Modern Information Architecture Dr. Amr Awadallah CTO & Co-Founder, Cloudera Twitter: @awadallah 1 2013 Cloudera, Inc. All rights reserved. Cloudera Overview The Leader

More information

End to End Solution to Accelerate Data Warehouse Optimization. Franco Flore Alliance Sales Director - APJ

End to End Solution to Accelerate Data Warehouse Optimization. Franco Flore Alliance Sales Director - APJ End to End Solution to Accelerate Data Warehouse Optimization Franco Flore Alliance Sales Director - APJ Big Data Is Driving Key Business Initiatives Increase profitability, innovation, customer satisfaction,

More information

Simplify And Innovate The Way You Consume Cloud

Simplify And Innovate The Way You Consume Cloud A Forrester Consulting October 2014 Thought Leadership Paper Commissioned By Infosys Simplify And Innovate The Way You Consume Cloud Table Of Contents Executive Summary... 1 Cloud Adoption Is Gaining Maturity

More information

Database-As-A-Service Saves Money, Improves IT Productivity, And Speeds Application Development

Database-As-A-Service Saves Money, Improves IT Productivity, And Speeds Application Development A Forrester Consulting Thought Leadership Paper Commissioned By VMware Database-As-A-Service Saves Money, Improves IT Productivity, And Speeds Application Development October 2012 Table Of Contents Executive

More information

What s New with Informatica Data Services & PowerCenter Data Virtualization Edition

What s New with Informatica Data Services & PowerCenter Data Virtualization Edition 1 What s New with Informatica Data Services & PowerCenter Data Virtualization Edition Kevin Brady, Integration Team Lead Bonneville Power Wei Zheng, Product Management Informatica Ash Parikh, Product Marketing

More information

The Future of Data Management

The Future of Data Management The Future of Data Management with Hadoop and the Enterprise Data Hub Amr Awadallah (@awadallah) Cofounder and CTO Cloudera Snapshot Founded 2008, by former employees of Employees Today ~ 800 World Class

More information

Big Data Integration: A Buyer's Guide

Big Data Integration: A Buyer's Guide SEPTEMBER 2013 Buyer s Guide to Big Data Integration Sponsored by Contents Introduction 1 Challenges of Big Data Integration: New and Old 1 What You Need for Big Data Integration 3 Preferred Technology

More information

Cloud Backup And Disaster Recovery Meets Next-Generation Database Demands Public Cloud Can Lower Cost, Improve SLAs And Deliver On- Demand Scale

Cloud Backup And Disaster Recovery Meets Next-Generation Database Demands Public Cloud Can Lower Cost, Improve SLAs And Deliver On- Demand Scale A Forrester Consulting Thought Leadership Paper Commissioned By Microsoft March 2014 Cloud Backup And Disaster Recovery Meets Next-Generation Database Demands Public Cloud Can Lower Cost, Improve SLAs

More information

Enterprise Data Integration

Enterprise Data Integration Enterprise Data Integration Access, Integrate, and Deliver Data Efficiently Throughout the Enterprise brochure How Can Your IT Organization Deliver a Return on Data? The High Price of Data Fragmentation

More information

Luncheon Webinar Series May 13, 2013

Luncheon Webinar Series May 13, 2013 Luncheon Webinar Series May 13, 2013 InfoSphere DataStage is Big Data Integration Sponsored By: Presented by : Tony Curcio, InfoSphere Product Management 0 InfoSphere DataStage is Big Data Integration

More information

Informatica PowerCenter Data Virtualization Edition

Informatica PowerCenter Data Virtualization Edition Data Sheet Informatica PowerCenter Data Virtualization Edition Benefits Rapidly deliver new critical data and reports across applications and warehouses Access, merge, profile, transform, cleanse data

More information

Trends In Data Quality And Business Process Alignment

Trends In Data Quality And Business Process Alignment A Custom Technology Adoption Profile Commissioned by Trillium Software November, 2011 Introduction Enterprise organizations indicate that they place significant importance on data quality and make a strong

More information

Data Virtualization and ETL. Denodo Technologies Architecture Brief

Data Virtualization and ETL. Denodo Technologies Architecture Brief Data Virtualization and ETL Denodo Technologies Architecture Brief Contents Data Virtualization and ETL... 3 Summary... 3 Data Virtualization... 7 What is Data Virtualization good for?... 8 Applications

More information

The IBM Cognos Platform

The IBM Cognos Platform The IBM Cognos Platform Deliver complete, consistent, timely information to all your users, with cost-effective scale Highlights Reach all your information reliably and quickly Deliver a complete, consistent

More information

Ganzheitliches Datenmanagement

Ganzheitliches Datenmanagement Ganzheitliches Datenmanagement für Hadoop Michael Kohs, Senior Sales Consultant @mikchaos The Problem with Big Data Projects in 2016 Relational, Mainframe Documents and Emails Data Modeler Data Scientist

More information

Decoding the Big Data Deluge a Virtual Approach. Dan Luongo, Global Lead, Field Solution Engineering Data Virtualization Business Unit, Cisco

Decoding the Big Data Deluge a Virtual Approach. Dan Luongo, Global Lead, Field Solution Engineering Data Virtualization Business Unit, Cisco Decoding the Big Data Deluge a Virtual Approach Dan Luongo, Global Lead, Field Solution Engineering Data Virtualization Business Unit, Cisco High-volume, velocity and variety information assets that demand

More information

Data virtualization: Delivering on-demand access to information throughout the enterprise

Data virtualization: Delivering on-demand access to information throughout the enterprise IBM Software Thought Leadership White Paper April 2013 Data virtualization: Delivering on-demand access to information throughout the enterprise 2 Data virtualization: Delivering on-demand access to information

More information

Datenverwaltung im Wandel - Building an Enterprise Data Hub with

Datenverwaltung im Wandel - Building an Enterprise Data Hub with Datenverwaltung im Wandel - Building an Enterprise Data Hub with Cloudera Bernard Doering Regional Director, Central EMEA, Cloudera Cloudera Your Hadoop Experts Founded 2008, by former employees of Employees

More information

Hadoop Data Hubs and BI. Supporting the migration from siloed reporting and BI to centralized services with Hadoop

Hadoop Data Hubs and BI. Supporting the migration from siloed reporting and BI to centralized services with Hadoop Hadoop Data Hubs and BI Supporting the migration from siloed reporting and BI to centralized services with Hadoop John Allen October 2014 Introduction John Allen; computer scientist Background in data

More information

Data Virtualization. Paul Moxon Denodo Technologies. Alberta Data Architecture Community January 22 nd, 2014. 2014 Denodo Technologies

Data Virtualization. Paul Moxon Denodo Technologies. Alberta Data Architecture Community January 22 nd, 2014. 2014 Denodo Technologies Data Virtualization Paul Moxon Denodo Technologies Alberta Data Architecture Community January 22 nd, 2014 The Changing Speed of Business 100 25 35 45 55 65 75 85 95 Gartner The Nexus of Forces Today s

More information

Accelerate BI Initiatives With Self-Service Data Discovery And Integration

Accelerate BI Initiatives With Self-Service Data Discovery And Integration A Custom Technology Adoption Profile Commissioned By Attivio June 2015 Accelerate BI Initiatives With Self-Service Data Discovery And Integration Introduction The rapid advancement of technology has ushered

More information

IBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS!

IBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS! The Bloor Group IBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS VENDOR PROFILE The IBM Big Data Landscape IBM can legitimately claim to have been involved in Big Data and to have a much broader

More information

MDM and Data Warehousing Complement Each Other

MDM and Data Warehousing Complement Each Other Master Management MDM and Warehousing Complement Each Other Greater business value from both 2011 IBM Corporation Executive Summary Master Management (MDM) and Warehousing (DW) complement each other There

More information

RED HAT AND HORTONWORKS: OPEN MODERN DATA ARCHITECTURE FOR THE ENTERPRISE

RED HAT AND HORTONWORKS: OPEN MODERN DATA ARCHITECTURE FOR THE ENTERPRISE WHITEPAPER RED HAT AND HORTONWORKS: OPEN MODERN DATA ARCHITECTURE FOR THE ENTERPRISE A Hortonworks and Red Hat whitepaper INTRODUCTION WHAT IS HADOOP? Apache Hadoop is an opensource technology born out

More information

Data Virtualization Usage Patterns for Business Intelligence/ Data Warehouse Architectures

Data Virtualization Usage Patterns for Business Intelligence/ Data Warehouse Architectures DATA VIRTUALIZATION Whitepaper Data Virtualization Usage Patterns for / Data Warehouse Architectures www.denodo.com Incidences Address Customer Name Inc_ID Specific_Field Time New Jersey Chevron Corporation

More information

An Integrated Analytics & Big Data Infrastructure September 21, 2012 Robert Stackowiak, Vice President Data Systems Architecture Oracle Enterprise

An Integrated Analytics & Big Data Infrastructure September 21, 2012 Robert Stackowiak, Vice President Data Systems Architecture Oracle Enterprise An Integrated Analytics & Big Data Infrastructure September 21, 2012 Robert Stackowiak, Vice President Data Systems Architecture Oracle Enterprise Solutions Group The following is intended to outline our

More information

JOURNAL OF OBJECT TECHNOLOGY

JOURNAL OF OBJECT TECHNOLOGY JOURNAL OF OBJECT TECHNOLOGY Online at www.jot.fm. Published by ETH Zurich, Chair of Software Engineering JOT, 2008 Vol. 7, No. 8, November-December 2008 What s Your Information Agenda? Mahesh H. Dodani,

More information

Big Data and Your Data Warehouse Philip Russom

Big Data and Your Data Warehouse Philip Russom Big Data and Your Data Warehouse Philip Russom TDWI Research Director for Data Management April 5, 2012 Sponsor Speakers Philip Russom Research Director, Data Management, TDWI Peter Jeffcock Director,

More information

Cloud Ready Data: Speeding Your Journey to the Cloud

Cloud Ready Data: Speeding Your Journey to the Cloud Cloud Ready Data: Speeding Your Journey to the Cloud Hybrid Cloud first Born to the cloud 3 Am I part of a Cloud First organization? Am I part of a Cloud First agency? The cloud applications questions

More information

Data Virtualization A Potential Antidote for Big Data Growing Pains

Data Virtualization A Potential Antidote for Big Data Growing Pains perspective Data Virtualization A Potential Antidote for Big Data Growing Pains Atul Shrivastava Abstract Enterprises are already facing challenges around data consolidation, heterogeneity, quality, and

More information

Big Data Executive Survey

Big Data Executive Survey Big Data Executive Full Questionnaire Big Date Executive Full Questionnaire Appendix B Questionnaire Welcome The survey has been designed to provide a benchmark for enterprises seeking to understand the

More information

Cloud First Does Not Have to Mean Cloud Exclusively. Digital Government Institute s Cloud Computing & Data Center Conference, September 2014

Cloud First Does Not Have to Mean Cloud Exclusively. Digital Government Institute s Cloud Computing & Data Center Conference, September 2014 Cloud First Does Not Have to Mean Cloud Exclusively Digital Government Institute s Cloud Computing & Data Center Conference, September 2014 Am I part of a cloud first organization? Am I part of a cloud

More information

HDP Hadoop From concept to deployment.

HDP Hadoop From concept to deployment. HDP Hadoop From concept to deployment. Ankur Gupta Senior Solutions Engineer Rackspace: Page 41 27 th Jan 2015 Where are you in your Hadoop Journey? A. Researching our options B. Currently evaluating some

More information

Independent process platform

Independent process platform Independent process platform Megatrend in infrastructure software Dr. Wolfram Jost CTO February 22, 2012 2 Agenda Positioning BPE Strategy Cloud Strategy Data Management Strategy ETS goes Mobile Each layer

More information

Market Overview: Big Data Integration

Market Overview: Big Data Integration For: Enterprise Architecture Professionals Market Overview: Big Data Integration by Noel Yuhanna, December 5, 2014 Key Takeaways Big Data Creates New Data Challenges Today, most big data deployments are

More information

Achieving Business Value through Big Data Analytics Philip Russom

Achieving Business Value through Big Data Analytics Philip Russom Achieving Business Value through Big Data Analytics Philip Russom TDWI Research Director for Data Management October 3, 2012 Sponsor 2 Speakers Philip Russom Research Director, Data Management, TDWI Brian

More information

INTELLIGENT BUSINESS STRATEGIES WHITE PAPER

INTELLIGENT BUSINESS STRATEGIES WHITE PAPER INTELLIGENT BUSINESS STRATEGIES WHITE PAPER Improving Access to Data for Successful Business Intelligence Part 2: Supporting Multiple Analytical Workloads in a Changing Analytical Landscape By Mike Ferguson

More information

A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY

A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY A TECHNICAL WHITE PAPER ATTUNITY VISIBILITY Analytics for Enterprise Data Warehouse Management and Optimization Executive Summary Successful enterprise data management is an important initiative for growing

More information

MDM for the Enterprise: Complementing and extending your Active Data Warehousing strategy. Satish Krishnaswamy VP MDM Solutions - Teradata

MDM for the Enterprise: Complementing and extending your Active Data Warehousing strategy. Satish Krishnaswamy VP MDM Solutions - Teradata MDM for the Enterprise: Complementing and extending your Active Data Warehousing strategy Satish Krishnaswamy VP MDM Solutions - Teradata 2 Agenda MDM and its importance Linking to the Active Data Warehousing

More information

How to Enhance Traditional BI Architecture to Leverage Big Data

How to Enhance Traditional BI Architecture to Leverage Big Data B I G D ATA How to Enhance Traditional BI Architecture to Leverage Big Data Contents Executive Summary... 1 Traditional BI - DataStack 2.0 Architecture... 2 Benefits of Traditional BI - DataStack 2.0...

More information

Pervasive Software + NetSuite = Seamless Cloud Business Processes

Pervasive Software + NetSuite = Seamless Cloud Business Processes Pervasive Software + NetSuite = Seamless Cloud Business Processes Successful integration solution between cloudbased ERP and on-premise applications leveraging Pervasive integration software. Prepared

More information

Next-Generation Data Virtualization Fast and Direct Data Access, More Reuse, and Better Agility and Data Governance for BI, MDM, and SOA

Next-Generation Data Virtualization Fast and Direct Data Access, More Reuse, and Better Agility and Data Governance for BI, MDM, and SOA white paper Next-Generation Data Virtualization Fast and Direct Data Access, More Reuse, and Better Agility and Data Governance for BI, MDM, and SOA Executive Summary It s 9:00 a.m. and the CEO of a leading

More information

Traditional BI vs. Business Data Lake A comparison

Traditional BI vs. Business Data Lake A comparison Traditional BI vs. Business Data Lake A comparison The need for new thinking around data storage and analysis Traditional Business Intelligence (BI) systems provide various levels and kinds of analyses

More information

WHITEPAPER. Why Dependency Mapping is Critical for the Modern Data Center

WHITEPAPER. Why Dependency Mapping is Critical for the Modern Data Center WHITEPAPER Why Dependency Mapping is Critical for the Modern Data Center OVERVIEW The last decade has seen a profound shift in the way IT is delivered and consumed by organizations, triggered by new technologies

More information

Future-Proofing Your Data Center Storage

Future-Proofing Your Data Center Storage A Custom Technology Adoption Profile Commissioned By HP October 2012 Introduction For years, organizations have made hard choices on storage spending in the face of high growth as well as high sensitivity

More information

ORACLE DATA INTEGRATOR ENTERPRISE EDITION

ORACLE DATA INTEGRATOR ENTERPRISE EDITION ORACLE DATA INTEGRATOR ENTERPRISE EDITION Oracle Data Integrator Enterprise Edition 12c delivers high-performance data movement and transformation among enterprise platforms with its open and integrated

More information

Why Big Data in the Cloud?

Why Big Data in the Cloud? Have 40 Why Big Data in the Cloud? Colin White, BI Research January 2014 Sponsored by Treasure Data TABLE OF CONTENTS Introduction The Importance of Big Data The Role of Cloud Computing Using Big Data

More information

BEYOND BI: Big Data Analytic Use Cases

BEYOND BI: Big Data Analytic Use Cases BEYOND BI: Big Data Analytic Use Cases Big Data Analytics Use Cases This white paper discusses the types and characteristics of big data analytics use cases, how they differ from traditional business intelligence

More information

ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS

ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS PRODUCT FACTS & FEATURES KEY FEATURES Comprehensive, best-of-breed capabilities 100 percent thin client interface Intelligence across multiple

More information

ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS

ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS Oracle Fusion editions of Oracle's Hyperion performance management products are currently available only on Microsoft Windows server platforms. The following is intended to outline our general product

More information

Big Data and Trusted Information

Big Data and Trusted Information Dr. Oliver Adamczak Big Data and Trusted Information CAS Single Point of Truth 7. Mai 2012 The Hype Big Data: The next frontier for innovation, competition and productivity McKinsey Global Institute 2012

More information

DATAMEER WHITE PAPER. Beyond BI. Big Data Analytic Use Cases

DATAMEER WHITE PAPER. Beyond BI. Big Data Analytic Use Cases DATAMEER WHITE PAPER Beyond BI Big Data Analytic Use Cases This white paper discusses the types and characteristics of big data analytics use cases, how they differ from traditional business intelligence

More information

SELLING PROJECTS ON THE MICROSOFT BUSINESS ANALYTICS PLATFORM

SELLING PROJECTS ON THE MICROSOFT BUSINESS ANALYTICS PLATFORM David Chappell SELLING PROJECTS ON THE MICROSOFT BUSINESS ANALYTICS PLATFORM A PERSPECTIVE FOR SYSTEMS INTEGRATORS Sponsored by Microsoft Corporation Copyright 2014 Chappell & Associates Contents Business

More information

Introducing Oracle Exalytics In-Memory Machine

Introducing Oracle Exalytics In-Memory Machine Introducing Oracle Exalytics In-Memory Machine Jon Ainsworth Director of Business Development Oracle EMEA Business Analytics 1 Copyright 2011, Oracle and/or its affiliates. All rights Agenda Topics Oracle

More information

TRANSFORM BIG DATA INTO ACTIONABLE INFORMATION

TRANSFORM BIG DATA INTO ACTIONABLE INFORMATION TRANSFORM BIG DATA INTO ACTIONABLE INFORMATION Make Big Available for Everyone Syed Rasheed Solution Marketing Manager January 29 th, 2014 Agenda Demystifying Big Challenges Getting Bigger Red Hat Big

More information

Business Transformation for Application Providers

Business Transformation for Application Providers E SB DE CIS IO N GUID E Business Transformation for Application Providers 10 Questions to Ask Before Selecting an Enterprise Service Bus 10 Questions to Ask Before Selecting an Enterprise Service Bus InterSystems

More information

Offload Enterprise Data Warehouse (EDW) to Big Data Lake. Ample White Paper

Offload Enterprise Data Warehouse (EDW) to Big Data Lake. Ample White Paper Offload Enterprise Data Warehouse (EDW) to Big Data Lake Oracle Exadata, Teradata, Netezza and SQL Server Ample White Paper EDW (Enterprise Data Warehouse) Offloads The EDW (Enterprise Data Warehouse)

More information

Oracle Database 12c Plug In. Switch On. Get SMART.

Oracle Database 12c Plug In. Switch On. Get SMART. Oracle Database 12c Plug In. Switch On. Get SMART. Duncan Harvey Head of Core Technology, Oracle EMEA March 2015 Safe Harbor Statement The following is intended to outline our general product direction.

More information

Next-Generation Cloud Analytics with Amazon Redshift

Next-Generation Cloud Analytics with Amazon Redshift Next-Generation Cloud Analytics with Amazon Redshift What s inside Introduction Why Amazon Redshift is Great for Analytics Cloud Data Warehousing Strategies for Relational Databases Analyzing Fast, Transactional

More information

Strategically Detecting And Mitigating Employee Fraud

Strategically Detecting And Mitigating Employee Fraud A Custom Technology Adoption Profile Commissioned By SAP and Deloitte March 2014 Strategically Detecting And Mitigating Employee Fraud Executive Summary Employee fraud is a universal concern, with detection

More information

Beyond the Single View with IBM InfoSphere

Beyond the Single View with IBM InfoSphere Ian Bowring MDM & Information Integration Sales Leader, NE Europe Beyond the Single View with IBM InfoSphere We are at a pivotal point with our information intensive projects 10-40% of each initiative

More information

Integrating SAP and non-sap data for comprehensive Business Intelligence

Integrating SAP and non-sap data for comprehensive Business Intelligence WHITE PAPER Integrating SAP and non-sap data for comprehensive Business Intelligence www.barc.de/en Business Application Research Center 2 Integrating SAP and non-sap data Authors Timm Grosser Senior Analyst

More information

BIG DATA: FROM HYPE TO REALITY. Leandro Ruiz Presales Partner for C&LA Teradata

BIG DATA: FROM HYPE TO REALITY. Leandro Ruiz Presales Partner for C&LA Teradata BIG DATA: FROM HYPE TO REALITY Leandro Ruiz Presales Partner for C&LA Teradata Evolution in The Use of Information Action s ACTIVATING MAKE it happen! Insights OPERATIONALIZING WHAT IS happening now? PREDICTING

More information

Pentaho High-Performance Big Data Reference Configurations using Cisco Unified Computing System

Pentaho High-Performance Big Data Reference Configurations using Cisco Unified Computing System Pentaho High-Performance Big Data Reference Configurations using Cisco Unified Computing System By Jake Cornelius Senior Vice President of Products Pentaho June 1, 2012 Pentaho Delivers High-Performance

More information

Service Oriented Data Management

Service Oriented Data Management Service Oriented Management Nabin Bilas Integration Architect Integration & SOA: Agenda Integration Overview 5 Reasons Why Is Critical to SOA Oracle Integration Solution Integration

More information

Achieving business agility and cost optimization by reducing IT complexity. The value of adding ESB enrichment to your existing messaging solution

Achieving business agility and cost optimization by reducing IT complexity. The value of adding ESB enrichment to your existing messaging solution Smart SOA application integration with WebSphere software To support your business objectives Achieving business agility and cost optimization by reducing IT complexity. The value of adding ESB enrichment

More information

BIG DATA: FIVE TACTICS TO MODERNIZE YOUR DATA WAREHOUSE

BIG DATA: FIVE TACTICS TO MODERNIZE YOUR DATA WAREHOUSE BIG DATA: FIVE TACTICS TO MODERNIZE YOUR DATA WAREHOUSE Current technology for Big Data allows organizations to dramatically improve return on investment (ROI) from their existing data warehouse environment.

More information

www.ducenit.com Self-Service Business Intelligence: The hunt for real insights in hidden knowledge Whitepaper

www.ducenit.com Self-Service Business Intelligence: The hunt for real insights in hidden knowledge Whitepaper Self-Service Business Intelligence: The hunt for real insights in hidden knowledge Whitepaper Shift in BI usage In this fast paced business environment, organizations need to make smarter and faster decisions

More information

WHITE PAPER. Data Migration and Access in a Cloud Computing Environment INTELLIGENT BUSINESS STRATEGIES

WHITE PAPER. Data Migration and Access in a Cloud Computing Environment INTELLIGENT BUSINESS STRATEGIES INTELLIGENT BUSINESS STRATEGIES WHITE PAPER Data Migration and Access in a Cloud Computing Environment By Mike Ferguson Intelligent Business Strategies March 2014 Prepared for: Table of Contents Introduction...

More information

Cloud Without Limits: How To Deliver Hybrid Cloud With Agility, Governance, And Choice

Cloud Without Limits: How To Deliver Hybrid Cloud With Agility, Governance, And Choice A Custom Technology Adoption Profile Commissioned By Dell November 2014 Cloud Without Limits: How To Deliver Hybrid Cloud With Agility, Governance, And Choice Introduction With more and more business applications

More information

VIEWPOINT. High Performance Analytics. Industry Context and Trends

VIEWPOINT. High Performance Analytics. Industry Context and Trends VIEWPOINT High Performance Analytics Industry Context and Trends In the digital age of social media and connected devices, enterprises have a plethora of data that they can mine, to discover hidden correlations

More information

What's New in SAS Data Management

What's New in SAS Data Management Paper SAS034-2014 What's New in SAS Data Management Nancy Rausch, SAS Institute Inc., Cary, NC; Mike Frost, SAS Institute Inc., Cary, NC, Mike Ames, SAS Institute Inc., Cary ABSTRACT The latest releases

More information

SOA REFERENCE ARCHITECTURE: SERVICE TIER

SOA REFERENCE ARCHITECTURE: SERVICE TIER SOA REFERENCE ARCHITECTURE: SERVICE TIER SOA Blueprint A structured blog by Yogish Pai Service Tier The service tier is the primary enabler of the SOA and includes the components described in this section.

More information

5 Keys to Unlocking the Big Data Analytics Puzzle. Anurag Tandon Director, Product Marketing March 26, 2014

5 Keys to Unlocking the Big Data Analytics Puzzle. Anurag Tandon Director, Product Marketing March 26, 2014 5 Keys to Unlocking the Big Data Analytics Puzzle Anurag Tandon Director, Product Marketing March 26, 2014 1 A Little About Us A global footprint. A proven innovator. A leader in enterprise analytics for

More information

By Makesh Kannaiyan makesh.k@sonata-software.com 8/27/2011 1

By Makesh Kannaiyan makesh.k@sonata-software.com 8/27/2011 1 Integration between SAP BusinessObjects and Netweaver By Makesh Kannaiyan makesh.k@sonata-software.com 8/27/2011 1 Agenda Evolution of BO Business Intelligence suite Integration Integration after 4.0 release

More information

IRMAC SAS INFORMATION MANAGEMENT, TRANSFORMING AN ANALYTICS CULTURE. Copyright 2012, SAS Institute Inc. All rights reserved.

IRMAC SAS INFORMATION MANAGEMENT, TRANSFORMING AN ANALYTICS CULTURE. Copyright 2012, SAS Institute Inc. All rights reserved. IRMAC SAS INFORMATION MANAGEMENT, TRANSFORMING AN ANALYTICS CULTURE ABOUT THE PRESENTER Marc has been with SAS for 10 years and leads the information management practice for canada. Marc s area of specialty

More information

7 Megatrends Driving the Shift to Cloud Business Intelligence

7 Megatrends Driving the Shift to Cloud Business Intelligence 7 Megatrends Driving the Shift to Cloud Business Intelligence Cloud business intelligence has the potential to unify all data, make it available to everyone and enable highly agile decision-making. Here

More information

Apache Hadoop in the Enterprise. Dr. Amr Awadallah, CTO/Founder @awadallah, aaa@cloudera.com

Apache Hadoop in the Enterprise. Dr. Amr Awadallah, CTO/Founder @awadallah, aaa@cloudera.com Apache Hadoop in the Enterprise Dr. Amr Awadallah, CTO/Founder @awadallah, aaa@cloudera.com Cloudera The Leader in Big Data Management Powered by Apache Hadoop The Leading Open Source Distribution of Apache

More information

Create a single 360 view of data Red Hat JBoss Data Virtualization consolidates master and transactional data

Create a single 360 view of data Red Hat JBoss Data Virtualization consolidates master and transactional data Whitepaper Create a single 360 view of Red Hat JBoss Data Virtualization consolidates master and transactional Red Hat JBoss Data Virtualization can play diverse roles in a master management initiative,

More information

The Unified Communications Journey

The Unified Communications Journey A Custom Technology Adoption Profile Commissioned By Cisco Systems How IT Is Responding To Increasing Demand For Mobile And Visual Collaboration July 2012 Introduction Today s IT managers have to navigate

More information

Benefits Of Leveraging The Cloud Extend To Master Data Management

Benefits Of Leveraging The Cloud Extend To Master Data Management A Custom Technology Adoption Profile Commissioned By Liaison Technologies April 2014 Benefits Of Leveraging The Cloud Extend To Master Data Management Introduction It is extremely difficult to imagine

More information

OPEN MODERN DATA ARCHITECTURE FOR FINANCIAL SERVICES RISK MANAGEMENT

OPEN MODERN DATA ARCHITECTURE FOR FINANCIAL SERVICES RISK MANAGEMENT WHITEPAPER OPEN MODERN DATA ARCHITECTURE FOR FINANCIAL SERVICES RISK MANAGEMENT A top-tier global bank s end-of-day risk analysis jobs didn t complete in time for the next start of trading day. To solve

More information

Data Integration for the Real Time Enterprise

Data Integration for the Real Time Enterprise Executive Brief Data Integration for the Real Time Enterprise Business Agility in a Constantly Changing World Overcoming the Challenges of Global Uncertainty Informatica gives Zyme the ability to maintain

More information

Microsoft Big Data. Solution Brief

Microsoft Big Data. Solution Brief Microsoft Big Data Solution Brief Contents Introduction... 2 The Microsoft Big Data Solution... 3 Key Benefits... 3 Immersive Insight, Wherever You Are... 3 Connecting with the World s Data... 3 Any Data,

More information

A Service-oriented Architecture for Business Intelligence

A Service-oriented Architecture for Business Intelligence A Service-oriented Architecture for Business Intelligence Liya Wu 1, Gilad Barash 1, Claudio Bartolini 2 1 HP Software 2 HP Laboratories {name.surname@hp.com} Abstract Business intelligence is a business

More information

Using Tableau Software with Hortonworks Data Platform

Using Tableau Software with Hortonworks Data Platform Using Tableau Software with Hortonworks Data Platform September 2013 2013 Hortonworks Inc. http:// Modern businesses need to manage vast amounts of data, and in many cases they have accumulated this data

More information

BIG DATA ANALYTICS REFERENCE ARCHITECTURES AND CASE STUDIES

BIG DATA ANALYTICS REFERENCE ARCHITECTURES AND CASE STUDIES BIG DATA ANALYTICS REFERENCE ARCHITECTURES AND CASE STUDIES Relational vs. Non-Relational Architecture Relational Non-Relational Rational Predictable Traditional Agile Flexible Modern 2 Agenda Big Data

More information

Architecting for the Internet of Things & Big Data

Architecting for the Internet of Things & Big Data Architecting for the Internet of Things & Big Data Robert Stackowiak, Oracle North America, VP Information Architecture & Big Data September 29, 2014 Safe Harbor Statement The following is intended to

More information

Informatica and the Vibe Virtual Data Machine

Informatica and the Vibe Virtual Data Machine White Paper Informatica and the Vibe Virtual Data Machine Preparing for the Integrated Information Age This document contains Confidential, Proprietary and Trade Secret Information ( Confidential Information

More information

Evolving Data Warehouse Architectures

Evolving Data Warehouse Architectures Evolving Data Warehouse Architectures In the Age of Big Data Philip Russom April 15, 2014 TDWI would like to thank the following companies for sponsoring the 2014 TDWI Best Practices research report: Evolving

More information

The Principles of the Business Data Lake

The Principles of the Business Data Lake The Principles of the Business Data Lake The Business Data Lake Culture eats Strategy for Breakfast, so said Peter Drucker, elegantly making the point that the hardest thing to change in any organization

More information

Faster Business Insights By Enabling Real-Time Data for Business Intelligence (BI) & Analytics A Best Practices White Paper

Faster Business Insights By Enabling Real-Time Data for Business Intelligence (BI) & Analytics A Best Practices White Paper Faster Business Insights By Enabling Real-Time Data for Business Intelligence (BI) & Analytics A Best Practices Page 1 of 11 Executive Summary In today s intelligent economy, companies must analyze and

More information

How the oil and gas industry can gain value from Big Data?

How the oil and gas industry can gain value from Big Data? How the oil and gas industry can gain value from Big Data? Arild Kristensen Nordic Sales Manager, Big Data Analytics arild.kristensen@no.ibm.com, tlf. +4790532591 April 25, 2013 2013 IBM Corporation Dilbert

More information