InDetail. Kdb+ and the Internet of Things/Big Data
|
|
|
- Christopher Hubbard
- 10 years ago
- Views:
Transcription
1 InDetail Kdb+ and the Internet of Things/Big Data An InDetail Paper by Bloor Research Author : Philip Howard Publish date : August 2014
2 Kdb+ has proved itself in what is unarguably the most demanding big data market: financial trading and risk management. The company is now targeting other environments. Philip Howard
3 Executive summary Kdb+ is a column-based relational database with extensive in-memory capabilities, developed and marketed by Kx Systems. Like all such products, it is especially powerful when it comes to supporting queries and analytics. However, unlike other products in this domain, kdb+ is particularly good (both in terms of performance and functionality) at processing, manipulating and analysing data (especially numeric data) in real-time, alongside the analysis of historical data. Moreover, it has extensive capabilities for supporting timeseries data. For these reasons Kx Systems has historically targeted the financial industry for trading analytics and black box trading based on real-time and historic data, as well as realtime risk assessment; applications which are particularly demanding in their performance requirements. The company has had significant success in this market with over a hundred major financial institutions and hedge funds deploying its technology. In this paper, however, we want to explore the use of kdb+ for big data and Internet of Things based use cases. Fast facts Kdb+ is a column-based, hybrid in-memory database with stream processing capabilities, primarily designed for analytic workloads. In so far as in-memory capability is concerned, we refer to it as hybrid because it uses in-memory processing as much as it can but recognises that in some cases it may be impracticable to load all relevant (typically historic) data into memory and that you therefore need to employ techniques that will not only leverage memory-based processing but optimise performance when not all the data can fit into memory. This is similar to the approach taken by IBM with its DB2 BLU Acceleration, as an example. The product s stream processing capabilities (which means that you can analyse very large quantities of information in-flight, in real-time) arise from the fact that kdb+ is tightly integrated with the product s development language, q. This is a vector (array) processing language that is much more efficient than SQL and can be used to develop analytic applications as well as for query purposes. Key findings In the opinion of Bloor Research, the following represent the key facts of which prospective users should be aware: Data is stored in sequentially ordered columns. Note that other column-based databases are not typically ordered in this fashion so you would expect performance for operations such as sorts to be much faster when using kdb+. As is usual with columnbased databases it is not necessary to define indexes because columns are self-indexing (and this even more true in this case, thanks to the sequential ordering). However, you may define indexes if you wish to. Kdb+ supports compression at comparable levels to other vendors. That is, up to around 10x depending on the type of data. In-memory processing is genuinely in-memory: it is not just a cache. However, the system has been designed so that both pure in-memory and hybrid in-memory/disk based processing are optimised. The support for time series in kdb+ is especially relevant to many Internet of Things applications such as smart metering, preventative maintenance and other environments where large quantities of data need to be collected at regular intervals and then analysed. Note that native support for time series is extremely rare across database products. We are pleased to hear that Kx has recently implemented geospatial capabilities, as many Internet of Things applications are both time and location based. The q language is significantly more efficient than other languages that you might use (both procedural and declarative) for analysis purposes. It allows very complex business logic to be developed quickly. However, its use does imply a learning curve. Alternatively there are specific C# and JavaScript interfaces that are provided as a part of kdb+. There are interfacing capabilities for a host of other languages and environments, including (but not limited to) Python, R, Matlab, Excel, and Mathematica. A Bloor InDetail Paper Bloor Research
4 Executive summary Because programming is typically in q rather than SQL, most business intelligence tools will not run natively with kdb+. However, Kx Systems has partnered with Datawatch and that company s Panopticon product can be used for visualisation purposes. In addition, a number of Kx s partners have built business intelligence front-ends to the product. As a lowest common denominator, kdb+ supports ODBC. Kdb+ is scalable, does not require a large investment in hardware, is relatively simple to maintain, performs extremely well and is highly available. The bottom line Kx Systems was founded more than 20 years ago, although the precursor to kdb+ (kdb) was not introduced until In the fifteen years since then the company and its product have earned a well-deserved reputation in the financial services market. That, in itself, speaks volumes: hedge funds and other financial institutions have been processing big data since before the term came into common parlance and Kx Systems has been a leading provider to that market. Today, Bloor Research believes that the company is well placed to exploit the capabilities of kdb+ in other big data markets and, especially, with respect to the Internet of Things. In fact, Kx Systems has already started to make inroads in this area, as we will discuss (by means of case studies) in due course. Our view is that Kx Systems has, perhaps serendipitously, built a platform that makes it ideal for many big data analytic requirements Bloor Research A Bloor InDetail Paper
5 The product Kdb+ runs on Solaris, Windows and Linux platforms and is currently in version 3.1. In addition to the standard kdb+, which is 64-bit, there is also a free 32-bit version. This runs on almost anything, including a Raspberry Pi! The standard version has a parallel architecture for deployment across multiple partitions (and, notably, you can implement different indexes on different partitions) and has a distributed capability so that it can scale across a clustered environment. This architecture helps to support the product s failover capabilities, which are supplemented by full logging facilities and automatic recovery. Load balancing and replication are both provided. The product includes its own web server, which is used to support web-based queries that utilise either a standard URL link or which may derive from applications such as Microsoft Excel. The results are returned to the user s browser either in HTML or XML format, or they can be exported to an Excel spreadsheet. Unicode is supported. Historically, there was an optional product called kdb-tick that was specifically designed for ingesting and analysing (stream processing) stock tick information. However, this has now been folded into kdb+. There is also a fast loading capability. The solution supports commodity servers and storage devices in any topology (local, remote, SAN, NAS, flash, SSD, HDD), providing flexibility in cost-effectively adding capacity when required. A Bloor InDetail Paper Bloor Research
6 Architecture There are really two parts to kdb+: the q language and the database. We will discuss each in turn. The q language The origins of q lie in a programming language called APL. This was a language used from the early days of computing, primarily to process numerical data. In 1988, while he was at Morgan Stanley, Arthur Whitney developed A+ as a replacement for APL in order to support non-mainframe environments. However, while this crunched numbers more efficiently than APL, it was a proprietary development on behalf of Morgan Stanley. So, in 1993, Arthur co-founded Kx Systems and developed a further and more powerful replacement language, called k. This was extremely efficient but equally terse and it was very difficult for non-experts to read or understand so, during the first decade of this century Arthur developed a more verbose version of k, called q. While you can still use k almost all kdb+ customers use q, which, while still implying a learning curve, is much easier to use and understand: for example, it has common functions familiar from SQL such as SELECT statements and WHERE clauses as well as updates, deletes and so forth. Where it goes beyond other environments is in its mathematical functions, such as variances, and in the fact that developed applications are tightly integrated with the data and database. More technically, q is a vector programming language (that is, it addresses vectors [arrays] rather than tables) that uses memory-mapped files for numeric processing. This has important implications, not only in its own right (because you get better performance) but also because Intel is increasingly adding vector processing capabilities to its processors. As Kx Systems is a partner of Intel, the former aims to leverage each new piece of relevant functionality that Intel introduces, so that performance will continue to improve. There are debugging facilities built into the product and, if required, you can use these from an Eclipse environment. There are also a number of other IDEs (integrated development environments) that can be used in conjunction with kdb+. Because of the conciseness of the code, programs tend to consist of relatively few lines, which makes debugging relatively simple. Code is interpreted but the interpreter takes up not much more than 100K and this makes it easy for the environment to support many simultaneous processes. This small footprint not only means that installation is very fast, it also reduces the risks and costs associated with upgrades and maintenance. The database As has already been stated data is stored in compressed format in a columnar database. Enough has been written (by both Bloor Research and others) about the advantages of column-based databases for analytics that we do not need to reiterate those arguments here. However, it is worth commenting on the approach kdb+ takes to in-memory processing. There are, essentially, three methods of implementing so-called in-memory databases. One is essentially to take a cache-based strategy and call it in-memory, which it isn t. The second is to assume that all relevant data can fit into memory. The problem with this is that it becomes expensive if the datasets to be analysed are anything other than relatively small, and it becomes prohibitively expensive as datasets become very large. Another problem is what happens if there is a system failure and you have to recover the system? The third option is to implement a hybrid in-memory/disk architecture that is optimised to use whatever resources are available, which, in our view, is a better solution. In particular, stream processing often requires that historic data is needed for context purposes and, given that a year s of worth of tick data (for example) comprises something like 6 or 7TB, then you really do need hybrid capability. As far as availability is concerned, kdb+ stores memory images on disk to facilitate rapid recovery. A further feature of kdb+ that is worth mentioning is the fact that it supports userdefined datatypes as well as built-in datatypes. In the latter case, time series support (which can be accurate down to nanoseconds) is implemented by means of specific datatypes for elements such as dates. Similarly, a relational table is a base datatype Bloor Research A Bloor InDetail Paper
7 Use cases So far we have merely described kdb+ in terms of its technology, and we have suggested that it would be suitable for use outside of financial services and in other industries where big data and the Internet of Things are significant issues. In fact, Kx Systems has already started to make headway in this market and it will be useful to discuss uses in general. Some of the following are live applications, some of them are planned by existing users and some of them are theoretical possibilities but they all highlight the potential that kdb+ offers. There are, of course, endless possibilities when it comes to analysing time-sensitive machine generated data but the following will give a flavour of environments for which kdb+ is suitable. Utilities Historically, utility companies have collected meter readings via readers calling at the door or, more recently, by customers posting their own readings online. Such readings are typically processed in a transactional database of some kind, which also support query and analytic processing of this data. However, with smart meters collecting data very much more frequently the amount of data to be processed in such queries is beyond the scope of traditional databases, especially when low latency queries are required. One company in this space has used kdb+ for a data mart to meet these query requirements, which include on-demand, batch, and ad-hoc analysis and aggregation queries. The kdb+ database is updated in real-time via change data capture. Smart devices This is an area that other Kx customers are investigating, specifically in the healthcare sector, to capture medical sensor data (telemedicine) and data in hospitals in areas such as intensive care and neonatal units. In particular, companies think that new smart fit wristbands and similar devices, that become part of the Internet of Things, are perfect for using kdb+ to give doctors the ability to perhaps predict events such as heart attacks before they occur. Asset management There are two aspects of asset management for which kdb+ might be suitable. The first is preventative maintenance and the second is real-time monitoring. These are closely allied but distinct. As an example, a compressor on an oil rig typically has around 120 sensors that are constantly being monitored. Immediate alerts need to be raised if, for example, vibration exceeds a certain tolerance level. Typically, this is currently done via banks of red lights but real-time monitoring can not only raise alerts but also provide additional contextual information to engineers based on previous history. Preventative maintenance takes this same facility a step further by taking historic information, analysing it and predicting when problems are likely to occur so that pre-emptive action can be taken. Risk analysis Another application where kdb+ can be (and is) deployed is to support epidemiological studies, a typical scenario being post-market risk studies. That is, analysing how a particular drug is being used by different populations, its efficacy, whether any unexpected side-effects are present, and so on. Information to be analysed is mainly derived data generated by insurance companies from patient claims, as well as prescription data from reporting pharmacies. The results form part of a feedback loop with R&D. Quality management This is another area being explored by one of Kx s customers in manufacturing. In this environment there is a lot of time-series sensor data that could be quickly manipulated by kdb+. The company wants to correlate this manufacturing sensor data with the quality of its products. A Bloor InDetail Paper Bloor Research
8 Performance and competition It is worth commenting on some of the conversations that have been held with Kx customers in order to compile this report. In particular, when it comes to big data most organisations will think of Hadoop or other NoSQL environments in the first instance. kdb+ users, on the other hand, have concluded that while Hadoop is good for batch-based processing and for environments where queries are predetermined it does not offer the real-time and ad hoc performance that kdb+ can offer. Moreover, Hadoop requires considerably more system resources: one user commented that after their proof of concept they estimated that they would require between 10 and 50 times more servers to support Hadoop than kdb+. A similar comparison with an appliance-based solution required sixteen times as many cores. The amount of programming and management overhead associated with Hadoop was also a major concern. In terms of performance, one company, which has a database that is measured in hundreds of billions of rows, stated that queries that used to take weeks to run on their previous system now run in seconds or less. Insofar as learning q is concerned, the comment we were given was that q is easier to learn than some of the open source technologies and libraries. Moreover, we were told that it is much more efficient than SQL: by orders of magnitude. On a slightly different topic we want to briefly mention STAC. This is an independent benchmarking group that is funded by some 200 leading financial (and other) institutions. It benchmarks both hardware and software by keeping one fixed and changing the other. It currently uses kdb+ as its standard software for the STAC-M3 benchmark because that is the fastest database product it has been able to find (or that has applied) Bloor Research A Bloor InDetail Paper
9 The vendor Kx Systems is a privately owned company in which First Derivatives has a minority shareholding. It was founded in 1993 and kdb was first introduced in 1998 with kdb-tick following in These were both 32-bit products, which have subsequently been replaced by the 64-bit versions discussed in this report. The company s historical approach to marketing has been both through direct marketing and via channels. It has a number of partners around the world that provide product sales, training and installation as well as first line support. Many of these partners, such as First Derivatives, have extended Kx s capabilities by offering specialised financial capabilities with things like graphical user interfaces for business intelligence and extended complex event processing. In addition to these partners the company also has OEM partners. For example, 1010Data, a cloudbased data warehousing solution for financial service companies, is based on Kx technology. However, while this has been Kx s approach to financial markets it has been directly addressing big data opportunities outside the financial sector. As this becomes more mature we expect the company to follow a similar channel model. Kx web address: A Bloor InDetail Paper Bloor Research
10 Summary Kdb+ has proved itself in what is unarguably the most demanding big data market: financial trading and risk management. The company is now targeting other environments. It has already achieved initial success in smart metering and pharmaceuticals/healthcare and, in our view, the technology is well-suited to a variety of other environments including preventative maintenance (asset management), smarter cities, the connected car and other such environments where the collection and analysis of time-series based data is important. We would have to say that Kx has been fortunate. We do not believe that it anticipated the Internet of Things. What it did was to focus on offering the best possible performance for its chosen (financial) market. It just happens that the (big data) issues faced by that market are exactly analogous to many Internet of Things environments. Not only is kdb+ suitable for implementation for many of these use cases but, more importantly, the product has significant technical advantages in these areas. We therefore expect a significant expansion of the company s user base within the Internet of Things domain. Further Information Further information about this subject is available from Bloor Research A Bloor InDetail Paper
11 Bloor Research overview Bloor Research is one of Europe s leading IT research, analysis and consultancy organisations, and in 2014 celebrates its 25th anniversary. We explain how to bring greater Agility to corporate IT systems through the effective governance, management and leverage of Information. We have built a reputation for telling the right story with independent, intelligent, well-articulated communications content and publications on all aspects of the ICT industry. We believe the objective of telling the right story is to: Describe the technology in context to its business value and the other systems and processes it interacts with. Understand how new and innovative technologies fit in with existing ICT investments. Look at the whole market and explain all the solutions available and how they can be more effectively evaluated. Filter noise and make it easier to find the additional information or news that supports both investment and implementation. Ensure all our content is available through the most appropriate channel. Founded in 1989, we have spent 25 years distributing research and analysis to IT user and vendor organisations throughout the world via online subscriptions, tailored research services, events and consultancy projects. We are committed to turning our knowledge into business value for you. About the author Philip Howard Research Director - Data Management Philip started in the computer industry way back in 1973 and has variously worked as a systems analyst, programmer and salesperson, as well as in marketing and product management, for a variety of companies including GEC Marconi, GPT, Philips Data Systems, Raytheon and NCR. After a quarter of a century of not being his own boss Philip set up his own company in 1992 and his first client was Bloor Research (then ButlerBloor), with Philip working for the company as an associate analyst. His relationship with Bloor Research has continued since that time and he is now Research Director focused on Data Management. Data management refers to the management, movement, governance and storage of data and involves diverse technologies that include (but are not limited to) databases and data warehousing, data integration (including ETL, data migration and data federation), data quality, master data management, metadata management and log and event management. Philip also tracks spreadsheet management and complex event processing. In addition to the numerous reports Philip has written on behalf of Bloor Research, Philip also contributes regularly to IT-Director.com and IT-Analysis.com and was previously editor of both Application Development News and Operating System News on behalf of Cambridge Market Intelligence (CMI). He has also contributed to various magazines and written a number of reports published by companies such as CMI and The Financial Times. Philip speaks regularly at conferences and other events throughout Europe and North America. Away from work, Philip s primary leisure activities are canal boats, skiing, playing Bridge (at which he is a Life Master), dining out and walking Benji the dog.
12 Copyright & disclaimer This document is copyright 2014 Bloor Research. No part of this publication may be reproduced by any method whatsoever without the prior consent of Bloor Research. Due to the nature of this material, numerous hardware and software products have been mentioned by name. In the majority, if not all, of the cases, these product names are claimed as trademarks by the companies that manufacture the products. It is not Bloor Research s intent to claim these names or trademarks as our own. Likewise, company logos, graphics or screen shots have been reproduced with the consent of the owner and are subject to that owner s copyright. Whilst every care has been taken in the preparation of this document to ensure that the information is correct, the publishers cannot accept responsibility for any errors or omissions.
13 2nd Floor, St John Street LONDON, EC1V 4PY, United Kingdom Tel: +44 (0) web:
White Paper. SAP ASE Total Cost of Ownership. A comparison to Oracle
White Paper SAP ASE Total Cost of Ownership A White Paper by Bloor Research Author : Philip Howard Publish date : April 2014 The results of this survey are unequivocal: for all 21 TCO and related metrics
White Paper. Lower your risk with application data migration. next steps with Informatica
White Paper Lower your risk with application data migration A White Paper by Bloor Research Author : Philip Howard Publish date : April 2013 If we add in Data Validation and Proactive Monitoring then Informatica
White Paper. What the ideal cloud-based web security service should provide. the tools and services to look for
White Paper What the ideal cloud-based web security service should provide A White Paper by Bloor Research Author : Fran Howarth Publish date : February 2010 The components required of an effective web
White Paper. Data Migration
White Paper Data Migration A White Paper by Bloor Research Author : Philip Howard Publish date : May 2011 data migration projects are undertaken because they will support business objectives. There are
HOW INTERSYSTEMS TECHNOLOGY ENABLES BUSINESS INTELLIGENCE SOLUTIONS
HOW INTERSYSTEMS TECHNOLOGY ENABLES BUSINESS INTELLIGENCE SOLUTIONS A white paper by: Dr. Mark Massias Senior Sales Engineer InterSystems Corporation HOW INTERSYSTEMS TECHNOLOGY ENABLES BUSINESS INTELLIGENCE
CitusDB Architecture for Real-Time Big Data
CitusDB Architecture for Real-Time Big Data CitusDB Highlights Empowers real-time Big Data using PostgreSQL Scales out PostgreSQL to support up to hundreds of terabytes of data Fast parallel processing
White Paper. Architecting the security of the next-generation data center. why security needs to be a key component early in the design phase
White Paper Architecting the security of the next-generation data center A White Paper by Bloor Research Author : Fran Howarth Publish date : August 2011 teams involved in modernization projects need to
White Paper. Big Data Analytics with Hadoop and Sybase IQ
White Paper Big Data Analytics with Hadoop and Sybase IQ A White Paper by Bloor Research Author : Philip Howard Publish date : April 2012 Big data is important because it enables you to analyse large amounts
Benefits of multi-core, time-critical, high volume, real-time data analysis for trading and risk management
SOLUTION B L U EPRINT FINANCIAL SERVICES Benefits of multi-core, time-critical, high volume, real-time data analysis for trading and risk management Industry Financial Services Business Challenge Ultra
Why DBMSs Matter More than Ever in the Big Data Era
E-PAPER FEBRUARY 2014 Why DBMSs Matter More than Ever in the Big Data Era Having the right database infrastructure can make or break big data analytics projects. TW_1401138 Big data has become big news
A Next-Generation Analytics Ecosystem for Big Data. Colin White, BI Research September 2012 Sponsored by ParAccel
A Next-Generation Analytics Ecosystem for Big Data Colin White, BI Research September 2012 Sponsored by ParAccel BIG DATA IS BIG NEWS The value of big data lies in the business analytics that can be generated
Understanding the Value of In-Memory in the IT Landscape
February 2012 Understing the Value of In-Memory in Sponsored by QlikView Contents The Many Faces of In-Memory 1 The Meaning of In-Memory 2 The Data Analysis Value Chain Your Goals 3 Mapping Vendors to
Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010
Microsoft SQL Server 2008 R2 Enterprise Edition and Microsoft SharePoint Server 2010 Better Together Writer: Bill Baer, Technical Product Manager, SharePoint Product Group Technical Reviewers: Steve Peschka,
Why Big Data in the Cloud?
Have 40 Why Big Data in the Cloud? Colin White, BI Research January 2014 Sponsored by Treasure Data TABLE OF CONTENTS Introduction The Importance of Big Data The Role of Cloud Computing Using Big Data
SAP SE - Legal Requirements and Requirements
Finding the signals in the noise Niklas Packendorff @packendorff Solution Expert Analytics & Data Platform Legal disclaimer The information in this presentation is confidential and proprietary to SAP and
Information Architecture
The Bloor Group Actian and The Big Data Information Architecture WHITE PAPER The Actian Big Data Information Architecture Actian and The Big Data Information Architecture Originally founded in 2005 to
InDetail. RainStor archiving
InDetail RainStor archiving An InDetail Paper by Bloor Research Author : Philip Howard Publish date : November 2013 Archival is a no-brainer when it comes to return on investment and total cost of ownership.
Managing Big Data with Hadoop & Vertica. A look at integration between the Cloudera distribution for Hadoop and the Vertica Analytic Database
Managing Big Data with Hadoop & Vertica A look at integration between the Cloudera distribution for Hadoop and the Vertica Analytic Database Copyright Vertica Systems, Inc. October 2009 Cloudera and Vertica
Innovative technology for big data analytics
Technical white paper Innovative technology for big data analytics The HP Vertica Analytics Platform database provides price/performance, scalability, availability, and ease of administration Table of
Lambda Architecture. Near Real-Time Big Data Analytics Using Hadoop. January 2015. Email: [email protected] Website: www.qburst.com
Lambda Architecture Near Real-Time Big Data Analytics Using Hadoop January 2015 Contents Overview... 3 Lambda Architecture: A Quick Introduction... 4 Batch Layer... 4 Serving Layer... 4 Speed Layer...
Improve your IT Analytics Capabilities through Mainframe Consolidation and Simplification
Improve your IT Analytics Capabilities through Mainframe Consolidation and Simplification Ros Schulman Hitachi Data Systems John Harker Hitachi Data Systems Insert Custom Session QR if Desired. Improve
The Shortcut Guide to Balancing Storage Costs and Performance with Hybrid Storage
The Shortcut Guide to Balancing Storage Costs and Performance with Hybrid Storage sponsored by Dan Sullivan Chapter 1: Advantages of Hybrid Storage... 1 Overview of Flash Deployment in Hybrid Storage Systems...
Big Data and Its Impact on the Data Warehousing Architecture
Big Data and Its Impact on the Data Warehousing Architecture Sponsored by SAP Speaker: Wayne Eckerson, Director of Research, TechTarget Wayne Eckerson: Hi my name is Wayne Eckerson, I am Director of Research
How To Handle Big Data With A Data Scientist
III Big Data Technologies Today, new technologies make it possible to realize value from Big Data. Big data technologies can replace highly customized, expensive legacy systems with a standard solution
Accelerating Hadoop MapReduce Using an In-Memory Data Grid
Accelerating Hadoop MapReduce Using an In-Memory Data Grid By David L. Brinker and William L. Bain, ScaleOut Software, Inc. 2013 ScaleOut Software, Inc. 12/27/2012 H adoop has been widely embraced for
ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION
ORACLE BUSINESS INTELLIGENCE, ORACLE DATABASE, AND EXADATA INTEGRATION EXECUTIVE SUMMARY Oracle business intelligence solutions are complete, open, and integrated. Key components of Oracle business intelligence
ORACLE OLAP. Oracle OLAP is embedded in the Oracle Database kernel and runs in the same database process
ORACLE OLAP KEY FEATURES AND BENEFITS FAST ANSWERS TO TOUGH QUESTIONS EASILY KEY FEATURES & BENEFITS World class analytic engine Superior query performance Simple SQL access to advanced analytics Enhanced
SQL Server 2012 Performance White Paper
Published: April 2012 Applies to: SQL Server 2012 Copyright The information contained in this document represents the current view of Microsoft Corporation on the issues discussed as of the date of publication.
I N T E R S Y S T E M S W H I T E P A P E R INTERSYSTEMS CACHÉ AS AN ALTERNATIVE TO IN-MEMORY DATABASES. David Kaaret InterSystems Corporation
INTERSYSTEMS CACHÉ AS AN ALTERNATIVE TO IN-MEMORY DATABASES David Kaaret InterSystems Corporation INTERSYSTEMS CACHÉ AS AN ALTERNATIVE TO IN-MEMORY DATABASES Introduction To overcome the performance limitations
High Performance Time-Series Analysis Powered by Cutting-Edge Database Technology
High Performance Time-Series Analysis Powered by Cutting-Edge Database Technology Overview Country or Region: United Kingdom Industry: Financial Services Customer Profile builds data and analytics management
Colgate-Palmolive selects SAP HANA to improve the speed of business analytics with IBM and SAP
selects SAP HANA to improve the speed of business analytics with IBM and SAP Founded in 1806, is a global consumer products company which sells nearly $17 billion annually in personal care, home care,
Preview of Oracle Database 12c In-Memory Option. Copyright 2013, Oracle and/or its affiliates. All rights reserved.
Preview of Oracle Database 12c In-Memory Option 1 The following is intended to outline our general product direction. It is intended for information purposes only, and may not be incorporated into any
Architectural patterns for building real time applications with Apache HBase. Andrew Purtell Committer and PMC, Apache HBase
Architectural patterns for building real time applications with Apache HBase Andrew Purtell Committer and PMC, Apache HBase Who am I? Distributed systems engineer Principal Architect in the Big Data Platform
Scaling Objectivity Database Performance with Panasas Scale-Out NAS Storage
White Paper Scaling Objectivity Database Performance with Panasas Scale-Out NAS Storage A Benchmark Report August 211 Background Objectivity/DB uses a powerful distributed processing architecture to manage
Bringing Big Data Modelling into the Hands of Domain Experts
Bringing Big Data Modelling into the Hands of Domain Experts David Willingham Senior Application Engineer MathWorks [email protected] 2015 The MathWorks, Inc. 1 Data is the sword of the
SAP HANA SAP s In-Memory Database. Dr. Martin Kittel, SAP HANA Development January 16, 2013
SAP HANA SAP s In-Memory Database Dr. Martin Kittel, SAP HANA Development January 16, 2013 Disclaimer This presentation outlines our general product direction and should not be relied on in making a purchase
Key Attributes for Analytics in an IBM i environment
Key Attributes for Analytics in an IBM i environment Companies worldwide invest millions of dollars in operational applications to improve the way they conduct business. While these systems provide significant
Breaking News! Big Data is Solved. What Is In-Memory Computing and What Does It Mean to U.S. Leaders? EXECUTIVE WHITE PAPER
Breaking News! Big Data is Solved. What Is In-Memory Computing and What Does It Mean to U.S. Leaders? EXECUTIVE WHITE PAPER There is a revolution happening in information technology, and it s not just
Trading and risk management: Benefits of time-critical, real-time analysis on big numerical data
SOLUTION B L U EPRINT FINANCIAL SERVICES Trading and risk management: Benefits of time-critical, real-time analysis on big numerical data Industry Financial Services Business Challenge Ultra high-speed
Oracle Big Data Building A Big Data Management System
Oracle Big Building A Big Management System Copyright 2015, Oracle and/or its affiliates. All rights reserved. Effi Psychogiou ECEMEA Big Product Director May, 2015 Safe Harbor Statement The following
Modern IT Operations Management. Why a New Approach is Required, and How Boundary Delivers
Modern IT Operations Management Why a New Approach is Required, and How Boundary Delivers TABLE OF CONTENTS EXECUTIVE SUMMARY 3 INTRODUCTION: CHANGING NATURE OF IT 3 WHY TRADITIONAL APPROACHES ARE FAILING
Oracle Database - Engineered for Innovation. Sedat Zencirci Teknoloji Satış Danışmanlığı Direktörü Türkiye ve Orta Asya
Oracle Database - Engineered for Innovation Sedat Zencirci Teknoloji Satış Danışmanlığı Direktörü Türkiye ve Orta Asya Oracle Database 11g Release 2 Shipping since September 2009 11.2.0.3 Patch Set now
Cognos Performance Troubleshooting
Cognos Performance Troubleshooting Presenters James Salmon Marketing Manager [email protected] Andy Ellis Senior BI Consultant [email protected] Want to ask a question?
TIBCO Live Datamart: Push-Based Real-Time Analytics
TIBCO Live Datamart: Push-Based Real-Time Analytics ABSTRACT TIBCO Live Datamart is a new approach to real-time analytics and data warehousing for environments where large volumes of data require a management
A powerful duo. PRIMEQUEST and SQL Server
A powerful duo PRIMEQUEST and SQL Server The challenge: Moving forward to new worlds in IT More efficient, powerful and agile infrastructures are what IT managers want. And if costs can be cut and the
Affordable, Scalable, Reliable OLTP in a Cloud and Big Data World: IBM DB2 purescale
WHITE PAPER Affordable, Scalable, Reliable OLTP in a Cloud and Big Data World: IBM DB2 purescale Sponsored by: IBM Carl W. Olofson December 2014 IN THIS WHITE PAPER This white paper discusses the concept
What Is In-Memory Computing and What Does It Mean to U.S. Leaders? EXECUTIVE WHITE PAPER
What Is In-Memory Computing and What Does It Mean to U.S. Leaders? EXECUTIVE WHITE PAPER A NEW PARADIGM IN INFORMATION TECHNOLOGY There is a revolution happening in information technology, and it s not
Fact Sheet In-Memory Analysis
Fact Sheet In-Memory Analysis 1 Copyright Yellowfin International 2010 Contents In Memory Overview...3 Benefits...3 Agile development & rapid delivery...3 Data types supported by the In-Memory Database...4
Data Integration Checklist
The need for data integration tools exists in every company, small to large. Whether it is extracting data that exists in spreadsheets, packaged applications, databases, sensor networks or social media
QLIKVIEW INTEGRATION TION WITH AMAZON REDSHIFT John Park Partner Engineering
QLIKVIEW INTEGRATION TION WITH AMAZON REDSHIFT John Park Partner Engineering June 2014 Page 1 Contents Introduction... 3 About Amazon Web Services (AWS)... 3 About Amazon Redshift... 3 QlikView on AWS...
The 21st Century Time-series Database. Making real-time decisions on billions of current data events and trillions of historical records.
kx systems whitepaper The 21st Century Time-series Database Making real-time decisions on billions of current data events and trillions of historical records. From the outset, we have designed our products
ICONICS Choosing the Correct Edition of MS SQL Server
Description: This application note aims to assist you in choosing the right edition of Microsoft SQL server for your ICONICS applications. OS Requirement: XP Win 2000, XP Pro, Server 2003, Vista, Server
Part V Applications. What is cloud computing? SaaS has been around for awhile. Cloud Computing: General concepts
Part V Applications Cloud Computing: General concepts Copyright K.Goseva 2010 CS 736 Software Performance Engineering Slide 1 What is cloud computing? SaaS: Software as a Service Cloud: Datacenters hardware
Windows Embedded Security and Surveillance Solutions
Windows Embedded Security and Surveillance Solutions Windows Embedded 2010 Page 1 Copyright The information contained in this document represents the current view of Microsoft Corporation on the issues
The Future of Data Management
The Future of Data Management with Hadoop and the Enterprise Data Hub Amr Awadallah (@awadallah) Cofounder and CTO Cloudera Snapshot Founded 2008, by former employees of Employees Today ~ 800 World Class
ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS
ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS PRODUCT FACTS & FEATURES KEY FEATURES Comprehensive, best-of-breed capabilities 100 percent thin client interface Intelligence across multiple
White. Paper. EMC Isilon: A Scalable Storage Platform for Big Data. April 2014
White Paper EMC Isilon: A Scalable Storage Platform for Big Data By Nik Rouda, Senior Analyst and Terri McClure, Senior Analyst April 2014 This ESG White Paper was commissioned by EMC Isilon and is distributed
How To Use Shareplex
Data consolidation and distribution with SharePlex database replication Written by Sujith Kumar, Chief Technologist Executive summary In today s fast-paced mobile age, data continues to accrue by leaps
IBM WebSphere Premises Server
Integrate sensor data to create new visibility and drive business process innovation IBM WebSphere Server Highlights Derive actionable insights that support Enable real-time location tracking business
ORACLE BUSINESS INTELLIGENCE SUITE ENTERPRISE EDITION PLUS
Oracle Fusion editions of Oracle's Hyperion performance management products are currently available only on Microsoft Windows server platforms. The following is intended to outline our general product
WHAT IS AN APPLICATION PLATFORM?
David Chappell December 2011 WHAT IS AN APPLICATION PLATFORM? Sponsored by Microsoft Corporation Copyright 2011 Chappell & Associates Just about every application today relies on other software: operating
News and trends in Data Warehouse Automation, Big Data and BI. Johan Hendrickx & Dirk Vermeiren
News and trends in Data Warehouse Automation, Big Data and BI Johan Hendrickx & Dirk Vermeiren Extreme Agility from Source to Analysis DWH Appliances & DWH Automation Typical Architecture 3 What Business
IBM DB2 Near-Line Storage Solution for SAP NetWeaver BW
IBM DB2 Near-Line Storage Solution for SAP NetWeaver BW A high-performance solution based on IBM DB2 with BLU Acceleration Highlights Help reduce costs by moving infrequently used to cost-effective systems
Maximum performance, minimal risk for data warehousing
SYSTEM X SERVERS SOLUTION BRIEF Maximum performance, minimal risk for data warehousing Microsoft Data Warehouse Fast Track for SQL Server 2014 on System x3850 X6 (95TB) The rapid growth of technology has
IBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS!
The Bloor Group IBM AND NEXT GENERATION ARCHITECTURE FOR BIG DATA & ANALYTICS VENDOR PROFILE The IBM Big Data Landscape IBM can legitimately claim to have been involved in Big Data and to have a much broader
Data Integration Platforms - Talend
Data Integration Platforms - Talend Author : Philip Howard Publish date : July 2008 page 1 Introduction Talend is an open source provider of data integration products. However, while many open source
An Oracle White Paper November 2010. Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics
An Oracle White Paper November 2010 Leveraging Massively Parallel Processing in an Oracle Environment for Big Data Analytics 1 Introduction New applications such as web searches, recommendation engines,
Using In-Memory Computing to Simplify Big Data Analytics
SCALEOUT SOFTWARE Using In-Memory Computing to Simplify Big Data Analytics by Dr. William Bain, ScaleOut Software, Inc. 2012 ScaleOut Software, Inc. 12/27/2012 T he big data revolution is upon us, fed
Accelerating Enterprise Applications and Reducing TCO with SanDisk ZetaScale Software
WHITEPAPER Accelerating Enterprise Applications and Reducing TCO with SanDisk ZetaScale Software SanDisk ZetaScale software unlocks the full benefits of flash for In-Memory Compute and NoSQL applications
Using an In-Memory Data Grid for Near Real-Time Data Analysis
SCALEOUT SOFTWARE Using an In-Memory Data Grid for Near Real-Time Data Analysis by Dr. William Bain, ScaleOut Software, Inc. 2012 ScaleOut Software, Inc. 12/27/2012 IN today s competitive world, businesses
Big Data. Value, use cases and architectures. Petar Torre Lead Architect Service Provider Group. Dubrovnik, Croatia, South East Europe 20-22 May, 2013
Dubrovnik, Croatia, South East Europe 20-22 May, 2013 Big Data Value, use cases and architectures Petar Torre Lead Architect Service Provider Group 2011 2013 Cisco and/or its affiliates. All rights reserved.
INTEROPERABILITY OF SAP BUSINESS OBJECTS 4.0 WITH GREENPLUM DATABASE - AN INTEGRATION GUIDE FOR WINDOWS USERS (64 BIT)
White Paper INTEROPERABILITY OF SAP BUSINESS OBJECTS 4.0 WITH - AN INTEGRATION GUIDE FOR WINDOWS USERS (64 BIT) Abstract This paper presents interoperability of SAP Business Objects 4.0 with Greenplum.
Focus on the business, not the business of data warehousing!
Focus on the business, not the business of data warehousing! Adam M. Ronthal Technical Product Marketing and Strategy Big Data, Cloud, and Appliances @ARonthal 1 Disclaimer Copyright IBM Corporation 2014.
Five Technology Trends for Improved Business Intelligence Performance
TechTarget Enterprise Applications Media E-Book Five Technology Trends for Improved Business Intelligence Performance The demand for business intelligence data only continues to increase, putting BI vendors
hmetrix Revolutionizing Healthcare Analytics with Vertica & Tableau
Powered by Vertica Solution Series in conjunction with: hmetrix Revolutionizing Healthcare Analytics with Vertica & Tableau The cost of healthcare in the US continues to escalate. Consumers, employers,
Performance and Scalability Overview
Performance and Scalability Overview This guide provides an overview of some of the performance and scalability capabilities of the Pentaho Business Analytics Platform. Contents Pentaho Scalability and
ANY SURVEILLANCE, ANYWHERE, ANYTIME
ANY SURVEILLANCE, ANYWHERE, ANYTIME WHITEPAPER DDN Storage Powers Next Generation Video Surveillance Infrastructure INTRODUCTION Over the past decade, the world has seen tremendous growth in the use of
