1 White Paper The Surprising Economics of Engineered Systems for Big Data (with Oracle and Intel ) By Nik Rouda, Senior Analyst and Adam DeMattia, Research Analyst December 2015 This ESG White Paper was commissioned by Oracle and Intel and is under license from ESG. distributed
2 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 2 Contents Executive Summary... 3 In Search of Big Data Infrastructure... 3 Do-it-yourself Hadoop Can Be Difficult... 3 Save Time and Up to 45% in Costs By Using Preconfigured Big Data Appliances... 3 Delivering Big Data s Value Proposition... 4 Introduction: In Search of Big Data Reality... 4 The Enterprise Big Data Value Imperative... 4 Hadoop's Role in an Enterprise Big Data Platform... 4 Debunking Three Common Assumptions of Hadoop Big Data... 5 The Real Costs and Benefits of Big Data Infrastructure... 6 A Model for Hadoop Big Data Infrastructure Cost Analysis: Build Versus Buy... 6 Specific Costs for Build Versus Buy Comparison... 7 Oracle Big Data Appliance as an Alternative to Build... 7 Looking for Big Data Savings? The Infrastructure Buy Option... 8 Comparing to an Open Source Software Alternative... 9 Critical Considerations Beyond Costs Better Time to Market of the Buy Option Better Performance with Oracle and Intel Tuning The Bigger Truth Serious Big Data Requires Serious Commitment Avoid Common Assumptions about Big Data Buy Will Often Trump Build for Both Big Data Infrastructure Costs and Benefits Appendix All trademark names are property of their respective companies. Information contained in this publication has been obtained by sources The Enterprise Strategy Group (ESG) considers to be reliable but is not warranted by ESG. This publication may contain opinions of ESG, which are subject to change from time to time. This publication is copyrighted by The Enterprise Strategy Group, Inc. Any reproduction or redistribution of this publication, in whole or in part, whether in hard-copy format, electronically, or otherwise to persons not authorized to receive it, without the express consent of The Enterprise Strategy Group, Inc., is in violation of U.S. copyright law and will be subject to an action for civil damages and, if applicable, criminal prosecution. Should you have any questions, please contact ESG Client Relations at Intel, the Intel logo, Xeon, and Xeon Inside are trademarks or registered trademarks of Intel Corporation in the U.S. and/or other countries.
3 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 3 Executive Summary In Search of Big Data Infrastructure Enterprises of all types are betting on big data analytics to help them better understand customers, compete in the market, improve products and services, tune operations, and increase profits. Given the high degree of interest in analytics and the vast quantity of unutilized data, the first step most organizations need to take on the path toward realizing the benefits of big data analytics is to implement an appropriate infrastructure to process big data. Though most enterprises are not starting entirely from scratch, having already developed data warehouses and related business intelligence (BI) solutions, most realize that big data analytics require a different infrastructure than what they have used historically for data warehousing and BI. Many organizations, therefore, plan to invest in a new solution infrastructure to realize the promise of big data. ESG also discovered that buying purpose-built appliances for use on-premises, such as Oracle s Big Data Appliance (BDA), will likely be the preferred deployment option for nearly one-quarter of enterprise organizations seeking to implement new solutions. 1 Do-it-yourself Hadoop Can Be Difficult A few of the largest analytics pioneers, such as Google and Yahoo, have successfully built big data infrastructures from scratch. Those same firms were also primary participants in the birth and nurturing of Hadoop, the Apache open source project given credit for catalyzing the big data movement. Hadoop has matured over the past 10 years, due in part to feature enhancements and in part to growing support from a widening range of both startups and more established IT vendors. For many organizations, Hadoop-based solutions represent the first new explicitly big data investment. However, successful Hadoop implementations put into full production using do-it-yourself infrastructure components may require more time and effort than initially expected. Hadoop lures many big data hopefuls due to its apparent low infrastructure cost and easy access; as an open source technology, anyone can download Hadoop for free, and can spin up a simple Hadoop infrastructure in the cloud or on-premises. Unfortunately, many organizations also quickly find that they lack the time, resources, and expertise needed to quickly build an infrastructure for big data. A large portion of the effort and expertise required comes from two groups: 1. Hadoop engineers, who can architect an initial Hadoop infrastructure, feed in applicable data, help the data analysts work with the data store, and continue to manage the whole infrastructure over time. 2. Data scientists and analysts, who know how to render the tools of statistics in the context of big data analytics, and also can lead the human and business process of discovery and collaboration in order to yield actionable results. Beyond this, however, ESG s research shows that most enterprise organizations feel that stakeholders from server, storage, network, security, and other IT infrastructure and operations teams are important to the success of a big data initiative. Thus, despite the hope and hype, Hadoop on a commodity stack of hardware does not always offer a lower cost ride to big data analytics, or at least not as quickly as hoped. Many organizations implementing Hadoop infrastructures based on human expertise working with commodity hardware and open source software may experience unexpected costs, slower speed to market, and unplanned complexity throughout the lifecycle. Accordingly, ESG s research further shows that more than three-quarters of respondents expect it will take longer than six months to realize significant business value from a new big data initiative. Save Time and Up to 45% in Costs By Using Preconfigured Big Data Appliances Based on ESG validation of an Oracle model for a medium-sized Hadoop-oriented big data project, a buy infrastructure option like Oracle BDA could yield approximately 45% lower costs than a build equivalent do-it- 1 Source: ESG Research Report, Enterprise Big Data, Business Intelligence, and Analytics Trends Redux, January All other ESG research references in this white paper have been taken from this report.
4 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 4 yourself infrastructure. Now, clearly, discounting affects any pricing comparison, and some are willing to discount heavily to win a foothold with a strategic customer. It s worth keeping in mind that even if other hardware and software vendors offer massive discounts to match or even beat the BDA price, some the other aspects (e.g., ease of procurement, simpler patching and upgrades, performance, speed of deployment, and support for the entire system not the components) are also very important (even if no cost is directly attributed to these qualities.) Using an engineered appliance will greatly reduce the effort of having staff from many IT disciplines do extensive platform evaluation, testing, development, integration, and tuning. For most enterprises, it is well worth exploring the idea of skipping in-house development of their own big data infrastructure, and instead deploying a purpose-built infrastructure solution such as Oracle's BDA. Delivering Big Data s Value Proposition Introduction: In Search of Big Data Reality The media bandies about the term big data as if it were common at most enterprises today. To the contrary, many enterprises have only recently begun what will turn into a multi-year commitment and effort toward enhancing their data analytics capabilities. Similarly, much of the interest in big data springs from the vendors offering products and services based on the Apache Hadoop project, yet their marketing sometimes contributes to the misleading notion that the benefits of big data will come easily and inexpensively. In terms of infrastructure for Hadoop, the conventional wisdom suggests that on-premises or cloud-based commodity servers and storage will be "good enough" solutions for all environments, while the reality is that there are several viable options. What will the reality of big data be? Assuming many enterprises will invest in new solutions to attain their respective big data goals, the Hadoop ecosystem may become the norm, but other approaches will also have their place. ESG believes that Hadoop will play at least a part in many big data solutions, even as Spark and other analytics technologies complement the core components. Assuming for now that Hadoop will eventually play some role in most organizations, what related infrastructure decisions will deliver the best ROI for big data solutions doit-yourself commodity infrastructures or more purpose-built platforms? The Enterprise Big Data Value Imperative Half of the respondents in ESG s aforementioned survey considered big data the highest priority business initiative. While many IT initiatives can help organizations do things better, big data helps organizations know what to do better. It is one thing to know that bookings decreased when compared with the same quarter last year (an example of business intelligence), but it is better to know why they decreased (analytics), and it is best to model and predict how they might increase going forward (one potential use of big data analytics). Despite the common belief that big data will eventually deliver on all the promise and hype, it seems less likely that organizations will successfully reach their big data goals by cutting corners. In fact, enterprises should avoid falling for the false promise of something earth-shattering for nearly nothing that seems associated with big data why not get real about big data? This starts with enterprises selecting the right big data infrastructure. Hadoop's Role in an Enterprise Big Data Platform Based on the aforementioned ESG research survey, a number of respondents are already using 23% in total Hadoop framework technologies to process large and diverse sets of data for big data analytics. The fact that IT vendors of all types, from startups to well-established vendors, have jumped on the Hadoop bandwagon has changed Hadoop's status at most enterprises from a mere curiosity a few years ago to a serious choice going forward. What Hadoop has most obviously lacked is maturity, needing the equivalent of integrated systems management software, but vendors like Cloudera, Hortonworks, and MapR have developed such software and support via their respective distributions to define a managed data lake or data hub.
5 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 5 Debunking Three Common Assumptions of Hadoop Big Data While Hadoop has now had its tenth birthday, there are still some common misconceptions associated with the data platform. ESG believes that these myths present some risks for enterprises, which should be explored given the potential strategic impacts. Assumption 1: Big data is not mission-critical. If big data will transform the business, then how should IT view big data from an architectural and operational perspective? Simple: Consider big data mission-critical, just like ERP. The CIO and the business leaders should plan to architect, build, and operate a big data facility that delivers dependably for the entire organizational value chain including customers and partners on an ongoing basis. CIOs already know the drill: 24x7x365 reliability, five 9s availability, appropriate performance, scalability, security, serviceability, and quick adaptability to business requirements changes. Thinking about big data in a mission-critical context may run counter to much of the way big data has been positioned in the industry: The idea that an enterprise-class big data solution can only run effectively on inexpensive, commodity hardware, using purely open source software, is a somewhat limited viewpoint. The amount and diversity of data; the number of points of integration; the wide variety of users; and the importance of analytic models and applications suggest that enterprises need a similar approach to what IT has used to deliver mission-critical solutions in the past. Assumption 2: Hadoop is free. You can visit the Apache website for Hadoop and download all the elements you need for free. Why then does Hadoop end up costing so much? First, the expertise required to implement and use Hadoop is expensive, whether you contract data scientists and Hadoop engineers, shift some of your top business analysts and DBAs and retrain them on Hadoop, or execute a combination thereof. Enterprises should ask themselves, "Do I have engineers on staff deeply familiar with configuring, allocating, and managing Hadoop infrastructure?" Second, few enterprises plan to depend solely on Hadoop for big data solutions Hadoop is part of a larger puzzle, and connecting other data sources and tools to Hadoop carries hidden costs on the human, software, networking, and hardware fronts. Existing data warehouses, and related integration and BI solutions, will certainly count as part of the overall big data solution at most companies. Connecting to Hadoop for data ingest and analytics visualization purposes may require significant administrative effort. Assumption 3: I already own the infrastructure I need or can come by it inexpensively, and am staffed to configure and manage that infrastructure. Consider a scenario where an enterprise recently shifted to a cloud-based Microsoft Exchange implementation (away from a self-managed server farm) and the enterprise thus owns a large number of generic servers and storage that are all paid for and fully depreciated. They might simply repurpose all those nodes for the Hadoop project. Alternatively, one can buy a rack of commodity servers with direct-attached storage at white-box prices. Unfortunately, the inexpensive infrastructure myth may not hold up in practice. While generic servers may be fine for smaller projects and proofs-of-concept, for large-scale or compute-intensive workloads, you may want to use higher-end servers, storage, and networking. One of the primary big data ROI factors involves the choice of infrastructure. An enterprise-class big data infrastructure requires a robust and reliable architecture, and if your company does not currently possess sufficient personnel with the skills needed to architect and implement all the hardware, network, and systems software necessary for your big data solution, then you should consider an appliance instead. Identification, evaluation, and integration of a complete technology stack carries many hidden costs, particularly in the many hours of human effort on the project.
6 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 6 The Real Costs and Benefits of Big Data Infrastructure A Model for Hadoop Big Data Infrastructure Cost Analysis: Build Versus Buy What would an enterprise experience in terms of costs if IT rolled its own Hadoop infrastructure versus using an appliance that was purpose-designed and integrated for enterprise-class big data? Using Hadoop isn't just about clusters of server nodes; it is about the whole infrastructure, including systems management software, networking, and extra capacity for a variety of analytics processing purposes. A very important note is that the soft costs of labor time to evaluate, procure, test, deploy, and integrate a full stack of hardware and software are not examined here. These costs in time and money can be easily underestimated, sometimes ranging to hundreds of hours or more, and alone are some of the most compelling reasons to consider a purpose-built big data appliance. That said, many organizations will look to their existing staff to manage this effort, and may heavily discount or exclude these human costs, not considering the opportunity cost of their time. ESG conducted a review of a buy of the Oracle Big Data Appliance (powered by Intel Xeon processors) for a three-year total cost comparison with a closely matched commodity build infrastructure. While the exact pricing will of course vary by vendor and over time, this model uses the closest directly comparable equipment, including equivalently resourced HP Proliant DL-380 and/or Dell R730xd servers and Mellanox MIS-5025Q Infiniband networking switches. The list prices are those that were publicly available at the time of writing. Few businesses will pay list price for equipment, but assuming that a similar discount level would be negotiated with Oracle as with other vendors, these calculations will use list costs. With the same discounts applied, the magnitude of the savings might change, but the ratio should hold fairly constant. Evaluators should use this kind of ROI model as a reference while conducting their own calculations based on their specific environmental requirements. This exercise is primarily a validation of the comparable costs and financial calculations provided by Oracle, and a similar past comparison may be found at the Oracle blog, Price Comparison for Big Data Appliance and Hadoop. Though some readers can surely argue that lower cost servers and networking components are frequently chosen in real-world deployments, these options may be lacking in performance and reliability features, and simply aren t as directly comparable to the equipment included in the Oracle offering. As an example, storage capacity and therefore density in lower cost servers is generally much lower, which could then require two or even three racks of servers to match the capacity of a BDA. The reasons that the Oracle Big Data Appliance itself works well for this comparison are that it would serve well as an infrastructure for a medium-sized big data project, as depicted in Table 1, and the cost and infrastructural details of the buy option the Oracle Big Data Appliance in this case are publicly disclosed. In order to make such a comparison, we will need a model project, and here are the assumptions for our theoretical medium-sized, enterprise-class big data project: Users: 200 end-users: 150 business end-users, 50 data scientists/analysts. Analytics Consumption: One-third ad-hoc, one-third daily, one-third monthly. Servers: Enterprise-grade with newer chipsets, backup power supply, high storage density (to ensure, for example, that Hadoop doesn't become storage-limited), plenty of cores to support parallelization with more cores in the name node, plus plenty of memory to handle complex queries and columnar-based analytics. Storage: Total raw data capacity of 1,728 terabytes (TB), in a balanced mix of refreshes (i.e., monthly, weekly, daily, real-time). Note that in the example in Table 1, the math suggests more terabytes (18 nodes * 96 TB/node = 1,728 TB), but one has to account for replication (3x replication in pure Hadoop for example), peaks, compression, and sharding to arrive at the actual usable capacity. We will use a usability rate of 30%, which in this case, rounded up, yields about 518 TB.
7 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 7 Network: Dedicated and fast bandwidth, with multiple Infiniband switches to deal with contention; there can be varying amounts of replication and data movement within a cluster, but a big data infrastructure needs to ensure it doesn't become network-bound. Queries: A diverse set of activities that spans from simple select statements to complex joins. Integration and Information Management: Five new points of integration/transformation; in the process of design, additional data sources will be added in addition to, for example, a data warehouse. Those sources will require integration/transformation and information management work, and related licenses and hardware. Project Time: Six months total running, which includes one month for architecture, design, and procurement; one month for hardware/network configuration/implementation; two months for various development elements, MapReduce queries (assuming the use of Hive), statistics, user experience, integration, training, etc.; one month for final integrated/system testing and go live; and one month for slack and over-runs. For a full listing of all the resulting elements of cost associated with the theoretical project, see Appendix, Table 5. Specific Costs for Build Versus Buy Comparison Table 1 lists those project items where ESG believes there is a pricing choice between build and buy. The table reflects estimated pricing for the build consumption option only, with 18 nodes of quality hardware specified directly comparable to the contents of an Oracle BDA. Table 1. Medium Big Data Project Three-year TCO Summary of Build Cost Items Item Cost Notes Servers $708,561 Networking $37,500 $39,365 each; enterprise-class, each with: two Intel Xeon 18-core processors, eight 16GB memory kits, twelve 8TB hard drives, network card, and dual power supplies Three Mellanox 36-port Infiniband switches at $10,500 each, plus $6,000 cabling Rack enclosure $5,000 Including cables, looms, patch panels, etc. Hardware support (three years) of list cost of above items OS licenses $66,042 Hadoop licensing $388,800 Build Project Costs $1,318,562 Oracle Big Data Appliance as an Alternative to Build Red Hat Enterprise Linux at $1,223 annually for each of 18 nodes for 3 years Cloudera Enterprise Data Hub at $7,200 annually for each of 18 nodes for 3 years Source: Enterprise Strategy Group, The Oracle Big Data Appliance was used as the buy candidate in this analysis, as it is a well-conceived big data infrastructure solution that will serve most enterprises well for both initial big data projects and as organizations grow their requirements over time. Reasons for this include: Software: The appliance includes software often required in big data projects, such as Cloudera's Distribution including Apache Hadoop (CDH) and Cloudera Manager for infrastructure administration; Oracle Linux; the Oracle JDK 8; MySQL Database Enterprise Server (Advanced Edition); the Oracle NoSQL Database (Community Edition); and
8 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 8 R as a statistical development tool. At the time of this writing, the latest version of CDH 5.0 was included, with additional features such as Cloudera Navigator and Back-up and Disaster Recovery (BDR), plus additional functionality in Impala, Search, Hbase, Accumulo, Spark, Kafka, and more. Although beyond the scope of this particular paper, Cloudera s full capabilities are relatively broad and deep as a Hadoop distribution, and should be explored and considered as a preferred option for the data platform. Additionally, security software functionality, including both network and disk encryption, Kerberos-based authentication, and more, are preinstalled and available as checkbox features, which will make protecting all that sensitive data more easily accomplished. Hardware: Oracle s full rack Big Data Appliance includes 18 nodes with an aggregate total of 1,728 TB of raw storage capacity, 648 Intel Xeon cores, 2,304 GB memory, and InfiniBand networking between nodes and racks. If this isn t enough, adding incremental nodes or whole racks is readily done. For some workloads, optimized networking throughout a big data infrastructure can be important to big data performance. The InfiniBand included in Oracle Big Data Appliance, plus the storage and core design of the appliance, will greatly reduce the possibility of network bottlenecks for a Hadoop cluster. Again, much more information on the full specifications of the Big Data Appliance is readily available on Oracle s website, but the quality and capacity of the total rack is impressive. Services: With an Oracle Big Data Appliance, you receive Oracle Premier support; the Automated Service Request (ASR) feature that auto-detects major problems and maintains a heartbeat with Oracle support; configuration and installation support; a warranty; and straightforward options and techniques for expansion or you can do it all yourself with a home-built "commodity" infrastructure. It is worth noting that the appliance is supported as a whole, reducing the risk and finger-pointing of having multiple vendors involved in troubleshooting the inevitable problems of a complex multi-component system. Finally, and this is particularly important for primarily Oracle shops, using the same InfiniBand, you can connect the Oracle Big Data appliance to other Oracle engineered systems, such as the Oracle Exadata Database Machine and the Oracle Exalytics In-Memory Machine, both powered by Intel Xeon. But whether your organization is purely an Oracle shop or not, the Oracle Big Data Appliance presents a compelling alternative to do-it-yourself, Hadooporiented, big data infrastructures for those companies that are serious about big data. Looking for Big Data Savings? The Infrastructure Buy Option Where can you save in a buy versus build scenario for big data? One big bucket of cost comes from the big data infrastructure itself. Of that three-year TCO of $1,318,562 in our theoretical build pricing, well over half of the costs come from server hardware and networking plus related support. It is very important to note that the labor times to order, assemble, and manage all the hardware and software were excluded, but will surely be very significant as well. Certainly installation of a pre-configured appliance would be faster than onsite assembly of all components sourced from a number of independent vendors, see Appendix, Table 6. The Oracle BDA buy option would deliver everything in the build scenario for (assuming list pricing) $725,000 over a three-year time horizon, fully loaded as described above (includes the Hadoop distribution, storage, network/bandwidth, hardware support, etc.). Table 2. Buy Three-year TCO for the Oracle Big Data Appliance (full rack) Item Cost Notes Appliance (18 node) $525,000 Oracle BDA list price Hardware and Software Support $189,000 $63k annually for 3 years of full support Installation $11,000 By Oracle professional services Total $725,000 Source: Oracle, 2015.
9 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 9 ESG believes that for a true enterprise-class big data project of medium complexity, the appliance option could deliver roughly 45% cost savings versus IT architecting, designing, procuring, configuring, and implementing its own big data infrastructure. Frankly, that s a very surprising outcome, but the numbers hold up with the assumptions and configurations already described. But the build versus buy big data story doesn't end with a single project: There are additional and longer-term infrastructure costs and concepts to consider. Table 3. Medium Big Data Implementation Three-year TCOs Buy Wins Item Cost Notes Buy Total $725,000 Cost of Oracle Big Data Appliance including installation and three years of support Build Total $1,318,562 See Appendix for detailed inventory Buy Savings $593,562 Over three-year period ESG Estimated Savings ~45% Oracle BDA appliance lowers costs versus do-it-yourself Source: Enterprise Strategy Group, Big Data Build Proof-of-concept and Cluster Costs Misleading Some would argue that for proof-of-concept projects, a major investment in a big data appliance, like Oracle s Big Data Appliance, at a list price of $525,000 for the full rack of 18 servers, is simply too expensive, but a smaller starter option is available for a $185,000 list price. This entry point may be easier for many organizations to justify, and, with six nodes supporting the same software environment, could be a very desirable scale on demand approach. The ultimate costs of expanding from six to 18 nodes would exceed purchasing the fully equipped system at the start by 6%, but may help avoid initial overprovisioning for smaller environments. While many may hesitate to make any such capital investment, the problem with the keep it cheap, it is only a proof-ofconcept argument rests with the nature of proof-of-concept: It must include all the elements of production complexity, or it may not represent the true scenario upon enterprise-wide rollout. If IT personnel spin up a commodity cluster, implement Hadoop, load some data into Hadoop, and run a few queries, the only real thing that has been proved is that Hadoop works. Such a proof-of-concept does not necessarily indicate the infrastructure's, and the related personnel's, ability to reliably support and process enterprise-grade big data analytics over the long term. In fact, making long-term big data architectural decisions based on a simplistic proof-of-concept could engender unforeseen long-term costs. Oracle s Big Data Appliance offers a short cut to many of the integration efforts, with a full software stack including Cloudera s full Enterprise Data Hub edition, including all the latest innovations and updates. The Cloudera distribution is presented as an included perpetual license, not a per-server, ongoing subscription, and this alone has a significant impact on both cost and effort. The functionality of the data platform will be an asset because Cloudera s EDH is recognized as a technology leader in both range and quality of features, offering flexibility, scalability, performance, security, and management tools. Security features including strong authentication, network encryption, and on-disk encryption are an added bonus, if not a necessary consideration for any enterprise. Also included and fully integrated out-of-the-box are Oracle s Linux, JDK, and NoSQL database licenses, all of which increases the value versus procuring and assembling these software components individually. Prospective customers should also use references from their peers, rather than relatively simply proof-of-concept projects, for making initial decisions on big data infrastructure. Comparing to an Open Source Software Alternative Again, many could argue that spending almost $40,000 per server for a Hadoop environment is extreme, but it is the actual public list cost of acquiring a server, which is comparable to those in the Oracle BDA. Lesser servers would no doubt be cheaper, but also have less storage capacity, memory, and processor resources, making them not really comparable. However, there may be some ways to still reduce costs without implementing a totally different environment.
10 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 10 If networking isn t considered essential, a more modest setup without Infiniband might reduce the costs by an ESG estimate of approximately $12,500. This would impact performance perhaps significantly so depending on the data distribution, analytics algorithms, and workloads. Further, for the really brave, one might reduce the level of hardware vendor support across the board from premier 24x7 to a lower level for a further savings of perhaps $46,189. There is a definite risk to doing so, especially in a mission-critical enterprise environment, and while possible, it s not recommended! The other large portion of costs comes from software licensing, notably including Cloudera Hadoop (at $388,800 over three years) and to a much smaller extent, Red Hat Linux (at $66,042). Suppose instead one was a firm believer in pure open source software, and was willing to go to the effort of supporting a generic Apache Hadoop software stack, alongside a generic Linux instance. Both are freely available for download from a variety of sources, so it s an option. This also isn t particularly recommended, but it s not necessarily that rare either. Taking out all of these not strictly necessary costs does greatly reduce the cost of the build your own scenario, bringing the three-year TCO all the way down to $805,031. This might be called the corner cutting build option. Shockingly, even taking on all the additional effort and considerable support risk, reducing the network speed, and giving up all the advanced functionality and management capabilities of Cloudera and Red Hat, the outcome is that the Oracle BDA is still almost 10% less expensive. This is shown in Table 4. Table 4. Medium Corner Cutting Build Implementation Three-year TCOs Buy Still Wins Item Cost Notes Buy Total $725,000 Cost of Oracle Big Data Appliance including installation and three years of support Build Total $805,031 As described above Buy Savings $80,031 Over three-year period ESG Estimated Savings ~9.9% Oracle BDA appliance lowers costs versus do-it-yourself Source: Enterprise Strategy Group, Critical Considerations Beyond Costs Big Data Longer-term Infrastructure Costs Favor Consistency Though ESG believes that for medium-sized big data projects the buy option can deliver significant savings over build, companies also should consider what will happen with their next project will you reuse the same infrastructure from the initial project, or will you create a series of big data project islands? Let s go back to lessons learned with ERP, a comparably heavy investment for many companies in years past, to consider an approach. One of the key issues many organizations faced in the early days of ERP, and that some still face, was piecemeal ERP implementations, resulting in a variety of infrastructures, databases, schemas, and user experiences. The result was a huge dependency on integration technologies, and a never-ending ERP upgrade lifecycle. The time is now for IT and the business to realize that big data projects ultimately lead to something larger and more complex, and best practices like architectural guidelines should apply to big data just as they do today for ERP and CRM. ESG believes that customers choosing a consistent infrastructure for multiple big data projects across different departments will enjoy compounded savings due to the elimination of the learning curve associated with managing, maintaining, and tuning the infrastructure, plus the potential for infrastructure reuse.
11 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 11 Big Data Build Risk Never Disappears Look forward three years from now: Your organization has become a successful big data organization, where IT easily adapts to new big data demands and where your executives, business users, and extended value chain all benefit and compete better due to big data. Will your organization easily reach that goal if every big data project is a build project? The IT department will constantly swim upstream to deal with new demands, constantly adjusting and updating custom infrastructures. The buy decision, however, may enable your organization to focus more on value and visualization versus procurement and deployment. ESG suggests that organizations take infrastructure development and deployment variability out of the pool of risk associated with reaching big data success. The risk of constantly reinventing and managing do-it-yourself infrastructures for big data may grow with the volume and complexity of big data deliverables. Better Time to Market of the Buy Option Beyond costs, determining the benefits is often the hardest part of calculating ROI for big data projects. The difficulty of predicting and detecting big data benefits comes from the fact that big data projects do not end per se, but rather turn into analytics applications, which grow, evolve, and lead to more discovery projects in total, providing benefits in continually evolving ways. Unfortunately, making this claim will not satisfy the CFO who is wondering when and from where the big data benefit will arrive. It needn t be that difficult, however. Usually some kind of basic business metric will offer an acceptable framework for benefit, which may receive tuning over time. For example, Better understanding of what products our customers like best may link to increased product sales and streamlined supply chain. Or, More accurately predicting energy demand may link to a new premium pricing scenario for peak times or a well-tuned electrical grid bidding scheme. An agreed-upon metric for the benefits side of the equation forms the basis for ROI. But one variable seems to fit neatly into every single big data project and that variable is time. Quite simply, the faster the business and IT is able to deliver the big data solution, the faster the business will realize any associated benefit. The tried and true time-to-market variable applies in every big data project. ESG performed a time-to-market benefit analysis associated with a medium-sized big data project (see Appendix, Table 6 for details), and the analysis unveiled that a buy infrastructure will deliver the project about 35% faster, cutting the medium-sized project time from 26 weeks to 17 weeks. ESG believes that a buy versus build approach will yield roughly one-third faster time-to-market benefit associated with big data analytics projects for discovery, improved decision making, and business process improvement. Better Performance with Oracle and Intel Tuning ESG believes that a buy versus build approach will yield roughly one-third faster time-to-market benefit associated with big data analytics projects for discovery, improved decision making, and business process improvement. There is one more major difference between building your own and choosing an engineered solution: tuning. If performance matters, and it always does, then IT will be required to tune the whole hardware and software stack to get optimal results. An ad hoc selection of commodity servers and open source software will by definition be untuned, and require significant effort to do so. An Oracle BDA has already been extensively tuned in the design, through an active collaboration between Oracle and Intel, which goes beyond the Xeon processors themselves. Intel and Oracle have worked together to optimize the entire environment and achieved great improvements versus a stock environment. Their self-reported benchmark comparison showed Oracle s BDA to be immediately
12 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) x faster using BigBench on a 2TB input dataset. 2 With further general purpose tuning, they grew this speed advantage to almost 2x faster. Some of the adjustments included: Doubling the YARN container size. Enabling parallel execution in Hadoop. Enabling vectorization in Hadoop. Using Uber tasks for simple MapReduce queries. Tuning large pages in Linux. Intel also reports that for specific workloads, additional custom tuning could deliver up to 3x faster results. Of course this testing was done by experienced engineers in the vendors labs. Even if a customer s IT department had staff with significant experience in tuning Hadoop, Linux, and other components of the environment, they may not see the same outcomes. Familiarity with the Intel Xeon processors architecture helps, and anyway it would add a lot of time and effort to tune a self-built cluster, all just to match what has already been done for the Oracle BDA. The Bigger Truth Oracle s Big Data Appliance has many advantages over a homegrown solution, and for those considering their deployment options for Hadoop, it is well worth taking the time to evaluate the total picture. Certainly many organizations will be more comfortable working within their existing Oracle relationship and enjoying the confidence of quality services and support that will accompany the fully vetted and pre-integrated appliance. Cost is outlined as one factor, but ultimately the satisfaction of the many users in the enterprise will determine the success of the big data initiative. Serious Big Data Requires Serious Commitment In order for big data to deliver on the promise of helping organizations look forward, just as ERP, for example, has helped automate business processes and business intelligence has helped organizations look backward, big data will require an enterprise-class facility. Big data carries unique requirements for hardware, networking, software, and human skills. Enterprises should plan to build mission-critical infrastructures to support a series of big data projects and applications, yielding a big data facility that continually benefits the organization over time. Avoid Common Assumptions about Big Data Hadoop has been associated with sometimes over-hyped promises, like you already have the data you need, you already have the people you need, and you can use inexpensive commodity infrastructures, which may not be true for many organizations. Hadoop is a tool that can be used in many ways to help organizations achieve big data results, but meeting end-user expectations of performance and availability often requires costly expertise and an enterprise-class infrastructure that spans the total needs for storage, processing, and lifecycle management. Buy Will Often Trump Build for Both Big Data Infrastructure Costs and Benefits A shorter path to lowering big data costs for the vast majority of enterprises can involve buying a preconfigured and purpose-built infrastructure such as an appliance. Rolling your own infrastructure often involves a long series of hidden risks and costs, and may undermine your organization s ability to deliver big data in the long term. In addition, the buy option for big data infrastructures compresses project durations. Thus, well-designed appliances often help deliver the one common benefit metric for big data: time to market. 2 Performance testing has not been validated by ESG Lab.
13 Appendix White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 13 Table 5. Elements of a Medium Enterprise Big Data Implementation Cost/Benefit Build Cost Elements Item Value Metrics/Comments Hardware/Network Nodes 18 Servers - each 2 x 18-core Intel Xeon processors Cores 36 Per node, as above Memory 128 GB/server Racks 1 Assumes up to 18 nodes/rack Storage for nodes 96 TB/server for primary clusters, internal storage Storage for information Terabytes; assumes one-fourth of total data in 50 management motion maximum at any given time Assumes Infiniband, assumes 3x throughput Switches 3 improvement over generic 10GBe; plus an administration switch and assorted cabling Hardware support 15% Of total hardware costs; third-party support Software Hadoop license costs $7,200/node Typically new licenses during early big data adoption, allow for two more licenses for warm backup Source: Enterprise Strategy Group, Table 6. Medium Big Data Project Build Versus Buy Project Time Analysis Phase Architecture, design, procurement Build Time 1 month Buy Time/Savings (assumes 4-week months) Reason for Reduction 2 weeks/2-week savings Hardware/network pre-designed, single source procurement Hardware, network 2 weeks/2-week savings configuration, and 1 month Appliance pre-configured; less pieces to implement implementation separately and certify 7 weeks/1-week savings Development, integration, 2 months Appliance stability provides more solid foundation for training, etc. development, testing, etc. 3 weeks/1-week savings Integrated system test, go 1 month Expect smooth system test and go live due to live appliance s fewer moving parts 3 weeks/1-week savings Slack and over-run 1 month More dependable infrastructure will lead to more dependable scheduling Performance tuning 2 weeks Oracle BDA pre-tuned Project Totals 6.5 months (26 weeks) 4.25 months (17 weeks) Buy Time to Market Benefit = 9 weeks or ~35% Source: Enterprise Strategy Group, 2015.
14 White Paper: The Economics of Engineered Systems for Big Data (with Oracle and Intel ) 14 What Table 5 suggests is that a medium-sized project will complete in more than one-third less time due to a buy infrastructure decision over build. The reduced procurement time, reduced configuration and design time, reduced implementation time, and a more dependable infrastructure out of the box, which moves along development and testing time, are all factors that contribute to the idea of buy speeding along big data projects. The time-to-market benefit of buy keeps giving, too, because the same time savings keeps repeating in every big data project.
White Paper Getting Real About Big Data: Build Versus Buy By Evan Quinn, Senior Principal Analyst February 2013 This ESG White Paper was commissioned by Oracle and is distributed under license from ESG.
Big Data Big Deal for Public Sector Organizations Hoàng Xuân Hiếu Director, FAB & Government Business Indochina & Myanmar 1 Copyright 2013, Oracle and/or its affiliates. All rights reserved. The following
White Paper EMC Isilon: A Scalable Storage Platform for Big Data By Nik Rouda, Senior Analyst and Terri McClure, Senior Analyst April 2014 This ESG White Paper was commissioned by EMC Isilon and is distributed
The Future of Data Management with Hadoop and the Enterprise Data Hub Amr Awadallah (@awadallah) Cofounder and CTO Cloudera Snapshot Founded 2008, by former employees of Employees Today ~ 800 World Class
1 Oracle Big Data Appliance Releases 2.5 and 3.0 Ralf Lange Global ISV & OEM Sales Agenda Quick Overview on BDA and its Positioning Product Details and Updates Security and Encryption New Hadoop Versions
infrastructure (WX 2 and blade server Kognitio provides solutions to business problems that require acquisition, rationalization and analysis of large and/or complex data The Kognitio Technology and Data
White Paper EMC s Enterprise Hadoop Solution Isilon Scale-out NAS and Greenplum HD By Julie Lockner, Senior Analyst, and Terri McClure, Senior Analyst February 2012 This ESG White Paper was commissioned
WHITE PAPER Ubuntu and Hadoop: the perfect match February 2012 Copyright Canonical 2012 www.canonical.com Executive introduction In many fields of IT, there are always stand-out technologies. This is definitely
Solution Impact Analysis NEC Powers ServIT's Custom Hosting Solutions By Mark Bowker September, 2011 This ESG publication was commissioned by NEC and is distributed under license from ESG. 2011, Enterprise
Technical paper Open source comes of age for ERP customers It s no secret that open source software costs less to buy the software is free, in fact. But until recently, many enterprise datacenter managers
INSIGHT Oracle's All- Out Assault on the Big Data Market: Offering Hadoop, R, Cubes, and Scalable IMDB in Familiar Packages Carl W. Olofson IDC OPINION Global Headquarters: 5 Speen Street Framingham, MA
Introducing Oracle Exalytics In-Memory Machine Jon Ainsworth Director of Business Development Oracle EMEA Business Analytics 1 Copyright 2011, Oracle and/or its affiliates. All rights Agenda Topics Oracle
WHITEPAPER OPEN MODERN DATA ARCHITECTURE FOR FINANCIAL SERVICES RISK MANAGEMENT A top-tier global bank s end-of-day risk analysis jobs didn t complete in time for the next start of trading day. To solve
Oracle Database - Engineered for Innovation Sedat Zencirci Teknoloji Satış Danışmanlığı Direktörü Türkiye ve Orta Asya Oracle Database 11g Release 2 Shipping since September 2009 126.96.36.199 Patch Set now
2010, Enterprise Strategy Group, Inc. All Rights Reserved White Paper The Data Center of the Future By Mark Bowker and Lauren Whitehouse March, 2010 This ESG White Paper was commissioned by Veeam and is
White Paper Benefiting from Server Virtualization Beyond Initial Workload Consolidation By Mark Bowker June, 2010 This ESG White Paper was commissioned by VMware and is distributed under license from ESG.
White Paper Integrated Computing Platforms: Infrastructure Builds for Tomorrow s Data Center By Mark Bowker, Senior Analyst, and Perry Laberis, Senior Research Associate March 2013 This ESG White Paper
White Paper Hosted Desktops Rightsized for Desktop Transformation By Mark Bowker, Senior Analyst December 2013 This ESG White Paper was commissioned by Citrix and HP and is distributed under license from
PAGE 1 l Teradata Magazine l Q1/2011 l 2011 Teradata Corporation l AR-6309 It s going mainstream, and it s your next opportunity. by Merv Adrian Enterprises have never had more data, and it s no surprise
SAP HANA forms the future technology foundation for new, innovative applications based on in-memory technology. It enables better performing business strategies, including planning, forecasting, operational
IT CHANGE MANAGEMENT & THE ORACLE EXADATA DATABASE MACHINE EXECUTIVE SUMMARY There are many views published by the IT analyst community about an emerging trend toward turn-key systems when deploying IT
White Paper: TCA/TCO Benefits of Consolidating Databases and x86 Servers on IBM zenterprise Servers with Linux Whitepaper: TCA/TCO Benefits of Consolidating Databases and x86 Servers on IBM zenterprise
Field Audit Report Asigra Hybrid Cloud Backup and Recovery Solutions By Brian Garrett with Tony Palmer May, 2009 Field Audit: Asigra Hybrid Cloud Backup and Recovery Solutions 2 Contents Introduction...
Evaluation Report: Accelerating SQL Server Database Performance with the Lenovo Storage S3200 SAN Array Evaluation report prepared under contract with Lenovo Executive Summary Even with the price of flash
Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments Important Notice 2010-2016 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, Impala, and
Oracle Big Data SQL Technical Update Jean-Pierre Dijcks Oracle Redwood City, CA, USA Keywords: Big Data, Hadoop, NoSQL Databases, Relational Databases, SQL, Security, Performance Introduction This technical
Dell s SAP HANA Appliance SAP HANA is the next generation of SAP in-memory computing technology. Dell and SAP have partnered to deliver an SAP HANA appliance that provides multipurpose, data source-agnostic,
Research Report Abstract: Enterprise Big Data, Business Intelligence, and Analytics Trends By Nik Rouda, Senior Analyst With Bill Lundell, Senior Research Analyst, and Jennifer Gahm, Senior Project Manager
Company Brief Data Protection Services Should Be About Services, as Well as Data Protection Date: February 2013 Author: Jason Buffington, Senior Analyst Abstract: Organizations trying to modernize or fix
ESG Case Study Primatics Financial Delivers SaaS-based Solution Excellence Using EMC s XtremIO Date: March 2015 Authors: Mark Peters, Senior Analyst; Adam DeMattia, Market Research Analyst; and Monya Keane,
Delivering Real-World Total Cost of Ownership and Operational Benefits Treasure Data - Delivering Real-World Total Cost of Ownership and Operational Benefits 1 Background Big Data is traditionally thought
James Serra Sr BI Architect JamesSerra3@gmail.com http://jamesserra.com/ Our Focus: Microsoft Pure-Play Data Warehousing & Business Intelligence Partner Our Customers: Our Reputation: "B.I. Voyage came
SAS and Oracle: Big Data and Cloud Partnering Innovation Targets the Third Platform David Lawler, Oracle Senior Vice President, Product Management and Strategy Paul Kent, SAS Vice President, Big Data What
HP virtualization with ware Enabling end-to-end virtualization of IT resources Whether you have 10 servers, 100 servers or 1,000 servers, the HP virtualization solution with ware can deliver savings, simplification
Einsatzfelder von IBM PureData Systems und Ihre Vorteile email@example.com Agenda Information technology challenges PureSystems and PureData introduction PureData for Transactions PureData for Analytics
ORACLE BIG DATA APPLIANCE X4-2 BIG DATA FOR THE ENTERPRISE OPEN, SECURE AND INTEGRATED KEY FEATURES Massively scalable, open infrastructure to store and manage big data Industry-leading security, performance
Big Data at Cloud Scale Pushing the limits of flexible & powerful analytics Copyright 2015 Pentaho Corporation. Redistribution permitted. All trademarks are the property of their respective owners. For
Nexsan and FalconStor Team for High Performance, Operationally Efficient Disk-based Backup Date: August, 2009 Author: Brian Babineau, Senior Consulting Analyst, and Lauren Whitehouse, Senior Analyst Abstract:
An Oracle White Paper November 2010 Backup and Recovery with Oracle s Sun ZFS Storage Appliances and Oracle Recovery Manager Introduction...2 Oracle Backup and Recovery Solution Overview...3 Oracle Recovery
E-PAPER FEBRUARY 2014 Why DBMSs Matter More than Ever in the Big Data Era Having the right database infrastructure can make or break big data analytics projects. TW_1401138 Big data has become big news
Appliances are rapidly becoming a preferred purchase option for large and small businesses seeking to meet expanding workloads and deliver ROI in the face of tightening budgets. TBR is reporting the results
Microsoft Analytics Platform System Solution Brief Contents 4 Introduction 4 Microsoft Analytics Platform System 5 Enterprise-ready Big Data 7 Next-generation performance at scale 10 Engineered for optimal
The big data revolution Friso van Vollenhoven (Xebia) Enterprise NoSQL Recently, there has been a lot of buzz about the NoSQL movement, a collection of related technologies mostly concerned with storing
Database Decisions: Performance, manageability and availability considerations in choosing a database Reviewing offerings from Oracle, IBM and Microsoft 2012 Oracle and TechTarget Table of Contents Defining
Please give me your feedback Session BB4089 Speaker Claude Lorenson, Ph. D and Wendy Harms Use the mobile app to complete a session survey 1. Access My schedule 2. Click on this session 3. Go to Rate &
WHITE PAPER July 2014 Achieving Real-Time Business Solutions Using Graph Database Technology and High Performance Networks Contents Executive Summary...2 Background...3 InfiniteGraph...3 High Performance
White Paper Application Virtualization: An Opportunity for IT to do More with Much Less By Mark Bowker, Senior Analyst November 2012 This ESG White Paper was commissioned by DH2i and is distributed under
News and trends in Data Warehouse Automation, Big Data and BI Johan Hendrickx & Dirk Vermeiren Extreme Agility from Source to Analysis DWH Appliances & DWH Automation Typical Architecture 3 What Business
David Chappell OPERATIONAL SCENARIOS USING THE MICROSOFT DATA PLATFORM A GUIDE FOR IT LEADERS Sponsored by Microsoft Corporation Copyright 2016 Chappell & Associates Contents Microsoft s Data Platform:
Architecting the Future of Big Data Whitepaper Apache Hadoop: The Big Data Refinery Introduction Big data has become an extremely popular term, due to the well-documented explosion in the amount of data
` ESG Solution Showcase EMC Isilon: Data Lake 2.0 Date: November 2015 Author: Scott Sinclair, Analyst Abstract: With the rise of new workloads such as big data analytics and the Internet of Things, data
Cloudera Enterprise Reference Architecture for Google Cloud Platform Deployments Important Notice 2010-2015 Cloudera, Inc. All rights reserved. Cloudera, the Cloudera logo, Cloudera Impala, Impala, and
Hitachi s HSP Hyper-Converged Appliance makes Big Data Analytics fit the Enterprise Prepared by: George Crump, Lead Analyst Prepared: January 2016 Hitachi s HSP Hyper-Converged Appliance makes Big Data
White Paper Building Next Generation Data Centers Implications for I/O Strategies By Bob Laliberte, Senior Analyst August 2014 This ESG White Paper was commissioned by Emulex and is distributed under license
OPTIMIZING SERVER VIRTUALIZATION HP MULTI-PORT SERVER ADAPTERS BASED ON INTEL ETHERNET TECHNOLOGY As enterprise-class server infrastructures adopt virtualization to improve total cost of ownership (TCO)
SQL Server 2012 Parallel Data Warehouse Solution Brief Contents Introduction... 1 Microsoft Platform: Windows Server and SQL Server... 2 SQL Server 2012 Parallel Data Warehouse... 3 Built for Big Data...
ORACLE BIG DATA APPLIANCE X3-2 BIG DATA FOR THE ENTERPRISE KEY FEATURES Massively scalable infrastructure to store and manage big data Big Data Connectors delivers load rates of up to 12TB per hour between
Unisys ClearPath Forward Fabric Based Platform to Power the Weather Enterprise Introducing Unisys All in One software based weather platform designed to reduce server space, streamline operations, consolidate
White Paper Protecting Big Data Data Protection Solutions for the Business Data Lake Abstract Big Data use cases are maturing and customers are using Big Data to improve top and bottom line revenues. With
Real-Time Big Data Analytics with the Intel Distribution for Apache Hadoop software Executive Summary is already helping businesses extract value out of Big Data by enabling real-time analysis of diverse
Can Cloud Database PaaS Solutions Replace In-House Systems? Abstract: With the advent of Platform-as-a-Service as a viable alternative to traditional database solutions, there is a great deal of interest
White Paper Getting on the Road to SDN Attacking DMZ Security Issues with Advanced Networking Solutions By Bob Laliberte, Senior Analyst March 2014 This ESG White Paper was commissioned by NEC and is distributed
David Chappell SELLING PROJECTS ON THE MICROSOFT BUSINESS ANALYTICS PLATFORM A PERSPECTIVE FOR SYSTEMS INTEGRATORS Sponsored by Microsoft Corporation Copyright 2014 Chappell & Associates Contents Business
WHITE PAPER Windows Server 2003 migration: Your three-phase action plan to reach the finish line Table of contents Executive summary...2 Windows Server 2003 and the big migration question...3 If only migration
Enterprise Strategy Group Getting to the bigger truth. White Paper Direct Scale-out Flash Storage: Data Path Evolution for the Flash Storage Era Apeiron introduces NVMe-based storage innovation designed
An Oracle White Paper June 2012 High Performance Connectors for Load and Access of Data from Hadoop to Oracle Database Executive Overview... 1 Introduction... 1 Oracle Loader for Hadoop... 2 Oracle Direct
White paper: Unlocking the potential of load testing to maximise ROI and reduce risk. Executive Summary Load testing can be used in a range of business scenarios to deliver numerous benefits. At its core,
mwd a d v i s o r s Navigating the Big Data infrastructure layer Helena Schwenk A special report prepared for Actuate May 2013 This report is the second in a series of four and focuses principally on explaining
An Oracle White Paper October 2011 Oracle Database Appliance Introduction The Oracle Database Appliance is a new engineered system consisting of hardware and software that saves customers time and money
Enterprise Strategy Group Getting to the bigger truth. White Paper Lenovo: Software-defined Storage for a New Generation of Information Technology An investigation into software-defined storage, and the
HRG Assessment: Cloud Computing Provider Perspective In the fall of 2009 Harvard Research Group (HRG) interviewed selected Cloud Computing companies including SaaS (software as a service), PaaS (platform
White Paper Extracting the Value of Big Data with HP StoreAll Storage and Autonomy By Terri McClure, Senior Analyst and Katey Wood, Analyst December 2012 This ESG White Paper was commissioned by HP and
White Paper Closing the Big Data Management and Security Gap By Nik Rouda, Senior Analyst October 2014 This ESG White Paper was commissioned by Zettaset and is distributed under license from ESG. 2 Contents
ORACLE INFRASTRUCTURE AS A SERVICE PRIVATE CLOUD WITH CAPACITY ON DEMAND FEATURES AND FACTS FEATURES Hardware and hardware support for a monthly fee Optionally acquire Exadata Storage Server Software and
ESG Brief IBM: An Early Leader across the Big Data Security Analytics Continuum Date: June 2013 Author: Jon Oltsik, Senior Principal Analyst Abstract: Many enterprise organizations claim that they already
Powerful Duo: MapR Big Data Analytics with Cisco ACI Network Switches Introduction For companies that want to quickly gain insights into or opportunities from big data - the dramatic volume growth in corporate
IBM PureFlex System The infrastructure system with integrated expertise 2 IBM PureFlex System IT is moving to the strategic center of business Over the last 100 years information technology has moved from
STANDARDIZATON CONSOLIDATION SCALING INTEROPERABILITY DATA MANAGEMENT PORTFOLIO MANAGEMENT IT S WHO WE ARE AND HOW WE WORK Seems like every technology company is fond of saying they eat their own dog food.