1 25 March 2014 Volume 5 Issue 2 The Blotter presents ITG s insights on complex global market structure, technology, and policy issues. Big Data, Big Decisions The coming sea change in technology investments CONTRIBUTOR David Meitz Managing Director, Chief Technology Officer CONTACT Asia Pacific Canada EMEA United States Market participants do not need to be told that they are working in an era of Big Data. They experience it every day. However, developing an appropriate response is going to change the daily experience in a number of important ways. The relationship with technology will inevitably change. Internal relationships will be altered. And analytics will dominate any list of required capabilities. To understand how and why these changes are necessary it is first essential to understand where the Big Data phenomenon has come from, and why it is here. 1. THE BIG DATA PHENOMENON Big Data is not an isolated phenomenon. In fact, Big Data might more accurately be viewed as the culmination of several separate but interrelated financial market trends over the past years. The first of these, and perhaps the most obvious, is the fragmentation of liquidity over multiple venues. It has always been obvious this would lead to an increase in market data feeds overall. But, the landscape has been rendered more complex by the data that is required for each individual trade, as it splits over multiple venues. The need to reconcile, record and prove best execution for parent, child and even grandchild orders has placed its own demands on data. At the same time, stagnation in equities has led to far greater interest in and demand for multiple asset classes in a single portfolio: both for hedging and speculation purposes. Again, it is apparent that simply diversifying holdings will increase the data to be consumed. But that too is compounded by the data-intensive nature of instruments like fixed income, foreign exchange, and derivatives that increasingly have their place in today s portfolios. As regulatory demands pull many of these instruments onto exchanges, we can expect data volumes to increase once more. The third factor is that this is happening at a time when commissions and fee income are less certain than ever before. The irony of data proliferation is that traditional sources of market data cannot be considered a source of alpha, since everyone has the same information. Instead, smart players are seeking out new, specialist data both from formal and informal sources on which to build their competitive proposition, and are diversifying into other revenue-generating services, including data-intensive research, with inevitable results.
2 25 March 2014 Volume 5 Issue 2 2 Not all data is created equal Clearly, the Big Data phenomenon is inextricably linked to broader market trends. However, it is not enough for firms to understand where data comes from. Addressing the data challenge is dependent on the ability to distinguish between data types, and deciding how to interact with each type. Two factors are critical: how data is stored, and when it needs to be accessed. In most cases, the response to any given data type will be determined by sensitivity to latency. For example, on the trading desk, where time is measured in sub-second increments, the speed at which pricing data becomes available to inform trading decisions is a key factor in determining its value. In contrast, tick data used for compliance purposes, such as retracing the pricing of an order, is measured in hours if not days. Like records of s, phone calls, and instant messages that are required for compliance and auditing, security is a more important consideration than latency. Trade Execution Venues (Street) Consumption Layer Market Research Algorithms EMS/OMS Real-time Analytics Service Bus Service Layer Real-time Market Data Historical Market Data Research Dynamic Cache Hybrid Storage Model Local Storage at DC 3rd Party Cloud Storage Provision and Storage Layer Source: ITG, Inc. Similarly, data acquired in the name of market research might include important insights such as retail sales data, cell phone usage trends, or medical device information, but it can in no way be regarded as latency-sensitive. Sensitivity to latency can also change over time. The gap between historical data used for dynamic or intra-day trading analytics and real-time pricing data is narrowing. For example, analysis of dynamic, intra-day costs on just-executed trades is designed to help make better-informed decisions when trading a large list. Although the latency for this data cannot be as low as that for pricing data, the faster it is made available in the trading cycle, the more beneficial it is to the decisionmaking process. It is also important to note that the increase in data volumes has not necessarily improved the ratio of signals to noise. Almost two thirds of incoming data has little if any value. Firms need to be able to distinguish between what is useful and what is not.
3 25 March 2014 Volume 5 Issue 2 3 Storage, access and interaction This is the fundamental challenge of Big Data: understanding and analyzing all incoming information, determining its origin and type, and then deciding where and how it is to be stored and accessed. The treatment of real-time, interactive data demanded by the trading desk is very different to the treatment for essential but less time-critical data. The past decade has seen a regulation-driven emphasis on secure archiving and storage, and the development of effective user-access policies, both in data centers and the enterprise. But the game is changing once again, and the emphasis is shifting towards ensuring that data remains usable, interactive and immediately available. That in turn changes the demands made on the technology deployed at firms: instant, secure and interactive access needs to co-exist with long-term archiving and storage. 2. THE PRACTICAL RESPONSE TO BIG DATA However we look at it, there are costs attached to the Big Data challenge. With storage demands for mid-size firms growing at approximately 90 terabytes a year, existing capabilities are quickly becoming exhausted. Costs are not simply confined to the capital expenditure required for hardware and memory capacity. There are also operational costs attached to the support and maintenance of servers, support for writing queries against databases, and the addition of performance monitoring applications to the growing infrastructure. Furthermore, as more applications draw on databases to access data for their own precise needs, the development costs associated with writing and testing application interfaces increases. At these volumes, even low-power servers and server architectures still require cooling, back-up and redundancy which add to already high energy bills. However, it is an unfortunate truth that the increase in data volumes is not matched by a corresponding increase in trading volumes, which strains existing cost-revenue ratios. Making a connection between investment in extra data management and storage capacity and increased revenues is hard to do. In this cost-constrained environment, new and innovative approaches to data access and storage are required. Building an infrastructure for data management Cloud-based solutions, whether private or public, are a potential solution. Widely deployed in other industries, there have been moves within the financial sector to adopt cloud-based storage solutions for certain sets of data. The use of cloud today is therefore much more advanced than it was only a few years ago. Although cloud providers have made good strides to enhance the security of their offerings, an understandable resistance remains, particularly when it comes to storing vital client information. Redundancy and latency also remain key issues, although, as discussed above, there is plenty of non-latency-sensitive data that could usefully be stored in the cloud. Perhaps the real obstacle for cloud-based solutions is that most commercial models charge by access. For financial firms that leverage the cloud for heavily interactive data, that is likely to prove far more costly than local storage. Ultimately, the role of cloud computing within financial services is likely to be within a hybrid model that retains data centers and localized storage. But even in a hybrid model such as this one, firms still need multi-layered storage facilities on site for real-time, intra-day transactions, and manipulation and review of various trade analytics, trade data, transaction activity and market data.
4 25 March 2014 Volume 5 Issue 2 4 Central to this solution will be some form of aggregated cache. A cache server enables firms to take a single instance of data, which can be archived and retrieved at a later stage, and analyze it immediately. A high-performance cache server enables firms to store and retrieve data at very fast pace. In the right infrastructure, it enables a single instance of the data to be maintained and accessed by all applications and functions that require it. The human face of analytics As Big Data establishes itself within the trading landscape, it is becoming apparent that intelligent analytics are critical. Their role in supporting trade performance is well understood, but analytics will increasingly become a critical feature in the management of the technology infrastructure. Interestingly, this is likely to lead to a closer relationship between technology and analyst functions that mirrors the closer relationship between technology and compliance that has developed over the past decade. This is because effective analytics are dependent on a keen understanding of the expected business result. If we look at dynamic TCA, as an example, effective results are delivered by a core group of analytics developers who work closely with the business line to understand exactly what data is expected, at what frequency and how that information should be displayed. Equally, an algorithmic trading platform benefits from a speedier time to market where there is a close relationship with technology teams. When detailed specifications come from more intimate interaction between development teams and business lines, using agile development practices, faster results can be delivered in smaller increments. The same principle applies to infrastructure analytics. Getting the right solution will depend on seamless relationships between technology and the business. 3. RETHINKING TECHNOLOGY INVESTMENT One of the more likely consequences of the Big Data phenomenon will be a change in the way that technology investment is directed. The past decade has been dominated by investment in essential OMS and EMS functionality. But now that this functionality has matured, the emphasis is moving towards capturing, analyzing and presenting richer data back to the EMS and OMS, with the goal of making smarter trading decisions. Increasingly, the EMS and OMS will be viewed as the means by which betterintegrated analytics from multiple sources can be presented, published and consumed by all client-facing applications. This analytics capability is the key that opens the door to more efficient technology infrastructures. Advanced performance analytics will be joined by the ability to identify and assess incoming data to support the functionality of multi-faceted and flexible storage, archiving and retrieval capabilities, and manipulation and review of real-time transactions. But perhaps the biggest sea change of all will be in the way the technology is viewed and understood. Although well served by discrete solutions and discrete functionality in the past, the evolution of trading technology will no longer be about simply adding new features or functions. More emphasis will be placed on the underlying architecture and modern development techniques that create a strong, but highly flexible backbone that runs through every element of the business.
5 25 March 2014 Volume 5 Issue 2 5 Technologies such as service-oriented architecture that easily and efficiently carry information through an organization will dominate. Where individual trading tools, analytics and functional components have historically been tied neatly to one product or service, a Service Oriented Architecture enables those capabilities to be shared across an almost infinite number of applications. Consequently, functionality that was originally produced for one particular product for example, charting could be published to the architecture and subscribed to by other applications. With this kind of architecture, the dynamic storage cache can take in volumes of data and publish them back in real time for consumption by any relevant applications. This will be the killer application of the Big Data era, and all efforts toward managing and storing the data will need to be directed toward making sure that meaningful data can be published and ready for consumption as quickly as possible. CONCLUSION Caught in the immediate firestorm of overwhelming volumes of data, the long-term impact of Big Data can be hard to spot. Take a step back, however, and it becomes clear that Big Data is not just the latest in a long line of operational challenges that can be solved by money and microprocessors. Big Data is the product of significant and irreversible market changes in the financial sector. It is enabled by the pervasive information that technology has delivered to almost every area of society. Viewed in this light, Big Data is the catalyst for profound changes in the way firms manage their operations and their organizations. We cannot be sure what the end result will be. However, we can be sure that Big Data is dramatically re-shaping the landscape, and requires a correspondingly dramatic change in the way technology is deployed to manage it Investment Technology Group, Inc. All rights reserved. Not to be reproduced or retransmitted without permission The positions taken in this document reflect the judgment of the individual author(s) and are not necessarily those of ITG. These materials are for informational purposes only, and are not intended to be used for trading or investment purposes or as an offer to sell or the solicitation of an offer to buy any security or financial product. The information contained herein has been taken from trade and statistical services and other sources we deem reliable but we do not represent that such information is accurate or complete and it should not be relied upon as such. No guarantee or warranty is made as to the reasonableness of the assumptions or the accuracy of the models or market data used by ITG or the actual results that may be achieved. These materials do not provide any form of advice (investment, tax or legal). ITG Inc. is not a registered investment adviser and does not provide investment advice or recommendations to buy or sell securities, to hire any investment adviser or to pursue any investment or trading strategy. Broker-dealer products and services are offered by: in the U.S., ITG Inc., member FINRA, SIPC; in Canada, ITG Canada Corp., member Canadian Investor Protection Fund ( CIPF ) and Investment Industry Regulatory Organization of Canada ( IIROC ); in Europe, Investment Technology Group Limited, registered in Ireland No ( ITGL ) and/or Investment Technology Group Europe Limited, registered in Ireland No ( ITGEL ) (the registered office of ITGL and ITGEL is Block A, Georges Quay, Dublin 2, Ireland). ITGL and ITGEL are authorised and regulated by the Central Bank of Ireland; in Asia, ITG Hong Kong Limited (SFC License No. AHD810), ITG Singapore Pte Limited (CMS Licence No ), and ITG Australia Limited (AFS License No ). All of the above entities are subsidiaries of Investment Technology Group, Inc. MATCH NowSM is a product offering of TriAct Canada Marketplace LP ( TriAct ), member CIPF and IIROC. TriAct is a wholly owned subsidiary of ITG Canada Corp.