Data Management for Risk in Capital Markets: Leading practices from Xenomorph

Size: px
Start display at page:

Download "Data Management for Risk in Capital Markets: Leading practices from Xenomorph"

Transcription

1 Data Management for Risk in Capital Markets: Leading practices from Xenomorph December 2012

2 About Chartis Research Chartis is a leading provider of research and analysis covering the global market for risk management technology. Our goal is to support enterprises seeking to optimise business performance through better risk management, corporate governance and compliance. We help clients make informed technology and business decisions by providing in-depth analysis and actionable advice on the broad spectrum of risk technology offerings. Areas of expertise include: Credit risk Operational risk and Governance, Risk and Compliance (GRC) Market risk Asset and Liability Management (ALM) and Liquidity Risk Financial Crime Insurance risk Regulatory requirements including Basel II, Basel III, Dodd Frank and Solvency 2 Chartis is solely focused on risk technology giving it significant advantage over generic market analysts. Chartis has brought together a leading team of analysts and advisors from the risk management and financial services industries. This team has hands-on experience of implementing and developing risk management systems and programmes for Fortune 500 companies and leading consulting houses. Chartis Research is authorised and regulated in the United Kingdom by the Financial Services Authority (FSA) to provide investment advice. No part of this publication may be reproduced, adapted, stored in a retrieval system or transmitted in any form by any means, electronic, mechanical, photocopying, recording or otherwise, without the prior permission of Chartis Research Ltd. The facts of this report are believed to be correct at the time of publication but cannot be guaranteed. Please note that the findings, conclusions and recommendations that Chartis Research delivers will be based on information gathered in good faith, whose accuracy we cannot guarantee. Chartis Research accepts no liability whatever for actions taken based on any information that may subsequently prove to be incorrect or errors in our analysis. See Chartis Terms of Use on RiskTech100 is a Registered Trade Mark of Chartis Research Limited RiskTech Quadrant is a Registered Trade Mark of Chartis Research Limited 2

3 Table of Contents 1- Executive Summary The fundamental challenges of data management for risk Key business drivers Key technology challenges Practical implications Control Agility Industry solutions Control over data and analytics Risk data integration Data analytics Spreadsheet control Centralized data management costs and benefits Agility Real-time data management Drill-down tools Visualization Fragmentation as the solution for agility costs and benefits Leading practices from Xenomorph Unified data and analytics Data model designed for change Open architecture for client customization Deeper analysis with easy deployment Future Outlook Further Reading

4 List of Figures and Tables Figure 1: Regulation the top driver for risk management...7 Figure 2: Data: the leading obstacle to enterprise risk management...8 Figure 3: Balance between control and agility Figure 4: Data management process...12 Figure 5: Control and agility with TimeScape...15 Figure 6: TimeScape data flow...16 Figure 7: TimeScape s open architecture...18 Figure 8: Querying and visualization in TimeScape

5 1- Executive Summary Since 2008, there has been a period of reflection and self-examination among financial institutions. In their efforts to discover what went wrong, the issue of data management for risk has become increasingly important and a higher priority for firms, as they now recognize that without the right data their view of risk will not be accurate. However, many are still unsure as to what they should do to remedy their current inadequacies. Having heard of too many unfulfilled promises from technology vendors about enterprise data management systems that will solve all their data problems, they are skeptical of what can be accomplished. In particular, financial institutions have begun to believe that they must choose between control and agility that they can either get control over data or have agility. This has led to firms either implementing rigid systems that put data in one repository, or keeping data in silos to enable agility. Neither of these options solve the problems most firms have with risk data: complex, opaque systems that fail to integrate data from across the enterprise and cannot reconcile different sets of data. Under increased pressure from regulators, auditors, and shareholders, financial institutions are searching hard for technology solutions that will allow them to cut the Gordian Knot of silo-based data management, without forcing them into a rigid and inflexible centralized system. Achieving this is far from easy, but that does not mean that firms should tolerate inadequate data systems. There are technologies available that can help firms to get control of data and retain the flexibility they need to manage their business. In particular, firms need to implement technologies that enable the full data management for risk process. Data management for risk should include extract-load-transfer, storage, analytics, and visualization capabilities, ideally together in a single architecture. Improved data integration, storage, and validation technologies, combined with open architectures, better query tools, and new visualization techniques, can allow firms to to create auditable master copy versions of risk data and allow relevant functions within the firm to use them. While conceptually a single master copy is an important requirement, in practice multiple master copies are needed across different business units. Chartis research shows that many financial institutions have made progress towards improved risk management through better management of data, but they still have some way to go. Equally, while vendors are offering more advanced systems, many still do not provide the complete range of functionalities to enable advanced data management for risk. This report highlights some of the key trends in data management for risk in capital markets and describes leading practices from Xenomorph, an innovative solutions provider in this emerging area. 5

6 2- The fundamental challenges of data management for risk The events of 2008 were the result of business practices rewarding returns without sufficient reference to risk, but it is now clear that in many cases the management of financial institutions were driving blind. Firms struggled to aggregate and understand their risk exposures due to deficiency in accessing, analyzing and understanding risk data from across their enterprise. The financial crisis therefore acted as a spur for firms to recognize these problems and highlighted the links between the challenges of managing data and the underlying risk. In this light, data management for risk is now taken more seriously and considered a vital part of the overall risk management program. 2.1 Key business drivers The most significant drivers stem from the increasing complexity and volatility of capital markets. These include: Increased trading speed market data is entering data management systems at a faster rate and that data needs to be supplied to numerous functions of the firm more quickly; Increased complexity of products and transactions data is arriving in new and more complex formats (e.g. spreadsheets, curves, volatility surfaces); Increased regulatory burden Regulations such as Dodd-Frank, Basel 3, and MiFID have increased the amount of data required for regulatory calculations and reporting; Increased internal demand for risk information Firms are trying to embed risk data in the front office and implement risk-based performance measures across the firm. In the past, many financial institutions were not aware of the crucial importance of data management for improving risk management. This was partly due to the fact that financial institutions and technology vendors focused on the many risk and regulatory compliance initiatives (e.g. pillar one of Basel II) or the reporting requirements (e.g. Basel II Pillar Three, MiFID). These regulations did not fully specify data management or data governance requirements and, as a result, data management and its associated complexities were neglected. However, many of these projects ran over budget and did not deliver the desired outcomes, often because the challenges of data management were underestimated or ignored. As a result, there is now greater awareness that data must be brought under control for firms to understand their risks. Recognizing the importance of risk data is part of the wider acceptance of the role of risk management in avoiding a repeat of the financial crisis; a process that is being driven by two main factors, regulatory pressure and business pressure. In response to the economic downturn in fact, firms have been faced with an increased regulatory burden and stricter requirements for high-quality and audited data feeding the systems and reports. For example, with Basel 3, financial institutions are required to execute specific calculations -such as new liquidity risk and credit value adjustment metrics -and to provide more frequent, in-depth risk and compliance reports. The results for Chartis Research s 2012 RiskTech100 survey showed that complying with regulations was the leading factor affecting firms risk management programs (see Figure 1, below) 1. In some firms, these pressures are leading to a mere box-ticking approach; in other firms, they are pushing management to re-think their data management strategies and work towards new target architectures. 1 RiskTech , Chartis Research 6

7 Figure 1: Regulation the top driver for risk management Most important factors impacting your ERM programme [score from 1-5, 1=least important and 5= most important] Cost reduction Integration of silo-based risk management processes and systems Need for upgrading and refreshing risk management processes and systems Increased shareholder pressure for better governance Adequacy of risk measurement methodologies Concerns over increased levels of credit losses Compliance with domestic and international regulations Additionally, auditors, clients and shareholders are demanding to return to higher levels of profitability by implementing sound data management strategies which can lead to greater transparency and improved risk management. Hence, the need for financial institutions to tackle a number of technology challenges for better data management for risk. 2.2 Key technology challenges Boards are beginning to see the links between the business drivers above and the costs of inadequate data management systems. In an attempt to address these issues, many financial institutions are now tackling the following key technology challenges: Maintaining control over data without sacrificing agility the key challenge for many firms is to control their data, that is, to aggregate it, and then integrate it into a data model to make data across the enterprise consistent. This may conflict with the need for firms to update their systems to react to changing markets or to changes to the firm. Real-time data platform for risk management increased trading speed (for example, through algorithmic trading) and intra-day regulatory reporting are making real-time risk management and, therefore, real-time data ever more important. To keep up with faster markets and enable on-demand risk decision support, data management systems need to support much faster risk data aggregation and to provide real-time updates. Transparent access to risk data accessibility of risk data is a crucial element for risk management systems. Risk functions and front-office staff both need to be able to access and view risk data easily in order to make effective decisions. This means both providing end-users with a clear view of risk information and allowing them to query the data easily and effectively. Legacy systems that are rigid and unable to change this is a major problem for larger financial institutions, who are finding that the complex data and risk systems they installed in the past are difficult to change without performing a complete overhaul. Financial institutions need to find data management systems that can incrementally link into pre-existing components and reduce the complexity of their legacy systems. These are significant technology challenges and resolving them will have a major impact on the efficiency and accuracy of risk management systems. 7

8 3- Practical implications Recent research by Chartis has shown that poor data management is seen by financial institutions as the number one obstacle to improved enterprise risk management (see Figure 2, below) 2. Figure 2: Data: the leading obstacle to enterprise risk management What do you consider as your organization's main challenges to successful ERM? [score from 1-5, 1= Not a challenge and 5= significant challenge] Achieving reliable intra-day ERM Individual responsibility and accountability for risk management Board level/senior management support Adequate external advice Adequate in-house knowledge and expertise Adequate systems and data to support ERM Internal risk culture Clear definition of the firm's risk appetite Poor data management has led to a number of negative consequences for financial institutions. These include: Inaccurate risk calculations Siloed data systems and a lack of consistent data mean that the risk function is unable to calculate risks correctly on a consistent basis leading to reconciliation. As a result, the firm and staff are ill-informed about the risks taken on a business-line and enterprise-wide level, leading to poor decision-making and costly losses. No enterprise risk view Inadequate data prevents the formation of an accurate view of risk across the enterprise, stopping the risk function from saying where risk should be reduced or mitigated and allowing areas of the firm or risk types to get out of control. For example, firms cannot get a clear view of capital, collateral, and liquidity across the firm. This prevents the optimal allocation of these assets and the ability to raise revenue from unused assets (e.g. by collateral optimization). Ineffective risk management systems The rule of garbage in, garbage out applies to even the most expensively acquired and installed risk management systems, which will be nothing but a drain on resources if they don t receive high-quality data. Increased regulatory costs The cost of poor data management extends to costs imposed by regulators. Firstly, poor data could lead to firms overestimating their regulatory capital or liquidity requirements, placing an unnecessary cost on themselves. Secondly, if regulators find that figures in reports cannot be reconciled (for example, reconciliation between capital requirements and the balance sheet is required under Basel 3), they can impose sanctions, such as not approving internal risk models, which would increase capital requirements, or fines. Moreover, compliance failures would have an extremely damaging effect on the firm s reputation, internally and externally, causing problems with shareholders and staff. The unifying theme of these practical implications is that financial institutions need to improve their data management systems and make sure they have full control of their data while maintaining the agility needed to achieve business goals. 2 Collaborative Risk Management 2012, Chartis Research 8

9 3.1 Control The problems associated with a lack of control over data stem from the need to aggregate data to provide an enterprise-wide view of risk and the requirement to transmit risk information to different areas of the firm. Firms need to have control at three crucial steps of data management for risk: Data aggregation Getting all the data in the right place and ensuring it is validated and consistent. Risk analytics Providing data to risk engines and ensuring reconcilable results are produced. Presentation of risk information Ensuring clear, reconcilable risk information is available to relevant staff, for internal use, and for use in regulatory reports. And these steps can ultimately lead to: Siloed data A siloed data management environment prevents or obstructs the aggregation of risk data from across the enterprise, creating an incomplete and fragmentary view of risk. While most financial institutions and technology vendors see the need to integrate risk data, many do not recognize that risk is a multi-dimensional concept. Risk data needs to be assessed across asset classes, legal entities, counterparties, and across risk types. Replacing one set of silos with another by only integrating data across business lines will not improve data management for risk sufficiently. Overuse of spreadsheets While structured data and central databases are easy for risk functions and technology experts to use, many non-specialists find it difficult to access, understand, or use these centralized systems. In most financial institutions, traders, risk managers, and operational staff tend to rely on spreadsheets for financial and risk analysis. While spreadsheets are a valuable tool, they are difficult to control and prone to human error. Too often spreadsheets are bound to each individual s desktop and are not subject to central audit and supervision. Separation of the front and back office While usage of data management in back and middle office is well understood, front office data management doesn t seem to be integrated currently in the thinking of many data management practitioners. The different technical needs of the front and back office have led to the fragmentation of data systems that could support both so that there is little communication between the two, or any communication that exists is mutually incomprehensible. Transparency and audit Full, granular audit trails are increasingly important to regulators, who want more transparency and want to ensure that accurate measures of a firm s financial position are available. The Financial Stability Board has recommended that risk data aggregation, data completeness, and data architecture requirements are incorporated into Pillar 2 of Basel 3 to improve reporting and transparency. With data scattered throughout ad hoc systems, it will be virtually impossible to explain to senior management, auditors, or regulators how a particular key metric has been computed. Without a unified data model, multiple audit trails will not be reconcilable and will often be inaccurate. Reconciliation problems Reconciliation problems are a key productivity and risk management issue in many sell-side and buy-side firms. This is usually due to the lack of a consistent data management environment. When reconciliation problems do occur, it can be impossible for business or technical staff to explain why and different areas of the firm will be operating on different and incompatible data and views of risk. Separation of real-time and batch data Real-time data management for risk is increasingly necessary for financial institutions. Conventional relational database management system (RDBMS) can t keep up with the requirements of live risk computation, but batch data is still needed. Realtime risk data is too often segregated onto a different technical infrastructure from batch risk, which makes it difficult to aggregate and reconcile the data, and to upgrade computations from batch to real-time. 9

10 3.2 Agility The problems associated with a lack of agility are centered on the inability of financial institutions to alter their systems to react to market events and use data to make timely decisions. If getting control over the data is essential for a strategic enterprise-wide approach to risk, agility is required for tactical, real-time risk management decision-making. The problems related to a lack of agility include: Time to market Time to market has been a key challenge for financial institutions. Rigid systems mean that requests from end-users for new functionalities are not satisfied in an adequate timeframe. In some cases it can take days for new functionalities to be introduced into a system, by which time the market has moved again and traders have missed an opportunity. Support for new instruments Closed vendor-supplied systems prevent firms IT support from adding new instruments themselves. Any significant delay in instrument support can mean the difference between market leadership and irrelevance. If data management systems are not flexible enough, traders will use spreadsheets until and even beyond the time that these systems eventually catch up. Inability to change or update data The inflexible legacy architecture of many existing data management environments requires direct IT support to modify or add to underlying data and analytics. Market conditions often require changes to computations in the middle of a trading day when volatility demands it. Systems designed from the ground up to allow end-users to add, remove, or modify asset classes, models and analytics at the moment of need will provide traders and risk managers with a competitive advantage. Separating trading from risk A lack of agility in a data management system can lead to trading and risk function technology systems being separated. There are legitimate reasons why over time risk and front office data architectures have diverged, but this only leads to greater complexity in technology systems. Moreover, it furthers the divide between risk, trading, and IT within the organization, hampering collaboration. Collaboration on risk management between different functions is needed to allow firms to get a clear view of risk across the enterprise and to increase accountability, responsibility, and transparency. These points show how poor data management can have serious repercussions for financial institutions. It can lead to a fragmented view of risk, inaccurate and contradictory risk numbers and audit trails, and a lack of communication and understanding between different departments. 10

11 4- Industry solutions Risk technology and data management vendors are helping financial institutions to meet these fundamental challenges by offering to solve the practical problems associated with data. However, the challenges are presented as a trade-off between control and agility, and these aims are often viewed as contradictory and difficult to implement simultaneously. As Figure 3 shows below, control is often seen as achieved through centralized, top-down data management systems, which typically limit the value that end-users can add through interaction with the data and also are not particularly responsive to immediate business needs. By contrast, while agility potentially offers rapid response for business users, it is also associated with a lack of control and integration with core systems and processes. Figure 3: Balance between control and agility Control Centralized management Top-down policies Standard data model Overnight batch processing Limited/restricted end-user interfaces Agility Silo/Data Type Specific Real-time/intraday Flexible data model Visualization Extendable and component-based Easy spreadsheet access 4.1 Control over data and analytics Financial institutions recognize that they need to get control over data. However, different data management vendors have different definitions of what data control means. Complete integration of risk data or an enterprise-wide data store may describe different capabilities for different systems, but often the reality of what enterprise data management systems can deliver will not match up to what was promised Risk data integration While the wider challenges of data management have been neglected, data integration has been a major goal for financial institutions and a key selling point for technology vendors for many years. However, this goal was based on a somewhat simplistic understanding of data management and a complacent view of the challenges surrounding risk data integration. While many providers offer to bring all data under control and provide full integration, they often focus either on only one step of the data management process (e.g. see Figure 4, below) or on only a limited set of asset classes and data types. Risk technology vendors often focus only on the step of the data management process that is relevant to their solution and area of expertise and call that data management, instead of considering or dealing with the full data management for risk process. By enterprise data management, vendors mean data aggregation from different business lines into a single repository. However, this does not necessarily mean that data will also be integrated across asset-class, temporality, or risk type. 11

12 Furthermore, they may only have capabilities for individual stages of data management or only focus on storage (just aggregating it into a single repository), or on validation and reconciliation, or on data integration. This means that what is presented as an end-to-end process will actually require firms to add additional components, increasing project costs and implementation length. In reality, a unified data model requires a system that covers the whole process of data management, as shown in Figure 4 below, rather than one or two individual pieces. This is the only way to get full control over the data while a fragmented data system will only create more silos, more complexity, and more reconciliation problems. Data management systems need to bring all data together (from the simple to the most complex) and make it consistent, by relying on a unified data model that doesn t simply aggregate data, but also prepares it for use in risk management technology systems, i.e. in analytics engines. Indeed, while the derived data in often neglected and overlooked, integrated data management systems should have robust data analytics capabilities for a full and transparent auditability of the data and analytics feeding the risk processes. Figure 4: Data management process Data Sources Trade data Reference data Market data Risk data Models Spreadsheets Data acquisition Event-driven data integration Extract Standardize Validate Enrich Transform Load Data Management Real-time data aggregation Multi-asset Multi-risk (market and credit risk) Multi-business line Multi-legal entity Dynamic Data analytics Volatilities Correlations Curves Spreads Valuations Validations etc. Data distribution Output for end-users Reports Dashboards Ad hoc queries Drill-down etc Data analytics Data analytics refer to derivation of particular data objects, including such things as volatilities, correlations, validations, interpolations, curves, spreads, valuations, and hedging parameters. Data analytics are distinct from higher level risk analytics or risk models that perform aggregated risk calculations, for example, using techniques such as Value-at-Risk or simulations. The granular, derived data produced by data analytics supports both these higher level risk calculations and underlying instrument valuations, and is a vital foundation for the overall risk management process. Derived data from data analytics often falls between two stools. Many data management vendors would consider these analytics and spreadsheet-like data to be beyond their field of expertise, as too business-oriented and quantitative, and as a task for risk management system vendors. In contrast, many risk management system vendors see such derived data as too low level, as they focus on regulatory and compliance-based metrics, such as liquidity risk ratios and capital requirement calculations. Many believe that they can sit on top of any database and just need to be able to take the feeds from this source. It is often assumed that data is not a problem and that all data provided can be used reliably by the risk engines. Caught in this gap in coverage, financial institutions often turn to ad hoc tools to perform these calculations, resulting in spreadsheet proliferation and with the consequence of having data falling outside of centrally managed processes. Instead of improving transparency and data efficiency, the separation of data and data analytics only leads to the fragmentation of information and creates more complexity and discrepancies in the system. 12

13 4.1.3 Spreadsheet control The increased proliferation of spreadsheets has been recognized as a problem not only by financial institutions, but also by regulators. The SEC is encouraging firms to reduce or eliminate the use of spreadsheets as it would cause firms to fail Sarbanes-Oxley internal control tests or control measures, notably for Solvency 2 and MiFID. As a result, firms have started to try to control spreadsheets in their data systems, for example by building an inventory and attempting to maintain the accuracy of the information in the spreadsheets. These methods effectively amount to a CCTV system that watches and records spreadsheet usage centrally. While this implies that spreadsheets are controlled, their use and reasons for use have not changed significantly. As a result, the underlying problem of data fragmentation and lack of integration with core systems remains. Alternatively, some data management systems will instead try to implement too much control in an attempt to limit spreadsheet use or eliminate it altogether. The forced conversion of complex objects into simpler, more structured data will bring this data under control, but it will also prevent real-time support for excel-like calculations used in the front office and will make it more difficult to centralize them. In particular, attempting to eliminate spreadsheet use is a fool s errand. Counter-intuitively, extending control may simply lead to more data outside the system, as front-office staff will continue to use spreadsheets if they are necessary and will try to keep them outside the system Centralized data management costs and benefits Introducing a centralized data management system can help firms to get control over risk data by integrating data into a single data model and ensuring central validation and consolidation of data. However, while centralized data management can bring data under control, it can also lead to a limiting top-down approach to the use of risk data. Under too much control, front-office teams and other functions may not be able to use risk data either for their own purposes, or to update the risk function on changes to the data or to inform them about new emerging risks. Hence a centrally managed system, focused purely on control, may reduce the agility of data management processes, and in turn weaken the responsiveness of risk management in the firm. 4.2 Agility What does agility in risk management mean? Agility is the ability of financial institutions to react rapidly to events and to carry out necessary responses quickly. First of all, this refers to the flexibility of the overall system. How easy is it to change the technology system to react to the firm s or the market s changing circumstances? It also refers to the ability for end-users to use risk data to perform calculations in real-time and to be able to access data in the system to support business decisions. This means having real-time risk data updates, query tools, and visualization to share risk information. The integration problem with both of these attributes is that many technology vendors see them as being most easily solved by using separate silos of functionality. Additional point solutions are used to allow end-users to manage data within their own business and make real-time risk data aggregation less complex, and to bypass the problem of making the overall system more flexible. However, this can imply that agility will often come at the expense of control Real-time data management Being able to manage risk in real-time is increasingly crucial for financial institutions. In order to use real-time risk management systems, financial institutions need data management systems that can also operate in realtime. End-users need to be able to make real-time calculations based on a variety of data types and using a higher volume of data than ever before. A failure in delivering or accessing information frustrates risk managers who want to keep up with the trading desk and obtain real-time and intraday updates. 13

14 The problem for many financial institutions is that their systems have been built on architectures that support batch-data handling. These architectures struggle to cope with the move to real-time. Vendors with these architectures are prone to over-optimistic claims that their systems can adapt to real-time processing, or simply ignore real-time as a requirement that they hope will go away. At the same time, integrating and reconciling real-time systems with those designed for batch processing is also problematic. Systems such as tick databases have been designed purely for high throughput of relatively simple datasets, and as such struggle with reference data and the more complex datasets found in data management for risk. While such real-time systems may deal with some of the specific needs of the front office, risk management needs technology that is able to present a consistent view of analytics and data across all data types, asset classes and temporalities Drill-down tools With transparency becoming a major goal for financial institutions and regulators, data management systems are required to provide the high performance capture, cleansing, storage, integration, analysis and distribution of reference, historical, real-time and derived data for any financial asset-class. Software solutions need to be easy for non-specialists to use and allow both business and technologist users to easily access and take control of their data. In contrast however, both siloed systems and top-down data management systems can make it difficult for end-users to find and access the relevant data in the right format. Simplified, business-focused querying tools and languages are a good way to get round these problems, because they allow end-users to access the data and utilize it as they wish. Querying tools can be designed to allow end-users to perform their own analysis on the data to allow them to get the information they need to make decisions. Using query tools and working with risk data can also help end-users to understand the logic of risk calculations and measurements, while allowing them to make changes and calculations to support decisions. Query tools can also be configured to build up audit trails so that the logic of decisions can be explained Visualization Visualization techniques, such as heat maps or candlestick charts of prices, may also be necessary to enable better decision-making and the incorporation of risk measures into front-office decisions. Risk data in its raw form may be too large or complex for staff within a firm and external users (auditors, regulators) to understand. Increasing the clarity of risk measures will improve collaboration between the risk function and other areas of the firm by making it easier for risk managers to explain what the risk data means. It will also make it easier for the risk function to get a clearer picture of the enterprise-wide view of risk Fragmentation as the solution for agility costs and benefits The solutions for agility need to be implemented at an enterprise-wide level in order for firms to maintain control over the data. However, many firms are using additional components as add-ons to their main system and implementing a fragmented approach based on point solutions which enables agility, but at the expense of control. Instead of having a system that is prepared for future changes and is built for flexibility, many firms and vendors implement an ever-growing number of add-ons and extra components to provide additional capabilities when they are needed. While firms may be able to react quickly to events in the short term, they will find their systems becoming increasingly unwieldy in the long term. Moreover, it will become ever harder to gain control over the data in such a sprawling and fragmented system. 14

15 5- Leading practices from Xenomorph Xenomorph is an established global provider of analytics and data management solutions to the financial markets. The firm focuses on an integrated approach to the management of reference data, market data and the analytics/models used to generate derived data. This approach covers all asset classes, all data types and includes data management for the front, middle, and back office. The company s product, TimeScape, has an architecture that has been designed to enable financial institutions to enjoy the benefits of both control and agility, without sacrificing one for the other. This architecture is ideally suited to address the challenges of data management for risk. Figure 5 provides an overview of the capabilities of TimeScape and how they bridge the gap between control and agility. Figure 5: Control and agility with TimeScape Control All temporalities of data All assets and data types Granular data access Unified model data Agility Add new instruments Real-time/intraday Analytics and data together Validate complex data and analytics Reduced siloed systems and databases Transparent and consistent data Spreadsheet Inside Open Architecture Spreadsheet flexibility Leverage end-user knowledge Add new data sources Add new analytics 5.1 Unified data and analytics A key aspect of TimeScape is the unified approach to the management of data and analytics. Analytics and the derived data generated by them are not excluded as a process to be managed elsewhere, but rather included in the core of the architecture. As Figure 6 illustrates, TimeScape can manage the full process of data acquisition, integration, validation, analysis and distribution. This means that analytics become a controlled part of the data management process and that the data does not become fragmented as intermediate data is generated and thrown away by business users. Instead, TimeScape can manage the process from end-to-end to ensure the consistency of all the data used across all risk management systems. 15

16 Figure 6: TimeScape data flow Data Source Data Source Data Source Models Connect Control FTP/File Import Management Interactive Market/Static Datafeeds Streaming and Snap Real-Time Datafeeds External Database Integration Customised Excel/API Imports Data Acquisition Configuration 4-Eyes Data Approval Automated Data Validation Automated Data Cleansing Cleanse Exception Management Create Data Enrichment Filling, Aligning and Proxy Data Rules Datafeed Integration and Normalisation Database Integration and Normalisation Database Template Management Instruments of Any Asset Class Curve/Spread Curves Volatility Surfaces and Cubes Consolidate Indices, Basket and Portfolios Derived and Calculated Data Instrument and Data Model Design Fine-Grain Data Access Permissioning Store Static, Historic and Real-Time Data Multiple Instrument Identifiers Multiple High Volume Data Sources Simple to Complex Data Types Analytics and Data Together Financial Object Integration Temporal Storage Statistical and Time Series Analysis Basket/Index Composition and Performance Analyze Curve/Surface Composition and Consistency Instrument Pricing Strategy backtesting Analytics and Spreadsheet Integration Audit Trail Distribute Pricing Model Integration Reports File Exports Excel and API Interfaces Broadcast Event Management and Notification Front Office Back Office P&L Risk 16

17 5.2 Data model designed for change The unified approach to the management of data and analytics is made possible through the design of the TimeScape data model. This data model has the following key features: Support for All Data Types The TimeScape data model supports simple data types such as numbers and text, but also more complex types such as links to other objects in the system, tabular and hierarchical data, spreadsheet regions, and even spreadsheet-like calculation definitions. Explicitly managing these datatypes within the system means that more business objects such as curves, spread curves, and volatility surfaces are easy to describe and understand. Rapid Addition of New Asset Classes Building upon the data type support outlined above, it is possible to add new fields and even new asset classes the moment that they are needed. This means that business users get the responsiveness they need to meet new business requirements, and that they do not resort to difficult to support tactical solutions to fill gaps or delays in new data becoming available. Multiple-Sourced Data Another fundamental aspect is that one data vendor is unlikely to be the perfect source for all data, and that for some data attributes multiple sources of data may be necessary to obtain the coverage and quality required. This is fully supported by the TimeScape data model and fully integrated with the systems querying and data analysis capabilities. Time Series for All Data Many time-series and tick-database solutions are dedicated primarily to the highvolume management of numeric data. TimeScape has been designed to extend this temporal storage to any kind of data, not just price or tick histories. As such, not only items such as index compositions can be tracked over time, but also curves, surfaces or simply the relationship/ownership between one entity and another can be represented and managed. Granular Data Access and Audit All of the above flexibility counts for nothing if it is not controllable, and as a result it is possible to control who has access to what kind of data right down to an individual data attribute or data source. Additionally, all changes to data are auditable, enabling easy recourse to what changed and when. This means that it is easy to rewind to the state of the data when a decision was taken, ultimately making it easier to deal with requests from clients, auditors and regulators. Easy to Upgrade Given that the data model exposed throughout TimeScape is built as an insulated layer above the physical data model, it is possible to upgrade the system without breaking the consistency of the existing data model and any modifications made to it. Combining this consistency of data model across releases with Xenomorph s commitment to maintain interface compatibility means that past investments in TimeScape are not problematic or wasted when a new system release comes along. 5.3 Open architecture for client customization Complimentary to the agility of its data model, the open architecture of TimeScape makes it easy for users and technologists to integrate it with other systems and to add new client-specific components. Pre-configured connectors allow it to pick up feeds and for it to be connected to other databases and analytics systems (including Thomson Reuters, Bloomberg, Markit, FAME, Numerix, Fincad, Matlab, R and more). Figure 7, below, shows how TimeScape can connect to third-party databases and third-party risk engines. Moreover, the system is not a black box, but instead includes a number of Software Development Kits (SDKs) that allow users to add new components to the system and update, change, and replace data feeds, messaging software, statistical analytics, and pricing models. Users and technologists can alter and add to the system without creating a series of new components or databases outside it, and hence agility is maintained without sacrificing control. 17

18 Figure 7: TimeScape s open architecture Third-party Risk Management Systems e.g. Summit, Numerix Portfolio, Risk Engines APIs.NET, COM, C/C++, SOAP, Java, Microsoft Excel, R,... TASK SERVICES Validation Pricing Bulk Data Capture Tick Data Capture Data Import/Export Custom Processes SERVICES Event Broadcast Tick Capture Import Server APIs/Applications Query Services Pricing Services Analytics Services Data Services Connectivity Services Applications TimeScape Workbench, Web Client, Microsoft Excel, Matlab, R,... Models and Analytics e.g. Numerix, Fincad, R, Matlab, SDKs Pricing, Data, Analytics Datafeeds and Databases e.g. Reuters, Bloomberg, Markit, IDC, S&P, SIX, FAME, TimeScape Databases XDB, SQL XDB, Oracle XDB 5.4 Deeper analysis with easy deployment Given the open architecture approach, TimeScape implements a number of technology approaches that allow third-party and client-specific analytics and data to be fully integrated into the data calculation, validation, and analysis capabilities of the system. Querying Not Just for Technologists - TimeScape s query language, QL+ has been designed specifically for financial markets and the peculiarities of its data. It enables front-office staff to drill down into the data easily, allowing complex questions to be asked simply and intuitively. It contains many of the standard operators you would expect in SQL such as WHERE, SORTBY and GROUPBY, but insulates the user from needing to know any of the complexities of the underlying table structure, as you would need to know in SQL. The QL+ language deals with all the intricacies aligning and rebasing time series data, but also offers key features for all data types, such as on-the-fly data context rules for queries, expressing data source preferences, proxy instruments, and how data can be filled and interpolated. New objects and analytics can be added to it, and third-party spreadsheet analytics can be integrated automatically without any programming effort. As it has been designed for business users and technologists alike, queries can be built directly in TimeScape or accessed through any of TimeScape s programming interfaces. Figure 8 shows how QL+ can be used to analyze credit index data. 18

19 Spreadsheet Agility with Control - Based upon QL+ and consistent with the theme of managing data and analytics together, Xenomorph have looked at the data management issues of spreadsheets and as a result have integrated spreadsheet-like calculations as a supported core data type within TimeScape s unified data model. This SpreadSheet Inside technology offers the agility of spreadsheet-like design with the control and transparency of centralized data management. Appropriately permissioned business users can employ this functionality to create and lock down analytics for sharing with others. This means that they have the ease and flexibility of using spreadsheet design without the audit, reconciliation and operational risk issues of desktopbound spreadsheets. Seeing the Bigger Picture with Visualization - Xenomorph has developed a number of visualization capabilities to provide clearer information on risk for users. These include standard charting, but also heat maps that allow users to drill down on prices, as well as the visualization of derived data, such as spread curves and volatility surfaces (see Figure 8 again for visualization integrated with TimeScape QL+). This kind of visualization makes it possible to understand behaviors in large data sets very quickly, while still allowing the granular detail to be investigated more fully. Figure 8: Querying and visualization in TimeScape 19

20 6- Future Outlook The current business and regulatory environment is leading financial institutions to prepare for a radical departure from conventional modes of thinking about data management for risk. The inadequacies of past systems and the scale of the current requirements mean that assumptions about risk data are no longer going unquestioned as firms re-think their approach. Principally, firms are trying to strike a balance between control and agility. While it is difficult to build a system that enables agility and controls data, this does not make it impossible. It also does not mean that it is safe to sacrifice one for the other. Having neither of these attributes would be the worst case scenario, but financial institutions also need to be aware that focusing too heavily on controlling data can lead to a loss of agility. More importantly, when implementing a data management system within risk, firms need to focus on how the system can enable business goals. A focus on data control can often lead to firms implementing a system that achieves IT-led goals (e.g. getting all data in a repository), but neglects business requirements (e.g. the ability to access that data). Technology needs to serve the business, not the other way round. Firms can build a system in which control and agility objectives are not in conflict, but instead are complementary. Doing this will allow firms to build a technologically coherent system that also helps them to improve risk management and performance. This means moving away from more traditional approaches to data management. Instead of a centralized, topdown system, financial institutions need a system that combines a unified data model with an architecture that is flexible and can be easily accessed by non-technologists. Financial institutions that adopt a data management for risk solution that incorporates a unified data model within an open architecture will reap numerous benefits, including more accurate data, easier validations and reconciliations, and improved use of data by staff outside the risk function. The combination of agility and control will allow them to improve risk management from top-down and bottom-up perspectives. 20

21 7- Further Reading Basel 3 Technology Solutions 2012 Collaborative Risk Management 2012 Global Risk IT Expenditure 2011 Risk Tech For all of these reports see: 21