Asseco Aggregation Engine



Similar documents
USER REQUIREMENTS CHAPTER 16 STATIC DATA REQUIREMENTS

TUTORIAL: Marketing Automation Gold-Vision 7

White Paper: FSA Data Audit

System Administrator Training Guide. Reliance Communications, Inc. 603 Mission Street Santa Cruz, CA

TechTips. Connecting Xcelsius Dashboards to External Data Sources using: Web Services (Dynamic Web Query)

Protecting Business Information With A SharePoint Data Governance Model. TITUS White Paper

StorageX 7.5 Case Study

Content Management Playout Encryption Broadcast Internet. Content Management Services

BillQuick Agent 2010 Getting Started Guide

The All-In-One Browser-Based Document Management Solution

EXCEL Using Excel for Data Query & Management. Information Technology. MS Office Excel 2007 Users Guide. IT Training & Development

Novell ZENworks Asset Management

5 Steps To Successful ERP Implementation

Quality Assurance/Quality Control Plan: Samples and Manual for Development

Front Metrics Technologies Pvt. Ltd. Capacity Management Policy, Process & Procedures Document

PERFORMANCE EVALUATION AUDIT CHECKLIST EXAMPLE. EIIP Volume VI

LEARNING SOLUTIONS website milner.com/learning phone

TERRITORY RECORDS OFFICE BUSINESS SYSTEMS AND DIGITAL RECORDKEEPING FUNCTIONALITY ASSESSMENT TOOL

COMPLIANCE MANAGEMENT SOLUTIONS THOMSON REUTERS ACCELUS COMPLIANCE MANAGEMENT SOLUTIONS

Solvency II Data audit report guidance. March 2012

Own Risk and Solvency Assessment

2. MOTIVATING SCENARIOS 1. INTRODUCTION

HighJump Application Suite for PeopleSoft

Business Objects Enterprise version 4.1. Report Viewing

Unlimited. Click4Assistance - Package Comparison. The Packages...

IBM Campaign Version-independent Integration with IBM Engage Version 1 Release 3 April 8, Integration Guide IBM

For each requirement, the Bidder should indicate which level of support pertains to the requirement by entering 1, 2, or 3 in the appropriate box.

Fermilab Computing Division Service Level Management Process & Procedures Document

License Management App 2.1 Administration and User Guide

Overcoming Obstacles to Retail Supply Chain Efficiency and Vendor Compliance

McAfee SIEM Alarms. Setting up and Managing Alarms. Introduction. What does it do? What doesn t it do?

Licensing for BarTender s Automation Editions

Migrating to vcloud Automation Center 6.1

ELEPHANT. Serviced Offices. The 116 Group Limited Telephone:

Patterns of Information Management

Cisco Network Optimization Service

Monitoring and Reporting Recreation Programs (MARP) Tool User Guide October 2013

FireScope + ServiceNow: CMDB Integration Use Cases

A Simple Guide to Material Master Data Governance. By Keith Boardman, Strategy Principal

EventSentry Overview. Part I Introduction 1 Part II Setting up SQL 2008 R2 Express 2. Part III Setting up IIS 9. Part IV Installing EventSentry 11

Two new DB2 Web Query options expand Microsoft integration As printed in the September 2009 edition of the IBM Systems Magazine

IBM Campaign and IBM Silverpop Engage Version 1 Release 2 August 31, Integration Guide IBM

PREPARED BY: AUDIT PROGRAM Author: Lance M. Turcato. APPROVED BY: Logical Security Operating Systems - Generic. Audit Date:

Basic Securities Reconciliation for the Buy Side

Business process, document archiving and workflow, all words when joined together can add significant cost to your business

INTERSPIRE MARKETER

Global Value 7. Productivity Suite for GammaVision. Optimizing Gamma Spectrometry Processes through Secure Data Management and Measurement Automation.

Digital Document Processing

Paul McFedries. Home Server 2011 LEASHE. Third Edition. 800 East 96th Street, Indianapolis, Indiana USA

Framework for Data warehouse architectural components

Technical Manual. (TLT-2H GPS/GSM Vehicle Tracker) V1.5

MOVES Batch Mode: Setting up and running groups of related MOVES run specifications. EPA Office of Transportation and Air Quality 11/3/2010

Backup and Recovery FAQs

Oracle Database: Program with PL/SQL

About this release. McAfee Application Control and Change Control Addendum. Content change tracking. Configure content change tracking rule

RBI BCP READINESS REPORT. Auto generated by Sanovi DRM

VMware vcenter Log Insight Getting Started Guide

ManageMyHealth SMS Text Message Service User Guide. Medtech32. Version 20.0 (March 2012)

Supply Chain: improving performance in pricing, planning, and sourcing

19/10/2012. How do you monitor. (...And why should you?) CAS Annual Meeting - Henry Jupe

Installing and Configuring vcenter Support Assistant

AUTOMATING AUDITS AND ENSURING CONTINUOUS COMPLIANCE WITH ALGOSEC

Central Bank of Ireland Guidelines on Preparing for Solvency II Pre-application for Internal Models

The Carbon Accountancy Guide to book-keeping

Industry Briefing on Central Bank Guidelines on Preparing for Solvency II

FAQ's. Introduction. Frequently asked questions about Epay Manager. For more information, please call

Microsoft Windows 2000 Terminal Services

Web Developer Toolkit for IBM Digital Experience

STAYING AHEAD OF THE CURVE WITH AGILE FINANCIAL PLANNING, BUDGETING, AND FORECASTING

D6.1: Service management tools implementation and maturity baseline assessment framework

4. Getting started: Performing an audit

SharePoint 2010 Interview Questions-Architect

Data Quality Policy. Appendix A. 1. Why do we need a Data Quality Policy? Scope of this Policy Principles of data quality...

Real-Time Business Visibility Solutions For the Real World

Manage Release and Deployment

Tracking Network Changes Using Change Audit

SQL Server Protection

Snap 9 Professional s Scanning Module

OFFERINGS & PRICING. Effective 8/2015 Prices are subject to change. v Act-On Software

Cryoserver Archive Lotus Notes Configuration

Converting and Exporting Data in XML Format

Solvency II, the practical implications for asset managers and insurers

IMAN: DATA INTEGRATION MADE SIMPLE YOUR SOLUTION FOR SEAMLESS, AGILE DATA INTEGRATION IMAN TECHNICAL SHEET

Sanovi DRM for Oracle Database

billingplatform.com Case Study for Bizcel do Brazil Enterprise Billing for Cellular Providers

Secure Transfers. Contents. SSL-Based Services: HTTPS and FTPS 2. Generating A Certificate 2. Creating A Self-Signed Certificate 3

AHS Computing. Outlook Seminar Series Session #2

Transcription:

Asseco Aggregation Engine Data Governance Policy Template In order to make sure that the results provided by an engine or engine complex are always available and correct, the input data must be monitored for availability and reviewed for completeness and accuracy. The following is a selection of policies an organization can tailor and adopt to make sure that this takes place. 1 of 5

Continuous Availability of Inputs It is desirable that any input to the engine complex is continuously available - ie, that the end of validity of one entry is the same as the start of validity for the following entry, for all generations of the element in question. This enables the engine to provide results based on data in the past, even if the calculations needed were not implemented (or indeed, imagined) at the time. A policy that ensures the continuous availability of input elements will also improve the general performance of results calculations, since fetching remote inputs can be time-consuming. Automated Sources Automated data sources must be configured to provide continously valid data even in the face of temporary outages in the provider or the engine itself. Use appropriate soft validity settings for pull sources, such as database queries. Push sources are usually considered valid until a new entry is successfully uploaded. Manual Inputs To ensure that manual data is kept up to date, warning reports must be constructed that highlight manually sourced elements about to go out of validity. Use the CACHE_METADATA function to determine the authorization groups required, and the expiry time, for all relevant manual entries. it is possible (with the proper authorization) to "fill in the blanks" in a timeline using the manual sourcing option, by setting a valid-from and valid-to date for the input manually. To fill in backdated data for automated sourced, temporarily convert it to a manual source, enter the backdated entry, and convert it back. Audit reports will then show that the source data was entered in the present, and by whom. The Data Governance Policy must specify which manual inputs to monitor, and what actions to take when manual inputs become invalid. 2 of 5

Automated Data Validation For data sources where the data quality is not assured, engine validation mechanisms must ensure that calculations are only based on valid data. In addition, the engine must be able to alert human users when input data fails validation, to allow remedial action to take place as soon as possible. It is best practice to perform validation on a separate engine from the calculations and reporting that is done; This has two benefits: 1. 2. The consuming engine can ensure that there is a continuous availability timeline for an element, even if a particular data source is temporarily invalid. The validation rules are typically organization-specific, and the engines implementing them not subject to externallysourced upgrades in the same manner as for instance a Solvency II SCR calculation or QRT report. In the validation engine, create a single element for each source to be validated, with an appropriately named input and output. Use the "Create pipeline element" to create validation elements and reports for all sources. Define the specific validation rules in the created validation element, and determine which failure mode that a specific validation failure should trigger. The Data Governance Policy must specify the general validation policy, including failure modes for failed validations. 3 of 5

Human Approval A human approval mechanism can ensure input data is properly sanitized before use, and can ensure that results calculations and reports are sanity checked before distribution. Inputs approval Approval at the input data level ensures that errors in input data that were not (or could not be) detected by the automated validation mechanisms are caught, preventing calculations on invalid data to be done. This improves the data quality of the final results, since there then won't be versions of the final report that are invalid. Inputs approval is, like input validation, best handled in a dedicated engine, for the same reasons. After creating the validation element, use the "Create pipeline element" tool to create approval elements for each source to be approved. Observe that all approvals are, by their nature, manual inputs, and should thus be covered by warning reports as described under continous availability of inputs, to ensure that elements are reapproved as quickly as possible after changing. Note also that it is generally impossible to ensure a continous timeline of an approved source within an engine, since there will always be unavoidable gaps in the validity in the time from an element is changed until it is reapproved. For this reason, input approvals should be used with caution, and alternate means of creating identical, but unapproved, reports using the inputs should be provided. The Data Governance Policy must specify which inputs require approval, which roles are authorized to approve them, and what actions to take if an input is not approved. Results Approval Approval at the results level ensures that the final results or reports are read and agreed upon by a human reviewer before it is broadcast to the final audience (the public, supervisory agencies or the company board). Results approval can be added to the reporting engine, or managed in a dedicated engine like inputs approval. They are otherwise defined and configured exactly like input approvals. The Data Governance Policy must specify which results require approval, which roles are authorized to approve them, and what actions to take if a result is not approved. Tiered Approvals Tiered approvals can be used in scenarios where different groups of line employees are responsible for preparing and approving subsections of a report, and a department head is responsible for collating and approving a summary or compilation. Tiered approvals are configured like other approvals, by adding approvals to elements that already has approvals. The configurator should be careful to ensure that the chained approval manual sources requires different authorizations, and that there exists separation of duty between the roles. When describing tiered approvals in the Data Governance Policy, it is usually best to group all the approvals together, listing the different sub-approvals after the top-level approval. 4 of 5

Periodic Results Organizations may want to prepare, publish and archive reports to a schedule, using inputs from specific dates. Archive reports are best managed on a separate engine, where an authorized user can create new elements and results. To create an archive report, use the "Create new element" tool, naming both a result and a (manual) source. The archived result name should be same as the live result name, suffixed by an appropriate identifier - for example, a quarter-end copy of the result "Balance Sheet" could be named "Balance Sheet 2011Q3". Change the manual source to be an AAE source, fetching from the live result, and use the "Create a pipeline element" to create an intermediate copy. Change the formula in the pipeline element from LOOKUP('elementname_2011q3') to LOOKUPAT('elementname_2011q3', TIME('2011-10-01')). Change the domain of the formula (eg, from A1 to A1:XX999) and copy the styling and labels manually from the source, if those should be included. Note that archive reports can be created at any time after the fact, provided it existed (or could be calculated for) the time specified. Archive reports created using this process does not have an expiry time, since they depend only on immutable data in the past; In practice, they are valid for 100 years. The Data Governance Policy must specify which results to archive and publish, the frequency of the process, and who is responsible for this procedure. 5 of 5