Predictive Analysis Risk Analysis



Similar documents
Forensic Audit and Automated Oversight Federal Audit Executive Council September 24, 2009

NSF Grants Conference March William J. Kilgallin Senior Advisor, Investigations Office of Inspector General National Science Foundation

Using Technology to Automate Fraud Detection Within Key Business Process Areas

Get a Head Start on Analytics and Audits for Proactive Preparation and Responses to Federal Audits

Office of the Inspector General U.S. Department of Justice

by: Scott Baranowski, CIA

AUDIT OF NASA GRANTS AWARDED TO THE PHILADELPHIA COLLEGE OPPORTUNITY RESOURCES FOR EDUCATION

VA Office of Inspector General

Why is Internal Audit so Hard?

SCOPE OF WORK FOR PERFORMING INTERNAL CONTROL AND STATUTORY/REGULATORY COMPLIANCE AUDITS FOR RECIPIENTS OF SPECIAL MUNICIPAL AID

ACL WHITEPAPER. Automating Fraud Detection: The Essential Guide. John Verver, CA, CISA, CMC, Vice President, Product Strategy & Alliances

Office of the Inspector General U.S. Department of Justice

AGA Kansas City Chapter Data Analytics & Continuous Monitoring

Internal Controls and Fraud Detection & Prevention. Harold Monk and Jennifer Christensen

THE ABC S OF DATA ANALYTICS

U S I N G D A T A A N A L Y S I S T O M E E T T H E R E Q U I R E M E N T S O F R I S K B A S E D A U D I T I N G S T A N D A R D S

Office of Inspector General Office of Audit

Use of Data Extraction & Analysis Software In a Financial Statement Audit

SBIR/STTR Post Award Frequently Asked Questions

U.S. Department of Energy Office of Inspector General Office of Audits and Inspections

LEGAL SERVICES CORPORATION OFFICE OF INSPECTOR GENERAL FINAL REPORT ON SELECTED INTERNAL CONTROLS RHODE ISLAND LEGAL SERVICES, INC.

DoD Financial Management Regulation Volume 12, Chapter 5 September 1996 CHAPTER 5 GRANTS AND COOPERATIVE AGREEMENTS

GOVERNANCE: Enhanced Controls Needed To Avoid Duplicate Payments

FINANCIAL MANAGEMENT MANUAL

Privacy Impact Assessment. For Education s Central Automated Processing System (EDCAPS) Date: October 29, 2014

Office of Grants Management. General Federal Entitlement Grant Guidance

Introductions, Course Outline, and Other Administration Issues. Ed Ferrara, MSIA, CISSP Copyright 2015 Edward S.

Treasury Inspector General Tax Administration (TIGTA)

Using Predictive Analytics to Detect Contract Fraud, Waste, and Abuse Case Study from U.S. Postal Service OIG

Fraud Detection & Data Analytics

COSTS CHARGED TO HEAD START PROGRAM ADMINISTERED BY ROCKINGHAM COMMUNITY ACTION,

Data Mining/Fraud Detection. April 28, 2014 Jonathan Meyer, CPA KPMG, LLP

Auditing for Value in the Procure to Pay Cycle Dallas IIA Chapter. October 1, 2009

Office of Investigations GEOFFREY CHERRINGTON, DEPUTY ASSISTANT INSPECTOR GENERAL FOR INVESTIGATIONS

U.S. Department of Justice Office of the Inspector General. Improving the Grant Management Process

Office of the Inspector General U.S. Department of Justice

Data Mining: Unlocking the Intelligence in Your Data. Marlon B. Williams, CPA, ACDA Partner, IT Advisory Services Weaver

CAPACITY BUILDING AND OVERSIGHT BEST PRACTICES

Strong Corporate Governance & Internal Controls: Internal Auditing in Higher Education

CORPORATE PURCHASING CARD User Guidelines

How To Manage A Corporation

CHAPTER 4 EFFECTIVE INTERNAL CONTROLS OVER PAYROLL

An Auditor s Guide to Data Analytics

Is Your Accounting System Making the Grade? Attributes of an Acceptable Accounting System April 26, 2012

Report Number: A

Capital Area Council of Governments FY 2015 Cost Allocation Plan

OFFICE OF INSPECTOR GENERAL

To the Rector, Wardens and Vestry of (Church Name; Church Address; City and Zip)

Reports Functionality Matrix. DoD Travel Shared Reports

NOT ALL COMMUNITY SERVICES BLOCK GRANT RECOVERY ACT COSTS CLAIMED

INTERNAL CONTROL POLICIES

REPORT ON INDEPENDENT EVALUATION AND ASSESSMENT OF INTERNAL CONTROL FOR CONTRACT OVERSIGHT

Leveraging Big Data to Mitigate Health Care Fraud Risk

ACCOUNTING AND FINANCIAL REPORTING REGULATION MANUAL

How to Establish an SBIR Accounting System that Can Withstand a Government Audit

Objectives. Single Audit: Recent Changes and Preparation Tips. Overview. Overview 1/20/2016

The policy and procedural guidelines contained in this handbook are designed to:

Audit of the Administrative and Loan Accounting Center, Austin, Texas

Keith Barger MFS, MCSE, CCE

U.S. Department of Energy Office of Inspector General Office of Audit Services. Audit Report. Selected Energy Efficiency and Renewable Energy Projects

Microsoft Confidential

AUDIT TIPS FOR MANAGING DISASTER-RELATED PROJECT COSTS

ATTACHMENT L. 2012/13 Internal Control Questionnaire for Workforce Organizations/Programs

Department of Veterans Affairs VA HANDBOOK 4090 GOVERNMENT FLEET CARD PROCEDURES

Five-Year Strategic Plan

U.S. Department of Education Office of Inspector General. FY 2015 Management Challenges

The City of New York, Office of Management and Budget, New York, NY

WVU FOUNDATION & UNIVERSITY PURCHASING CARD PROGRAM POLICIES & PROCEDURES. Updated October 2012

DEPARTMENT REVIEW FINANCE

NONPROFIT FINANCIAL MANAGEMENT SELF ASSESSMENT TOOL

Uniform Administrative Requirements, Cost Principles, and Audit Requirements for Federal Awards AT 2 CFR 200

The Federal Railroad Retirement Board (RRB) and Data Analytics

Post Payment System (PPS)

FRAUD PREVENTION & DETECTION IN PARTICIPANT DIRECTION PROGRAMS HCBS Conference Wednesday, September 11, :30-9:45am

How to Establish an SBIR Accounting System that Can Withstand a Government Audit

Accounting System Requirements

The Environmental Careers Organization Reported Outlays for Five EPA Cooperative Agreements

Transcription:

Predictive Analysis Risk Analysis MARYLAND ASSOCIATION OF CPAS GOVERNMENT AND NOT-FOR-PROFIT CONFERENCE April 25, 2014

Overview Forensic Audit and Automated Oversight Data Analytics for Grant Oversight Data Analytics for Purchase Card

Forensic Audit Definition of Forensic Audit Audit that looks for financial misconduct, abuse, or wasteful activity More than Computer Assisted Audit Techniques (CAATs) Creates Relationships, Checks Calculations, Performs Comparisons Assess limitless number of analytical relationships Summarize large volume of data Database comparison 3

Automated Oversight More efficient use of resources 100% transaction review Still use traditional audit techniques to test transactions Opportunities to enhance oversight with less resources Improved risk identification Business rules based on risks Focus review on higher risks Recipients can use similar data analytics techniques Monitor grant spending Identify anomalies early

Data Versus Information An Endless Maze of Data... but No Information

Different Levels of Knowledge Data Facts, numbers Information Summary Reports ACL, IDEA, Access Knowledge Descriptive Analytics SAS, SPSS, ACL, IDEA Wisdom Predictive Analytics Modeler Intelligent Miner Enterprise Miner NSF OIG does not endorse commercial products

Forensic Audit Approach Audit objectives and audit universe Structured brainstorming Consider Subject Matter Expert (SME) conference Identify indicators of potential fraud and ways to find in data Process to identify financial risks capture all ideas Map out the end-to-end process Identify systems and key processes Identify key controls Identify and obtain transaction-level data Record layout 1000 record dump ACL, IDEA, and Monarch can read virtually any data format Flat files, Delimited files, Dbase files, MS Access, Report files,. No file size limits Build targeted business rules and run against data Examine anomalies, work with investigations NSF OIG does not endorse commercial products

Hardware and Software Applications Hardware SQL servers Mainframe (QMF) Docking stations Terminal server Software Applications Data mining and predictive analytics, e.g., Clementine Data interrogation e.g., ACL, IDEA, MS Access, Excel Statistical analysis e.g., SPSS and SAS Link analysis I2 Lexis-Nexis Data conversion utilities (Monarch) Internet, open-source research Access to system query tools NSF OIG does not endorse commercial products

Framework for End-to-End Payment Oversight A Data Analytics Approach Accounting Systems Entitlement Systems Grant Award Systems Contracting Systems Personnel Systems Social Security, Medicare, Medicaid Medical Claims Electronic Fund Request SAM Electronic Invoice Paper Invoice Civilian, Military, Retiree Travel Claims Insurance Claims Award level Data: DUNS, CAGE, Institution ID, Award No., Award Amount Direct Payment Systems Loan Payment Systems Grant Payment Systems Contract Payment Systems Transportation Systems Purchase Card Systems People Systems Travel Payment Systems Insurance Payment Systems Disbursing Systems Federal Reserve ACH System Fedwire Transaction level Data: Payee, Institution ID, Award No., Invoice No., Payment Amt, Date, Bank Acct $$ Treasury Check Commercial Bank Match databases with key fields (awardee IDs) Apply risk indicators to each payment record Summarize risks by awardee (award level risk) Data Analytics Continuous auditing Quick response and ad hoc Identify high risk institutions or payees Identify high risk programs Surface questionable payments Oversight Review by GATB Auditors Investigators Agencies GuideStar (non profits) IRS 990 Federal Audit Clearinghouse Dunn and Bradstreet Master Death File (SSA) Do Not Pay (Treasury) Systems that can help validate transactions and identify awardee and program risk CPARS, FPDS Brett Baker AIGA, NSF OIG

Data Analytics: Myths and Realities MYTHS REALITIES Data only, no fieldwork Numbers exercise Process changes data Findings unsupported Not auditing Focuses fieldwork Source data not changed Traditional techniques still test support Findings have stronger support Yellow Book Compliance

Data Analytics Help. Determine reliability data fields Shape of the data (statistics) Completeness of transactions and fields Show anomalies Within a database Between databases Changes in behavior over time Develop risk profiles for comparisons Awardee profiles Award-type profiles Program profiles

Common Data Analytics Tests Join Summarization Computed fields Blank fields (noteworthy if field is mandatory) Invalid dates Bounds testing Completeness Invalid codes Unreliable computed fields Illogical field relationships Trend analysis Duplicates

Data Analysis Fosters Creativity Can examine all records, not just a sample Tests not limited to What technical staff provides Predetermined data formats and/or relationships Can create relationships, check calculations, and perform comparisons Assess limitless analytical relationships Within large databases Comparing large databases Identifies anomalies Useful to identify misappropriation of assets and fraudulent financial reporting

What is Data Mining? Refers to the use of machine learning and statistical analysis to find patterns in data sets. If you know exactly what you are looking for, use structured query language (SQL) If you only know vaguely what you are looking for, turn to data mining Most often used in marketing and customer analysis (until recently)

Methods of Data Mining Supervised modeling Predict patterns in data based on patterns of known information Decision trees Neural networks Unsupervised modeling Identify anomalies or outliers based on grouping similar transactions Kohonen Networks K-Means Clusters

Control Charts

Comparing Data Files (Three Bucket Theory) Vendors Not Paid Yet Vendors Paid and In Vendor Table Vendors Paid but not In Vendor Table Vendor Table Disbursing Transactions

Normal Distribution Anomalous Activity Normal Activity Anomalous Activity

Building a Forensic Capability Develop organizational capability All audit staff should have basic skill with data analysis tools Access to internal and external data Targeted audits are more efficient in time and cost Phased development Hardware and software Staff: system savvy, analytical, business process knowledge Training, then immediate application to work Forensic audit units perform more sophisticated analyses Tone at the top is an important component

U.S. Financial Assistance Overview $550 billion in awards 88,000 awardees and 26 Federal grant making agencies Project and research, block, and formula Outcomes promote public good Challenges Limited visibility of how Federal funds are spent by awardees Support for funding requests is much less than for contracts American Recovery and Reinvestment Act (2009) $840 billion of assistance to stimulate the economy Greater accountability and transparency over spending than ever

Grants Differ From Contracts GRANTS Promote services for the public good Merit review (competitive) Multiple awardees Award budget No government ownership Grant payments Summary drawdowns No invoices for claims Expenditures not easily visible Salary percentages CONTRACTS Specified deliverables (Goods and Services) Competitive process One awardee Contract price Government ownership Contract payments Itemized payment requests Invoices to support claims Detailed costs Salary hourly rates

Framework for Grant Oversight Data analytics-driven, risk-based methodology to improve oversight Identify institutions that may not use Federal funds properly Techniques to surface questionable expenditures Life cycle approach to oversight Mapping of end-to-end process to identify controls 100% review of key financial and program information Focus attention to award and expenditure anomalies Complements traditional oversight approaches Techniques to review process and transactions are similar Transactions of questionable activities are targeted

End to End Process for Grant Oversight 6 PRE-AWARD RISKS Funding Over Time Conflict of Interest False Statements False Certifications Duplicate Funding Inflated Budgets Candidate Suspended/Debarred ACTIVE AWARD RISKS Unallowable, Unallocable, Unreasonable Costs Inadequate Documentation General Ledger Differs from Draw Amount Burn Rate No /Late/Inadequate Reports Sub-awards, Consultants, Contracts Duplicate Payments Excess Cash on Hand/Cost transfers Unreported Program Income AWARD END RISKS No /Late Final Reports Cost Transfers Spend-out Financial Adjustments Unmet Cost Share Dr. Brett M. Baker, 2010

Internal Data Sources Proposals: budgets, panel scores Agency award systems, recipient reporting External Excluded Parties List System (EPLS) Central Contractor Registration (CCR/SAM) Public tax filings Federal Audit Clearinghouse (A-133 audits) Recipient financial system records General ledger and subsidiary ledgers Effort reporting Property Travel and purchase card Subaward monitoring

Risk Assessment and Identification of Questionable Transactions Phase I Identify High Risk Institutions Phase II Identify Questionable Expenditures Agency Award Data Award proposals Quarterly expense reports Cash draw downs Agency Award Data Award proposals Quarterly expense reports Cash draw downs Awardee Transaction Data General ledger Subsidiary ledgers Subaward data Data Analytics Continuous monitoring of grant awards and recipients Data Analytics Apply risk indicators to GL data and compare to Agency data Review Questionable Transactions External Data A-133 audits (FAC) D&B, Recovery Board SAM (CCR, EPLS) External Data A-133 audits (FAC) D&B, Recovery Board SAM (CCR, EPLS) Dr. Brett Baker (2012)

Identification of Higher Risk Institutions and Transactions Dr. Brett Baker AIGA. NSF-OIG

Transaction Level Risk Indicators: Grant Transactions Cost transfers to other projects or contracts 27 Multiple payments of same invoice Excessive costs/large budget reallocations Multiple adjusting entries, end-of-grant spend-out Unallowable charges, charges after expiration

Payroll Risk Indicators: Grantee Data 28 Payments after grant expiration Multiple labor categories for same employee Timesheets with signature differences Employee background, experience, or education does not meet requirements Fictitious employees, such as contractor s relatives

Anomalous Drawdown Patterns $$ Start up costs Drawdown Spike Extinguishing Remaining Grant funds (before expiration) Extinguishing Remaining Grant funds (after expiration) Normal drawdown pattern Grant Award Grant Expiration 29 Dr. Brett Baker AIGA. NSF-OIG

Burn Rate Actual vs Expected Actual 30 Expected Award Amount ($K) Expended ($K) %Expend Award Days Days Active % Total Days Delta 1 10,000 9,000 90% 1095 769 70% 1.29 2 5,000 4,000 80% 1095 524 48% 1.67 3 2,000 1,500 75% 1095 404 37% 2.03 4 1,000 995 99% 365 200 55% 1.81 5 20,000 12,000 60% 1826 500 27% 2.22 6 10,000 5,000 50% 1826 1600 88% 0.57 Awardee Totals 48,000 32,495 68% 7,302 3,997 55% 1.24 1.00 would be normal

Award Burn Rate Profile Comparison 31

Example: Equipment Charges Incurred Immediately Before Grant Expiration Date 32 Same day as expiration 57 days before expiration 46 days before expiration

Travel Related to Award? 33 Just before award expiration Just after award expiration

Purchase Card Oversight Using Data Analytics Government purchase card overview Simplified acquisition High risk for abuse without strong oversight Government Credit Card Fraud Prevention Act 2013 DoD joint purchase card review (2002-2003) Current work at NSF 34

Review objective DoD Joint Purchase Card Review 35 Develop automated oversight capability to identify anomalies in purchase card data that may indicate fraud or abuse Improve field research, reporting, process for audit and investigation Universe reviewed 15 million purchase card transactions ($9 billion) 200,000 cardholders (CH) and 40,000 authorizing officials (AO) Subject matter expert conferences Structured brainstorming : 35 auditors, investigators, GSA officials Developed 115 indicators of potential fraud 46 codable Build targeted business rules to run against data

36

Top Indicator Combinations 97% Adult websites, Weekend/Holidays 67% Purchases from 1 vendor, CH=AO 57% Adult websites 57% Internet transactions, 3rd party billing 53% Interesting vendors, many transactions 43% Even dollars, near limit, same vendor, vendor business w/few CHs

Current NSF Purchase Card Work 38 Approach Similar to DoD Joint Purchase Card Review Transaction Universe 3 years of purchase card activity 230 card holders 34,000 transactions $17 million Risk-based approach to testing Worked closely with Investigations Developed risk indicators (previous and new)

Risk Factor Flags Approving official has a span of control of 5 or more card holders (Risk value = 1) MCC Codes Transactions with Suspect MCC codes (Risk value = 2) Blocked MCC codes (Risk value = 3) One to One Transactions in which the merchant did business only with that particular NSF card holder (Risk value = 2) 39 Weekend and Holiday Purchases Transactions on Saturday, Sunday, Holidays (Risk value = 3)

Risk Factor Flags - continued Possible Split Purchase Card holder has multiple purchases from the same merchant totaling more than $3,000 on the same day or within a few days (Risk value = 3) 40 Suspect Level 3 Data Transactions with Level 3 data deemed suspect, based on manual review. For example, possible personal purchase, possible split transaction, questionable legitimate business need. (Risk value = 3)

Example of Level 3 Data

Future Opportunities For Automated Oversight Electronic Invoices and Receipts Debit Cards Funding agency can view transaction charges OIGs/Administrators can run analytics on the digital data Continuous Monitoring Grantee performs Agency performs Government-wide efforts

Questions? Dr. Brett M. Baker Assistant Inspector General for Audit National Science Foundation Office of Inspector General Phone: 703-292-7100