DATABASE MARKETING STRATEGY: Best Practices for Selecting a 3rd Party Data Provider
BEST PRACTICES FOR SELECTING A 3RD PARTY DATA PROVIDER With $32 billion dollars of the US s economy contingent upon the exchange of data from 3rd party data suppliers to data brokers and end-users, there are thousands of suppliers looking to carve out their portion of the revenue. But what makes a high-quality, reputable data provider? There are a number of questions to ask when considering a new data supplier. DATA QUALITY CHECKLIST What hygiene steps are performed as the database is built? Compiled databases intake massive amounts of raw, transactional data. In many cases, the data is in many different formats and in varying degrees of standardization. Significant pre-compilation processing is necessary to ensure accuracy and deliverability. For accurate consumer identification, an individual s name and postal address is the foundation for any database. To gauge database quality, it is necessary to understand how a provider performs the following functions: Name parsing: the ability to accurately separate first, middle, last name, and suffix, including corrections to misspelled names, and recognition of multiple words contained within the entry. There are many software programs that can be used for this task; however, the top database compilers will usually employ proprietary processes and algorithms to ensure accuracy. (For example: a name such as Charles St. Church could easily be parsed to St. Charles Church without specific algorithms correctly directing the surname behavior). Address standardization: ensuring deliverability of postal mailing information. Many compilers utilized a CASS-certified (Coding Accuracy Support System) software program to ensure compliance with United States Postal Service Standards. This processing improves the quality of postal address information, including correcting and appending missing information. Address standardization usually includes DPV (Delivery Point Validation) processing as well. (For example: an input address of 123 Mian Street, Chicago, IL would be corrected to 123 Main St., Chicago IL 60544-2761. This record would be given a Y designation as the DPV code) Address correction: CASS-certified software will greatly improve raw data quality, but there is still the challenge of inaccurate input information that CASS-certified software cannot detect as errant. Quality compilers will institute their own proprietary hygiene processes that can provide significant corrections to inaccurate addresses after initial CASS-certified processing. (For example: an input record of John Smith, 122075 E 22nd St, Tulsa, OK 74128 would be corrected to John Smith, 12207 SE 22nd St, Tulsa OK 74128)
How thorough is NCOA processing, what is the frequency? All files are processed against NCOA during their initial compilation or update cycle (i.e. daily, weekly, monthly, or quarterly). In addition, it s important to understand if the file is processed against the 18 month or 48 month NCOA file. 18 month NCOA data includes moves within the past 18 months. 48 month NCOA captures four years of moves and is ideal for files that have not recently been updated or for files that are being compiled/built. The 48 month NCOA contains significantly more moves, but is a more costly process. Quality data compilers will use the 48-month NCOA file to process change-of-address updates during the compilation or update process. How often is the data updated? Typically files are updated daily, weekly, monthly or quarterly. Providers should be able to provide you with a schedule or timeline of when you can expect the updates to occur, and be able to remain consistent with the delivery schedule. During the update process, it is also important to understand if the data was updated with net new records or new validation dates, or if the data is older with just a recent NCOA update. Is the data single-sourced or multi-sourced/validated? Some compilers will receive a record from a single source and immediately add it to their file in an effort to maintain or boost the total database universe. This can result in poor quality data and a significant increase of duplicate individuals in the file. High quality compilers will ensure that at least two unique data transactions indicate the same information prior to adding a record to the file. What methods are used to validate and ensure long-term data accuracy? When a database is initially compiled, all data endures stringent processing and validation. Once the initial compilation is complete, the database should follow predictable updating patterns, with on-going validation and maintenance schedules. This is more than processing weekly or monthly NCOA updates. Your provider should have ongoing raw data that either updates old records within the database or provides a validation date, indicating the data is still accurate. These validation dates should be present as an option for you to filter older data, or select within a specific date range.
How are deceased individuals identified? Identifying deceased individuals is a significant means of cost savings and provides an additional layer of fraud protection. Recent changes in the ways that states report deceased data as part of the Social Security Death Index (SSDI) file, however, have limited the amount of information reported, making it more challenging to identify and flag an individual as deceased. Your data provider should be able to tell you if they use SSDI file as a standalone indicator, or if they also receive additional deceased data from outside sources including obituary and funeral home records. In addition, when the deceased data is compiled, is compiled at the individual or household level? Some providers flag a household as deceased rather than just the deceased person, removing individuals from your file that could still be valuable customers and prospects. What process is used to identify and remove duplicates, identify aliases, and link individuals within the database? Your data provider should have processes in place to identify duplicates and individuals with aliases within the data. When raw data is received and processed, name normalization and common corrections are made to the data. In many instances, individuals appear multiple times over the course of years of data compilation. Surnames change due to marriage or divorce. Addresses change frequently. Individuals may use different variations of their name (aliases). Large database compilers will process all new data transactions against an internal historical, referential database. Historical, referential databases hold millions or even billions of historical records and link all instances of an individual together with a single identification number. Instances may include multiple addresses, surname changes, and nicknames/aliases. Once the criteria for establishing a link has been met (typically a proprietary process, unique to the data supplier), the applied identification number ensures no duplication in the compiled file. What delivery systems are available to access the data? Typically, data is available through online count and order systems, batch data processing for appends and data hygiene (both manual and automated), real-time transactions for instant validation and identification of an individual, and licensing of entire databases. If you are looking for these options, ask about whether there are additional set-up fees or development costs. Some providers charge significant fees for implementation. Others charge more for projects that require manual processing. Ensure you are aware of all costs associated with a new supplier.
DATA PROVIDER CHECKLIST Typically, data compilers and brokers will offer as much transparency as possible into the sources of their data. In some cases, contractual obligations will limit revelation of exact sources, however, even in these cases, descriptive information should be available as to the type of source that provides the data (i.e. magazine subscription instead of ABC Magazine ). If your data provider is not willing to share how the data is compiled, you should be wary of the origin. Below is a list of questions to protect the integrity and security of data exchange within your organization: Does your provider require sources that meet: all data use regulations? This can include GLBA (Gramm-Leach-Bliley Act), FCRA (Fair Credit Reporting Act), DPPA (Drivers Privacy Protection Act), CAN-SPAM email compliance, Do-Not-Call (DNC) and TCPA (Telephone Consumer Protection Act) guidelines? (see below for an overview of common data use guidelines) Do data sources undergo annual 3rd party attorney audits to ensure source data is legally obtained and that data sources do not infringe on state or federal privacy legislation? Your data provider should have third party scheduled audits and documentation indicating processes and procedures followed during the data exchange process. Does the data reside in the cloud or in proprietary-managed, multiple co-location facilities? Cloud-based services offer data providers the flexibility in managing storage and capacity, up-time, speed and redundancy of workload. Cloud-based services are easily scalable and can reduce cost within some organizations. These services can be hosted in any number of locations, and in many cases, may reside off-shore. Organizations that offer cloud-based services may not own or manage the on-site staff who have physical access to their servicers. Anyone with access to the cloud providers servers has access to the data. For disaster recovery practices, cloud-based providers send many copies of the data residing on their servers to other data centers. Because of the lack of transparency about data handling practices, access to audit log data, and visibility into internal controls, cloud-based service providers are challenged in answering two primary questions: where is the client data located and who has access to it. Proprietary-managed co-location facilities offer full redundancy at separate, permanent, secure locations, giving the data supplier full control over the servers and support of the hardware. Permanent co-location facilities also provide the data supplier with exclusive access to the hosted data. To maintain compliance with many data security standards, you must be able to document who has access to your data. It is more challenging in a cloud-based environment to document access points. Understanding
how your data provider houses data and the security surrounding the platform or facilities is important to comply with both internal and external data security standards. Doees the data reside in a secure facility? Quality data providers will utilize data centers in SSAE16 Type 2 or SAS70 facilities. These facilities adhere to strict compliance with reliability of power and cooling, the security of premises where the data resides, and the quality technical support. Statement on Auditing Standards (SAS) No. 70 was a widely recognized auditing standard developed by the American Institute of Certified Public Accountants (AICPA). A service auditor s examination performed in accordance with SAS70 represents that a service organization has been through an in-depth examination of their control objectives and control activities, which often include controls over information technology and related processes. In today s global economy, service organizations or service providers must demonstrate that they have adequate controls and safeguards when they host or process data belonging to their customers. In addition, the requirements of Section 404 of the Sarbanes-Oxley Act of 2002 make SAS70 audit reports even more important to the process of reporting on the effectiveness of internal control over financial reporting. In 2011, Statement on Standards for Attestation Engagements (SSAE) No. 16 took effect and replaced SAS70 as the authoritative guidance for performing a service auditor s examination. SSAE 16 established a new attestation standard (AT 801) to contain the professional guidance. Does the data provider maintain an Information Security Program that is continuously reviewed and updated? As data regulations and industry breaches frequently occur, a continual review and update of security policies and procedures is critical for all data suppliers. This includes policies such as restricting data storage to US-based, highly secure data centers, ensuring access-as-authorized, appropriate logging, secure firewalls, secure/encrypted data transmission, and intrusion detection/vulnerability processes. GLBA (Gramm-Leach-Bliley Act) standards COMMON DATA USE REGULATIONS GLBA regulates the use of financial data. Primary standards include: Financial institutions are required to: ensure the security and confidentiality of customer information; protect against any anticipated threats or hazards to the security or integrity of such information; and protect against unauthorized access to or use of customer information that could result in substantial harm or inconvenience to any customer The law requires these institutions to explain how they use and share your personal information. The law also allows consumers to stop or opt out of certain information sharing
The law requires that financial institutions describe how they will protect the confidentiality and security of consumer information FCRA (Fair Credit Reporting Act) FCRA regulates the collection, dissemination, and use of consumer information, including consumer credit information. FCRA: Provides a consumer with information about him or her in the agency s files and how to take steps to verify the accuracy of information disputed by a consumer. Under the Fair and Accurate Credit Transactions Act (FAC- TA), an amendment to the FCRA passed in 2003, consumers are able to receive one free credit report per year If negative information is removed as a result of a consumer s dispute, it may not be reinserted without notifying the consumer within five days, in writing Credit Reporting Agencies may not retain negative information for an excessive period. The FCRA describes how long negative information, such as late payments, bankruptcies, tax liens or judgments may stay on a consumer s credit report typically seven years from the date of the delinquency. The exceptions: bankruptcies (10 years) and tax liens (seven years from the time they are paid) TCPA (Telephone Consumer Protection Act of 1991) October 2013 Updates On February 15, 2012, the Federal Communications Commission (FCC) adopted substantial changes to the Telephone Consumer Protection Act of 1991 (TCPA). These changes, which took effect on October 16, 2013, include the following: Prior express written consent is required for all autodialed or prerecorded telemarketing calls or text messages to wireless numbers and pre-recorded calls made to residential landlines, with the exception of informational calls, such as those from non-profit organizations, political calls, and calls for other noncommercial purposes (informational messages i.e. school closings) Consent must be unambiguous, with the consumer receiving clear disclosure that they will receive future calls that deliver prerecorded messages by or on behalf of a specific seller Specific requirements for consumers to opt-out of future robocalls during a robocall Established Business Relationship exemption for pre-recorded telemarketing calls to residential landlines have been eliminated, requiring companies to obtain express written consent from their consumers to receive pre-recorded telemarketing messages DPPA (Drivers Privacy Protection Act of 1994) Regulations The DPPA makes it illegal to obtain drivers information for unlawful purposes or to make false representations to obtain such information. The act establishes criminal fines for noncompliance, and establishes a civil cause of action for drivers against those who unlawfully obtain their information. The DPPA governs permissible use guidelines for automobile data. These guidelines include: For any government agency to carry out its functions For use in connection with matters of motor vehicle or driver safety and theft, including For use in the normal course of business by a legitimate business or its agents, employees, or contractors, but only to: verify the accuracy of personal information and correct information
CAN-SPAM (Controlling the Assault of Non-Solicited Pornography And Marketing Act of 2003) CAN-SPAM was created specifically to provide guidelines surrounding electronic mail message with the primary purpose of commercial advertising or promotion of commercial goods and services. Guidelines include: Header information cannot be unclear or misleading. The From, To, Reply-To, and routing information including the originating domain name and email address must be accurate and identify the person or business who initiated the message The email subject line must accurately reflect the content of the message Companies must disclose clearly and conspicuously that their message is an advertisement Email message must include a valid physical postal address Messages must include a clear and conspicuous explanation of how the recipient can opt out of getting email in the future Any opt-out mechanism offered must be able to process opt-out requests for at least 30 days after the message was sent. Recipient s opt-out request must be honored within 10 business days. Businesses cannot charge a fee, require the recipient to give any personally identifying information beyond an email address, or make the recipient take any step other than sending a reply email or visiting a single page on an Internet website as a condition for honoring an opt-out request The law makes clear that even if a company outsources email marketing the originating business is still legally responsible for complying with the law DATA PARTNERSHIP Understanding the processes your data provider adheres to when compiling a new file, processing and hosting your data, or reselling data to you from a data compiler protects the best interest of your business. Quality data compilation and structured regulations ensures your data supplier is mitigating risk for you and helping achieve the data quality needed for you to reach your overall goals. ABOUT INFUTOR DATA SOLUTIONS Infutor Data Solutions provides marketers with access to elite consumer data, business data, new movers, telephone, automotive, and email data. Specializing in cost-effective solutions for retailers, non-profit and fundraising organizations, and direct marketers, Infutor has gained industry recognition and grown significantly over the past several years. In addition to providing high-quality compiled data and marketing solutions, Infutor also provides automated data processing, including telephone append, e-append, reverse e-append, and a variety of proprietary data cleansing processes to help marketers reach the maximum number of customers and prospects, while reducing the cost of acquisition. Infutor s Senior Leadership team includes key executives from companies including TransUnion, Experian, Acxiom, Visa and Accudata. Together they bring a wealth of experience in data sourcing, linkage, and database applications. Infutor Data Solutions 15129 South Route 59, Plainfield, IL 60544 (312) 348-7900 sales@infutor.com www.infutor.com