White Paper. The Definition of Persona Data: Seeing the Complete Spectrum
|
|
|
- Jennifer Heath
- 10 years ago
- Views:
Transcription
1 White Paper The Definition of Persona Data: Seeing the Complete Spectrum Omer Tene, Senior Fellow Christopher Wolf, Founder and Co- Chair The Future of Privacy Forum January 2013 The Future of Privacy Forum (FPF) is a Washington, DC based think tank that seeks to advance responsible data practices. The forum is led by Internet privacy experts Jules Polonetsky and Christopher Wolf and includes an advisory board comprised of leaders from industry, academia, law and advocacy groups.
2 The Definition of Personal Data: Seeing the Complete Spectrum European law considers data to be personal if they relate to an identified natural person or a natural person who can be identified, directly or indirectly, by means reasonably likely to be used by the controller or by any other natural or legal person. 1 Under both existing and proposed European data privacy regimes, once data are defined to be personal data the full complement of legal requirements applies. 2 The implication of this binary approach to privacy law is that once data are deemed sufficiently de- identified or anonymized, there are no privacy- related obligations that must be met. It is no surprise therefore that the bar has historically been set quite high for determining when data can be considered sufficiently de- identified as to exempt its controllers and processors from data protection requirements. The current binary approach, we argue in this paper, is neither practical nor as protective of individuals as it can and should be. Whether or not data are reasonably likely to be related to an identifiable individual should depend not only on the application of technical measures of de- identification but also on the use of administrative safeguards and legal commitments undertaken by a controller (and imposed downstream on any third party transferees). Only when data are publicly released without appropriate administrative or legal strings and obligations should a solely technical- based de- identification standard apply. And in those cases, differential privacy provides a robust framework to satisfy technical de- identification requirements. 1 General Data Protection Regulation ( GDPR ), Articles 4(1) and 4(2). References to the proposed GDPR are to the European Commission proposal of 25 January 2012: Proposal for a Regulation of the European Parliament and of the Council on the protection of individuals with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation), COM(2012) 11 final. References to the Rapporteur s Report are to the Draft Report on the proposal for a regulation of the European Parliament and of the Council on the protection of individual with regard to the processing of personal data and on the free movement of such data (General Data Protection Regulation) (COM(2012)0011 C7 0025/ /0011(COD)), Committee on Civil Liberties, Justice and Home Affairs, Rapporteur: Jan Philipp Albrecht. 2 There are sound reasons to question the application of the data protection framework to data held by a party who has no capability of re-identification, simply because the controller or any other natural or legal person could re-identify those same data. Consider, for example, a data processor that stores strictly encrypted data without holding the key. Such a processor would be subject to data protection obligations without a practical ability to make sense of the data stored on its equipment. However, this language dates back to the 1995 Data Protection Directive and is therefore not addressed in this White Paper. 2
3 The proposed approach enables stronger protections for individuals right to privacy because it envisions the use of a more comprehensive range of data protection measures. In addition, this approach adjusts practically to the reality that while contemporary mathematical capabilities may theoretically enable re- identification to occur, their impact can functionally and effectively be constrained by legal and organizational measures. Pseudonymous data The Rapporteur s proposal of a new concept of pseudonymous data is commendable. 3 Such a concept could ostensibly bridge between data that are subject to the full gamut of legal obligations and those that remain exempt from the reach of the law. Yet, the definition of the new term in the Rapporteur s Report is vague (e.g., What does specific to one given context mean? Is online behavioral advertising a specific context? Does the term seek to distinguish between first and third party uses?); and more importantly, under the Rapporteur s Report controllers lack sufficient (if any) incentive to pseudonymize data. Specifically, pseudonymization should excuse controllers from certain obligations under the GDPR, such as obtaining explicit data subject consent or providing rights of access and rectification. However, according to the Rapporteur s Report, Article 10 of the GDPR, which excuses a controller from acquir[ing] additional information in order to identify the data subject for the sole purpose of complying with any provision of this Regulation, does not apply to pseudonymous data. This is the result of the Rapporteur s Report addition of the words single out to the Commission proposed text of Article 10. Pseudonymous data is defined as information that is not directly identifiable but allows the singling out of a data subject. And under the Rapporteur s Report, Article 10 only applies to data that do not permit the controller to identify or single out a natural person ; hence not to pseudonymous data. 4 Moreover, the Rapporteur s Report revisions to the consent provisions imply that consent to the processing of pseudonymous data may be given by automated means using a technical standard with general validity in the Union ( ) without collecting identification data. 5 Yet the obligation to document and prove data subject consent under Article 7(1) of the GDPR renders the collection of consent from unauthenticated users theoretically appealing but realistically impractical. Indeed, while the Rapporteur s Report relies on novel theories of authentication without identification, 6 the deployment of such infrastructures has been slow to gain traction in practical business settings. In addition, the Rapporteur s Report s explanatory comments refer to a Do Not Track standard, 3 The Rapporteur s Report; proposed Article 4(2a). 4 This logical impasse is reinforced by the Rapporteur s Report revisions to recital 45, under which: If the data processed by a controller do not permit the controller to identify or single out a natural person, the data controller should not be obliged to acquire additional information in order to identify the data subject for the sole purpose of complying with any provision of this Regulation (adding the words or single out ). 5 The Rapporteur s Report, proposed Article 7(2a). 6 The Rapporteur s Report, proposed recital 52. 3
4 which has yet to be implemented as a technical standard much less agreed upon from a policy perspective. 7 The end result of these shortcomings is that few controllers are likely to pursue data pseudonymization. The Limits of Technical De- Identification As noted, under current European privacy law, data that are not identifiable to an individual are exempt from legal obligations. For many years, technical measures of anonymization or de- identification were thus perceived as a silver bullet, allowing organizations to collect, analyze, retain and repurpose information while at the same time preserving individuals privacy. With the advent of Big Data collection, storage and analysis, the societal value ascribed to de- identified data has become immense, driving innovations in healthcare, energy conservation, fraud prevention, data security, and more. 8 Optimally, the legal framework would allow for use of data in ways that maximize such value while minimizing privacy risk. Yet over the past decade, it has become clear that in a world of big data, de- identification based solely on technical measures is increasingly strained by the existence of increasingly effective re- identification techniques. 9 Today, ample evidence exists of re- identification of apparently de- identified data, through methods such as multiple queries on a single database or linking the data in one database with that in another, which is often publicly available online. 10 Moreover, measures commonly taken to de- identify personal data, such as scrubbing or aggregating certain data fields, degrade the utility of the data for future use. 11 The result is that strictly technical de- identification has become encumbered with a deadweight cost, reducing data utility without eliminating privacy risks. Internal Use and Limited Sharing: Administrative and Legal Protections The limitation of technology- based de- identification does not mean, however, that all data should henceforth come under the full remit of data protection laws, regardless of how remote the risk of re- identification. When using data internally or sharing them with third party service providers or business associates, organizations can put in place stringent administrative and legal safeguards in addition to technical de- identification, to greatly reduce the likelihood of re- identification. Moreover, organizations frequently have no interest in re- 7 See discussion in Omer Tene & Jules Polonetsky, To Track or Do Not Track : Advancing Transparency and Individual Control in Online Behavioral Advertising, 13 MINN. J. L. SCI. & TECH. 281 (2012). 8 Omer Tene & Jules Polonetsky, Privacy in the Age of Big Data: A Time for Big Decisions, 64 STAN. L. REV. ONLINE 63 (2012). 9 Paul Ohm, Broken Promises of Privacy: Responding to the Surprising Failure of Anonymization, 57 UCLA L. REV (2010). 10 See, e.g., Arvind Narayanan & Vitaly Shmatikov, Robust De-anonymization of Large Sparse Datasets, 2008 IEEE SYMPOSIUM ON SECURITY AND PRIVACY In technical terms, existing methods for aggregation of fields, such as k-anonymity, "fail to compose," meaning that they do not survive multiple instantiations. 4
5 identifying specific individuals from de- identified datasets. Treating all such data as personal would create perverse incentives for organizations to forgo administrative and legal safeguards and retain as much personal data as they can. Not all data are intended for public release or to be shared with unencumbered third parties. Different obligations should be imposed depending on whether the intended recipients of data are the public at large, contractually bound service providers or business associates, researchers subject to institutional review boards, or internal employees. The UK Information Commissioner s Office, in its recent code of practice on anonymization, suggests a similar approach: It is important to draw a distinction between the publication of anonymised data to the world at large and limited access. 12 When assessing whether data are considered personal and which data protection obligations should apply, respect should be had not only for technical de- identification but also for administrative and legal safeguards that make re- identification unlikely. Administrative safeguards include data security policies, access limits, employee training, data segregation guidelines, and data deletion practices that aim to stop confidential information from leaking. Legal controls include contractual terms that restrict how service providers or business associates may handle, use or share information, penalties for contractual breaches, and auditing rights to ensure compliance. By implementing administrative and legal safeguards, organizations provide important privacy protections independent of technical de- identification. In other words, the identifiability of data should be assessed on a spectrum as opposed to the existing binary scale. Data that are only identifiable at great technical or legal cost should remain subject to the legal framework yet trigger the application of only a subset of legal obligations. 13 For example, it makes little sense to provide individuals with a right of access and rectification to data that are not readily identifiable, as this would require data controllers to proactively authenticate identities and re- identify data, increasing privacy risks. The explicit consent requirement in Article 4(8) of the GDPR, particularly when coupled with burden of proof obligations under Article 7(1), will incentivize organizations to increasingly rely on an architecture of authenticated identities. In the context of processing of pseudonymized or indirectly identifiable data, the requirement of explicit consent creates a perverse result that runs counter to other provisions of the GDPR, such as Article 10. The Rapporteur s Report, while recognizing the benefits of data pseudonymization, fails to provide sufficient incentive for organizations to deploy de- identification techniques as it exculpates controllers from neither explicit consent requirements nor access and rectification obligations. 12 Information Commissioner s Office, ANONYMISATION: MANAGING DATA PROTECTION RISK CODE OF PRACTICE, November 2012, pplication/anonymisation_code.ashx. 13 Paul Schwartz & Dan Solove, The PII Problem: Privacy and a New Concept of Personally Identifiable Information, 86 NYU L. REV (2011). 5
6 Given the stiff requirements for consent, a new legal basis should be added to the GDPR to authorize the processing of pseudonymized data without consent based on a controller s legitimate interests. This would incentivize organizations to implement pseudonymization and prevent the use of authenticated identities strictly in order to prove compliance with the consent provisions of the GDPR. Certain organizations implement business models that require authenticated identities in one context (e.g., webmail services), while at the same time using anonymization or pseudonymization in other contexts (e.g., targeting ads). Such diversity should be encouraged, not stifled, provided that in certain contexts individual accountability, which is based on identity authentication, is key, while in other contexts identification of specific individuals is neither required nor warranted. Public Release: Differential Privacy Just as data generation, analysis and re- identification methods have evolved over the past decade, so have privacy enhancing techniques. The existence and promise of such techniques should be recognized and their use should be incentivized in privacy laws. One new approach that is relevant to de- identification is called differential privacy. 14 Differential privacy provides a workable, formal definition of privacy- preserving data access, along with algorithms that can furnish mathematical proof of privacy preservation. Differential privacy avoids the ailments of de- identification by allowing data sharing in a way that maintains data quality while at the same time preserving individuals privacy. It enables data controllers to share derivative data with third parties without subjecting any individual to more than a minimal risk of harm from the use of his or her data in computing the values to be released, even when those values are combined with other data that may be reasonably available. In other words, it allows data controllers and third parties to draw lessons and derive valuable conclusions from a data set, without being able to determine whether or not such conclusions are based on the personal data of any given individual. Hence, differential privacy emphasizes not whether an individual can be directly associated with a particular revealed value; but rather the extent to which any revealed value depends on an individual s data. Differential privacy does not solve all privacy problems. 15 It does not provide protection vis- à- vis a data controller who is in possession of the data set 14 Cynthia Dwork, Differential Privacy, in 33RD INTERNATIONAL COLLOQUIUM ON AUTOMATA, LANGUAGES AND PROGRAMMING (ICALP 2006), Cynthia Dwork, Frank McSherry, Kobbi Nissim & Adam Smith, Calibrating noise to sensitivity in private data analysis, in: PROC. OF THE 3RD THEORY OF CRYPTOGRAPHY CONFERENCE 265 (2006). Also see Ed Felten, Protecting privacy by adding noise, TECH@FTC BLOG, June 21, 2012, 15 Other privacy enhancing technologies in this space include homomorphic encryption and crowdblending. See Craig Gentry, A FULLY HOMOMORPHIC ENCRYPTION SCHEME, PhD Dissertation, Sept. 6
7 containing the raw data. And it could leak information where positive correlations exist between individuals whose data reside in a given data set. Yet when compared to existing de- identification methods, differential privacy provides greatly increased privacy and salvages the utility of data which would otherwise have been suppressed under de- identification. In addition, as research progresses, differential privacy science improves accuracy while keeping privacy loss fixed. Conclusion European law should avoid a rigid distinction between personal and non- personal data based on strictly technical criteria and divorced from the circumstances of data collection, retention, and use. Where data are maintained by a controller or shared with a restricted group of service providers or business associates, additional safeguards administrative and legal beyond technical de- identification can prevent re- identification. Where data are shared publicly or made freely available to transferees, technical safeguards such as differential privacy can be applied to provide adequate privacy. As currently structured, the GDPR should introduce the concept of pseudonomyzed data and give it credence by allowing the processing of such data without consent. This would prevent organizations from pursuing a perverse incentive to identify individuals strictly in order to comply with data protection law. 2009, Johannes Gehrke, Michael Hay, Edward Lui & Rafael Pass, Crowd-Blending Privacy, 32ND INTERNATIONAL CRYPTOLOGY CONFERENCE (CRYPTO 2012), Also see Tom Simonite, How to Share Personal Data While Keeping Secrets Safe, MIT TECH. REV., AUG. 7, 2012, 7
Jan Philipp Albrecht Rapporteur, Committee on Civil Liberties, Justice and Home Affairs European Parliament
September 5, 2012 Jan Philipp Albrecht Rapporteur, Committee on Civil Liberties, Justice and Home Affairs European Parliament Lara Comi Rapporteur, Committee on Internal market and Consumer Protection
Opinion 04/2012 on Cookie Consent Exemption
ARTICLE 29 DATA PROTECTION WORKING PARTY 00879/12/EN WP 194 Opinion 04/2012 on Cookie Consent Exemption Adopted on 7 June 2012 This Working Party was set up under Article 29 of Directive 95/46/EC. It is
Re: Big Data: A Tool for Inclusion or Exclusion? Workshop Project No. P145406
October 30, 2014 Federal Trade Commission Office of the Secretary Room H 113 (Annex X) 600 Pennsylvania Avenue NW Washington, DC 20580 Re: Big Data: A Tool for Inclusion or Exclusion? Workshop Project
The Information Commissioner s Office response to HM Treasury s Call for Evidence on Data Sharing and Open Data in Banking
The Information Commissioner s Office response to HM Treasury s Call for Evidence on Data Sharing and Open Data in Banking The Information Commissioner has responsibility for promoting and enforcing the
Observations on international efforts to develop frameworks to enhance privacy while realising big data s benefits
Big Data, Key Challenges: Privacy Protection & Cooperation Observations on international efforts to develop frameworks to enhance privacy while realising big data s benefits Seminar arranged by the Office
AMENDMENTS TO THE DRAFT DATA PROTECTION REGULATION PROPOSED BY BITS OF FREEDOM
AMENDMENTS TO THE DRAFT DATA PROTECTION REGULATION PROPOSED BY BITS OF FREEDOM On 25 January 2012, the European Commission published a proposal to reform the European data protection legal regime. One
The U.K. Information Commissioner s Office Report on Big Data and Data Protection
reau of National Affairs, Inc. (800-372-1033) http://www.bna.com WORLD DATA PROTECTION REPORT >>> News and analysis of data protection developments around the world. For the latest updates, visit www.bna.com
Privacy Techniques for Big Data
Privacy Techniques for Big Data The Pros and Cons of Syntatic and Differential Privacy Approaches Dr#Roksana#Boreli# SMU,#Singapore,#May#2015# Introductions NICTA Australia s National Centre of Excellence
Application of Data Protection Concepts to Cloud Computing
Application of Data Protection Concepts to Cloud Computing By Denitza Toptchiyska Abstract: The fast technological development and growing use of cloud computing services require implementation of effective
Processor Binding Corporate Rules (BCRs), for intra-group transfers of personal data to non EEA countries
Processor Binding Corporate Rules (BCRs), for intra-group transfers of personal data to non EEA countries Sopra HR Software as a Data Processor Sopra HR Software, 2014 / Ref. : 20141120-101114-m 1/32 1.
Degrees of De-identification of Clinical Research Data
Vol. 7, No. 11, November 2011 Can You Handle the Truth? Degrees of De-identification of Clinical Research Data By Jeanne M. Mattern Two sets of U.S. government regulations govern the protection of personal
ARTICLE 29 DATA PROTECTION WORKING PARTY
ARTICLE 29 DATA PROTECTION WORKING PARTY 0829/14/EN WP216 Opinion 05/2014 on Anonymisation Techniques Adopted on 10 April 2014 This Working Party was set up under Article 29 of Directive 95/46/EC. It is
Policy-based Pre-Processing in Hadoop
Policy-based Pre-Processing in Hadoop Yi Cheng, Christian Schaefer Ericsson Research Stockholm, Sweden [email protected], [email protected] Abstract While big data analytics provides
E-PRIVACY DIRECTIVE: Personal Data Breach Notification
E-PRIVACY DIRECTIVE: Personal Data Breach Notification PUBLIC CONSULTATION BEUC Response Contact: Kostas Rossoglou [email protected] Ref.: X/2011/092-13/09/11 EC register for interest representatives: identification
Council of the European Union Brussels, 15 January 2015 (OR. en) NOTE German delegation Working Party on Information Exchange and Data Protection
Council of the European Union Brussels, 15 January 2015 (OR. en) Interinstitutional File: 2012/0011 (COD) 14705/1/14 REV 1 LIMITE DATAPROTECT 146 JAI 802 MI 805 DRS 135 DAPIX 150 FREMP 178 COMIX 568 CODEC
How To Respond To The Nti'S Request For Comment On Big Data And Privacy
Submission to the National Telecommunications and Information Administration (NTIA), U.S. Department of Commerce Docket No. 140514424 4424 01 RIN 0660 XC010 Comments of the Information Technology Industry
Corporate Guidelines for Subsidiaries (in Third Countries ) *) for the Protection of Personal Data
Corporate Guidelines for Subsidiaries (in Third Countries ) *) for the Protection of Personal Data *) For the purposes of these Corporate Guidelines, Third Countries are all those countries, which do not
Binding Corporate Rules ( BCR ) Summary of Third Party Rights
Binding Corporate Rules ( BCR ) Summary of Third Party Rights This document contains in its Sections 3 9 all provision of the Binding Corporate Rules (BCR) for Siemens Group Companies and Other Adopting
Data controllers and data processors: what the difference is and what the governance implications are
ICO lo : what the difference is and what the governance implications are Data Protection Act Contents Introduction... 3 Overview... 3 Section 1 - What is the difference between a data controller and a
A NOVEL APPROACH FOR MULTI-KEYWORD SEARCH WITH ANONYMOUS ID ASSIGNMENT OVER ENCRYPTED CLOUD DATA
A NOVEL APPROACH FOR MULTI-KEYWORD SEARCH WITH ANONYMOUS ID ASSIGNMENT OVER ENCRYPTED CLOUD DATA U.Pandi Priya 1, R.Padma Priya 2 1 Research Scholar, Department of Computer Science and Information Technology,
INERTIA ETHICS MANUAL
SEVENTH FRAMEWORK PROGRAMME Smart Energy Grids Project Title: Integrating Active, Flexible and Responsive Tertiary INERTIA Grant Agreement No: 318216 Collaborative Project INERTIA ETHICS MANUAL Responsible
Privacy Committee. Privacy and Open Data Guideline. Guideline. Of South Australia. Version 1
Privacy Committee Of South Australia Privacy and Open Data Guideline Guideline Version 1 Executive Officer Privacy Committee of South Australia c/o State Records of South Australia GPO Box 2343 ADELAIDE
OVERVIEW. stakeholder engagement mechanisms and WP29 consultation mechanisms respectively.
Joint work between experts from the Article 29 Working Party and from APEC Economies, on a referential for requirements for Binding Corporate Rules submitted to national Data Protection Authorities in
Data Protection in Clinical Studies Implications of the New EU General Data Protection Regulation
June 19, 2012 Practice Group(s): Health Care Life Sciences Data Protection in Clinical Studies Implications of the New EU General Data Protection Regulation By Mathias Schulze Steinen and Daniela Bohn
Align Technology. Data Protection Binding Corporate Rules Controller Policy. 2014 Align Technology, Inc. All rights reserved.
Align Technology Data Protection Binding Corporate Rules Controller Policy Contents INTRODUCTION 3 PART I: BACKGROUND AND ACTIONS 4 PART II: CONTROLLER OBLIGATIONS 6 PART III: APPENDICES 13 2 P a g e INTRODUCTION
Overview. Data protection in a swirl of change 28.03.2014. Cloud computing. Software as a service. Infrastructure as a service. Platform as a service
Data protection in a swirl of change Overview 1 Data protection issues in cloud computing 2 Consent for mobile applications Security Seminar 2014: Privacy Radboud University Nijmegen 3 The WhatsApp case
Behavioral Targeting Legal Developments in Europe and the Netherlands
1 Behavioral Targeting Legal Developments in Europe and the Netherlands Frederik Zuiderveen Borgesius Ph.D researcher, focusing on behavioral targeting and privacy law Institute for Information Law, University
INFORMATION SECURITY MANAGEMENT POLICY
INFORMATION SECURITY MANAGEMENT POLICY Security Classification Level 4 - PUBLIC Version 1.3 Status APPROVED Approval SMT: 27 th April 2010 ISC: 28 th April 2010 Senate: 9 th June 2010 Council: 23 rd June
How To Ensure Health Information Is Protected
pic pic CIHI Submission: 2011 Prescribed Entity Review October 2011 Who We Are Established in 1994, CIHI is an independent, not-for-profit corporation that provides essential information on Canada s health
Declaration of Internet Rights Preamble
Declaration of Internet Rights Preamble The Internet has played a decisive role in redefining public and private space, structuring relationships between people and between people and institutions. It
Privacy-preserving Data-aggregation for Internet-of-things in Smart Grid
Privacy-preserving Data-aggregation for Internet-of-things in Smart Grid Aakanksha Chowdhery Postdoctoral Researcher, Microsoft Research ac@microsoftcom Collaborators: Victor Bahl, Ratul Mahajan, Frank
Public Consultation regarding Data Sharing and Governance Bill. Contribution of Office of the Data Protection Commissioner
Submission of the Office of the Data Protection Commissioner (DPC) on the data-sharing and Governance Bill: - Policy Proposals (dated the 1 st of August 2014) Public Consultation regarding Data Sharing
Guidelines on Data Protection. Draft. Version 3.1. Published by
Guidelines on Data Protection Draft Version 3.1 Published by National Information Technology Development Agency (NITDA) September 2013 Table of Contents Section One... 2 1.1 Preamble... 2 1.2 Authority...
The primary responsibility for the data processing lies within the Administration Department, which the FINCOP Unit is part of.
Opinion on a Notification for Prior Checking received from the Data Protection Officer of the European Training Foundation Regarding the Processing Operations to Manage Calls for Tenders Brussels, 22 April
Corporate Policy. Data Protection for Data of Customers & Partners.
Corporate Policy. Data Protection for Data of Customers & Partners. 02 Preamble Ladies and gentlemen, Dear employees, The electronic processing of virtually all sales procedures, globalization and growing
Re: FINRA Regulatory Notice 13-42: FINRA Requests Comments on a Concept Proposal to Develop the Comprehensive Automated Risk Data System
Ms. Marcia E. Asquith Office of the Corporate Secretary FINRA 1735 K Street, NW Washington, DC 20006 Re: FINRA Regulatory Notice 13-42: FINRA Requests Comments on a Concept Proposal to Develop the Comprehensive
Part I. Universität Klagenfurt - IWAS Multimedia Kommunikation (VK) M. Euchner; Mai 2001. Siemens AG 2001, ICN M NT
Part I Contents Part I Introduction to Information Security Definition of Crypto Cryptographic Objectives Security Threats and Attacks The process Security Security Services Cryptography Cryptography (code
Data Protection. Processing and Transfer of Personal Data in Kvaerner. Binding Corporate Rules Public Document
Data Protection Processing and Transfer of Personal Data in Kvaerner Binding Corporate Rules Public Document 1 of 19 1 / 19 Table of contents 1 Introduction... 4 1.1 Scope... 4 1.2 Definitions... 4 1.2.1
CDT ISSUE BRIEF ON FEDERAL DATA BREACH NOTIFICATION LEGISLATION
CDT ISSUE BRIEF ON FEDERAL DATA BREACH NOTIFICATION LEGISLATION January 27, 2015 A September 2014 Ponemon study found that 60% of U.S. companies have experienced more than one data breach in the past two
A CPA recounts exponential growth in Compliance. Mary Ellen McLaughlin
Compliance TODAY September 2015 a publication of the health care compliance association www.hcca-info.org A CPA recounts exponential growth in Compliance an interview with Patricia Bickel Compliance and
PROTECTION OF PERSONAL INFORMATION
PROTECTION OF PERSONAL INFORMATION Definitions Privacy Officer - The person within the Goderich Community Credit Union Limited (GCCU) who is responsible for ensuring compliance with privacy obligations,
Recommendations for the PIA. Process for Enterprise Services Bus. Development
Recommendations for the PIA Process for Enterprise Services Bus Development A Report by the Data Privacy and Integrity Advisory Committee This report reflects the consensus recommendations provided by
Synapse Privacy Policy
Synapse Privacy Policy Last updated: April 10, 2014 Introduction Sage Bionetworks is driving a systems change in data-intensive healthcare research by enabling a collective approach to information sharing
The Future Of UK Pharmaceutical Best Practices --By Lincoln Tsang and Silvia Valverde, Arnold & Porter LLP
Published by Life Sciences Law360 on January 26, 2015. Also ran in Health Law360. The Future Of UK Pharmaceutical Best Practices --By Lincoln Tsang and Silvia Valverde, Arnold & Porter LLP Law360, New
SHORT FORM NOTICE CODE OF CONDUCT TO PROMOTE TRANSPARENCY IN MOBILE APP PRACTICES. I. Preamble: Principles Underlying the Code of Conduct
SHORT FORM NOTICE CODE OF CONDUCT TO PROMOTE TRANSPARENCY IN MOBILE APP PRACTICES I. Preamble: Principles Underlying the Code of Conduct Below is a voluntary Code of Conduct for mobile application ( app
Aircloak Analytics: Anonymized User Data without Data Loss
Aircloak Analytics: Anonymized User Data without Data Loss An Aircloak White Paper Companies need to protect the user data they store for business analytics. Traditional data protection, however, is costly
Data protection compliance checklist
Data protection compliance checklist What is this checklist for? This checklist is drawn up on the basis of analysis of the relevant provisions of European law. Although European law aims at harmonizing
Guidance for Data Users on the Collection and Use of Personal Data through the Internet 1
Guidance for Data Users on the Collection and Use of Personal Data through the Internet Introduction Operating online businesses or services, whether by commercial enterprises, non-government organisations
Big Data, Not Big Brother: Best Practices for Data Analytics Peter Leonard Gilbert + Tobin Lawyers
Big Data, Not Big Brother: Best Practices for Data Analytics Peter Leonard Gilbert + Tobin Lawyers March 2013 How Target Knew a High School Girl Was Pregnant Before Her Parents Did just because you can,
E U R O P E A N E C O N O M I C A R E A
E U R O P E A N E C O N O M I C A R E A S T A N D I N G C O M M I T T E E O F T H E E F T A S T A T E S Distribution: EEA EFTA 20 March 2012 SUBCOMMITTEE I ON THE FREE MOVEMENT OF GOODS EEA EFTA Comment
Clinical Study Reports Approach to Protection of Personal Data
Clinical Study Reports Approach to Protection of Personal Data Background TransCelerate BioPharma Inc. is a non-profit organization of biopharmaceutical companies focused on advancing innovation in research
GSK Public policy positions
Safeguarding Personally Identifiable Information A Summary of GSK s Binding Corporate Rules The Issue The processing of Personally Identifiable Information (PII) 1 and Sensitive Personally Identifiable
INTERNATIONAL PHARMACEUTICAL PRIVACY CONSORTIUM COMMENTS IN RESPONSE TO THE CALL FOR EVIDENCE ON EU DATA PROTECTION PROPOSALS
INTERNATIONAL PHARMACEUTICAL PRIVACY CONSORTIUM COMMENTS IN RESPONSE TO THE CALL FOR EVIDENCE ON EU DATA PROTECTION PROPOSALS I. INTRODUCTION The International Pharmaceutical Privacy Consortium (IPPC)
Article 29 Working Party Issues Opinion on Cloud Computing
Client Alert Global Regulatory Enforcement If you have questions or would like additional information on the material covered in this Alert, please contact one of the authors: Cynthia O Donoghue Partner,
Summary of the Dutch Data Protection Authority s guidelines for the Data Breach Notification Act
Summary of the Dutch Data Protection Authority s guidelines for the Data Breach Notification Act On 1 January 2016, the Dutch Data Breach Notification Act will enter into force. The Dutch DPA issued Guidelines
Insurance Europe key messages on the European Commission's proposed General Data Protection Regulation
Position Paper Insurance Europe key messages on the European Commission's proposed General Data Protection Regulation Our reference: SMC-DAT-12-064 Date: 3 September 2012 Related documents: Proposal for
MEMORANDUM. I. Accurate Framing of Communications Privacy Policy Should Acknowledge Full Range of Threats to Consumer Privacy
MEMORANDUM To: Interested Persons From: Claire Gartland, Khaliah Barnes, and Marc Rotenberg, Electronic Privacy Information Center (EPIC) Re: FCC Communications Privacy Rulemaking Date: EPIC is circulating
A Q&A with the Commissioner: Big Data and Privacy Health Research: Big Data, Health Research Yes! Personal Data No!
A Q&A with the Commissioner: Big Data and Privacy Health Research: Big Data, Health Research Yes! Personal Data No! Ann Cavoukian, Ph.D. Information and Privacy Commissioner Ontario, Canada THE AGE OF
Policy Statement. Employee privacy, data protection and human resources. Prepared by the Commission on E-Business, IT and Telecoms. I.
International Chamber of Commerce The world business organization Policy Statement Employee privacy, data protection and human resources Prepared by the Commission on E-Business, IT and Telecoms I. Introduction
The potential legal consequences of a personal data breach
The potential legal consequences of a personal data breach Tue Goldschmieding, Partner 16 April 2015 The potential legal consequences of a personal data breach 15 April 2015 Contents 1. Definitions 2.
FORWARD: Standards-and-Guidelines-Process.pdf. 1 http://www.nist.gov/public_affairs/releases/upload/vcat-report-on-nist-cryptographic-
NIST Cryptographic Standards and Guidelines: A Report to the NIST Visiting Committee on Advanced Technology Regarding Recommendations to Improve NIST s Approach FORWARD: In July 2014, the National Institute
BSA GLOBAL CYBERSECURITY FRAMEWORK
2010 BSA GLOBAL CYBERSECURITY FRAMEWORK BSA GLOBAL CYBERSECURITY FRAMEWORK Over the last 20 years, consumers, businesses and governments 1 around the world have moved online to conduct business, and access
When Security, Privacy and Forensics Meet in the Cloud
When Security, Privacy and Forensics Meet in the Cloud Dr. Michaela Iorga, Senior Security Technical Lead for Cloud Computing Co-Chair, Cloud Security WG Co-Chair, Cloud Forensics Science WG March 26,
Privacy & Big Data: Enable Big Data Analytics with Privacy by Design. Datenschutz-Vereinigung von Luxemburg Ronald Koorn DRAFT VERSION 8 March 2014
Privacy & Big Data: Enable Big Data Analytics with Privacy by Design Datenschutz-Vereinigung von Luxemburg Ronald Koorn DRAFT VERSION 8 March 2014 Agenda? What is 'Big Data'? Privacy Implications Privacy
4-column document Net neutrality provisions (including recitals)
4-column document Net neutrality provisions (including recitals) [Text for technical discussions. It does not express any position of the Commission or its services] Proposal for a REGULATION OF THE EUROPEAN
THE TRANSFER OF PERSONAL DATA ABROAD
THE TRANSFER OF PERSONAL DATA ABROAD MARCH 2014 THIS NOTE CONSIDERS THE SITUATION OF AN IRISH ORGANISATION OR BUSINESS SEEKING TO TRANSFER PERSONAL DATA ABROAD FOR STORAGE OR PROCESSING, IN LIGHT OF THE
Credit Union Board of Directors Introduction, Resolution and Code for the Protection of Personal Information
Credit Union Board of Directors Introduction, Resolution and Code for the Protection of Personal Information INTRODUCTION Privacy legislation establishes legal privacy rights for individuals and sets enforceable
DRAFT BILL PROPOSITION
DRAFT BILL PROPOSITION Establishes principles, guarantees, rights and obligations related to the use of the Internet in Brazil. THE NATIONAL CONGRESS decrees: CHAPTER I PRELIMINAR PROVISIONS Article 1.
DATA PROTECTION POLICY
Reference number Approved by Information Management and Technology Board Date approved 14 th May 2012 Version 1.1 Last revised N/A Review date May 2015 Category Information Assurance Owner Data Protection
Position of the retail and wholesale sector on the Draft Data Protection Regulation in view of the trilogue 2015
2 September 2015 Position of the retail and wholesale sector on the Draft Data Protection Regulation in view of the trilogue 2015 We support the efforts of EU legislators to create a harmonised data protection
1.2: DATA SHARING POLICY. PART OF THE OBI GOVERNANCE POLICY Available at: http://www.braininstitute.ca/brain-code-governance. 1.2.
1.2: DATA SHARING POLICY PART OF THE OBI GOVERNANCE POLICY Available at: http://www.braininstitute.ca/brain-code-governance 1.2.1 Introduction Consistent with its international counterparts, OBI recognizes
Formal Methods for Preserving Privacy for Big Data Extraction Software
Formal Methods for Preserving Privacy for Big Data Extraction Software M. Brian Blake and Iman Saleh Abstract University of Miami, Coral Gables, FL Given the inexpensive nature and increasing availability
Personal information, for purposes of this Policy, includes any information which relates to an identified or an identifiable person.
PART I: INTRODUCTION AND BACKGROUND Purpose This Data Protection Binding Corporate Rules Policy ( Policy ) establishes the approach of Fluor to compliance with European data protection law and specifically
How To Ensure Correctness Of Data In The Cloud
A MECHANICS FOR ASSURING DATA STORAGE SECURITY IN CLOUD COMPUTING 1, 2 Pratibha Gangwar, 3 Mamta Gadoria 1 M. Tech. Scholar, Jayoti Vidyapeeth Women s University, Jaipur, [email protected] 2 M. Tech.
