A Unified Ethical Frame for Big Data Analysis
|
|
|
- Alannah Claire Wilkins
- 10 years ago
- Views:
Transcription
1 A Unified Ethical Frame for Big Data Analysis Big Data Ethics Project Version October 2014
2 Introduction Big data provides unprecedented opportunities to drive information-based innovation in economies, healthcare, public safety, education, transportation and almost every human endeavour. Big data also creates risk to both individuals and society unless effective governance is in place. That governance must be sensitive to reticence, the harm to individuals when data is not used because of ambiguity on how to apply laws, standards and regulations, as well as to privacy. Governance must be holistic taking into consideration concepts of good and bad from all potential stakeholders. That means that the analysis should consider the benefits and risks to the individual, for society as a whole, and for the parties conducting big data discovery and application. Moreover, data protection requires a full understanding of the potential impact of big data on the full range of human rights, not just those related to privacy. To establish big data governance, the Foundation believes in the need for a common ethical frame based on key values and the need for an interrogation framework. The latter consists of a set of key questions to be asked and answered to illuminate significant issues, both for industry and for those providing oversight to assess big data projects. Reviews must be from the 360 degree ethical perspective discussed above. That assessment must take into consideration all human, as well as societal and business interests and rights. In formulating a frame, we concluded the following: governance requires enforcement, big data enforcement needs to be explored by stakeholders 1 and interrogation frameworks should be customized (at least at the industry level and possibly down to the company level). To assure project materials are approachable, they will be broken into four parts: Part A Unified ethical frame; Part B Interrogation framework; Part C Enforcement discussion; Part D Industry interrogation models. Parts A and B will be completed in the project s first phase. Part C will focus on enforcement, and Part D will create examples. 2 It is anticipated that Parts A and B will be completed in 2014 and will be shared prior to starting the later parts. This is a living document. As we learn more, for example in creating the interrogation documents in Part D and vetting all parts with the data community, we will make changes. Future amended documents will have a new version number and date on the title page. 1 While key data protection concepts are enduring and sound, the Foundation believes current law, in many instances, does not contain the precise authority for some privacy agencies to enforce and the targeted incentives to encourage the balancing processes for business suggested in this framework. 2 The Foundation has received a grant from Acxiom Corporation for a big data ethical tool for marketing. The Foundation is also in discussions to help create three other interrogation tools. 2
3 Part A: Key Values for a Shared Ethical Frame for Big Data Introduction Effective governance of big data analytics facilitates quality outcomes that create economic, research and social value while still preserving the ability for individuals to define themselves, where appropriate, and avoid predestination. A common ethical framework provides the necessary foundation for organizations to build an accountable governance system for big data analytics. The first step in building such a process is defining core values, as described below, from which ethics-based rules and outcomes can evolve. Kenneth Cukier s and Viktor Mayer Schoenberger s book on big data defined the need for algorithmists to be interrogators of the big data process to achieve effective governance. 3 Whether it is an individual conducting the analysis or a team, mechanisms are necessary to assure process interrogation is responsible and answerable to stakeholders. For any process to be responsible, it must rest upon a foundation of established societal norms 4 not only for privacy but also for consumer, citizen and subject protection, intellectual rights, liberty, freedom, and free expression. Those interests may vary from country to country and from legal system to legal system. However, the core interest in fairness remains critical to any discussion of basic rights. Big data analytics often remain invisible to individuals, even when big data insights affect them. While transparency is important to market and regulatory checks and balances, internal processes that have and can demonstrate integrity are vital. This paper seeks to define the scope and concepts of key values for a shared ethical frame for big data that is demonstrable to both internal oversight entities and external oversight organizations. 5 Before describing the process to define core values, it is necessary to address how this document defines big data. Many definitions of big data have been articulated in different fields over the past few years. 6 Cukier s and Mayer-Schoenberger s definition is probably the most applicable one. They write that big data refers to things one can do at a large scale that cannot be done at a smaller one, to extract new insights or create new forms of value, in ways that change markets, organizations, the relationship between citizens and government, and more; 7 While this definition links to other definitions of big data such as Gartner s, where the 3 Mayer-Schöenberger, V. and K. Cukier (2013), Big Data: A Revolution That Will Transform How We Live, Work and Think. Houghton Mifflin Harcourt, New York, pp For the purposes of this framework, the Foundation used The Universal Declaration of Human Rights as the source for societal norms. The Foundation acknowledges that, in some cases, the rights are still aspirational and, in some jurisdictions, norms have not evolved with time. 5 External oversight organizations may be regulatory agencies, accountability agents and the voice of the crowd; 6 Gartner, Inc., defines big data as high-volume, high-velocity and high-variety information assets that demand cost-effective, innovative forms of information processing for enhanced insights and decision making; Gartner, Inc. (n.d.), "Big Data", accessed 1 September Mayer-Schöenberger, V. and K. Cukier (2013), Big Data: A Revolution That Will Transform How We Live, Work and Think. Houghton Mifflin Harcourt, New York, p. 6. 3
4 emphasis is on volume, velocity and variety of data, Cukier and Mayer-Schoenberger focus on the ability of big data to change the manner in which key questions are confronted by looking for interesting correlations between data sets that would not have been visible using legacy systems, small data sets and intuition. 8 Advanced analytic processes that make it possible to use unstructured data to conduct long-standing legacy forms of analysis with greater and more diverse data is included in the definition of big data, but this ethical frame is more focused on the processing of data that makes what would have previously been considered impossible insights now possible. The ethical frame was developed, in part, from prior work by the project leadership. The prior work includes Big Data and Analytics: Seeking Foundations for Effective Privacy Guidance that suggests a two-phase approach to big data analytics. 9 The first phase is discovery, 10 which yields new insights. The second phase is application, which puts the insights into effect. The first step in discovery consists of the aggregation and, where applicable, the de-identification of data. Big data analytics bring together very large, diverse and often unstructured data sets. Therefore, organizations conducting both big data discovery and application must conduct due diligence on the sources of data to assure the data is appropriate for the intended purpose. Data may be inappropriate for a number of reasons. The data use may be prohibited by law or contract restrictions. Sometimes, the data is too fragmented, or its accuracy is doubtful. In other instances, the data may have been provided, created or observed in a deceptive fashion. The ethics of big data analytics rests upon the due diligence of the appropriateness of the source data. Due diligence processes must be proportional to both societal and individual risks and the data s significance; Inadequate due diligence regarding the source data is comparable to data scientists not understanding the characteristics associated with the data they use, which creates the risk of inaccurate correlations that may lead to inappropriate outcomes. In addition to due diligence, it is important to understand that the discovery phase is where one may find correlations between data sets that would not be visible without the muscle of modern high-speed computing and advanced analytic processes and technologies. In the discovery phase, one does not apply those insights but only conducts the research to illuminate them. Any implementation of the insights would occur in the application, not the discovery, phase. The discovery phase typically begins with a repurposing of data already in existence. The discovery phase does not usually involve collection of data directly from the individual or from observations of the individual as part of its process. Therefore, the discovery phase is not usually personally impactful. 8 Legacy analytics relied on precisely designed and formatted samples. Big data makes use of complete data sets that are not always in structured fields. Small data sets refer to legacy data samples. 9 The Centre for Information Policy Leadership (2013), Big Data and Analytics: Seeking Foundations for Effective Privacy Guidance, 10 Discovery is typically research and is recognized as a compatible use under the EU Directive. However, processing still requires a legal basis. Establishing a legal basis in Europe can require the type of balancing process discussed in this paper. (See Article 7(f) of the EU Directive.) 4
5 A word of caution, however, the structured evaluation of the discovery phase is very important. The data used will have implications for the insights generated. Some data scientists have asserted that the sheer volume of data corrects for any flaws in the data itself. However, a recent Oxford University journal article written by Bruening and Waterman discusses the risks associated with the discovery phase. 11 Bruening and Waterman argue that mistakes related to picking the wrong data sets, or even not understanding the characteristics related to the data, will affect the accuracy of the insights reached. If the wrong data sets have been selected to begin with, then inaccurate insights or conclusions could result that may have negative societal impacts, including the chance for discrimination or harm. Generally, if appropriate safeguards are in place, if the purpose is legitimate and if the data used is of a quality appropriate for that purpose, discovery should not have an impact on the individual. However, one cannot disregard the conditionals described in the paragraph. The application phase is where impact on the individual is more likely to occur. In the application phase, the insights from discovery are used to make decisions that can be positively or negatively impactful. Such decisions might range from which drugs should be used in a medical protocol to what time to change the direction of high occupancy lanes on a highway. Applications may particularly affect individuals if the insights are employed in an individually unique manner. For example, the discovery phase might identify factors that, when taken together, would predict the likelihood of a particular cancer. If the knowledge is administered in rank order so that individuals are labelled on that likelihood, then labelling might have a direct impact on individuals. Thus, a model generating a credit score that corresponds to the likelihood an individual with that credit history will repay or default on a loan and that becomes a label attached to an individual either may adversely or advantageously affect an individual. All applications that make decisions affecting individuals require due diligence, whether targeted to a specific individual or not. 12 Those that touch on specific individuals require a heightened review. The application phase may include processes that assess new insights and apply changes to the application. For example, network security systems may be trained to look for new anomalies and predict the likelihood the anomalies will have a negative impact on the network, allowing the algorithms to detect and counter cyber-criminal actions or malware behaviour. The process 11 Krasnow Waterman, K. and P. Bruening (2014), Big Data analytics: Risks and Responsibilities, International Data Privacy Law, Volume 4, Issue 2, Oxford University Press, Oxford. 12 Every legal system deals with the discovery and application phases in a slightly different manner. European data protection law as implemented within the states requires an assessment of (a) is the new data use incompatible with the purposes specified at collection and (b) is there a legal basis for the processing. There are six legal bases for processing in Europe, only one of which is consent. Many Latin American privacy laws require explicit consent for all uses of personal data. Yet, gaining explicit consent may be problematic under existing interpretations of those laws. Other regimes, such as Canada s, allow implied consent. A key question in Canada is whether notices are clear enough that individuals would be able to anticipate the big data processing based on the notice. Consent is a legal requirement in many jurisdictions and, where effective, should be used. However, the Article 29 Working Party s paper on legitimate interests points out the other measures and the legal bases that should be used when consent is ineffective. One of the bases for processing data is legitimate interests, which requires a balancing of interests analysis. The process this paper advocates relies upon a balancing of interests. 5
6 for looking for new risks and applying solutions are engineered into the application phase. Accordingly, the initial ethical analysis takes into consideration that these are learning, selffixing systems. Bright lines between discovery and application are not always apparent. However, processes designed purely to create new knowledge should be considered discovery, while insight development primarily designed to improve existing processes should be part of application. Moreover, ethical big data analytics entails more than source due diligence and includes the consideration of a full spectrum of individual interests and human rights. It is not just a function of principles related to data protection or privacy. Global and regional text define individual interests and/or rights and include principles that discuss employment, basic economic needs, family, free expression and broad dispersal of the benefits of technology. Any ethical frame for big data analytics must take all of these interests into consideration. Privacy is an underpinning for many other interests but not all of them. Reticence risk, avoiding processing because one finds resolving the conflict between risks too difficult, which leads to the loss of meaningful benefits to individuals and society as a whole, is as much a violation of fundamental rights as the loss of privacy. Digital predictions should empower, not limit, what individuals can achieve if left to their own free will. Some processing of data and some applications of insights are prohibited by social values and laws. Any values assessment process begins with a review of, and compliance with, established laws. When processing data as part of a global process, one must be sensitive to regional and even national differences. 13 Many national laws were enacted before analytic-based research was well understood. National laws that provide only one legal permission mechanism to process personal data, for example, explicit consent for research, are particularly problematic. In some cases, by working with enforcement agencies, a protective but flexible legal basis may be established. The Article 29 Working Party s work on compatible use and legitimate interests has better informed the market on more flexible, but legitimate means, to use big data. It is beyond the scope of this frame to suggest legislative change, but effective means to govern big data and protect individuals may be needed in some jurisdictions. For generations, free enterprise has led to faster growth, more opportunities, solutions to social problems (while creating others) and new wealth. Big data insights will be used by business to further their objectives. The intent of this frame is not to stifle business but, rather, to channel big data endeavours so that they are creative, beneficial and protective all at the same time. Values for an Ethical Frame The unified ethical frame consists of five key values, which the Foundation isolated and are believed to help define the important questions for an ethical code with respect to big data. They are used to create a balancing process that facilitates governance with integrity in the 13 See footnote 12. 6
7 application of big data methodologies. The values establish the starting point for developing an interrogation framework necessary to assure a balanced ethical approach to big data. The five values are: Beneficial, Progressive, Sustainable, Respectful, and Fair. Prior to a values-based analysis, there should be an understanding of the intended purpose of the big data analytics. This understanding should clarify why the analysis will take place. Is the purpose to identify data correlations that will reveal broad questions, or is the problem statement more narrowly defined? It is during this stage that the evaluators should ascertain if there are any legal, contractual or overarching organizational values that affect the integrity of the analytics as understood. Beneficial Both the discovery and application phases require an organization to define the benefits that will be created by the analytics and should identify the parties that gain tangible value from the effort. The act of big data analytics may create risks for some individuals and benefits for others or society as a whole. Those risks must be counter-balanced by the benefits created for individuals, organizations, political entities and society as a whole. Some might argue that the creation of new knowledge is a value-creating process itself. While big data does not always begin with a hypothesis, it usually begins with a sense of purpose about the type of problem to be solved. Data scientists, along with others in an organization, should be able to define the usefulness or merit that comes from solving the problem so it might be evaluated appropriately. The risks should also be clearly defined so that they may be evaluated as well. If the benefits that will be created are limited, uncertain, or if the parties that benefit are not the ones at risk from the processing, those circumstances should be taken into consideration, and appropriate mitigation for the risk should be developed before the analysis begins. Progressive Because bringing large and diverse data sets together and looking for hidden insights or correlations may create some risks for individuals, the value from big data analytics should be materially better than not using big data analytics. If the anticipated improvements can be achieved in a less data-intensive manner, that less intensive processing should be pursued. One might not know the level of improvement in the discovery phase. Yet, in the application phase, the organization should be better equipped to measure it. This application of new learnings to create materially better results is often referred to as innovation. There are examples of big data being used to reduce congestion, manage disaster relief and improve medical outcomes. These are all examples of material improvements; however, there are other examples where organizations may analyse data and achieve only marginal improvements but use big data because big data is new and interesting. Organizations should not create the risks 7
8 associated with big data analytics if there are other processes that will accomplish the same objectives with fewer risks. 14 Sustainable All algorithms have an effective half-life a period in which they effectively predict future behaviour. Some are very long, others are relatively short. Models used in the mortgage securitization market to assign risk to sub-prime mortgages in the first decade of this century are examples of data scientists not understanding how the models themselves would influence the behaviour of various market players. That change in behaviour affected the model validity helping to facilitate a market decline. The half-life of an insight affects sustainability. Big data analysts should understand this concept and articulate their best understanding of how long an insight might endure once it is reflected in application. Big data insights, when placed into production, should provide value that is sustainable over a reasonable time frame. Considerations that affect the longevity of big data analytics include whether the source data will be available for a period of time in the future, whether the data can be kept current, whether one has the legal permissions to process the data for the particular application, and whether the discovery may need to be changed or refined to keep up with evolving trends and individual expectations. For example, an early application of big data analytics led to a significant reduction in fraud when the discovery phase produced new insights showing a significant portion of identity fraud was not identity theft but rather came from synthetic or manufactured identities. Later insights showed that the fraudsters changed the makeup of those fake identities as organizations improved their processes to catch them. As a result, the predicative algorithms were continually refined to sustain their effectiveness in detecting and preventing fraud. There are situations where data, particularly de-identified data, might be available for the discovery phase but would not be available in the application phase because of legal or contractual restrictions. These restrictions affect sustainability. Respectful Respectful relates directly to the context in which the data originated and to the contractual or notice related restrictions on how the data might be applied. The United States Consumer Privacy Bill of Rights speaks to data being used within context; European law discusses processing not incompatible to its defined purpose; and Canadian law allows for implied consent for evolving uses of data. Big data analytics may 14 Data protection guidance often raises the issue of proportionality. Those concepts of proportionality come into play when conducting assessments on all the values, but they particularly come into play on progressive. 8
9 affect many parties in many different ways. Those parties include individuals to whom the data pertains, organizations that originate the data, organizations that aggregate the data and those that might regulate the data. All of these parties have interests that must be taken into consideration and respected. For example, a specialized social network might display data pertaining to individuals that they would not expect to be used for, or would be inappropriate for, employment related purposes. Organizations using big data analytics should understand and respect the interests of all the stakeholders involved in, or affected by, the application. Anything less would be disrespectful. Fair Fairness relates to the insights and applications that are a product of big data, while respectful speaks to the conditions related to, and the processing of, the data. In lending and employment, United States law prohibits discrimination based on gender, race, genetics or age. Yet, big data processes can predict all of those characteristics without actually looking for fields labelled gender, race or age. The same can be said about genotypes, particularly those related to physical characteristics. Section 5 of the United States Federal Trade Commission Act prohibits unfair practices in commerce that are harmful to individuals not outweighed by countervailing benefits. 15 European guidance on application of the data protection directive continually references fairness as a component of determining whether a use of data is incompatible or a legal basis to process is appropriate. Big data analytics, while meeting the needs of the organization that is conducting or sponsoring the processing, must be fair to the individuals to whom the data pertains. The analysis of fairness needs to look not only at protecting against unseemly or risky actions but also at enhancing beneficial opportunities. Human rights speak to shared benefits of technology and broader opportunities related to employment, health and safety. Interfering with such opportunities is also a fairness issue. 15 FTC Policy Statement on 17 December 1980 states: (1) whether the practice, without necessarily having been previously considered unlawful, offends public policy as it has been established by statutes, the common law, or otherwise-whether, in other words, it is within at least the penumbra of some common law, statutory or other established concept of unfairness; (2) whether it is immoral, unethical, oppressive or unscrupulous; (3) whether it causes substantial injury to consumers (or competitors or other businessmen). U.S. Federal Trade Commission (1980), "FTC Policy Statement on Unfairness", 9
10 Upcoming In conducting this fairness assessment, organizations should take steps to balance individual interests with integrity. The project s next step is to develop the interrogation framework that will be used later in the development of interrogation questions in part D. That paper should be available in late
11 Appendix A Big Data Ethics Research Team Martin Abrams Paula Bruening Jennifer Barrett- Glasgow Lynn Goldstein Barbara Lawler Miranda Mowbray Artemi Rallo Lombarte Scott Taylor Hilary Wandall The Information Accountability Foundation Intel Corporation Acxiom New York University Center for Science and Progress Intuit Inc. HP Laboratories Universitat Jaume I de Castelló Hewlett-Packard Merck & Co., Inc. Big Data Ethics Project Staff Susan Smith Nick Warren The Information Accountability Foundation - Project Manager The Information Accountability Foundation - Publications 11
IAF BIG DATA ETHICAL ASSESSMENT PROJECT MAY 2015
IAF BIG DATA ETHICAL ASSESSMENT PROJECT MAY 2015 THE ISSUE Predicting future outcomes no matter the endeavor is compelling Big data is making predictions based on non-intuitive data relationships There
Re: Big Data: A Tool for Inclusion or Exclusion? Workshop Project No. P145406
October 30, 2014 Federal Trade Commission Office of the Secretary Room H 113 (Annex X) 600 Pennsylvania Avenue NW Washington, DC 20580 Re: Big Data: A Tool for Inclusion or Exclusion? Workshop Project
How To Respond To The Nti'S Request For Comment On Big Data And Privacy
Submission to the National Telecommunications and Information Administration (NTIA), U.S. Department of Commerce Docket No. 140514424 4424 01 RIN 0660 XC010 Comments of the Information Technology Industry
Accountability: Data Governance for the Evolving Digital Marketplace 1
Accountability: Data Governance for the Evolving Digital Marketplace 1 1 For the past three years, the Centre for Information Policy Leadership at Hunton & Williams LLP has served as secretariat for the
Privacy & Big Data: Enable Big Data Analytics with Privacy by Design. Datenschutz-Vereinigung von Luxemburg Ronald Koorn DRAFT VERSION 8 March 2014
Privacy & Big Data: Enable Big Data Analytics with Privacy by Design Datenschutz-Vereinigung von Luxemburg Ronald Koorn DRAFT VERSION 8 March 2014 Agenda? What is 'Big Data'? Privacy Implications Privacy
American Securitization Forum
Statement of Principles, Recommendations and Guidelines for the Modification of I. Introduction The American Securitization Forum (ASF) 1 is publishing this Statement as part of its overall efforts to
WHITEPAPER. Complying with the Red Flag Rules and FACT Act Address Discrepancy Rules
WHITEPAPER Complying with the Red Flag Rules and FACT Act Address Discrepancy Rules May 2008 2 Table of Contents Introduction 3 ID Analytics for Compliance and the Red Flag Rules 4 Comparison with Alternative
WRITTEN TESTIMONY OF JENNIFER BARRETT-GLASGOW GLOBAL PRIVACY OFFICER ACXIOM CORPORATION
WRITTEN TESTIMONY OF JENNIFER BARRETT-GLASGOW GLOBAL PRIVACY OFFICER ACXIOM CORPORATION BEFORE THE UNITED STATES HOUSE COMMITTEE ON ENERGY AND COMMERCE SUBCOMMITTEE ON COMMERCE, MANUFACTURING AND TRADE
GUIDANCE FOR MANAGING THIRD-PARTY RISK
GUIDANCE FOR MANAGING THIRD-PARTY RISK Introduction An institution s board of directors and senior management are ultimately responsible for managing activities conducted through third-party relationships,
Data protection at the cost of economic growth?
Data protection at the cost of economic growth? Elina Pyykkö* ECRI Commentary No. 11/November 2012 The Data Protection Regulation proposed by the European Commission contains important elements to facilitate
Understanding the Entity and Its Environment and Assessing the Risks of Material Misstatement
Understanding the Entity and Its Environment 267 AU-C Section 315 Understanding the Entity and Its Environment and Assessing the Risks of Material Misstatement Source: SAS No. 122; SAS No. 128. Effective
CORPORATE CODE OF ETHICS. Codes of corporate ethics normally have features including:
E. Professional values and ethics CORPORATE CODE OF ETHICS An ethical code typically contains a series of statements setting out the organization s values and explaining how it sees its responsibilities
Cloud Computing: Legal Risks and Best Practices
Cloud Computing: Legal Risks and Best Practices A Bennett Jones Presentation Toronto, Ontario Lisa Abe-Oldenburg, Partner Bennett Jones LLP November 7, 2012 Introduction Security and Data Privacy Recent
How To Ensure Health Information Is Protected
pic pic CIHI Submission: 2011 Prescribed Entity Review October 2011 Who We Are Established in 1994, CIHI is an independent, not-for-profit corporation that provides essential information on Canada s health
The 7 Foundational Principles. Implementation and Mapping of Fair Information Practices. Ann Cavoukian, Ph.D.
Privacy by Design The 7 Foundational Principles Implementation and Mapping of Fair Information Practices Ann Cavoukian, Ph.D. Information & Privacy Commissioner Ontario, Canada Purpose: This document provides
Privacy by Design The 7 Foundational Principles Implementation and Mapping of Fair Information Practices
Privacy by Design The 7 Foundational Principles Implementation and Mapping of Fair Information Practices Ann Cavoukian, Ph.D. Information & Privacy Commissioner, Ontario, Canada Purpose: This document
Big Data & The Law Will Landecker & John Rake
Big Data & The Law Will Landecker & John Rake Portland Eugene Salem harrang.com 800.315.4172 2014 Harrang Long Gary Rudnick P.C. Introduction Clients need data to solve problems or predict results. Attorneys
Best Practices for Protecting Individual Privacy in Conducting Survey Research
Best Practices for Protecting Individual Privacy in Conducting Survey Research CONTENTS Foreword... 1 Introduction... 2 Privacy Considerations at Each Stage of a Survey Research Project... 5 Stage 1: Issue
Code of Ethics for Professional Accountants
COE Issued December 2005; revised June 2010 Effective on 30 June 2006 until 31 December 2010 Code of Ethics for Professional Accountants CODE OF ETHICS FOR PROFESSIONAL ACCOUNTANTS CONTENTS Page PREFACE...
Privacy in the Cloud A Microsoft Perspective
A Microsoft Perspective November 2010 The information contained in this document represents the current view of Microsoft Corp. on the issues discussed as of the date of publication. Because Microsoft
POLICY FRAMEWORK AND STANDARDS INFORMATION SHARING BETWEEN GOVERNMENT AGENCIES
POLICY FRAMEWORK AND STANDARDS INFORMATION SHARING BETWEEN GOVERNMENT AGENCIES January 2003 CONTENTS Page 1. POLICY FRAMEWORK 1.1 Introduction 1 1.2 Policy Statement 1 1.3 Aims of the Policy 1 1.4 Principles
The European psychologist in forensic work and as expert witness
The European psychologist in forensic work and as expert witness Recommendations for an ethical practice 1. Introduction 1.1 These recommendations are made to the EFPPA member associations, advising that
CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING AGENCIES
CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING AGENCIES A CONSULTATION REPORT OF THE CHAIRMEN S TASK FORCE OF THE TECHNICAL COMMITTEE OF THE INTERNATIONAL ORGANIZATION OF SECURITIES COMMISSIONS OCTOBER
Observations on international efforts to develop frameworks to enhance privacy while realising big data s benefits
Big Data, Key Challenges: Privacy Protection & Cooperation Observations on international efforts to develop frameworks to enhance privacy while realising big data s benefits Seminar arranged by the Office
ONTARIO TRIAL LAWYERS ASSOCIATION. OTLA s Response to the Anti-Fraud Task Force Status Update
ONTARIO TRIAL LAWYERS ASSOCIATION OTLA s Response to the Anti-Fraud Task Force Status Update 8/17/2012 The Ontario Trial Lawyers Association (OTLA) welcomes the opportunity to provide comment on the Ontario
INSTITUTE FOR SAFE MEDICATION PRACTICES CANADA
INSTITUTE FOR SAFE MEDICATION PRACTICES CANADA PRIVACY IMPACT ASSESSMENT (PIA) ON ANALYZE-ERR AND CURRENT DATA HANDLING OPERATIONS VERSION 3.0-2 JULY 11, 2005 PREPARED IN CONJUNCTION WITH: ISMP Canada
Research Involving Human Biological Materials: Ethical Issues and Policy Guidance Executive Summary
Research Involving Human Biological Materials: Ethical Issues and Policy Guidance Executive Summary Introduction Biomedical researchers have long studied human biological materials such as cells collected
Privacy Impact Assessment of Automated Loan Examination Review Tool
Privacy Impact Assessment of Automated Loan Examination Review Tool Program or application name: Automated Loan Examination Review Tool (ALERT) System Owner: Board of Governors of the Federal Reserve System
MISSION VALUES. The guide has been printed by:
www.cudgc.sk.ca MISSION We instill public confidence in Saskatchewan credit unions by guaranteeing deposits. As the primary prudential and solvency regulator, we promote responsible governance by credit
(2) The neurological surgeon shall not participate in any activity, which is not in the best interest of the patient.
AANS Code of Ethics a) General Statement of Purpose The American Association of Neurological Surgeons has established a Code of Ethics for neurological surgeons as guidelines in medical, social, and professional
HIPAA Privacy and Security Changes in the American Recovery and Reinvestment Act
International Life Sciences Arbitration Health Industry Alert If you have questions or would like additional information on the material covered in this Alert, please contact the author: Brad M. Rostolsky
Enhancing Cybersecurity with Big Data: Challenges & Opportunities
Enhancing Cybersecurity with Big Data: Challenges & Opportunities Independently Conducted by Ponemon Institute LLC Sponsored by Microsoft Corporation November 2014 CONTENTS 2 3 6 9 10 Introduction The
Declaration of Internet Rights Preamble
Declaration of Internet Rights Preamble The Internet has played a decisive role in redefining public and private space, structuring relationships between people and between people and institutions. It
Privacy and Data Protection
Hewlett-Packard Company 3000 Hanover Street Palo Alto, CA 94304 hp.com HP Policy Position Privacy and Data Protection Current Global State of Privacy and Data Protection The rapid expansion and pervasiveness
Jan Philipp Albrecht Rapporteur, Committee on Civil Liberties, Justice and Home Affairs European Parliament
September 5, 2012 Jan Philipp Albrecht Rapporteur, Committee on Civil Liberties, Justice and Home Affairs European Parliament Lara Comi Rapporteur, Committee on Internal market and Consumer Protection
Board of Directors and Management Oversight
Board of Directors and Management Oversight Examination Procedures Examiners should request/ review records, discuss issues and questions with senior management. With respect to board and senior management
Deriving Value from ORSA. Board Perspective
Deriving Value from ORSA Board Perspective April 2015 1 This paper has been produced by the Joint Own Risk Solvency Assessment (ORSA) Subcommittee of the Insurance Regulation Committee and the Enterprise
International Privacy and Data Security Requirements. Benedict Stanberry, LLB LLM MRIN Director, Centre for Law Ethics and Risk in Telemedicine
International Privacy and Data Security Requirements Benedict Stanberry, LLB LLM MRIN Director, Centre for Law Ethics and Risk in Telemedicine Aims of this Presentation. To provide a brief overview of
Understanding the Entity and Its Environment and Assessing the Risks of Material Misstatement
Understanding the Entity and Its Environment 1667 AU Section 314 Understanding the Entity and Its Environment and Assessing the Risks of Material Misstatement (Supersedes SAS No. 55.) Source: SAS No. 109.
Memorandum! Is Big Data the right recipe for Europe?
has been around for years on account Ulrich Seldeslachts CEO, LSEC Leaders In Security (moderator) Data is a new class of economic asset; it s like currency, which means you have to do something with it
Policy Statement. Employee privacy, data protection and human resources. Prepared by the Commission on E-Business, IT and Telecoms. I.
International Chamber of Commerce The world business organization Policy Statement Employee privacy, data protection and human resources Prepared by the Commission on E-Business, IT and Telecoms I. Introduction
How To Audit A Company
INTERNATIONAL STANDARD ON AUDITING 315 IDENTIFYING AND ASSESSING THE RISKS OF MATERIAL MISSTATEMENT THROUGH UNDERSTANDING THE ENTITY AND ITS ENVIRONMENT (Effective for audits of financial statements for
HORIZON OIL LIMITED (ABN: 51 009 799 455)
HORIZON OIL LIMITED (ABN: 51 009 799 455) CORPORATE CODE OF CONDUCT Corporate code of conduct Page 1 of 7 1 Introduction This is the corporate code of conduct ( Code ) for Horizon Oil Limited ( Horizon
The Informatica Solution for Improper Payments
The Informatica Solution for Improper Payments Reducing Improper Payments and Improving Fiscal Accountability for Government Agencies WHITE PAPER This document contains Confidential, Proprietary and Trade
CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING AGENCIES
CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING AGENCIES THE TECHNICAL COMMITTEE OF THE INTERNATIONAL ORGANIZATION OF SECURITIES COMMISSIONS REVISED MAY 2008 CODE OF CONDUCT FUNDAMENTALS FOR CREDIT RATING
Best Practices for Protecting Individual Privacy in Conducting Survey Research (Full Version)
Best Practices for Protecting Individual Privacy in Conducting Survey Research (Full Version) April 1999 Information and Privacy Commissioner/Ontario 80 Bloor Street West Suite 1700 Toronto, Ontario M5S
Code of Ethics and Professional Responsibilities for Healthcare Ethics Consultants
Code of Ethics and Professional Responsibilities for Healthcare Ethics Consultants Preface The statements in this code set out the core ethical responsibilities of individuals performing healthcare ethics
National Standards for Safer Better Healthcare
National Standards for Safer Better Healthcare June 2012 About the Health Information and Quality Authority The (HIQA) is the independent Authority established to drive continuous improvement in Ireland
Optimizing government and insurance claims management with IBM Case Manager
Enterprise Content Management Optimizing government and insurance claims management with IBM Case Manager Apply advanced case management capabilities from IBM to help ensure successful outcomes Highlights
DOC NO: INFOSOC 52/14 DATE ISSUED: June 2014. Resolution on the open and neutral Internet
DOC NO: INFOSOC 52/14 DATE ISSUED: June 2014 Resolution on the open and neutral Internet Introduction This resolution builds on the TACD net neutrality resolution of April 2010 1, which called for policies
Audit of the Policy on Internal Control Implementation
Audit of the Policy on Internal Control Implementation Natural Sciences and Engineering Research Council of Canada Social Sciences and Humanities Research Council of Canada February 18, 2013 1 TABLE OF
Summary of Submissions Received on the Consultation on Strengthening Statutory Payment Oversight Powers and the Reserve Bank s Responses
Summary of Submissions Received on the Consultation on Strengthening Statutory Payment Oversight Powers and the Reserve Bank s Responses October 2013 2 SECTION ONE: INTRODUCTION 1. In March 2013, the Reserve
COMMUNIQUÉ ON PRINCIPLES FOR INTERNET POLICY-MAKING OECD HIGH LEVEL MEETING ON THE INTERNET ECONOMY,
COMMUNIQUÉ ON PRINCIPLES FOR INTERNET POLICY-MAKING OECD HIGH LEVEL MEETING ON THE INTERNET ECONOMY, 28-29 JUNE 2011 The Seoul Declaration on the Future of the Internet Economy adopted at the 2008 OECD
Federal Trade Commission Sharing Economy Workshop Project No. P15-1200. New York City Taxi & Limousine Commission Public Comment
Meera Joshi Commissioner [email protected] 33 Beaver Street 22nd Floor New York, NY 10004 Federal Trade Commission Sharing Economy Workshop Project No. P15-1200 New York City Taxi & Limousine
Audit of the Test of Design of Entity-Level Controls
Audit of the Test of Design of Entity-Level Controls Canadian Grain Commission Audit & Evaluation Services Final Report March 2012 Canadian Grain Commission 0 Entity Level Controls 2011 Table of Contents
In which new or innovative ways do you think RPAS will be used in the future?
Written evidence Submitted by Trilateral Research & Consulting On the House of Lords Select Committee on the European Union call for evidence on Civil use of remotely piloted aircraft systems (RPAS) in
BSA GLOBAL CYBERSECURITY FRAMEWORK
2010 BSA GLOBAL CYBERSECURITY FRAMEWORK BSA GLOBAL CYBERSECURITY FRAMEWORK Over the last 20 years, consumers, businesses and governments 1 around the world have moved online to conduct business, and access
APEC General Elements of Effective Voluntary Corporate Compliance Programs
2014/CSOM/041 Agenda Item: 3 APEC General Elements of Effective Voluntary Corporate Compliance Programs Purpose: Consideration Submitted by: United States Concluding Senior Officials Meeting Beijing, China
THE UH OH MOMENT. Financial Services Enterprises Focus on Governance, Transparency and Supply Chain Risk
THE UH OH MOMENT Financial Services Enterprises Focus on Governance, Transparency and Supply Chain Risk By Lois Coatney, Chuck Walker and Joseph Yacura, ISG Directors www.isg-one.com INTRODUCTION A top
Cloud Computing and Privacy Toolkit. Protecting Privacy Online. May 2016 CLOUD COMPUTING AND PRIVACY TOOLKIT 1
Cloud Computing and Privacy Toolkit Protecting Privacy Online May 2016 CLOUD COMPUTING AND PRIVACY TOOLKIT 1 Table of Contents ABOUT THIS TOOLKIT... 4 What is this Toolkit?... 4 Purpose of this Toolkit...
OVERVIEW. stakeholder engagement mechanisms and WP29 consultation mechanisms respectively.
Joint work between experts from the Article 29 Working Party and from APEC Economies, on a referential for requirements for Binding Corporate Rules submitted to national Data Protection Authorities in
Human Research Protection Program Good Clinical Practice Guidance for Investigators Investigator & Research Staff Responsibilities
This Guidance Document is to ensure that investigators and research personnel recognize their responsibilities associated with the conduct of human subject research by outlining their responsibilities,
Title: Big Data Analytics: Risks and Responsibilities
Title: Big Data Analytics: Risks and Responsibilities Short Title: Big Data Risks Submission type: Original Article Authors: K. Krasnow Waterman CEO, LawTechIntersect, LLC, New York, NY 10022, USA Visiting
Corporate Code of Ethics
FERROVIAL CORPORATE CODE OF ETHICS Corporate Code of Ethics Our complete commitment to the ethics and integrity of our workforce highlights us as a serious company committed to its stakeholders interests.
Policy Brief: Protecting Privacy in Cloud-Based Genomic Research
Policy Brief: Protecting Privacy in Cloud-Based Genomic Research Version 1.0 July 21 st, 2015 Suggested Citation: Adrian Thorogood, Howard Simkevitz, Mark Phillips, Edward S Dove & Yann Joly, Policy Brief:
Principles for Responsible Clinical Trial Data Sharing
Principles for Responsible Clinical Trial Data Sharing Our Commitment to Patients and Researchers Biopharmaceutical companies are committed to enhancing public health through responsible sharing of clinical
Revised 05/22/14 P a g e 1
Corporate Office 107 W. Franklin Street P.O. Box 638 Elkhart, IN 46515-0638 Phone (574) 294-7511 Fax (574) 522-5213 INTRODUCTION PATRICK INDUSTRIES, INC. CODE OF ETHICS AND BUSINESS CONDUCT As a leader
WESTERN ASSET MORTGAGE CAPITAL CORPORATION CODE OF CONDUCT
WESTERN ASSET MORTGAGE CAPITAL CORPORATION CODE OF CONDUCT I. Introduction This Code of Conduct (the "Code") sets out basic principles to guide the day-today business activities of directors, officers
COMMISSION AUTHORIZED
V900033 UNITED STATES OF AMERICA FEDERAL TRADE COMMISSION WASHINGTON. D.C. 20S80 COMMISSION AUTHORIZED April 17, 1990 Bruce Hamilton Executive Director State Bar of Arizona 363 North First Avenue Phoenix,
Position of the retail and wholesale sector on the Draft Data Protection Regulation in view of the trilogue 2015
2 September 2015 Position of the retail and wholesale sector on the Draft Data Protection Regulation in view of the trilogue 2015 We support the efforts of EU legislators to create a harmonised data protection
Universal Declaration on Bioethics and Human Rights
Universal Declaration on Bioethics and Human Rights Adopted by the UNESCO General Conference, 19 October 2005 The General Conference, Conscious of the unique capacity of human beings to reflect upon their
Binding Corporate Rules ( BCR ) Summary of Third Party Rights
Binding Corporate Rules ( BCR ) Summary of Third Party Rights This document contains in its Sections 3 9 all provision of the Binding Corporate Rules (BCR) for Siemens Group Companies and Other Adopting
POSITION PAPER ON SCOPES OF PRACTICE PUBLISHED BY THE ONTARIO COLLEGE OF SOCIAL WORKERS AND SOCIAL SERVICE WORKERS
POSITION PAPER ON SCOPES OF PRACTICE PUBLISHED BY THE ONTARIO COLLEGE OF SOCIAL WORKERS AND SOCIAL SERVICE WORKERS OCTOBER 2008 Published by the OCSWSSW October 2008 2 Ontario College of Social Workers
IBM Software Universal Health Identifiers: Issues and Requirements for Successful Patient Information Exchange
IBM Software Universal Health Identifiers: Issues and Requirements for Successful Patient Information Exchange Lorraine Fernandes, RHIA Scott Schumacher, Ph.D. Universal Health Identifiers: Issues and
OCC 98-3 OCC BULLETIN
To: Chief Executive Officers and Chief Information Officers of all National Banks, General Managers of Federal Branches and Agencies, Deputy Comptrollers, Department and Division Heads, and Examining Personnel
GSK Public policy positions
Safeguarding Personally Identifiable Information A Summary of GSK s Binding Corporate Rules The Issue The processing of Personally Identifiable Information (PII) 1 and Sensitive Personally Identifiable
European Union Law and Online Gambling by Marcos Charif
With infringement proceedings, rulings by the European Court of Justice (ECJ) and the ongoing lack of online gambling regulation at EU level, it is important to understand the extent to which member states
Importance of the Consumer Financial Protection Bureau
Importance of the Consumer Financial Protection Bureau The aftermath of the financial crisis affected millions of Americans. The U.S. economy was devastated as companies crumbled, homeowners lost their
