The De-identification Maturity Model Authors: Khaled El Emam, PhD Waël Hassan, PhD

Save this PDF as:
 WORD  PNG  TXT  JPG

Size: px
Start display at page:

Download "The De-identification Maturity Model Authors: Khaled El Emam, PhD Waël Hassan, PhD"

Transcription

1 A PRIVACY ANALYTICS WHITEPAPER The De-identification Maturity Model Authors: Khaled El Emam, PhD Waël Hassan, PhD De-identification Maturity Assessment Privacy Analytics has developed the De-identification Maturity Model or DMM as a formal framework for evaluating the maturity of de-identification services within an organization. The framework gauges the level of an organization s readiness and experience with respect to deidentification in terms of people, processes, technologies and consistent measurement practices. The DMM is used as a measurement tool; and enables the enterprise to implement a fact-based improvement strategy.

2 Contents Introduction... 4 Structure of the De-identification Maturity Model... 5 Key De-identification Practice Dimension... 8 P1 Ad-hoc... 8 P2 Masking... 8 P3 Heuristics... 9 P4 Risk-based... 9 P5 Governance The Implementation Dimension I1 - The Initial Level I2 The Repeatable Level I3 The Defined Level I4 The Measured Level The Cumulative Nature of the Implementation Dimension The Automation Dimension A1 Homegrown Automation A2 Standard Automation Use of the De-identification Maturity Model The Practice Dimension and Compliance Scoring Process Improvement Ambiguous Cases Case Case Case Case Mapping to the Twelve Characteristics Quantitative Assessment Scheme References

3 About Privacy Analytics

4 Introduction Privacy Analytics has developed the De-identification Maturity Model or DMM as a formal framework for evaluating the maturity of de-identification services within an organization. The framework gauges the level of an organization s readiness and experience with respect to deidentification in terms of people, processes, technologies and consistent measurement practices. The DMM is used as a measurement tool and enables the enterprise to implement a fact-based improvement strategy. The DMM is a successor to a previous analysis in which we identified twelve criteria for evaluating a de-identification methodology [1], and to an earlier maturity model which we developed to assess the identifiability of data [2]. The criteria used, in both instances, were based on contemporary standards. The set of twelve criteria, although useful for general evaluation purposes, poses two challenges regarding its application: (a) how can these criteria be used to evaluate the deidentification practices of organizations, and (b) do all of the twelve criteria need to be implemented at once? We have now developed a maturity model based on these twelve criteria: the de-identification maturity model. The DMM is intended to serve a number of purposes: it can (a) be used by organizations as a yardstick to evaluate their de-identification practices, (b) provide a roadmap for improvement, helping organizations to determine what they need to do next in order to improve their de-identification practices and (c) allow different units or departments within a larger organization to compare their deidentification practices in a concise and objective way. Organizations that have a higher maturity score on the DMM are considered to have better and more sophisticated de-identification practices. Higher maturity scores indicate that the organization is able to: (a) defensibly ensure that the risk of re-identification is very small, (b) meet regulatory and legal requirements, (c) share more data for secondary purposes using fewer resources (greater efficiency), (d) share higher quality data that better meets the analytical needs of the data recipients, (e) de-identify data through consistent practices, and (f) better estimate the resources and time required to de-identify a data set. 4

5 Structure of the DMM The DMM has five maturity levels to describe the de-identification practices that an organization has in place, with level 1 being the lowest level of maturity and level 5 being the highest level of maturity. Borrowing a term from the ISO/IEC international standard [1] on software process assessment, we will assume that the scope of the DMM is an organizational unit (or OU for short). An OU is a general term for entities of all sizes, from small units of a few people assigned to a specific project up to whole enterprises. We therefore deliberately do not define an OU because the definition will be case specific. It may be a particular business unit within a larger enterprise, or it can be a whole ministry of health. DMM is a descriptive model developed through discussions with, and experiences and observations of, more than 60 OUs over the last five years. The model is intended to capture the stages that an OU passes through as it implements de-identification services. Based on our experiences, the DMM describes the evolutionary stages through which OUs achieve greater levels of sophistication and efficiency. The DMM has three dimensions, as illustrated in FIGURE 1. The first dimension is the nature of the deidentification methodology that the OU has in place: this is the Key De-identification Practice dimension. The second dimension captures how well these practices are being implemented. This Implementation dimension covers elements such as proper management of de-identification practices, documentation of practices, and their measurement. The third dimension of Automation assesses the degree of automation of the de-identification process. We will examine each of these dimensions in detail below. 5

6 Key Deidentification Practice Dimension Implementation Dimension Automation Dimension FIGURE 1: The three dimensions of the De-identification Maturity Model. The DMM can be represented in the form of a matrix as illustrated in Figure 2. This matrix also allows for the explicit scoring of an OU s de-identification services. 6

7 FIGURE 2: The de-identification maturity matrix showing all three dimensions of the DMM.

8 Key De-identification Practice Dimension P1 Ad-hoc At this level, an OU does not have any defined practices for de-identification. The OU may not even realize that de-identification is necessary. If it is recognized as necessary, it is left to database administrators or analysts to figure out what to do, without much guidance. The methods that are used to de-identify the data are either developed in-house (e.g., invented by a programmer) or picked up by browsing the Internet. At Practice Level 1 (P1), such methods are not proven to be rigorous or defensible. OUs at this level will often lack adequate advice from their legal counsel, have a weak or non-existent privacy or compliance office, and/or have poor communication with this office. OUs at the Practice Level 1 tend to have a lot of variability in how they de-identify data, as the type and amount of de-identification applied will be depend on the analyst who is performing it, and that analyst s experience and skill. They may apply various techniques on the data, such as rudimentary masking methods, or other non-reviewed approaches. The quality of the data coming out of the OU will also vary, as the extent of de-identification may be indiscriminately high or low. Some OUs at Practice Level 1 recognize the low maturity of their de-identification capacities and err on the conservative side by not releasing any data. For OUs at this level which do release data, a data breach would almost certainly be considered notifiable in jurisdictions where there are breach notification laws in place. P2 Masking OUs at this level only implement masking techniques. Masking techniques focus exclusively on direct identifiers such as name, phone number, health plan number, and so on. They include techniques such as pseudonymization, the suppression of fields, and randomization [3]. As we have documented elsewhere [1], masking is not sufficient to ensure that the risk of reidentification is very small for the data set. Masking is necessary, but not sufficient, to protect against identity disclosure. Even if good masking techniques are used, it is possible to produce a data set with a high risk of re-identification. According to our observations, OUs often remain at Practice Level 2 for one or a combination of these three reasons: (a) masking tool vendors erroneously convince them that masking is sufficient to ensure

9 that the risk of re-identification is small, (b) the OU needs to disclose unique identifiers (such as social security numbers or health insurance numbers) to facilitate data matching but is uncomfortable doing so and consequently chooses to implement pseudonymization, and (c) the individuals who are tasked with implementing de-identification lack knowledge of the area and do not know about residual risks from indirect identifiers. In health care, and increasingly outside of healthcare, this method of de-identification alone will not meet the expected standards for protection of personal information (see the table mapping to existing standards below). Therefore, OUs at Practice Level 2 will also have to go through a notification process if they experience a breach, in jurisdictions where there are breach notification laws. P3 Heuristics At this level OUs have masking techniques in place, and have started to add heuristic methods for protecting indirect identifiers. Heuristic methods are rules-of-thumb that are used to de-identify data. For example, the commonly used cell size of five is a rule of thumb, as is the heuristic that no geographic area with less than 20,000 residents will be released. Oftentimes these heuristics are just copied from another organization that is believed to be reputable or is perceived to have good practices for managing privacy. A discussion of heuristics can be found elsewhere [4]. This is a significant improvement from Practice Level 2 in that it is starting to look at ways to protect indirect identifiers, such as location, age, dates of service, types of service, etc., that can be used in combination to identify individuals. However, heuristics have two key disadvantages: (a) they do not ensure that the risk of re-identification is sufficiently small for the OU s particular data sets, and (b) they may result in too much distortion of the data set. The primary reasons for these disadvantages are that heuristics do not rely on measurement of re-identification risk and do not take the context of the use or disclosure of the data into account. P4 Risk-based Risk-based de-identification involves the use of empirically validated and peer-reviewed measures to determine the acceptable re-identification risk and to demonstrate that the actual risk in the data sets is at or below this acceptable risk level. In addition to measurement, there are specific techniques that take into account the context of the de-identification when deciding on an acceptable risk level. Because risk can be measured, it is possible to perform only enough de-identification on the data to meet the risk threshold, and no more. In addition to risk measurement, OUs at this level quantify 9

10 information loss, or the amount of change made to the data. By considering these two types of measures, the OU can ensure that the data experiences minimal change while still meeting the risk threshold requirement. Of course, masking techniques are still used at this level to protect direct identifiers in the data set. These are assumed to be carried over from lower levels on this dimension. At this level, if practices are implemented consistently, it would be relatively straightforward to make the case that no notification is required if there is a data breach of the de-identified data. P5 Governance At the highest level of maturity, masking and risk-based de-identification are applied as described in Practice Level 4. However, now there is a governance framework in place, as well as practices to implement it. Governance practices include performing audits of data recipients, monitoring changes in regulations, and having a re-identification response process. 10

11 The Implementation Dimension The de-identification practices described above can be implemented with different levels of rigor. We define four levels of implementation: Initial, Repeatable, Defined, and Measured. Note, that there is no Implementation dimension for OUs at the P1 level because there are no de-identification practices to implement. Consequently these cells are crossed out in Figure 2. I1 - The Initial Level At the Initial level the de-identification practices are performed by an analyst with no documented process, no specific training, and no performance measurements in place. It is experiential and the quality of the de-identification that is performed will depend largely on the skills and effort of the analyst performing it. This leads to variability in how well and how quickly de-identification can be performed. It also means that the OU is at risk of losing a significant amount of its de-identification expertise if these analysts leave or retire; there is no institutional memory being built up to maintain the practices within the OU. I2 The Repeatable Level At the Repeatable level the OU has basic project management practices in place to manage the deidentification service. This means that: (a) there are roles and responsibilities defined for performing deidentification, and (b) there is a known high-level process for receiving data, de-identifying it, and then releasing it to the data users. At this level there is a basic structure in place for de-identification. Also critical at this level is the involvement of the privacy or compliance office in helping to shape deidentification practices. Since these staff would have a more intimate understanding of legislation and regulations, their inputs are advisable at the early stages of implementing de-identification practices. I3 The Defined Level The Defined level of implementation means that the de-identification process is documented and there is training in it in place. Documentation is critical because it is a requirement in privacy standards. The HIPAA Privacy Rule statistical method, for example, explicitly mentions documentation of the process as a compliance requirement. Training ensures that the analysts performing the de-identification will be able to do it correctly and consistently. In order to comply with regulatory requirements, the nature of the documentation also matters. The purpose of the documentation is to demonstrate to an auditor or an investigator, in the event of a potentially adversarial situation, the precise practices that are used to de-identify the data. An auditor 11

12 or investigator will be looking at the process documentation for a number of reasons, such as: (a) there has been a breach and the regulator is investigating it; (b) there has been a patient complaint about data sharing and concerns have been expressed in the media, resulting in an audit being commissioned; and/or (c) a client of the OU has complained that data they are receiving from the OU are not deidentified properly. The documentation is then necessary to make the case strongly and convincingly that adequate methods were used to de-identify the data. Based on our experience, a two or three page policy document, for example, will generally not be considered to be sufficient. I4 The Measured Level The Measured level of implementation pertains to performance measures of the de-identification process being made and used. Measures can be based on tracking of the data sets that are released and of any data sharing agreements. For example, the OU can examine trends of data releases and their types over time to enable better resource allocation; for instance, overlaps in data requests could lead to the creation of standard data sets. Other measures can include data user satisfaction surveys and measures of response times and delays in getting data out. The Cumulative Nature of the Implementation Dimension The levels in the Implementation dimension are cumulative in that it is difficult to implement a higher level without having implemented a lower level. For example, meaningful performance measures will be difficult to collect without having a defined process. The Automation Dimension Automation is essential for scalability. Any data set that is not trivial in size can only be de-identified using automated tools. Automation becomes critical as data sets become larger and as de-identification needs to be performed regularly (as in data feeds). Without automation, it will be difficult to believe that there is a de-identification process in place. A1 Home-grown Automation An OU may attempt to develop its own scripts and tools to de-identify data sets. Based on our experiences, solutions developed in-house tend to have fundamental weaknesses in them, or to be incomplete. For example, some OUs may try to develop their own algorithms for creating pseudonyms. We have sometimes seen OUs develop their own hashing schemes for pseudonyms. We strongly advise against this, because they will almost certainly not work correctly, or will be easy to re-identify (one should only use NIST approved hashing schemes). But even for seemingly straightforward masking 12

13 schemes, de-identification can be reversed if not constructed carefully [1]. It takes a considerable amount of expertise in this area to construct effective masking techniques. We have also observed home grown de-identification scripts that distort the data too much, because they focus on modifying data to protect privacy without considering data utility. A2 Standard Automation Standard automation means adopting tools that have been used more broadly by multiple organizations and have received scrutiny. These may be publicly available (open source) or commercial tools. The advantage of standard tools is transparency, in that their algorithms have likely been reviewed and evaluated by a larger community. Any weaknesses have been identified and flagged, and the developer(s) of the tools have been made aware of them and likely addressed them. Data masking is not a trivial task. Developing pseudonymization or randomization schemes that are guaranteed not to be reversible (at least have a low probability) requires significant knowledge in the area of disclosure control. The measurement of re-identification risk is an active area of research among statisticians, and the de-identification of indirect identifiers is a complex optimization problem. The metrics and algorithms of any tools adopted should be proven to be defensible. 13

14 Use of the De-identification Maturity Model The Practice Dimension and Compliance Across multiple jurisdictions, OUs at levels P1 and P2 in the Practice dimension are generally not compliant with de-identification regulations. Practice Level 3 OUs may pass an audit, but with major findings, because they will not be able to provide objective evidence that their de-identification techniques ensure that re-identification risk is very small (since there is no measurement in place). The outcome of an audit or an investigation will depend on how strict the auditor is, but in general, deidentified data cannot be defensibly considered to be re-identifiable. In general, we recommend that Practice Level 3 should be a transitional stage to higher Practice levels. FIGURE 3: An example of scoring P2-I3-A1 on the maturity matrix. 14

15 Scoring An OU scored on the DMM has three scores: the Practice score, the Implementation score, and the Automation score. For example, an OU that has purchased a data masking tool and implemented it, and has documented the data masking process and its justifications thoroughly, would be scored at P2-I3-A2. If the masking tool was home grown, but with the same level of documentation and training, then the score would be P2-I3-A1. This is illustrated on the maturity matrix in Figure 3. The check marks in the matrix indicate the three dimensions of the score. The absolute minimum score to be able to make any defensible claim of compliance with current standards would be a P4-I3-A1 score. We deliberately did not attach weights to the dimensions as we do not have a parsimonious way of doing so. Based on our experiences we would argue that an OU would be best off improving its P score first, and then focusing on Automation, followed by the Implementation. Without the appropriate deidentification practices (Practice dimension) in place, adequate privacy protection is impossible. The OU must first have reasonable de-identification practices in place. Then, automation is necessary for the deidentification of data sets of any substantial size. Finally, the Implementation dimension should receive attention. Therefore, the priority ranking is Practice > Automation > Implementation. Improvements beyond the minimum required for compliance may be motivated by a drive to achieve certain outcomes, such as finding efficiencies, improving service quality, and reducing costs. For example, further improvements allow for the release of higher quality data, the release of larger volumes of data, faster data releases, and lower operating costs. In large OUs there may be multiple departments and units performing de-identification, and each of these departments and units may have a different DMM score. These scores may be quite divergent, reflecting significant heterogeneity within the OU. This makes it challenging to compute or present an OU-wide score. There are three ways to approach such heterogeneity: (a) present a range for each of the three scores to reflect that heterogeneity, (b) present the average for each score, or (c) define multiple department or unit profiles and present a separate assessment for each. 15

16 The third approach requires some explanation. In a case where there are two extreme sets of practices within an OU, some very good and some very poor (according to the maturity model), it would be difficult to present a coherent picture of the whole OU. Those departments or units with high maturity scores would be represented, characterized, and described in a Pioneers profile. Those departments or units with low maturity scores would be represented by a Laggards profile. Each profile s scores on the three DMM dimensions would be an average or a range of the scores of the departments or units represented.. Process Improvement The DMM provides a roadmap for an OU to improve its de-identification practices. The Practice dimension target for an OU that only occasionally uses and discloses data for secondary purposes would be Practice Level 4. The Practice dimension target for an OU that uses and discloses data for secondary purposes on quite a regular basis would be Practice Level 5. Process improvement plans generally focus on all three dimensions. These may be performed simultaneously or staggered, depending on resources. It is very reasonable for an OU to skip certain Practice Levels. Recall that the DMM characterizes the natural evolution of de-identification practices that an OU goes through. In a deliberate process improvement context an OU may skip Practice Levels to move directly to the targeted state. For example, an OU at Practice Level 1 (Ad-hoc) would not deliberately move to a still non-compliant Practice Level 2 (Masking), but directly to Practice Level 4 (Risk-based). As noted above, the Implementation Levels (Initial, Repeatable, Defined, and Measured) are cumulative. This means that skipping Implementation Levels is not recommended (and will not work very well). 16

17 Ambiguous Cases We consider below some grey areas or edge cases that have been encountered in practice. These are intended to help interpret the DMM. Case 1 Assume that an OU is using a masking technique, but that technique is known to be reversible. For example, the method that is used for creating the pseudonym from a social security number or medical record number has weaknesses in that one can infer the original identifier from the pseudonym. Weaknesses with other masking techniques have been described elsewhere [1]. Would that OU be considered at Practice Level 1 (Ad Hoc) or Practice Level 2 (Masking)? In general we would consider this OU to still be at Practice Level 1 since the approach that is used for masking would be considered adhoc. To be a true Practice Level 2 OU the masking methods must be known to be strong, in that they cannot be easily reverse engineered by an adversary. Case 2 A Practice Level 2 (Masking) OU has implemented pseudonymization but has not implemented other forms of masking on direct identifiers. Would that OU still be considered at Practice Level 2? If all of the direct identifiers have not been determined and masked properly, then this OU is considered at Practice Level 1 (Ad Hoc). Pseudonymization by itself may not sufficient. Also, sometimes OU s will create pseudonyms for some direct identifiers but not for others a form of partial pseudonymization. Again, this would still keep the OU at Practice Level 1. Case 3 A series of algorithms and checklists have been developed and implemented by an OU s staff to deidentify indirect identifiers. The algorithms always de-identify the data sets the same way, and do not take the context into account. Because the context is not accounted for, this would be considered a Level 3 (Heuristic) OU. Being able to adjust the parameters of the de-identification to account for the data use and disclosure context is important for the DMM definition of Practice Level 4 (Risk-based). Case 4 An OU needs to disclose data to allow the data recipient to link the data set with another data set. The two data sets do not have a unique identifier in common, therefore probabilistic linkage is necessary. This means that indirect identifiers such as the patient s date of birth, postal code, and date of admission need to be disclosed without any de-identification. In that case, because there is a necessity 17

18 to disclose fields that are suitable for probabilistic linkage does not mean that the data set is considered to have a very small risk of re-identification. Unless the data custodian can demonstrate that the measured risk is very small, then this cannot be considered as having a very small risk of reidentification. Furthermore, there are alternative approaches one can use for secure probabilistic matching that do not require the sharing of the indirect identifiers. Mapping to the Twelve Characteristics In an earlier analysis we documented twelve characteristics of de-identification methodologies that are mentioned in contemporary standards, such as guidance and best practice documents from regulators [1], [5]. The following mapping indicates how the DMM maps to these criteria. There are three conclusions that one can draw from this mapping. First, that the DMM covers the 12 characteristics and therefore it is consistent with existing standards. Second, that the DMM covers some practices that are not mentioned in the standards. There are a number of reasons for this: The standards describe a high maturity OU. The DMM covers low maturity as well as high maturity OUs. Hence, we discuss some of the practices we see in low maturity OUs. The standards do not cover some practices that we believe are critical for the effective implementation of de-identification, such as automation (the Automation dimension) and performance measurement (the I4 level). These are practical requirements that enable the scaling of de-identification. We have observed that large scale de-identification is becoming the norm because of the volume of health data that is being used and disclosed for secondary purposes. Third, we see that the union of current standards describes a P5-I3-A1 OU. As noted above we consider P5 practices most suitable for OUs that perform a significant amount of de-identification. And the standards do not discuss performance measurement and automation. 18

19 Mapping the DMM to the Twelve Characteristics CRITERION Is the methodology documented? Has the methodology received external or independent scrutiny? Does the methodology require and have a process for the explicit identification of the data custodian and the data recipients? MAPPING I3 this is a requirement of the Implementation Defined Level P5 this is part of governance 1 P2 onwards Customizing the fields to mask or to de-identify can depend on the data recipient 2 Does the methodology require and have a process for the identification of plausible adversaries and plausible attacks on the data? Does the methodology require and have a process for the determination of direct identifiers and quasi-identifiers? Does the methodology have a process for identifying mitigating controls to manage any residual risks? Does the methodology require the measurement of actual re-identification risks for different attacks from the data? Is it possible to set, in a defensible way, re-identification risk thresholds? Is there a process and template for the implementation of the re-identification risk assessment and deidentification? Does the methodology provide a set of data transformations to apply? P4 this kind of practice would mostly be relevant in a risk-based deidentification approach This is generally considered in the structure of the Practice dimension P4 this kind of practice would mostly be relevant in a risk-based deidentification approach P4 this kind of practice would mostly be relevant in a risk-based deidentification approach P4 this kind of practice would mostly be relevant in a risk-based deidentification approach P4 this kind of practice would mostly be relevant in a risk-based deidentification approach I3 a documented methodology with appropriate training would explain the data transformations 1 This is also to ensure that there are no loop holes in the implementation of the de-identification methodology that would allow the leakage of information useful for re-identification. For example, if the data is being disclosed through an on-line table generation tool, then the data recipient should not be able to create overlapping tables to circumvent disclosure control restrictions. 2 In the case of masking, the fields that are pseudonymized, for example, may be dependent on the role of the data recipient. There may be different heuristics for different classes of data recipients. In general, if a Privacy Impact Assessment has been performed then the data custodians and data recipients would have been identified. 19

20 Does the methodology consider data utility? Does the methodology address de-identification governance? P2 and P4 usually masking methodologies have a subjective measure of data utility and risk-based ones a more objective measure of data utility Elements of governance are covered in Practice Level 5 and also in Implementation Level 3. 20

21 Quantitative Assessment Scheme We now discuss a scoring scheme for the maturity model. This is the scoring scheme that we have been using and we found that it provides results that have face validity. Recall that the objective of using the DMM is often to identify weaknesses and put in place a roadmap for improvement, and a scoring scheme that supports that objective would be considered acceptable. An assessment can be performed by the OU itself or by an external assessor. It consists of a series of questions around each of the dimensions. The response to a question is Yes (and gets a score of 4), No (and gets a score of zero), or is being planned (and gets a score of 1). For each dimension in the DMM, start at the level 2 questions. Score the level 2 questions and take their average. If the average score is less than or equal to 3 then that OU is at level 1 on that dimension. If the OU has a score greater than 3, then proceed to the next level s questions, and so on. Key De-identification Practice Dimension CRITERION Yes No Planned Level 2 - Masking Does the OU s de-identification methodology require and have a process for the determination of direct identifiers and quasi-identifiers? Does the OU have practices for either deleting direct identifiers or creating pseudonyms from unique direct identifiers (such as MRNs, health insurance numbers, SSNs or SINs)? Does the OU have practices for the randomization of other non-unique direct identifiers? Is data utility explicitly considered when deciding which direct identifiers to mask and how? Does the OU require and have a process for the explicit identification of the data custodian and the data recipients? Level 3 - Heuristics Does the OU s de-identification methodology require and have a process for the determination of direct identifiers and quasi-identifiers? Does the OU have practices for either deleting direct identifiers or creating pseudonyms from unique direct identifiers (such as MRNs, health insurance numbers, 21

22 SSNs or SINs)? Does the OU have practices for the randomization of other non-unique direct identifiers? Is data utility explicitly considered when deciding which direct identifiers to mask and how? Does the OU require and have a process for the explicit identification of the data custodian and the data recipients? Does the OU have a checklist of indirect identifiers to always remove or generalize (similar in concept to the US HIPAA Privacy Rule Safe Harbor list)? Does the OU use general rules-of-thumb to generalize or aggregate certain indirect identifiers (e.g., never release more than three characters of the postal code) that are always applied for all data releases? Level 4 Risk-based Does the OU s de-identification methodology require and have a process for the determination of direct identifiers and quasi-identifiers? Does the OU have practices for either deleting direct identifiers or creating pseudonyms from unique direct identifiers (such as MRNs, health insurance numbers, SSNs or SINs)? Does the OU have practices for the randomization of other non-unique direct identifiers? Is data utility explicitly considered when deciding which direct identifiers to mask and how? Does the OU require and have a process for the explicit identification of the data custodian and the data recipients? Does the OU require and have a process for the identification of plausible adversaries and plausible attacks on the data? Does the OU have a process for identifying mitigating controls to manage any residual risks? Does the OU require the measurement of actual reidentification risks for different attacks from the data? Is it possible to set, in a defensible way, re-identification risk thresholds? Is there a process and template for the implementation 22

23 of the re-identification risk assessment and deidentification? Level 5 Governance Does the OU s de-identification methodology require and have a process for the determination of direct identifiers and quasi-identifiers? Does the OU have practices for either deleting direct identifiers or creating pseudonyms from unique direct identifiers (such as MRNs, health insurance numbers, SSNs or SINs)? Does the OU have practices for the randomization of other non-unique direct identifiers? Is data utility explicitly considered when deciding which direct identifiers to mask and how? Does the OU require and have a process for the explicit identification of the data custodian and the data recipients? Does the OU require and have a process for the identification of plausible adversaries and plausible attacks on the data? Does the OU have a process for identifying mitigating controls to manage any residual risks? Does the OU require the measurement of actual reidentification risks for different attacks from the data? Is it possible to set, in a defensible way, re-identification risk thresholds? Is there a process and template for the implementation of the re-identification risk assessment and deidentification? Does the OU conduct of audits of data recipients (or require third-party audits) to ensure that conditions of data release are being satisfied? Does the OU have an explicit process for monitoring changes in relevant regulations and precedents (e.g., court cases or privacy commissioner orders)? Is there a process for dealing with an attempted or successful re-identification of a released data set? Has the OU subjected its de-identification practices to external review and scrutiny to ensure that they are defensible? 23

24 Does the OU commission re-identification testing to ensure that realistic re-identification attacks will have a very small probability of success? Does the OU monitor multiple data releases to the same recipients for overlapping variables that may increase the risk of re-identification? Implementation Dimension CRITERION Yes No Planned Does the OU have someone responsible for deidentification? Level 2 - Repeatable Does the OU have clearly defined expected inputs and outputs for the de-identification? Are there templates for data requests and de-identification reports, certificates, and agreements? Is the privacy or compliance function involved in defining or reviewing the de-identification practices? Is the de-identification process documented? Level 3 - Defined Does the OU have evidence that the documented process is followed? Are all analysts who need to perform or advise on deidentification activities receiving appropriate training? Level 4 - Measured Does the OU collect and store performance data, for example, on the number and size of de-identified data sets, the types of fields, the types of data recipients, how long deidentification takes? Is trend analysis performed on the collected performance data to understand how performance is changing over time? Are actions taken by the OU to improve performance based on the trend analysis? Are satisfaction surveys of data recipients conducted and analyzed? 24

25 Automation Dimension CRITERION Yes No Planned Level 2 Standard Automation Does the OU use off-the-shelf data masking tools? Does the OU use off-the-shelf data de-identification tools? Is the functioning of the masking tools transparent (i.e., documented and reviewed)? Is the functioning of the de-identification tools transparent (i.e., documented and reviewed)? 25

26 References [1] International Standards Organization, Accessed April 2013 [2] K. El Emam, Risky Business: Sharing Health Data while Protecting Privacy. Trafford, [3] K. El Emam, Risk-based de-identification of health data, IEEE Security and Privacy, vol. 8, no. 3, pp , [4] El Emam, K., Guide to the de-identification of personal health information. CRC Press (Auerbach), [5] K. El Emam, Heuristics for de-identifying health data, IEEE Security and Privacy, pp , [6] K. El Emam, The twelve characteristics of a de-identification methodology, Privacy Analytics Inc. 26

27 About Privacy Analytics As the developers of the only commercially available de-identification and masking software application, Privacy Analytics provides comprehensive HIPAA de-identification and re-identification solutions. Privacy Analytics works with clients the healthcare industry including health insurance, pharmaceutical, medical devices to derive value and maximum granularity out of their data. Using a risk-based methodology, our solutions allow secondary users of personal information to have high quality, de-identified data that protects the privacy of personal data for the purpose of conducting complex analytics, while still meeting the most stringent legal, privacy and compliance regulations. Privacy Analytics, Inc. T: King Edward Avenue, Suite 3042, Ottawa, ON, K1N 6N5 blog.privacyanalytics.ca Copyright 2013 Privacy Analytics, Inc. All Rights Reserved. 27

A Privacy Officer s Guide to Providing Enterprise De-Identification Services. Phase I

A Privacy Officer s Guide to Providing Enterprise De-Identification Services. Phase I IT Management Advisory A Privacy Officer s Guide to Providing Enterprise De-Identification Services Ki Consulting has helped several large healthcare organizations to establish de-identification services

More information

The De-identification of Personally Identifiable Information

The De-identification of Personally Identifiable Information The De-identification of Personally Identifiable Information Khaled El Emam (PhD) www.privacyanalytics.ca 855.686.4781 info@privacyanalytics.ca 251 Laurier Avenue W, Suite 200 Ottawa, ON Canada K1P 5J6

More information

Degrees of De-identification of Clinical Research Data

Degrees of De-identification of Clinical Research Data Vol. 7, No. 11, November 2011 Can You Handle the Truth? Degrees of De-identification of Clinical Research Data By Jeanne M. Mattern Two sets of U.S. government regulations govern the protection of personal

More information

THE STATE OF DATA SHARING FOR HEALTHCARE ANALYTICS 2015-2016: CHANGE, CHALLENGES AND CHOICE

THE STATE OF DATA SHARING FOR HEALTHCARE ANALYTICS 2015-2016: CHANGE, CHALLENGES AND CHOICE THE STATE OF DATA SHARING FOR HEALTHCARE ANALYTICS 2015-2016: CHANGE, CHALLENGES AND CHOICE As demand for data sharing grows, healthcare organizations must move beyond data agreements and masking to achieve

More information

Foundation Working Group

Foundation Working Group Foundation Working Group Proposed Recommendations on De-identifying Information for Disclosure to Third Parties The Foundation Working Group (FWG) engaged in discussions around protecting privacy while

More information

De-Identification Framework

De-Identification Framework A Consistent, Managed Methodology for the De-Identification of Personal Data and the Sharing of Compliance and Risk Information March 205 Contents Preface...3 Introduction...4 Defining Categories of Health

More information

De-Identification Framework

De-Identification Framework De-Identification Framework 1 Agenda Introduction What is de-identification? Re-identification attacks and other risks to deidentified data De-identification methodologies and experts Conclusion 2 HITRUST

More information

ENSURING ANONYMITY WHEN SHARING DATA. Dr. Khaled El Emam Electronic Health Information Laboratory & uottawa

ENSURING ANONYMITY WHEN SHARING DATA. Dr. Khaled El Emam Electronic Health Information Laboratory & uottawa ENSURING ANONYMITY WHEN SHARING DATA Dr. Khaled El Emam Electronic Health Information Laboratory & uottawa ANONYMIZATION Motivations for Anonymization Obtaining patient consent/authorization not practical

More information

De-Identification 101

De-Identification 101 De-Identification 101 We live in a world today where our personal information is continuously being captured in a multitude of electronic databases. Details about our health, financial status and buying

More information

Health Data De-Identification by Dr. Khaled El Emam

Health Data De-Identification by Dr. Khaled El Emam RISK-BASED METHODOLOGY DEFENSIBLE COST-EFFECTIVE DE-IDENTIFICATION OPTIMAL STATISTICAL METHOD REPORTING RE-IDENTIFICATION BUSINESS ASSOCIATES COMPLIANCE HIPAA PHI REPORTING DATA SHARING REGULATORY UTILITY

More information

Protecting Patient Privacy. Khaled El Emam, CHEO RI & uottawa

Protecting Patient Privacy. Khaled El Emam, CHEO RI & uottawa Protecting Patient Privacy Khaled El Emam, CHEO RI & uottawa Context In Ontario data custodians are permitted to disclose PHI without consent for public health purposes What is the problem then? This disclosure

More information

A Q&A with the Commissioner: Big Data and Privacy Health Research: Big Data, Health Research Yes! Personal Data No!

A Q&A with the Commissioner: Big Data and Privacy Health Research: Big Data, Health Research Yes! Personal Data No! A Q&A with the Commissioner: Big Data and Privacy Health Research: Big Data, Health Research Yes! Personal Data No! Ann Cavoukian, Ph.D. Information and Privacy Commissioner Ontario, Canada THE AGE OF

More information

De-identification, defined and explained. Dan Stocker, MBA, MS, QSA Professional Services, Coalfire

De-identification, defined and explained. Dan Stocker, MBA, MS, QSA Professional Services, Coalfire De-identification, defined and explained Dan Stocker, MBA, MS, QSA Professional Services, Coalfire Introduction This perspective paper helps organizations understand why de-identification of protected

More information

Data Privacy and Biomedicine Syllabus - Page 1 of 6

Data Privacy and Biomedicine Syllabus - Page 1 of 6 Data Privacy and Biomedicine Syllabus - Page 1 of 6 Course: Data Privacy in Biomedicine (BMIF-380 / CS-396) Instructor: Bradley Malin, Ph.D. (b.malin@vanderbilt.edu) Semester: Spring 2015 Time: Mondays

More information

Privacy Committee. Privacy and Open Data Guideline. Guideline. Of South Australia. Version 1

Privacy Committee. Privacy and Open Data Guideline. Guideline. Of South Australia. Version 1 Privacy Committee Of South Australia Privacy and Open Data Guideline Guideline Version 1 Executive Officer Privacy Committee of South Australia c/o State Records of South Australia GPO Box 2343 ADELAIDE

More information

De-identification Koans. ICTR Data Managers Darren Lacey January 15, 2013

De-identification Koans. ICTR Data Managers Darren Lacey January 15, 2013 De-identification Koans ICTR Data Managers Darren Lacey January 15, 2013 Disclaimer There are several efforts addressing this issue in whole or part Over the next year or so, I believe that the conversation

More information

Best Practice Guidelines for Managing the Disclosure of De-Identified Health Information

Best Practice Guidelines for Managing the Disclosure of De-Identified Health Information Best Practice Guidelines for Managing the Disclosure of De-Identified Health Information Prepared by the: Health System Use Technical Advisory Committee Data De-Identification Working Group October 2010

More information

74. Selecting Web Services with Security Compliances: A Managerial Perspective

74. Selecting Web Services with Security Compliances: A Managerial Perspective 74. Selecting Web Services with Security Compliances: A Managerial Perspective Khaled Md Khan Department of Computer Science and Engineering Qatar University k.khan@qu.edu.qa Abstract This paper proposes

More information

REGULATORY CHANGES DEMAND AN ENTERPRISE-WIDE APPROACH TO DISCLOSURE MANAGEMENT OF PHI

REGULATORY CHANGES DEMAND AN ENTERPRISE-WIDE APPROACH TO DISCLOSURE MANAGEMENT OF PHI REGULATORY CHANGES DEMAND AN ENTERPRISE-WIDE APPROACH TO DISCLOSURE MANAGEMENT OF PHI Healthcare Organizations Can Adopt Enterprise-Wide Disclosure Management Systems To Standardize Disclosure Processes,

More information

Privacy Policy on the Collection, Use, Disclosure and Retention of Personal Health Information and De-Identified Data, 2010

Privacy Policy on the Collection, Use, Disclosure and Retention of Personal Health Information and De-Identified Data, 2010 pic pic Privacy Policy on the Collection, Use, Disclosure and Retention of Personal Health Information and De-Identified Data, 2010 Updated March 2013 Our Vision Better data. Better decisions. Healthier

More information

NEW PERSPECTIVES. Professional Fee Coding Audit: The Basics. Learn how to do these invaluable audits page 16

NEW PERSPECTIVES. Professional Fee Coding Audit: The Basics. Learn how to do these invaluable audits page 16 NEW PERSPECTIVES on Healthcare Risk Management, Control and Governance www.ahia.org Journal of the Association of Heathcare Internal Auditors Vol. 32, No. 3, Fall, 2013 Professional Fee Coding Audit: The

More information

De-Identification of Health Data under HIPAA: Regulations and Recent Guidance" " "

De-Identification of Health Data under HIPAA: Regulations and Recent Guidance  De-Identification of Health Data under HIPAA: Regulations and Recent Guidance" " " D even McGraw " Director, Health Privacy Project January 15, 201311 HIPAA Scope Does not cover all health data Applies

More information

Vendor Management Challenge Doing More with Less

Vendor Management Challenge Doing More with Less Vendor Management Challenge Doing More with Less Megan Hertzler Assistant General Counsel Director of Data Privacy Xcel Energy Boris Segalis Partner InfoLawGroup LLP Session ID: GRC-402 Insert presenter

More information

Cloud Computing and Privacy Toolkit. Protecting Privacy Online. May 2016 CLOUD COMPUTING AND PRIVACY TOOLKIT 1

Cloud Computing and Privacy Toolkit. Protecting Privacy Online. May 2016 CLOUD COMPUTING AND PRIVACY TOOLKIT 1 Cloud Computing and Privacy Toolkit Protecting Privacy Online May 2016 CLOUD COMPUTING AND PRIVACY TOOLKIT 1 Table of Contents ABOUT THIS TOOLKIT... 4 What is this Toolkit?... 4 Purpose of this Toolkit...

More information

Helpful Tips. Privacy Breach Guidelines. September 2010

Helpful Tips. Privacy Breach Guidelines. September 2010 Helpful Tips Privacy Breach Guidelines September 2010 Office of the Saskatchewan Information and Privacy Commissioner 503 1801 Hamilton Street Regina, Saskatchewan S4P 4B4 Office of the Saskatchewan Information

More information

Council, 14 May 2015. Information Governance Report. Introduction

Council, 14 May 2015. Information Governance Report. Introduction Council, 14 May 2015 Information Governance Report Introduction 1.1 The Information Governance function within the Secretariat Department is responsible for the HCPC s ongoing compliance with the Freedom

More information

Anonymizing Unstructured Data to Enable Healthcare Analytics Chris Wright, Vice President Marketing, Privacy Analytics

Anonymizing Unstructured Data to Enable Healthcare Analytics Chris Wright, Vice President Marketing, Privacy Analytics Anonymizing Unstructured Data to Enable Healthcare Analytics Chris Wright, Vice President Marketing, Privacy Analytics Privacy Analytics - Overview For organizations that want to safeguard and enable their

More information

HIPAA POLICY REGARDING DE-IDENTIFICATION OF PROTECTED HEALTH INFORMATION AND USE OF LIMITED DATA SETS

HIPAA POLICY REGARDING DE-IDENTIFICATION OF PROTECTED HEALTH INFORMATION AND USE OF LIMITED DATA SETS HIPAA POLICY REGARDING DE-IDENTIFICATION OF PROTECTED HEALTH INFORMATION AND USE OF LIMITED DATA SETS SCOPE OF POLICY: What Units Are Covered by this Policy?: This policy applies to the following units

More information

HIPAA-Compliant Research Access to PHI

HIPAA-Compliant Research Access to PHI HIPAA-Compliant Research Access to PHI HIPAA permits the access, disclosure and use of PHI from a HIPAA Covered Entity s or HIPAA Covered Unit s treatment, payment or health care operations records for

More information

De-identification of Data using Pseudonyms (Pseudonymisation) Policy

De-identification of Data using Pseudonyms (Pseudonymisation) Policy De-identification of Data using Pseudonyms (Pseudonymisation) Policy Version: 2.0 Page 1 of 7 Partners in Care This is a controlled document. It should not be altered in any way without the express permission

More information

CHAPTER-6 DATA WAREHOUSE

CHAPTER-6 DATA WAREHOUSE CHAPTER-6 DATA WAREHOUSE 1 CHAPTER-6 DATA WAREHOUSE 6.1 INTRODUCTION Data warehousing is gaining in popularity as organizations realize the benefits of being able to perform sophisticated analyses of their

More information

An Oracle White Paper November 2011. Financial Crime and Compliance Management: Convergence of Compliance Risk and Financial Crime

An Oracle White Paper November 2011. Financial Crime and Compliance Management: Convergence of Compliance Risk and Financial Crime An Oracle White Paper November 2011 Financial Crime and Compliance Management: Convergence of Compliance Risk and Financial Crime Disclaimer The following is intended to outline our general product direction.

More information

Health and Social Care Information Centre

Health and Social Care Information Centre Health and Social Care Information Centre Information Governance Assessment Customer: Clinical Audit Support Unit of the Health and Social Care Information Centre under contract to the Royal College of

More information

HIPAA-P06 Use and Disclosure of De-identified Data and Limited Data Sets

HIPAA-P06 Use and Disclosure of De-identified Data and Limited Data Sets HIPAA-P06 Use and Disclosure of De-identified Data and Limited Data Sets FULL POLICY CONTENTS Scope Policy Statement Reason for Policy Definitions ADDITIONAL DETAILS Web Address Forms Related Information

More information

Information Sheet: Cloud Computing

Information Sheet: Cloud Computing info sheet 03.11 Information Sheet: Cloud Computing Info Sheet 03.11 May 2011 This Information Sheet gives a brief overview of how the Information Privacy Act 2000 (Vic) applies to cloud computing technologies.

More information

HIPAA Data Breaches: Managing Them Internally and in Response to Civil/Criminal Investigations

HIPAA Data Breaches: Managing Them Internally and in Response to Civil/Criminal Investigations HIPAA Data Breaches: Managing Them Internally and in Response to Civil/Criminal Investigations Health Care Litigation Webinar Series March 22, 2012 Spence Pryor Paula Stannard Jason Popp 1 HIPAA/HITECH

More information

How to De-identify Data. Xulei Shirley Liu Department of Biostatistics Vanderbilt University 03/07/2008

How to De-identify Data. Xulei Shirley Liu Department of Biostatistics Vanderbilt University 03/07/2008 How to De-identify Data Xulei Shirley Liu Department of Biostatistics Vanderbilt University 03/07/2008 1 Outline The problem Brief history The solutions Examples with SAS and R code 2 Background The adoption

More information

Principles and Best Practices for Sharing Data from Environmental Health Research: Challenges Associated with Data-Sharing: HIPAA De-identification

Principles and Best Practices for Sharing Data from Environmental Health Research: Challenges Associated with Data-Sharing: HIPAA De-identification Principles and Best Practices for Sharing Data from Environmental Health Research: Challenges Associated with Data-Sharing: HIPAA De-identification Daniel C. Barth-Jones, M.P.H., Ph.D Assistant Professor

More information

HIPAA in the Cloud How to Effectively Collaborate with Cloud Providers

HIPAA in the Cloud How to Effectively Collaborate with Cloud Providers How to Effectively Collaborate with Cloud Providers Agenda Overview of Topics Covered Agenda Evolution of the Cloud Comparison of Private vs. Public Clouds Other Regulatory Frameworks Similar to HIPAA

More information

Governance, Risk, and Compliance (GRC) White Paper

Governance, Risk, and Compliance (GRC) White Paper Governance, Risk, and Compliance (GRC) White Paper Table of Contents: Purpose page 2 Introduction _ page 3 What is GRC _ page 3 GRC Concepts _ page 4 Integrated Approach and Methodology page 4 Diagram:

More information

Overview of Topics Covered

Overview of Topics Covered How to Effectively Collaborate with Cloud Providers Agenda Overview of Topics Covered Agenda Evolution of the Cloud Comparison of Private vs. Public Clouds Other Regulatory Frameworks Similar to HIPAA

More information

ARRA HITECH Stimulus HIPAA Security Compliance Reporter. White Paper

ARRA HITECH Stimulus HIPAA Security Compliance Reporter. White Paper ARRA HITECH Stimulus HIPAA Security Compliance Reporter White Paper ARRA HITECH AND ACR2 HIPAA SECURITY The healthcare industry is in a time of great transition, with a government mandate for EHR/EMR systems,

More information

Best Practices for Protecting Individual Privacy in Conducting Survey Research

Best Practices for Protecting Individual Privacy in Conducting Survey Research Best Practices for Protecting Individual Privacy in Conducting Survey Research CONTENTS Foreword... 1 Introduction... 2 Privacy Considerations at Each Stage of a Survey Research Project... 5 Stage 1: Issue

More information

Overvie w of Data. Points to Ponder

Overvie w of Data. Points to Ponder 1 Overvie w of Data Anonymiz ation Points to Ponder What is data anonymization? What are the drivers for data anonymization? Here are some startling statistics on security incidents and private data breaches:

More information

Planning for Success: Privacy Impact Assessment Guide

Planning for Success: Privacy Impact Assessment Guide Planning for Success: Privacy Impact Assessment Guide Acknowledgement This guide is partially based on the Privacy Impact Assessment Guides and Tools developed by the Ministry of Government and Consumer

More information

Data Breach and Senior Living Communities May 29, 2015

Data Breach and Senior Living Communities May 29, 2015 Data Breach and Senior Living Communities May 29, 2015 Todays Objectives: 1. Discuss Current Data Breach Trends & Issues 2. Understanding Why The Senior Living Industry May Be A Target 3. Data Breach Costs

More information

Privacy and Electronic Communications Regulations

Privacy and Electronic Communications Regulations ICO lo Notification of PECR security breaches Privacy and Electronic Communications Regulations Contents Introduction... 2 Overview... 2 Relevant security breaches... 3 What is a service provider?... 3

More information

WHITEPAPER. Complying with the Red Flag Rules and FACT Act Address Discrepancy Rules

WHITEPAPER. Complying with the Red Flag Rules and FACT Act Address Discrepancy Rules WHITEPAPER Complying with the Red Flag Rules and FACT Act Address Discrepancy Rules May 2008 2 Table of Contents Introduction 3 ID Analytics for Compliance and the Red Flag Rules 4 Comparison with Alternative

More information

www.pwc.com Internal Audit Data Analytics

www.pwc.com Internal Audit Data Analytics www.pwc.com Internal Audit Data Analytics What s driving the need for enhanced data analytics in the market Analytics maturity model Capabilities developed and adopted Capabilities used to drive audits

More information

Achieving Regulatory Compliance through Security Information Management

Achieving Regulatory Compliance through Security Information Management www.netforensics.com NETFORENSICS WHITE PAPER Achieving Regulatory Compliance through Security Information Management Contents Executive Summary The Compliance Challenge Common Requirements of Regulations

More information

Societal benefits vs. privacy: what distributed secure multi-party computation enable? Research ehelse 2015 21-22 April Oslo

Societal benefits vs. privacy: what distributed secure multi-party computation enable? Research ehelse 2015 21-22 April Oslo Privacy Societal benefits vs. privacy: what distributed secure multi-party computation enable? Research ehelse 2015 21-22 April Oslo Kassaye Yitbarek Yigzaw UiT The Arctic University of Norway Outline

More information

Principles and Guidelines on Confidentiality Aspects of Data Integration Undertaken for Statistical or Related Research Purposes

Principles and Guidelines on Confidentiality Aspects of Data Integration Undertaken for Statistical or Related Research Purposes Principles and Guidelines on Confidentiality Aspects of Data Integration Undertaken for Statistical or Related Research Purposes These Principles and Guidelines were endorsed by the Conference of European

More information

Sustainable HIPAA Compliance: Protecting Patient Privacy through Highly Leveraged Investments

Sustainable HIPAA Compliance: Protecting Patient Privacy through Highly Leveraged Investments View the Replay on YouTube Sustainable HIPAA Compliance: Protecting Patient Privacy through Highly Leveraged Investments FairWarning Executive Webinar Series October 31, 2013 Today s Panel Chris Arnold

More information

Upcoming OCR Audits for HIPAA Compliance: How Prepared and Confident are Medical Practices and Billing Companies?

Upcoming OCR Audits for HIPAA Compliance: How Prepared and Confident are Medical Practices and Billing Companies? Upcoming : How Prepared and Confident are Medical Practices and Billing Companies? - Presented by NueMD a complete medical billing and practice management software solution company has partnered with Porter

More information

Comparing the CSF, ISO/IEC and NIST SP Why Choosing the CSF is the Best Choice

Comparing the CSF, ISO/IEC and NIST SP Why Choosing the CSF is the Best Choice Comparing the CSF, ISO/IEC 27001 and NIST SP 800-53 Why Choosing the CSF is the Best Choice June 2014 Why Choosing the CSF is the Best Choice Introduction Many healthcare organizations realize it is in

More information

Privacy fact sheet 17

Privacy fact sheet 17 Privacy fact sheet 17 Australian Privacy Principles January 2014 From 12 March 2014, the Australian Privacy Principles (APPs) will replace the National Privacy Principles Information Privacy Principles

More information

Information Security Management System for Microsoft s Cloud Infrastructure

Information Security Management System for Microsoft s Cloud Infrastructure Information Security Management System for Microsoft s Cloud Infrastructure Online Services Security and Compliance Executive summary Contents Executive summary 1 Information Security Management System

More information

Achieving Business Imperatives through IT Governance and Risk

Achieving Business Imperatives through IT Governance and Risk IBM Global Technology Services Achieving Business Imperatives through IT Governance and Risk Peter Stremus Internet Security Systems, an IBM Company Introduction : Compliance Value Over the past 15 years

More information

BUSINESS ASSOCIATE AGREEMENT

BUSINESS ASSOCIATE AGREEMENT BUSINESS ASSOCIATE AGREEMENT This Addendum is made part of the agreement between Boston Medical Center ("Covered Entity ) and ( Business Associate"), dated [the Underlying Agreement ]. In connection with

More information

2009 HIMSS Analytics Report: Evaluating HITECH s Impact on Healthcare Privacy and Security

2009 HIMSS Analytics Report: Evaluating HITECH s Impact on Healthcare Privacy and Security 2009 HIMSS Analytics Report: Evaluating HITECH s Impact on Healthcare Privacy and Security Commissioned by ID Experts November 2009 INTRODUCTION Healthcare breaches are on the rise; according to the 2009

More information

CIHI Portal P r i v a c y I m p a c t A s s e s s m e n t

CIHI Portal P r i v a c y I m p a c t A s s e s s m e n t CIHI Portal P r i v a c y I m p a c t A s s e s s m e n t The contents of this publication may be reproduced in whole or in part, provided the intended use is for non-commercial purposes and full acknowledgement

More information

Capital Adequacy: Advanced Measurement Approaches to Operational Risk

Capital Adequacy: Advanced Measurement Approaches to Operational Risk Prudential Standard APS 115 Capital Adequacy: Advanced Measurement Approaches to Operational Risk Objective and key requirements of this Prudential Standard This Prudential Standard sets out the requirements

More information

HIPAA and HITECH Compliance for Cloud Applications

HIPAA and HITECH Compliance for Cloud Applications What Is HIPAA? The healthcare industry is rapidly moving towards increasing use of electronic information systems - including public and private cloud services - to provide electronic protected health

More information

UPMC POLICY AND PROCEDURE MANUAL

UPMC POLICY AND PROCEDURE MANUAL UPMC POLICY AND PROCEDURE MANUAL POLICY: INDEX TITLE: HS-EC1807 Ethics & Compliance SUBJECT: Honest Broker Certification Process Related to the De-identification of Health Information for Research and

More information

Processor Binding Corporate Rules (BCRs), for intra-group transfers of personal data to non EEA countries

Processor Binding Corporate Rules (BCRs), for intra-group transfers of personal data to non EEA countries Processor Binding Corporate Rules (BCRs), for intra-group transfers of personal data to non EEA countries Sopra HR Software as a Data Processor Sopra HR Software, 2014 / Ref. : 20141120-101114-m 1/32 1.

More information

National System for Incident Reporting

National System for Incident Reporting National System for Incident Reporting Privacy Impact Assessment The contents of this publication may be reproduced in whole or in part, provided the intended use is for non-commercial purposes and full

More information

Roadmap. What is Big Data? Big Data for Educational Institutions 5/30/2014. A Framework for Addressing Privacy Compliance and Legal Considerations

Roadmap. What is Big Data? Big Data for Educational Institutions 5/30/2014. A Framework for Addressing Privacy Compliance and Legal Considerations Big Data for Educational Institutions A Framework for Addressing Privacy Compliance and Legal Considerations Roadmap Introduction What is Big Data? How are educational institutions using Big Data? What

More information

RELATIONSHIP TO PREVIOUS AGREEMENT(S) / PREVIOUS REQUESTS

RELATIONSHIP TO PREVIOUS AGREEMENT(S) / PREVIOUS REQUESTS HEALTH DATA REQUEST Submit this completed form to the email address: healthdatacentral@gov.bc.ca Questions about the request process or any part of this application may be directed to the email address

More information

Basel Committee on Banking Supervision. Working Paper No. 17

Basel Committee on Banking Supervision. Working Paper No. 17 Basel Committee on Banking Supervision Working Paper No. 17 Vendor models for credit risk measurement and management Observations from a review of selected models February 2010 The Working Papers of the

More information

Driving Quality Improvement and Reducing Technical Debt with the Definition of Done

Driving Quality Improvement and Reducing Technical Debt with the Definition of Done Driving Quality Improvement and Reducing Technical Debt with the Definition of Done Noopur Davis Principal, Davis Systems Pittsburgh, PA NDavis@DavisSys.com Abstract This paper describes our experiences

More information

Big Data, Big Risk, Big Rewards. Hussein Syed

Big Data, Big Risk, Big Rewards. Hussein Syed Big Data, Big Risk, Big Rewards Hussein Syed Discussion Topics Information Security in healthcare Cyber Security Big Data Security Security and Privacy concerns Security and Privacy Governance Big Data

More information

Regulatory Compliance Management for Energy and Utilities

Regulatory Compliance Management for Energy and Utilities Regulatory Compliance Management for Energy and Utilities The Energy and Utility (E&U) sector is transforming as enterprises are looking for ways to replace aging infrastructure and create clean, sustainable

More information

Data Masking: A baseline data security measure

Data Masking: A baseline data security measure Imperva Camouflage Data Masking Reduce the risk of non-compliance and sensitive data theft Sensitive data is embedded deep within many business processes; it is the foundational element in Human Relations,

More information

Managing Cyber Security as a Business Risk: Cyber Insurance in the Digital Age

Managing Cyber Security as a Business Risk: Cyber Insurance in the Digital Age Managing Cyber Security as a Business Risk: Cyber Insurance in the Digital Age Sponsored by Experian Data Breach Resolution Independently conducted by Ponemon Institute LLC Publication Date: August 2013

More information

Compliance Management, made easy

Compliance Management, made easy Compliance Management, made easy LOGPOINT SECURING BUSINESS ASSETS SECURING BUSINESS ASSETS LogPoint 5.1: Protecting your data, intellectual property and your company Log and Compliance Management in one

More information

Administrative Services

Administrative Services Policy Title: Administrative Services De-identification of Client Information and Use of Limited Data Sets Policy Number: DHS-100-007 Version: 2.0 Effective Date: Upon Approval Signature on File in the

More information

what your business needs to do about the new HIPAA rules

what your business needs to do about the new HIPAA rules what your business needs to do about the new HIPAA rules Whether you are an employer that provides health insurance for your employees, a business in the growing health care industry, or a hospital or

More information

Practice guide. quality assurance and IMProVeMeNt PrograM

Practice guide. quality assurance and IMProVeMeNt PrograM Practice guide quality assurance and IMProVeMeNt PrograM MarCh 2012 Table of Contents Executive Summary... 1 Introduction... 2 What is Quality?... 2 Quality in Internal Audit... 2 Conformance or Compliance?...

More information

Report on FSCO s Compliance Reviews of Mortgage Brokerages. Financial Services Commission of Ontario Licensing and Market Conduct Division

Report on FSCO s Compliance Reviews of Mortgage Brokerages. Financial Services Commission of Ontario Licensing and Market Conduct Division Report on FSCO s Compliance Reviews of Mortgage Brokerages Financial Services Commission of Ontario Licensing and Market Conduct Division May 2010 TABLE OF CONTENTS EXECUTIVE SUMMARY...3 ABOUT FSCO...4

More information

Synapse Privacy Policy

Synapse Privacy Policy Synapse Privacy Policy Last updated: April 10, 2014 Introduction Sage Bionetworks is driving a systems change in data-intensive healthcare research by enabling a collective approach to information sharing

More information

HIPAA in the Cloud. How to Effectively Collaborate with Cloud Providers

HIPAA in the Cloud. How to Effectively Collaborate with Cloud Providers How to Effectively Collaborate with Cloud Providers Speaker Bio Chad Kissinger Chad Kissinger Founder OnRamp Chad Kissinger is the Founder of OnRamp, an industry leading high security and hybrid hosting

More information

Supplementary Policy on Data Breach Notification Legislation

Supplementary Policy on Data Breach Notification Legislation http://www.privacy.org.au Secretary@privacy.org.au http://www.privacy.org.au/about/contacts.html 4 May 2013 Supplementary Policy on Data Breach Notification Legislation Introduction It has been reported

More information

Tools for De-Identification of Personal Health Information

Tools for De-Identification of Personal Health Information Tools for De-Identification of Personal Health Information Prepared for the Pan Canadian Health Information Privacy (HIP) Group Authored by: Ross Fraser and Don Willison, September 2009 Executive Summary

More information

Key Steps to Meeting PCI DSS 2.0 Requirements Using Sensitive Data Discovery and Masking

Key Steps to Meeting PCI DSS 2.0 Requirements Using Sensitive Data Discovery and Masking Key Steps to Meeting PCI DSS 2.0 Requirements Using Sensitive Data Discovery and Masking SUMMARY The Payment Card Industry Data Security Standard (PCI DSS) defines 12 high-level security requirements directed

More information

3 Guidance for Successful Evaluations

3 Guidance for Successful Evaluations 3 Guidance for Successful Evaluations In developing STEP, project leads identified several key challenges in conducting technology evaluations. The following subsections address the four challenges identified

More information

The potential legal consequences of a personal data breach

The potential legal consequences of a personal data breach The potential legal consequences of a personal data breach Tue Goldschmieding, Partner 16 April 2015 The potential legal consequences of a personal data breach 15 April 2015 Contents 1. Definitions 2.

More information

The economics of IT risk and reputation

The economics of IT risk and reputation Global Technology Services Research Report Risk Management The economics of IT risk and reputation What business continuity and IT security really mean to your organization Findings from the IBM Global

More information

NCHICA HITECH Act Breach Notification Risk Assessment Tool. Prepared by the NCHICA Privacy, Security & Legal Officials Workgroup

NCHICA HITECH Act Breach Notification Risk Assessment Tool. Prepared by the NCHICA Privacy, Security & Legal Officials Workgroup NCHICA HITECH Act Breach Notification Risk Assessment Tool Prepared by the NCHICA Privacy, Security & Legal Officials Workgroup NORTH CAROLINA HEALTHCARE INFORMATION AND COMMUNICATIONS ALLIANCE, INC August

More information

Criteria and Risk-Management Standards for Prominent Payment Systems

Criteria and Risk-Management Standards for Prominent Payment Systems Criteria and Risk-Management Standards for Prominent Payment Systems Designation Criteria for Prominent Payment Systems The Bank of Canada has established five high-level criteria to help identify prominent

More information

HIPAA PRIVACY RULE & AUTHORIZATION

HIPAA PRIVACY RULE & AUTHORIZATION HIPAA PRIVACY RULE & AUTHORIZATION Definitions Breach. The term breach means the unauthorized acquisition, access, use, or disclosure of protected health information which compromises the security or privacy

More information

Building a Data Quality Scorecard for Operational Data Governance

Building a Data Quality Scorecard for Operational Data Governance Building a Data Quality Scorecard for Operational Data Governance A White Paper by David Loshin WHITE PAPER Table of Contents Introduction.... 1 Establishing Business Objectives.... 1 Business Drivers...

More information

Funding Privacy Commissioner of Canada: Secondary use of data from the EHR current governance challenges & potential approaches

Funding Privacy Commissioner of Canada: Secondary use of data from the EHR current governance challenges & potential approaches Don Willison, Sc.D. Senior Scientist, Ontario Agency for Health Protection and Promotion Associate Professor, Part time, Clinical Epidemiology & Biostatistics, McMaster University don.willision@oahpp.ca

More information

CORL Dodging Breaches from Dodgy Vendors

CORL Dodging Breaches from Dodgy Vendors CORL Dodging Breaches from Dodgy Vendors Tackling Vendor Security Risk Management in Healthcare Introductions Cliff Baker 20 Years of Healthcare Security experience PricewaterhouseCoopers, HITRUST, Meditology

More information

CLOUD COMPUTING FOR SMALL- AND MEDIUM-SIZED ENTERPRISES:

CLOUD COMPUTING FOR SMALL- AND MEDIUM-SIZED ENTERPRISES: CLOUD COMPUTING FOR SMALL- AND MEDIUM-SIZED ENTERPRISES: Privacy Responsibilities and Considerations Cloud computing is the delivery of computing services over the Internet, and it offers many potential

More information

Privacy Impact Assessment: care.data

Privacy Impact Assessment: care.data High quality care for all, now and for future generations Document Control Document Purpose Document Name Information Version 1.0 Publication Date 15/01/2014 Description Associated Documents Issued by

More information

Data Breach, Electronic Health Records and Healthcare Reform

Data Breach, Electronic Health Records and Healthcare Reform Data Breach, Electronic Health Records and Healthcare Reform (This presentation is for informational purposes only and it is not intended, and should not be relied upon, as legal advice.) Overview of HIPAA

More information

Conducting Surveys: A Guide to Privacy Protection. Revised January 2007 (updated to reflect A.R. 186/2008)

Conducting Surveys: A Guide to Privacy Protection. Revised January 2007 (updated to reflect A.R. 186/2008) Conducting Surveys: A Guide to Privacy Protection Revised January 2007 (updated to reflect A.R. 186/2008) ISBN 978-0-7785-6101-9 Produced by: Access and Privacy Service Alberta 3rd Floor, 10155 102 Street

More information

Data Protection Act. Conducting privacy impact assessments code of practice

Data Protection Act. Conducting privacy impact assessments code of practice Data Protection Act Conducting privacy impact assessments code of practice 1 Conducting privacy impact assessments code of practice Data Protection Act Contents Information Commissioner s foreword... 3

More information

How to Leverage HIPAA for Meaningful Use

How to Leverage HIPAA for Meaningful Use How to Leverage HIPAA for Meaningful Use The overlap between HIPAA and Meaningful Use requirements 2015 SecurityMetrics How to Leverage HIPAA for Meaningful Use 2 About this ebook Who should read this

More information