o'reilly webcast: anonymizing health data

Post on 28-Jan-2015

117 Views

Category:

Education

1 Downloads

Preview:

Click to see full reader

DESCRIPTION

Authors: Khaled El Emam, Luk Arbuckle How can health data be released to analysts and app developers who desperately want it? Under current legislation, the use and disclosure of health data for secondary purposes is limited—patients must either consent to have their data used, which is often difficult to get and can lead to bias, or the data needs to be de-identified (there are some exceptions, but we won't address them in this webinar.) To ensure that end users get data that is anonymized and highly useful, we focus on the HIPAA Privacy Rule De-identification Standard. We've built our risk-based methodology for anonymizing data around the foundation created by HIPAA's Statistical Method. In this webcast we'll share several of the case studies that we've described in our O'Reilly book Anonymizing Health Data, which is devoted to examples of how we anonymized real-world data sets. In almost every case in which we've anonymized data, there have been new and interesting challenges to overcome.

TRANSCRIPT

Anonymizing Health DataWebcast

Case Studies and Methods to Get You Started

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Part 1 of Webcast: Intro and Methodology

Part 2 of Webcast: A Look at Our Case Studies

Part 3 of Webcast: Questions and Answers

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Part 1 of Webcast: Intro and Methodology

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

To Anonymize or not to Anonymize

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

To Anonymize or not to Anonymize

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

To Anonymize or not to Anonymize

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

Anonymization allows for the sharing of health information.

To Anonymize or not to Anonymize

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

Anonymization allows for the sharing of health information.

To Anonymize or not to Anonymize

Compelling financial case. Breach cost ~$200 per patient.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

Anonymization allows for the sharing of health information.

To Anonymize or not to Anonymize

Compelling financial case. Breach cost ~$200 per patient.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Consent needs to be informed.

Not all health care providers are willing to share their patient’s PHI.

Anonymization allows for the sharing of health information.

To Anonymize or not to Anonymize

Privacy protective behaviors by patients.

Compelling financial case. Breach cost ~$200 per patient.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

First name, last name, SSN.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Distortion of data—no analytics.

First name, last name, SSN.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Creating pseudonyms.

First name, last name, SSN.

Distortion of data—no analytics.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Removing a whole field.

Creating pseudonyms.

First name, last name, SSN.

Distortion of data—no analytics.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Masking Standards

Removing a whole field.

Creating pseudonyms.

Replacing actual values with random ones.

First name, last name, SSN.

Distortion of data—no analytics.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Standards

Age, sex, race, address, income.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Minimal distortion of data—for analytics.

Age, sex, race, address, income.

De-identification Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Minimal distortion of data—for analytics.

Age, sex, race, address, income.

De-identification Standards

Safe Harbor in HIPAA Privacy Rule.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

What’s “Actual Knowledge”?

Privacy Rule

Safe Harbor

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

What’s “Actual Knowledge”?

Info, alone or in combo, that could identify an individual.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

What’s “Actual Knowledge”?

Info, alone or in combo, that could identify an individual.

Has to be specific to the data set—not theoretical.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

What’s “Actual Knowledge”?

Info, alone or in combo, that could identify an individual.

Has to be specific to the data set—not theoretical.

Occupation Mayor of Gotham.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Heuristics, or rules of thumb.

Minimal distortion of data—for analytics.

Age, sex, race, address, income.

Safe Harbor in HIPAA Privacy Rule.

De-identification Standards

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Heuristics, or rules of thumb.

Statistical method in HIPAA Privacy Rule.

Minimal distortion of data—for analytics.

Age, sex, race, address, income.

Safe Harbor in HIPAA Privacy Rule.

De-identification Standards

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
A risk-based methodology is consistent with contemporary standards from regulators and governments, and is the approach we present in our book.

Anonymizing Health Data

De-identification Myths

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Myth: It’s possible to re-identify most, if not all, data.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Myth: It’s possible to re-identify most, if not all, data.

Using robust methods, evidence suggests risk can be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Myth: It’s possible to re-identify most, if not all, data.

Myth: Genomic sequences are not identifiable, or are easy to re-identify.

Using robust methods, evidence suggests risk can be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identification Myths

Myth: It’s possible to re-identify most, if not all, data.

Myth: Genomic sequences are not identifiable, or are easy to re-identify.

In some cases can re-identify, difficult to de-identify using our methods.

Using robust methods, evidence suggests risk can be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

A Risk-based De-identification Methodology

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
This is where things get heavy. We’ll start with some basic principles.

Anonymizing Health Data

A Risk-based De-identification Methodology

The risk of re-identification can be quantified.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

A Risk-based De-identification Methodology

The risk of re-identification can be quantified.

The Goldilocks principle: balancing privacy with data utility.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
The Goldilocks Principle: the trade-off between perfect data and perfect privacy.

Anonymizing Health Data

A Risk-based De-identification Methodology

The risk of re-identification can be quantified.

The Goldilocks principle: balancing privacy with data utility.

The re-identification risk needs to be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

A Risk-based De-identification Methodology

The risk of re-identification can be quantified.

The Goldilocks principle: balancing privacy with data utility.

De-identification involves a mix of technical, contractual, and other measures.

The re-identification risk needs to be very small.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Steps in the De-identification Methodology

Step 1: Select Direct and Indirect Identifiers

Step 2: Setting the Threshold

Step 3: Examining Plausible Attacks

Step 4: De-identifying the Data

Step 5: Documenting the Process

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 1: Select Direct and Indirect Identifiers

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
We use masking for direct identifiers, and de-identification for indirect identifiers.

Anonymizing Health Data

Direct identifiers: name, telephone number, health insurance card number, medical record number.

Step 1: Select Direct and Indirect Identifiers

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
Masking

Anonymizing Health Data

Direct identifiers: name, telephone number, health insurance card number, medical record number.

Indirect identifiers, or quasi-identifiers: sex, date of birth, ethnicity, locations, event dates, medical codes.

Step 1: Select Direct and Indirect Identifiers

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
De-identification

Anonymizing Health Data

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Maximum acceptable risk for sharing data.

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Maximum acceptable risk for sharing data.

Needs to be quantitative and defensible.

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Maximum acceptable risk for sharing data.

Needs to be quantitative and defensible.

Is the data in going to be in the public domain?

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Maximum acceptable risk for sharing data.

Needs to be quantitative and defensible.

Is the data in going to be in the public domain?

Extent of invasion-of-privacy when data was shared?

Step 2: Setting the Threshold

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Recipient deliberately attempts to re-identify the data.

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Recipient deliberately attempts to re-identify the data.

Recipient inadvertently re-identifies the data.“Holly Smokes, I know her!”

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Recipient deliberately attempts to re-identify the data.

Recipient inadvertently re-identifies the data.

Data breach at recipient’s site, “data gone wild”.

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Recipient deliberately attempts to re-identify the data.

Data breach at recipient’s site, “data gone wild”.

Adversary launches a demonstration attack on the data.

Step 3: Examining Plausible Attacks

Khaled El Emam & Luk Arbuckle

Recipient inadvertently re-identifies the data.

Presenter
Presentation Notes
Yahoo!

Anonymizing Health Data

Step 4: De-identifying the Data

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 4: De-identifying the Data

Generalization: reducing the precision of a field.Dates converted to month/year, or year.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 4: De-identifying the Data

Generalization: reducing the precision of a field.

Suppression: replacing a cell with NULL.Unique 55-year old female in birth registry.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 4: De-identifying the Data

Generalization: reducing the precision of a field.

Suppression: replacing a cell with NULL.

Sub-sampling: releasing a simple random sample.50% of data set instead of all data.

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Step 5: Documenting the Process

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
From a regulatory perspective, it’s important to document the process that was used to de-identify the data set, as well as the results of enacting that process.

Anonymizing Health Data

Step 5: Documenting the Process

Process documentation—a methodology text.

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
From a regulatory perspective, it’s important to document the process that was used to de-identify the data set, as well as the results of enacting that process.

Anonymizing Health Data

Step 5: Documenting the Process

Results documentation—data set, risk thresholds, assumptions, evidence of low risk.

Khaled El Emam & Luk Arbuckle

Process documentation—a methodology text.

Presenter
Presentation Notes
From a regulatory perspective, it’s important to document the process that was used to de-identify the data set, as well as the results of enacting that process.

Anonymizing Health Data

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Pr(re-id, attempt) = Pr(attempt) × Pr(re-id | attempt)

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
The probability of an attack will depend on the controls in place to manage the data (mitigating controls).

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)Pr(re-id, acquaintance) = Pr(acquaintance) × Pr(re-id | acquaintance)

Presenter
Presentation Notes
On average people tend to have 150 friends. This is called the Dunbar number.

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)Pr(re-id, breach) = Pr(breach) × Pr(re-id | breach)

Presenter
Presentation Notes
Based on recent credible evidence, we know that approximately 27% of providers that are supposed to follow the HIPAA Security Rule have a reportable breach every year.

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)

T4: Public Data (demonstration attack)Pr(re-id), based on data set only

Presenter
Presentation Notes
We assume that there is an adversary who has background information that can be used to launch an attack.

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
So we can measure risk under plausible attacks, but how to we set an overall risk threshold?

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Many precedents going back multiple decades.

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Many precedents going back multiple decades.Recommended by regulators.

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Many precedents going back multiple decades.Recommended by regulators.All based on max risk though.

Presenter
Presentation Notes
Max risk is based on the record that has the highest probability of re-identification; average risk when the adversary is trying to re-identify someone they know or all everyone in data set.

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Many precedents going back multiple decades.Recommended by regulators.All based on max risk though.

Presenter
Presentation Notes
To set the threshold, we can look at the sensitivity of the data and the consent mechanism that was in place (invasion of privacy).

Anonymizing Health Data

Part 2 of Webcast: A Look at Our Case Studies

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

140,000 births per year.

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

140,000 births per year.

Cross-sectional—mothers not traced over time.

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

140,000 births per year.

Cross-sectional—mothers not traced over time.

Process of getting de-identified data from a research registry.

Anonymizing Health Data

Cross Sectional Data: Research Registries

Khaled El Emam & Luk Arbuckle

Better Outcomes Registry & Network (BORN)of Ontario

140,000 births per year.

Cross-sectional—mothers not traced over time.

Process of getting de-identified data from a research registry.

Anonymizing Health Data

Researcher Ronnie wants data!

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Researcher Ronnie wants data!

Khaled El Emam & Luk Arbuckle

919,710 recordsfrom 2005-2011

Presenter
Presentation Notes
The data he wants...

Anonymizing Health Data

Researcher Ronnie wants data!

Khaled El Emam & Luk Arbuckle

919,710 recordsfrom 2005-2011

Presenter
Presentation Notes
The data he wants...

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

Average risk of 0.1 for Researcher Ronnie(and the data he specifically requested).

Presenter
Presentation Notes
Based on detailed risk assessment.

Anonymizing Health Data

Choosing Thresholds

Khaled El Emam & Luk Arbuckle

0.05 if there were highly sensitive variables(congenital anomalies, mental health problems).

Average risk of 0.1 for Researcher Ronnie

Anonymizing Health Data

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Low motives and capacity

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Low motives and capacity; low mitigating controls.

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Pr(attempt) = 0.4

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)119,785 births out of a 4,478,500 women ( = 0.027)

Presenter
Presentation Notes
Worse case is 2008, prevalence of 0.027.

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)Pr(aquaintance) = 1- (1-0.027)150/2 = 0.87

Presenter
Presentation Notes
150/2 friends because only women considered.

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)Based on historical data.

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)Pr(breach)=0.27

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)

T4: Public Data (demonstration attack)

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)

Overall riskPr(re-id, T) = Pr(T) x Pr(re-id | T) ≤ 0.1

Anonymizing Health Data

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)Pr(aquaintance) = 1- (1-0.027)150/2 = 0.87

Overall riskPr(re-id, acquaintance) = 0.87 × Pr(re-id | acquaintance) ≤ 0.1

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Meeting Thresholds: k-anonymity

Khaled El Emam & Luk Arbuckle

k

Anonymizing Health Data

Meeting Thresholds: k-anonymity

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

MDOB in 1-yy; BDOB in wk/yy; MPC of 1 char.

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

MDOB in 1-yy; BDOB in wk/yy; MPC of 1 char.

MDOB in 10-yy; BDOB in qtr/yy; MPC of 3 chars.

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

MDOB in 1-yy; BDOB in wk/yy; MPC of 1 char.

MDOB in 10-yy; BDOB in qtr/yy; MPC of 3 chars.

MDOB in 10-yy; BDOB in mm/yy; MPC of 3 chars.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005—deleted.In 2007 Researcher Ronnie asks for 2006.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.In 2007 Researcher Ronnie asks for 2006—deleted.In 2008 Researcher Ronnie asks for 2007.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.In 2007 Researcher Ronnie asks for 2006.In 2008 Researcher Ronnie asks for 2007—deleted.In 2009 Researcher Ronnie asks for 2008.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.In 2007 Researcher Ronnie asks for 2006.In 2008 Researcher Ronnie asks for 2007.In 2009 Researcher Ronnie asks for 2008—deleted.In 2010 Researcher Ronnie asks for 2009.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

In 2006 Researcher Ronnie asks for 2005.In 2007 Researcher Ronnie asks for 2006.In 2008 Researcher Ronnie asks for 2007.In 2009 Researcher Ronnie asks for 2008—deleted.In 2010 Researcher Ronnie asks for 2009.

Can we use the same de-identification scheme every year?

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

BORN data pertains to very stable populations.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

BORN data pertains to very stable populations.

No dramatic changes in the number or characteristics ofbirths from 2005-2010.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

BORN data pertains to very stable populations.

No dramatic changes in the number or characteristics ofbirths from 2005-2010.

Revisit de-identification scheme every 18 to 24 months.

Anonymizing Health Data

Year on Year: Re-using Risk Analyses

Khaled El Emam & Luk Arbuckle

BORN data pertains to very stable populations.

No dramatic changes in the number or characteristics ofbirths from 2005-2010.

Revisit de-identification scheme every 18 to 24 months.

Revisit if any new quasi-identifiers are added or changed.

Anonymizing Health Data

Longitudinal Discharge Abstract Data:State Inpatient Databases

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Longitudinal Discharge Abstract Data:State Inpatient Databases

Khaled El Emam & Luk Arbuckle

Linking a patient’s records over time.

Anonymizing Health Data

Longitudinal Discharge Abstract Data:State Inpatient Databases

Khaled El Emam & Luk Arbuckle

Linking a patient’s records over time.

Need to be de-identified differently.

Anonymizing Health Data

Meeting Thresholds: k-anonymity?

Khaled El Emam & Luk Arbuckle

k?

Anonymizing Health Data

Meeting Thresholds: k-anonymity?

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Meeting Thresholds: k-anonymity?

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying Under Complete Knowledge

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying Under Complete Knowledge

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying Under Complete Knowledge

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying Under Complete Knowledge

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

State Inpatient Database (SID) of California

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

State Inpatient Database (SID) of California

Khaled El Emam & Luk Arbuckle

Researcher Ronnie wants public data!

Anonymizing Health Data

State Inpatient Database (SID) of California

Khaled El Emam & Luk Arbuckle

Researcher Ronnie wants public data!

Anonymizing Health Data

State Inpatient Database (SID) of California

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

T1:Deliberate Attempt

Measuring Risk Under Plausible Attacks

Khaled El Emam & Luk Arbuckle

T2: Inadvertent Attempt (“Holly Smokes, I know her!”)

T3: Data Breach (“data gone wild”)

T4: Public Data (demonstration attack)Pr(re-id) ≤ 0.09 (maximum risk)

Presenter
Presentation Notes
K=11

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

BirthYear in 5-yy (cut at 1910-);AdmissionYear unchanged;DaysSinceLastService in 28-dd (cut at 7-, 182+);LengthOfStay same as DaysSinceLastService.

Anonymizing Health Data

De-identifying the Data Set

Khaled El Emam & Luk Arbuckle

BirthYear in 5-yy (cut at 1910-);AdmissionYear unchanged;DaysSinceLastService in 28-dd (cut at 7-, 182+);LengthOfStay same as DaysSinceLastService.

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

QI to QI

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

QI to QISimilar QI? Same generalization and suppression.

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

QI to QISimilar QI? Same generalization and suppression.

QI to non-QI

Anonymizing Health Data

Connected Variables

Khaled El Emam & Luk Arbuckle

QI to QISimilar QI? Same generalization and suppression.

QI to non-QINon-QI is revealing?Same suppression so both are removed.

Anonymizing Health Data

Other Issues Regarding Longitudinal Data

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Other Issues Regarding Longitudinal Data

Khaled El Emam & Luk Arbuckle

Date shifting—maintaining order of records.

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Other Issues Regarding Longitudinal Data

Khaled El Emam & Luk Arbuckle

Date shifting—maintaining order of records.

Long tails—truncation of records.

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Other Issues Regarding Longitudinal Data

Khaled El Emam & Luk Arbuckle

Date shifting—maintaining order of records.

Long tails—truncation of records.

Adversary power—assumption of knowledge.

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Free-form text—anonymization.

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Free-form text—anonymization.

Geospatial information—aggregation and geoproxy risk.

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Free-form text—anonymization.

Geospatial information—aggregation and geoproxy risk.

Medical codes—generalization, suppression, shuffling (yes, as in cards).

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Other Concerns to Think About

Khaled El Emam & Luk Arbuckle

Free-form text—anonymization.

Geospatial information—aggregation and geoproxy risk.

Medical codes—generalization, suppression, shuffling (yes, as in cards).

Secure linking—linking data through encryption before anonymization.

Presenter
Presentation Notes
Approximate complete knowledge

Anonymizing Health Data

Part 3 of Webcast: Questions and Answers

Khaled El Emam & Luk Arbuckle

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

More Comments or Questions: Contact us!

Anonymizing Health Data

Khaled El Emam & Luk Arbuckle

Khaled El Emam: kelemam@privacyanalytics.ca

Luk Arbuckle: larbuckle@privacyanalytics.ca

More Comments or Questions: Contact us!

top related