Monetising Personal Health Data in the aged care sector – Part 2 regulatory guidance

Background

This Alert is designed to provide additional detail about regulatory perspectives on approaches to de-identification. It is intended to supplement Part 1:  Legal and Regulatory Pitfalls:  Monetising Personal Health Data in the aged care sector (https://www.qualitycare.org.au/news/) for those wanting some more detailed background on de-identification issues. It also provides a select bibliography of some further resources.

Regulatory responses and guidance

In the Australian context, both the Office of the Australian Information Commissioner (OAIC) and the Victorian privacy regulator, the Office of the Victorian Information Commissioner (OVIC) have produced helpful guidance about de-identification and its risks.

The OAIC guidance, produced in conjunction with CSIRO, is the most extensive. It notes that:

Information that has undergone an appropriate and robust de-identification process is not personal information, and is therefore not subject to the Privacy Act 1988 (Cth). Whether information is personal or de-identified will depend on the context. Information will be de-identified where the risk of an individual being re-identified in the data is very low in the relevant release context (or data access environment).[1] Put another way, information will be de-identified where there is no reasonable likelihood of re-identification occurring. (our emphasis)

The contextual nature of the challenges associated with de-identification is significant and is elaborated in the CSIRO/Data 61 De-identification Decision-Making Framework.

The De-identification Decision-Making Framework notes that:

De-identification is a process of risk management but it is also a decision-making process: should we release this data or not and if so in what form? Considering all the elements involved, that decision can appear complex with many uncertainties. It requires thinking about a range of heterogeneous issues from ethical and legal obligations to technical data questions, and integrating the different perspectives on the topic of de-identification into a single comprehensible framework…

It also states that:

De-identification is not an exact science and, even using the De-Identification Decision-Making Framework (DDF) at this level, you will not be able to avoid the need for complex judgement calls about when data is sufficiently de-identified given your data situation (that is, your data and its interactions with factors such as other data, people, governance, and infrastructure).

It then describes the Decision-Making Framework as a complex and resource-intensive process consisting of ten components:

1. Describe your data situation

2. Understand your legal responsibilities

3. Know your data

4. Understand the use case

5. Meet your ethical obligations

6. Identify the processes you will need to assess disclosure risk

7. Identify the disclosure control processes that are relevant to your data situation

8. Identify who your stakeholders are and plan how you will communicate

9. Plan what happens next once you have shared or released the data

10. Plan what you will do if things go wrong

It recommends that these ten components be grouped “into three core de-identification activities”:

•     A data situation audit (Components 1-5). This activity will help you to identify and frame those issues relevant to your data situation. You will encapsulate and systematically describe the data, what you are trying to do with it and the issues thereby raised. A well-conducted data situation audit is the basis for the next core activity.

•     Risk analysis and control (Components 6-7). Here you consider the technical processes that you will need to use in order to both assess and manage the disclosure risk associated with your data situation.

•     Impact management (Components 8-10). Here you consider the measures that should be in place before you share or release data to help you to communicate with key stakeholders, ensure that the risk associated with your data remains negligible going forward, and work out what you should do in the event of an unintended disclosure or security breach.

Whether this framework remains current and fit-for-purpose is open to doubt. During the Commonwealth’s review of the Privacy Act, CSIRO itself noted that ‘the framework was already becoming dated’ and that there are ‘practical difficulties with continuously updating the framework.’

The Five Safes

Another proposed approach to de-identification is known as the ‘Five Safes.’ The CSIRO/Data 61 De-identification Decision Making Framework suggests that it can be used alone or in conjunction with the Five Safes.

The Five Safes originated in the UK’s Office of National Statistics in the early 2000s as a mechanism to provide safe research access to personal data. It is claimed to be “best practice in data protection while fulfilling the demands of open science and transparency.” It has been adopted by the Australian Bureau of Statistics and the Australian Institute of Health and Welfare and informs a number of public sector approaches to data sharing.

The Five Safes is an approach to thinking about, assessing and managing risks associated with data sharing and release. It has five dimensions:

Safe projects – is this use of the data appropriate, lawful, ethical and sensible?

Safe people – can the user be trusted to use it in an appropriate manner?

Safe data – does the data itself contain sufficient information to allow confidentiality to be breached

Safe settings – does the access facility limit unauthorised use or mistakes

Safe outputs – is the confidentiality maintained for the outputs of the management regime?

Although the Five Safes became a much celebrated ‘solution’ to de-identification impediments considered to apply in Australia, particularly by open data sharing advocates, its limitations as a universal panacea to de-identification risks have become apparent. Having been developed in the context of enabling public benefit research, successfully deploying it involves complex and dynamic risk management across a range of participants accompanied by strong governance and complex legal instruments. Often, participants must be accredited and must maintain accreditation, usually from a public sector organisation having oversight of the sharing arrangements.

OVIC Guidance

Although OVIC’s regulatory oversight is restricted to the Victorian public sector, the Legislation it administers, the Privacy and Data Protection Act 2014 (Vic) relevantly uses the same terminology as the Commonwealth Privacy Act 1988.

Essentially, it raises similar issues as the Commonwealth/CSIRO De-identification Decision-Making Framework in its Introduction to De-Identification and reaches similar conclusions regarding the problematic nature of publicly disclosing unit level data:

When done robustly in a controlled environment, the application of de-identification techniques to unit-level data can indeed enhance privacy. However, in circumstances where data is released to the public through an “open data” policy, de-identification can be risky. The growing availability of new and more information increases the risk that de-identified data can be matched with other auxiliary information, leading to the re-identification of seemingly de-identified data.

The OVIC guidance additionally touches on some of the more advanced technical measures that can be relevant in considering approaches to de-identification. These are “k-anonymity” and “differential privacy.” A description of these techniques is beyond the scope of this Alert.

Further reading

Commonwealth of Australia

OAIC and CSIRO/Data61 De-identification Decision-Making Framework, See https://www.oaic.gov.au/privacy/privacy-guidance-for-organisations-and-government-agencies/handling-personal-information/de-identification-decision-making-framework

State of Victoria

OVIC, An Introduction to De-identification, see https://www.oaic.gov.au/privacy/privacy-guidance-for-organisations-and-government-agencies/handling-personal-information/de-identification-decision-making-framework

USA

National Institute of Science and Technology, De-identification of Personal Information, see https://nvlpubs.nist.gov/nistpubs/ir/2015/NIST.IR.8053.pdf

UK

Office for National Statistics, The ‘Five Safes’ – Data Privacy at ONS, see https://blog.ons.gov.uk/2017/01/27/the-five-safes-data-privacy-at-ons/

For a critique of the Five Safes: Chris Culnane, Ben Rubinstein and David Watts, Not Fit for Purpose: A Critical Analysis of the ‘Five Safes’, see https://arxiv.org/pdf/2011.02142.pdf

EU

European Data Protection Supervisor, 10 Misunderstandings Related to Anonymisation, see https://edps.europa.eu/system/files/2021-04/21-04-27_aepd-edps_anonymisation_en_5.pdf

This QCAA Alert was compiled by Professor David Watts.

David Watts, Professor, Thomas More Law School, Australian Catholic University, Melbourne.

David is one of Australia’s leading data protection experts with experience as a regulator, policy maker, public and private sector lawyer.

Scroll to Top