The definition of “personal information” under s 6(1) of the Privacy Act 1988 (Cth) is a threshold question that determines the boundaries of what is regulated and what is protected by privacy laws.1

The underlying conceptual focus of defining “personal information” in Australian privacy laws is the revelation of identity.Australian privacy laws will only offer legal protection to individuals where they are identifiable.3 In the context of the digital environment, where there is a richness of data, advances of data analytics, and predictive capabilities of machine learning, privacy “laws need to grapple with a broader view of the types of” data that warrant protection — regardless of whether individuals’ identity is known or revealed.

Key takeaways

  • A consequence of the current definition is that there is a too narrow underlying conceptual focus on revelation of identity. As a result, personal information collected from household devices, location data, and potentially personal information in relation to COVID-19 — all information that should be protected — will often fall outside the scope of regulation.

  • Navigating privacy law on drawing a line between what is “personal information” and what is not is no easy task, and requires careful consideration.

  • A review of the definition is indispensable to ensure that the definition4 meets the advances of the data-driven nature of our world and reform two prominent “grey areas”: metadata and data analytics.5

Metadata and “personal information”

Despite the Federal Court’s ruling in Privacy Commissioner v Telstra Corp Ltd(Telstra) and the amendment to the definition in 2012,7 a deal of uncertainty remains as to whether the definition of “personal information” includes metadata such as IP addresses, device identifiers, location data, and other online identifiers.8

Some stakeholders advocate that any amendment to the definition is unnecessary, arguing the privacy regime is adequate.It is clear that data fragments can be regarded as “personal information” even if a more considerable sum of information is required to achieve identification.10 On the other hand, some engineers are surprised to learn that a great deal of routine operational data, such as event logs and analytics, can be considered personal information.11

Some engineers and security professionals advocate for uncertainty, claiming they “are at home with probabilities”12 of what personal information is caught by the definition, noting that “[u]ncertainty is what risk management is all about”.13

Reform of the definition to include metadata is criticised as what can be considered “personal information” requires an evaluative conclusion on every case’s totality of facts.14 Furthermore, the definition was intended to be “technology-neutral”.15

Lastly, stakeholders suggest “the proposed amendments are likely to increase compliance costs”,16 hindering innovation.

Nevertheless, stakeholders in favour of metadata constituting “personal information” exemplify cogent arguments for reform. First, the legislature’s intention as to the scope of the definition is clear: “the definition of personal information is intended to be expansive”17 to ensure an “infinite”18 range of information or opinions fall within the scope of regulation.

Second, metadata ought to be afforded protection under the Act as the nature of metadata has a tendency to be collected “behind our backs”19 — removing much of the control and bargaining power away from consumers and into the hands of data controllers.20

Such practices are “hugely exacerbated by the explosion of wearable computers, connected devices . . . and sensors spread across the built environment”.21

Third, widening the scope of the definition will enhance consumer trust in data security, resulting in data-driven innovation and the productive utilisation of data.22 “Technologists are wont to demand certainty in the interpretation [of the definition] and can be frustrated by the wiggle room in the definition of personal information”.23

Fourth, including metadata as “personal information” is a mild extension. The definition is broad enough to capture the use of CCTV cameras, even though it is:

. . . not necessary for the entity using or storing the images or video recordings of an individual to know the individual’s name or details — it is enough that the individual is capable of being identified from the images or video recordings.24

This logic should be applied to metadata.

Lastly, reform will align Australia’s definition of “personal information” with international standards.25

The problem inherent in requiring a revelation of identity as the test for personal information

It is unviable and unrealistic for the definition of personal information to exclude metadata.

As European law has recognised: “dynamic IP addresses can be used to indirectly identify an individual where it is held with additional data that can be used to identify the individual”.26

A consequence of the current definition is that the underlying conceptual focus is the revelation of identity: an “assumption . . . that no harm can befall an individual from the handling of their personal information if they cannot be identified from the data”.27

This is particularly problematic in the context of the internet of things (IoT). The IoT refers to “an ecosystem in which applications and services are driven by data collected from devices that sense and interface with the physical world”.28 These devices generate enormous amounts of data about how we live our everyday lives, for example, “the number of times the doors of a smart refrigerator are opened”.29

The Internet of Things Alliance Australia submitted that:

. . . sensing and actuating products such as the range of Google Home products have increased Google’s ability to capture information from the home that “may over time, through the use of data analytics, yield highly personal information such as home occupancy and a wide range of behaviours’’.30

Much of this data would not currently be considered personal information as the definition.31

Further, the issue of ownership should be considered.32 Innovators and entrepreneurs who have invested in these new technologies will understandingly claim to own their algorithms’ outputs such as the individual attributes data analytics can yield.33

If an individual can be “individuated”, that is, a dataset that can track, profile, target, contact or subject to a decision or action at an individual level, it is privacy harm that requires protection against.34 Perhaps removing the exemption that “personal information” does not include households’ information is an appropriate starting point.

Location data

Another illustration of how the current threshold issue hinges on a false assumption of “identifiability”35 as the source of harm is apparent in the context of location data. A recent case study found:

. . . publicly disclosed de-identified data about public transport cards used in the city of Melbourne could be used to find patterns showing young children travelling without an accompanying adult. Those children could be targeted by a violent predator as a result, without the perpetrator needing to know anything about the child’s identity.36

Location data is highly granular, yet none of these technologies depend on the data subject’s identifiability.37 Evidently, the assumption that identifiability is the crucial legal threshold can no longer be valid as harm can be caused by just knowing the individuals’ attributes without knowing the individuals’ identity. As Anna Johnston advocates, “[i]n the digital environment, “not identified” is no longer an effective proxy for ‘will suffer no harm’”.38

Personal information and biosecurity surveillance 

A further related issue that the current definition produces has been interpreting a subset of “personal information” in the COVID-19 pandemic: “sensitive personal information”.

A new line of authority that has arisen in the Fair Work jurisdiction,39 which held (among other things) that the personal information an employer sought to compel Mr Knight to divulge was not sensitive personal information “as it did not request sensitive health information about Mr Knight as defined in s 6 of [the Act]”.40 The information that Mr Knight was required to provide was: “All previous travel history outside of Australia; and [a]ny travel plans within the next 6 months with locations and dates”.41

This finding is liable to criticism. If “Mr Knight disclosed that he had been in a high-risk country, his employer would [undoubtedly] have requested medical follow-up”,42 which demonstrates why the information must have been health-related and sensitive, and the Tribunal’s finding erroneous.

Although the Tribunal’s detailed reasoning on this point is obscure (and the decision one that must be approached with some caution), this demonstrates how a narrow emphasis on the revelation of identity as the only test results in information that requires protection to falling outside the scope of regulation.

The narrow concentration in Knight v One Key Resources (Mining) Pty Ltd T/A One Key Resources43 on the question of whether Mr Knight’s travel details were health information missed the point that revelation that Mr Knight had been in China or Italy (as the case may be) meant that he had also been in a country experiencing a COVID-19 wave, and therefore was at risk of being positive to the virus (obviously sensitive informa tion).

In Telstra, the narrow concentration on the fact that metadata themselves did not reveal identity missed the point that enough metadata can easily connect and reveal identity.

This illustration acknowledges the risk of identification through data linkage.44 As Jon Fasman quotes Catherine Crump from Berkeley Law, “it’s the aggrega tion of data, not individual pinpoints, that creates a privacy issue”.45

In a more artistic vein, it is like missing Seurat’s A Sunday afternoon on the island of La Grande Jatte by concentrating on the pointillism, or not seeing a Lichtenstein [Pop Art] painting for the Ben Day dots.46

A Sunday afternoon on the island of La Grande Jatte (and detail)
Georges Seurat, 1884–1886

The definition should be reformed to clarify that metadata is captured under the definition, as advocated by the Australian Competition and Consumer Commis- sion (ACCC) recommendation 16(a).47

Conclusion

In a digitalised era where data-driven technologies govern every aspect of the digital environment, it is disheartening to see that Australian privacy laws do not adequately address the challenges metadata and data analytics imposes. The suggestion to “err on the side of caution by treating the information as personal information”48 is no longer sufficient. Nevertheless, navigating privacy law on drawing a line between what is “personal information” and what is not is no easy task.

The definition requires reform to ensure metadata is regulated and that the definition should be broadened to include “any information that may ‘reflect’ a specific person (without necessarily identifying them)”49 — an approach adopted by the newer gold standards of privacy and data laws.50

It is not suggested that changing the definition of “personal information” will end our metadata-related concerns. Other reforms to the Act will be necessary to avoid each element of metadata, no matter how obscure and untellable on its own, requiring the same degree of protection.

There are three further suggestions of reform: first, defining “collection”, and establishing certain categories of collection to determine which forms of collection merit more or less the same protections under privacy law. Second, defining the terms “public” and “private” domains to assist in understanding the type of information being collected and to identify the degree of protection they relevantly call for. Third, clearer consideration of data ownership in the context of data-analytics should be considered to strike a balance between stakeholders’ competing interests: protection and innovation.

END NOTES

1.The scope of this legislation is confined information or an opinion about an identified individual, or an individual who is reasonably identifiable: whether the information or opinion is true or not; and whether the information or opinion is recorded in a material form or not: Privacy Act 1988 (Cth), s 6(1).

2.A Johnston “The privacy challenges posed by location data” (2020) 17(6) PRIVLB 94, at 95.

3.Above.

4.Privacy Act, above n 1.

5.Australian Competition and Consumer Commission, Digital Platforms Inquiry Final Report (June 2019) (Digital Platforms Inquiry), citing Law Council of Australia, Submission to Australian Competition and Consumer Commission, Digital Platforms Inquiry (15 February 2019) 21.

6.Privacy Commissioner v Telstra Corp Ltd (2017) 249 FCR 24; 347 ALR 1; [2017] FCAFC 4; BC201700165.

7.Privacy Amendment (Enhancing Privacy Protection) Act 2012 (Cth).

8.Digital Platforms Inquiry, above n 5, 458–460.

9.Digital Platforms Inquiry, above n 5, 457.

10.S Wilson, “Seeing privacy through the engineer’s eyes” (2015) 12(3) PRIVLB 34, 35.

11.Above.

12.Above n 10.

13.Above n 10, 37.

14.V Scott, H Lauder and A Fehrenbach “Let’s get personal: the effect of the Federal Court’s interpretation of ‘personal information’ and the OAIC’s response” (2017) 14(6) PRIVLB 105, 107.

15.Explanatory Memorandum, Privacy Bill 1988 (Cth) 11–12.

16.Digital Platforms Inquiry, above n 5, 457.

17.Attorney-General’s Department, Privacy Act Review Issues Paper (October 2020) 16.

18.Above n 15.

19.Above n 10, 36.

20.Digital Platforms Inquiry, above n 5, 455.

21.Above n 10, 36.

22.Digital Platforms Inquiry, above n 5, 457.

23.Above n 10, 37.

24.A Hutches and J Perier “Smile — you’re on candid camera? How advances in video surveillance technology present privacy compliance issues for Australian businesses” (2016) 13(10) PRIVLB 221, 222.

25.Digital Platforms Inquiry, above n 5, 461.

26.Digital Platforms Inquiry, above n 5, 459.

27.Above n 2, 95.

28.M Thornhill, F D Roberts and K Y Yeo “The bottomless abyss of data in a technological and computerised world: The privacy of Australians’ data and the Internet of Things” (2020) 17(4) PRIVLB 66.

29.Above.

30.Above n 28, citing Internet of Things Alliance, Submission to Australian Competition and Consumer Commission, Digital Platforms Inquiry (15 February 2019), 1.

31.Office of the Australian Information Commissioner, Australian Privacy Principles Guidelines — Privacy Act 1988 (July 2019), para B.95.

32.Above n 10, 36.

33.Above.

34.Above n 2, 96.

35.Above n 2, 95.

36.Above n 2, 96, citing C Culnane, B I P Rubinstein and V Teague “Two Data Points Enough to Spot You in Open Transport Records” Pursuit 15 August 2019, https://pursuit. unimelb.edu.au/articles/two-data-points-enough-to-spot-you-in- open-transport-records.

37.Above n 2, 95.

38.Above n 2.

39.Knight v One Key Resources (Mining) Pty Ltd T/A One Key Resources [2020] FWC 3324.

40.Above, [83].

41.Above n 39, [14].

42.T Blyth “Just How private are employee records?” (2021) 17(10) PRIVLB 190, 192.

43.Above n 39.

44.M Finck, F Pallas “They who must not be identified — Distinguishing Personal from Non-Personal data under the GDPR” (2020), Max Plan Institute for Innovation and Com- petition, Research Paper No 19–14, 7, https://papers.ssrn.com/ sol3/papers.cfm?abstract_id=3462948#.

45.J Fasman, We See It All, Scribe, 2021, p 46.

46.Interestingly, Lichtenstein restricted his paint colours to imitate the four colours of printers inks’ yet created a picture that was comprehensible when viewed from further away (Ben-Day dots are a system devised to increase the tonal range in commercial printing through a dot screen method). The author thanks Toby Blyth for his views on Knight and Telstra and on “connecting the dots” (and his references to art).

47.Digital Platforms Inquiry, above n 5, 458.

48.Office of the Australian Information Commissioner, What is personal information?, 5 May 2017, www.oaic.gov.au/privacy/ guidance-and-advice/what-is-personal-information.

49.Above n 2, 98, quoting G Greenleaf and S Livingston “China’s Personal Information Standard: The long March to a Privacy Law” (2017) 150 Privacy Law and Business International Report 25. Cf Regulation (EU) 2016/679 of the European Parliament and of the Council of 27 April 2016 on the protection of natural persons with regard to the processing of personal data and on the free movement of such data, and repealing Directive 95/46/EC (General Data Protection Regulation) [2016] OJ L 119/1 (GDPR), Art 4(1). The definition under the GDPR arguably still requires identification rather than individuation.

50.See California Consumer Privacy Act of 2018, s 1798.140(o)(1); Nigeria Data Protection Regulation 2019, reg 1.3(xx); International Organization for Standardization, ISO/IEC 27701:2019 — Security techniques — Extension to ISO/IEC 27001 and ISO/IEC 27002 for privacy information management — Requirements and guidelines, www.iso.org/standard/71670.html. 

 

By Toby Blyth and Jessica Yazbek