Skip to content
News update

Putting privacy on your radar

One area of governance that is perhaps overlooked is information governance and privacy compliance.

Whilst regulators are currently weighing in on the cyber security risks for organisations and the obligations of boards and senior managers to address these risks, the basic ‘privacy by design’ principles which minimise the collection of data, and accordingly the inherent privacy risk profile of the organisation, are sometimes overlooked.

Privacy Awareness Week 2022 (2-8 May) had the theme of ‘Privacy: The Foundation of Trust’. Organisations are finding that, as consumers become more aware of their rights, building trust in their data practices is an integral part of maintaining the customer relationship.

Case examples of technology not enhancing privacy

In its launch of Privacy Awareness Week, the Office of the Australian Information Commissioner (OAIC) considered three cases over the last year which involved businesses using technology to collect, use or disclose more data than they needed. This included the OAIC’s determinations on:

  • Clearview AI, who have used web scraping software to trawl online content for images which the organisation compiles into a facial recognition database and provides to law enforcement agencies
  • 7-Eleven, who had been collecting and using biometric information without consent
  • The Australian Federal Police (AFP) and their trial of Clearview’s AI facial recognition tool without undertaking a privacy impact assessment in advance of the trial.

Clearview AI

Clearview’s facial recognition database essentially allows law enforcement to identify individuals in security camera footage using personal information, circumventing the need for search warrants.

In late 2021, the OAIC published a determination which concluded that Clearview had engaged in multiple contraventions of the Privacy Act in relation to the collection of the images which were determined to be biometric information, a type of sensitive information.

The Clearview case illustrates the danger of:

  • failing to obtain consent from users
  • collecting and using personal information for a purpose which no user would reasonably expect.

Regulators in jurisdictions across the world are seeking to put an end to Clearview’s data practices but have had little success thus far.

The OAIC, as well as privacy regulators in Canada, UK, Germany, France and some US states, have ordered Clearview to stop collecting, using and disclosing the images (and the associated biometric information) and to delete all images from its databases. Those orders seem to have had little effect on Clearview.

7-Eleven

7-Eleven is another organisation which has been found to have breached the Privacy Act.

Over a 14-month period, 7-Eleven had been collecting facial images and fingerprints via a tablet device used to obtain customer feedback in stores. The feedback survey had been completed by over 1.6 million customers in the first nine months.

The facial recognition information was used to detect whether a customer had left multiple responses to the customer feedback survey within a short period of time, in which case the feedback may not have been genuine. The technology also allowed 7-Eleven to collect demographic information about its customer base.

The OAIC’s findings against 7-Eleven are a good reminder for businesses to:

  • obtain consent from customers to collect their sensitive information
  • only collect sensitive information that is ‘reasonably necessary’ for your business’ functions or activities.

The OAIC found that, while improving customers’ in store experience was a legitimate function of 7-Eleven, the collection of customer’s biometric information was not reasonably necessary for that purpose.

Australian Federal Police

For two months, in 2019/2020, an AFP department used Clearview AI’s facial recognition tool on a trial basis to determine whether it could assist in the investigation of child exploitation offences.

To test the efficacy of the database, AFP trial participants uploaded images of possible persons of interest, an alleged offender, victims, members of the public and members of the AFP. The OAIC subsequently investigated this trial and found that the AFP had interfered with the privacy of individuals whose images it uploaded to the Clearview database.

No privacy impact assessment had been undertaken in relation to the trial despite the fact that government agencies are required to undertake privacy impact assessments in relation to any ‘high privacy risk project’ and to take other reasonable steps to mitigate privacy risks.

Using technology to embed privacy by design: Lessons from the OAIC determinations

The importance of privacy by design and early intervention into process and system design remains paramount. In many areas the answer is simply: If you don’t need it, don’t collect it. Also, if you can’t protect it, don’t collect it. Regular data hygiene and updates in relation to information collection and storage practices are also key issues.

Unlike in Europe where, under the General Data Protection Regulation (GDPR), there is a mandated position for a Data Protection Officer who has significant statutory obligations in relation to an organisation’s compliance with data protection laws, the Australian principles-based regime is more flexible, and no single person is required to take responsibility for privacy compliance.

As such, cross-collaboration between various lines of responsibility is often required to deal with the governance, privacy and security issues arising from the collection and storage of information.

Next Steps

For the rest of 2022, the takeaway would be to facilitate this coordination, embed privacy early in a project or new program and where there is a risk of overstepping the line, such as in the 7-Eleven case, put in place governance approvals processes to ask the key questions before projects go live. At the end of the day, setting up privacy by design guidelines in governance processes is a small price to pay in comparison to those larger consequences of fines and reputational risk.

About the author:

Lyn Nicholson is General Counsel at Holding Redlich. Lyn advises on privacy and data protection issues. She also advises on directors’ duties and practical governance issues for both listed and unlisted companies. Her transactional work focuses on the services industry and technology, intellectual property and a range of issues that are legally ambiguous and often at the cutting edge of technology. Lyn has been recognised in the Best Lawyers 2023 list for her work in privacy and data security law.

Painting a clear picture of board dynamics

Next article