In May, the Prime Minister announced legislation amending the Privacy Act 1988 (Cth) (Privacy Act) will go before Parliament this month after an almost four-year review process that began in late 2020. In his announcement, the Prime Minister cited the two driving forces behind the Australian Government’s push to “bring forward” the legislation as outlawing doxxing and giving Australians (particularly women who are experiencing domestic and family violence) greater control and transparency over their personal information. The following day, the Attorney-General confirmed the legislation amending the Privacy Act would be “brought forward” in August to “overhaul” the Privacy Act. Interestingly, there are only eight sitting days in August for both houses and given other reform agenda announcements since May, presumably the reform docket for the government is full.
Two months on in early July, in a National Press Club Q&A , the Attorney-General seemed to skirt around the issue of whether the amending legislation would be ready to be tabled in Parliament this month, stating that he was “hopeful” that it would be brought to Parliament “later this year”. So, once again, the business community are left waiting with bated breath to understand more specifically what impact the amending legislation may have.
The specifics of the amending legislation have not been released through an exposure draft bill (and we understand an exposure draft will not be released prior to tabling in Parliament), but the Australian Government’s response (released around 10 months ago) to the Privacy Act Review Report (the Report) will inform the draft legislation. Our previous examination of the Report, and our analysis of the government’s response provide a detailed breakdown of the government’s direction of travel. In its response to the Report, the government ‘agreed’ with 38 of the 116 proposals, ‘agreed in-principle’ with 68, and ‘noted’ the remaining 10. We expect that the imminent (or not) draft legislation will at a minimum reflect the agreed proposals, and (given where we are at 10 months on) is likely also to include most of those agreed in-principle. A spokesperson for the Attorney-General has confirmed that the legislation will address the entirety of the government’s response to the Privacy Act, rather than approaching it in tranches.
This article identifies areas we believe will be a priority for the Australian Government, the likely efficacy of the reforms given the current inadequate funding for the Office of the Australian Information Commissioner (OAIC), and what businesses can expect and do to prepare.
Agreed areas of reform
As we noted in our previous article on the government’s response to the Report, many of the ‘agreed’ proposals are not substantive changes to the privacy protections offered to individuals. Instead, they relate to enhancing the OAIC’s regulatory powers, or propose further consultation or consideration.
The proposals agreed by the government in its response will be focus areas included in the upcoming amendments to the Privacy Act. Key areas which may form part of the reforms include:
Introducing additional protections for children (Proposal 16). This includes codifying existing OAIC guidance on consent and capacity, requiring entities to make collection notices and privacy policies ‘clear and understandable’, and requiring entities to have regard for the best interests of the child in their consideration of the fair and reasonable test (see further below). The Report also proposes the introduction a Children’s Online Privacy Code that applies to online services accessed by children. This proposal would complement the work being done by the eSafety Commissioner, including the current review of the Online Safety Act 2021 (Cth) and the recent amendment to the BOSE Determination, to protect children from harmful content online.
Strengthening enforcement of the Privacy Act and penalties (Proposal 25). In addition to the enhanced penalties and expanded OAIC powers introduced in December 2022 , the Report proposes multiple tiers of civil penalty provisions, suggesting two new categories of penalties: (i) a mid-tier penalty for general privacy interference, and (ii) a low-tier penalty for ‘administrative breaches’, complete with Privacy Commissioner infringement notice powers. This proposal also seeks to clarify the meaning of a ‘serious interference’ with privacy, expands the OAIC and courts’ powers, and requires APP entities to identify, mitigate and redress loss.
Regulating the use of personal information in automated decision making (Proposal 19). If implemented as proposed, this will require regulated entities to ensure that their privacy policies are transparent about the type and use of personal information in ‘substantially automated decisions’ that have a legal or otherwise significant effect on an individual. The Privacy Act will also include examples of high-level indicators of the types of decisions that will be caught by these requirements. This reform is targeted at the regulation of AI as it infiltrates businesses’ practices and ways of working. Part of this proposal also includes introducing a right to allow individuals to request information about how substantially automated decisions using their personal information are made.
Regulating targeted advertising, direct marketing and trading (Proposal 20). These proposals suggest that we may see prohibitions on direct marketing, targeting and trading where it relates to the personal information of children. Any permitted targeting will also need to be ‘fair and reasonable’ and come with transparency requirements about the use of algorithms and profiling to recommend content to individuals. In addition, individuals will have a right to opt-out of receiving targeted advertising and the use or disclosure of their personal information for direct marketing.
Proposals agreed in-principle likely to make it in the amendment bill
Interestingly, several proposals in the Report that we consider to be priority areas for the reforms, were only agreed in-principle by the government. While this demonstrates a hesitance by the government to commit to such proposals, we expect some of these to be included in the upcoming reforms. Proposals that strengthen protections against doxxing and enhance online safety are highly likely to be prioritised given the recent government announcements, namely:
Introducing a fairly broad standard of conduct for all organisations subject to the Privacy Act to act fairly and reasonably when collecting, using and disclosing personal information (Proposal 12). This proposal is intended to ensure data handling practices align with community expectations. It goes much further than the current requirement to merely collect personal information only by ‘lawful and fair’ means. The government stated in its response: “This new requirement will help protect individuals when their personal information is used in complex data processing activities which have emerged through technological advancement, such as screen scraping and AI” . It also noted that OAIC guidance and enforcement, and judicial assessment, would sketch out the “contours of the fair and reasonable test over time” . The Report helpfully includes a number of factors that may be taken into account to determine whether any collection, use or disclosure of personal information is fair and reasonable but it is an otherwise unknown test at this stage. The OAIC sees this proposal as a ‘new keystone’ of the privacy framework and the Privacy Commissioner has commented that the proposed test is “really important because it circumvents the reliance on consentwhereby individuals just kind of click consent or agree to terms and conditionswhich actually lead to harmful privacy practices”.
Introducing a statutory tort for serious invasions of privacy that are intentional or reckless (and not merely negligent) (Proposal 27). To meet the requirements of the proposed tort, the invasion of privacy would have to be either a serious intrusion into the seclusion of an individual or a serious misuse of private information and would be actionable only where a person in the position of the plaintiff would have had a reasonable expectation of privacy, in all of the circumstances. Importantly, however, the invasion of privacy would not need to cause actual damage and individuals may claim damages for emotional distress. Our previous articles here and here detail the long history of this proposal. It is this proposal that the government sees as one response to doxxing issues which became highlighted earlier this year in the context of the Middle East conflict. The introduction of the tort would mark a significant change in the privacy law regime as it extends the boundaries of the Privacy Act and would seek to legislate and govern relations directly between individuals themselves.
Introducing a right of erasure to provide individuals with the ability to request APP entities delete their personal information (Proposal 18.3) and a right to de-index online search results containing certain types of personal information (Proposal 18.5).
Amending the definition of consent to make it clear that consent must be voluntary, informed, current, specific and unambiguous (but does not need to be express) (Proposal 11). This amendment would reflect and make mandatory the current standard of consent in the APP Guidelines.
Broadening the definition of personal information by replacing ‘about’ with ‘relates to’ such that the definition is to read: “personal information means information or an opinion that relates to an identified individual, or an individual who is reasonably identifiable ” (Proposals 4.1-4.4). This would allow the definition to capture a broader range of information as the individual would no longer need to be the subject of the information, albeit we note that the proposal also recognises that the nexus between the information and the individual is not to be tenuous or remote and the definition would ensure this “through drafting of the provision, explanatory materials and OAIC guidance” . Similar to the GDPR, it is proposed to include a non-exhaustive indicative list of information that may be ‘personal’ to assist APP entities. Importantly, this list includes location data, an online identifier, or one or more factors specific to the physical, physiological, genetic, mental, behavioural, economic, cultural or social identity or characteristics of an individual.
Introducing a direct right of action for individuals to enforce their privacy rights and to seek compensation in the courts in relation to an interference with their privacy (Proposal 26). The Report proposes a ‘gateway model’, which mirrors the system used by the Australian Human Rights Commission. It envisages that individuals or representative groups that wish to issue court proceedings for a privacy interference would first have to make a complaint to the OAIC - just as they have to today. The OAIC (or a nominated external dispute resolution body) would assess whether the complaint should be referred to a conciliation process before the court. After the conciliation gateway, the complainant(s) would be entitled to bring an action in the Federal Court or the Federal Circuit Court.
Amending the notifiable data breaches scheme , including imposing tighter deadlines to report eligible data breaches to the OAIC (within a GDPR-esque 72 hours) and requiring entities to issue a statement to the OAIC or an affected individual outlining steps the entity has taken or intends to take in response to a breach (Proposals 28.2-28.3).
Imposing additional obligations when APP entities handle private sector employee records (Proposal 7). In particular, obligations are to be imposed relating to transparency of collection and use of employee information, protection against unauthorised access or interference, and eligible data breach reporting.
Requiring APP entities to conduct a Privacy Impact Assessment for high-privacy risk activities (Proposal 13.1). It is proposed that a high privacy risk activity is likely to have a significant impact on the privacy of individuals and that OAIC guidance would be developed which would provide more detailed criteria of what a high privacy risk activity might comprise. The Report notes that an indicative list of high privacy risk activities ‘could be’ set out in the Privacy Act or OAIC guidance and may include:
the processing of sensitive information and children’s personal information on a large scale;
online tracking, profiling and the delivery of personalised content and advertising to individuals; and
the use of biometric templates or biometric information for the purpose of verification or identification.
Introducing the concepts of controllers and processors (Proposal 22). This concept would mean that a non-APP entity that processes information on behalf of an APP entity controller will have fewer obligations for compliance with the Privacy Act. This will be of particular interest to the around 95 percent of businesses that currently fall outside the scope of the Privacy Act due to the operation of the small business exemption. While there have been suggestions from the OAIC that this exemption will be scrapped, the government is yet to announce whether it will remove the exemption. Regardless, the controller / processor concept would lighten the blow for small to medium sized businesses operating in Australia.
Without adequate OAIC funding, can the reforms be effective?
The success of the upcoming reforms is contingent on the OAIC having adequate funding to implement the likely significant changes to Australia’s privacy regime. This is currently not the case as the Australian Government’s 2024-25 budget revealed that the OAIC will take an almost $11 million funding hit. This is curious timing given the OAIC has been - and continues to be - notoriously underfunded and the Australian Government has committed to tabling the Privacy Act legislation by the end of the year.
The OAIC is central to effecting the Privacy Act reforms as a significant number of the proposals in the Report pile the work to be done onto the OAIC to develop guidelines, conduct consultation and consideration of a range of unsettled issues, as well as enforce and investigate compliance with the Privacy Act. Without adequate funding, the OAIC will struggle to undertake the work expected in these proposals, as well as its current enforcement program which includes investigations into Optus and Latitude, and court proceedings against Meta, Australian Clinical Labs, and most recently, civil penalty proceedings initiated against Medibank on 5 June 2024, in relation to its October 2022 data breach.
What can you expect in the upcoming months and how can you prepare?
There’s no shying away from the fact that the amending legislation will be significant and incorporate several issue areas identified by the Report. After draft legislation is tabled and passed, we anticipate an implementation period of between 12 to 24 months, although this has not been confirmed.
Ahead of the upcoming reforms, affected organisations should, to the extent they have not already, take stock of their current compliance position vis--vis the Privacy Act and seek to understand the material gaps between that position and the potential elements of the ‘go-forward’ Privacy Act following reform, particularly in those areas that are likely to undergo change or present requirements for new compliance measures. Organisations could start to bridge any such gap by aligning their privacy governance, practices, procedures, and documents with the new key fundamentals that will underlie the revised Privacy Act.
We also expect continued tougher enforcement by the OAIC, as evidenced recently by the proceedings initiated against Medibank and by recent commentary from the (relatively newly installed) Privacy Commissioner indicating a more proactive and proportionate approach as a regulator and ‘outcome-based enforcement’ being a priority focus - calling on organisations to “power up” privacy.
For more information on the changing landscape of the privacy regime, tune into our podcast .