The Labor Government is increasing efforts directed to regulation of online safety and harms in relation to digital platforms, announcing it intends to pass legislation to ban social media for under-16s, crackdown on misinformation and disinformation, and now legislate ‘duty of care’ obligations to keep Australians safe and prevent online abuse on social media platforms.
On 13 November 2024, the Communications Minister Michelle Rowland MP announced the government’s decision to develop a Digital Duty of Care regulatory model (Duty of Care) which places broad obligations on digital platforms to ‘keep users safe and help prevent online harms’. The announcement of the proposed Duty of Care follows a slate of reforms to address online safety, including:
Prime Minister Albanese’s announcement just a few days earlier, on 8 November 2024, that the government will legislate 16 as the minimum age for access to social media (Age Restriction).
The accelerated movement of the Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (Misinformation Bill) through the House of Representatives last week.
The Criminal Code Amendment (Deepfake Sexual Material) Bill 2024 (Deepfake Legislation) which commenced on 3 September 2024 and introduces criminal offences to ban the sharing of non-consensual deepfake sexually explicit material.
The government will introduce the Age Restriction legislation in the next fortnight and seek to pass the Misinformation Bill in the next Parliamentary sitting.
These reforms are intended by the government to address harms caused by offshore actors and increase transparency about how digital platforms manage risks relating to foreseeable harms, misinformation and disinformation. Digital platforms will be required to:
Take reasonable steps to ensure ‘fundamental protections’ are in place to ensure their platforms cannot be accessed by children under 16 years of age.
Report on the outcomes of risk assessments relating to misinformation and disinformation on their platforms.
Take reasonable steps to prevent harms and undertake regular risk assessments against the enduring harms.
The Duty of Care
The Duty of Care places legal obligations on digital platforms to take reasonable steps to protect Australian users from foreseeable harm, with the risk of heavy penalties for systemic failures. The government considers that, as part of a growing global effort, it “will deliver a more systemic and preventative approach to making online services safer and healthier”.
The Duty of Care was recommended by an independent review of the Online Safety Act, undertaken by former ACCC Deputy Chair, Delia Rickard. While the report was received by the government on 1 November 2024, it is yet to be published. Although the government is still working through the report, Ms Rowland MP hinted “one message stood out: a Duty of Care is fundamental”.
Ms Rowland MP explained that the change to the Online Safety Act 2021 (Cth) (Online Safety Act) is “subtle but significant” and she emphasised, “What’s required is a shift away from reacting to harms by relying on content regulation alone and moving towards systems-based prevention, accompanied by a broadening of our perspective of what online harms are experienced by children”.
So far, the government has provided minimal detail about the proposed obligations and penalties under the Duty of Care framework, only noting that:
Digital platforms must take reasonable steps to prevent harms and must undertake regular risk assessments against the enduring harms. These requirements are core features under the EU Digital Services Act and the UK Online Safety Act 2023.
It will enable the Australian Communications and Media Authority (ACMA) to “draw on strong penalty arrangements” where platforms seriously breach their duty of care and where there are systemic failures.
The Online Safety Act expressly applies to ‘acts, omissions, matters and things outside Australia’ and it rightly finds its policy rationale and jurisdictional nexus in the harm which occurs onshore to Australians. She also noted that harmonisation with like-minded countries is a key objective because policy principles that carry across markets will be more effective at driving change.
To complement the overarching Duty of Care, the government will legislate enduring categories of harm, which Ms Rowland MP says could include: harms to young people, harms to mental wellbeing, the instruction and promotion of harmful practices, and other illegal content, conduct and activity.
The proposal to legislate the Duty of Care was also subsequently recommended by the Joint Select Committee on Social Media and Australian Society in its final report, titled ‘Social media: the good, the bad and the ugly’ and tabled in the Senate on 18 November 2024 (Social Media Inquiry Final Report). The Joint Select Committee recommended the Australian Government introduce a single and overarching statutory duty of care onto digital platforms for the wellbeing of their Australian users and requires digital platforms to implement diligent risk assessments and risk mitigation plans to make their systems and processes safe for all Australians.
Ms Rowland MP emphasised that harmonisation with ‘like minded’ countries is a key objective, referring to both the EU and UK models.
For context, the EU Digital Services Act places varying obligations on categories of services, platforms and providers to target illegal content, disinformation and transparent advertising. On one end of the spectrum, intermediary service providers must comply with orders to act against illegal content and publish annual reports on their content removal and moderation activities.
On the other end, very large online platforms and search engines must have an internal complaint handling system regarding the removal of content, comply with enhanced transparency obligations, conduct an annual risk assessment, establish an independent compliance function and provide additional information and user optionality in relation to online advertising and recommender systems used on their platforms.
The UK Online Safety Act 2023 similarly creates a duty of care of online platforms, requiring regulated services to conduct certain risk assessments at defined intervals, carry out risk assessments of whether the service is likely to be accessed by children and how likely children are to be harmed by content on the website and take action against illegal or harmful content from their users. Failure to do so could result in penalties of up to £18 million or 10% of their annual turnover, or in access to particular websites being blocked by the regulator.
The Labor Government will consult on how the proposed duty of care model would work, though has not yet indicated when it intends to consult or introduce legislation, which would need support from the other parties in order to pass the Senate.
Age Restriction
On 8 November 2024, the government announced it will legislate 16 as the minimum age for access to social media, following endorsement by National Cabinet that same day. The proposed Age Restriction puts the onus on social media platforms to take reasonable steps to ensure ‘fundamental protections’ are in place.
Ms Rowland noted the legislation will contain positive incentives as part of an exemption framework to encourage safe innovation and also provide for access to social media type services enable education or health support for young people. Social media platforms may apply to the regulator for approval if they can demonstrate they meet set criteria and provide positive benefits for children or do not employ harmful features.
The government plans to introduce the legislation in the next Parliamentary sitting fortnight (between 18 to 28 November 2024). It will continue to work closely with stakeholders in the lead up to the commencement of the law, with a lead time of at least 12 months following the Bill’s passage to give industry, governments and the eSafety Commissioner time to implement systems and processes.
Misinformation Bill
The Misinformation Bill was introduced into Parliament on 12 September 2024. It amends the Broadcasting Services Act 1992 (Cth) to:
Empower the ACMA to require digital communications platform providers take steps to manage misinformation and disinformation risks. The proposed bill would give the ACMA new powers to set an enforceable industry code or make standards should companies fail to self-regulate under voluntary codes.
Increase transparency about how digital communications platform providers manage misinformation and disinformation, by imposing core obligations on digital communications platform providers to:
Assess risks relating to misinformation and disinformation on their platforms and publish a report of the outcomes of that assessment.
Publish their policy or policy approach in relation to managing misinformation and disinformation.
Publish a media literacy plan setting out the measures the provider will take to enable end-users of the platform to better identify misinformation and disinformation.
Empower users to identify and respond to misinformation and disinformation on digital communications platforms.
The Misinformation Bill was considered by the House of Representatives on 7 November 2024 and is expected to be considered by the Senate in the next Parliamentary sitting. Given the Environment and Communications Legislation Committee is due to provide a report to the Senate on the provisions of the Misinformation Bill by 25 November 2024, we expect the Misinformation Bill may be considered by the Senate between 25 and 28 November 2024, if not early next year.
The Deepfake Legislation
The Deepfake Legislation, which commenced on 3 September 2024, imposes serious criminal penalties on those who share sexually explicit material without consent, regardless of whether the material is unaltered or has been created or altered in any way using technology.
Specifically, the Deepfake Legislation amended the Criminal Code Act 1995 to create a new offence where a person uses a carriage service to transmit sexual material which depicts, or appears to depict, another person (who is, or appears to be 18 years of age or over) when the person knows the other person does not consent to the transmission, or is reckless as to whether the other person consents to the transmission.
These offences will be subject to serious criminal penalties of up to:
six years’ imprisonment for sharing non-consensual deepfake sexually explicit material
seven years’ imprisonment where the offender was also responsible for the creation or alteration of the material (this is an aggravated offence).
There is a broad exception where ‘a reasonable person would consider transmitting the material to be acceptable’ having regard to a number of factors.
The Deepfake Legislation does not impose specific obligations on platforms to remove such materials, potentially leaving scope for this to be addressed through the Duty of Care.
What’s next
November is slated to be a big month for the government’s digital platforms agenda. While the government has not yet indicated when it will release the Duty of Care draft legislation, it plans to introduce the Age Restriction legislation in the next Parliamentary sitting fortnight. We also expect the Misinformation Bill will be considered by the Senate between 25 and 28 November 2024 (if not, early next year) after it is due to receive the Environment and Communications Legislation Committee’s report on the Misinformation Bill by 25 November 2024.
We will report further on the Duty of Care as further information becomes available – watch this space.
The authors would like to acknowledge knowledge lawyer Felicity Lee for her valuable assistance in the preparation of this update. Any mistakes are the authors’ alone.