Conspiracy theories… unproven health remedies… Assertions that election processes are being rigged…

The Minister for Communications, Michelle Rowland, in her misinformation and disinformation opinion piece noted these issues as indicators of the pressing need for greater accountability and responsibility in tackling the dissemination of ‘falsehoods’ through misinformation and disinformation which can interfere with Australian democracy, the economy and society.

The Communications Legislation Amendment (Combatting Misinformation and Disinformation) Bill 2024 (the Bill) is part of a broader loose package of legislation currently running through parliament under the overarching Digital Duty of Care regulatory model. This includes age restrictions on social media access, regulation of deepfake material (previously discussed in a related insights article), and anticipated amendments to the Online Safety Act.

The Senate is expected to consider the Bill next week, the last parliamentary sitting week of the year. The consideration of the Bill comes more than a year after the release of an initial exposure draft that was widely panned by commentators from across the private sector, media, academia and the law (the Exposure Draft).

However, even though the revised Bill pares back some of the more controversial elements of the Exposure Draft, in addition to adding several new obligations on platforms and increasing the transparency standards applied to the Australian Communications and Media Authority (ACMA) when administering the Bill, it seems that questions remain about the potential effectiveness of these reforms in responding to legitimate challenges in relation to freedom of expression. Recent criticisms of the Bill from several independent cross benchers and the Coalition which liken the Bill to state-sanctioned censorship and allowing platform providers to act as the ‘sole arbiters of truth’, may stir up sufficient consternation to prevent its passage.

Background

Like other speech-based harms (such as hate speech) reasonable opinions differ as to exactly what qualifies as misinformation and disinformation on the internet. Until relatively recently, governments have been wary of wading into such debates, leaving it to digital platforms to develop and enforce policies across their platforms.

Locally, self-regulation of misinformation and disinformation has existed via the voluntary Australian Code of Practice on Disinformation and Misinformation (Voluntary Code) which commenced in February 2021. But as governments around the world continue to face pressure to respond to misinformation and disinformation online, the dial has been steadily shifting from voluntary self-regulation to more traditional forms of regulation being imposed on digital platforms.

Without dismissing or immediately obviating the need for the Voluntary Code, the Bill seeks to improve how digital platforms respond to misinformation and disinformation through measures including:

a)    Positive obligations, such as the publication of risk assessments, content policies and media literacy plans not previously included in the Exposure Draft.

b)    Record-keeping requirements and mandatory reporting to the ACMA.

c)     The ability for ACMA to report on information it collects and identify the digital platforms it has originated from, thereby placing a level of pressure on platform reputation that aims to increase accountability and drive change (similar to the reporting requirements in respect of the Basic Online Safety Expectations under the Online Safety Act, which the eSafety Commissioner has used to great effect).

d)    The ability for ACMA to request and approve industry codes, or make industry standards, which enumerate further obligations on digital platforms (and would apply on a mandatory basis, unlike the Voluntary Code).

e)    Allowing ACMA to exercise a graduated set of enforcement powers, including formal warnings, remedial directions and civil penalties.

Misinformation, disinformation and other key terms

The Bill intends to respond to online ‘misinformation’ and ‘disinformation’, terms that until relatively recently were not part of the cultural mainstream. Both terms are commonly used to describe information that is verifiably false or misleading or deceptive, with the distinction between the two usually coming down to intent.

The Bill characterises content disseminated using a digital service as ‘misinformation’ where:

(a)   The content contains information that is reasonably verifiable as false, misleading or deceptive.

(b)   The content is provided on the digital service to one or more end-users in Australia.

(c)   The provision of the content is reasonably likely to cause or contribute to serious harm.

(d)   The dissemination is not excluded dissemination.

These factors not only add some specificity to what is otherwise an impractically broad and contested concept, but they also introduce a materiality threshold through the serious harm requirement. Unlike the Exposure Draft, the Bill also introduces a standard of ‘reasonably verifiable’ when it comes to the falsity or misleading nature of content.

The definition of ‘disinformation’ adopts the characteristics described above, however, also requires that either:

(i) There are grounds to suspect that the person disseminating the content intends for it to deceive another person

(i) The dissemination involves inauthentic behaviour.

The Bill also defines the following important terms:

Serious harm

Serious harm is:

(a)   harm to the operation or integrity of an Australian electoral or referendum process

(b)   harm to public health in Australia

(c)   vilification of a group in Australian society distinguished by race, religion, disability, sexual orientation (among other listed attributes) or vilification of an individual because of a belief that they are a member of such a group

(d)   intentionally inflicted physical injury to an individual in Australia

(e)   imminent damage to critical infrastructure or disruption of emergency services in Australia

(f)    imminent harm to the Australian economy

that has

(g)   significant and far-reaching consequences for the Australian community (or a segment thereof)

(h)   severe consequences for an individual in Australia.

Compared to the Exposure Draft, we note that previous references to ‘hatred against’ certain groups have been replaced with vilification (likely in an attempt to avoid framing the Bill as a response to hate speech). In addition, references to disruption of public order or society and harm to the Australian environment, have been removed entirely.

Excluded dissemination

The following are defined as being excluded dissemination:

(a)   dissemination of content that would be reasonably regarded as parody or satire

(b)   dissemination of professional news content

(c)   reasonable dissemination of content for any academic, artistic, scientific or religious purpose.

The Exposure Draft had several further grounds of excluded dissemination, including content produced in good faith for entertainment and content authorised by a government body, however these were removed from the Bill. Reasonable dissemination for a religious purpose was also added in the final Bill.

Inauthentic behaviour

Under the Bill, inauthentic behaviour refers to the use of automated systems or coordinated efforts in a way that is reasonably likely to mislead end-users about certain aspects of the content, such as its popularity or the identity or purpose of the person disseminating the content.

Inauthentic behaviour also includes activity aimed at frustrating the ability of a digital platform to comply with the Bill or enforce its own terms of use.

Given the often subjective and value-laden treatment of terms like misinformation and disinformation, it is important to point out the Bill only regulates a specific form of misinformation and disinformation (being that which meets the criteria outlined in the Bill). Regulatable misinformation and disinformation under the Bill must have a specifically Australian impact, which we touch on in our analysis later in the article.

Who is impacted?

The scope of the ACMA’s powers are directed toward entities that qualify as a ‘digital communications platform’, with the Bill delineating the four following broadly defined sub-categories:

(a) Content Aggregation Services – digital communications services that primarily collate and present online content to end-users, excluding search engines (for example news aggregation services).

(b) Internet Search Engine Services – previously covered as a type of content aggregation service under the Exposure Draft, the Bill separately distinguishes this sub-category in a manner that generally accorded with industry understandings of search engines.

  • Connective Media Services – digital communications services that have an interactive feature and primarily facilitate end-users to interact with each other online (e.g. social media, dating websites, peer-to-peer marketplaces).

  • Media Sharing Services – digital communications services that primarily provide audio, audio-visual or moving visual content to end-users. The Bill excludes application of certain provisions, including codes and standards, to media sharing services that do not have any interactive features (such as broadcast and subscription video on demand services).

The minister may also specify additional categories of in-scope digital communications services, as well as excluded services, via legislative instrument. The Bill carves out several service-types, including internet carriage services, SMS/MMS services and email services.

Key Points of the Bill

The Bill primarily increases the powers available to the ACMA to respond to misinformation and disinformation through a graduated enforcement approach involving ACMA information gathering powers, as well as reserve powers to initiate, develop and enforce codes and standards if deemed appropriate.

Information powers

To support transparency of digital platform providers in relation to misinformation and disinformation, the Bill would grant ACMA information gathering powers.

These new information powers would allow ACMA to make record keeping and reporting rules for digital communications services. ACMA can then request information from digital communications platforms about the measures they have taken to address misinformation or disinformation. This includes actions taken to moderate content and the effectiveness of these measures.

To promote transparency, ACMA would have the ability to publish the information collected under the information-gathering and record keeping powers on its website, including the identity of the platform to which the information relates. As with other aspects of the Bill, these information powers would extend to all digital communications services (except where exempt), including services who never opted into the Voluntary Code.

Codes and standards

While the Bill intends for ACMA’s increased information powers to supplement existing industry self-regulation through the Voluntary Code, the Bill also includes a series of graduated ‘reserve powers’ to ratchet up enforcement should voluntary measures prove ineffectual. The Bill reflects a similar (though not identical) code and standard scheme as the Online Safety Act.

The ACMA may ask industry to develop new official codes that if registered by ACMA, would extend compulsory compliance to other digital services providers in the industry, including services who opted out of the Voluntary Code. If industry fails to produce a code, or such codes also prove ineffectual, the Bill grants ACMA powers to independently create a standard.

Once registered, both codes and standards would be enforceable, however standards would carry higher penalties for non-compliance. We also note that codes and standards approved by ACMA would be subject to parliamentary scrutiny and disallowance.

New core obligations

Unlike the Exposure Draft, the Bill also creates some ‘core upfront obligations on the platforms’ in the form of the following requirements placed on digital communications services:

  • To create media literacy plans aimed at helping users recognise false or misleading content by improving their understanding of the source and accuracy of information.

  • To conduct and report on regular risk assessments that identify and evaluate misinformation-related risks on their platforms, including user behaviour and platform design.

  • To establish and publicly share policies that outline how they address misinformation, ensuring transparency and accountability to the public and ACMA.

Enforcement

The ACMA may issue a formal warning to the provider if it is satisfied a digital platform provider has contravened a registered code or standard. ACMA will also be empowered to issue remedial directions to platforms that fail to comply with a registered code or standard, as well as infringement notices (as an alternative to pursuing civil penalties).

Significant civil penalties are in place for digital platform providers who routinely contravene provisions in a registered code or standard, or fail to comply with remedial directions. The Bill provides the maximum penalty units for alleged contravention of a registered code for corporations is 10,000 penalty units ($3.13 million in 2024) or 2% of global turnover (whichever is greater), or 2,000 penalty units ($626 thousand in 2023) for individuals.

Similarly, the maximum penalty units for alleged contravention of an industry standard for corporations is 25,000 penalty units ($7.83 million in 2023) or 5% of global turnover (whichever is greater) or 5,000 penalty units ($1.56 million in 2023) for individuals.

Implications

It is clear the final Bill reflects an intent by the Australian Government to set several baseline expectations on digital communications platforms while at the same time equipping ACMA with the regulatory and enforcement levers it requires to effectively monitor misinformation and disinformation in Australia.

Compared to the Exposure Draft, the Bill is substantially tighter in its foundational terms and concepts. However, questions remain about potential ramifications on freedom of speech as well as concerns surrounding the Bill’s practical implementation and effectiveness.   

Speech

Assurances previously made by the Australian Government that the reform “does not seek to curtail freedom of speech” were not present in the explanatory memorandum for the Bill. In addition, the Bill also removed express references to the ‘freedom of political communication’ from four of the five areas of the amendments that featured them in the Exposure Draft.

The Bill now only makes reference to such a freedom in the title of section 54, the substance of which only requires ACMA to be satisfied that a standard it approves is reasonably appropriate to protect the community and goes no further than necessary to provide this protection. It appears that, in keeping with the High Court position on the freedom of political communication, the Australian Government has opted to keep the freedom implied and instead focus on expressly ensuring ACMA satisfies itself as to the necessity of a given power being exercised.

The Australian Government is, however, still stressing that the Bill does not empower ACMA to directly regulate, investigate or take down particular content on digital communications platforms, or require providers to remove content or block-end users outside of inauthentic behaviour situations. Instead, digital communications services are said to retain responsibility for the content they collate and host.

The Bill aims to incentivise digital communications platform providers to have robust systems and measures in place to address misinformation and disinformation on their services. It does not provide the ACMA with powers to directly regulate content on digital communications platforms themselves.

While the Australian Government has understandably drawn a line between itself and ACMA interfering directly with this kind of speech, the Bill arguably sets the foundations for a regulatory environment that requires private companies to do just that, whether intentionally or not. Professor Anne Twomey (a consultant with G+T), speaking at a senate committee on the Bill, discussed the fact that in effect this will leave it up to the digital platforms “to make decisions about what is true and what is false”.

In many ways this mirrors the approach taken under the Online Safety Act, where responsibility for considering where online material falls under the existing Australian Classification Scheme (a legacy framework drafted to regulate films, computer games and publications) is distributed to online service providers, meaning the same material may receive differential treatment by hundreds of different private entities.

Practicality and potential effectiveness

Following on from what we outline above regarding the purportedly limited role of the ACMA, some commentators have queried how the regulator will be able to exercise its powers without doing the things the Australian Government has stressed it is not empowered to do (namely, considering individual pieces of content).

As with other forms of online content regulation, the actual content at issue as well as its surrounding context are critical to responding in a proportionate way. The proposed definitions of misinformation, disinformation and serious harm, which would serve the purpose of demarcating lawful content from regulated content, are also drafted to consider content on a case-by-case basis. Furthermore, clear boundaries are established to ensure that codes and standards are not approved regarding requirements for private or encrypted messages and VoIP communications.

Without suggesting that ACMA ought to be provided with powers to arbitrate the truthfulness of content, as the regulatory authority charged with administering the Bill, it is questionable how the ACMA will be able to practically enforce its new powers without ever being required to do so. For instance, given the Bill’s powers are intended to be ‘graduated’ (and in the case of codes and standards, ‘reserve’) in nature, with the effectiveness of industry’s conduct being linked to exercising further powers, it could also be difficult for ACMA to make such assessments without veering itself into such self-assessments. This potential level of power endowed to ACMA is one of the concerns voiced by the opposers of the Bill.

Another potential limit on the Bill meeting the expectations created for it by the Australian Government is the fact that what qualifies as regulatable misinformation and disinformation is actually now very narrow. Following what were most often sensible recommendations of industry and other stakeholders, the Bill enshrines a standard for serious harm that is specific, localised and of a reasonably high materiality threshold.

When monitoring for violative content, not only will digital communications platforms have to navigate the usual complexities caught up with identifying misinformation and disinformation. They will also need to consider the likelihood that it will cause or contribute to serious harm, which itself requires a consideration of various criteria and the degree of impact in a localised Australian context. Following this, a further analysis is required to ensure that content is not otherwise exempt from as excluded dissemination, such as on the basis of it being satire or disseminated for a religious purpose.

The Australian Government has in the past responded to such concerns by pointing out that the Bill focuses on systems and processes, rather than individual pieces of content. However, it remains the case that digital communications platforms must operationalise the Bill to some degree at the level of individual pieces of content. This is not just for the sake of being able to practically comply at the massive scale of online content, but also so that legitimate speech does not become grouped in with regulatable speech by taking a bluntly systemic approach.

Despite the potential challenges created by the Bill, it should be kept in mind that it is intended to establish a regulatory framework that is able to adapt and respond to misinformation and disinformation in Australia over the long-term. Challenging societal issues often require imperfect or complex regulatory responses. In any case, the final version of the Bill does build in numerous levels of parliamentary oversight and mandated legislative review, which should keep an appropriate check on the effectiveness of the reforms once enacted.

Where to from here?

Given the Environment and Communications Legislation Committee is due to provide a report to the Senate on the provisions of the Bill by 25 November 2024, we expect the Bill may be considered by the Senate next week, during the last sitting week of the year. However, mounting and open criticism by the crossbench have called into question the likelihood of the Bill making it through.