16/07/2024

Not since the 2019 Christchurch terrorist attack, a heinous crime live-streamed by its perpetrator, has the issue of violent online content been so acutely in the spotlight as it has been in Australia over the past few months. 

Despite significant and self-evident differences with the Christchurch terrorist attacks, the recent stabbing of Mar Mari Emmanuel, the Founder and Bishop of a small, breakaway Assyrian-Christian denomination based in the Sydney suburb of Wakeley, looks set to be another defining moment in the regulation of online safety in Australia. 

Footage depicting the incident raises important legal, ethical and policy questions, the answers to which carry implications far exceeding the confines of this single event a and single piece of footage. However, the considerable political and media furore that emerged in the wake of the stabbing often failed to navigate these difficult questions in a meaningful way, particularly once the focus shifted to the ongoing stoush between the eSafety Commissioner and Elon Musk’s X Corp.

In this article, we examine how the Wakeley stabbing, the ensuing controversy concerning control of online content and the eventual Federal Court case, have underscored some of the limitations of Australia’s online safety laws. 

The incident 

The incident itself has been widely and comprehensively reported on. Below we summarise the factual detail necessary for understanding the nuances of the proceedings between the eSafety Commissioner and X Corp: 

  • Mar Mari Emmanuel has a devout local following, as well as considerable online supporters owing to his savvy social media skills. However, his religious and political views have long been a source of public controversy. 
  • On 15 April 2024, the Bishop was stabbed while delivering a sermon that was being live-streamed through the church’s official online channels. In addition to the Bishop sustaining serious injuries and losing the use of an eye, a priest, a member of the congregation, and the suspected attacker were also injured. 
  • The suspect, a 16-year-old, was quickly apprehended by members of the congregation and soon arrested by local police. He has since been charged with a terrorist act, as have a number of other individuals who police allege are associates of the attacker and have been charged with related offences. 
  • Immediately following the stabbing, a riot broke out in the streets directly outside the Christ the Good Shepherd Church, where an estimated 2000 were said to have gathered. In addition to considerable property damage, roughly 80 people were injured (mostly police officers), leading to multiple additional arrests. 

Unpacking the footage 

Despite the newsworthiness of the actual events in Wakeley, in the weeks that followed, it was footage of the stabbing that became the dominant news story and public controversy. 

Mainstream reporting of debate relating to online content control can sometimes position online safety regulation as a straightforward affair: a piece of content is said to be harmful, and our laws regulate harmful content, thus the removal of the content must be justified. However, as with most areas of the law, there is significance in the details.

As such, to understand the dispute between eSafety and X Corp, it is critical to clearly understand what the footage depicted, its characteristics as a piece of online content, and how the Online Safety Act (OSA) could apply to it. 

As the Bishop regularly livestreams his sermons, the attack against him was captured clearly and available instantly online via the Church’s usual online channels. Following its online live-stream, the footage was clipped to a duration of roughly 11 seconds and uploaded to social media. 

The clip depicts the following:

  • As the Bishop is speaking, the suspected attacker approaches the pulpit, raises his right arm, and strikes the Bishop several times before falling into a scuffle with intervening members of the congregation and moving out of view of the camera. 
  • It is understood that the weapon being carried (allegedly a small flick knife) failed to open properly. As a result, no weapon, blood, impact wound, or other highly graphic detail is visible in the footage. 
  • The audio track of the footage includes the sound of several clear impacts between the assailant’s fist and/or part of his weapon and the Bishop, then the immediate footfall and screams of the congregation.

In addition to its objective characteristics, it is important to appreciate what occurs once a very short, highly provocative piece of footage becomes online content. 

Like any online content, once uploaded, the clip can be copied, altered, sliced and distributed in an indeterminate number of ways, ad infinitum. As this happens, the footage can take on additional meanings that are shaped by, among other things: who shares it, what iteration of the footage is shared, where it is shared, what comments and subsequent responses are shared alongside it, and how other users interact with it.

Regulating online content

There is no doubt that the 11-second clip of the stabbing is subject to the OSA. All forms of online material (whether in the form of text, data, sounds, visual images, or any other form or combination of forms) are capable of regulation under the OSA and its ‘Online Content Scheme’, which seeks to limit harms associated with certain kinds of illegal or restricted online material. 

Depending on the subject matter depicted in video footage, it may be regulated as either class 1 or class 2 material under the Online Content Scheme. The way in which the OSA classifies class 1 and class 2 material is drawn directly from longstanding classification concepts found in the Classification (Publications, Films and Computer Games) Act 1995 (Cth) and its subordinate instruments that makeup Australia’s ‘National Classification Scheme’ (NCS). 

Given the sheer scale of online content, regulation of specific material under the OSA is made on a largely reactive basis, with the eSafety Commissioner able to deploy a range of powers depending on where the material is deemed to sit on the spectrum between class 1 and class 2 material. 

Where the Wakeley footage falls on this spectrum and what powers were, therefore, validly available to eSafety, sit at the core of this content controversy. 

Regulating the Wakeley footage

From a legal perspective, the NCS and the OSA create a framework for identifying where violent material may reasonably be classified on a spectrum from harmful-to-some and worth restriction (class 2 material), to harmful-to-all (or illegal) and requiring prohibition (class 1 material). 

The OSA reserves its most stringent powers for class 1 material. This primarily includes the ability for the eSafety Commissioner to issue notices to social media service providers requiring providers to take all reasonable steps to remove class 1 material from their platforms, regardless of jurisdiction; that is, the takedown power applies irrespective of where a provider is based or where the content is hosted. This is the power that the eSafety Commissioner invoked against X Corp and other providers in relation to the Wakeley footage. 

The table below summarises how class 1 material of a violent, non-sexual nature is characterised by the OSA and NCS:

OSAClass 1 material 
Violent material that, if it were to be classified by the Classification Board in a corresponding way to the way in which a film would be under the NCS, the material would likely be classified as Refused Classification. 
While the OSA itself only defines class 1 material via broad reference to the NCS, eSafety has created additional sub-categories that are themselves reflected in instruments like industry codes. One such sub-category is ‘extreme crime and violence material’. 
NCS

Refused Classification (RC): 
Violent material that, without justification:

  • Promotes, incites or instructs in matters of crime or violence, or includes or contains detailed instruction or promotion in matters of crime or violence;
  • Depicts, expresses or otherwise deals with matters of crime, cruelty or violence in such a way that the material offends against the standards of morality, decency and propriety generally accepted by reasonable adults to the extent that such material should not be classified; or
  • Includes or contains gratuitous, exploitative or offensive depictions of (i) violence with a very high degree of impact or which are excessively frequent, prolonged or detailed or (ii) cruelty or real violence which are very detailed or which have a high impact 

We note that the reference to ‘without justification’ above acknowledges that the NCS also requires consideration of other principles and matters when assessing material, such as the importance of context, impact levels, and the persons to whom it is published. 

ExampleA recent example of violent material classified as Refused Classification by the Australian Classification Board is the 17-minute bodycam footage attack video of the 2019 Christchurch terrorist attacks. This footage depicts in full detail the murder of, and injury to, a significant number of innocent individuals. We understand that this footage has also been treated as pro-terror content. 

The alleged stabbing was plainly a violent incident, and the footage directly depicts that violence. It is likely most audiences would find the footage shocking and confronting. But, as with most confronting media, opinion may differ on its appropriate characterisation. Indeed, neither the NCS nor the OSA provides an unfettered means of prohibiting confronting material. There exists a necessary social and legal tolerance for the ability to access confronting material, including where such material depicts a public incident or matters otherwise in the public interest. 

There is also the principle enshrined in the NCS that adults should be able to view what they want, particularly where they seek it out. This principle itself must be balanced with, for example, the need to take account of community concerns about depictions that condone or incite violence, or that advocate carrying out a terrorist act. Under the NCS, material that advocates terrorist acts must be Refused Classification, with ‘advocates’ including the direct or indirect encouragement or instruction in the carrying out of a terrorist act, or the direct praise of carrying out a terrorist act where such praise is substantially likely to lead a person to engage in carrying out a terrorist act. The NCS also makes clear that the material does not advocate a terrorist act if the depiction could reasonably be considered to be part of public discussion or debate. We note, due to the complexity and sensitivity of such an analysis, and given that the eventual proceedings characterised the footage as ‘extreme crime and violence material’, this article does not examine the potential status of the footage as ‘pro-terror content’.

When considering the spectrum of harmful material available online, as well as a comparison with the prior treatment of RC material under the NCS, it is less certain that the Wakeley footage depicts ‘extreme crime and violence’, or that it rises to the threshold of Refused Classification set by the NCS. The length of the clip, its quality and lack of extremely graphic detail are relevant to bear in mind here. 

However, it remains the case that the OSA and the NCS create a fairly broad scope for violent class 1 material. Violent material deemed ‘Refused Classification’ has ranged from the most brutal depictions of real-life violence noted above to 50 Cent: Bulletproof, a video game containing the simulated crimes of its namesake rapper protagonist. As such, while the Wakeley footage may not be the most extreme or graphic example of violent media, eSafety’s treatment of it as class 1 material is ultimately not entirely out of step with the framework created under the OSA and NCS. 

eSafety responds to the footage

On the day after the Wakeley stabbing, the eSafety Commissioner took several actions to respond to the presence of the footage online, which it deemed to be class 1 material. Using a mix of informal requests and official statutory notices, eSafety sought the removal of the footage from major social media platforms and private communications apps. 

eSafety issued a formal removal notice to X Corp under section 109 of the OSA to take all reasonable steps to ensure the removal of specified material from the X platform on the basis that it is class 1 material depicting “crime, cruelty and real violence in such a way that it offends against the standards of morality, decency and propriety generally accepted by reasonable adults to the extent that it would likely be classified RC” (the Notice).

For clarity, the Notice did not apply to all copies of the Wakeley footage, but to a specified list of 65 URLs, each corresponding with a post on X. Despite doubting (and ultimately challenging) the status of the Wakeley footage as class 1 material and raising its own objections on that front, X Corp geo-blocked each post listed in the Notice.

Unsatisfied with X Corp’s efforts to comply with the Notice, and amidst significant media reporting and comments from politicians across the political spectrum, the eSafety Commissioner commenced proceedings in the Federal Court. 

eSafety Commissioner v X Corp

On 22 April, the Federal Court issued an interim injunction against X Corp, which remained in effect until 10 May. The injunction required X Corp to hide the material identified by eSafety behind a notice such that instead of seeing the post, an X user would instead only see the notice, which itself could not be removed. 

The interim injunction only applied to the 65 posts identified in the Notice, which at this point X Corp had already insisted were subject to geo-blocking to Australian users. Further, the injunctive relief was ambiguous as to whether the notices were required globally. As such, the extent to which X Corp substantially implemented the interlocutory orders is unclear. 

The final injunctive relief sought by eSafety (which at that point in time was still potentially months away) sought, in effect, the removal of the material identified in the Notice from the X platform on a global basis. Interestingly, in its draft orders, the eSafety Commissioner requested orders that use ‘remove’ in its ordinary sense, and not its statutory definition under section 12 of the OSA (where material is considered ‘removed’ if it is “…neither accessible to, nor delivered to, any of the endusers in Australia using the service”).

The application by eSafety to extend the interlocutory injunction was ultimately refused by the Federal Court on 13 May, with the reasoning for this set out in eSafety Commissioner v X Corp [2024] FCA 499 (the Judgment). (We note that the proceedings were deemed a ‘public interest case’ by the Federal Court. As such, an almost complete public record is available, including written submissions, affidavits, and other documentation tendered by eSafety, X Corp and several intervening parties, totalling over 1000 pages. While this provides a rich record for further examination, to keep this article contained, we primarily engage with the Judgment and material cited directly therein.) 

The Judgment grappled with two core issues as they related to whether there was a ‘real issue to be tried’ in support of extending the interim injunction until final relief could be determined: 

  1. Whether the Notice was a valid exercise of eSafety’s power under the OSA; and 
  2. Whether, given the relevant provision of the OSA and the accompanying Notice could only ever require X Corp to “take all reasonable steps” to ensure the removal of the identified posts, eSafety’s proposed final relief went beyond what is required for compliance with the OSA.

Justice Kennett found no reason to view the Notice itself as an invalid exercise of power. While the Judgment is careful not to adjudicate the merits of eSafety’s decision to issue the Notice (noting that this was the subject of a separate merits review in the Administrative Appeals Tribunal), it does acknowledge that the assessment of classification to be performed under the NCS is of a “highly debatable nature”.

“While it is certainly arguable that the depiction of violence in the stabbing video is not sufficiently long, detailed or otherwise impactful to warrant an RC classification, it does not follow that the view taken by the delegate was not open.”

Of greater significance in the Judgment were Justice Kennett’s findings in relation to the scope of “all reasonable steps” under the Notice and the OSA, and of the potential for global takedown orders generally. 

Despite X Corp agreeing to geo-block the 65 identified posts, making them inaccessible to users with Australian IP addresses, eSafety contended that this was not sufficient for compliance with the Notice due to the use of Virtual Private Networks (VPNs). The argument was that as VPNs allow users in Australia to connect to the internet via an IP address not linked to Australia, users in Australia could still access footage of the Wakeley stabbing notwithstanding X Corp geo-blocking the posts. eSafety pointed out that X Corp had implemented takedowns with global effect in the past, including in response to notices issued by eSafety. 

While the OSA does not directly contemplate the existence of VPNs, several of its sections would appear to contradict the position advanced by eSafety. For example, when defining its core online service categories, the OSA includes the concept of an ‘exempt service’, being one in which none of the material on the service is accessible or delivered to users in Australia. VPNs, which have been a popular tool in Australia long before the passage of the OSA, would render the concept of ‘exempt services’ essentially redundant. 

The Judgment notes that the OSA lacks mention of either IP addresses or VPNs, but does so in a manner that lends support to eSafety’s view that ‘remove’ could extend to making inaccessible even to those users in Australia with VPNs. However, it is the threshold test contained in the OSA that limitedly requires recipients of removal notices to take ‘all reasonable steps to ensure’ content removal that results in the Judgment ultimately being in favour of X Corp.

Justice Kennett found strength in the argument that requiring the global removal of the 65 posts was not a step that was ‘reasonable’ to require X Corp to do, notwithstanding the fact that it may, of its own volition, choose to do so. The Judgment canvassed how this would clash with the ‘comity of nations’ and run contrary to principles of statutory interpretation that create a presumption against a violation of international law. This view was no doubt hardened by the absence of clear legislative intent for the OSA’s removal notices to have a global effect. 

“The result is that, read in context and in the light of normal principles of statutory construction, the 'reasonable steps' required by a removal notice issued under s 109 do not include the steps which the Commissioner seeks to compel X Corp to take in the present case.” 

As Justice Kennett ultimately found there was not a prima facie case for the grant of a final injunction in the terms sought by eSafety, usual deliberations over a balance of convenience for the interlocutory relief were not necessary. The Judgment, nevertheless, does include the following observations on that front, claiming the final injunction sought would: 

  • Have a “literally global effect on the operations of X Corp, including operations that have no real connection to Australia or Australia’s interests”. 
  • Impact the interests of millions of people unconnected with the litigation.
  • Be ineffective in preventing people who want to see the video from watching it.
  • Be unlikely to be enforced by a US court (where X Corp is based). 

Does eSafety v X Corp tell us something about our approach to online safety?

While the Judgment was only intended to be an interim decision prior to the final hearing, the eSafety Commissioner ultimately chose to discontinue proceedings in the Federal Court on 5 June, instead focusing on the current merits review in the Administrative Appeals Tribunal. 

In its associated press release, eSafety labelled the Wakeley footage as “extremely violent”, bearing the potential of “inciting further violence and inflicting more harm on the Australian community”. The press release also made repeated reference to the ease with which the footage was available to children. 

“Most Australians accept this kind of graphic material should not be on broadcast television, which begs an obvious question of why it should be allowed to be distributed freely and accessible online 24/7 to anyone, including children.”

The Wakeley footage is plainly inappropriate for children to see, and preventing children’s inadvertent exposure to harmful material is a significant challenge. However, the suggested equivalence between broadcast regulation and online content regulation is worth closer consideration. 

There is a reasonable societal understanding (and a fairly global one at that) that the content available online will exceed that of broadcast television in terms of its volume, nature and potential harms. For instance, while broadcast regulation may prevent the airing of graphic war documentaries during certain times of the day on free-to-air, that same subject matter is rightly accessible at all times of the day online. One regulatory approach responds to a context where viewers have no choice about what is broadcast into their homes, whereas the other seeks to balance harm minimisation with the free and open exchange of lawful information. 

Of course, eSafety’s correlation of broadcast and online safety regulation is not without legislative basis. As drafted, the OSA is made up of many concepts and procedures that originate in broadcast regulation, classification laws, and even to some degree the now moribund regulation of obscene literature. The sustainability of such an approach has been questioned, as has the somewhat square-peg reliance on NCS concepts by the OSA.

For example, some may be surprised to learn that for most pieces of online content, the eSafety Commissioner is to consider the material as if it were a film being formally classified under the NCS. That is to say, all online material that is not a film, publication or computer game, is classified by taking into account guidelines written for the formal classification of films by the Australian Classification Board. 

At this point, it is also worth bearing in mind that even if eSafety succeeded in continuing its interim injunction, or even obtained its intended final injunctive relief, either would have impacted only those posts at the 65 URLs listed in the Notice. Even with well-intentioned and well-resourced enforcement, responding to online content via drafting designed to respond to legacy media on a case-by-case basis is showing its limitations. 

Unlike the comparatively contained formats of professionally produced film and computer games, user-generated online content is innately replicative. If anything, content controversies can sometimes result in increased virality and solicitation of the material at issue, including via the so-called ‘Streisand Effect’. While this does not mean attempts to regulate are futile, it is perhaps cause to recalibrate how we best do this in online settings. 

Coincidentally, the OSA is presently undergoing statutory review. The Judgment, and the associated content controversy surrounding it, provides a valuable case study for considering how the OSA has been designed and whether its powers are appropriately structured and expressed. Indeed, clarity could be provided over statutory ambiguities that have been left for eSafety to test in court, such as the intended international reach of removal powers. 

The eSafety Commissioner’s office occupies a critical role in keeping Australians safer online but may require a more purpose-built legislative framework in order to best fulfil its mandate. 

 

""