Australia’s Adult Cyber Abuse Scheme Kicks Off Soon, Here’s What to Expect

Australia’s Adult Cyber Abuse Scheme Kicks Off Soon, Here’s What to Expect

From Sunday, Australia’s eSafety Commissioner will have sweeping new powers to, among other things, order the removal of material that seriously harms adults. These powers come by way of an Adult Cyber Abuse scheme, afforded by the Online Safety Act.

The Online Safety Act was passed last year, despite testimony from tech companies and civil liberties groups that the legislation was “rushed”.

The new powers have been labelled as overbearing. They contain a lot more than just this Adult Cyber Abuse scheme. As one Twitter user put it previously, the Commissioner is imminently receiving the “master on/off switch to the internet”. But more on that later.

Of concern to many is that it is not yet known what the test or criteria will be for determining if content warrants removal.

As I wrote previously for ZDNet: “There is much to take into account, especially when much of ‘Australian culture’ includes the use of a curse word as a term of endearment; that tone, for example, can be hard to ascertain from a character-limited post.”

Google took issue with the powers, as did Facebook and Twitter, sex work associations, human rights advocates and even the Australian Greens. But the Act was waved through both houses and given Royal Ascent, so now we have to grin and bear the ramifications of these sweeping powers.

The new powers afforded to the eSafety Commissioner come into effect on January 23, 2022. By June 2022, industry codes should be developed. So what exactly does this all mean?

First, what is inside the Online Safety Act?

  • A cyberbullying scheme to remove material that is harmful to children
  • an Adult Cyber Abuse scheme to remove material that seriously harms adults
  • an image-based abuse scheme to remove intimate images that have been shared without consent
  • Basic Online Safety Expectations for the eSafety Commissioner to hold services accountable
  • an online content scheme for the removal of “harmful” material through take-down powers
  • and an abhorrent violent material blocking scheme to block websites hosting abhorrent violent material.

For the purpose of this article, we’re going to dive into the Adult Cyber Abuse scheme.

What do you have to do?

Well, not sharing images of others without their consent should already be happening, but to help, you can not share images of others without their consent. The pitch from the government is that the Online Safety Act exists to help victims, so if you or anyone you know is a victim of online abuse, reach out to the eSafety Commissioner.

What does the Adult Cyber Abuse scheme offer me?

Let’s assume you’re an adult. With the new Adult Cyber Abuse scheme, eSafety will be able to act as a safety net to give Australian adults who have been subjected to serious online abuse somewhere to turn if the online service providers have failed to act in removing the abusive content.

Your first point of call is the platform – Facebook, Instagram, Twitter – but if a platform fails to take action, you can visit www.esafety.gov.au to make a report.

eSafety will then investigate and make a ruling. Yep. There’s no black or white here, it’s up to eSafety to determine if the platform should in fact take something down.

“In relation to the Adult Cyber Abuse scheme, the bar for determining what ‘adult cyber abuse’ is has been set deliberately high, to ensure it does not stifle freedom of speech. Under the law, to reach the threshold the abuse must be both ‘intended to cause serious harm’, and ‘menacing, harassing or offensive in all the circumstances’,” eSafety told Gizmodo Australia.

What is serious harm, though?

Serious harm could include material which sets out realistic threats, places people in real danger, is excessively malicious or is unrelenting.

Following a report and investigation by eSafety, eSafety can require the removal of adult cyber abuse material (a takedown notice). If the material is not removed, eSafety can impose civil penalties (including fines) on those who posted it, or the provider of the service where it appears, if they do not comply with a notice from eSafety to remove the material.

For Image Based Abuse (when an intimate image or video is shared without the consent of the person pictured), reports can be made directly to eSafety without going to the platform first.

For takedown notices, the platform will have 24 hours to comply (this was 48 hours, but now it’s just a day), and failure to comply could see companies facing fines of up to $555,000. As well as facing possible criminal charges in their jurisdiction, eSafety can seek penalties of up to $111,000 if a perpetrator posts or threatens to post an intimate image. There are also a range of remedial actions it can compel the perpetrator to take.

What else is happening from Sunday?

An enhanced cyberbullying scheme for Australian children. The takedown notice afforded to adults from Sunday has already been in place for children, but this will be enhanced this weekend.

Basically, instead of just being applicable to social media sites, the scheme protecting kids will extend to anywhere on the internet kids hangout, so online gaming platforms, content sharing sites and messaging services.

What about those Basic Online Safety Expectations?

Yeah. Aside from being labelled as an abdication of responsibility, these Basic Online Safety Expectations (BOSE, like the headphone/speaker company), don’t actually exist yet.

They exist in draft form and Minister for Communications (currently Paul Fletcher) will make a decision on them once consultation is done. We’ll provide an explainer on that when they come into effect, but for now, the gist is that these expectations will apply to service providers including social media; “relevant electronic service of any kind”, such as messaging apps and games; and other designated internet services, such as websites.

This is what eSafety told us:

“It has raised the bar by establishing a wide-ranging set of Basic Online Safety Expectations, including that online service providers take reasonable steps to ensure that users are able to use the service in a safe manner. These expectations will help make sure online services are safer for all Australians to use.  They will also encourage the tech industry to be more transparent about their safety features, policies and practices.”

New industry codes or standards

A new set of industry codes will also be developed to guide industry on compliance with their legal obligations under the new Act. To achieve this, the codes will cover the various segments of the online industry.

“The desired approach would see these codes developed by industry and then reviewed and registered by the eSafety Commissioner,” eSafety says.

If suitable codes cannot be agreed, or do not meet the desired safety outcomes initially outlined, the Commissioner has the power to impose industry-wide standards in place of the codes. The new codes should also promote the adoption of responsible industry processes and procedures for dealing with online safety and content issues.

Hopefully this makes things a little bit clearer. Stay safe out there and be kind to one another.

If you or someone you care about needs support, please call LifeLine Australia on 13 11 14, the National Sexual Assault, Domestic Family Violence Counselling Service on 1800 737 732 or MensLine Australia on 1300 789 978. 

If life is in danger, call 000.


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.