Wooden office desk with a laptop, mobile phone and coffee mug sitting on it.

Tightening the law around online content: Introduction of the Online Safety Act 2021 (Cth)

Malcolm Campbell ||

Co-authored by Francesca Zappia

The Online Safety Act 2021 (Cth) (‘the Act’) was passed on 23 June 2021 commenced on 23 January 2022. The Federal Government introduced the legislation in a bid to strengthen industry standards and resolve the gaps in Australia’s existing online safety system. As a result of the COVID-19 pandemic, online interactions are becoming increasingly relied upon as a part of everyday life, particularly through the way people socialise, work, learn and enjoy entertainment. This increased usage of online platforms opens a range of privacy and safety issues that users are often not aware of.

The Act allows the eSafety Commissioner (‘Commissioner’), Julie Inman Grant, to assess complaints relating to cyber abuse, image-based abuse and cyberbullying. Whilst this Act attempts to eliminate the harm caused to users, it has gained criticism for its wide scope of application, and the consequences of misuse that are likely to arise as a result.

What does the Act enforce?

The Act has a broad scope, allowing the Commissioner to deal with the removal of harmful online material relating to Australian children, adults, the sharing of intimate images without consent, abhorrent violent material and harmful online content. Harmful online content is broken into Class 1, which is anything that is against the standards of morality, decency and propriety, and Class 2, which refers to anything that would be classified as R18+.

The Act upholds a high threshold for cyber-abuse. Under s 7, the abuse must ‘intend to cause serious harm’ and be ‘menacing, harassing or offensive in all circumstances’. Examples of serious harm of cyber-abuse include situations where the material sets out realistic threats, place individuals in imminent danger, is intending to be excessively malicious or where the abuse is relentless. Importantly, if the situation does not meet the above threshold, the Commissioner is able to offer support, information and advice to guide the individual to avoid harm.

Industry Standards

The Act enforces industry standards known as the Basic Online Safety Expectations (BOSE), which requires online service providers to take reasonable steps in order to minimise the risk of harm.  For example, the Act requires online service providers to create a safer online environment through:

  • Ensuring technological or other measures are in effect to prevent access by children;
  • Guaranteeing the service has clear and readily identifiable mechanisms that enable end-users to report and make complaints about cyber bullying material and breaches of the service’s terms of use; and
  • Providing a written statement of complaints and removal notices to the Commissioner within 30 days when required.

Who does the Act apply to?

The Act applies to the following services:

  • Designated internet service providers
  • Social media service providers
  • Electronic service providers (such as Outlook and WhatsApp)
  • Hosting service providers
  • App distributor service providers
  • Internet service providers (such as Microsoft Edge, Firefox and Safari)
  • Internet search engine service providers
  • Ancillary service providers to the online industry

What powers does the Commissioner have?

The Commissioner will impose updated BOSE industry standards and technical requirements upon digital platforms to which the Act applies. This will be done through online content schemes and regimes tackling violent, abhorrent material that are set out within the Act.

A criticism that is identified with the enforcement of the Act is the large role placed on the Commissioner. The Commissioner is the single decider of the complaints and breaches, and along with this, has been given substantial investigative and enforcement powers.

The Commissioner has the power to issue the following notices:

  1. Removal Notice: Social media, electronic, designated internet and hosting service providers will be given a removal notice, requiring them to remove or take all reasonable steps to remove the material, or cease hosting the material, from their service within 24 hours.
  2. Blocking Notice: Internet service providers will be requested or required to block access to material that depicts, incites or instructs abhorrent violent conduct. The Commissioner must be satisfied that the material is likely to cause significant harm to the Australian community.
  3. App Removal Notice: App distribution service providers may be given a notice to cease enabling end-users to download an app that facilitates the posting of certain material within 24 hours.
  4. Link Deletion Notice: Internet search engine providers may be given a notice requiring the provider to cease offering a link to certain material within 24 hours.

Where an individual or body corporation does not comply with the notice, the Commissioner may impose formal warnings, infringement notices, enforceable undertaking, injunctions and civil penalties. The civil penalties for individuals’ range between $22,200 (100 penalty units) and $111,000 (500 penalty units), and $555,000 (2500 penalty units) for corporations.

Individuals are able to make a report to the Commissioner where they believe a platform has failed to take action to ensure the safety of the users, or where they believe a breach of safety has occurred. The Commissioner has the discretion whether to act upon this complaint or whether to issue a formal warning for the online service provider to take reasonable steps to correct the breach.

The Commissioner does not have the power to investigate most online frauds and scams, spam, defamation and privacy breaches. In this situation, the Commissioner will refer individuals who have been targeted to obtain legal advice.

What does this mean for your business?

If your business falls within the realms of the Act, then it will be essential to review and update your current online safety procedures and policies to ensure that there is compliance with the BOSE as well as the relevant sections of the Act. If you are required to remove content as per the Commissioners request, you should have effective mechanisms in place that allow this to be done in the stipulated 24-hour time frame.

It is important that a business has updated terms of use, safety policies and procedures (particularly those that deal with end-users), standards of conduct and policies in relation to the control and enforcement of those standards. The BOSE also requires businesses to have a mechanism which allows Australian residents to report and make complaints about breaches of terms of use and the service provided on their platform. The Commissioner may direct minor complaints back to the business and give them the chance to rectify the breach.

How can we help?

If you have questions or require assistance with the enforcement of the Online Safety Act 2021 (Cth), please do not hesitate to contact a member of Coleman Greig’s Commercial Advice Team who would be more than happy to assist you.

This material is provided by Coleman Greig Lawyers as general information only in summary form on legal topics current at the time of first publication. The contents do not constitute legal advice and should not be relied upon as such. Formal legal advice should be sought in particular matters.


Send an enquiry

Any personal information you provide is collected pursuant to our Privacy Policy.


More posts

SafeWork NSW
SafeWork NSW releases new strategy to address psychosocial hazards

On 22 May 2024 SafeWork NSW introduced a new strategy to address psychological and psychosocial hazards. The SafeWork NSW Psychological Health and Safety Strategy 2024-2026 establishes new supports for employers regarding their duties in preventing psychosocial harm in the workplace.

roles in the strata scheme
Understanding roles in the strata scheme

A strata scheme is a building or group of buildings that have been divided into lots which can be apartments, villas, offices, units or townhouses. This will be articulated in the strata plan.

Airbnb home
Can I put my home on Airbnb?

Airbnb is a form of short-term rental accommodation. To add your property to Airbnb in NSW, you are required to meet several laws and regulations governing short-term rentals.

liquidators required to seek approval
When are liquidators required to seek approval to retain legal counsel?

When does a liquidator (or the company he or she is appointed to) need court, creditor, or committee approval to validly retain a solicitor to act in a liquidation matter which is likely to extend for longer than three months?  The answer to this question has only recently been settled.

Proposed changes to building
Proposed changes to building and construction law in NSW

The Building Bill 2022 (the Bill) is the key avenue through which the NSW Government has proposed to reshape the culture of the building and construction industry by eliminating poor performance and improving the quality of building statewide.

Dismiss an employee
Can you dismiss an employee who fails to return to the office?

Slowly but surely, most employers are requiring employees to return to the office for at least a portion of their working week. Some employers continue to struggle with employees resistant to returning to the office or those who have an expectation that they can continue to work from home whenever it suits them.

Phoenixing in Construction
New powers to combat phoenixing in construction

The rise of phoenixing in the building and construction industry in Australia in recent years has proved a significant challenge to regulators. Mismanagement of time or cashflow can quickly propel businesses into insolvency.

© 2024 Coleman Greig Lawyers  |  Sitemap  |  Liability limited by a scheme approved under Professional Standards Legislation. ABN 73 125 176 230