On November 27, 2024 the Australian Parliament passed the Online Safety Amendment (Social Media Minimum Age) Act 2024 (the Act), marking a significant shift in the regulation of online platforms. The Act amends the Online Safety Act 2021 and introduces a minimum age requirement of 16 years for social media account holders in Australia. This legislation makes Australia one of the first jurisdictions to mandate a national age threshold for access to social media platforms, reflecting growing international concern about the impacts of social media on children and adolescents.
The Australian Government’s announcement of the new law confirmed the onus would be placed on social media platforms, not parents or young people, to take steps to implement systems that prevent under-16s from accessing their services. Social media platforms including Snapchat, TikTok, Instagram and X (formerly Twitter) are among those likely to be captured by the new obligations. Prime Minister Anthony Albanese framed the Act as a necessary intervention in response to mounting evidence of social harm, stating: “We know social media is doing social harm. This is a landmark reform. We know some kids will find workarounds, but we're sending a message to social media companies to clean up their act.” (The Hon. Anthony Albanese MP, Media release - Albanese Government delivers world-leading legislation to protect children online (Prime Minister of Australia, 2024)).
The main features of the Act are:
- Minimum Age Requirement:
A statutory prohibition on individuals under the age of 16 creating or maintaining social media accounts in Australia. - Obligations on Platforms:
Platforms defined as “age-restricted social media platforms” are required to take reasonable steps to prevent children under 16 from creating and holding social media accounts. Reasonableness is context-dependent, with factors such as technological feasibility, proportionality, and privacy compliance to be considered. - Definition of “Age-Restricted Social Media Platform”:
Introducing a new definition for ‘age‐restricted social media platform’ to which the minimum age obligation applies. The definition includes electronic services which satisfy the following conditions:
- A sole or significant purpose is to enable online social interaction between two or more users;
- Users can link to or interact with each other;
- Users can post content to the platform; and
- The service satisfies any additional criteria specified by Australia’s eSafety Commissioner under delegated legislative rules.
- Prohibition on Government ID Collection:
Platforms must not collect government-issued ID (g., passports or Digital ID) as a condition of account creation or age assurance. Instead, they are required to implement privacy-preserving age assurance methods that do not involve collecting sensitive personal information. - Data Protection Safeguards:
Establishing privacy protections, placing limitations on the use of information collected by platforms for the purposes of satisfying the minimum age obligation, and requiring the destruction of information following its use.
The Act imposes civil penalties of up to 30,000 penalty units for non-compliance, which increases to 150,000 penalty units (currently equivalent to $49.5 million AUD) for a breach of the minimum age obligation by companies. The maximum for companies is said to be in line with the online safety maximum civil penalties currently in place in Ireland, the EU and the UK (Australian Government, Online Safety Amendment (Social Media Minimum Age) Bill 2024 – Fact Sheet (Department of Infrastructure, Transport, Regional Development, Communications and the Arts (2024) at 4) (Fact Sheet).
The reforms are due to take effect within 12 months of the Act being passed (i.e., by December 2025). This deferred effect is intended to provide Australian industry and the Australian eSafety Commissioner with sufficient time to develop and implement systems to enforce the provisions of the Act. How the Act will operate and whether it is viable remains to be seen.
Australia’s 2024/2025 Federal Budget allocated AUD6.5 million for the development of a technical trial of age assurance technologies, to determine effectiveness, maturity and readiness for use in Australia (Fact Sheet, 2). It is anticipated that the outcomes of the trial will be provided to the Australian Government with advice to inform what “reasonable steps” are (to prevent young people from creating or holding social media accounts). The trial is currently being conducted by a consortium headed by Age Check Certification Scheme (ACCS), with the final report due to be delivered by the end of June 2025 (Allen, T, Age Assurance Technology Trial D6.1 – Project Plan Department of Infrastructure, Transport, Regional Development, Communications and the Arts (2024) at 43).
In March 2023, the eSafety Commissioner delivered a “Roadmap for Age Verification” to the Australian Government addressing if and how a mandatory age verification mechanism could be achieved in Australia (in the context of considering measures to prevent harm to children from online pornography). The response of the Australian Government noted that at the time, age assurance technologies had privacy, security, effectiveness and implementation issues and a decision to mandate age assurance was not ready to be taken (Australian Government, Government response to the Roadmap for Age Verification (Department of Infrastructure, Transport, Regional Development, Communications and the Arts (2023) at 2). In this context, it is unclear how privacy and security issues will be addressed in the current technical trial, and whether the reforms will be ready for implementation in December 2025.
Prior to the enactment of the Act, the Law Council of Australia (LCA) also identified various concerns with the proposed law including that the definition of “age-restricted social media platform” is “extremely broad and likely to bring uncertainty to its application.” The LCA further raised concerns regarding the overcollection of personal information, and expressed the view that further scrutiny of the relevant provisions is required to ensure that the protections are adequate (Law Council of Australia, Letter to Senator the Hon. Penny Wong - Online Safety Amendment (Social Media Minimum Age) Bill 2024, at 2).
TikTok made a submission to the Parliamentary Inquiry similarly emphasizing the need for clarity on various definitions, stating, “The Bill's definitions need considerable work to ensure they are clear, enforceable and applied fairly and as expressly intended by the legislation” (Woods-Joyce, E, Letter to Environment and Communications Legislation Committee - Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Tik Tok Australia and New Zealand 2024) at 3).
Headspace (the National Youth Mental Health Foundation), in its Submission, acknowledged concerns around social media and the risks of harm to young people, however commented on the need for young people and their families to receive support to help manage the changes and clear messaging around what they mean (Trethowan, J, Letter to Committee Secretary – Inquiry into the Online Safety Amendment (Social Media Minimum Age) Bill 2024 (Headspace 2024) at 2).
It remains to be seen how the Act will be enforced in Australia and whether other countries will look to follow suit with similar reforms.