top of page

Australia bans social media for children under 16

  • Staff Writer
  • Nov 29, 2024
  • 3 min read

Updated: Dec 14, 2024


child on smartphone

Australia has passed a new law to completely restrict children below 16 years of age from using social media. The Online Safety Amendment (Social Media Minimum Age) Bill 2024 requires social media giants such as Meta, TikTok and Snap to take steps to prevent minors from logging in or face fines of up to AUD 49.5 million (USD 32 million).


Social media firms will be given a year to test methods to implement the ban across all their platforms, after which they will be slapped with a fine. Platforms used for health and educational purposes such as YouTube, Messenger Kids, Kids Helpline and Google Classroom have been exempted from the ban. 


Despite overwhelming support from parents, some privacy advocates and child rights groups opposed the ban fearing that it would lead to teen isolation and cut social support for vulnerable children.


Meta said it “respects” Australia’s new law, but expressed concerns over the “rushed legislation” as it believes the parliament didn’t consider the evidence properly.  Meta said the industry is already taking several steps to ensure age-appropriate experiences.


Some of these steps include age verification for teenagers that was introduced by Meta in 2022 and was later expanded to multiple countries. To update their age to 18 or over, users are required to provide an ID, a video selfie, or mutual friend verification.


In September, Meta announced that all Instagram accounts of users under 16 will be made private by default, restricting others from accessing their accounts or tagging them in posts. 

Instagram currently has over 1 billion monthly active users out of which 8% are estimated to be in the age group of 13 to 17, according to Statista. 

TikTok on its part claims that it removes 80 million underage accounts every year. 


Snapchat parent Snap Inc said it will comply with the laws but expressed concern over how to implement the ban. The firm said it will work closely with the Australian government and the e-safety commissioner to find a solution. 


Though Australia is the first country to take such a drastic step, several countries have passed laws to protect children from online harms. 

Last year, India passed the Digital Personal Data Protection Act (DPDPA) 2023, which requires verifiable parental consent for children under 18 to use social media. 


The Children's Online Privacy Protection Act (COPPA) in the US restricts tech firms from collecting personal information of children under 13 years without parental consent. The US is also considering a new bill that aims to set a minimum age of 13 for social media use and  make parental consent for minors mandatory.


Many of the top social media firms are already under scrutiny in the US for violating privacy laws and targeting children. 

For instance, Meta has been sued by 33 US states for allegedly violating the COPPA law and collecting personal information from over a million children under 13 years. 


Meta, TikTok, Snap, X, YouTube, Twitch are also under FTC scanner for targeting children and illegally harvesting large volumes of personal data for targeted advertising. 


FTC claims that many of these platforms are not taking adequate measures or enforcing age restrictions to protect children, leading to a mental health epidemic. Many of these platforms are accused of falsely claiming that there are no children on their platform to avoid liability. 



Image credit: Pexels

bottom of page