Instagram Rolls Out New Restrictions for Teen Users

October 14, 2025 by No Comments

Australia Bans Social Media For Under 16s

Instagram implemented new limitations for adolescent user accounts on Tuesday, amidst increasing debate regarding safety protocols for younger individuals on the social media service.

The image-sharing application will shortly restrict content for teenagers, drawing on standards akin to those in the movie sector for PG-13 films. These adjustments signify that the app will conceal or refrain from suggesting posts containing explicit language, depicting drug-related items, or promoting “potentially dangerous actions,” , as stated by Meta, its parent company.

The firm announced its intention to employ “age prediction technology” to hinder adolescents from circumventing these limitations, which are scheduled for implementation by year-end.

These modifications arrive as Instagram faces criticism for supposedly neglecting to safeguard users under legal age from detrimental content. A report last week indicated that almost three out of five teenagers, aged 13 to 15, had encountered inappropriate material and unsolicited communications in the past half-year. Meta informed TIME that this report was “highly subjective” and “based on a basic misinterpretation of how our adolescent safety mechanisms function.”

Furthermore, in September, a distinct investigation conducted by online-safety organizations and researchers from Northeastern University revealed that over 40 child safety functionalities promised by Instagram were defective. Meta dismissed that study as “”

‘Content unsuitable for age’

The company took steps to implement enhanced safeguards for younger users last year, introducing “teen accounts” that restricted individuals under 18 from accessing specific mature content and automatically set their profiles to private by default.

These forthcoming alterations will prevent adolescent users from subscribing to accounts that share “content unsuitable for their age, or if an account’s name or biography implies it is inappropriate for teens,” Instagram stated. Teenagers already following such accounts will be unable to view or engage with their content, and these accounts will similarly be barred from following teens, sending them direct messages, or commenting on their uploads. These limitations will extend to public figures and other widely followed adult accounts that publish even a single piece of age-inappropriate material, Instagram informed NBC News.

Instagram’s artificial intelligence chatbot will also undergo new adjustments to ensure it does not provide unsuitable responses to users based on age. Independently, AI chatbots and their creators have also faced legal challenges concerning claims that these chatbots “investigate suicide methods.”

Instagram’s recently introduced restrictions will be automatically enforced for adolescent users. They cannot bypass these rules unless they secure consent from their guardians.

Instagram is rolling out a new functionality for parents desiring even tighter supervision, enabling them to prevent an account from seeing, posting, or receiving comments beneath posts.

Instagram has been the target of personal injury claims in both state and federal jurisdictions, stemming from accusations that it adversely affects young individuals: Over 1,800 litigants in northern California have filed suits against prominent social media firms, including Instagram and Meta, alleging they “negligently disregarded the consequences of their offerings on children’s psychological and physical well-being.” One particular lawsuit described Instagram as an “addictive, detrimental, and occasionally lethal” service.

Nevertheless, the social media corporation hailed the recently unveiled changes as the “most impactful enhancement to teen accounts” since their inception in January 2024. These new limitations will affect hundreds of millions of teenagers globally who utilize the application, albeit they will initially be phased in for users in the U.S., U.K., Australia, and Canada.