Instagram is introducing new safety features aimed at making the platform safer for teenagers.
With this update, all users under 18 will automatically be placed under stricter content settings. This means they will see only age-appropriate posts, similar to content rated suitable for teens.
The platform will now reduce exposure to posts that include violence, strong language, or adult themes. It will also limit interactions with accounts that regularly share such content.
Search results are being tightened as well. Teens won’t easily come across sensitive or harmful content, even if they try to search for it in different ways.
One of the biggest changes is that these safety settings cannot be turned off freely. Teens will need approval from parents to make any changes, giving families more say in how the app is used.
The update also introduces a “limited content” option, which applies even stricter filters for those who want extra protection.
Parent company Meta says the changes are based on feedback from parents and are part of a larger effort to improve safety for younger users.
The move comes at a time when social media platforms are facing growing pressure to better protect teenagers online, especially around issues like mental health and exposure to harmful content.