Instagram boosts safety features around teenage users
Instagram is to restrict the ability of adults to contact teenagers who do not follow them on the platform as part of new safety tools aimed at protecting younger users.
Under the new rules, adults will be blocked from sending a direct message (DM) to any Instagram user under 18 if that account does not already follow them.
As part of a range of new measures being introduced, the social media platform will also begin sending safety alerts to users aged under 18 to encourage them to be cautious in conversation with adults they are already connected to, but have exhibited potentially suspicious behaviour – such as sending a large amount of friend or message requests to teenage users.
In addition, Instagram said it was making it more difficult for adults to find and follow teenagers on the site by restricting teen accounts from appearing in the Suggested Users section of the app and hiding content from teen users in both Reels and Explore.
Youngers users are also being encouraged to make their accounts private, while Instagram said it was developing new artificial intelligence and machine learning technology to help it better identify the real age of younger users after it acknowledged that some young people were lying about how old they were in order to access the platform.
The Facebook-owned site’s terms of service require all users to be at least 13 years old to have an account.
“Protecting young people on Instagram is important to us,” the social media giant said.
“Today, we’re sharing updates on new features and resources as part of our ongoing efforts keep our youngest community members safe.
“We believe that everyone should have a safe and supportive experience on Instagram. These updates are a part of our ongoing efforts to protect young people, and our specialist teams will continue to invest in new interventions that further limit inappropriate interactions between adults and teens.”
The online safety of teenagers using social media has been a key issue for technology firms for some time, with companies under continued scrutiny in the wake of repeated warnings from industry experts and campaigners over the dangers for young people online.
The Government is set to introduce an Online Safety Bill later this year, which is expected to introduce stricter regulation around protecting young people online and harsh punishments for platforms found to be failing to meet a duty of care, overseen by the regulator Ofcom.
Responding to Instagram’s announcement, Andy Burrows, head of child safety online policy at the NSPCC, said: “Instagram’s decision to stop adults contacting children who don’t follow them is welcome, correcting a dangerous design decision that should never have been allowed in the first place.
“There are consistently more grooming offences on Instagram than any other platform. Our latest data shows it is the platform of choice for offenders in more than a third of instances where they target children for sexual abuse.
“Nothing in this package will change the dial on Instagram’s dangerous plans to introduce end-to-end encryption, which will blindfold themselves and law enforcement to abuse and mean that offenders can target children on the site unchecked.”