Instagram is introducing new safety features in order to better protect the youngest users on its platform.
In a blog post on Tuesday, the company announced that it was developing new artificial intelligence and machine learning technology in order to verify people’s ages and “apply new age-appropriate features.” Instagram currently requires users to be at least 13 in order to sign up for an account.
The company said that it was also introducing a new feature that prevents adults from messaging users under 18 who don’t follow their accounts. “When an adult tries to message a teen who doesn’t follow them, they receive a notification that DM’ing them isn’t an option.”
In addition, the platform will begin prompting young users to be more cautious when chatting with adults they’re already connected to.
“Safety notices in DMs will notify young people when an adult who has been exhibiting potentially suspicious behavior is interacting with them in DMs,” the company said. “For example, if an adult is sending a large amount of friend or message requests to people under 18, we’ll use this tool to alert the recipients within their DMs and give them an option to end the conversation, or block, report, or restrict the adult.”
Instagram also plans to make it harder for adults to find and follow teens, as well as encourage young users to make their accounts private when signing up. These new features will be rolled out in some countries this month and will expand globally in the near future.