Social media platform Instagram has reportedly announced new policies designed to limit interaction between adults and teenagers on the platform, in order to improve safety for younger users. The new policies ban adults from sending direct messages to teenage users who do not follow them.
Furthermore, the app has introduced safety prompts, which will be sent to teen users when they send direct messages to any adults who have been deemed suspicious. These safety prompts will offer teen users options to block or report adults messaging them. They will also remind the younger users not to feel pressurized to respond to messages, as well as to exercise caution while sharing videos, photos or information with people they do not know.
In the event of any suspicious behavior exhibited by adult users, Instagram’s moderation systems will issue notices. While the company has not divulged in detail how these systems operate, it has stated that suspicious activities include the sending of large amounts of message or friend requests to users under 18 years of age. The Facebook-owned app has said that this feature will be made available in certain countries over the course of the month, and will soon be rolled out globally.
Instagram has also revealed plans to develop a new AI (artificial intelligence) and ML (machine learning) technology, in order to detect the age of users when they sign up for a new account. This move is part of the company’s effort to prevent users from lying about their age when signing up but declined to provide any details about how the new ML systems might help.
New teenaged users signing up to Instagram will also be prompted to make their profile private. In the event that they opt for a public account, the app will send them a notification afterwards, citing the merits of having a private account and reminding them to keep a check on their settings.
Source Credit: https://www.theverge.com/2021/3/16/22333580/instagram-bans-adults-messaging-teens-safety-notice-prompt