Instagram is Coaching its AI to detect Offensive Captions

Instagram is going to warn its users when their captions on a photo or video are considered to be offensive. The Facebook-owned company says it has trained an AI system to spot offensive captions.

Reconsider, what words you use for a caption!

The idea behind the concept is to give users “a chance to pause and reconsider their words”.

On Monday, Instagram disclosed the feature in a blog, stating it would be rolled out immediately to some countries.

Instagram has taken this step after it was ranked the worst online platform in a cyber-bullying study in July 2017. The tool is designed to help counter online bullying, which has become a major issue for platforms such as Instagram, YouTube, and Facebook.

While uploading an image or video on the platform, if a user types an offensive caption on Instagram, they will get a prompt informing them it is similar to others reported for bullying. Users will then be provided with the option to edit their caption before it is published.

“In addition to limiting the reach of bullying, this warning helps educate people on what we don’t allow on Instagram and when an account may be at risk of breaking our rules”, Instagram wrote in the post.

Instagram launched a similar feature earlier this year that notified people when their comments on other user’s posts could be considered offensive.

“Results have been promising and we’ve found that these types of nudges can encourage people to reconsider their words when given a chance”, Instagram wrote.

Internet culture writer and author of the book YouTubers, Chris Stokel Walker explained that the feature was part of a broader move by Instagram to be more aware of the wellbeing of its users.

“From cracking down on promoting images of self-harm, to hiding ‘likes’ so people outwardly are less likely to equate their self-worth with how many people press ‘like’ on their photos, the app has been making moves to try and roll back some of the more damaging changes it’s had on society,” he said.

Leave a Reply

Your email address will not be published. Required fields are marked *