Instagram demoting inappropriate content from app
Facebook-owned messaging app Instagram has decided to demote sexually and morally vague content - video snippets, memes and pictures - which are being shared and viewed on its platform by its one billion global users.
Facebook-owned messaging app Instagram has decided to demote sexually and morally vague content - video snippets, memes and pictures - which are being shared and viewed on its platform by its one billion global users. "We have begun reducing the spread of posts that are inappropriate but do not go against Instagram`s Community Guidelines, limiting those types of posts from being recommended on our Explore and hashtag pages," Guy Rosen, Vice President of Integrity and Tessa Lyons, Head of News Feed Integrity, Facebook, wrote in a blog-post on Wednesday.
Instagram has always maintained strict policies against nudity on its platform. It states that nudity is only allowed if the photos show a nude sculpture or a painting, post-mastectomy scarring, or a woman breastfeeding, The Next Web (TNW) reported. As part of its plans, Instagram specified that any sexually suggestive post shared on the app would still appear in Feed for followers, however, the content may not appear for the broader community. In 2014, there was a huge uproar when Instagram banned global artiste Rihanna`s topless photo.
A #FreeTheNipple movement also came up on the app at the time, but Instagram refused to alter its policies. Facebook`s post coincided with a plethora of "integrity" announcements it made at a press event held in its Menlo Park headquarters, focusing on safeguarding its family of apps. Surrounded by data breach scandals, Facebook is planning to work with experts to check the spread of fake news on its platform and reduce the reach of Groups that repeatedly share misinformation on it.
06:09 PM IST