Skip to main content
xYOU DESERVE INDEPENDENT, CRITICAL MEDIA. We want readers like you. Support independent critical media.

Study Says Instagram Algorithm Prioritises ‘Scantily-Clad Men and Women’

According to a study carried out by European Data Journalism Network and AlgorithmWatch, pictures which featured under-dressed people had a greater chance of being featured in newsfeeds on the platform.
Study Says Instagram Algorithm Prioritises

An independent study has found that Facebook-owned Instagram, the photo and video sharing social media platform, prioritises pictures of “scantily-clad men and women”, which the researchers believe is shaping the way content creators use the service.

According to a study carried out by European Data Journalism Network and AlgorithmWatch, a non-profit which aims to decode “algorithmic decision making processes”, pictures which featured under-dressed people were more likely to be seen on the platform. It has to be noted that while Instagram’s policies do not allow for nudity on the platform, the researchers say it favours posts “that show skin”.

To make sense of which pictures were being pushed onto people’s feeds on Instagram, the researchers asked 26 volunteers to install a browser add-on and follow 37 professionals – 14 of them men – who push branded content relating to food, travel, fitness, fashion or beauty on their pages. The browser add-on would refresh the volunteer’s homepage at regular intervals and make a log of what appeared at the top of their feed. The exercise would provide an accurate picture of what Instagram thought the particular volunteer was interested in.

The reasoning used by the researchers was that the volunteers’ feeds should ideally be either a fair reflection of the professionals’ posts or catered specifically to the particular user’s already established interests.

The researchers analysed 2,400 photos from 1,737 posts put up by the content creators. They found that 362 posts (21% of the total) “containing pictures showing women in bikinis or underwear, or bare chested men. In the newsfeeds of our volunteers, however, posts with such pictures made up 30% of all posts shown from the same accounts (some posts were shown more than once).”

The study showed that pictures of women in undergarments or a bikini were 54% more likely to be seen on the volunteers’ newsfeed and posts of bare-chested men were 28% more likely to be seen. They also found that pictures of food or landscapes were 60% less likely to be seen. The researchers have published their results on this page.

The researchers mention that some accounts may not be skewed towards nudity, saying that a small percentage of volunteers were shown posts which were a fair reflection of the content creators’ posts. “It is likely that Instagram’s algorithm favors nudity in general, but that personalisation, or other factors, limits this effect for some users,” the researchers said.

While Facebook did not respond to questions, it reportedly sent a statement: “This research is flawed in a number of ways and shows a misunderstanding of how Instagram works. We rank posts in your feed based on content and accounts you have shown an interest in, not on arbitrary factors like the presence of swimwear.”

The authors of the study, however, believe that they have enough reason to not doubt their work. They cite a Facebook patent from 2015 which showed how the algorithm could choose which pictures it could provide a disproportionate amount of attention to. An ‘engagement metric’, which draws from the past behaviour of the user caters content accordingly. “But the engagement metric can also be computed based on past behavior from all users of the service. The patent specifically states that the gender, ethnicity and ‘state of undress’ of people in a photo could be used to compute the engagement metric,” the researchers mention.

The study also mentions that Facebook’s picture analyser – called computer vision – is an algorithm which scans pictures and decides what to show a user. It has been known to be prone to “replicate and amplify the biases of their training data, leading to spurious, or fallacious, correlations.”

The researchers conducted a review of Facebook’s “computer vision” patents, only to find that of the 238 patents filed, only 27 were done by females among the 340 in all. “While our results show that male and female content creators are forced to show skin in similar ways if they want to reach their audience, the effect could be larger for females, and be considered a discrimination of female entrepreneurs,” the study said.

Get the latest reports & analysis with people's perspective on Protests, movements & deep analytical videos, discussions of the current affairs in your Telegram app. Subscribe to NewsClick's Telegram channel & get Real-Time updates on stories, as they get published on our website.

Subscribe Newsclick On Telegram

Latest