Early bird prices are coming to an end soon... ⏰ Grab your tickets before January 17

This article was published on June 15, 2020

Instagram’s algorithm prioritizes ‘scantily-clad’ photos, study finds

Researchers analyzed which images appear at the top of newsfeeds


Instagram’s algorithm prioritizes ‘scantily-clad’ photos, study finds Image by: Tofros.com from Pexels

If your Instagram newsfeed is filled with thirst traps, it might not be solely because you’re a total pervert. According to a new study, the gram’s algorithm makes photos showing skin more likely to appear.

Researchers from AlgorithmWatch and the European Data Journalism Network made the discovery by analyzing Instagram newsfeeds, talking to content creators, and studying patents.

The team asked 26 volunteers to install a browser add-on that automatically opens their Instagram homepage at regular intervals, and records which posts appear at the top of their newsfeeds. The volunteers then followed a selection of professional content creators who use Instagram to advertise their brands or attract new clients.

Of the 2,400 photos that the content creators posted, 362 (21%) showed bare-chested men, or women in bikinis or underwear. The researchers expected that if Instagram’s algorithm wasn’t prioritizing these pictures, the volunteers would see a similar diversity of posts. But that didn’t happen. In the volunteers’ newsfeeds, posts with semi-nude pictures made up 30% of the posts shown from the accounts.

The 💜 of EU tech

The latest rumblings from the EU tech scene, a story from our wise ol' founder Boris, and some questionable AI art. It's free, every week, in your inbox. Sign up now!

[Read: Sci-fi perpetuates a misogynistic view of AI — Here’s how we can fight it]

The pictures of scantily-clad women were 54% more likely to appear in their newsfeeds, while posts with bare-chested men were 28% more likely to be shown. In contrast, posts showing pictures of food or landscape were 60% less likely to pop up in their feeds.

Nicolas Kayser-Bril, a reporter at AlgorithmWatch, believes the algorithms are perpetuating the biases of certain users. He said on Twitter:

A minority of Instagram users see the platform as a free source of soft porn images and their behavior is probably picked up by ML systems, amplified, and pictures of nudity are pushed for all users, in a vicious cycle.

This algorithmic bias could push content creators — particularly women — into posting revealing photos to attract more viewers. It could also help shape the worldview of Instagram’s 1 billion monthly users.

Further research required

The researchers admit that the bias towards nudity didn’t apply to all the volunteers. They suspect this is because Instagram’s algorithm promotes nudity in general, but that other factors — such as personalization — limit the effect for some users.

They added that it’s impossible to draw concrete conclusions without access to internal data and production servers held by Instagram’s owner Facebook. Until that happens, the researchers plan to investigate further by recruiting more volunteers to install their monitoring add-on.

In a statement, Facebook disputed their findings:

This research is flawed in a number of ways and shows a misunderstanding of how Instagram works. We rank posts in your feed based on content and accounts you have shown an interest in, not on arbitrary factors like the presence of swimwear.

Nonetheless, the researchers believe that their findings reflect how Instagram’s algorithm works.

They note that Facebook has published a patent showing how the newsfeed can automatically choose which pictures appear in newsfeeds. Among the factors that could determine which images to prioritize, the patent specifically mentions “state of undress”.

That suggests that Instagram could not only organize newsfeeds based on what a user wants; it could also select pictures based on what the company thinks they want.

Get the TNW newsletter

Get the most important tech news in your inbox each week.

Also tagged with