In June of 2020, Variety reported that the Facebook owned social media sharing site Instagram would be reviewing its practices in the wake of Black Lives Matter protests exploding across the streets of America and online after the murder of George Floyd. George Perry Floyd Jr. was an African-American man who was murdered by a police officer in Minneapolis, Minnesota, during an arrest after a store clerk suspected Floyd may have used a counterfeit $20 bill, on May 25, 2020.
Instagram’s Adam Mosseri at the time said quote ” Instagram is “taking a harder look” at how the photo and video sharing app affects different communities, starting with Black users. Mosseri continued in a blog post published in June 2020 that quote “[we are} hearing concern about whether we suppress Black voices and whether our products and policies treat everyone equally.”
While Instagram — with more than 1 billion monthly users — is “a platform that stands for elevating Black voices,” Mosseri wrote, at the same time “Black people are often harassed, afraid of being ‘shadow-banned,’ and disagree with many content takedowns.” – Variety
Adam Mosseri is an Israeli-American businessman and the head of Instagram. He formerly served as an executive at Facebook
Instagram users such as activist Tahyira Savanna who uses the handle @iletthegoodtimesroll, have been tracking their own shadow-banned realities prior to the 2020 protests. Savanna has been using her feed to show her activist work which goes back to posts dated in 2012 after the murder of Trayvon Martin. Savanna is a graduate from Long Island University studying criminal justice and policing science.
Mosseri continued in his blog post to encourage these users that something would be done saying quote, “Instagram is reviewing four areas of potential concern for how the app specifically affects Black users, which would include: harassment; account verification (with plans to make changes to “ensure it’s as inclusive as possible”); distribution (how content is filtered on Explore and Hashtag pages to determine where there may be vulnerability to bias); and algorithmic bias. You can read the full blog entry here: https://about.instagram.com/blog/announcements/ensuring-black-voices-are-heard
In August of 2020, the Guardian shared a headline that read, Censorship #IwanttoseeNyome outcry after social media platform repeatedly removes pictures of Nyome Nicholas-Williams. The response on Instagram was ecstatic: “stunning … beautiful … this should be in a gallery!”. But within hours, Instagram had deleted the photo and Nicholas-Williams had been warned her account could be shut down. Photographer Alexandra Cameron at the time accused Instagram of a disconnect between its positive statements over Black Lives Matter and the apparent unfair targeting of its black content creators. Under the platform’s community guidelines, nudity or sexual activity is restricted but is monitored on a case-by-case basis.
Instagram had to apologize in August of 2021 for removing the official poster for Spanish director Pedro Almodóvar’s new film from the social network because it showed a female nipple, after the poster’s designer complained of censorship.
White owned companies including social media management apps like Hootsuite have long been sisters to these types of scams and scenarios. These companies help Instagram remain in the good guy role by promoting top performing articles designed to confuse paying users. In 2021, experiments were performed by Hootsuite who claims that shadow banned accounts are a rumor and the ban or lower engagement isn’t caused by some racist algorithm but in fact by the use of banned hashtags.
This month, Hootsuite’s Social Media Marketing Manager, Amanda Wood, and her team, try to get shadowbanned on Instagram.
So, how does one get shadowbanned on Instagram? People who have claimed to be shadowbanned tend to use too many hashtags, use irrelevant hashtags, and write generic comments on other users’ posts.
Over the course of the week, we posted the content that normally would garner high engagement, but alternated between tagging it with 30 related hashtags (e.g. #vancouver, #vancity), and 30 unrelated hashtags (#skateboarding, #elevator). Stacey also would pop into accounts on her Explore page and comment “Nice post!” over and over again.
One caveat to this experiment was that Stacey wrote captions for each post explaining that she was trying to get shadowbanned, so her followers wouldn’t think she had been hacked. We can’t say if it impacted the results of the test, but nonetheless, we were more focused on testing the hashtag and comment effects.
After the week was up, we used Hootsuite Analytics to gather the results and discovered a few findings:
- Despite the account having a strong following with lots of active and supportive comments, there was a dip in engagement the week of the experiment (to 9.87%) compared to Stacey’s usual engagement of 17% throughout the summer.
- When comparing the engagement numbers from the last 3 months within the app natively, some of the shadowban posts made it to the top 10—suggesting that the hashtags were actually helping.
- However, it looked like the irrelevant hashtags were bringing almost no new users to the account. This makes perfect sense since people who click on irrelevant hashtags won’t click on the content they aren’t looking for.
To sum up the experiment, from the perspective of hashtag use and comments, we found no evidence of a “shadow ban.” Instagram has been cracking down on bots and third-party apps, so it’s not surprising that imitating one won’t make them inclined to bump you up in the algorithm.
We found Amanda on LinkedIn, https://www.linkedin.com/in/amandacharlottewood/ she’s a white woman from Canada. How is it that she got to run an experiment that does not reflect accurate results for people like Nyome? Let’s examine a non-race related aspect of this argument.
In September of 2020, a user named Liv, shared a story titled I ‘Violated’ Instagram’s Community Guidelines For Saying ‘Men Are Trash’ to the site F Grls Club. She recounts her moment, my caption read, ‘this is why men are trash and women deserve the world’. Less than 2 minutes later, I received a notification from Instagram, informing me that my story had been taken down for violating Instagram’s community guidelines.
She continued; the app allows women to be attacked by its users every day – users that are breaking its ‘community guidelines’. Honing in on and quickly removing content that could be interpreted as hate speech against men, then, is hypocritical. In short, women and non-binary people are challenged more on their nipples than misogynists are on their hate speech.
With all of these accounts regarding inequitable use of their algorithm as of the time of this story nothing had been updated on their community guidelines.