Public Media for the Yukon-Kuskokwim Delta
Play Live Radio
Next Up:
0:00
0:00
0:00 0:00
Available On Air Stations

Instagram fails to protect female politicians from violent, racist, sexist comments

MARY LOUISE KELLY, HOST:

As the election revs up, so, too, do verbal attacks on politicians. This is not surprising, but social media is saturated with them, and trolls target female politicians with especially vitriolic messages. A new report looking at Instagram raises questions about how best to tackle the problem, NPR tech correspondent Dara Kerr reports. And before we start, a warning - this story has language that will offend some listeners.

DARA KERR, BYLINE: A group called The Center for Countering Digital Hate took a look at the Instagram accounts of 10 leading female politicians - five Democrats, including Vice President Kamala Harris, and five Republicans, including Congresswoman Marjorie Taylor Greene. The group combed through these accounts and says it found tens of thousands of toxic comments.

IMRAN AHMED: Things like, make rape legal. We don't want Blacks around us. Death to her.

KERR: Imran Ahmed is the founder of the group and says they reported a thousand of these abusive, vile comments to Instagram.

AHMED: Women who already are in politics receive a disproportionate amount of abuse and that - often that abuse is highly gendered. It can be threatening. It's designed to terrorize.

KERR: As violent as these comments sound, it's more prevalent than you think. The group says that 1 in 25 comments to these female politicians were identified as toxic. So after contacting Instagram, Ahmed says the group waited a week to see if the company took any action.

AHMED: What we found was that 93% of the time, Instagram fails to act on that kind of threat, that kind of extreme identity-based hate.

KERR: To be clear, these kinds of hateful comments span across all of social media. Instagram is not unique. The company said in a written statement that it, quote, "provides tools so that anyone can control who can comment on their posts" and that it will take action on content that violates its policies. Easier said than done. Faiza Patel works at the Brennan Center for Justice.

FAIZA PATEL: So there's a lot of different variables that go into content moderation, but the reality is it's always going to be imperfect.

KERR: Because content moderation is imperfect, she says, it can allow for hateful comments to stay up. And the result - it further normalizes that kind of toxic behavior.

PATEL: It's not enough to have great rules, right? That's just sort of typical, right? You also have to be able to enforce them.

KERR: The question is how to enforce. Imran Ahmed, from The Center for Countering Digital Hate, suggests the importance of government regulation. He says tech companies aren't doing enough. They need to be held liable.

AHMED: It's about the message that we're sending to young women across the country.

KERR: Without action, he warns, this kind of flagrant treatment could deter the next generation of female leaders. Dara Kerr, NPR News. Transcript provided by NPR, Copyright NPR.

NPR transcripts are created on a rush deadline by an NPR contractor. This text may not be in its final form and may be updated or revised in the future. Accuracy and availability may vary. The authoritative record of NPR’s programming is the audio record.

Dara Kerr
Dara Kerr is a tech reporter for NPR. She examines the choices tech companies make and the influence they wield over our lives and society.