#21 The people v Twitter Trolls

Social media abuse tries to shut people down, and is frighteningly common for women and people of colour. This is the story of how Amnesty crowdsourced thousands of volunteers to patrol twitter and see if they could stop it.

Click on the play button above to stream it here. Or listen to the episode on Podcast One, Stitcher or Apple iTunes via these links. Or find by searching “ChangeMakers” on your podcast app.

Or use a Podcast app with our RSS feed. Broadcast 11 June 2019 as part of our third series.

 


Full transcript of Episode 21 – The People v Twitter Trolls

TARNEEN: My twitter was going off. I was getting thousands and thousands of people abusing me, being racist, being fat-phobic, telling me to kill myself.

HOST: That’s Tarneen Onus-Williams, a young Aboriginal and Torres Strait Islander human rights activist.

TARNEEN: Really misogynist, like, like rape things. I had people trying to hack into my Twitter account like about 30 times a day. I had people send me death threats in the mail.

HOST: Today on ChangeMakers we’re in cyberspace covering the story of one organisation that’s trying to change the rules of engagement in the digital playground. It’s hard to understand why people are so awful to each other online. But they are. And after more than a decade, it’s clear social media networks have little interest in solving the problem themselves.In this episode, instead of asking why social media has become so toxic, we’re looking at what can be done to bring it under control. Social media is such a vast, sprawling content explosion that the very idea of bringing it under control seems absurd.But this is the story of how Amnesty International used the very scale – and people power – of the internet for good, developing  a groundbreaking crowd-sourced research project, to track the toxic experiences that women have online.

Lets go.

HOST: I’m Amanda Tattersall, welcome to Changemakers, the podcast telling stories about people changing the world.

We are supported by the Sydney Policy Lab at the University of Sydney. They break down barriers between researchers, policy makers and community campaigners so we can build change together. Check them out at sydney.edu.au/policy-lab. And you can sign up to our email list at changemakerspodcast.org.

HOST: Let’s face it.  Social media has a woman problem.

TARNEEN: I’m part of Warriors of the Aboriginal Resistance, which is a group of young people nationally who organise around things.

HOST: As part of this resistance group, Tarneen organised the 2018 Invasion Day protests. Invasion Day is the name Aboriginal and Torres Strait Islander people – and an increasingly other Australians – give to the day the British formally began colonising Australia.For Aboriginal and Torres Strait Islanders, it’s an in-your-face reminder of colonisation’s catastrophic impacts on their people.

TARNEEN (clip): “We’ve organised this to abolish Australia Day, ‘cos f-[bleep] Australia, f-[bleep] Australia, I f-[bleep]-en hope it burns to the ground.”

TARNEEN: I’ve said it in the context not to actually pour fuel on the ground and burn the country to the ground, but as in the systems that exist are continuing to abuse Aboriginal people, kill Aboriginal people, like the black deaths in custody, our imprisonment rates, child suicide and, like, I’m saying that those systems of abuse need to be burnt to the ground.

HOST: Some will consider Tarneen’s opinion controversial.  But just because she had a strong opinion, controversially expressed, doesn’t mean she’s should be threatened and abused online about her gender, race and appearance. But she was.

And it’s this abusive backlash that women like Tarneen suffer whenever they express a strong opinion that prompted  Amnesty to look at ways to protect the human rights of women on Twitter.

It was an interesting choice. Amnesty has traditionally been a strong supporter of social media freedom.

MILENA: In the context of Me Too, Twitter and social media have played a really important role in women speaking about their experiences and bringing harassers to justice.

HOST: Milena Marin is from Amnesty UK’s Technology and Human Rights Team.

SEYI: It allows people to challenge their beliefs, find their voice, cultivate their identity, to connect to people. 

HOST: Seyi Akiwowo is the Founder and Executive Director of Glitch, a not-for-profit organisation dedicated to ending online abuse. Seyi acknowledges the transformational power of online technology.

SEYI: It allows communities to develop online – so you hear about Latina Twitter, Feminist Twitter, and Black Twitter and Gay Twitter.

HOST: Social media not only connects people – it empowers them, says Tanya O’Carroll, Head of Digital with Amnesty

TANYA: You know people using Facebook as a tool for protest. It felt really revolutionary that people could just be in the streets holding a phone.

HOST: Amnesty fully embraced the possibility of digital until 2013, then…

TANYA: Boom – Snowden disclosures. And you know suddenly there’s this, “Amnesty shouldn’t just be talking about tech for good and how do we use tech – like we need to be starting really seriously talking about technology as a major new threat to human rights.”

HOST: Edward Snowden blew the lid on governments running  massive surveillance programs. But he also showed that Digital platforms were collaborating with governments to deny human rights campaigners and protestors their freedom of expression.

TANYA: Most of the calls to companies were basically to not censor content and most of the thinking was you know you’re making arbitrary decisions with no due process to take down and block accounts and take down content and we’ve seen that happen with you know really important human rights video archives from Syria, for example.

HOST: But Amnesty, in advocating for Tarneen and women like her, started arguing that some social media controls were appropriate.

HOST: Hang on. Amnesty thinks some controls on speech is appropriate?

As is the experience for any woman online with a strong opinion, people’s responses to Tarneen’s comment on Invasion Day quickly got very personal and very toxic.

The Daily Telegraph’s Tim Blair blogged a photo of Tarneen, captioning it, “Tarneen wants Australia to burn, but probably not all the pie shops”.

Others online joined Blair’s fat-shaming crusade, with Tarneen also subjected to racist and sexist taunts, threats of sexual assault, and suggestions that she kill herself.

HOST: We know these can have a very real life impact on women.

VIDEO: “Good evening, after a long and public struggle with depression, television personality Charlotte Dawson’s life has ended tragically.

In 2012, Charlotte was admitted to hospital with depression after she received a barrage of vicious comments on Twitter.  She posted a picture of herself holding tablets, with the comment, “I hope this ends the misery.”

HOST: Charlotte Dawson was an anti-cyber bullying advocate who, days before her death in 2014, had appeared on Australian current affairs programs to expose vitriolic Twitter bullying, sharing how many people on Twitter had posted, “please go hang yourself”. Tarneen explains how the online abuse affected her.

TARNEEN: People are just like abusing you constantly, um, was like terrifying. Like it absolutely like ruined my self-worth. And I think that you begin to like not believe yourself, like the work that you’re doing, so it’s almost like, like, gaslighting yourself.

TARNEEN: I was like doing work around sexual assault and that’s something I can’t do now because, it’s like, it’s just I think it’s just too much to give, like those sorts of people, that part of myself, and I can’t share that part of myself now.

HOST: Ultimately, its consequence is to exclude certain people from being able to participate fully as online citizens.

TARNEEN: I, you know I was going to stop doing activism because of the abuse, like I was “I don’t think I’m going to do this anymore, like, it’s not worth it”. 

HOST: For Azmina Dhrodia, a former Amnesty researcher, Tarneen’s response to online abuse was all too common.

AZMINA: Women were talking about all of these horrible, horrible things that were happening to them online and the effect that it had on them, the effect that it had on how they expressed themselves, on how often they expressed themselves, or whether they expressed themselves. 

AZMINA: I was just looking at this and thinking this is a violation of your right to freedom of expression. 

TANYA:  At the moment we’re prioritizing the freedom expression of misogynistic men and trolls who want to send rape threats to women over the right of women to feel safe online and express themselves.

HOST: But just getting a handle on the scope of online abuse, let alone changing the online environment, is a big ask – because, as Milena explains, the online world is just so…. BIG.

MILENA: Every minute there’s 500 hours of video uploaded on Facebook – 3.3 million of posts per minute.

 

HOST: And on Twitter?  Around 500 million tweets are sent every day – that’s 6,000 tweets per second and 200 billion tweets per year. In comparison, Amnesty’s Technology and Human Rights team had just three people. Three people.

While Tanya and Azmina saw there was a real opportunity to promote the digital human rights of women, they knew they couldn’t take on every social media platform.

TANYA: What came out was that Twitter was the space where this was happening most prolifically and it was by far the company that was doing clearly doing the least to respond and to take that to take reports from women seriously. They just were not addressing it.

HOST: But it wasn’t just the scale that was the problem. This was Amnesty. You know, the group famous for its letter writing.

Tanya, with her tech background, experienced a bit of culture shock when she joined Amnesty back in 2012.

TANYA: I mean there were desks filled with piles of paper. There were cabinets overflowing with research notes.

TANYA: Pen and paper was the most widespread tool for doing human rights research. And that’s because researchers go out into the field with their pen and paper and maybe if you’re lucky an audio recorder, and they meet people and they talk to victims of human rights abuses and gather testimony of human rights abuses and then they write that all up as a PDF report, like 60, 70, pages. A year after they’ve been in the field, it goes live. I mean that’s the model that we’ve been using for decades.

 

HOST: Field research remains an important component of Amnesty’s campaigns, but Tanya wanted to draw on other sources.

TANYA: We could be responding, and much closer to real time, by just looking at what was on social media by reviewing the videos that were coming up on YouTube and we might be able to get story out much much much quicker if we were able to analyse that data and use it as evidence, rather than relying on field missions.

HOST: Amnesty needed to change.

TANYA: The first few years were very focused on, on how to get Amnesty to think differently about tech.

TANYA: It was finding the pockets across the movement in different countries where people were thinking in similar ways and wanting to try new things out and bringing those people together as a bit of a network.

And I think it has it has really changed in the last 7 years.

HOST: So plans began to emerge. Tanya and Azmina knew that just writing a report on the toxic treatment of women on Twitter would not build sufficient momentum for change.

So what would?

TANYA: And so that we wanted to have multiple types of research outputs over a couple of years that would sort of build and create a campaign on a story arc that people would follow.

HOST: It wouldn’t just be about method. Story would be central to how Amnesty built this campaign.

 

TANYA: I think sometimes it is the sort of view that we’re fact finders primarily, because we have to be credible, we have to have watertight evidence, but we fail a little bit at the storytelling point.

HOST: Azmina decided that women’s personal stories would be central to the campaign.

AZMINA: So I made sure to speak to black women, to Muslim women, to Jewish women, to LGBT women, to non-binary people. I wanted to reach out across the spectrum of high profile individuals who use Twitter, so journalists, politicians, activists, comedians, games developers, comic book writers.

HOST: Seyi Akiwowo.

SEYI: I was one of the first people that Azmina interviewed for the Toxic Twitter report and I was very happy to do it, because I think it’s really important when we talk about online abuse to be intersectional because before this report the women that were being showcased around online abuse were white women.

HOST: Seyi had been invited to the European Parliament to talk about youth engagement in politics and democracy.

SEYI: And there was an emergency session about the Syrian refugee crisis and I made an impromptu speech in response to the awful, awful things I was hearing from the young far right from the French Nationale, which is a fascist political party in France.

HOST: One of the interesting things about trolling, is you never know when an attack might come.

SEYI: A few months have passed after that moment and I was at the gym getting summer-body ready and my Spotify playlist kept cutting out, so I stopped the treadmill to find out why my ‘Rihanna – work, work, work’ playlist wasn’t playing properly. I pick up my mobile phone and I see a flood of notifications all full of racist abuse, sexist racial slurs, and even death threats across my social media account and that was my first time experiencing online abuse of this magnitude. It really made me understand the impact that online abuse has on human rights, freedom of expression, mental health and wellbeing, and safety, and, and opened my eyes actually to this huge, huge gap around duty of care.

SEYI: Social media companies were happy to say that, “Oh we are just the host platform – we – it’s not our fault we can’t take responsibility”. 

HOST: Azmina interviewed 86 high-profile women in the UK and US, held focus group discussions with young women, and Amnesty engaged Ipsos MORI to poll women in eight countries about their social media experiences.

The poll found 23 per cent of women surveyed had experienced online abuse or harassment – with that figure 33 per cent in the US.

AZMINA: There’s so many different ways that people troll and abuse women online.

AZMINA: Especially if that tweet was around feminism, or human rights, or discrimination issues more broadly, the response to that would often be sexist or misogynist in nature.

AZMINA: So some of it ranges from, “Shut up, go back to the kitchen. I don’t want to hear your opinion, Go make a cup of tea love.” And some of it goes to very graphic and extreme threats of rape and torture, sexual violence or death.

HOST: Amnesty wanted its report to include more than just the qualitative research it’s famed for.  It wanted to crunch some serious data… Milena Marin was Amnesty’s data expert.

MILENA: And at the time one of the organisations that were interested to working with us was Accenture. They were keen to organize a hackathon for us and they asked me what kind of interesting data challenges I could have.

MILENA: One of them was the Twitter project and they were really interested in that.

HOST: The project involved analysing the level of abusive tweets sent to women politicians in the lead up to the 2017 UK general election. OK, that seems clear enough – but what’s a hackathon?

MILENA: A hackathon is a weekend where people work almost uninterrupted on an issue.

MILENA: People working at their laptops fuelled by a lot of caffeine and Red Bull and whatnot. 

MILENA: So in our case we had data scientists working with human rights researchers to try to understand how we could use Twitter data to establish volumes of abuse on the platform.

HOST: So, did all that caffeine and Red Bull and whatnot allow Amnesty to accurately quantify abuse?

MILENA: So this was, this was an interesting first experiment. We tried to analyse tweets searching for keywords, but that generated a lot of false positives, because people use bad words in positive context. So it wasn’t really good enough, it wasn’t solid enough in terms of methodology, and our experiments, you know, within hackathon, just showed us really. It showed us that abuse comes in large volumes. It also showed us that we have to do more work in order to properly quantify that.

HOST: So Azmina took the work from the hackathon and did further analysis.

AZMINA: And that finding showed that the first black female MP in this country received almost half of all abuse directed at female MPs on Twitter, and 10 times more than any other woman MP that was running.

HOST: The MP was Dianne Abbott, the Shadow Home Secretary.   Abbott found social media invaluable in getting her messages out to the electorate and she enjoyed the cut and thrust of online debate, but the problem is when people are not engaging in debate or scrutiny, but just showering you with abuse – that you’re a nigger, that you’re a prostitute, threats against your safety. I get a double whammy. I’m abused as a female politician and I’m abused as a black politician.

HOST: This intersectional nature of abuse – where people’s specific identities can supercharge it – was one of Amnesty’s key findings.

AZMINA: And then that also includes, you know, racism Islamophobia, anti-Semitism, transphobia, homophobia, and so you know you’re just, you know, if you are a woman, and if you’re a woman with different identities, you know you are just experiencing so many different types of abuse online.

HOST

Dianne Abbott explains how Twitter abuse impacted on her.

DIANNE ABBOTT: It can make you second-guess yourself as to what you talk about and what you write about, because you know raising certain subjects will get a very predictable response.

I think the upsetting thing about it is the volume.  When I was a new member of Parliament, you know, you might get one racist letter a week you had to put it in an envelope, you had to put a stamp on it, you had to put it in a letterbox. Now, on some days, we get hundreds of items of abuse.

HOST: While Dianne Abbott was a member of the UK Labour Party, Amnesty found trolls were remarkably bipartisan.

MILENA: We also looked into political affiliation, trying to see whether, you know, there’s any difference if women represent liberal or conservative causes, whether it makes any difference in the abuse they got, and what we’ve seen is that it’s pretty much the same across the board.

HOST: Amnesty wasn’t just focussed on women’s experiences of abuse on Twitter, but how Twitter responded to their reports of abuse.

AZMINA: So stuff that was clearly against the Twitter Rules would not found to be in violation of the Twitter Rules. So there was really nothing they could do about the content. And as a result they just lost faith in the company to do anything about abuse on the platform. So that meant that they had to change their behaviours, so they would tweet less, they would take social media breaks, lots of high profile women have, you know, very famously left Twitter.

HOST: It wasn’t just trolls who were silencing women on Twitter – it was Twitter itself.

SEYI: So I was engaged in that kind of consultation around recommendations on what we would like to see Twitter do. And I was heavily pushing for transparency.

SEYI: Why is there reluctance from Facebook and Twitter to share information about the moderators? 

HOST: Twitter employs moderators to enforce its policies against hateful conduct and abuse.

The problem is that moderators don’t necessarily do this – and the way Twitter applies its criteria for different levels of response to reports is unclear.

Twitter did not provide Amnesty with requested details about the content moderation process, moderator training or the number of moderators employed per region.

Then, a  couple of weeks before Amnesty’s Toxic Twitter report was released, Twitter changed its tune, stating it would email account holders whose accounts were suspended to make clear which rules their tweet had broken.

While this was welcome, it was too little too late.

HOST: Amnesty released its report Toxic Twitter – A Toxic Place For Women on a day designed to provide maximum media impact.

The 21st of March 2018 – Twitter’s twelfth birthday. And it reported its findings in a new way.

MILENA: I think the Toxic Twitter campaign was one of the most interesting ones we’ve had and it’s one of the campaigns that really tried to break the rules. For example we didn’t have a traditional report at the end of it. It was a digital report. 

HOST: Amnesty designed its online report to be read by social media users.  It embedded tweet and blogs from women and used video files.

MILENA: Digital technology makes us able to reach many, many more people. It makes us able to engage them in a more meaningful way. And it makes us able to get them closer to our cause.

 

HOST: The release of the Toxic Twitter report received huge media attention and… some attention that was… a little toxic..

AZMINA: Actually the most abuse I’ve ever experienced on Twitter was in response to publishing a report about violence and abuse against women on Twitter. And the irony is not lost on me there. And the further irony is that when we reported this abuse on Twitter it came back as saying this tweet is not in violation of Twitter’s rules.

HOST: Still, Amnesty’s report was on Twitter’s radar. Twitter released an updated Transparency Report, which contained limited data on levels of online violence and abuse, and Twitter’s responses to that content.

But this report was just the first step.

TANYA: And so after we launched the Toxic Twitter report and the campaign that went with that, we thought this is a real chance for us to do something different and to do something that also feels like a bit like we’re fighting back. You know that the idea that we’re not just telling the world there’s a problem but we’re saying be part of the solution to change this and to challenge this. And so we launched Troll Patrol.

HOST: So what is Troll Patrol?

It was designed to create a different scale of research about twitter.

MILENA: So, in Troll Patrol what we’ve done, we looked at seven hundred seventy eight women politicians and journalists from UK and US. We looked at all their mentions for the whole of 2017 and they received about 14 million tweets altogether in mentions.

 

HOST: Right. Let’s just let that sink in for a second. They wanted to track 14 million tweets for abusive content.

To do that they wanted to mark up which of those tweets were abusive and which were merely problematic. Problematic tweets were hurtful or hostile, particularly if repeated, but didn’t meet the threshold for abuse.

Then Amnesty wanted to identify the type of abuse – for example, sexist, racist or homophobic abuse – and flag whether the abuse was in the tweet itself, or in an attachment – as some people try to avoid Twitter keyword monitoring by concealing abuse in attachments.

The data analysis and reporting required for all this was quite simply…. Huge.

MILENA: Of course, volume is a huge challenge and we’re not equipped for that. We’re not a technical organisation that can deal with large data streams.

MILENA: We don’t have the internal capacity to process that data.

MILENA: And at the same time we have millions of activists around the world that want to work with us, want to engage with us.

HOST: Amnesty had always involved its members in its campaigns.

MILENA: Amnesty is known as a letter writing organisation.

HOST: But while bombarding politicians and other decision-makers with letters was once an incredibly effective campaign technique, it doesn’t cut it in the digital world.

 

MILENA: Online we just ask people to sign petitions, like us on Facebook, or tweet us and then kind of, these were the limited options you could, you could, engage with Amnesty International online. 

TANYA: The petition boom that happened over a few years was so uninspiring.

HOST: All over the digital organising space people were looking for new ideas. I was part of those discussions. At GetUp, Australia’s largest digital campaign organisation we too were exploring how we could do more than just ask people to sign petitions. While petitions were amazing when we all first started using them; wash and repeat them too many times and they got a little stale.

Everyone was looking for ways to innovate. What could a deeper form of digital engagement look like?

TANYA: And we really wanted something that was going to be a more meaningful way of digitally volunteering time and being part of the human rights change, being part of the action. 

HOST: So how many volunteers would it take to classify 14 million tweets?

In 2016, Amnesty had borrowed from citizen science and asked volunteers to join its new Decoders Program.

The Decoders Program killed two birds with one stone – it helped Amnesty crunch data and it helped to engage and mobilise its volunteers.

MILENA: So decoders use a type of crowdsourcing. It’s called micro-tasking. What we do with micro-tasking is we take a very large research task. We split it into small pieces, or micro-tasks, and then we distribute these micro-tasks over the Internet to lots of people. So it’s like an all hands on deck situation. People help and in the space of a few weeks we can process very very large volumes of data. And this helps Amnesty to do unprecedented large scale research.

MILENA: We ask them to analyse satellite images of places like Darfur to determine when villages were burned. We ask them to help us understand the effect of oil spills in the Niger Delta and local populations by analysing investigation reports of oil spills, images of spills. 

HOST: Could Amnesty’s volunteers analyse Twitter data in the same way they analysed satellite images?

MILENA: When it comes to online abuse, people have different interpretations and it’s a much more subjective decision. So it was, it was harder for us to come up with ways to analyse the data. 

HOST: Amnesty partnered with Element AI to design the Troll Patrol project.

Julien Cornebise was an Amnesty volunteer who’d worked on decoding satellite images of Darfur, and who now works for Element AI.

JULIEN: Element AI … It’s a software enterprise company.

JULIEN: We have this branch, which is this A.I. For Good Team, which works, in this case, pro bono with Amnesty International, to provide them with the technical resources that they need. And we do that as part of long partnerships, not just one off projects. 

 

JULIEN: I don’t go out to change or save the world – I mean I’m a, I’m a mathematician. I’m a computer scientist. I have no clue how to fix the world. 

JULIEN:The idea is to look at people working at Amnesty like Tanya, like Milena.

JULIEN: They spend their entire careers studying these big problems that we face as a society or as a species even. And they have a track record of getting impact.

JULIEN: So what I see there is, well with the tools that I’m developing and that all the nerds around the world are developing, well maybe we can help them do what they already do, but do it at a bigger scale.

HOST: So what did Amnesty do – they set up a collaboration with a group that had  complementary skills.

Element A.I. looked at the outcomes of the earlier hackathon to scope how decoders could label tweets as abusive or problematic.  

JULIEN: And then we helped them do the maths and realised, OK if we want to go for this amount of people, well it’s this amount of tweets, how would we set that up?

JULIEN: How do we choose which tweets to label? How do we pick them from among the 14 million tweets that have been sent that year to these women? Which ones do we put up for labelling, and how do you do that?

JULIEN: How do we make sure that we can zoom in on the amount of abuse to be able to characterise the abuse quite precisely from a statistical viewpoint? How can we make sure that any analyses that we do is statistically sound?

 

HOST: Element A.I. and Amnesty also wanted to see if artificial intelligence, informed by machine learning, could help analyse the tweets.

JULIEN: I would say that artificial intelligence is a goal – the goal to create a machine that thinks. And the most efficient way to do that is a tool kit of machine learning. And that is a subfield of mathematics and statistics and computer science where instead of giving your computer a fish, you teach it how to fish. Instead of telling it here’s how you recognise a cat from a dog, instead you give it plenty of image of cat and dog, and you give it an algorithm that can learn from examples.

HOST: Julien and Amnesty hoped the decoders could train an algorithm that would allow machines to analyse tweets for abusive or problematic content.

With the Troll Patrol project scoped, it was time for Amnesty to get the decoders on board.

MELINA: We have sections and regional offices in over 100 countries. So we recruit volunteers through our sections and through our local presence in various countries around the world.

MELINA: We recruit them online. We recruit them through e-mail. We do physical events as well. We had over six thousand five hundred volunteers. They were from all over the world, from over 150 countries.

HOST: You don’t have to be a tech-head to be an Amnesty decoder.  

MELINA: We also have, you know, very young people that are excited to sometimes do their first project with Amnesty. We have people in their 60s and 70s that are retired and, you know, have a lot of time on their hands and and want to do something meaningful.

MELINA: Of course we had lots of women participating in this project. It’s a project that was close to their hearts but we had lots of men as well.

MELINA: We had lots of men from Pakistan, for example, participating, recognising that abuse and misogyny is a problem in their society as well, and they wanted to do something about it.

HOST: Jacqueline Wigfall is an African American teacher who joined Troll Patrol as a decoder.

JACQUELINE WIGFALL: I really got active with the decoder stuff. And that was during a time where I had finished teaching abroad, but I wasn’t working again here in the United States. So I had a lot of downtime and I was a little bit lonely.

HOST: Amnesty focussed on developing a real Troll Patrol community.

JACQUELINE: It was almost like a Facebook world.

JACQUELINE: The facilitators encouraged us to interact with other decoders and not only ask and answer questions, but we could like something or send hearts or support.

JACQUELINE: There were ways to post support to individual people or the group as a whole. You could send smiling faces and hearts, so you just kind of push the icon and there’d be a bouquet of flowers or there’d be, like, a kitty cat. And so they made it really fun. 

HOST: Amnesty knew they would need to provide volunteers with support – because the Troll Patrol decoders were exposed to some pretty awful stuff.

MELINA: We wanted to acknowledge that people can be traumatized by reading this content.

 

MELINA: The first and most important thing is to let people know that distressing content is coming. Don’t take people by surprise. So we put warnings all over the website, all over the promotional materials, in all e-mails we send to people.

MELINA: So we give people a chance to say that’s not for me.

MELINA: The other step was to try to educate people on how to deal with the abuse. 

MELINA: Then one other important step was to automate the process of asking people to take breaks. 

MELINA: So every time they would label a tweet as abusive they would see a message saying, “Hey, you just said that this tweet was abusive. We know this is hard. We know that you just read possibly some really nasty content. So try to take a break and be kind to yourself because you just read something horrible”. We also had a forum where people could discuss with each other and with us. 

MELINA: And the forum was amazing. People showed so much solidarity with each other, and they were there to help each other and lift each other up. 

HOST: The decoders couldn’t analyse all fourteen million tweets – but they could analyse a statistically valid sample.

MILENA: In just a few weeks, they worked two thousand five hundred hours, which is the equivalent of somebody working full time for one and a half years, and altogether they analysed six hundred thirty one thousand tweets.

HOST: Jacqueline alone labelled about 15,000 tweets.

 

JACQUELINE: I guess it’s showing up, it’s showing up with time if you have it. A willingness to learn, a capacity to listen, and maybe the slightest smidge of bravery.

AMANDA: You can make a difference

JACQUELINE: You can make a difference. 

HOST: And the results of the Troll Patrol study show that a difference really needs to be made. When the sample results were applied to all fourteen million tweets, well… let’s hear it from Milena.

MILENA: And what we found was that out of the 14 million, one point one million tweets were problematic or abusive. So that makes seven percent of the mentions of a women would be either problematic or abusive. That’s 1 in 14 tweets. We found that, you know, looking over time, every 30 seconds one of the women in our group was abused.

MILENA: Women of colour were 34 percent more likely to be abused than white women. But out of women of colour, including Black, Asian, Latina, and mixed-race, black women stood out. They are 84 percent more likely to be abused than white women. So really they are the ones that get the brunt of online abuse. 

HOST: The Troll Patrol study also found that Twitter will not be able to replace its human moderators with machines any time soon.

MELINA: What we’ve seen is that machine learning and algorithms are not a solution for online abuse. They don’t get it right all the time, and they risk to either censor legitimate posts or to miss abuse.

 

MILENA: So we are advising companies that for the time being human moderators are the best solution. We are not claiming to have all the solutions because Twitter still is not releasing information.

MILENA: So our main call to action to Twitter was to release more meaningful information and really engage a broader range of actors in solving this issue, because it’s clear that by themselves they’re not doing a good enough job. 

MILENA: And we are not confident that they are investing a lot of resources in understanding the problem well enough to be able to solve it. 

HOST: Amnesty released its Troll Patrol report in December 2018.  

MILENA: What we did was to have the data to back what women have long been telling us, that, you know, Twitter is a place where racism, misogyny, and homophobia are allowed to flourish unchecked. 

HOST: The report was embraced by women who’d been trolled.

MILENA: I received calls from people who said, “Thank you for doing that because, you know, I thought I’m crazy. I thought I’m weak for not being able to stand this. I thought it’s normal and you’re telling us it’s not normal.”

SEYI: I think this research helps women and black women confirm they’re not crazy, or they’re not silly, or they’re not oversensitive.

HOST: The report was also embraced by the mainstream media – after all, women journalists were regular targets of abuse… And new allies were emerging…

 

MELINA: So one of the incredible things that happened just a couple of days after the launch, because we had so much media pick up, one of the organisations that was interested in the report was Citron Research. They are a financial investment newsletter advising people around the world where to invest their money. And they basically said that people should not invest in Twitter because it’s the Harvey Weinstein of social media. And immediately after, the Twitter shares dropped 13 percent in a single day.

HOST: Money talks and Twitter is now talking to Amnesty and Element A.I. about how it can make its platform a safer place for women.

And following the Christchurch massacre in New Zealand, there is a greater willingness among social media companies to look at how they can better address hateful, violent and abusive content.

Amnesty’s work will continue to inform debate and positive change in the online space.

TARNEEN: It definitely needs to be on people who don’t experience online abuse to use their voices to continue to campaign against Twitter, to stop the trolling that happens. If it’s just people who are affected then like our voices are just gonna be shot down again…and it just further disenfranchises the people who are affected.

SEYI: Because let’s not forget, Facebook was set up to rate women. It was never set up to be in support, allyship or protect women. We should be investing in digital citizenship education, so that when we have the new Facebook, the new MySpace, the new MSM in 2026 or whenever,it doesn’t have these fundamental flaws in the first place.

 

AZMINA: I think Amnesty working on these issues validated these women’s experiences as a human rights issue. It’s not just something that they were experiencing alone.

AZMINA: And I think that’s, that’s really important.

MILENA: But it wasn’t just Amnesty International it was Amnesty International and six thousand five hundred volunteers It was a tech partner. It was all the journalists that spoke about it and decided to cover this issue. 

MILENA: So they could not ignore us any longer.

 


Join our weekly email list to hear our latest musings, podcasts and training. Click on this button to subscribe:

_

Comments

comments

Stay Up to Date!