Milena Marin demonstrates how Amnesty International uses micro-tasking, the process of splitting a large job into smaller tasks, for their human rights research and how they first tested these methods on their biggest dataset: The Urgent Actions Archive.
By Milena Marin, 23 January 2017
Milena demonstrates how Amnesty International uses micro-tasking, the process of splitting a large job into smaller tasks, for their human rights research and how they first tested these methods on their biggest dataset: the urgent actions archive. She explains the technology they used, the outcomes of the campaign and how it helped identify the characteristics for a successful urgent action campaign.
Micro-tasking - the big picture
About a year ago, I started working with Amnesty International on an exciting civic tech project that is pushing the boundaries of the organisation in two key areas - engagement and research. The project encourages Amnesty supporters to contribute in new and meaningful ways and at the same time, it challenges traditional research methods by engaging non-experts as researchers.
Amnesty Decoders was born out of a desire to inspire a new generation of digital activists to contribute in real time to human rights research. With this project, Amnesty is building a community of digital activists who are able to work with mountains of messy, unstructured information and transform it into structured evidence of human rights violations.
This is powerful not only because it allows an unlimited number of people to generate valuable data for research, but also because this act engages people in human rights in a meaningful way that moves beyond usual forms of 'clicktivism' or engagements that focus on the sharing of information or the provision of financial support.
We believe this project can support many different types of research engagement, like the analysis of satellite images of refugee movements in war zones, or illegal house demolitions; the categorisation and verification of video footage showing war crimes; or even the ability to alert international decision-makers in real-time about hate speech threats made on social media.
Unpacking micro-tasking
Given what we wanted to achieve, the technology choice for this project was pretty straightforward. We needed something that would enable us to engage large numbers of people and that would allow anyone to participate without training or barriers to entry.
Micro-tasking
Micro-tasking is the process of splitting a large job into small tasks so it can be distributed to many people. This is especially useful for situations where people perform better than machines or algorithms such as when it is important to detect context, irony, positive or negative bias, or accurately recognise what is in an image or video.
Although micro-tasking projects are new to Amnesty International, there were plenty of examples we were able to look to, ranging from commercial services like Amazon’s Mechanical Turk to amazing citizen science projects like Zooniverse, open data platforms like Voz Data or non-profit initiatives like Geo-Tag-X.
Not-for-profit micro-tasking projects have demonstrated that many people have a genuine desire to contribute towards the "greater good", and technology makes it simple for anyone to pitch into a common effort. These projects were and are an immense inspiration to us and provided invaluable lessons about what works and what doesn’t. You can read more about our extensive research into micro-tasking here.
Amnesty's Decoders micro-tasking portal for the Decoding Darfur project
Our first pilot: Decode Urgent Actions
For our first pilot project, we looked into one of the biggest datasets curated by Amnesty International: the urgent actions archive. Urgent actions are a form of campaigning for individuals based on a simple idea - when someone is in immediate danger of serious human rights abuse, those responsible receive thousands of letters, faxes and messages through social media from all over the world demanding action. This pressure has the potential to protect that person from further abuse.
Since 1973, Amnesty has issued more than 25,000 Urgent Actions and in doing so has mobilised hundreds of thousands of people to stand up against injustice. The Amnesty database that documents these actions is one of the largest and most consistent databases of human rights violations that exist. Most of the documents the database links to are digitised and carefully archived with metadata including date, country and the type of human rights violations.
To help us mine and learn from this incredibly rich archive, we reached out to our supporters with a simple proposition: 'help expose human rights violations one click at the time'. In this case, all that supporters were required to do was to read a document and answer few simple questions.\
A gif of the Urgent Actions interface for a campaign on human rights in Burundi by Amnesty International
Just a month after the June 2016 launch we had already engaged 6,500 digital volunteers who collectively spent over 1,000 hours and decoded 12,500 tasks, helping us to review each document at least five times. This level and depth of participation far exceeded our expectations.
We were extremely lucky to have a community of researchers and designers from the Digital Methods Summer School in Amsterdam analyse and visualise the resulting data soon after our pilot was completed. This analysis helped us to quickly understand the validity of the data and to identify emergent trends.
We were able to validate 80% of the data put in front of our digital volunteers. The tool we used had a built-in redundancy system, meaning that each document was reviewed by at least five different people. In some cases, it was impossible to establish consensus but most of the contributions corroborated each other. Out of the validated data, we could quickly see that we had a positive outcome in 67% of cases.
Visualisation of regions' violations by outcome from 1980 - 2016, by Amnesty International
-
What characteristics make an urgent action campaign most successful? Spoiler - be a single woman!
Statistical analysis of the results has also helped us profile the characteristics or parameters of a successful urgent action campaign. On average, urgent actions campaigns for women have been more successful than ones for men; as well as this, calls for specific named individuals are more successful than the ones for groups of victims.
The average statistical likelihood of a positive urgent action outcome for all individuals is at 67.3%. However, the likelihood of a positive outcome for a group of two to eight people drops to 56.1%, and for groups of more than nine people it is 46.6%.
Graphs and findings created to Amnesty International's research team
-
Putting the pieces together
The first and most important thing we learned was that it worked. Our core assumption that Amnesty supporters would be interested in this type of engagement was validated. This project showed that there is an immense appetite for deeper engagement among Amnesty volunteers. We started with a soft launch, working only with Amnesty's Swedish section and with supporters from countries without an official Amnesty presence.
The plan was to follow this with a public launch, using global social media channels and engaging more of our Sections around the world: but we did not manage to implement that stage because we had such great participation that we completed the project following the soft launch.
We were also able to validate our assumption that micro-tasking can be valuable for human rights research. We verified 80% of the data and learned important lessons on how to increase this percentage in the future. For example, when designing the project, we prioritised user experience – getting people to try it out as soon as possible, without asking people to register or to take a tutorial.
We wanted to design something so simple that people could start immediately. However, this generated some bad data – people were curious about it and experimented initially with random answers.
We also had a high amount of “skips” – instead of answering questions, people jumped to the next document. Initially we thought people would skip ambiguous documents where it was hard to answer the question but we quickly realised that “skips” were not a function of how difficult a document was but rather a way to browse the data and read different types of Urgent Actions.
As a result of these lessons, data quality will certainly be our main focus in the next project. We will implement solutions ranging from encouraging more people to log in while valuing their privacy, running interactive tutorials, and possibly not allowing people to skip tasks while assuring them we want their best guess.
One interesting unintended effect of the project was the awareness raising it generated about Amnesty’s work. Reading archive materials gave people a sense of what the organisation does, its historical efficacy and significance, and also the range of possible outcomes a campaign can contribute to. In this way, Amnesty Decoders is an education tool as well as a practical data collection and analysis tool.
-
Thousands of eyes in the sky
We are now looking forward to being able to roll out an ongoing flow of research projects to our growing network of Amnesty Decoders. Next up is analysing satellite images, looking for remote villages in Darfur that have been destroyed in the latest wave of attacks.
In this project, anyone with an internet connection and a drive to make a difference will be able to contribute, scanning small parcels of satellite imagery for destroyed villages.
If you are interested in the project, you can follow its progress by signing up here or stay in touch directly @milena_iul
Milena Marin is Senior Innovations Campaigner at Amnesty International and leads the Amnesty Decoders initiative. Prior to Amnesty, Milena lead the School of Data at the Open Knowledge Foundation, and has worked at Transparency International.
Amnesty Decoders works towards moving beyond 'clicktivism' and towards working with Amnesty supporters to contribute to human rights research in a more meaningful and, ultimately, a more useful way while challenging more traditional forms of research through engaging non-experts as researchers.