Richard Ngamita - It takes a village to fight bad actors

Between big tech and civil society, Richard Ngamita argues that there is a chance for more collaboration. A senior investigator, Ngamita has been working on threat intelligence for tech companies and maintains a focus on civil society as well. Follow Ngamita's journey of investigation as he relays his experiences, the challenges we face and some of the tips and tools he uses.

Exposing some of these bad actors is actually very risky and one of the ways to, as much as possible, you know, protect ourselves, is to actually expose what is happening. I think the best way  to protect yourself is to bring the truth to the light.

Download this episode.

Subscribe to the RSS feed or listen to this podcast on your preferred podcasting platform.

About the speaker

Richard is a senior investigator who has spent half of his career in tech companies within Threat Intelligence and Trust/Safety teams (Twitter/Facebook/Google), and half working on public-interest civil society tech projects in Sub Saharan Africa e.g @Refunite, @Medic etc. He has most recently led technical investigations and policy research across a set of issues like disinformation and dangerous organizations that have come to occupy the public spotlight, working on the convergence of digital technology and politics.

Transcript

 My name is Richard Ngamita, I'm a digital researcher and my background is in computer science. I was born and raised in Kampala, Uganda. The kind of work that I do is based on data, you know, digital investigations, investigative research and what we normally also call, you know, threat intelligence. I've worked in both civil society, but at the same time also development tech and also big tech - working at Google, Facebook, and most recently Twitter. Performing several investigations based on disinformation, you know, dangerous organizations and just understanding how, you know, bad actors manifests on social media platforms, and in the real world.

Right after school, I actually went into business, so I opened up a barber shop and an internet cafe - we are talking about the early 2000s here. And that's something that I really enjoyed, you know, doing naturally, seeing parents coming in with their children, you know, to get haircuts and the parents themselves, you know, hop into our wifis or get into the cafe side to check on their emails. It was really fun. I really enjoyed that side of things. 

My first experience with, you know, with digital research was actually at the internet cafe. I came across a number of, you know, whether it's cybersecurity, you know, risks that were handed over to me right away by just observing how people interact or use, you know, the internet. There were several, what I would call, whether it's, you know, confidentiality breaches that I saw; several times users were not also logging out, leaving their personal family photos on our drives, (PII) information. And this kept me thinking, how bad or worse can this get? And do they really know the risks of, you know, what this is?

I think this was the first time that I also started, you know, understanding and trying to get a little technical in my investigations. Things like, you know, okay, how can I automate clearing of the browsers at the end of the day in the internet cafe so that, you know, no one leaves the information for the next day, and it helped me learn a lot. Also from a human behavioral point of view just understanding how, you know, users use platforms, the different communication styles, people's fears and risks. And in hindsight, I think for me that was the first time I got into investigations; yeah!

My real, real, you know, work started off at Google actually as an analyst in Africa. Google was setting up its first offices in Nairobi, Kenya, and I joined the team. So think of it as an experiment laboratory, or department of, you know, Google that typically was smaller than independent, you know, research divisions within Google itself. And back then, our primary goal or objective was to grow, you know, the users in the region - that's sub-Saharan Africa. And one thing that this does to you is, you get exposed to different, not only people, but problems and different data sets. And you start, you know, it brings out the hacky way of how you can perform your integrations.

At the same time at Google, I also worked with a spam team, and for me this was the first time I started experiencing adversarial activities and how bad actors abuse platforms, how users were abusing search, and you know, clicking on each other's ads. And how would you identify these actors? Connect them to real world businesses, you know, and this was my first experience with attribution. And this introduced me to open source investigation, huh, intelligence, back then.

I also moved to civil society, think of it as development tech. I worked with an organization called Refunite, trying to connect a number of refugees with their family members. So think of it this way, after every crisis, families end up in different refugee camps - and it may sound easy today because of, you know, social media and everything - but back then it was very hard with the low internet penetration in some of these regions. And at the end of the day, this also opened up a new angle for me on how I can use my open source intelligence and technical digital investigative skills to turn some of these experiences of these refugees into a tool for them to input data points, like names or last phone number known and last seen location, to have them find clues about where their loved ones would be. I think that's how I got started, you know, in my digital research world.

One of the things that I've done in the past was investigating the targeting of journalists. And this is a huge, huge challenge that is happening currently across, you know, whether it's, you know, the real world or social media platforms. One of the main challenges we faced was trying to understand who these adversaries or cyber harassers were. It took really technical, geopolitical and local contacts and understanding of, you know, the operations to help in identifying these harassment campaigns and required longer investigations into dozens of, you know, whether users and accounts and as well as interactions between them, and especially doing this while trying to be as safe as possible. Yeah.

And in these investigations, it was really tricky because you're dealing with really authentic and inauthentic accounts that were all showing signs of harassment. Documenting some of these does not happen in a single occurrence. So let's say there could be their attacks this month, then the actors, you know, go quiet for the next four months and, you know, they have specific months when they push their spear campaigns and character assassination narratives, and later on go quiet. So you have to really be patient in some of the work that you do, and it could take years for you to really uncover the source of what is really happening with the targeting and the harassment itself.

But over time, It was really good to see that whenever there was either an article or an activity, they all came in and you would see some of the threats that they were sharing against her, the kind of manipulated images that they were sharing against her. And this was interesting because they dropped their guard and you would easily see the other kind of political narratives that they were supporting.

And for a researcher, that was a really good way to dig through and sort of nail it down on who these people were; what they worked for; where they were coming from; and why they were doing this. But as I said, you know, it's really challenging and takes time and it's not one of those things that you will think of doing in just a month. It could take a few years to uncover.

One of the ways I perform my investigations is, you know, mainly to try to understand the problem at hand. And that is, you know, you're talking about the Who, When, What, Where, and Why. And when you have that, you always want to start with some kind of, you know, a piece of information that you have. It could be an email, a phone number, a username, or a hashtag. You'll need to gather and analyze and validate the data before performing an investigation. And something else that is very important whenever performing my investigations also is setting up some kind of environment to protect yourself. So think of whether it's a virtual machine, not using your personal laptop, starting off with picking some fake profiles for you to perform an investigation. Yes, you need to create what we normally call sock puppets to perform your investigations. You don't need to use your personal emails. You can dig into burner emails and burner phone numbers that avoid phone numbers like Google Voice and many others, log into public wifi just to protect yourself.

Exposing some of these bad actors is actually very risky and one of the ways to, as much as possible, you know, protect ourselves, is to actually expose what is happening. I think the best way  to protect yourself is to bring the truth to the light. It takes, I think the term is, it takes a village to fight these bad actors.

The connection between politics and my investigative work, all the investigations that I've looked into have involved some kind of political parties or regimes producing or benefiting from digital disinformation.

I think a lot of people are not aware of the scale of disinformation that is happening in Sub-Saharan Africa. There's a clear, you know, growing trend of domestic political actors that we are seeing deploying disinformation. One thing that is clear is we see them working with, you know, whether it's influences for political hire. And during the elections themselves, now, many times these political actors, you know, use these individuals to create, you know, fake accounts and create malicious content and generate fake engagement. The goal is to try and game social media algorithms for the benefit of these political parties, or push their narratives, manipulate the minds of millions of citizens. The goal is also to, at times, sway public opinion during these elections. So think about it this way, just before any election, if you spend time as an investigator digging around social media platforms, you'll have a very high chance of finding coordinated disinformation. The same thing would be said around, you know, whether it is conflicts,  just before any coup or these, you know, political takeovers we are seeing in West Africa, Burkina Faso, Mali; there's always some kind of a massive coordinated disinformation from the actors, both domestically or by foreign actors.

I've been investigating a trend of how influencers and activists that are strongly connected to some of these narratives, and are being amplified, and showing how they have a very close relationship with pushing Russian disinformation on the continent. And this is all happening on the platform itself, or different platforms. And it's,  it's really hard to dig into this if you don't have, you know, some kind of background in disinformation research. And it takes a lot of, you know, experience in patterns for you to understand that this is all coordinated in one way or another. There are huge numbers of fake accounts or what we call sock puppets on social platforms that are pushing these, you know, narratives and coordinating these activities.

Just to dig through that, my former colleague who was called Ben Nimmo - former colleague and a friend - coined  the four Ds which are: Dismissed, Distort, Distract, and Dismay. The main patterns that we're seeing from, you know, these actors is one they dismiss. So think of it as they dismiss it, it is normally the first and is, by far, the most common technique that they're using to deploy. So what they do is, they push narratives like, ‘Hey, don't listen to this because of that’; and then they throw out an insult of a source. The second thing that they do is they try to distort some of the narratives themselves. So the distorting is an easy concept to understand. The third D is distract. So one of the activities that we can clearly see them doing is, you know, if a conversation is uncomfortable and unfavorable for them, then there's some kind of, someone can attempt to change the subject, and this turns into whether a viral activity or some kind of, you know, a blackmail, and they immediately switch the content and the narratives. The last but not least is dismay. So dismay, which means to try and scare people off. So this is a technique we've seen them use,   often used when there is, you know, a policy debate or something like that. They'll use a specific rhetoric that can warn against the consequences of an action. So they can maybe target a journalist or share warnings to specific, whether opposition or politicians on what they're doing. Yeah. And this is a common pattern that we've seen with many of these actors.

One of the major challenges that I'll start off with is understanding geopolitical narratives and cultural nuances and historical events that have led to, you know, what's happening is very important, and that takes time. As a researcher, you know, you need to dive into this information and immerse yourself into it. So that you have no bias and you understand it all around; asking the right questions and many times asking for help from locals or experts to interpret for you a situation that would help you unlock the why, or the where or the what. Many times, you know, exactly something wrong is happening, but the actors are becoming smarter and there is, you know, no data point that is actually leading to some kind of attribution; so it takes time. Many times, the data that is not easily accessible in our investigations, or is hidden behind paywalls, governments and social media platforms, making it harder to access their, you know, data through APIs. You might have seen recently the likes of Twitter setting up their cheapest package for investigative journalists and researchers to almost, 42,000 a month, $42,000 a month for, you know, 50 million tweets only. The other challenge, I think I would say, is researcher burnout. It is, a real problem. Researchers and research funding agencies really need to consider, you know, taking a number of steps, which can help researchers avoid the experience of researcher burnout? It's a real, real huge challenge.  

The unfortunate truth is there is not really much interest in some of the investigative work or funding of disinformation, especially in African countries. The fact that I've been on both sides of big tech and civil society or development tech, I can confidently state that. But one thing I can clearly say is that I'm seeing more and more collaboration between big tech and civil society. Without big tech collaborations or partnerships we wouldn't have seen some of the recent debunking of some of the most widely spread disinformation online in the region.

I've always said it's not actually about the tools themselves, but it's more about the tactics and the methodologies and the investigative mindset that specific individuals need to have. It's very important to learn how people behave whether it's in breaking news or, in knowing how to apply an investigation will help you in knowing how to apply an investigation.

I'll just quickly go through some tools that I find very important for myself. For example, with the dealing with government open source data or open data, the data is locked up in PDFs, so, a tool like Tabula that liberates, you know, data tables locked in PDF files is super important. A number of times you are investigating organizations and companies, offshore companies, and tools like, you know, platforms like, OpenCorporates and Open Ownership, that are global searches for registered, you know, corporate entities and their associated individual officers or investors is super important. A number of times you're dealing with the likes of, whether it's, you know, IP addresses, trying to make some attributions, you know, who created this website? And behind what IP addresses it? And I find domain tools, like you know, lookup and domain IP historical data, you know, super, super important. In some of the work that I do, the Wayback Machine is another important tool that helps me, you know, explore the history of the website. You know, are they just claiming that this is how it looks today, but you know, who are the former employees that were, you know, documented before in, in their About Us page a few years ago? You'll find that super important. Leveraging the crowd is also very important and crowdsourcing is still king in some of the things that you can do and you'll find yourself learning so much from experts. So think of, let's say I'm not the best cartographer, or map, you know, specialist, but if I throw it out there, I'll find a mapping specialist that is willing to interpret for me why something is happening in a specific way.

The tools will always come and go, but the main thing, as I said earlier on, is just having an investigatory mindset, being inquisitive, the best tools and techniques can help you in your hunt for information, but what is central to being an effective researcher is an investigatory mindset and the love for problem solving.

There is a growing interest in Sub-Saharan Africa. So there is a mix of, you know, whether it's state power and private interests. The question is why are global, you know, powers scrambling for Africa. We are seeing top officials from, whether it's the US or China or Russia, Turkey, and other, you know, middle Eastern countries who have visited over 14 African countries just in the last two months. 

So from private interests that include companies that are keen to where they exploit energy and commodity resources backed by states, you know, and, all these play into the disinformation world itself, and that's why we really need more and more investigators on the continent, which has been a challenge due to a number of factors. You know, whether it's training, whether it's opportunities for, you know, just getting started, whether it's the dangers with the kind of work that you know, can be done around this kind of investigative work. 

So I think that my closing words is, I think we need more and more investigators on the continent, but it's going to take leveraging, you know, all sectors, whether it's big tech and civil society and development tech to come together.

Credits

The Exposing the Invisible podcast series is produced by Tactical Tech.

Interview, Production and Sound Design by Mariam Aboughazi.

Tactical Tech's Exposing the Invisible team includes Laura Ranca, Lieke Ploeger, Wael Eskandar, Marek Tuszynski and Christy Lange.

Theme Music by Wael Eskandar.

Additional Music:

Warm of Mechanical Heart by Kai Engel, Free Music Archive, licensed under a Attribution License (CC BY). Cendres by Kai Engel, Free Music Archive, licensed under a Attribution-NonCommercial License (CC BY-NC).  Low Horizon by Kai Engel, Free Music Archive, licensed under a Attribution License (CC BY). Headway by Kai Engel, Free Music Archive, licensed under a Attribution License (CC BY). 

Illustration by Ann Kiernan


With support from


More about this topic