top of page

Verifying Information Online

How do we know if the information we are consuming is true, accurate or justified? How can we be confident that the information we use to inform our judgments and advice to decision makers is of good quality? In this blog, I’ll provide a snapshot into some of the tradecraft, tools and techniques used by intelligence analysts to better evaluate information and sources online which can be neatly summarised using the following four key criteria.

  1. The relevance of the information.

  2. The reliability of the source.

  3. The credibility of the information.

  4. The corroboration of the information.

To remember this, I like to use the mnemonic (or pattern) R2C2. It’s a simple way to remember two ‘Rs’ for Relevance and Reliability and two ‘Cs’ for Credibility and Corroboration.

R2C2: Is the information RELEVANT?

The first step in verifying the information we’ve collected online is to do a quick check that it’s relevant. This might seem like an obvious point, but when we are collecting large amounts of information – sometimes quickly – it can be easy to pick up information that may only be tangentially linked or after closer examination, not be relevant to our intelligence issue. When evaluating for relevance we want to check if the information is closely connected to our issue or problem and that the information is appropriate to the current time period or circumstances of interest.

R2C2: Is the source RELIABLE?

Once we’ve done a quick relevance test, we can then start to look more closely at the source of the information and ask ourselves some key questions to determine if the source is reliable. That is, evaluating whether the source is consistently good in quality or performance and able to be trusted.

R2C2: Is the information CREDIBLE?

Turning now to our next criterion, credibility. Is the information provided by the source convincing and able to be believed? To better evaluate credibility, we first need to consider if the information is plausible. If not, under what conditions would the information be plausible?

R2C2: Can I CORROBORATE this information?

Which leads us to our fourth criterion, corroboration. Can we corroborate this information? Do other quality sources tell a similar story? If not, why not? Is further research required to corroborate the information we’ve found?

What does it mean if the information we have can’t be corroborated? Could the information still be accurate even if it can’t be corroborated? And, if so, how confident are we that it’s true and why?

R2C2: It Gets You Thinking

These are all questions we need to ask ourselves when determining the quality or strength of our information.

Using the R2C2 criteria should get you thinking about the nature of the information you’ve collected and the sources of that information. You likely will have identified some new information gaps, including the need to do more research on the sources of your information. You are probably also wanting to sort or prioritise your information in a way that will help you make better sense of what it’s telling you – so that you can come to sensible conclusions or make considered judgments about the problem or issue you’re working on. Why? Because ultimately the judgements and assessments about the issue you’re working on are going to inform your brief to your boss!

What does this look like in practice?

Let’s switch gears slightly – from the theoretical to something a little more practical and use R2C2 to evaluate a couple of different information sources. To scope this activity, I have chosen two written examples and have deliberately avoided other types of collect such as imagery or communications as we will look at these in future Tools and Techniques webinars and blog posts. Also, I’m going to assume that these sources are relevant to the scenario I'm considering and will focus on one criterion per source. We will step through reliability and corroboration together, but I’ll leave you to have a go at credibility yourselves.

Source 1: BUZZFEED

Up first, I’ve chosen an article from BuzzFeed. For this source we are going to consider its reliability.

For those of you who are not familiar with BuzzFeed, it is an American internet media, news, and entertainment company with a focus on digital media. It started as an algorithm designed to pull stories from around the web that were going viral and was famous for its pop culture articles, quizzes and cat pictures.

Over the years, BuzzFeed has also invested millions into serious investigative journalism and even won a Pulitzer Prize in 2021 for its coverage of China's campaign against the Uyghurs in Xinjiang. BuzzFeed is left-leaning and despite its entrance into serious journalism, is still often viewed as questionable source. Nevertheless, it is popular. And chances are, if you’re conducting research on any number of topics, you’ll probably come across something relating to your issue or problem on BuzzFeed.

In this scenario, let’s pretend that we are researching whether Australia is safer than the United States. Using our criteria for evaluating sources (R2C2), how does it stack up for reliability?

Based on our simple understanding of BuzzFeed, it’s possibly going to score lower than we’d like, as we begin to answer some of our criteria questions. But we can’t be sure until we step through and try to evaluate its reliability for ourselves.

In most cases you will need to do more research on the author of the article and consider the type of news that is being reported. Have they relied on other sources to inform their judgments? Do they reference academic studies or reporting?

At a quick glance, we can see that this article is based primarily on the views of individuals and although the title of the article isn’t misleading, I’m starting to wonder about some of the conclusions the author may be drawing from the source material. There’s a lot of emotive language and Reddit references also feature heavily. In some cases, the references for direct quotes have been deleted so I’m unable to see the context in which they were written. It’s not looking super helpful, but it possibly does give us a place to start investigating – even if we do so to find other more quality sources. Depending on your project and timeline, you will need to make a judgment about whether you will review this source further or rule it out.

Source 2: KATEHON

Our next source is an article from the Russian ‘think tank’, Katehon. We are going to review this source in the context of identifying opportunities for corroborating the information presented in the article.

Katehon is a Moscow-based quasi-think-tank that is a proliferator of anti-Western disinformation and propaganda. It is led by individuals with probable links to the Russian state and Russian intelligence services. Within Russia’s broader disinformation and propaganda ecosystem, Katehon plays the role of providing supposedly independent, analytical material aimed largely at European audiences.

So, let’s act as if we are researching the war in Ukraine and have come across this article. At a very base level, it suggests that the Ukrainian President, Volodymyr Zelensky, is trying to start World War III. Using our criteria for evaluating sources (R2C2), how does it stack up on corroboration?

Let’s start by acknowledging that corroborating propaganda is often difficult unless we go straight to other propaganda sources – and there are certainly times we would want to do this. But, for this exercise, we are looking to dissect the information in the article and see if we can corroborate it. We are not making a judgment about whether we agree with the author's position on the war in Ukraine.

To start, we know we are going to have to do more research and are going to have to look closely at the facts, rationale and arguments presented in the article, breaking them down into manageable chunks. We might try to ground truth any number of the provocative threads – Is Volodymyr Zelensky heading a Nazi regime? Is Ukraine a proxy in a war between Russia and the US? Was it a Russian or Ukrainian missile that struck Poland in November last year? Is Ukraine developing a ‘dirty bomb’ as the author claims? Then there’s all the statistics (presumably aimed at a conservative US audience!) about how much money the US is spending on supporting Ukraine, and some emotive language and assertions about Zelensky’s character. We can quickly see that analytically, this article is a minefield. But one we need to make our way through if we want to be able to make informed judgments about the issues, and ultimately better understand the war in Ukraine.

Importantly, what does it mean if we can’t corroborate the information? Could some of this information still be accurate even if we can’t corroborate it? And could some of the information still be accurate even if the conclusions drawn by the author are flawed? Is this a useful source? As always, it depends on how we think about the problem and the questions we are trying to answer. I probably wouldn’t recommend adopting the author’s views too quickly, but I’m prepared to acknowledge that this source does give us insight into Russian propaganda; it does highlight key events/issues that we might be looking to better understand if we are researching the war in Ukraine, and it does prompt some important questions for us about the value in corroborating information and what it means for our analysis if we can’t.

Are there any tools to help us evaluate sources and information?

There’s no doubt that evaluating sources and information online takes a bit of brain power and discipline. I chose the examples above for ease of explanation, but source evaluation is often complex even with the help of R2C2. So, are there any other tools that can help us on this journey, perhaps do some of the grunt work for us?

I never like to answer that question with a definitive yes. As an analyst, you’re always going to have to employ your critical thinking to a problem. That doesn’t mean we can’t get a little help along the way. Here are a couple of examples of tools I’ve chosen to get you started.