Placeholder canvas
InterviewsPartner post

Interview with Graham Brookie: “Polarization, Disinformation Drives Georgians Away From One Another”

Disinformation is an important tool for authoritarian regimes to gain the control and remain the power – says Graham Brookie, managing editor and director of the Digital Forensic Research Lab, a start-up within the Atlantic Council focused on exposing and explaining disinformation in digital sphere. Brookie, who also served in various positions at the White House and National Security Council spoke how the Kremlin disinformation tried to shape a narrative before, while and after 2008 Russia-Georgia War. How Russia and other state actors use disinformation to create and exacerbate conflicts?  Voice of America’s Eka Maghaldadze spoke to Brookie in Washington DC.

Tell us about the Digital Forensic Research Lab? Who are Digital Sherlocks and what’s their goal?

The DFR Lab is one of the programs in the Atlantic Council. It is focused on how we deal with the issue of disinformation. Our mission is to identify, expose and explain disinformation with an emphasis on explaining the disinformation. So, identifying and exposing is fact-checking, whether something is true or not true, whether it is fact or fiction, but we also focus on explaining of every single case that we look into. What is the beginning, middle and the end of the story; what are the characters involved, what are the tactics that they are using in the content that they are creating and over all, most importantly, whether it actually impacted an audience, whether it reached real people, whether a piece of disinformation caused people to change their behavior or think in a different way on that issue.

That’s all based on a fact that facts themselves are cornerstone of democracy. So people are going to need facts in order to make collective decisions on things like elections, or local issues…

Youre always underlining the importance of not only exposing but explaining the disinformation and also the timing of this explanation, can you tell us more about it? Does that help us to better respond? And what are the risks of amplifying this disinformation by giving more audience to the fake narrative or a story?

We think that we have to engage with the audiences when and where they are engaging in content. If we are going to be effective in countering disinformation, then we are going to have to be effective in shaping those conversations as they are happening. What we see consistently in our work is that the fake [narrative] always gets more engagement than the correction – a piece of disinformation that goes truly viral will reach more people than the correction of that disinformation.

We have to be ahead of the curve of those who spread disinformation and right now people across western societies, free and open societies are behind the curve, frankly.

Those who spread disinformation are 2 steps ahead of us and we’ve been trying to catch up for a long time.

And about the response, if you respond to a story that is fake, then you’ve already accepted the premise of the story just a little bit. So we’ve got to figure out where to get above those stories and address the disinformation holistically.

That’s a large challenge of the directly exposing the disinformation we call that “the oxygen to amplification”, so fire only burns if it has the access to the oxygen. Same thing is with disinformation, it only spreads if we amplify it. Now a few of main amplifiers of disinformation are media specifically, and frankly, civil society.  We’ve got to take that as a serious responsibility to make sure that in our process of countering disinformation, we are not promoting that same disinformation just by giving more attention to it.

How this kind of effort is used to exacerbate conflicts or to push the agenda by not creating fake news but by amplifying existing social- political issues and other divisive issues?

An example that is not necessarily Georgian but what we’ve learned recently in the United States is: the target of disinformation from foreign actors are normal citizens, they want to catalyze more polarization. As an example in 2016 in the U.S. the infamous Saint Petersburg Troll Factory created a lot of its own content and then spread it in the U.S. with the catastrophic success, but we have seen them adapting tactics over time. In 2018 midterms, here in the United States, Russians were extremely active and yet, they were not creating their own content. What they were doing was taking pieces of very polarized or some extreme content from actors inside the United States and then amplifying it.

The point of this kind of operations is to drive us further away from each other, rather than closer together and weaken us in democracies. We are the target. And we see that played out in Georgia right now.

There is a very specific political moment that is ongoing, there is increasing polarization and very serious stakes with human lives on line, we see this [kind of] disinformation [campaigns are] trying to drive those Georgians further away from each other, rather than closer together.

Would you say you can see this happening in Georgia in 2008 or afterwards, and in Ukraine since or even before 2014 and Crimean annexation? 

Countries like Georgia are where the Russian government essentially developed and perfected a lot of the information operation tactics that we see played out in a number of other regions now. The 2008 war is extremely informative [in this sense], and one of the most informative things that we have to understand about Russian foreign information operations specifically, is that after 2008 the Russian government made a decision and disclosed this decision that they were caught on their back foot in public relations, so their ability to conduct war in Georgia was deeply inhibited by their ability to shape a public opinion about that war. So, the person that runs RT and Sputnik and generally state-run media, gave an interview in which she said, in 2008, in Georgia, we were on our back foot and we figured it out, and now we see information operations and state-run media operates as an extension of Russian policy. And that’s deeply informative about how they see not only Georgia, but how they see influence operations as a tool within a wider toolkit to conduct Russian foreign policy or Russian aggression across the region.

How has the disinformation, specifically the digital one, has become such an important tool for state actors to interfere in other countries?

Disinformation is certainly a toolkit in regimes that are trying to interfere in other countries or shape what happens on the international stage. Few very specific, very important points about how we frame these discussions: foreign influence operations are different than disinformation and disinformation is different than misinformation. And sometimes that does not translate into languages like Georgian, but the way we would categorize them is: foreign influence operations are when a nation creates content or messaging to impact another country and that can be in a hard cyberspace, where we think of hacking and things like that. That could be in the soft information space where we think about the content that are spread by Sputnik or RT, or state-owned kind of media entities or troll farms or things like that. That’s different than the disinformation itself and we define disinformation as spread of false information with intent, meaning an actor means to lie whatever the audience they are targeting.

And disinformation is the problem that does not respect borders. A lot of times foreign influence operations are designed to spread disinformation and then catalyze more disinformation domestically. The target is the domestic audience that then themselves spread the disinformation.

And that’s a little different than misinformation. Misinformation is the spread of false information without intent. We all have family members that send us crazy text messages and we need to check another source. So all three of those tactics and categories influence how we think about the addressing the problem which are all distinct challenges and need to be done separately.

Now it’s a kitchen table issue in the west, where a lot of people are talking about, not only in Washington DC or not in capitols around the world, but everyday people are talking about foreign influence operations as a matter. And that’s because they are extremely effective. That won’t be a news to citizens in Georgia or citizens across Caucasus region:  that is something they’ve dealt with for a long time and frankly, they have a lot to teach to the the world about. But the bottom line is that it is an increasingly effective tactic because it costs a very little resource to spread as much disinformation as possible. So, if you’re looking at a comparison: a government can conduct influence operations and that will cost less for them than putting tanks on the ground in that country, in that conflict zone. And these are very cost-effective and very effective means of influencing the conversations in other countries [aiming at] weakening adversaries. We’re going to see more of this as this technology evolves.

What is the biggest use of it, how much do the authoritarian regimes actually gain from it?

Control, control is the answer.

Authoritarian regimes have a vested interest in consolidating and controlling the way the people in their countries communicate or the narratives of their countries on the international stage, because they need to control their people in order to maintain the power.

Take Russia as an example: in years of operation of the infamous “Saint Petersburg Troll Factory”, the vast majority of their content was not focused on Ukraine, on Georgia or on United states, it was focused on Russian people. So, the vast majority of the content they are producing, shows what their highest priority was, which was maintaining a domestic support for highly unpopular regime or non-democratic regime. That’s the bargain – ball game for the authoritarian regimes, that’s what they have to gain and frankly that’s what they have to lose if they don’t spread disinformation.

What are the key vulnerabilities in Eastern Europe for the societies exposed to these kind of efforts? What serves as a good ground for a successful disinformation campaign?

Honestly, it is any polarizing issue that is getting a lot of engagement. In 2018 midterms they were amplifying very very polarizing content on far-right and the far-left, including some content that was anti-Russian. I am sure it was the very ironic moment in their editorial meetings in the mornings. They were spreading some anti-Russian content and because they made a decision to make us less capable of dealing with that issue as a society, the issue of election integrity or fighting back against foreign influence operations. You see the same thing played out in Georgia and across the region. Any issue that is extremely polarizing is vulnerable to disinformation and the target is citizens themselves.

There are lots of talks and researches about disinformation but do you think the real threats are still underestimated on policy-making or strategic planning levels?

I think the threat of foreign influence operation needs to be taken as seriously as traditional security measures. This is a new type of warfare [with] lots of different areas of combat and we’ve got to consider [it] as such, and that includes things like national security strategies.

One way that we think about it is that disinformation, even if it is a small operation, its designed to spark, its designed to catalyze, to weaken an opponent. So one way that we think about it is a “poison in to a well” – one drop of poison can affect the entire well and so the impact is meant to catalyze and we need to consider it as something that needs to be more resilient. You can’t fight content with content, so it is going to take different strategies than what we’ve seen in a lot of other areas of national security. It also needs to be taken very seriously.

Disinformation is an issue that we are going to have to address collectively, and all of us have a part to play. Government, policy-makers, civil society and media [have their responsibilities in this], but I think, the biggest role to play in this entire effort to be more resilient against disinformation, what we refer as digital resilience, is citizens, and citizens have very real obligation to be informed. That is a part of being a productive and engaged citizen in democracy. and to have a power to make collective decisions in your own and you have a responsibility to be informed, to check another source, to make sure you are making your own decisions in fact and that you are a productive member and can contribute to that discourse.

Do you think that this scale and the effectiveness has changed out attitude towards the online sphere? Did we lose the trust forever?

I think the issue of technology is going to be evolving at all times. The way we consume information and connect to one another, is going to change overtime and I think those two trends are really, really good trends, and frankly, they are not probably reversible at this point. So, how can we maximize the opportunity that’s been created by those trends, and mitigate the risks? You should be – like President Reagan said: “trust, but verify” – you should always be a little bit skeptical. That is what we would call healthy degree of skepticism.

Partner Post
This material was prepared for Civil.ge by the Voice of America. In order to license this and other content free of charge, please contact Adam Gartner.

This post is also available in: ქართული (Georgian) Русский (Russian)

Via
Voice of America
პირველწყარო
ვინ არიან ციფრული სამყაროს შერლოკები?

მსგავსი/Related

Back to top button