Old Geopolitics – New Technologies

On the occasion of the publication of the Dahrendorf working paper “Beyond Regulation: Approaching the challenges of the new media environment”, Dahrendorf Research Associates Rafael Goldzweig and Marie Wachinger take a broader look at mis- and disinformation, its historical links to Russia and the increasingly central role of technology in geopolitics.

Supposedly, it was Lenin himself who invented the term “disinformation”, attempting to make the phenomenon sound less Russian and more Western, Ion Mihai Pacepa and Ronald Rychlak, authors of a book with the same title, claim. It may seem ironic that such a book was published by WND, a publisher considered to be part of the far-right, conspiracy theories promoting spectrum. Yet somehow it fits the confusing and complex issue we want to address.

Accusations of mis- and disinformation being used to confuse and influence people’s perceptions are not seldomly brought up against Russia. Allegations of bots and spies interfering in elections and fostering public distrust in established institutions have been frequent over the last years in many parts of the world. The dissemination of false information is greatly facilitated by the new media landscape with its widespread reach and current lack of accountability. This blog post looks at the historical and current involvements of Russia, links these to the success of populist politicians and outlines some of the current strategies to combat the problem.

A historical view on disinformation

False and misleading information have likely existed throughout human history and across all cultural boundaries. The 20th century did however, through increased global economic and political interdependence, elevate the level of politically motivated disinformation. One of the most prominent myths of Soviet propaganda was the 1983 story that the U.S. had created AIDS in order to kill African Americans and homosexuals. Russian Ex-KGB spies have claimed these type of stories, so-called “active measures”, to be part of an orchestrated plan of ideological subversion. The goal was to destabilize and create mistrust among the population of Russia’s enemies through deception. It constituted the heart of the KGB’s “information warfare” strategy during the Cold War.

Current information wars and their beneficiaries

The political situation since the Cold War has of course changed drastically. However, it seems that disinformation has yet again become a means of influencing geopolitics. Most recent allegations of disinformation concern the current riots in France by the so-called Yellow Vests, during which increased presence of the hashtag #giletsjaunes has been found to coincide with Russian Internet activity supporting the protests.

Over the last couple of years, the Brexit campaign and the last French elections were also commonly associated with Russian interference. While Marine Le Pen did not win the French presidency, many bots distributed pieces of disinformation that favoured her. The most internationally famous case of alleged Russian election interference was certainly the 2016 US Presidential election. Professor Kathleen Hall Jamieson, University of Pennsylvania, concluded that Russian hackers widely disseminated disinformation, trying to minimize the votes of African Americans while working towards a GOP victory, having been “alarmingly successful in reframing the American political narrative in the crucial period” of the election campaign. The subsequent investigation into the incidents, led by Special Counsel Robert Mueller, still has not published results.

The mentioned cases indicate that particularly populists have received support through disinformation shared online. The extent of the problem is such that some fear it distorts democracy itself. Even though not all disinformation campaigns have decided election outcomes, their destabilizing effect is feared in many countries. In current efforts to counter such tendencies, it is striking how Russia (which currently spends 1.1 billion US dollars per year on pro-Kremlin media) constitutes the focus of action. This is particularly noteworthy since also the United States have a remarkable history of election interference in other countries, although arguably less through disinformation.

How are countries reacting to this?

In 2015, the European Union put in place the East StratCom task force “to address Russia’s ongoing disinformation campaigns”. In its initiative EU vs Disinfo, the task force campaigns to “better forecast, address and respond to pro-Kremlin disinformation”. Just weeks ago, the EU announced a “war on disinformation” in order to protect next year’s European elections. In the United States, former president Obama introduced the “Countering Foreign Propaganda and Disinformation Act” shortly before handing over the office to one of the supposed beneficiaries of disinformation campaigns–himself famous for calling out established media outlets as “fake”.

Different legal approaches to regulating the tech companies that disseminate disinformation are currently developing in countries all over the world, and these companies are establishing codes of conduct and content management – with mixed success.

The new media environment seems to come in handy at times where geopolitics and global power relations are of central importance. Tech companies are struggling to develop AI-based tools to fight disinformation on their platforms. It also seems that they have not been very effective in preventing trolls and extremist groups from manipulating the online debate during political events. Clearly, more needs to be done. To read more on these challenges, different legal approaches and measures taken by various countries, we recommend the new Dahrendorf Forum working paper on ways in which governments and tech companies are dealing with the regulation of the information disorder.

Rafael Goldzweig & Marie Wachinger are Research Associates to the Dahrendorf Forum at the Hertie School of Governance.

The opinions expressed in this blog contribution are entirely those of the author and do not represent the positions of the Dahrendorf Forum or its hosts Hertie School of Governance and London School of Economics and Political Science or its funder Stiftung Mercator.