Testimony Russian Social Media Influence Understanding Russian Propaganda in Eastern Europe Addendum Todd C Helmus CT-496 1 Document submitted August 30 2018 as an addendum to testimony before the Senate Select Committee on Intelligence on August 1 2018 For more information on this publication visit www rand org pubs testimonies CT496z1 html Testimonies RAND testimonies record testimony presented or submitted by RAND associates to federal state or local legislative committees government-appointed commissions and panels and private review and oversight bodies Published by the RAND Corporation Santa Monica Calif © Copyright 2018 RAND Corporation is a registered trademark Limited Print and Electronic Distribution Rights This document and trademark s contained herein are protected by law This representation of RAND intellectual property is provided for noncommercial use only Unauthorized posting of this publication online is prohibited Permission is given to duplicate this document for personal use only as long as it is unaltered and complete Permission is required from RAND to reproduce or reuse in another form any of its research documents for commercial use For information on reprint and linking permissions please visit www rand org pubs permissions html www rand org Russian Social Media Influence Understanding Russian Propaganda in Eastern Europe Testimony of Todd C Helmus1 The RAND Corporation2 Addendum to testimony before the Select Committee on Intelligence United States Senate Submitted August 30 2018 F ollowing the hearing on August 1 2018 the congressional committee sought additional information and requested answers to the questions in this document The answers were submitted for the record Questions from Senator Tom Cotton Question 1 As most people are aware the most detailed accounting of Russia’s past activities is the Mitrokhin Archive On page 243 of the Mitrokhin Archive as detailed in The Sword and the Shield it states It was the extreme priority attached by the Centre KGB Headquarters to discrediting the policies of the Reagan administration which led Andropov to decree formally on April 12 1982 as one of the last acts of his fifteen-year term as chairman of the KGB that is was the duty of all foreign intelligence officers whatever their “line” or department to participate in active measures Ensuring that Reagan did not serve a second term thus became Service A’s most important objective On February 25 1983 the Centre instructed its three American residences to being planning actives measures to ensure Reagan’s defeat in the presidential election of November 1984 They were ordered to acquire contacts on the staffs of all possible presidential candidates and in both party headquarters…The Centre made clear that any candidate of either party would be preferable to Reagan 1 The opinions and conclusions expressed in this testimony are the author’s alone and should not be interpreted as representing those of the RAND Corporation or any of the sponsors of its research 2 The RAND Corporation is a research organization that develops solutions to public policy challenges to help make communities throughout the world safer and more secure healthier and more prosperous RAND is nonprofit nonpartisan and committed to the public interest 1 Residences around the world were ordered to popularize the slogan “Reagan Means War ” The Centre announced five active measures “theses” to be used…his militarist adventurism his personal responsibility for accelerating the arms race his support for repressive regimes around the world his responsibility for tension with his NATO allies Active Measures “theses” in domestic policy included Reagan’s alleged discrimination against ethnic minorities corruption in his administration and Reagan’s subservience to the military-industrial complex ” So in 1982 over thirty-five years ago we had the KGB using active measures in the United States to sow racial discord try to create problems with NATO discredit our nuclear modernization undercut military spending highlight corruptions and try to encourage the U S to retreat from the world stage Aren’t the themes the KGB used in 1982 similar to those we’re seeing the Russian Intelligence Services use on social media in 2018 Answer The focus of the RAND research used as a basis for my testimony before the committee was on Russia’s propaganda efforts directed at Eastern Europe The research for this study was conducted in 20173 That study as well as my other research did not review this historical analog in great detail and thus I cannot compare Russia’s campaign against the Reagan presidency as articulated above and Russia’s modern political warfare campaign against the United States at this time Question 2 Isn’t this Russian social media campaign really just old wine in new bottles with perhaps a different distributor Answer It is true that Russia has historically worked to meddle in the internal affairs of various foreign countries For example a recent RAND Corporation study highlighted a Russian political warfare campaign in Estonia known as the “Bronze Night ” when Russia in an effort to respond to the Estonian government’s quest to move a statue commemorating the Soviet victory in World War II launched cyber attacks against the country’s web domains and possibly organized a major protest that left one dead and 150 injured 5 According to a recent Center for Strategic and International Studies report Russia has also cultivated “an opaque web of economic and political patronage” that sought to influence internal politics state institutions and economies of 3 Todd C Helmus Elizabeth Bodine-Baron Andrew Radin Madeline Magnuson Joshua Mendelsohn William Marcellino Andriy Bega and Zev Winkelman Russian Social Media Influence Understanding Russian Propaganda in Eastern Europe Santa Monica Calif RAND Corporation RR-2237-OSD 2018 As of August 30 2018 www rand org t RR2237 5 Linda Robinson Todd C Helmus Raphael S Cohen Alireza Nader Andrew Radin Madeline Magnuson and Katya Migazheva Modern Political Warfare Current Practices and Possible Responses Santa Monica Calif RAND Corporation RR-1772-A 2018 As of August 30 2018 www rand org t RR1772 2 Hungary Slovakia Bulgaria Latvia and Serbia 6 Russia has most certainly sought to assert influence in other nation states as well As the question suggests what is clearly unique about recent Russian political warfare activities is its use of social media The Kremlin initially developed its army of trolls fake social media accounts managed by Russian agents and social media bots automated social media accounts in order to influence the Russian domestic audience 7 With some apparent success the Kremlin then began to train these capabilities on foreign audiences most immediately against Ukraine and then beyond These social media operations which have also included the use of Facebook ads and pages are particularly unique and potentially powerful because of their ability to link specific messages with specific target audiences A simple review of Facebook’s capability for ad-targeting illustrates its power as a potential tool for political warfare Specifically the medium allows advertisers access to “powerful audience selection tools” that can be used to “target the people who are right for your business ”8 Such tools can increase the efficiency and potential efficacy of messaging campaigns that had prior to the social media age not been available at scale to government propaganda campaigns The social media campaigns can also mimic popular conversations and debates and so exert a kind of peer influence on American audiences Ensuring that malign actors like Russia do not have easy access to such tools will prove a critical challenge to technology companies and policymakers in the years ahead Question 3 We’ve heard from open testimony before this Committee that the Russians are using active measures to undermine our missile defense deployments nuclear modernization efforts and to try and drive a wedge between the U S and NATO on these issues Additionally we know from Mitrokhin and Bob Gate’s memoir “From the Shadows” that this was part of their playbook in the 1980s as well To what extent have you looked for and seen Russian activity on this front on social media Answer The focus of the RAND research used in my testimony before the committee was on Russia’s propaganda efforts directed at Eastern Europe 9 As part of that work we identified and reviewed Russian efforts to drive a wedge between Russian speakers in the Baltics and their home states the European Union and members of the North Atlantic Treaty Organization NATO However 6 Heather A Conley James Mina Ruslan Stefanov and Martin Vladimirov The Kremlin Playbook Understanding Russian Influence in Central and Eastern Europe Washington D C Center for Strategic and International Studies 2016 7 Keir Giles Russia’s “New” Tools for Confronting the West Continuity and Innovation in Moscow’s Exercise of Power London Chatham House Russia and Eurasia Programme March 2016 8 Facebook “Choose Your Audience ” webpage 2018 As of August 30 2018 https www facebook com business products ads ad-targeting 9 Helmus et al 2018 3 we did not look for or identify Russian efforts to drive a wedge between the United States and NATO Questions from Senator Joe Manchin Question 1 What modifications would you recommend to the large social media companies that would enable users to identify the source and potential funding of items posted on social media Answer It seems logical to conclude that if consumers were able to determine whether particular social media content was the direct product of a foreign disinformation or influence campaign then that content would potentially lose much of its influence value If an intriguing social media post was outed as a social media bot or identified as coming from a known Russian troll then that content would seem to lose all credibility Consequently the report I co-wrote on the topic Russian Social Media Influence identified several ways that technology firms or other external entities such as governments could inform audiences quickly and directly of Russian propaganda content We have previously noted that it is critical to highlight Russian propaganda in ways that are fast and that target at-risk audiences Thus our study highlighted several new approaches that could possibly take advantage of advances in modern information technology For example our study highlighted the potential use of Google Ads This approach uses videos and other content embedded in Google search results to educate populations who search for Russian-created fake news on Google and other search engines The report also highlighted the potential value of viewpoint bots A viewpoint bot can in theory use advanced algorithms to identify Russian bots or trolls engaged in hashtag campaigns Once it identifies a bot or troll the viewpoint bot posts messages to the offending hashtags informing audiences of Russian influence efforts However there may be a need for some caution in the implementation of any disinformation tagging campaign In 2017 Facebook implemented a campaign to mark inaccurate posts with a “Disputed” tag However less than a year after its implementation Facebook terminated the program because the effort was deemed ineffective 11 In particular Facebook’s testing revealed that marking some content “false” or “disputed” did not necessarily change some audience members’ opinions about the accuracy of the content And Facebook cited research suggesting that strong language or visualizations such as the “Disputed” marker can actually “backfire and further entrench someone’s beliefs ”12 Other researchers show what they call an “implied truth” effect by which “false stories that fail to get tagged are considered validated and thus are seen 11 Tessa Lyons “Replacing Disputed Flags with Related Articles ” Facebook December 20 2017 As of August 30 2018 https newsroom fb com news 2017 12 news-feed-fyi-updates-in-our-fight-against-misinformation 12 Jeff Smith Grace Jackson and Seetha Raj “Designing Against Misinformation ” Medium December 20 2017 As of August 30 2018 https medium com facebook-design designing-against-misinformation-e5846b3aa1e2 4 as more accurate ”13 Consequently it will be critical to ensure that any new efforts that tag false content undergo empirical evaluations to ensure that the regimens achieve their intended effect Question 2 Should there be disclaimers on anything other than personal information Answer Unfortunately our study on Russian social media operations in Eastern Europe did not address this type of policy response so I will refrain from answering this question Question 3 Should everything posted on social media have a “tag” that allows users to determine who posted information even if it was re-posted or shared by another person so you can always determine the actual source Answer Unfortunately our study on Russian social media operations in Eastern Europe did not address this type of policy response so I will refrain from answering this question Question from Senator Angus King Question 1 At the hearing on August 1 2018 I asked each witness to submit written policy recommendations to the Committee Specifically please provide recommendations on the following topics Technical solutions such as requirements to label bot activity or identify inauthentic accounts Public initiatives focused on building media literacy Solutions to increase deterrence against foreign manipulation and Any additional policy recommendations Answer While we were conducting field research in Estonia and Latvia and having phone conversations with numerous other regional activists the recommendation we heard most frequently was the need for media literacy training 13 Gordon Pennycook and David G Rand “The Implied Truth Effect Attaching Warnings to a Subset of Fake News Stories Increases Perceived Accuracy of Stories Without Warnings ” working paper December 8 2017 As of August 30 2018 https papers ssrn com sol3 papers cfm abstract_id 3035384 5 Several such efforts in Eastern Europe are currently under way For example the NonGovernmental Organization Media Baltic Centre with some international funding provides training to journalists in the Baltics and conducts media literacy training in the region In addition to helping journalists avoid becoming “unwitting multipliers of misleading information ” the organization works with school teachers in the region to help them “decode media and incorporate media research into teaching ” The program also works to guide school children with media production programs and help raise awareness of fake news on social media In addition the U S embassy in Latvia was looking to initiate media literacy programming A local tech entrepreneur in Latvia is interested in creating a nongovernmental organization startup that would advocate for broader media literacy training and develop a Baltic-focused crowdsourced fact-checking website along the lines of the popular English-language factchecking site Snopes 14 Beyond these disparate efforts we recommended establishing media literacy training as part of a national curriculum Both Canada and Australia have developed such curriculums In addition Sweden based on concerns of Russian fake news and propaganda has launched a nationwide school program to teach students to identify Russian propaganda 155 Given that a curriculum-based training program will take time to develop and establish impact we recommended that authorities in Eastern Europe launch a public information campaign that teaches the concepts of media literacy to a mass audience This campaign disseminated via conventional and social media could be targeted to the populations in greatest need It is likewise possible to meld public information campaigns with social media–driven training programs Facebook has also launched its own media literacy campaign most recently marked by distributing tips to users for spotting fake news stories 15 As we noted it would certainly be possible to develop such efforts for an East European and Ukrainian audience In theory helping audiences including those in the United States better access analyze and evaluate media messages and their accuracy can help reduce the plague of fake news and limit the ability of Russia to blindly influence the U S public However the scientific evidence for media literacy training to help audiences detect the types of content produced by propagandists remains limited Consequently for both U S and European media literacy initiatives it will be critical to scientifically evaluate the impact of such initiatives and determine the types of trainings e g online versus offline short course versus long course that are suitable for specific audiences mediums and content 14 Helmus et al 2018 15 See for example Facebook “Tips to Spot False News ” webpage 2018 As of August 30 2018 https www facebook com help 188118808357379 6
OCR of the Document
View the Document >>