Disinformation


Only 2/3! You are seeing results from the Public Collection, not the complete Full Collection. Sign in to search everything (see eligibility).

Browsing featured resources (41)

Search general resources for Disinformation (128)

41 featured resources    updated Oct 20, 2020

 Jump to grouping:

Government Reports

  • Antitrust and ‘Big Tech’ [September 11, 2019]

    From the Document: “Over the past decade, Google, Amazon, Facebook, and Apple (‘Big Tech’ or the ‘Big Four’) have revolutionized the internet economy and affected the daily lives of billions of people worldwide. While these companies are responsible for momentous technological breakthroughs and massive wealth creation, they have also received scrutiny related to their privacy practices, dissemination of harmful content and misinformation, alleged political bias, and–as relevant here–potentially anticompetitive conduct. In June 2019, the Wall Street Journal reported that the Department of Justice (DOJ) and Federal Trade Commission (FTC)–the agencies responsible for enforcing the federal antitrust laws–agreed to divide responsibility over investigations of the Big Four’s business practices. Under these agreements, the DOJ reportedly has authority over investigations of Google and Apple, while the FTC will look into Facebook and Amazon.”

    Library of Congress. Congressional Research Service

    Freeman, Wilson C.; Sykes, Jay B.

    2019-09-11

  • Can Public Diplomacy Survive the Internet?: Bots, Echo Chambers, and Disinformation

    “Scientific progress continues to accelerate, and while we’ve witnessed a revolution in communication technologies in the past ten years, what proceeds in the next ten years may be far more transformative. It may also be extremely disruptive, challenging long held conventions behind public diplomacy (PD) programs and strategies. In order to think carefully about PD in this ever and rapidly changing communications space, the Advisory Commission on Public Diplomacy (ACPD) convened a group of private sector, government, and academic experts at Stanford University’s Hoover Institution to discuss the latest trends in research on strategic communication in digital spaces. The results of that workshop, refined by a number of follow-on interviews and discussions, are included in this report. I encourage you to read each of the fourteen essays that follow, which are divided into three thematic sections: Digital’s Dark Side, Disinformation, and Narratives.”

    United States. Department of State

    Powers, Shawn; Kounalakis, Markos

    2017-05

  • China Deep Dive: ‘A Report on the Intelligence Community’s Capabilities and Competencies with Respect to the People’s Republic of China’ (Redacted)

    From the Introduction: “For the first time in three decades the United States is confronted by the rise of a global competitor. How the United States Intelligence Community meets the challenge of China’s arrival on the global stage, as well as the continued potential for highly disruptive transnational crises that originate within our competitors’ borders, the profound technological change transforming societies and communication across the globe, and the international order’s return to near-peer competition will have profound and long-lasting implications on our nation’s continued security, economic prosperity, and ability to preserve America’s democratic way of life.”

    United States. Congress. House. Permanent Select Committee on Intelligence

  • Combatting Targeted Disinformation Campaigns: A Whole-Of-Society Issue

    From the Disinformation Overview: “[T]he purpose of disinformation is to mislead. disinformation is information created and distributed with the express purpose of causing harm. […] A targeted disinformation campaign, in the context of this paper, is more insidious than simply telling lies on the internet. One untrue meme or contrived story may be a single thread in a broader operation seeking to influence a target population through methods that violate democratic values, social norms and, in some jurisdictions, the law. […] Targeted disinformation campaigns are not a new phenomenon and sophisticated ones follow a predictable progression. after establishing the objective, a threat actor follow distinct steps, discussed later in more detail: recon, build, seed, copy, amplify, and control to bring about an outcome.”

    United States. Department of Homeland Security

    2019-10

  • ‘Disinformation Online and a Country in Crisis’

    From the Overview: “Facebook’s Mark Zuckerberg has tried to frame the issue of reining in mis- and disinformation as not wanting to be ‘the arbiter of truth’. This entirely misses the point. The point is not about truth or falsehood, but about algorithmic amplification. The point is that social media decides every day what is relevant by recommending it to their billions of users. The point is that social media has learned that outrageous, divisive, and conspiratorial content increases engagement. The point is that online content providers could simply decide that they value trusted information over untrusted information, respectful over hateful, and unifying over divisive, and in turn fundamentally change the divisiveness-fueling and misinformation-distributing machine that is social media today. By way of highlighting the depth and breadth of these problems, I will describe two recent case studies that reveal a troubling pattern of how the internet, social media, and more generally, information, is being weaponized against society and democracy. I will conclude with a broad overview of interventions to help avert the digital dystopia that we seem to be hurtling towards.”

    United States. Congress. House. Committee on Energy and Commerce

    Farid, Hany

    2020-06-24?

  • First Responder’s Toolbox: Violent Extremists Likely Will Continue to Use Disinformation on Social Media Outlets to Instill Fear and Radicalize Others

    “This product highlights examples of official media releases by designated foreign terrorist organizations, such as ISIS [Islamic State of Iraq and Syria], and unofficial media releases by auxiliary news agencies and terrorist supporters. It is intended to describe how terrorists use disinformation to potentially influence Homeland and Western audiences, and introduce steps that can be taken to determine the credibility of the messaging.”

    National Counterterrorism Center (U.S.)

    2018-08-09

  • Government Responses to Disinformation on Social Media Platforms

    From the Comparative Summary: “Concerns regarding the impact of viral dissemination of disinformation on democratic systems of government, on political discourse, on public trust in state institutions, and on social harmony have been expressed by many around the world. These concerns are shared by countries with advanced economies as well as those with emerging and developing economies. […] This report is composed of individual surveys of the European Union (EU) and fifteen selected countries from around the globe. The countries surveyed vary geographically, culturally, in their systems of government, and in their commitment to democratic principles of governance, which include protections for freedom of expression, the right to privacy, and transparency and oversight of governmental actions, among other things. The surveys were prepared by the foreign law specialists and analysts of the Law Library of Congress’s Global Legal Research Directorate based on primary and secondary sources available in the Law Library of Congress’s collections, legal databases to which it subscribes, and open sources.”

    Law Library of Congress (U.S.)

    Levush, Ruth; Buchanan, Kelly (Kelly S.); Ahmad, Tariq . . .

    2019-09

  • Soviet ‘Active Measures’ Forgery, Disinformation, Political Operations

    “In late 1979, agents of the Soviet Union spread a false rumor that the United States was responsible for the seizure of the Grand Mosque of Mecca. In 1980, a French journalist was convicted by a French court of law for acting as a Soviet agent of influence since 1959. In August 1981, the Soviet news agency TASS alleged that the United States was behind the death of Panamanian leader Omar Torrijos. These are three examples of a stream of Soviet ‘active measures’ that seek to discredit and weaken the United States and other nations. The Soviets use the bland term ‘active measures’ (‘aktivnyye meropriyatiya’) to refer to operations intended to affect other nations’ policies, as distinct from espionage and counterintelligence.”

    United States. Department of State. Bureau of Public Affairs

    1981-10

  • Weapons of Mass Distraction: Foreign State-Sponsored Disinformation in the Digital Age

    From the Document: “If there is one word that has come to define the technology giants and their impact on the world, it is ‘disruption.’ The major technology and social media companies have disrupted industries ranging from media to advertising to retail. However, it is not just the traditional sectors that these technologies have upended. They have also disrupted another, more insidious trade – disinformation and propaganda. The proliferation of social media platforms has democratized the dissemination and consumption of information, thereby eroding traditional media hierarchies and undercutting claims of authority. The environment, therefore, is ripe for exploitation by bad actors. Today, states and individuals can easily spread disinformation at lightning speed and with potentially serious impact. […] The following interdisciplinary review attempts to shed light on these converging factors, and the challenges and opportunities moving forward.”

    Park Advisors

    Nemr, Christina; Gangware, William

    2019-03

Hearings

  • Examining Social Media Companies’ Efforts to Counter on-Line Terror Content and Misinformation, Hearing Before the Committee on Homeland Security, House of Representatives, One Hundred Sixteenth Congress, First Session, June 26, 2019

    This is the June 26, 2019 hearing titled “Examining Social Media Companies’ Efforts to Counter on-Line Terror Content and Misinformation,” held before the House Committee on Homeland Security. From the opening statement of Bennie G. Thompson: “In March, a white supremacist terrorist killed 51 people and wounded 49 more at two mosques in Christchurch, New Zealand. […] Shockingly, the terrorist was able to live-stream the attack on Facebook, where the video and its gruesome content went undetected initially. […] When New Zealand authorities called on all social media companies to remove these videos immediately, they were unable to comply. Human moderators could not keep up with the volume of videos being reposted, and their automated systems were unable to recognize minor changes in the video. […] This committee will continue to engage social media companies about the challenges they face in addressing terror content on their platforms.” Statements, letters, and materials submitted for the record include those of the following: Monika Bickert, Nick Pickles, Derek Slater, and Nadine Strossen.

    United States. Government Publishing Office

    2020

  • Russian Disinformation Attacks on Elections: Lessons from Europe, Hearing Before the Subcommittee on Europe, Eurasia, Energy, and the Environment of the Committee on Foreign Affairs, House of Representatives, One Hundred Sixteenth Congress, First Session, July 16, 2019

    This is the July 16, 2019 hearing on “Russian Disinformation Attacks on Elections: Lessons from Europe,” held before the U.S. House Subcommittee on Europe, Eurasia, Energy, and the Environment of the Committee on Foreign Affairs. From the opening statement of William R. Keating: “Today’s hearing is on Russia’s attacks on democratic elections through targeted disinformation campaigns and the takeaways from Europe where this activity has been accelerating for years. It is on what the EU and the European countries are doing themselves, what has been effective, what has not been, lessons learned.” Statements, letters, and materials submitted for the record include those of the following: Daniel Fried, Jessikka Aro, Jakub Kalensky, and Frederick W. Kagan.

    United States. Government Publishing Office

    2019

  • S. Hrg. 115-232: Open Hearing: Social Media Influence in the 2016 U.S. Election, Hearing Before the Select Committee on Intelligence, United States Senate One Hundred Fifteenth Congress, First Session, November 1, 2017

    “This document is the November 1, 2017 open hearing titled “Social Media Influence in the 2016 U.S. Election” before the Senate Select Committee on Intelligence. From the opening statement of Richard Burr: “What social media platforms played in spreading disinformation and discord during the 2016 elections. This is an opportunity for each of you to tell your respective stories and, if necessary, correct the record. My sense is that not all aspects of those stories have been told accurately. I’ll note for the record that this Committee is now having its seventeenth open hearing this year, and the twelfth at which we’ll be discussing Russia and Russia’s activities. Today, I’m hopeful we can provide the American people with an informed and credible assessment of how foreign actors used your platforms to circulate lies and to agitate unrest during last year’s elections.” Statements, letters, and materials submitted for the record include those of the following: Richard, Chairman, Mark R., Vice, Colin Stretch, Sean Edgett, and Kent Walker.

    United States. Government Publishing Office

    2017

  • S. Hrg. 115-40, Pt. 1: Disinformation: A Primer in Russian Active Measures and Influence Campaigns, Panel I, Hearing Before the Select Committee on Intelligence of the United States Senate, One Hundred Fifteenth Congress, First Session, March 30, 2017

    This is the March 30, 2017 hearing on “Disinformation: A Primer in Russian Active Measures and Influence Campaigns, Panel I” before the Select Committee on Intelligence of the United States Senate. From Senator Richard Burr’s opening statement: “This morning the committee will engage in an activity that’s quite rare for us, an open hearing on an ongoing critical intelligence question: the role of Russian active measures past and present. As many of you know, this committee is conducting a thorough, independent, and nonpartisan review of the Russian active measures campaign conducted against the 2016 U.S. elections. Some of the intelligence provided to the committee is extremely sensitive, which requires that most of the work be conducted in a secure setting to maintain the integrity of the information and to protect the very sensitive sources and methods that gave us access to that intelligence. However, the Vice Chairman and I understand the gravity of the issues that we’re here reviewing and have decided that it’s crucial that we take the rare step of discussing publicly an ongoing intelligence question. That’s why we’ve convened this second open hearing on the topic of Russian active measures, and I can assure you to the extent possible the committee will hold additional open hearings on this issue.” Statements, letters, and materials submitted for the record include those of the following: Roy Godson, Eugene Rumer, and Clint Watts.

    United States. Government Publishing Office

    2017

  • S. Hrg. 115-40, Pt. 2: Disinformation: A Primer in Russian Active Measures and Influence Campaigns, Panel II, Hearing Before the Select Committee on Intelligence of the United States Senate, One Hundred Fifteenth Congress, First Session, March 30, 2017

    This is the March 30, 2017 hearing on Disinformation: A Primer in Russian Active Measures and Influence Campaigns, Panel II, Hearing Before the Select Committee on Intelligence of the United States Senate. The purpose of this hearing was to establish the extent by which a foreign adversary interfered in the 2016 Presidential elections in the United States. The witnesses listed presented unclassified information for Congress to determine the severity and impact of the events that took place. Statements, letters, and materials submitted for the record include those of the following: Kevin Mandia, Keith B. Alexander, and Thomas Rid.

    United States. Government Publishing Office

    2017

  • S. Hrg. 115-460: Open Hearing on Foreign Influence Operations’ Use of Social Media Platforms (Company Witnesses), Hearing Before the Select Committee on Intelligence of the United States Senate, One Hundred Fifteenth Congress, Second Session, September 5, 2018

    This is the September 5, 2018 hearing titled “Open Hearing on Foreign Influence Operations’ Use of Social Media Platforms (Company Witnesses)” held before the Senate Select Committee on Intelligence. From the opening statement of Richard Burr: “The purpose of today’s hearing is to discuss the role that social media plays in the execution of foreign influence operations. In the past, we’ve used terms like misinformation and divisive content to describe this activity.” Statements, letters, and materials submitted for the record include those of the following: Sheryl Sandberg and Jack Dorsey.

    United States. Government Publishing Office

    2019

  • Serial No. 114-37: Confronting Russia’s Weaponization of Information, Hearing Before the Committee on Foreign Affairs, House of Representatives, One Hundred Fourteenth Congress, First Session, April 15, 2015

    This is the April 15, 2015 hearing on “Confronting Russia’s Weaponization of Information” held before the House Committee on Foreign Affairs. From the opening statement of Edward R. Royace: “And today we are going to look at the danger of Russia’s misinformation campaign in Europe and, indeed, today that misinformation campaign is worldwide and we are also going to look at the failed response to that effort. And as we will hear today, Russia’s propaganda machine is really at this time in overdrive and part of the focus, from my standpoint, seems to be to subvert democratic stability. And, frankly, there is also an element of this that goes to the issue of fomenting violence in Eastern Europe.” Statements, letters, and materials submitted for the record include those of the following: Peter Pomerantsev, Elizabeth Wahl, and Helle C. Dale.

    United States. Government Publishing Office

    2015

International Perspective

Research & Analysis

  • Fake News, (Dis)information and Principle of Non-Intervention: Scope, Limits and Possible Responses to Cyber Election Interference in Times of Competition

    From the Abstract: “In the era of asymmetrical conflicts, Information and Communication Technologies (ICT) play an essential role due to their importance in the manipulation and conditioning of public opinion. Several threats are linked to the use of ICT but, in terms of inter-state strategic competition, one of the main dangers is represented by so-called ‘cyber election interference’, i.e. cyber election meddling activities carried out by foreign States to influence the electorate of a target State through the diffusion of ‘fake news’ or ‘alternative truths’, principally via the media and social networks (Facebook, Twitter, YouTube, etc.). The aim of this paper is to clarify whether and when this kind of interference constitutes a breach of international obligations, in particular of the principle of non-intervention in the internal affairs of a State, and also to envisage possible lawful responses under international law for States targeted by said interference.”

    Army Cyber Institute, West Point

    Rotondo, Annachiara; Salvati, Pierluigi

    2018-11-14

  • Misinformation Contagion: A View Through an Epidemiological Lens

    From the Thesis Abstract: “Misinformation and disinformation have increasingly been a focus of public and media scrutiny in recent years. What differentiates past forms of misinformation from present-day are the new tools of information warfare–primarily the internet, and specifically social media platforms–which have effectively weaponized intentional false narratives directed at populations most vulnerable to manipulation. Where there is a lack of diverse populations willing to think critically about important issues, the mass nudging of social and political opinion via misinformation and disinformation both widens societal divides and stimulates action (or sometimes inaction) based on a false narrative. This thesis explores how we can better understand and address the proliferation of misinformation by viewing it through an epidemiological lens. To aid in this examination, the processes of cognitive bias will be explained as they relate to interventional opportunities to prevent contraction and spread, develop immunity, and treat the disease of misinformation. Recommendations focus on building individual and herd immunity to false narratives, reducing the virulence of these messages, and making online environments less conducive to the spread of misinformation. These steps require significant commitment to policies that will be difficult to achieve in a partisan and polarized sociopolitical environment, but they are necessary to support fact-based democratic discourse and decision-making.”

    Naval Postgraduate School (U.S.); Naval Postgraduate School (U.S.). Center for Homeland Defense and Security

    Fenton, Scott C.

    2019-12

  • Moment of Change: Challenges and Opportunities When Covering Hate Speech and Mis/Disinformation

    From the Executive Summary: “Since the 2016 U.S. presidential election, news coverage of hate speech and mis/ disinformation has skyrocketed. What was once a sleepy beat led by freelancers and activists has become a central topic of coverage for almost every news organization. As the news cycle is transformed by coverage of the COVID-19 [coronavirus disease 2019] pandemic and coverage of the 2020 presidential election ramps up, this beat is again at a critical juncture. To better understand the challenges and changes associated with this inflection point, we conducted 10 in-depth interviews with prominent journalists covering this beat. These interviews underscore critical debates in the field about platform accountability, the news agenda and news organizations’ infrastructure and support systems.”

    Massachusetts Institute of Technology. Media Laboratory

    Dave, Aashka; Chen, Claudia; Zuckerman, Ethan

    2020-06

Websites

Go to Source
Author: Leena Oh

Comments