Understanding information disorder: three pillars for a new framework

The Council of Europe published a report titled “Information Disorder: Toward an Interdisciplinary Framework for Research and Policy Making” on September 27, 2017. The report prepared by Claire Wardle and Hossein Derakhshan is a result of the quest to offer a more nuanced perspective to the problem of information pollution which has been reduced to “fake news” concept. Information disorder is scrutinized through three definitions in line with the new framework that the study intends to present: dis-information, mis-information, mal-information. I am going to explain these concepts with various examples some of which come from the content of teyit.org.  

Three types of information pollution: dis-, mis-, mal-information

The authors of the report discuss three types of information disorder, namely the concepts of dis-information, mis-information and mal-information with regards to dimensions of 1- falsity of the spreading information and 2- the harm intended with false information. For this reason, this effort of conceptualization separates an elderly Facebook user who spreads a content posted on a Facebook group without realizing that the content is photoshopped, and the Macedonian teenagers who knowingly generated false content to make Donald Trump win the elections. Concepts such as “information pollution” and “fake news” are not able to make such distinctions.

Before moving on to the definitions, it is important to point out that it is ambiguous to tell whether if the authors have in mind the moment in which the information is created (“… is [not] . . . created” [p. 20]) or the moment in which the information is shared (“… is when . . . shared” [p. 5]). For example, mis-information is a result of “sharing of false information without the intent to harm” according to the definition on page 5 of the report whereas it is defined as “false information created without the intent to harm” on page 20 of the report. There is a specific difference between two definitions. If we accept a scope limited to the moment of information creation, the scope of mis-information concept is restricted significantly. It is plausible to reckon that creation of false information on purpose will be limited to exceptional instances such as misunderstanding and irony even though the goal of harming is not pursued. Considering the examples of mis-information given by the authors (the rumors shared people caught in the heat of the moment in a time of a crisis [p. 22] and the spread of a fake image depicting a swimming shark during the Hurricane Sandy  [p. 36]), it can be said that both the moments of information creation and sharing are intended to be captured. We can always move onto an epistemological discussion by asking whether or not an information is recreated in every instance of sharing, but that is outside the scope of this article.

Disinformation = false information + intent to harm (ill will)

In the said report, dis-information is defined as the false information that is knowingly shared with an intent to harm (and the act of spreading such information). Although it can be thought that ill-will as a prerequisite may bring practical difficulties in terms of distinguishment, it is the only word which is used in daily language of Turkish (“dezenformasyon), among the three concepts. According to Oxford, this relatively young word originates from the Russian word “dezinformatsiya” and has been used in English since the 1950s. The definition of this word in this dictionary is “False information which is intended to mislead” and a government-sponsored disinformation campaign has been put forth as an example. Another example would be President Recep Tayyip Erdoğan, who showed a photo of a former President, İsmet İnönü during a speech and claimed that it was a US flag he waved in the photo. It is obvious that Erdoğan, who sparked a debate with the claim that İnönü held only US flag during an official ceremony, acted with the aim of harming the opposition party, CHP, by accusing them with “Americanism” with the photo as an evidence. However, İnönü was holding both Turkish and US flags in his hands. Therefore, it can be concluded that this example involves false information and the intent to harm which are required for disinformation to come into being.

Apart from this blatant example, more shrewd and non-political instances of disinformation can be found as well. It was reported in a press release about the manufacturer of computer parts, Emulex, on a news portal Internet Wire on August 25, 2000 that the CEO of the company had resigned and 1998 and 1999 earnings had been restated as losses instead of initially reported gains. This news, which was published during the opening hours of the stock markets and was spread by the recognized sources such as Bloomberg, Dow Jones ve CBS Marketwatch as well, caused the company stocks to lose value by 61 percent in a few hours. Such a big loss costed approximately $2 billion. The news was published by an employee of Internet Wire named Mark Jakob who was a 23-old university student. Jakob had “short sold” Emulex stocks on August 17 and 18, 2000 by acting with a broker. In other words, he had borrowed money to buy (at $80) and sell these stocks with the promise of buying them later permanently. Therefore, he was going to profit if the stocks lost value. However, he lost about $97,000 after the rise of Emulex stocks to approximately $113. Jakob found the solution in disinformation and did the same transaction over and over again as the stocks lost value. Finally, he made a profit of $240,000. Jakob was later arrested, fined and sentenced to 44 months in prison.

Misinformation = false information + mistake (good-will)

The second concept of the report, mis-information refers to the false information shared without the intent to harm (and the spreading process of such information). Even though the difference of “disinformation” from “misinformation” seems quite apparent with regards to the rationale of the report, it can be said that the authors attributed a different meaning to the concept than its traditional one. The word “Misinformation”, which has existed for over 400 years in English, has been attributed diverse meanings with small, but significant differences in prominent dictionaries. For example, Oxford defines this concept as “False or inaccurate information, especially that which is deliberately intended to deceive”. The word has two following meanings in Cambridge: “wrong information” and “information intended to deceive.” On the other hand, misinformation refers to “incorrect or misleading information” in Merriam-Webster. Lastly, its meaning in dictionary.com is “false information that is spread, regardless of whether there is intent to mislead.” Therefore this concept comprises “disinformation” too, according to the dictionaries, due to having the meaning of “false information.” Nevertheless, the subject of this article is the concepts presented in the report published by the Council of Europe and the effort of conceptualization, hence, I will provide examples which are in accordance with the conceptualization in the said report. The constituent elements of misinformation are 1- false information (as in disinformation), 2- the absence of intent to harm (in contrast with disinformation).

Accordingly, it can be said that a user who engages in misinformation does it mistakenly. Unlike disinformation, there is no intention and presumably, no planning. Thus, instances of disaster and panic may provide fertile soils for misinformation. In such cases, false information, which probably results from an act of disinformation, may spread good-willed people due to the heat of the moment or with the intention to help, but usually as a consequence of lack of adequate research and a consequent confirmation with regards to the accuracy of information. An apt example would be some of the content shared after terrorist attacks, most likely with the claim that the content is from the ground zero. Besides, a Facebook user can contribute to misinformation by sharing contents such as “superfood that burns fat”, “miraculous diet that guarantees losing 10 kg in three days” etc. An example of this would be the claim shared on social media which suggests that frozen lemon is more effective than chemotherapy for cancer treatment. Obviously, the individuals who share such misinformation do not intend to harm. Furthermore, unpleasantly popular anti-vaccine myths can be given as examples with regards to health. Obviously, it would be too far-fetched to think that the parents who share anti-vaccine myths intend to harm others. However, albeit unknowingly, they take the risk of irreversibly damaging children by engaging in misinformation.

Mal-information = true information + intent to harm

 

The last concept is mal-information which has not been included in prominent dictionaries yet. One of the authors of the report, Hossein Derakhshan specified that this word was introduced by them. According to authors, mal-information refers to the use of true information with the intent to harm. This term points out to conveyance of true information which usually should be kept private, to the public sphere. Similar to disinformation, it contains the intent to harm. Nevertheless, it differs from disinformation and misinformation due to the fact that it is true information at hand here. The authors indicated the common points of these three concepts and various examples in a venn diagram.  

          

 

Attention should be drawn to the fact that this diagram is based on the moment of the information creation as well as the notion of harmfulness in an objective manner (?) rather than the intent to harm in contrast to other definitions.

Mal-information originates from the Latin word “male” which basically means “badly.” In the diagram above, the authors set the leaked materials (e-mails, videos, etc.), the information which may be harassment (e.g. announcement of the religious identity of a person that belongs to a religious minority with the purpose of harassment) and the “hate speech” which is a problematic concept in my opinion for its inherent ambiguity in terms of determining its limits, as examples of mal-information. In the diagram, the adjective “some” is added before the leaks, harassment and hate speech (e.g. “(some) leaks, (some) harassment, (some) hate speech”). It can be inferred that the authors’ statements on page 22 (“However, hate speech, harassment and leaks raise a significant number of distinct issues, and there is not space in this report to consider those as well”) demonstrate their awareness of the problematic characteristics of mal-information concept stemming from its ambiguous aspects.

The following example of mal-information, provided by the authors, addresses satisfactorily one of these issues: Emmanuel Macron’s leaked e-mails prior to the second round of the French Presidential election in May 2017. However, the fundamental question is when does a leaked material constitutes mal-information. At which stage precisely does the leak of personal e-mails of Macron, who had resigned to be a presidential candidate approximately a year before the instance, becomes an act of mal-information? What is the significance of the identity of the person appertaining to the leaked materials, considering the fact that Macron is a public figure as a presidential candidate? Does the controversy arise from the content of the leaked material? For instance, where can we draw the line, if the e-mails of a public figure, which are unrelated to his job and which push the boundaries of ambiguous concepts such as ‘public morality’ or ‘public interest’, were leaked? Quoting from Erdoğan’s remarks about allegedly leaked sex tapes of his political opponents, when is this “a matter of public rather than private (parties)”?

At which point do ‘leaked documents’ of a government which do not abide neither international nor domestic law, pose ‘state secret’? When do they conflict with human rights? What are the limits of ‘state secret’?

Predictably, similar controversial questions of similar nature arise with regards to hate speech as well. What exactly constitutes hate speech? When are the statements attacked as hate speech are completely within the scope of freedom of speech? Jeremy Waldron, a professor of law and philosophy, when defining hate speech, indicates minorities as in the receiving end (p. 27). Along the same line of thought, for example, how do Islamophobic discourses constitute hate speech “in a country 99 percent of which are Muslims”, if we follow the popular discourse? Who can guarantee that a regulation on hate speech, fully implemented and prosecuted, will actually protect the minorities and will not result in prosecutions of alleged claims of Islamophobia, considering the servile status of Turkish judiciary?

Although I find the conceptualization efforts of Claire Wardle and Hossein Derakhshan extremely valuable and promising for the development of a fact-checking terminology, they can do with more sophistication. For instance, since it is pretty difficult to determine the status of innocence at various stage of dissemination of an information, a conceptualization resting on this ambiguous basis will inevitably lead to some gray areas. Likewise, examples that involves overlapping concepts, or instances in which it is pretty impossible to distinguish between concepts, are not hard to imagine in a process such as information disorder that contains at least thousands of different users and motivations. Is not misinformation that involves hate speech possible, for instance? Obviously, sophistication of this kind that addresses this and similar points, and the relevant developments of a larger scale in the field cannot be reached with a couple of articles in a short notice. Fact-checking is a young field after all. The development of an original terminology in this field requires similar attempts, like baby-steps, and those baby-steps to become widespread to the point where meaning intended by the attempts are a part of the language without leaving a room for doubt. We are just at the beginning..

 

Sources

BBC, “The city getting rich from fake news”, December 5, 2016

Oxford English Dictionary, Disinformation

YouTube, The statement of Erdoğan about İnönü holding US flag, October 7, 2018

teyit.org, “The claim that a photo shows İsmet İnönü holding only U.S. flag”, October 8, 2018

Brendon Fowler, Cara Franklin & Robert Hyde, “Internet Securities Fraud: Old Trick, New Medium”, February 28, 2001

Oxford English Dictionary, Misinformation

Cambridge English Dictionary, Misinformation

Merriam-Webster English Dictionary, Misinformation

Dictionary.com English Dictionary, Misinformation

teyit.org, “The claim that frozen lemon is more effective than chemotherapy in cancer”, July 17, 2018

Very Well Family, 50 Anti-vaccine myths, November 27, 2018

Time, An article about anti-vaccination and misinformation, March 5, 2014

Twitter, A tweet of Derakhshan, December 18, 2017

Oxford English Dictionary, Mal-

Jeremy Waldron, The Harm in Hate Speech, 2012

Translation: Sonay Ün

Cover Image: Illustration / Sébastian Thibault

Sources

BBC, “The city getting rich from fake news”, December 5, 2016

Oxford English Dictionary, Disinformation

YouTube, The statement of Erdoğan about İnönü holding US flag, October 7, 2018

teyit.org, “The claim that a photo shows İsmet İnönü holding only U.S. flag”, October 8, 2018

Brendon Fowler, Cara Franklin & Robert Hyde, “Internet Securities Fraud: Old Trick, New Medium”, February 28, 2001

Oxford English Dictionary, Misinformation

Cambridge English Dictionary, Misinformation

Merriam-Webster English Dictionary, Misinformation

Dictionary.com English Dictionary, Misinformation

teyit.org, “The claim that frozen lemon is more effective than chemotherapy in cancer”, July 17, 2018

Very Well Family, 50 Anti-vaccine myths, November 27, 2018

Time, An article about anti-vaccination and misinformation, March 5, 2014

Twitter, A tweet of Derakhshan, December 18, 2017

Oxford English Dictionary, Mal-

Jeremy Waldron, The Harm in Hate Speech, 2012

Translation: Sonay Ün

Cover Image: Illustration / Sébastian Thibault