(Image from Harvard Library)
People often confuse misinformation and disinformation with each other. Generally, misinformation is incorrect, misleading or wrong information. Disinformation is incorrect or misleading information that is deliberately intended to deceive people. It’s a relatively new word that may only date back to the 1960s.
Definitions
Confusing the two words is understandable. A few reputable lexicons such as the Cambridge Dictionary or Collins English Dictionary give a definition for misinformation that looks like the widely accepted definition of disinformation. If you happen to look up these terms there, you won’t find the commonly accepted distinction that I’m using. You will find the distinction at Merriam-Webster, Dictionary.com, Harvard University, the American Psychological Association and so on.
It can be helpful to remember the words misinformation and mistake start the same way, and that isn’t a coincidence. Misinformation is mistaken information. The prefix dis in English can mean negation or the inverse of whatever comes after the prefix. That’s intentional and malevolent.
How It’s Constructed
Last November I wrote about how I sift information to decide what’s believable and what’s garbage. Whenever you draw my attention to information you want me to think about, that’s the essence of how I evaluate it. But I also consider how it’s constructed and what comes with it or rides alongside it.
Misinformation can develop a life of its own and start zipping around, but disinformation is actively pushed. The more organized and deliberate the spreading of false information looks, the more likely it is to be disinformation.
Disinformation is often an artful, deliberate construct. It frequently uses snippets of fact for a veneer of validity, but embeds the snippets in a nest of falsehoods that undermine the target of the disinformation proponents.
Disinformation is designed to deceive people who stay in what Daniel Kahneman calls System 1 thinking too much. Everyone needs to use System 1 most of the time because it is fast, easy, automatic and intuitive. We have to make so many decisions and judgements in the course of ordinary life, we would get hardly anything done if we used slow, analytical, rational, unemotional System 2 most of the time.
People in certain professions such as mine learn to make ourselves use System 2 on a regular basis. If we don’t use System 2 enough, we’ll do terrible work in our rocket science, engineering, architecture, etc. We can’t do well in such professions unless we make the extra effort to use System 2 thinking for important decisions and judgements.
System 2 is good at detecting misinformation and disinformation. The way to detect bad information is to use System 2 thinking to evaluate it. That’s why disinformation tries to hit emotional hot buttons and keep us in System 1.
What Comes With it
False material is particularly likely to be disinformation when proponents are adamant about trying to ensnare us in a prolonged, detailed discussion where they advocate ideas that don’t stand on solid ground. The proponents string out the discussion as long as possible. They act offended anytime we try to end the engagement. This tactic strives to pull us into an exercise that will exhaust our energy, take up our time without any possibility of changing the proponents’ minds, and divert us from putting fruitful effort into whatever the proponents want to undermine.
I could go into more detail about how this tactic works, but I would rather not inadvertently coach anyone who is trying to do it. There are occasions when it can be used for a positive purpose, but it is too commonly used for negative purposes. The key here is that this tactic often accompanies disinformation, where taking our time and energy away from productive activity is its purpose. The only good way to handle it is to refuse to be pulled in.
My Point
I like getting input from readers.
From that input, I want to only promote material that does reasonably well in my assessment of viability. The video by Dr. Klotman is an example. I passed it along to you because most of it looked solid to me. I called out portions that I believe are misinformation, apparently mistakes rather than intentional attacks on truth, and they weren’t too much of the video. Everybody makes some mistakes. We can take in the solid parts of the video and set aside the small parts that contradict factual material.
Whenever I push back hard about some reader input, the material looks to me like it contains too much misinformation or disinformation. If I refuse to get into a prolonged discussion about it, that means I see hallmarks of disinformation and demands for a prolonged discussion look like the energy-sucking tactic I described.
This community has grown a bit lately. If you are new here, this should give you a sense of how I operate. I’m glad you are here. I hope you find my posts and methods worth your while.
I know we can always count on you to tell the truth based on facts, whether the facts are what we want to hear or not. The engineer relies on facts, not opinions. Keep up the good work, Bonnie.
The tactic of snaring someone in a disinformation diatribe reminds me of how evangelical Christians try to get converts. Maybe you can do a post about the details of System 1 and System 2 thinking. Sounds intriguing. I sure wish the Dominion Voting Systems lawsuit would have demanded an apology from Faux "News" Corp. and an admission of the lies so their followers can see how they were duped. However, knowing how those types of people think, I'm not sure even that would have convinced them.