As the holidays approach you will probably get into political and cultural debates with family and friends. We at Snopes have put together a guide on the nature of fallacies, particularly false arguments that attack facts through methods that seem logical and that can be deceptively convincing.
A "fallacy," according to the Internet Encyclopedia of Philosophy (IEP), is "a kind of error in reasoning." Logical fallacies are based on poor or faulty logic, and are used because they make an argument seem more persuasive or valid than it really is.
The IEP says fallacies can be classified in numerous ways including as formal or informal: "A formal fallacy can be detected by examining the logical form of the reasoning, whereas an informal fallacy depends upon the content of the reasoning and possibly the purpose of the reasoning. That is, informal fallacies are errors of reasoning that cannot easily be expressed in our system of formal logic (such as symbolic, deductive, predicate logic)."
The encyclopedia also argues that logical fallacies can be grouped in the following ways: "(1) the reasoning is invalid but is presented as if it were a valid argument, or else it is inductively much weaker than it is presented as being, (2) the argument has an unjustified premise, or (3) some relevant evidence has been ignored or suppressed. Regarding (2), a premise can be justified or warranted at a time even if we later learn that the premise was false, and it can be justified if we are reasoning about what would have happened even when we know it didn't happen."
The IEP has compiled a partial list of fallacies that can come into play during a heated debate. We've written about many of them in depth over the years, along with examples of how they can be used (see each link to sub-headlines below):
The actual definition is "attack the man." Also known as the personal attack fallacy, it is characterized by irrelevant name-calling or attacks on persons, their actions, or their characters, instead of their argument. According to the Ethics Center, the logical structure of an ad hominem attack is: "1. Person A makes a claim X. 2. Person B attacks person A. 3. Therefore, X is wrong."
We've covered such attacks before, like this viral meme from 2018, which claims that Democrats elected a group of so-called "horrible" people including anti-Semites, sexual predators, and low-IQ people. While it was true that the people named in the meme had been elected, the claims about their alleged crimes, mindset and bigotry were unsubstantiated. Another example of ad hominem attacks includes when journalists like Anderson Cooper targeted former President Donald Trump's appearance, describing him as "an obese turtle on his back" when he claimed election fraud in the 2020 presidential elections.
According to the American Psychological Association, the black sheep effect is "the tendency to evaluate a disreputable or disliked person more negatively when that person is a member of one's own group rather than of some other group." The behavior occurs often because people respond negatively to those who threaten their group's identity, especially if those people are strongly connected to that group.
The black sheep effect came into play when U.S. Rep. Liz Cheney, a Republican, was once a shoo-in for reelection in the August 2022 U.S. primaries. But, after spearheading the GOP resistance to former Republican President Donald Trump, she lost her primary race to her Trump-endorsed opponent. In a similar vein
Defined as a psychological process that protects the human mind from conflicting or upsetting information. In short, it allows people to see what they want to see by seeking out information that reaffirms personal beliefs while throwing out conflicting ideas, even if they're not aware of doing so.
Some examples of confirmation bias include when we search for information online that confirms our preexisting beliefs, or when we give more weight to pieces of information that support our beliefs. In some cases we can engage in "cherry-picking," where we use only morsels of data and choose to ignore the full picture, so that it aligns with our understanding of an issue. Confirmation bias exists on all sides of the political spectrum.
Defined by the Oxford Learner's Dictionary as "an environment where a person only encounters information or opinions that reflect and reinforce their own." These closed-loop social circles present a special kind of challenge to media literacy in that they hinder a person's ability to see and understand perspectives and beliefs different from their own, while reinforcing potentially false and even dangerous information. Echo chambers exist across social media and create clusters of people who share opinions on controversial topics like gun control, vaccinations, abortions, and more. They are applicable to anyone, no matter which side of a debate the person is on, and are often responsible for the spread of misinformation. The echo chamber works hand-in-hand with confirmation bias: The former is the environment in which your biases fester, while the existence of both elements feed each other in a never-ending cycle.
An informal fallacy that replaces a person's original argument with one that is exaggerated or distorted and attacks the extreme version as if that claim was made by the person in the first place. In essence, this new argument — or "straw man" — becomes easier to refute or defeat. An example of such an argument: "Person A: The statue of Robert E. Lee should be removed from the town square. Person B: Oh, so now you want to erase all history that doesn't align with your political views?"
Some other examples of straw man arguments include the oft-repeated criticism of critical race theory that says discussing slavery and its impact on structural inequalities is actually teaching schoolchildren to hate white people.
According to the psychology and philosophy website Effectiviology, "The red herring fallacy is a logical fallacy where someone presents irrelevant information in an attempt to distract others from a topic that's being discussed, often to avoid a question or shift the discussion in a new direction. For example, if a politician is asked how they feel about a certain policy, they might use the red herring fallacy by discussing how they feel about a related topic instead, to distract people from their failure to answer the original question."
One example of a red herring is what Republican U.S. Rep. Jim Jordan tweeted in response to public hearings held by the House Select Committee to Investigate the January 6 Attack on the U.S. Capitol. He wrote: "When's the primetime hearing on record crime in Democrat-run cities?"
The slippery slope fallacy leads an argument through a chain of events that the arguer suggests will lead to an undesirable outcome but who offers little or no evidence to back it up. Often with this logical fallacy, a person will accept that a proposed chain of events — in other words, the slippery slope — will happen without verifying such a likelihood.
One such example of a slippery slope is a claim that Bill Gates' funding of research on solar radiation management—a proposal intended as a last-ditch effort to combat global warming—is going to end up blocking out the sun. This, of course, is not true.
This occurs when an argument (valid or not) is based on an assumption or incorrect statement. For example, "Should Joe Biden and his son be investigated for their corrupt business dealings in Ukraine?" assumes that Biden and his son did in fact engage in corrupt business dealings in Ukraine, a claim without any evidence. Another example comes from Biden's claim that an AR-15's bullet "travels five times as rapidly as a bullet shot out of any other gun," to make a point about banning assault weapons. While AR-15s are undeniably lethal and have been used to murder hundreds of people in shootings across the U.S. Biden's argument relied on an incorrect premise about the speed of the AR-15 bullet.