Tag Archives: Fallacy

The presentism fallacy

by Tim Harding

In recent times, there has been a trend in the popular media towards viewing past events and people through the prism of present-day attitudes. This trend is manifested in attempts to ‘cancel’ the past, including by silencing discussions, banning books, tearing down statues and so on.

In historical and literary analysis, presentism is a pejorative term for the introduction of present-day ideas and perspectives into depictions or interpretations of the past. Some modern historians seek to avoid presentism in their work because they consider it a form of cultural bias, and believe it creates a distorted understanding of their subject matter. The practice of presentism is regarded by some as a common informal fallacy when writing about the past.

Presentism also fails to take into account that, at the time in which historical events occurred, those involved did not enjoy the benefit of hindsight that has informed our present perspective. Yet to fully understand an historical event, we must view it not only with the benefit of hindsight, but also in the more limited context of its own times.

To avoid the fallacy of presentism, orthodox historians restrict themselves to describing what happened and why, attempting to refrain from using language that passes judgment. For example, in analysing the history of slavery, it is more useful to study the attitudes and circumstances of the past that led to slavery, rather than just present-day attitudes that simply condemn slavery without further analysis. One theory is that African-American slavery began for economic reasons and that racism was a consequential attempt to justify the practice, rather than being the prime cause of slavery. Such a theory would be overlooked by presentism.

This fallacy should not be confused with philosophical presentism, which is a technical position in ontology (the study of existence) that only the present exists.

Leave a comment

Filed under Reblogs

Human accident fallacy

Some years ago, when I was working as a regulatory consultant to VicRoads, they asked me to use the term “road crash” rather than “road accident”. They pointed out that an accident is defined as an unintended event not directly caused by humans.

Vehicle collisions are not usually accidents; they are mostly caused by preventable behaviours such as drunk or careless driving, or intentionally driving too fast. Vehicle collisions invariably have a human cause, so the use of the term “accident” is misleading in this context.

The use of the word accident to describe car crashes was promoted by the US National Automobile Chamber of Commerce in the middle of the 20th century, as a way to make vehicle-related deaths and injuries seem like an unavoidable matter of fate, rather than a problem that could be addressed. The automobile industry accomplished this by writing customised articles as a free service for newspapers that used the industry’s preferred language. These articles were deliberately fallacious.

For this reason, the US National Highway Traffic Safety Administration has since 1994 asked media and the public not to use the word accident to describe vehicle collisions.  Similarly, the Australian National Road Safety Strategy 2021–30, endorsed by the responsible Federal and State Ministers, uses the term “road crash” rather than “road accident”. More widely, health and safety professionals generally prefer using the term “incident” in place of the term “accident”.

Confusingly, the common current meaning of the English word “accident” has almost nothing to do with Aristotle’s philosophical concept of the “fallacy of accident”, which was one of the thirteen fallacies that Aristotle discussed in his book On Sophistical Refutations. Aristotle’s accident fallacy is difficult to explain here, but doesn’t have anything to do with car crashes or people slipping on banana peels.

2 Comments

Filed under Logical fallacies

Ad hoc fallacy

In July 2009, Danish psychic/dowser Connie Sonne was given the chance to prove her claimed dowsing ability in the One Million Dollar Paranormal Challenge offered by the James Randi Educational Foundation. She was asked to dowse some randomly selected cards hidden in envelopes and lost the challenge by selecting other incorrect ones. In an interview afterward, she insisted that she lost merely because, “…it wasn’t time yet for my powers to be revealed.”

The ad hoc fallacy is not strictly an error of logic. Instead, it a fallacious rhetorical tactic in which a person presents a new explanation that is unjustified or simply unreasonable, in an attempt to rescue their original claim after evidence that contradicts it has emerged.

The Latin phrase “ad hoc” is literally translated as meaning “to this”. It refers to an idea or solution that is intended for a specific use, and not for any other uses. An ad hoc explanation is specifically constructed to be used in a particular case and is created hastily at the moment rather than being the result of deliberate, fact-based reasoning.

Another example encountered by skeptical investigators is as follows. This is a typical conversation between a supposed psychic who claims to be able to read minds and a skeptic.

Skeptic: “If you’re psychic then tell me what number I am thinking of”                        

Psychic: “My powers don’t work in the presence of skeptics.”

In this example, the fallacious tactic is pretty obvious. The response that their powers don’t work around skeptics is clearly a ridiculous explanation, and it’s an explanation that one would never accept unless one was already convinced that the person was a psychic. Further, it makes it impossible to discredit them no matter how fraudulent they actually are (a lack of falsifiability is a hallmark of ad hoc fallacies).

Leave a comment

Filed under Reblogs

Appeal to hypocrisy

Appeal to hypocrisy (also known as tu quoque, which is Latin for, ‘you also’) is an informal logical fallacy that tries to discredit the validity of the opponent’s argument by asserting the opponent’s failure to act consistently in accordance with its conclusion(s). It is similar to ‘whataboutism‘ which is an attempt to twist criticism back on the initial critic. The Oxford English Dictionary cites John Cooke’s 1614 stage play The Cittie Gallant as the earliest use of the term tu quoque in the English language.

The Appeal to Hypocrisy fallacy follows the pattern:

  1. Person A makes claim X.
  2. Person B asserts that A’s actions or past claims are inconsistent with the truth of claim X.
  3. Therefore, X is false.

An example would be

Peter: ‘Based on the arguments I have presented, it is evident that it is morally wrong to use animals for food or clothing.’

Bill: ‘But you are wearing a leather jacket and you have a roast beef sandwich in your hand! How can you say that using animals for food and clothing is wrong?’

The appeal to hypocrisy fallacy can also appear in less structured ways, such as in the following example where Person B is driving a car with Person A as a passenger:

Person A: “Stop running so many stop signs.”

Person B: “You run them all the time!”

This argument is a fallacy because the moral character or past actions of the opponent are generally irrelevant to the validity of the argument. It is often used as a red herring tactic and is a special case of the ad hominem fallacy, which is a category of fallacies in which a claim or argument is rejected on the basis of facts about the person presenting or supporting the claim or argument.

12 Comments

Filed under Logical fallacies, Reblogs

No true Scotsman

No true Scotsman is a kind of informal fallacy in which one attempts to rescue a universal generalisation from counterexamples by changing the definition in an ad hoc fashion to exclude the counterexample. Rather than denying the counterexample or rejecting the original claim, this fallacy modifies the subject of the assertion to exclude the specific case or others like it by rhetoric i.e. those who perform that action are not part of our group and thus criticism of that action is not criticism of the group.

Philosophy professor Bradley Dowden explains the fallacy as an ‘ad hoc rescue’ of a refuted generalisation attempt. The following is a simplified rendition of the fallacy:

Person A: ‘No Scotsman puts sugar on his porridge.’

Person B: ‘But my uncle Angus likes sugar with his porridge.’

Person A: ‘Ah yes, but no true Scotsman puts sugar on his porridge.’

The introduction of the term is attributed to British philosopher Prof. Antony Flew, because the term originally appeared in Flew’s 1971 book An Introduction to Western Philosophy.

A practical example of this fallacy occurs when Marxists try to defend their regressive and unworkable ideology against the overwhelming evidence from the 20th century that almost every communist regime was brutally repressive; and most of them resulted in poverty for everybody except the communist party elite. ‘But they weren’t true communists’ they say. Yeah, right.

Leave a comment

Filed under Logical fallacies

The Recommendation Fallacy

By Tim Harding

(An edited version of this fallacy was published in The Skeptic magazine Vol.37, No.1, March 2017 ).

I was once elected as a local government councilor in an inner Melbourne suburb.  The Council had serious concerns about poor staff performance, both in providing advice to Council and in implementing Council decisions.  As these problems appeared to be systemic rather than just the fault of the CEO, we brought in management consultants for an independent review of the Council administration.

After interviewing both Councillors and staff, the management consultants reported that there were indeed systemic organisational deficiencies related to poor staff culture.  One of the most pervasive problems was that few staff understood the difference between a recommendation and a decision.  They seemed to think that staff made decisions and the Council either ‘ratified’ or ‘overturned’ their ‘decisions’.  Some staff even mistakenly classified rejection of their recommendations by the elected Councillors as ‘political interference’! They did not seem to understand that their role was to provide professional advice to the Council, including options and recommendations. (There were also deficiencies in the implementation of Council decisions, but that is a separate issue).

I also found that this was a problem at junior levels in the state public service; but not in the middle and senior ranks. (Public servants tend not to get promoted if they do not even understand how organisations operate). Junior staff needed to be taught how to analyse problems and make recommendations, instead of indulging in what I called ‘problem referral’ without providing options or recommendations to management.  I have also found this staff deficiency in some NGOs, such as ANTaR, where I have been a board member.  It seemed to me that our education system did not teach students these fairly fundamental skills of working in a professional office environment.

A recommendation has no status other than as advice to a decision-maker.  Although a recommendation may well be persuasive, depending upon the expertise of its author, there is no obligation by a decision-maker to adopt any recommendation.  Thus it makes no sense to say that a recommendation has been ‘overturned’.  Only decisions can be overturned.  Similarly, a recommendation cannot be ‘implemented’ unless and until it has been adopted as a decision by decision-maker who is authorised to make that decision. (The media often make these errors). These are errors of reasoning, just like other fallacies.  So I have dubbed such confusions between recommendations and decisions instances of the Recommendation Fallacy.

If you find the information on this blog useful, you might like to consider supporting us.

Make a Donation Button

1 Comment

Filed under Logical fallacies

The Fallacy of Faulty Risk Assessment

by Tim Harding

(An edited version of this essay was published in The Skeptic magazine, September 2016, Vol 36 No 3)

Australian Skeptics have tackled many false beliefs over the years, often in co-operation with other organisations.  We have had some successes – for instance, belief in homeopathy finally seems to be on the wane.  Nevertheless, false beliefs about vaccination and fluoridation just won’t lie down and die – despite concerted campaigns by medical practitioners, dentists, governments and more recently the media.  Why are these beliefs so immune to evidence and arguments?

There are several possible explanations for the persistence of these false beliefs.  One is denialism – the rejection of established facts in favour of personal opinions.  Closely related are conspiracy theories, which typically allege that facts have been suppressed or fabricated by ‘the powers that be’, in an attempt by denialists to explain the discrepancies between their opinions and the findings of science.  A third possibility is an error of reasoning or fallacy known as Faulty Risk Assessment, which is the topic of this article.

Before going on to discuss vaccination and fluoridation in terms of this fallacy, I would like to talk about risk and risk assessment in general.

What is risk assessment?

Hardly anything we do in life is risk-free. Whenever we travel in a car or even walk along a footpath, most people are aware that there is a small but finite risk of being injured or killed.  Yet this risk does not keep us away from roads.  We intuitively make an informal risk assessment that the level of this risk is acceptable in the circumstances.

In more formal terms, ‘risk’ may be defined as the probability or likelihood of something bad happening multiplied by the resulting cost/benefit ratio if it does happen.  Risk analysis is the process of discovering what risks are associated with a particular hazard, including the mechanisms that cause the hazard, then estimating the likelihood that the hazard will occur and the consequences if it does occur.

Risk assessment is the determination of the acceptability of risk using two dimensions of measurement – the likelihood of an adverse event occurring; and the severity of the consequences if it does occur, as illustrated in the diagram below.  (This two-dimensional risk assessment is a conceptually useful way of ranking risks, even if one or both of the dimensions cannot be measured quantitatively).

risk-diagram

By way of illustration, the likelihood of something bad happening could be very low, but the consequences could be unacceptably high – enough to justify preventative action.  Conversely, the likelihood of an event could be higher, but the consequences could low enough to justify ‘taking the risk’.

In assessing the consequences, consideration needs to be given to the size of the population likely to be affected, and the severity of the impact on those affected.  This will provide an indication of the aggregate effect of an adverse event.  For example, ‘high’ consequences might include significant harm to a small group of affected individuals, or moderate harm to a large number of individuals.

A fallacy is committed when a person either focuses on the risks of an activity and ignores its benefits; and/or takes account one dimension of risk assessment and overlooks the other dimension.

To give a practical example of a one-dimensional risk assessment, the desalination plant to augment Melbourne’s water supply has been called a ‘white elephant’ by some people, because it has not been needed since the last drought broke in March 2010.  But this criticism ignores the catastrophic consequences that could have occurred had the drought not broken.  In June 2009, Melbourne’s water storages fell to 25.5% of capacity, the lowest level since the huge Thomson Dam began filling in 1984.  This downward trend could have continued at that time, and could well be repeated during the inevitable next drought.

wonthaggi

Melbourne’s desalination plant at Wonthaggi

No responsible government could afford to ‘take the risk’ of a major city of more than four million people running out of water.  People in temperate climates can survive without electricity or gas, but are likely to die of thirst in less than a week without water, not to mention the hygiene crisis that would occur without washing or toilet flushing.  The failure to safeguard the water supply of a major city is one of the most serious derelictions of government responsibility imaginable.

Turning now to the anti-vaccination and anti-fluoridation movements, they both commit the fallacy of Faulty Risk Assessment.  They focus on the very tiny likelihood of adverse side effects without considering the major benefits to public health from vaccination and the fluoridation of public water supplies, and the potentially severe consequences of not vaccinating or fluoridating.

Vaccination risks

The benefits of vaccination far outweigh its risks for all of the diseases where vaccines are available.  This includes influenza, pertussis (whooping cough), measles and tetanus – not to mention the terrible diseases that vaccination has eradicated from Australia such as smallpox, polio, diphtheria and tuberculosis.

As fellow skeptic Dr. Rachael Dunlop puts it:  ‘In many ways, vaccines are a victim of their own success, leading us to forget just how debilitating preventable diseases can be – not seeing kids in calipers or hospital wards full of iron lungs means we forget just how serious these diseases can be.’

No adult or teenager has ever died or become seriously ill in Australia from the side effects of vaccination; yet large numbers of people have died from the lack of vaccination.  The notorious Wakefield allegation in 1998 of a link between vaccination and autism has been discredited, retracted and found to be fraudulent.  Further evidence comes from a recently published exhaustive review examining 12,000 research articles covering eight different vaccines which also concluded there is no link between vaccines and autism.

According to Professor C Raina MacIntyre of UNSW, ‘Influenza virus is a serious infection, which causes 1,500 to 3,500 deaths in Australia each year.  Death occurs from direct viral effects (such as viral pneumonia) or from complications such as bacterial pneumonia and other secondary bacterial infections. In people with underlying coronary artery disease, influenza may also precipitate heart attacks, which flu vaccine may prevent.’

In 2010, increased rates of high fever and febrile convulsions were reported in children under 5 years of age after they were vaccinated with the Fluvax vaccine.  This vaccine has not been registered for use in this age group since late 2010 and therefore should not be given to children under 5 years of age. The available data indicate that there is a very low risk of fever, which is usually mild and transient, following vaccination with the other vaccine brands.  Any of these other vaccines can be used in children aged 6 months and older.

Australia was declared measles-free in 2005 by the World Health Organization (WHO) – before we stopped being so vigilant about vaccinating and outbreaks began to reappear.  The impact of vaccine complacency can be observed in the 2015 measles epidemic in Wales where there were over 800 cases and one death, and many people presenting were of the age who missed out on MMR vaccination following the Wakefield scare.

After the link to autism was disproven, many anti-vaxers shifted the blame to thiomersal, a mercury-containing component of relatively low toxicity to humans.  Small amounts of thiomersal were used as a preservative in some vaccines, but not the MMR vaccine.  Thiomersal was removed from all scheduled childhood vaccines in 2000.

In terms of risk assessment, Dr. Dunlop has pointed out that no vaccine is 100% effective and vaccines are not an absolute guarantee against infection. So while it’s still possible to get the disease you’ve been vaccinated against, disease severity and duration will be reduced.  Those who are vaccinated have fewer complications than people who aren’t.  With pertussis (whooping cough), for example, severe complications such as pneumonia and encephalitis (brain inflammation) occur almost exclusively in the unvaccinated.  So since the majority of the population is vaccinated, it follows that most people who get a particular disease will be vaccinated, but critically, they will suffer fewer complications and long-term effects than those who are completely unprotected.

Fluoridation risks

Public water fluoridation is the adjustment of the natural levels of fluoride in drinking water to a level that helps protect teeth against decay.  In many (but not all) parts of Australia, reticulated drinking water has been fluoridated since the early 1960s.

The benefits of fluoridation are well documented.  In November 2007, the NHMRC completed a review of the latest scientific evidence in relation to fluoride and health.  Based on this review, the NHMRC recommended community water fluoridation programs as the most effective and socially equitable community measure for protecting the population from tooth decay.  The scientific and medical support for the benefits of fluoridation certainly outweighs the claims of the vocal minority against it.

Fluoridation opponents over the years have claimed that putting fluoride in water causes health problems, is too expensive and is a form of mass medication.  Some conspiracy theorists go as far as to suggest that fluoridation is a communist plot to lower children’s IQ.  Yet, there is no evidence of any adverse health effects from the fluoridation of water at the recommended levels.  The only possible risk is from over-dosing water supplies as a result of automated equipment failure, but there is inline testing of fluoride levels with automated water shutoffs in the remote event of overdosing.  Any overdose would need to be massive to have any adverse effect on health.  The probability of such a massive overdose is extremely low.

Tooth decay remains a significant problem. In Victoria, for instance, more than 4,400 children under 10, including 197 two-year-olds and 828 four-year-olds, required general anaesthetic in hospital for the treatment of dental decay during 2009-10.  Indeed, 95% of all preventable dental admissions to hospital for children up to nine years old in Victoria are due to dental decay. Children under ten in non-optimally fluoridated areas are twice as likely to require a general anaesthetic for treatment of dental decay as children in optimally fluoridated areas.

As fellow skeptic and pain management specialist Dr. Michael Vagg has said, “The risks of general anaesthesia for multiple tooth extractions are not to be idly contemplated for children, and far outweigh the virtually non-existent risk from fluoridation.”  So in terms of risk assessment, the risks from not fluoridating water supplies are far greater than the risks of fluoridating.

Implications for skeptical activism

Anti-vaxers and anti-fluoridationists who are motivated by denialism and conspiracy theories tend to believe whatever they want to believe, and dogmatically so.  Thus evidence and arguments are unlikely to have much influence on them.

But not all anti-vaxxers and anti-fluoridationists fall into this category.  Some may have been misled by false information, and thus could possibly be open to persuasion if the correct information is provided.

Others might even be aware of the correct information, but are assessing the risks fallaciously in the ways I have described in this article.  Their errors are not ones of fact, but errors of reasoning.  They too might be open to persuasion if education about sound risk assessment is provided.

I hope that analysing the false beliefs about vaccination and fluoridation from the perspective of the Faulty Risk Assessment Fallacy has provided yet another weapon in the skeptical armoury against these false beliefs.

References

Rachael Dunlop (2015) Six myths about vaccination – and why they’re wrong. The Conversation, Parkville.

C Raina MacIntyre (2016) Thinking about getting the 2016 flu vaccine? Here’s what you need to know. The Conversation, Parkville.

Mike Morgan (2012) How fluoride in water helps prevent tooth decay.  The Conversation, Parkville.

Michael Vagg (2013) Fluoride conspiracies + activism = harm to children. The Conversation, Parkville.

 Government of Victoria (2014) Victorian Guide to Regulation. Department of Treasury and Finance, Melbourne.

If you find the information on this blog useful, you might like to consider supporting us.

Make a Donation Button

Leave a comment

Filed under Essays and talks

Association fallacy

An association fallacy is a faulty generalisation which asserts that the qualities of one thing are inherently qualities of another, merely via an irrelevant association.  Association fallacies come in various shapes and sizes; but most of them are a type of red herring fallacy that introduces irrelevant premises into an argument and draws an invalid conclusion.

The formal structure of the association fallacy is:

Premise 1: A is a B

Premise 2: A is also a C

Conclusion:  Therefore, all Bs are Cs.

The fallacy in the argument can be illustrated through the use of the following Euler diagram: ‘A’ satisfies the requirement that it is part of both sets ‘B’ and ‘C’, but if one represents this as an Euler diagram, it can clearly be seen that it is possible that a part of set ‘B’ is not part of set ‘C’, refuting the conclusion that ‘all Bs are Cs’.

An example of the association fallacy is the argument ‘All dogs have four legs; my cat has four legs. Therefore, my cat is a dog.’ Another example is where some atheists oppose the idea of free will simply because Christians believe in it.  (This is despite the fact that prominent atheists such as Professor Daniel Dennett also believe in free will).

Special cases of this fallacy include guilt by association, or conversely honour by association. Guilt by association can sometimes be a type of ad hominem fallacy, if the argument attacks a person because of the similarity between the views of someone making an argument and other proponents of the same argument.  An example would be ‘My opponent for office just received an endorsement from the Puppy Haters Association. Is that the sort of person you would want to vote for?  In this way, appeals to emotion can also be included for added rhetorical effect.

A form of the association fallacy often used by those denying a well-established scientific or historical proposition is the so-called ‘Galileo Gambit.’ The argument goes that since Galileo was ridiculed in his time but later acknowledged to be right, that since their non-mainstream views are provoking ridicule and rejection from other scientists, they too will later be recognized as correct.

Leave a comment

Filed under Logical fallacies

Argument to moderation

Argument to moderation (Latin: argumentum ad temperantiam) is an informal fallacy which asserts that the truth can be found as a compromise between two opposite positions.  It is also known as the argument from middle ground, false compromise, grey fallacy and the golden mean fallacy.  It is effectively an inverse false dilemma, discarding both of two opposites in favour of a middle position. It is related to, but different from the false balance fallacy.

An individual demonstrating this fallacy implies that the positions being considered represent extremes of a continuum of opinions, that such extremes are always wrong, and the middle ground is always correct.  This is not necessarily the case.

The form of the fallacy goes like this:

Premise: There is a choice to make between doing X or doing Y.

Conclusion: Therefore, the answer is somewhere between X and Y.

This argument is invalid because the conclusion does not logically follow from the premise.  Sometimes only X or Y is right or true, with no middle ground possible.

To give an example of this fallacy:

‘The fact that one is confronted with an individual who strongly argues that slavery is wrong and another who argues equally strongly that slavery is perfectly legitimate in no way suggests that the truth must be somewhere in the middle.’[1]

Another example is:

’You say the sky is blue, while I say the sky is red. Therefore, the best solution is to compromise and agree that the sky is purple.’

This fallacy is sometimes used in rhetorical debates to undermine an opponent’s position.  All one must do is present yet another, radically opposed position, and the middle-ground compromise will be forced closer to that position.  In pragmatic politics, this is part of the basis behind the Overton window theory.

In US politics this fallacy is known as ‘High Broderism’ after David Broder, a columnist and reporter for the Washington Post who insisted, against all reason, that the best policy was always the middle ground between the Republicans and the Democrats.

Related to this fallacy is design by committee, which is a disparaging term used to describe a project that has many designers involved but no unifying plan or vision, often resulting in a negotiated compromise; as illustrated by the aphorism ‘A camel is a horse designed by a committee’. The point is that a negotiated compromise is not necessarily true, right or even the optimal outcome. This does not mean that a negotiated compromise may not be appropriate in some cases.

References

[1] Susan T. Gardner (2009).Thinking Your Way to Freedom: A Guide to Owning Your Own Practical Reasoning. Temple University Press.

If you find the information on this blog useful, you might like to consider supporting us.

Make a Donation Button

3 Comments

Filed under Logical fallacies

False equivalence

In some ways the False equivalence fallacy is the direct opposite of a False dilemma.

False equivalence is an informal fallacy that describes a situation where there is an apparent similarity between two things, but in fact they are not equivalent.  The two things may share some common characteristics, but they have important differences that are overlooked for the purposes of the argument.

The pattern of the fallacy often looks like this: if A has characteristics c and d, and B has characteristics d and e, then since they both have characteristic d, A and B are equivalent. In practice, often only a passing similarity is required between A and B for this fallacy to be committed.

The following statements are examples of false equivalence:

‘They’re both soft, cuddly pets. There’s no difference between a cat and a dog.’

‘We all bleed red. We’re all no different from each other.’

‘Hitler, Stalin and Mao were evil atheists; therefore all atheists are evil.’

A more complex example is where somebody claims that more Australians are killed by sharks or road accidents than by terrorism, therefore we should not do anything to stop terrorism. This example ignores the fact that terrorist acts are prevented by doing something, such as surveillance and intelligence.  We also choose to take the risks of swimming in the ocean and driving in cars, but we cannot avoid the risk of terrorism no matter what we do.

False equivalence is occasionally claimed in politics, where one political party will accuse their opponents of having performed equally wrong actions, usually as a red herring in an attempt to deflect criticism of their own behaviour. Two wrongs don’t make a right.

On the other hand, politicians might accuse journalists of False equivalence in their reporting of political controversies if the stories are perceived to assign equal blame to opposing parties.  However, False equivalence should not be confused with False balance – the media phenomenon of presenting two sides of an argument equally in disregard of the merit or evidence on a subject (a form of argument to moderation).

Moral equivalence is a special case of False equivalence where it is falsely claimed, often for ideological motives, that both sides are equally to blame for a war or other international conflict. The historical evidence shows that this is rarely the case.

Another special case of False equivalence is Political correctness, which may be defined as language, ideas, policies, or behavior that seeks to minimise social offence in relation to occupational, gender, racial, cultural, sexual orientation, certain other religions, beliefs or ideologies, disability, and age-related contexts, to an excessive extent thus inhibiting free speech.

2 Comments

Filed under Logical fallacies