Category Archives: Logical fallacies

Can does not imply ought

‘Ought implies can’ is an ethical principle ascribed to the 18th century philosopher Immanuel Kant which claims that if a person is morally obliged to perform a certain action, then logically that person must be able to perform it. It makes no sense to say that somebody ought to do something that it is impossible for them to do. As a corollary, a person cannot be held responsible for something outside their control.

On the other hand, the converse relationship ‘Can implies ought’ does not apply. Just because a person can do something, it does not logically follow that they ought to do it.

When the Australian Governor-General Sir John Kerr dismissed the Whitlam Government in 1975, a common argument in favour was that he had the power to do it. However, this was an irrelevant red herring – hardly anybody disputed the power of the Governor-General to dismiss the Prime Minister. What they did dispute was whether he ought to have done it.

To give another example, in most cases it is lawful to tell lies, and it is often possible to get away with lying. But that does not mean that lying is acceptable behaviour (except for ‘white lies’ to avoid hurting somebody’s feelings or to otherwise minimise harm).  

In terms of logic, this reverse relationship is a logical fallacy known as ‘Affirming the consequent’ (sometimes called the fallacy of the converse) which consists of invalidly inferring the converse from the original statement. This fallacy takes the following form:

Premise 1: If P, then Q.

Premise 2: Q.

Conclusion: Therefore, P. 

Applying this form to the current case:

Premise 1: If you ought to do something, then you can do it.

Premise 2: You can do it.

Conclusion: Therefore, you ought to do it.

I realise that this is not a very common fallacy, and pointing it out might just seem like common sense, but it does have relevance for critical thinking and logical argument.

3 Comments

Filed under Logical fallacies

Special Pleading

Special pleading is a form of inconsistency in which the reasoner doesn’t apply his or her principles consistently. It is the fallacy of applying a general principle to various situations but not applying it to a special situation that interests the arguer even though the general principle properly applies to that special situation, too.

Example:

Everyone has a duty to help the police do their job, no matter who the suspect is. That is why we must support investigations into corruption in the police department. No person is above the law. Of course, if the police come knocking on my door to ask about my neighbors and the robberies in our building, I know nothing. I’m not about to rat on anybody.

In our example, the principle of helping the police is applied to investigations of police officers but not to one’s neighbors.

Source: The Internet Encyclopedia of Philosophy

1 Comment

Filed under Logical fallacies

Reverse causation fallacy

Recently on the Australian “Sunrise” TV program co-presenter David Koch said: “There have only been 3000 deaths from COVID, far less than that from influenza in the same period, so we should oppose the lockdowns”.  This statement ignored the fact that prior to and during the vaccination rollout, lockdowns are likely to have prevented many thousands more deaths.

Similarly, some people argue against counter-terrorism measures on the grounds that there have been relatively few successful terrorist attacks in Australia, ignoring the fact that counter-terrorism has deterred and disrupted many more terrorist plots than those that have been carried out.

Several years ago, I was working as a regulatory consultant helping to remake sunsetting Victorian water regulations. Amongst other things, these regulations require the installation of backflow prevention devices on the customer’s water service pipe, just after the water meter. Backflow can result in contaminants being drawn into the drinking water system if the mains pressure suddenly drops as the result of a burst water main. If a customer leaves a hose running in a chlorinated swimming pool or attached to a container of “hose-on” fertiliser, weedicide or insecticide, or worse still in an industrial chemical bath, risks to public health can occur. A Treasury official asked me how many Victorians have died as a result of such backflow incidents. When I answered “none yet” he said “so what is the problem?”. This ignored the fact that backflow prevention devices have been installed for many decades, thus preventing backflow from occurring.

These examples all make the logical error of confusing cause and effect, which is also known as the reverse causation fallacy. The low numbers of cases are caused, at least in part, by the preventative measures in place – they do not demonstrate that such measures are unnecessary. Without such measures, the numbers of cases would be likely to be many times higher.

Leave a comment

Filed under Logical fallacies

Pseudoprofundity

by Tim Harding

Most skeptics are familiar with the term ‘pseudoscience’, which means non-scientific activities masquerading as science. Examples include astrology, alchemy, so-called ‘alternative medicine’ and ‘creation science’.

Less well-known is the term ‘pseudoprofundity’, which means claptrap masquerading as profound wisdom. Deepak Chopra springs to mind, when he uses pseudoprofound terms such as ‘quantum healing’ and ‘dynamically active consciousness’.  

‘Love is only a word’ is a more general example of pseudoprofundity which Daniel Dennett calls a ‘deepity’ – the pretension of depth. This is a phrase which sounds like it contains a great depth of wisdom by virtue of being perfectly ambiguous. The philosophical blogger Jonasan writes that on one level it is clearly false that love is only a word. ‘Love’ is a word, but love itself is not (the inverted commas are important). The fact that ‘love’ is a word is also trivially true. We are thus left with a statement which can either be interpreted as obviously false or trivially true.  Jonasan notes that by failing to exercise our powers of analysis on this statement we end up thinking about both meanings together, rather than separating them and perhaps seeking clarification about which meaning is intended. This gives an illusion of profundity.

Stephen Law has pointed out that you can also achieve this effect without the need for an ambiguity in meaning. Ordinary trivially true platitudes such as ‘death comes to us all’ can be elevated to the level of profound insight if enunciated with enough gravitas. Likewise, one can take the other side of Dennett’s deepities – that of self-contradiction – and use it without even needing the trivially true side. Law again gives us one of the finest examples in ‘‘sanity is just another kind of madness’. It sounds profound doesn’t it? Except that sanity cannot be a form of madness because they are defined as opposites.

2 Comments

Filed under Logical fallacies

Tribal truth fallacy

During the 2020 US presidential election campaign, one of the starkest visual differences between Trump and Biden supporters was in the wearing of masks. Most Biden supporters appeared to wear masks, whereas most Trump supporters didn’t.

Even during Trump’s announcement of his nomination of Judge Amy Barrett to the US Supreme Court, very few people in the White House Rose Garden were wearing a mask. Worse still, some of the guests were seen hugging and kissing each other. 

As a result, the White House has become a COVID-19 ‘hotspot’ or super-spreader location, with seven attendees at the Justice Amy Barrett nomination announcement testing positive for coronavirus – even President Trump and the First Lady. More people caught the virus at the White House election ‘celebration night’. 130 Secret Service agents have also tested positive.

It is clear that at least some, if not most, of Trump supporters refuse to wear masks on political grounds. They seem to associate mask wearing with what they perceive to be ‘liberal’ pro-science attitudes. It is also possible that some Biden supporters might wear masks as a form of visual political opposition to the Trump supporters. In either case, this is irrational tribal behaviour. 

A similar phenomenon may be occurring in the climate change debate. Some beliefs against human causes of climate change may be genuinely (but mistakenly) held on the basis of personal interpretation of the evidence. But at least some of the far-right wing opposition is due to a perception of climate science being some sort of left-wing plot against fossil fuel industries.

The far left is not immune from such irrational tribal behaviour either. At least some of the opposition to GMOs and vaccines seems to be based on ideological opposition to large agribusinesses and the pharmaceutical industry, rather than on evidence-based health concerns.

Another example is where some atheists oppose the idea of free will simply because Christians believe in it.  (This is despite the fact that prominent atheists such as Professor Daniel Dennett also believe in free will).

The tribal truth fallacy lies not in the falsity of beliefs about mask wearing, climate change, GMOs, vaccines or free will per se; but in the basis for these beliefs being identification with one’s own tribe or opposition to a rival tribe.

5 Comments

Filed under Logical fallacies

Collecting the wrong information

Some people seem to think that if you have a problem or an issue, all you need to do is to collect enough information about it, and that will tell you the answer.

Robert McNamara has provided a stark counter-example. As well as being the Secretary of Defence during the Cuban missile crisis and president of Ford motor company, McNamara was a statistics analyst for the US airforce during WW2.

In the story he told, the Americans did a huge analysis of data from repair reports on B17 bombers, the plan being to add armour to the areas of the aircraft most commonly damaged, because bomber losses were so high. You can’t armour the whole plane, because of weight, so it was incredibly important that the armour went in the right place.

As McNamara told the story, one of the guys on the team, a Jewish mathematician named Abraham Wald (who had been thrown out of Austria because of Nazi anti-semitism), discussed the project with an armourer who’d been shipped back from overseas, who was helping the design team with the armour kits.

The armourer told Wald the project was all wrong; that he’s seen planes fly home with damage in all the areas they were armouring. Then Wald had a eureka moment – he drew a diagram of the bomber, and ruled out every area where a plane had come home with a bullet hole in that part. The diagram was stark – no plane had ever come home with damage in certain angles of the cockpit (where a bullet would kill both pilots) or at the base of the vertical stabiliser.

Wald had realised that the planes that had been shot in these bullet-free zones never made it home to be accounted for. They changed the armour, and crew survival rates shot up.

McNamara told the story to make a point about data – he said had they just followed the data, it would have led them to armour the places the plane could survive a hit and get back, not the places that would always bring the aircraft down if they were shattered.

So you need to collect the right information, because collecting the wrong information can mislead you to a wrong conclusion – in the above case the opposite conclusion to the right one.

If you find the information on this blog useful, you might like to consider supporting us.

Make a Donation Button

2 Comments

Filed under Logical fallacies

Appeal to the minority

by Tim Harding

Alternative things have been ‘cool’ since the 1960s. They include alternative music, alternative lifestyles and of course, so-called ‘alternative medicine’. At least part of the appeal of such alternatives is the rejection of majority views perceived as mainstream or conservative.                   

An appeal to the minority is a logical fallacy that occurs when something is asserted to be true because most people don’t believe it. It is the opposite of an appeal to popularity where an advocate asserts that because the great majority of people agree with his or her position on an issue, he or she must be right (The Skeptic, September 2012). The logical form of the appeal to the minority fallacy is:

Premise 1: X is a minority view (as compared to majority view Y).

Premise 2: Minority views are more often true than majority views.

Conclusion: X is more likely to be true than Y.

To give some examples of this fallacy: ‘just like Copernicus, we in the Flat Earth Society are willing to defy the wrong-headed orthodoxy of the mainstream scientific community’ and ‘the medical profession and pro-vaccine sheeple have been conned by Big Pharma to maximise their profits’.

This fallacy has also been called ‘second-option bias’, which is a well-documented phenomenon among fringe and counterculture groups where theyassume that any widely-held opinion among the general population must be untrue, and therefore, the prevailing contrary opinion must be right. This is an important driver in conspiracy theories, pseudoscience, quackery and other fields where a person feels their views and ideas are being marginalised by mainstream society.

Ironically, an appeal to the minority is inherently limited. If someone successfully persuades other people that they are right, then their opinion would increasingly lose its minority status — and eventually would become majority opinion.

Leave a comment

Filed under Logical fallacies

Appeal to Ignorance

 

by Tim Harding

The scope of the Appeal to Ignorance fallacy (Argumentum ad Ignorantiam in Latin) is more limited than its title would suggest. In the specific context of this fallacy, the word ignorance represents ‘a lack of contrary evidence’ rather than a lack of education or knowledge. The fallacy title was likely coined by the philosopher John Locke in the late 17th century.

In informal logic, this fallacy asserts that a proposition is true because it has not yet been shown to be false, or a proposition is false because it has not yet been shown to be true. This represents a type of false dichotomy, in that it excludes the possibility that there may have been an insufficient investigation to determine whether the proposition is either true or false. In other words, ‘absence of evidence is not evidence of absence.’

In rhetorical debates, appeals to ignorance are sometimes used in an attempt to shift the burden of proof.  A typical example is as follows: ‘In spite of all the talk, not a single flying saucer report has been authenticated. We may assume, therefore, there are no such things as flying saucers.’ An absurd but logically equivalent example is: ‘Although NASA has shown that the surface of the moon is not made of green cheese, it has not conclusively demonstrated that the Moon’s core is not made of it; therefore, the moon’s core is made of green cheese.

This fallacy is a potential trap that empiricists need to be wary of falling into. We cannot prove the non-existence of anything, so the burden of proof lies with those who claim the existence of something, rather than those who doubt it. So, we should always remain open to the possibility of new evidence in support of a claim, even if no such evidence has ever been found.

Leave a comment

Filed under Logical fallacies

Five most popular fallacies of 2018

The five most popular articles on this blog during 2018, apart from the home page, were all on fallacies:

  1. Fallacies of composition and division (9,187 views).
  2. Moral equivalence (6,376 views).
  3. False equivalence (4,281 views).
  4. Zero-sum fallacy (3,547 views).
  5. Perfect solution fallacy (3,229 views).

If you find the information on this blog useful, you might like to consider supporting us.

Make a Donation Button

Leave a comment

Filed under Logical fallacies

Why anti-vaxxers get it so wrong

By Tim Harding

The inability to accurately appraise one’s own knowledge is a cognitive bias known as the Dunning-Kruger Effect, first identified from social psychology experiments conducted in 1999. Dunning-Kruger effects occur when individuals’ lack of knowledge about a particular subject leads them to inaccurately gauge their expertise on that subject. Ignorance of one’s own ignorance can lead people who lack knowledge on a subject to think of themselves as more expert than those who are comparatively better informed.

A recent study published in the peer-reviewed journal Social Science and Medicine (and summarised in The Conversation) demonstrated that at least some anti-vaccination views are based on the Dunning-Kruger Effect.  The study found that 71 per cent of those who strongly endorse misinformation about the link between vaccines and autism feel that they know as much or more than medical experts about the causes of autism, compared to only 28 per cent who most strongly reject that misinformation.

The researchers found that nearly a third, or 30 percent, of people who think that they know more than medical experts about the causes of autism strongly support giving parents the latitude to not vaccinate their children. By contrast, 16 percent of those who do not think that they know more than medical professionals felt the same way.

The study also found that people who think they know more than medical experts are more likely to trust information about vaccines from non-expert sources, such as celebrities. These individuals are also more likely to support a strong role for non-experts in the process of making policies about vaccines and vaccination.

Whilst these recent research findings may not come as a surprise to seasoned skeptics, we now have  empirical evidence to explain why at least some anti-vaccination views are so irrational.

1 Comment

Filed under Logical fallacies, Reblogs