Category Archives: Logical fallacies

Policy vs rule confusion

Some people seem to be wittingly or unwittingly be confused about the distinction between policies and rules. Unwitting confusion stems from not understanding the difference in meaning between these terms. Witting confusion stems from knowing the difference, but trying soften the apparent strictness of the word ‘rule’, possibly for ideological motives.

A policy is a general guideline or principle that is used to make decisions and guide actions. It is a high-level document that provides direction for decision making, but allows for some discretion and flexibility in implementation.

A rule, on the other hand, is a specific and concrete statement that dictates a particular course of action. It is more specific and less open to interpretation than a policy, and is typically designed to ensure compliance with laws, regulations, or company policy. Rules provide a clear standard of behavior that must be followed and are typically enforceable with consequences for non-compliance.

Leave a comment

Filed under Logical fallacies

Insinuation by questions

‘Insinuation by questions’ is a devious rhetorical tactic increasingly being used in political campaigns. It also a way of dog whistling to conspiracy theorists.

The aim of this tactic is to try to discredit popular political figures by undermining their trustworthiness. The method is to keep asking questions that have already been answered by the political figure under attack. These answers have often been corroborated by independent evidence. The theory is that if such questions keep being asked, voters will come to think that they must have some substance. This in turn will raise doubts and suspicions in the minds of some voters.

A notorious example of this tactic was the ‘birther movement‘ in the USA, before and during President Obama’s term of office. Questions were continually raised as to whether Obama was actually born in Hawaii rather than Kenya as alleged. These questions persisted despite Obama’s pre-election release of his official Hawaiian birth certificate in 2008, confirmation by the Hawaii Department of Health based on the original documents, the April 2011 release of a certified copy of Obama’s original long-form birth certificate, and contemporaneous birth announcements published in Hawaii newspapers. As usual, conspiracy theorists dismissed this convincing evidence as being part of the alleged conspiracy.

Closer to home, questions have recently been asked during the Victorian state election campaign about two incidents involving the Premier, Daniel Andrews. These were a car accident about 10 years ago, and his fall on some wet steps at his holiday accommodation in Sorrento. Both incidents were independently investigated, and Andrews’ account was corroborated. Yet questions continue to be asked via tabloid media such as the Herald-Sun newspaper. One of the world’s leading experts on the amplification of conspiracies, University of Oregon Assistant Prof Whitney Phillips, says mainstream coverage like the Herald Sun’s presents a ‘huge turning point’ in helping to legitimise conspiracy theories.

1 Comment

Filed under Logical fallacies

Human accident fallacy

Some years ago, when I was working as a regulatory consultant to VicRoads, they asked me to use the term “road crash” rather than “road accident”. They pointed out that an accident is defined as an unintended event not directly caused by humans.

Vehicle collisions are not usually accidents; they are mostly caused by preventable behaviours such as drunk or careless driving, or intentionally driving too fast. Vehicle collisions invariably have a human cause, so the use of the term “accident” is misleading in this context.

The use of the word accident to describe car crashes was promoted by the US National Automobile Chamber of Commerce in the middle of the 20th century, as a way to make vehicle-related deaths and injuries seem like an unavoidable matter of fate, rather than a problem that could be addressed. The automobile industry accomplished this by writing customised articles as a free service for newspapers that used the industry’s preferred language. These articles were deliberately fallacious.

For this reason, the US National Highway Traffic Safety Administration has since 1994 asked media and the public not to use the word accident to describe vehicle collisions.  Similarly, the Australian National Road Safety Strategy 2021–30, endorsed by the responsible Federal and State Ministers, uses the term “road crash” rather than “road accident”. More widely, health and safety professionals generally prefer using the term “incident” in place of the term “accident”.

Confusingly, the common current meaning of the English word “accident” has almost nothing to do with Aristotle’s philosophical concept of the “fallacy of accident”, which was one of the thirteen fallacies that Aristotle discussed in his book On Sophistical Refutations. Aristotle’s accident fallacy is difficult to explain here, but doesn’t have anything to do with car crashes or people slipping on banana peels.


Filed under Logical fallacies

Can does not imply ought

‘Ought implies can’ is an ethical principle ascribed to the 18th century philosopher Immanuel Kant which claims that if a person is morally obliged to perform a certain action, then logically that person must be able to perform it. It makes no sense to say that somebody ought to do something that it is impossible for them to do. As a corollary, a person cannot be held responsible for something outside their control.

On the other hand, the converse relationship ‘Can implies ought’ does not apply. Just because a person can do something, it does not logically follow that they ought to do it.

When the Australian Governor-General Sir John Kerr dismissed the Whitlam Government in 1975, a common argument in favour was that he had the power to do it. However, this was an irrelevant red herring – hardly anybody disputed the power of the Governor-General to dismiss the Prime Minister. What they did dispute was whether he ought to have done it.

To give another example, in most cases it is lawful to tell lies, and it is often possible to get away with lying. But that does not mean that lying is acceptable behaviour (except for ‘white lies’ to avoid hurting somebody’s feelings or to otherwise minimise harm).  

In terms of logic, this reverse relationship is a logical fallacy known as ‘Affirming the consequent’ (sometimes called the fallacy of the converse) which consists of invalidly inferring the converse from the original statement. This fallacy takes the following form:

Premise 1: If P, then Q.

Premise 2: Q.

Conclusion: Therefore, P. 

Applying this form to the current case:

Premise 1: If you ought to do something, then you can do it.

Premise 2: You can do it.

Conclusion: Therefore, you ought to do it.

I realise that this is not a very common fallacy, and pointing it out might just seem like common sense, but it does have relevance for critical thinking and logical argument.


Filed under Logical fallacies

Special Pleading

Special pleading is a form of inconsistency in which the reasoner doesn’t apply his or her principles consistently. It is the fallacy of applying a general principle to various situations but not applying it to a special situation that interests the arguer even though the general principle properly applies to that special situation, too.


Everyone has a duty to help the police do their job, no matter who the suspect is. That is why we must support investigations into corruption in the police department. No person is above the law. Of course, if the police come knocking on my door to ask about my neighbors and the robberies in our building, I know nothing. I’m not about to rat on anybody.

In our example, the principle of helping the police is applied to investigations of police officers but not to one’s neighbors.

Source: The Internet Encyclopedia of Philosophy

1 Comment

Filed under Logical fallacies

Reverse causation fallacy

Recently on the Australian “Sunrise” TV program co-presenter David Koch said: “There have only been 3000 deaths from COVID, far less than that from influenza in the same period, so we should oppose the lockdowns”.  This statement ignored the fact that prior to and during the vaccination rollout, lockdowns are likely to have prevented many thousands more deaths.

Similarly, some people argue against counter-terrorism measures on the grounds that there have been relatively few successful terrorist attacks in Australia, ignoring the fact that counter-terrorism has deterred and disrupted many more terrorist plots than those that have been carried out.

Several years ago, I was working as a regulatory consultant helping to remake sunsetting Victorian water regulations. Amongst other things, these regulations require the installation of backflow prevention devices on the customer’s water service pipe, just after the water meter. Backflow can result in contaminants being drawn into the drinking water system if the mains pressure suddenly drops as the result of a burst water main. If a customer leaves a hose running in a chlorinated swimming pool or attached to a container of “hose-on” fertiliser, weedicide or insecticide, or worse still in an industrial chemical bath, risks to public health can occur. A Treasury official asked me how many Victorians have died as a result of such backflow incidents. When I answered “none yet” he said “so what is the problem?”. This ignored the fact that backflow prevention devices have been installed for many decades, thus preventing backflow from occurring.

These examples all make the logical error of confusing cause and effect, which is also known as the reverse causation fallacy. The low numbers of cases are caused, at least in part, by the preventative measures in place – they do not demonstrate that such measures are unnecessary. Without such measures, the numbers of cases would be likely to be many times higher.

Leave a comment

Filed under Logical fallacies


by Tim Harding

Most skeptics are familiar with the term ‘pseudoscience’, which means non-scientific activities masquerading as science. Examples include astrology, alchemy, so-called ‘alternative medicine’ and ‘creation science’.

Less well-known is the term ‘pseudoprofundity’, which means claptrap masquerading as profound wisdom. Deepak Chopra springs to mind, when he uses pseudoprofound terms such as ‘quantum healing’ and ‘dynamically active consciousness’.  

‘Love is only a word’ is a more general example of pseudoprofundity which Daniel Dennett calls a ‘deepity’ – the pretension of depth. This is a phrase which sounds like it contains a great depth of wisdom by virtue of being perfectly ambiguous. The philosophical blogger Jonasan writes that on one level it is clearly false that love is only a word. ‘Love’ is a word, but love itself is not (the inverted commas are important). The fact that ‘love’ is a word is also trivially true. We are thus left with a statement which can either be interpreted as obviously false or trivially true.  Jonasan notes that by failing to exercise our powers of analysis on this statement we end up thinking about both meanings together, rather than separating them and perhaps seeking clarification about which meaning is intended. This gives an illusion of profundity.

Stephen Law has pointed out that you can also achieve this effect without the need for an ambiguity in meaning. Ordinary trivially true platitudes such as ‘death comes to us all’ can be elevated to the level of profound insight if enunciated with enough gravitas. Likewise, one can take the other side of Dennett’s deepities – that of self-contradiction – and use it without even needing the trivially true side. Law again gives us one of the finest examples in ‘‘sanity is just another kind of madness’. It sounds profound doesn’t it? Except that sanity cannot be a form of madness because they are defined as opposites.


Filed under Logical fallacies

Tribal truth fallacy

During the 2020 US presidential election campaign, one of the starkest visual differences between Trump and Biden supporters was in the wearing of masks. Most Biden supporters appeared to wear masks, whereas most Trump supporters didn’t.

Even during Trump’s announcement of his nomination of Judge Amy Barrett to the US Supreme Court, very few people in the White House Rose Garden were wearing a mask. Worse still, some of the guests were seen hugging and kissing each other. 

As a result, the White House has become a COVID-19 ‘hotspot’ or super-spreader location, with seven attendees at the Justice Amy Barrett nomination announcement testing positive for coronavirus – even President Trump and the First Lady. More people caught the virus at the White House election ‘celebration night’. 130 Secret Service agents have also tested positive.

It is clear that at least some, if not most, of Trump supporters refuse to wear masks on political grounds. They seem to associate mask wearing with what they perceive to be ‘liberal’ pro-science attitudes. It is also possible that some Biden supporters might wear masks as a form of visual political opposition to the Trump supporters. In either case, this is irrational tribal behaviour. 

A similar phenomenon may be occurring in the climate change debate. Some beliefs against human causes of climate change may be genuinely (but mistakenly) held on the basis of personal interpretation of the evidence. But at least some of the far-right wing opposition is due to a perception of climate science being some sort of left-wing plot against fossil fuel industries.

The far left is not immune from such irrational tribal behaviour either. At least some of the opposition to GMOs and vaccines seems to be based on ideological opposition to large agribusinesses and the pharmaceutical industry, rather than on evidence-based health concerns.

Another example is where some atheists oppose the idea of free will simply because Christians believe in it.  (This is despite the fact that prominent atheists such as Professor Daniel Dennett also believe in free will).

The tribal truth fallacy lies not in the falsity of beliefs about mask wearing, climate change, GMOs, vaccines or free will per se; but in the basis for these beliefs being identification with one’s own tribe or opposition to a rival tribe.


Filed under Logical fallacies

Collecting the wrong information

Some people seem to think that if you have a problem or an issue, all you need to do is to collect enough information about it, and that will tell you the answer.

Robert McNamara has provided a stark counter-example. As well as being the Secretary of Defence during the Cuban missile crisis and president of Ford motor company, McNamara was a statistics analyst for the US airforce during WW2.

In the story he told, the Americans did a huge analysis of data from repair reports on B17 bombers, the plan being to add armour to the areas of the aircraft most commonly damaged, because bomber losses were so high. You can’t armour the whole plane, because of weight, so it was incredibly important that the armour went in the right place.

As McNamara told the story, one of the guys on the team, a Jewish mathematician named Abraham Wald (who had been thrown out of Austria because of Nazi anti-semitism), discussed the project with an armourer who’d been shipped back from overseas, who was helping the design team with the armour kits.

The armourer told Wald the project was all wrong; that he’s seen planes fly home with damage in all the areas they were armouring. Then Wald had a eureka moment – he drew a diagram of the bomber, and ruled out every area where a plane had come home with a bullet hole in that part. The diagram was stark – no plane had ever come home with damage in certain angles of the cockpit (where a bullet would kill both pilots) or at the base of the vertical stabiliser.

Wald had realised that the planes that had been shot in these bullet-free zones never made it home to be accounted for. They changed the armour, and crew survival rates shot up.

McNamara told the story to make a point about data – he said had they just followed the data, it would have led them to armour the places the plane could survive a hit and get back, not the places that would always bring the aircraft down if they were shattered.

So you need to collect the right information, because collecting the wrong information can mislead you to a wrong conclusion – in the above case the opposite conclusion to the right one.

If you find the information on this blog useful, you might like to consider supporting us.

Make a Donation Button


Filed under Logical fallacies

Appeal to the minority

by Tim Harding

Alternative things have been ‘cool’ since the 1960s. They include alternative music, alternative lifestyles and of course, so-called ‘alternative medicine’. At least part of the appeal of such alternatives is the rejection of majority views perceived as mainstream or conservative.                   

An appeal to the minority is a logical fallacy that occurs when something is asserted to be true because most people don’t believe it. It is the opposite of an appeal to popularity where an advocate asserts that because the great majority of people agree with his or her position on an issue, he or she must be right (The Skeptic, September 2012). The logical form of the appeal to the minority fallacy is:

Premise 1: X is a minority view (as compared to majority view Y).

Premise 2: Minority views are more often true than majority views.

Conclusion: X is more likely to be true than Y.

To give some examples of this fallacy: ‘just like Copernicus, we in the Flat Earth Society are willing to defy the wrong-headed orthodoxy of the mainstream scientific community’ and ‘the medical profession and pro-vaccine sheeple have been conned by Big Pharma to maximise their profits’.

This fallacy has also been called ‘second-option bias’, which is a well-documented phenomenon among fringe and counterculture groups where they assume that any widely-held opinion among the general population must be untrue, and therefore, the prevailing contrary opinion must be right. This is an important driver in conspiracy theories, pseudoscience, quackery and other fields where a person feels their views and ideas are being marginalised by mainstream society.

Ironically, an appeal to the minority is inherently limited. If someone successfully persuades other people that they are right, then their opinion would increasingly lose its minority status — and eventually would become majority opinion.

Leave a comment

Filed under Logical fallacies