Monthly Archives: February 2014

False balance

by Tim Harding

False balance, is a form of deliberate or unintended media bias, in which opposing viewpoints are presented as being more balanced than the evidence warrants.  The media gives weight to evidence and opinions out of proportion to the supporting evidence for each side, and/or withholds information that would establish one side’s claims as baseless.  The impression is given of a scientific or evidence-based debate, where there is actually none. The fallacy is related to false equivalence, but is not quite the same. 

balance

Source: University of California Museum of Paleontology[1]
(used with permission)

This fallacy is also known as ‘Okrent’s law’ named after Daniel Okrent, the first public editor of The New York Times newspaper . He once said: “The pursuit of balance can create imbalance because sometimes something is true,” referring to the phenomenon of the press providing legitimacy to fringe or minority viewpoints in an effort to appear even-handed.

A notorious instance of false balance occurred on 16th August 2012, when WIN TV in Wollongong aired a news story about a measles outbreak in South-West Sydney.  The story appeared to give equal weight to the professional advice of a medical practitioner that everyone should be immunised (against measles); versus an amateur opinion by Meryl Dorey of the so-called Australian Vaccination Network (AVN). Ms. Dorey claimed that ‘All vaccinations in the medical literature have been linked with the possibility of causing autism, not just the measles/mumps/rubella vaccine.’  The TV reporter concluded ‘Choice groups are calling for greater research into the measles vaccine’.

The ABC’s Media Watch program was scathing in its criticism of this news story:

‘Choice groups’. They actually only quoted one group, which claims that it’s in favour of the public having a choice. But Meryl Dorey’s deceptively-named Australian Vaccination Network is in fact an obsessively anti-vaccination pressure group that’s immunised itself against the effect of scientific evidence.  Dorey’s claim about the medical literature linking vaccination and autism is pure, unadulterated baloney.[2]

On the Media Watch web site, there is a link to a long statement by the NSW Director of Health Protection, Dr. Jeremy McAnulty. Amongst other things, he says that:

Any link between measles vaccine and autism has been conclusively discredited by numerous in-depth studies and reviews by credible experts, including the World Health Organisation, the American Academy of Paediatrics and the UK Research Council. Statements erroneously linking measles vaccine and autism were associated with a decline in measles vaccination, which led to a measles outbreak in the UK in the past.[3]

Jonathon Holmes of Media Watch went on to say:

So why on earth, we asked WIN TV, did it include the AVN’s misleading claims in a news story about a measles outbreak?… Medical practitioners – choice groups. One opinion as valid as the other. It’s a classic example of what many – especially despairing scientists – call ‘false balance’ in the media…To put it bluntly, there’s evidence, and there’s bulldust. It’s a journalist’s job to distinguish between them, not to sit on the fence and bleat ‘balance’. Especially when people’s health is at risk.[2]

As the British Medical Journal put it last year in an editorial about the ‘debate’ in the UK:

The media’s insistence on giving equal weight to both the views of the anti-vaccine camp and to the overwhelming body of scientific evidence …made people think that scientists themselves were divided over the safety of the vaccine, when they were not. [4]

Other common examples of false balance in media reporting on science issues include the topics of man-made vs. natural climate change and evolution vs. creationism, as well as medicine vs quackery. As the Understanding Science web site says:

Balanced reporting is generally considered good journalism, and balance does have its virtues. The public should be able to get information on all sides of an issue — but that doesn’t mean that all sides of the issue deserve equal weight. Science works by carefully examining the evidence supporting different hypotheses and building on those that have the most support. Journalism and policies that falsely grant all viewpoints the same scientific legitimacy effectively undo one of the main aims of science: to weigh the evidence.[1]

False balance can sometimes originate from similar motives as sensationalism, where media producers and editors may feel that a story portrayed as a contentious debate will be more commercially successful to pursue than a more accurate account of the issue. However, unlike most other media biases, false balance may ironically stem from a misguided attempt to avoid bias; producers and editors may confuse treating competing views fairly—i.e., in proportion to their actual merits and significance—with treating them equally, by giving them equal time to present their views even when those views may be known beforehand to be based on false or unreliable information. In other words, two sides of a debate are automatically and mistakenly assumed to have equal value regardless of their respective merits.

References

[1] Beware of false balance: Are the views of the scientific community accurately portrayed? Understanding Science. University of California Museum of Paleontology. 25 February 2014 http://undsci.berkeley.edu/article/0_0_0/sciencetoolkit_04

[2] False Balance Leads To Confusion Media Watch Episode 35, 1 October 2012, ABC1. http://www.abc.net.au/mediawatch/transcripts/s3601416.htm

[3] Media Statement – Immunisation. Dr. Jeremy McAnulty, Director of Health Protection, NSW Health, 28th September, 2012.

[4] When balance is bias. British Medical Journal, Christmas Edition, 2011.

If you find the information on this blog useful, you might like to consider supporting us.

Make a Donation Button

11 Comments

Filed under Logical fallacies

Buridan’s Ass

Buridan’s Ass is the name give to an apparent paradox related to the free will paradox; although there is some debate amongst philosophers as to whether it actually is a paradox (see below).

The paradox is named after the French priest and philosopher Jean Buridan (c.1300-1358CE), who studied under William of Ockham.  It refers to a hypothetical situation where a donkey finds itself exactly halfway between two equally big and delicious bales of hay.  There is no way of distinguishing between these two bales – they appear to be identical.  Because the donkey lacks a reason or cause to choose one over the other, it cannot decide which one to eat, and so starves to death. This tale is usually taken as demonstrating that there is no free will.

The corollary to this argument is that if the donkey eats one of the bales of hay, then the donkey is making a choice.  If the donkey is making a choice, then it must have free will, because there is no causal mechanism to make it choose one bale over another. And if donkeys have free will, then so must humans.

Deliberations_of_Congress

Deliberations of Congress (Source: Wikimedia Commons)

The paradox actually predates Buridan – it dates to antiquity, being found in Aristotle’s On the Heavens.[2] Aristotle, in ridiculing the Sophist idea that the Earth is stationary simply because it is circular and any forces on it must be equal in all directions, says that is as ridiculous as saying that:

…a man, being just as hungry as thirsty, and placed in between food and drink, must necessarily remain where he is and starve to death.  — Aristotle, On the Heavens, (c.350 BCE)

The 12th century Persian Islamic scholar and philosopher Al-Ghazali discusses the application of this paradox to human decision-making, asking whether it is possible to make a choice between equally good courses without grounds for preference.  He takes the attitude that free will can break the stalemate.

Suppose two similar dates in front of a man, who has a strong desire for them but who is unable to take them both. Surely he will take one of them, through a quality in him, the nature of which is to differentiate between two similar things. — Abu Hamid al-Ghazali, The Incoherence of the Philosophers (c.1100CE)

Professor Hauskeller of Exeter University takes a scientifically sceptical view of this paradox, using the donkey scenario:

If we could find a donkey which was dumb enough to starve between two piles of hay, we would have evidence against free will, at least as far as donkeys are concerned (or at least that particular donkey). But that’s not very likely. No matter how artfully we arrange the situation, a donkey will not hesitate very long, if at all, and will soon choose one of the piles of hay. He doesn’t care which, and he certainly won’t starve. However, even if we conducted thousands of experiments like this, and no donkey ever starved, we would still not have proved the existence of free will, because the reason no donkey ever starves in front of two equally attractive piles of hay may simply be that those piles aren’t really equally attractive. Perhaps in real life there aren’t any situations where the weighted reasons for a choice are equal.[1]

So Hauskeller’s suggested solution to the paradox is that the piles of hay are not equal in practice – the donkey detects a slight difference which causes it to choose one pile over the other. This solution is not very convincing when one considers the hypothetical possibility of the two piles of hay being exactly equal in appearance. So it seems that we still have a problem here.

Some proponents of hard determinism have acknowledged the difficulty the scenario creates for determinism, but have denied that it illustrates a true paradox, since a deterministic donkey could recognize that both choices are equally good and arbitrarily (randomly) pick one instead of starving. For example, there are deterministic machines that can generate random numbers, although there is some dispute as to whether such numbers are truly random.

References:

[1] Hauskeller, M. (2010) Why Buridan’s Ass Doesn’t Starve Philosophy Now, London. http://philosophynow.org/issues/81/Why_Buridans_Ass_Doesnt_Starve

[2] Rescher, N. (2005). Cosmos and Logos: Studies in Greek Philosophy . Ontos Verlag. pp. 93–99.

If you find the information on this blog useful, you might like to consider supporting us.

Make a Donation Button

6 Comments

Filed under Paradoxes

Sunk cost fallacy

by Tim Harding

In economics and business decision-making, a sunk cost is a retrospective (past) cost that has already been incurred and cannot be recovered. Sunk costs are sometimes contrasted with prospective costs, which are future costs that may be incurred or changed if an action is taken.

In traditional microeconomic theory, only prospective (future) costs are relevant to an investment decision. Traditional economics proposes that economic actors should not let sunk costs influence their decisions. Doing so would not be rationally assessing a decision exclusively on its own merits.

On the other hand, evidence from behavioral economics suggests this theory fails to predict real-world behavior. Sunk costs do, in fact, influence actors’ decisions because humans are prone to loss aversion and framing effects. In light of such cognitive quirks, it is unsurprising that people frequently fail to behave in ways that economists deem  rational.

Sunk costs should not affect the rational decision-maker’s best choice. However, until a decision-maker irreversibly commits resources, the prospective cost is an avoidable future cost and is properly included in any decision-making processes. For example, if one is considering preordering movie tickets, but has not actually purchased them yet, the cost remains avoidable. If the price of the tickets rises to an amount that requires him to pay more than the value he places on them, he should figure the change in prospective cost into the decision-making and re-evaluate his decision.

Many people have strong misgivings about ‘wasting’ resources (loss aversion). Continuing with the movie ticket analogy, after purchasing a non-refundable movie ticket, many people, for example, would feel obliged to go to the movie despite not really wanting to, because doing otherwise would be wasting the ticket price – they feel they have passed the point of no return. Similarly, some people will not walk out of a movie they dislike, because they do not want to ‘waste’ the money they have already paid and cannot recover i.e. the sunk cost. This is despite the fact that if they walk out of the movie, they could spend the time doing something else that they much prefer.

Another common instance of this behaviour is the reluctance to sell underperforming company shares (stocks) for fear of wasting one’s original investment; when it would make better sense to sell the shares and use the money to buy some other shares that are more likely to perform better.

This  behaviour referred to as the sunk cost fallacy. Economists would label this behavior ‘irrational‘: it is inefficient because it misallocates resources by depending on information that is irrelevant to the decision being made. Colloquially, this is known as ‘throwing good money after bad’. It could also be described as a form of preference failure i.e. not acting in one’s own best interests.

If you find the information on this blog useful, you might like to consider making a donation.

Make a Donation Button

4 Comments

Filed under Logical fallacies