This might seem like a simple question, but the answer is not so straightforward. The Macquarie Dictionary defines a fact as ‘what has really happened or is the case; truth; reality’. This implies that facts are objective, as opposed to opinions which are subjective. The distinction is important to scientific skepticism, which looks for objective evidence in support of dubious claims that are made.
The usual test for a statement of fact is verifiability; that is, whether it can be demonstrated to correspond to either empirical observation or deductive proof (such as 2+2=4). For instance, the proposition ‘It is raining’ describes the fact that it actually is raining. The rain that falls can be objectively measured in a rain gauge – it is not just a matter of opinion.
On the other hand, an opinion is a judgement, viewpoint, or statement about matters commonly considered to be subjective, such as ‘It is raining too much’. As Plato said: ‘opinion is the medium between knowledge and ignorance’.
Philosophers generally agree that facts are objective and verifiable. However, there are two main philosophical accounts of the epistemic status of facts. The first account is equivalent in meaning to the dictionary definition – a fact is that which makes a true proposition true. In other words, facts describe reality independently of propositions. This means that there can be unknown facts yet to be discovered by science or other investigations. But we cannot verify a fact unless we know about it. So the other account is that a fact is an item of knowledge – a proposition that is true and that we are justified in believing is true. This means that we either have to accept that there can be unknown and unverifiable facts, or we must adopt the position that facts are things that we know.
Mulligan, Kevin and Correia, Fabrice, ‘Facts‘, The Stanford Encyclopedia of Philosophy (Winter 2017 Edition), Edward N. Zalta (ed.)
Russell, Bertrand (1918) The Philosophy of Logical Atomism, Open Court, La Salle.
Russell, Bertrand (1950) An Inquiry Into Meaning and Truth, Allen & Unwin, London.
(An edited version of this article was published in “The Skeptic” Vol 37, No. 2, June 2017)
The claim ‘I’m entitled to my opinion’ or ‘I have a right to my opinion’ is a logical fallacy in which a person attempts to reject objections to their argument by claiming that they are entitled to their opinion. This claim is usually uttered by people in disagreement when they have hit the wall in defending their point on its merits. It is a last ditch rhetorical device that attempts to rescue their position by defending their right to hold an opinion, no matter how ill-founded or wrong that opinion might be.
The claim exemplifies a red herring. The right to have an opinion is not what is in dispute. Whether one has a particular entitlement or right is irrelevant to whether one’s opinion is true or false. To assert the existence of the right is a failure to provide any justification for the content of the opinion. The claim also implies that all opinions are equal, which exemplifies the relativist fallacy.
The entitlement claim would be relevant only if it guaranteed the truth of your opinions. But it can’t do that, because it is an entitlement supposedly enjoyed by everybody. And people disagree. Two debaters are both entitled to their contradictory opinions about a given issue, but they can’t both be right.  So insisting that you are entitled to your opinion cannot possibly give you any logical advantage in a debate.
 The relativist fallacy, also known as the subjectivist fallacy, is claiming that something is true for one person but not true for someone else. The fallacy rests on the law of noncontradiction. The fallacy applies only to objective facts, or what are alleged to be objective facts, rather than to personal tastes or subjective experiences.
 In classical logic, the law of non-contradiction (LNC) is the second of the three classic laws of thought. It states that contradictory statements cannot both be true in the same sense at the same time, e.g. the two propositions ‘A is B’ and ‘A is not B’ are mutually exclusive.
(An edited version of this essay was published in The Skeptic magazine,
September 2015, Vol 36 No 3 p.36, titled ‘Who needs to Know?’ It has since been republished in the Australian Doctor magazine 30 October 2015. The essay is based on a talk presented to the Victorian Skeptics in May 2015 ).
Anti-vaccination campaigner, Meryl Dorey is on record as saying that we should ‘do our own research’ instead of accepting what the doctors and other qualified experts tell us. Seasoned skeptics will be aware that ‘Do your own research!’ is a common retort by cranks and conspiracy theorists to those who dare to doubt their claims. It is a convenient escape hatch they use when trying to win a debate without the bothersome burden of providing their own evidence.
Of course, what they mean by this exhortation is not to do any actual scientific or medical research. It takes a bit of tertiary education in the relevant field to be able to do that. For them, ‘research’ means nothing more than googling for an hour or so on the Internet. They naively equate such googling with the years of study and experience it takes to become a qualified expert. Their message is that anybody with internet access can become an instant but unqualified expert on anything. Or worse still, that expertise doesn’t even count – all opinions are equal.
The reality is that googling is a notoriously unreliable source of information – there are sound reasons why Wikipedia is not allowed to be cited as a source in university assignments. The problem is that without expertise in the field in question, few googlers are capable of knowing which sources are reliable and which aren’t. Anything found on the internet becomes ‘knowledge’. Mere opinions become ‘facts’.
Another problem is that googlers are often unaware of the wider knowledge context of the specific pieces of information they have found on the internet. In contrast, experts are as much aware of what they don’t know as what they do know. As Professor Stephan Lewandowsky of the University of Bristol puts it:
‘Here is the catch: to know how much more there is to know requires knowledge to begin with. If you start without knowledge, you also do not know what you are missing out on.’
This paradox gives rise to a famous result in experimental psychology known as the Dunning-Kruger Effect. Named after Justin Kruger and David Dunning, it refers to a study they published in 1999. This study found that people who lack the knowledge or wisdom to perform well are often unaware of this fact. This is almost more dangerous than complete ignorance, because unlike Donald Rumsfeld, they don’t even know what they don’t know.
Professor Tom Nichols, a US national security expert wrote last year about the ‘death of expertise’; a Google-fueled, Wikipedia-based, blog-sodden collapse of divisions between professionals and amateurs, teachers and students, knowers and wonderers – between those with any expertise in an area and those with none at all. He sees this situation as not only a rejection of knowledge, but also the processes of knowledge acquisition – a rejection of science and other pursuits of rationality.
Nichols is particularly critical of otherwise intelligent people who are ‘doing their own research’ on the internet and second-guessing their doctors by refusing to vaccinate their children, leading to an entirely avoidable resurgence of dangerous infectious diseases such as whooping cough and measles.
So how did it all come to this sorry state of affairs? I think that there are basically four contributing factors: the blurring of facts and opinions; a misunderstanding of democracy; a misunderstanding of the Argument from Authority; and the dissipation of media accountability. I will now discuss each of these factors in turn and then outline some benefits of listening to experts.
Blurring facts and opinions
According to the Stanford Encyclopaedia of Philosophy, a fact is a state of affairs that is the case. The usual test for a statement of fact is verifiability; that is, whether it can be demonstrated to correspond to experience. Scientific facts are verified by repeatable careful observation or experiment. In other words, a fact is that which makes a true statement true. For instance, the statement ‘It is raining’ describes the fact that it actually is raining. The rain that falls can be objectively measured in a rain gauge – it is not just a matter of opinion.
On the other hand, an opinion is a judgment, viewpoint, or statement about matters commonly considered to be subjective, such as ‘It is raining too much’. As Plato said: ‘opinion is the medium between knowledge and ignorance’.
The last few decades have seen the growth of a postmodernist notion that truth is culturally relative and that all opinions are equal. What’s worse is a gradual blurring of the important distinction between facts and opinions. A disturbing feature of the public debate about climate change is the confusion between science and policy. Because they conflict with some political policies, there is a tendency for the findings of climate scientists to be treated as ‘just another opinion’. This is a marked change from a few decades ago, when the findings of epidemiologists about the links between smoking and cancer were widely accepted as facts rather than opinions.
Reducing the influence of experts is sometimes mistakenly described as ‘the democratisation of ideas’. Democracy is a system of government – it is not an equality of opinions. Whilst the right of free speech prevents governments from suppressing opinions, it does not require citizens to treat all opinions equally or even take them into account. Equal rights do not result in equal knowledge and skills. As Professor Brian Cox has said:
Deakin University philosopher Dr. Patrick Stokes has argued the problem with ‘I’m entitled to my opinion’ is that it has become shorthand for ‘I can say or think whatever I like’ without justification; and that disagreement is somehow disrespectful. Stokes suggests that this attitude feeds into the false equivalence between experts and non-experts that is an increasingly pernicious feature of our public discourse.
Professor Michael Clark of LaTrobe University gives an example of a public meeting recently, when a participant asked a question that referred to some research, a senior public servant replied: ‘Oh, everyone has a scientific study to justify their position, there is no end to the studies you could cite, I am sure, to support your point of view.’ Clark describes this is a cynical statement, where there are no absolute truths and everyone’s opinion must be treated as equally valid. In this intellectual framework, the findings of science can be easily dismissed as one of many conflicting views of reality.
Misunderstanding the Argument from Authority
A common response from cranks and conspiracy theorists (and even some skeptics) to citations of expertise is ‘that’s just the argument from authority fallacy’. Such a response ignores the obvious fact that all scientific papers and other forms of academic writing are chock full of citations of experts. The notion that the written outputs of the world’s universities and scientific institutions are all based on a logical fallacy is preposterous. Anybody who thinks that has clearly not thought through the implications of what they are saying.
The Argument from Authority is often misunderstood to be a fallacy in all cases, when this is not necessarily so. The argument becomes a fallacy only when used deductively, or where there is insufficient inductive strength to support the conclusion of the argument.
The most general form of the deductive fallacy is:
Premise 1: Source A says that statement p is true.
Premise 2: Source A is authoritative.
Conclusion: Therefore, statement p is true.
Even when the source is authoritative, this argument is still deductively invalid because the premises can be true, and the conclusion false (i.e. an authoritative claim can turn out to be false). This fallacy is known as ‘Appeal to Authority’.
The fallacy is compounded when the source is not an authority on the relevant subject matter. This is known as Argument from false or misleading authority.
Although reliable authorities are correct in judgments related to their area of expertise more often than laypersons, they can occasionally come to the wrong judgments through error, bias or dishonesty. Thus, the argument from authority is at best a probabilistic inductive argument rather than a deductive argument for establishing facts with certainty. Nevertheless, the probability sometimes can be very high – enough to qualify as a convincing cogent argument. For example, astrophysicists tell us that black holes exist. The rest of us are in no position to either verify or refute this claim. It is rational to accept the claim as being true, unless and until the claim is shown to be false by future astrophysicists (the first of whom would probably win a Nobel Prize for doing so). An alternative explanation that astrophysicists are engaged in a worldwide conspiracy to deceive us all would be implausible and irrational.
An artist’s depiction of a black hole
As the prominent British environmental activist Mark Lynas has said ‘…if an overwhelming majority of experts say something is true, then any sensible non-expert should assume that they are probably right.’
Thus there is no fallacy entailed in arguing that the advice of an expert in his or her field should be accepted as true, at least for the time being, unless and until it is effectively refuted. A fallacy only arises when it is claimed or implied that the expert is infallible and that therefore his or her advice must be true as a deductive argument, rather than as a matter of probability. Criticisms of cogent arguments from authority can actually be a rejection of expertise, which is a fallacy of its own.
The Argument from Authority is sometimes mistakenly confused with the citation of references, when done to provide published evidence in support of the point the advocate is trying to make. In these cases, the advocate is not just appealing to the authority of the author, but providing the source of evidence so that readers can check the evidence themselves if they wish. Such citations of evidence are not only acceptable reasoning, but are necessary to avoid plagiarism.
Expert opinion can also constitute evidence and is often accepted as such by the courts. For example, if you describe your symptoms to your doctor and he or she provides an opinion that you have a certain illness, that opinion is evidence that you have that illness. It is not necessary for your doctor to cite references when giving you his or her expert opinion, let alone convince you with a cogent argument. In some cases, expert opinion can carry sufficient inductive strength on its own.
Dissipation of media accountability
I have no doubt that the benefits of the internet generally outweigh the costs. However, there are some downsides that need be considered rather than just glossed over. An obvious negative is the decline of newspapers and competent professional journalism. Specialist science or medical journalists are a rarity these days. Generalist journalists often get their science stories wrong, or engage in misleading false balance – the equating of professional expertise with amateur ignorance.
Another problem is the blurring of the distinction between journalism and blogging – and I say this as a blogger myself. Unlike bloggers, journalists are subject to professional standards and editorial control. Some bloggers are anonymous, which removes their accountability to even their own readers for the accuracy of what they write.
There is a risk that when non-experts google, they are inclined to give equal weight to information from both professional journalists and amateur bloggers, regardless of its reliability and accuracy.
Benefits of expertise
Whilst experts are human and can mistakes, they have a pretty good batting average compared to laypersons. The advice that experts provide is far more likely to be true than advice from non-experts in the field in question. This has obvious benefits for society as a whole, for example in terms of public health and safety, environmental protection and managing the economy. There are good reasons why we don’t let amateurs design aircraft, bridges and tall buildings. But there are also some major benefits for the individual in listening to advice from experts as opposed to non-experts.
For instance, if you trust your doctor, you’re actually more likely to do better when you’re sick, according to a study recently published by General Hospital Psychiatry. This study, of 119 people with either breast, cervical, intestinal or prostate cancer, found that from three months following diagnosis, those patients who did not trust their doctors were not only more distressed but also more physically disabled. They were less likely, for example, to be able to go for long walks or take care of themselves. Patients who felt anxious about being rejected and abandoned suffered the most from not trusting their doctors.
Trusting your doctor has clear health benefits. You’ll be more likely to try new drugs, follow your treatment plan (jointly agreed with your trustworthy doctor), share important medical information, take preventative measures (e.g. screening) and have better-controlled diabetes and blood pressure.
Up to half of the failures in treatment reported by patients are due to not following the regime suggested by doctors. This increases the risk of hospitalisation and extended ill health. Another study at the University of California has found a small but statistically significant association between how much patients trusted their doctors and how much their symptoms improved within two weeks (allowing for different factors that could have influenced the outcome).
As Professor Michael Clark has said, people who use Dr. Google to diagnose their symptoms before visiting an actual doctor, sometimes ask to be tested for diseases they do not have, or waste time seeking a second opinion because they are convinced that their ‘research’ has led them to a correct diagnosis. If it were really that easy, would doctors have to spend all those years in medical school? Prof. Clark has also said that:
“Using Google to find the answer to Trivial Pursuit questions is not the same as researching a complex question. Experts do have skills and one of those is the ability to use high quality sources, up to date theoretical frameworks, and critical thinking based on their experience in a particular field. This is why an expert’s answers are going to be more accurate and more nuanced than a novice.”
Today’s Australians are, by far, the best educated cohort in our history –- on paper, anyway -– but this is not reflected in the quality of our political discourse. We appear to be lacking in courage, judgement, capacity to analyse and even simple curiosity, except about immediate personal needs.
Australia also has about 4.5 million graduates (nearly 20% of the population), far more than the total numbers of traditional blue collar workers. Members of trade unions amount to about one million people: 18% of the total work force and about 12% of the private sector.
Inevitably, these numbers will shift our political culture, but the process is occurring slowly.
Australia, like the US, UK, Canada and much of Europe, has undergone a serious decline in the quality of debate on public policy. The British journalist Robert Fisk has called this “the infantilisation of debate”.
In the era of “spin”, when a complex issue is involved, leaders do not explain. They find a mantra (“stop the boats!”) and repeat it endlessly, “staying on message”, without explanation or qualification. The word “because” seems to have fallen out of the political lexicon.
Evidence-based policies and actions should be a central principle in the working of our system and reliance on populism and sloganeering should be rejected, but in reality they are not.
Selling out science
Complex problems demand complex solutions. Examples of such problems are refugees and climate change, which cannot be reduced to parroting a few simple slogans (“turn back the boats”, “stop this toxic tax”).
“Retail politics” – sometimes called “transactional politics” – where policies are adopted not because they are right but because they can be sold, is a dangerous development and should be rejected. We must maintain confidence that major problems can be addressed –- and act accordingly.
A voracious media looks for diversity and emotional engagement, weakening capacity for reflection and serious analysis, compounded by the rise of social media where users, typically, seek reinforcement of their views rather than being challenged by diversity.
Science and research generally are given disturbingly low priority in contemporary public life in Australia. Scientists, especially those involved with climate change or the environment, have come under unprecedented attack, especially in the media.
And the whole concept of the scientific method is discounted, even ridiculed. Gus Nossal sometimes quotes me as saying that Australia must be the only country in the world where the word “academic” is treated as pejorative.
The role of science in policy development is a sensitive issue. I spent years – decades really – bashing my head against a brick wall trying to persuade colleagues to recognise the importance, even centrality, of science policy.
Many, probably most, of my political colleagues had no interest in science as an intellectual discipline, although they depended on science for their health, nutrition, transport, entertainment and communication.
We need to revive the process of dialogue: explain, explain, explain, rejecting mere sloganeering and populism. We need evidence-based policies, but often evidence lacks the psychological carrying power generated by appeals to prejudice or fear of disadvantage (“they are robbing you…”).
Evidence vs. opinion
There is a disturbing conflict between evidence and opinion (“you have evidence, but I have strong opinions”), and political processes are more likely to be driven by opinion rather than evidence in a short political cycle.
Brian Schmidt, our Nobel Laureate in astrophysics, wrote of his experience in this regard in The Age on February 16:
As a Nobel Prize winner, I travel the world meeting all kinds of people.
Most of the policy, business and political leaders I meet immediately apologise for their lack of knowledge of science.
Except when it comes to climate science. Whenever this subject comes up, it never ceases to amaze me how each person I meet suddenly becomes an expert.
Facts are then bandied to fit an argument for or against climate change, and on all sides, misconceptions abound.
The confusion is not surprising – climate science is a very broad and complicated subject with experts working on different aspects of it worldwide.
No single person knows everything about climate change. And for the average punter, it’s hard to keep up with all the latest research and what it means.
More surprising is the supreme confidence that non-experts (scientists and non-scientists alike) have in their own understanding of the subject.
I encourage you to read Thinking, Fast and Slow, a 2011 best seller by the psychologist Daniel Kahneman who, although not an economist, won the Nobel Prize for Economic Science in 2002 for his development of “prospect theory”.
Prospect theory analyses rational and irrational factors in decision making. He demonstrates, regrettably, the extent to which people like you and me use familiar short cuts – “heuristics” – to make intuitive judgements, and discount evidence or rationality in making decisions.
This can apply whether purchasing something, deciding where and how to like something, or taking a political stance on issues. Kahneman became the outstanding authority on behavioural economics and social psychology.
Jonathan Haidt’s The Righteous Mind: Why Good People are Divided by Politics and Religion, from 2012, is also an important book. I think Haidt could go much further with his thesis, which states that politics and religion tend to be centred on “values”, so people can pick and choose, and can sometimes be blinded to the facts because of their moral worldview. It is clear that many people say: “I reject these particular facts because I don’t trust where they come from.”
Heuristics and confusion
Psychologists confirm that we habitually engage in the cherry-picking of evidence -– we choose the bits that we are emotionally, intuitively, attracted to and comfortable with.
The Cambridge political scientist, David Runciman, argues that “opinion, interest and knowledge are too divided, and no event, whether an election […] or a crisis is clear enough in its meaning to bring closure”.
For example, there is fierce opposition in some quarters to the vaccination of children and the fluoridation of water supplies to prevent dental caries, even though the empirical evidence in support of both is overwhelming. But appeals to fear can be far more powerful than arguing on the basis of hard evidence.
There has been a sustained attack from some quarters – the News Corporation papers, the Institute of Public Affairs (IPA) and the Centre for Independent Studies (CIS) to name only three – on scientific research and scientific method, even on rationality and the Enlightenment tradition.
The illusion was created that scientists are corrupt, while lobbyists are pure. One of the false assertions is that scientists who take the mainstream position are rewarded, while dissenters are punished (similar to Galileo and the Inquisition).
In Australia now, and the US until recently, the contrary could be argued. Galileo’s work was based on observation of data -– his opponents were operating from doctrine.
Scientists arguing for the mainstream view have been subject to strong attack by denialists who assert that they are quasi-religious zealots who are missionaries for a green religion.
In reality, it was the denialist/confusionist position to rely on faith, the conviction that there were a diversity of complex reasons for climate change but only one could be confidently rejected: the role of human activity.
There are three areas of attack against expertise and taking a long term, analytical view of the world: from the Right, the Left and the anxious Centre.
From the Right there have been systematic and well-financed attacks by lobbyists from the fossil fuels industry and electricity generators. This has been highly personal, often abusive, sometimes threatening.
The anxious Centre includes people working in particular industries and regions (such as Hunter Valley, La Trobe Valley, Tasmanian forests), understandably fearful of potential job losses, without much prospect of creating new jobs. The trade union movement is deeply divided on this –- as is the business community.
But from the Left, or some segments of the intellectual Left, a deconstructionist mind-set has partly undermined an evidence-based approach to policy making or problem solving.
The pluralist or deconstructionist or post-modern theory of knowledge is contemptuous of expertise, rejects the idea of hierarchies of knowledge and asserts the democratic mantra that –- as with votes in elections –- every opinion is of equal value, so that if you insist that the earth is flat, refuse vaccination for children or deny that HIV-AIDS is transmitted by virus, your view should be treated with respect.
Similarly, there has been a repudiation of expertise and or taste -– dismissing the idea of people like Harold Bloom, or myself, that there is a “Western canon” which sets benchmarks. “No,” say the deconstructionists, “the paintings of Banksy, the mysterious British graffiti artist, are just as good as Raphael, and hip-hop performances are just as valid as Beethoven’s Opus 131.”
The Welsh geneticist Steve Jones asks an important question: if there is a division of scientific opinion, with 999 on one side, and one on the other, how should the debate be handled? Should the one dissenter be given 500 opportunities to speak?
Yet Graham Lloyd, The Australian’s environment editor – perhaps more accurately described as the anti-environment editor – trawls the web, finds obscure and unsubstantiated critiques of mainstream science, then publishes them as front page attacks on professional integrity.
Science and common-sense
There are major problems when it comes to explaining some of issues in science, and there have been ever since science began. Some fundamental scientific discoveries seem to be counter-intuitive, challenging direct observation or our common-sense view of the world.
Common sense, and direct observation, tells us that the Earth is flat, that the sun (like the moon) rotates around the Earth and that forces don’t operate at a distance.
Aristotle with his encyclopedic –- but often erroneous –- grasp of natural phenomena, was a compelling authority in support of a geocentric universe, and that the seat of reason was in the heart, not the brain, and that females were deformed males. His views were dominant for 1,500 years.
The Greek astronomer Ptolemy, following Aristotle, provided ingenious proofs in support of geocentrism. Then along came Copernicus, Galileo and Kepler who said: “Your common sense observation is wrong. The orbits of sun and moon are completely different, although they appear to be similar.” (Our use of the terms “sunrise” and “sunset” preserves the Ptolemaic paradigm.)
By the 20th Century, electronics enabled us to apply force from a distance, to do thousands of things remotely, manipulating spacecraft and satellites, or receiving signals (radio, telephony, television), setting alarms, opening garage doors and, one of the great labour saving devices, the remote switch for television.
The most obvious disjunction between science and common sense is the question: “right now, are we at rest or in motion?”
Common sense and direct observation suggests that we are at rest. But science says, “wrong again”. We are moving very rapidly. The earth is spinning on its axis at a rate of 1,669 kmh at the equator, and in Melbourne (37.8°S) at 1,317 kmh. We are also orbiting round the sun even faster, at nearly 30 kms, or 107,200 kmh. There is a third motion, harder to measure, as the galaxy expands -– and it’s speeding up, as Brian Schmidt postulates.
But, sitting here in Footscray, it is hard to grasp that we are in motion, kept in place by gravity. Psychology resists it. Essentially we have to accept the repudiation of common sense on trust, because somebody in a white coat says, “trust me, I’m a scientist”. I would challenge anyone to reconcile common sense and quantum theory or to satisfactorily explain the Higgs boson or -– hardest of all -– to define gravity.
The factors that limit the psychological carrying power of much science –- not all -– include these:
its complexity, often requiring use of a language known only to initiates
outcomes are seen as too expensive
outcomes are seen as too slow
the history of science has been badly taught, often portrayed as an effortless success story, proceeding from triumph to triumph, instead of the passionate and dramatic reality.
Science at the core
Scientists and learned societies have been punching below their weight in matters of public policy, and they are careful to avoid being involved in controversies outside their disciplines, possible threats to grants being among them.
Science must be at the core of our national endeavour and you are well placed to examine the evidence, evaluate it, then advocate and persuade. Our nation’s future depends on the quality of its thinking, and its leaders.
There is a wide-spread assumption by industry and government that Australia’s economic, social and technological future will be a mirror image of the past. We can be confident that this just won’t happen. We have not even begun to talk seriously about the threats and opportunities of a post-carbon economy.
I encourage you, whatever your political persuasion, or lack of it, to argue for higher recognition of the role that science must play in our future, and drive your MP mad unless or until he/ she does something about it.
Remember Archimedes and his lever. But first you have to find a fulcrum, then you push the lever.