One of the central assumptions of mainstream economics has been that people make rational choices. As a challenge to this assumption, Nobel prize winning behavioural economist Prof. Daniel Kahneman gives an example where some Americans were offered a choice of insurance against their own death in a terrorist attack while on a trip to Europe, or insurance that would cover death of any kind on the trip. People were willing to pay more for the former insurance, even though ‘death of any kind’ includes ‘death in a terrorist attack’.
This is an instance of the Conjunction Fallacy, which is based on the false assumption that specific conditions are more probable than general ones. This fallacy usually stems from thinking the choices are alternatives, rather than members of the same set.
The logical form of this fallacy is:
Premise: X is a subset of Y.
Conclusion: Therefore, X is more probable than Y.
The probability of a conjunction is never greater than the probability of its conjuncts. In other words, the probability of two things being true can never be greater than the probability of one of them being true, since in order for both to be true, each must be true. However, when people are asked to compare the probabilities of a conjunction and one of its conjuncts, they sometimes judge that the conjunction is more likely than one of its conjuncts. This seems to happen when the conjunction suggests a scenario that is more easily imagined than the conjunct alone.
Interestingly, Kahneman discovered in earlier experiments that statistical sophistication made little difference in the rates at which people committed the conjunction fallacy. This suggests that it is not enough to teach probability theory alone, but that people need to learn directly about the conjunction fallacy in order to counteract the strong psychological effect of imaginability.
As evolutionary scientists, we devote much of our working lives to exploring the behaviour of humans and other animals through an evolutionary lens. So it may come as a surprise that our show at this year’s Edinburgh Fringe is named Alas, Poor Darwin …?, borrowing from one of the most searing critiques of evolutionary psychology ever written. We’ve added a question mark, but still – it’s no simple tale of how our minds evolved.
Evolutionary theory is a bit like a chocolate ice cream in the hands of a two-year old: it’s going to get applied everywhere, but will anything useful be achieved in the process? The central tenets of Darwinian theory – variability, heredity and selection – are as beautiful as they are compelling. They completely revolutionised biology.
But applying these principles to the study of human behaviour has caused far more controversy. The evolutionary explanations for human behaviour that grab the headlines can often be neat; really neat – like tightly-plotted narratives in which everything works out perfectly in the end, usually with a guy getting a girl, where everything happens for a reason.
Real life rarely makes for such a neat story. We’ve all seen enough action movies to notice that the more satisfying the ending, the more plot holes you have to ignore as you walk out of the cinema. Neatness makes a good story, but it’s not enough for good science.
Ovulation meets evolution
One good example of this problem is the story of how women’s preferences for masculine male partners shift throughout the menstrual cycle in a strategic way. It goes like this: at the time of ovulation, when “good genes” are most important, women are attracted to more masculine men. For the rest of the menstrual cycle when faithfulness and cooperation are paramount, the opposite is true (we’re glossing over some subtleties that are explained here).
In a similar vein, there’s an elegant account of male violence. It says that men are more likely than women to behave aggressively everywhere in the world because in the Pleistocene epoch (between 10,000 and 1.7m years ago), humans had a polygynous mating system, meaning one man mating with several women. The men who succeeded in aggressive competition with other men had more partners, and therefore more children, and so more of their genes got passed on.
These stories prompt some awkward questions. For example does a change in women’s attraction have to be directly selected for? Could it be the by-product of some other evolutionary process? Can we be sure that the preferences reported in the lab by female undergraduates in 2015 are a good proxy for the real-life choices made by women 100,000 years ago? What evidence is there that our ancestors were polygynous? What selection pressures were acting on women while the men were all busy fighting? (Women’s genes also get passed on to their children, in case anyone had forgotten.)
You begin to find that very accomplished scientists who know an awful lot about evolution and human behaviour disagree. Vociferously. And there’s a good reason for this: they’re scientists. Destruction-testing of ideas is very much in the job spec.
The reality of scientific enquiry
In our own work we don’t generally find neat, satisfying stories that are easy to tell, hard to critique, and make everything fall into place. We tend to end up with tantalising hypotheses, really interesting ideas that might be true but we haven’t quite gathered the data to nail down beyond all doubt. We find theories that are dazzling in their elegance but multitudinous in their caveats.
We find that the mind steadfastly refuses to behave like a collection of perfectly adapted units, each with a single function that afforded a clear evolutionary advantage at some weirdly specific yet curiously under-specified time during human evolutionary history. Instead the human mind seems to be full of compromises and by-products, highly flexible, and intricately intertwined with this weird thing called “human culture”.
Yet having been drawn to evolutionary science for its extraordinary elegance and having found a thousand times more questions than satisfactory answers, we persist. Because if you expand your ideas about what “evolutionary” means – if you cease looking for the neat stories and embrace the fact that it’s going to get very, very messy, you can start to get somewhere really interesting.
Culture and evolution are not opposites. Evolved doesn’t have to mean adaptation. It might or might not mean “useful under some circumstances”. (It certainly doesn’t mean – and has never meant – good or right).
Refuting one evolutionary hypothesis about human behaviour doesn’t invalidate all of them. That would be like saying that evolutionary theory is felled by the old question, “But if we evolved from monkeys, why are there still monkeys?”
Arguing about the how, when and why isn’t a sign of science denialism, nor a reason to scrap the whole line of investigation – it’s healthy disagreement and we’d like to see more of it. Being an evolutionary scientist is a bit like being Dirk Gently: you might not get where you were hoping to go, but you’ll probably end up somewhere it’s worth being.
Kate and Lewis’s show, Alas Poor Darwin …?, part of the Cabaret of Dangerous Ideas, is taking place at the Edinburgh Fringe on August 16
All first year students at the University of Technology Sydney could soon be required to take a compulsory maths course in an attempt to give them some numerical thinking skills.
The new course would be an elective next year and mandatory in 2016 with the university’s deputy vice-chancellor for education and students Shirley Alexander saying the aim is to give students some maths “critical thinking” skills.
This is a worthwhile goal, but what about critical thinking in general?
Most tertiary institutions have listed among their graduate attributes the ability to think critically. This seems a desirable outcome, but what exactly does it mean to think critically and how do you get students to do it?
The problem is that critical thinking is the Cheshire Cat of educational curricula – it is hinted at in all disciplines but appears fully formed in none. As soon as you push to see it in focus, it slips away.
If you ask curriculum designers exactly how critical thinking skills are developed, the answers are often vague and unhelpful for those wanting to teach it.
This is partly because of a lack of clarity about the term itself and because there are some who believe that critical thinking cannot be taught in isolation, that it can only be developed in a discipline context – after all, you have to think critically about something.
So what should any mandatory first year course in critical thinking look like? There is no single answer to that, but let me suggest a structure with four key areas:
the nature of science.
I will then explain that these four areas are bound together by a common language of thinking and a set of critical thinking values.
The most powerful framework for learning to think well in a manner that is transferable across contexts is argumentation.
Arguing, as opposed to simply disagreeing, is the process of intellectual engagement with an issue and an opponent with the intention of developing a position justified by rational analysis and inference.
Arguments have premises, those things that we take to be true for the purposes of the argument, and conclusions or end points that are arrived at by inferring from the premises.
Understanding this structure allows us to analyse the strength of an argument by assessing the likelihood that the premises are true or by examining how the conclusion follows from them.
Arguments in which the conclusion follows logically from the premises are said to be valid. Valid arguments with true premises are called sound. The definitions of invalid and unsound follow.
This gives us a language with which to frame our position and the basic structure of why it seems justified.
Logic is fundamental to rationality. It is difficult to see how you could value critical thinking without also embracing logic.
People generally speak of formal logic – basically the logic of deduction – and informal logic – also called induction.
Deduction is most of what goes on in mathematics or Suduko puzzles and induction is usually about generalising or analogising and is integral to the processes of science.
Using logic in a flawed way leads to the committing of the fallacies of reasoning, which famously contain such logical errors as circular reasoning, the false cause fallacy or appeal to popular opinion. Learning about this cognitive landscape is central to the development of effective thinking.
The messy business of our psychology – how our minds actuality work – is another necessary component of a solid critical thinking course.
One of the great insights of psychology over the past few decades is the realisation that thinking is not so much something we do, as something that happens to us. We are not as in control of our decision-making as we think we are.
We are masses of cognitive biases as much as we are rational beings. This does not mean we are flawed, it just means we don’t think in the nice, linear way that educators often like to think we do.
It is a mistake to think of our minds as just running decision-making algorithms – we are much more complicated and idiosyncratic than this.
How we arrive at conclusions, form beliefs and process information is very organic and idiosyncratic. We are not just clinical truth-seeking reasoning machines.
Our thinking is also about our prior beliefs, our values, our biases and our desires.
4. The nature of science
It is useful to equip students with some understanding of the general tools of evaluating information that have become ubiquitous in our society. Two that come to mind are the nature of science and statistics.
Learning about what the differences are between hypotheses, theories and laws, for example, can help people understand why science has credibility without having to teach them what a molecule is, or about Newton’s laws of motion.
Understanding some basic statistics also goes a long way to making students feel more empowered to tackle difficult or complex issues. It’s not about mastering the content, but about understanding the process.
The language of thinking
Embedded within all of this is the language of our thinking. The cognitive skills – such as inferring, analysing, evaluating, justifying, categorising and decoding – are all the things that we do with knowledge.
If we can talk to students using these terms, with a full understanding of what they mean and how they are used, then teaching thinking becomes like teaching a physical process such as a sport, in which each element can be identified, polished, refined and optimised.
In much the same way that a javelin coach can freeze a video and talk to an athlete about their foot positioning or centre of balance, a teacher of critical thinking can use the language of cognition to interrogate a student’s thinking in high resolution.
All of these potential aspects of a critical thinking course can be taught outside any discipline context. General knowledge, topical issues and media provide a mountain of grist for the cognitive mill.
General concepts of argumentation and logic are readily transferable between contexts once students are taught to recognise the deeper structures inherent in these fields and to apply them across a variety of situations.
It’s worth understanding too that a good critical thinking education is also an education in values.
Not all values are ethical in nature. In thinking well we value precision, coherence, simplicity of expression, logical structure, clarity, perseverance, honesty in representation and any number of like qualities. If schools are to teach values, why not teach the values of effective thinking?
So, let’s not assume that students will learn to think critically just by learning the methodology of their subjects. Sure it will help, but it’s not an explicit treatment of thinking and is therefore less transferable.
A course that targets effective thinking need not detract from other subjects – in fact it should enhance performance across the board.
But ideally, such a course should not be needed if teachers of all subjects focused on the thinking of their students as well as the content they have to cover.