Tag Archives: critical thinking

The uselessness of cynicism

By Tim Harding and ChatGPT-4

Contemporary cynicism has quite a different meaning from the Classical Greek school of philosophical thought of that name. The modern meaning is an attitude of distrust toward claimed ethical and social values and a rejection of the need to be socially involved.

Nowadays cynicism, often mistaken for a wise and pragmatic worldview, can paradoxically become a self-defeating and unproductive attitude. At its core, cynicism is characterized by a general suspicion of others’ motives, believing that people are primarily motivated by self-interest. While it might seem like a protective mechanism or a sign of intellectual sophistication, cynicism can lead to several negative consequences, both for individuals and society.

Firstly, cynicism fosters a sense of helplessness and passivity. By assuming the worst in people and situations, cynics often feel that any effort to effect change is futile. This mindset can lead to a lack of engagement with important social, political, and environmental issues. When individuals adopt a cynical attitude, they are less likely to participate in community activities, vote, or engage in discussions that could lead to positive change. The collective impact of widespread cynicism can be a society that is less active, less democratic, and more apathetic towards critical issues.

Secondly, cynicism can hinder personal and professional relationships. Human connections thrive on trust, empathy, and mutual respect. A cynical approach to relationships, characterized by suspicion and a lack of trust, can erode the foundations necessary for strong, healthy relationships. In the workplace, this can manifest as a lack of teamwork and cooperation, stifling creativity and productivity. In personal relationships, it can lead to isolation and loneliness, as others may find it challenging to connect with someone who consistently doubts their intentions.

Furthermore, cynicism can be detrimental to one’s mental health. A persistent negative outlook on life can lead to increased stress, anxiety, and depression. The belief that nothing will ever improve or that all people are inherently selfish can create a sense of hopelessness, impacting an individual’s overall well-being and quality of life.

It’s important to differentiate cynicism from skepticism and critical thinking. Skepticism is a more philosophical and neutral stance where one questions or doubts accepted opinions, beliefs, or claims. Skeptics do not necessarily believe that people are inherently deceitful or self-interested; rather, they require sufficient evidence or reasoning before accepting a claim as true. Skepticism is often seen as a critical thinking tool, encouraging inquiry and the suspension of judgment in the absence of adequate evidence. Critical thinking involves analyzing and evaluating an idea or a situation to form a reasoned judgment. It is constructive, whereas cynicism is often destructively dismissive. Critical thinking leads to growth and improvement; cynicism often leads to stagnation.

In conclusion, while cynicism might appear as a shield against disappointment and deceit, it often becomes a barrier to personal growth, healthy relationships, and societal progress. Adopting a more balanced view, one that involves critical thinking but also an openness to the possibility of good in others and in situations, can lead to a more fulfilling, productive, and positive life experience. By overcoming cynicism, individuals can contribute to creating a more cooperative, trusting, and proactive society.

1 Comment

Filed under Reblogs

How do you know that what you know is true? That’s epistemology

The Conversation

File 20170721 23983 1qvint1
How can you justify your knowledge? Epistemology has a few answers. Flickr/World’s Direction

Peter Ellerton, The University of Queensland

How do you know what the weather will be like tomorrow? How do you know how old the Universe is? How do you know if you are thinking rationally?

These and other questions of the “how do you know?” variety are the business of epistemology, the area of philosophy concerned with understanding the nature of knowledge and belief.

Epistemology is about understanding how we come to know that something is the case, whether it be a matter of fact such as “the Earth is warming” or a matter of value such as “people should not just be treated as means to particular ends”.

It’s even about interrogating the odd presidential tweet to determine its credibility.


Read more: Facts are not always more important than opinions: here’s why


Epistemology doesn’t just ask questions about what we should do to find things out; that is the task of all disciplines to some extent. For example, science, history and anthropology all have their own methods for finding things out.

Epistemology has the job of making those methods themselves the objects of study. It aims to understand how methods of inquiry can be seen as rational endeavours.

Epistemology, therefore, is concerned with the justification of knowledge claims.

The need for epistemology

Whatever the area in which we work, some people imagine that beliefs about the world are formed mechanically from straightforward reasoning, or that they pop into existence fully formed as a result of clear and distinct perceptions of the world.

But if the business of knowing things was so simple, we’d all agree on a bunch of things that we currently disagree about – such as how to treat each other, what value to place on the environment, and the optimal role of government in a society.

That we do not reach such an agreement means there is something wrong with that model of belief formation.

We don’t all agree on everything. Flickr/Frank, CC BY-NC

It is interesting that we individually tend to think of ourselves as clear thinkers and see those who disagree with us as misguided. We imagine that the impressions we have about the world come to us unsullied and unfiltered. We think we have the capacity to see things just as they really are, and that it is others who have confused perceptions.

As a result, we might think our job is simply to point out where other people have gone wrong in their thinking, rather than to engage in rational dialogue allowing for the possibility that we might actually be wrong.

But the lessons of philosophy, psychology and cognitive science teach us otherwise. The complex, organic processes that fashion and guide our reasoning are not so clinically pure.

Not only are we in the grip of a staggeringly complex array of cognitive biases and dispositions, but we are generally ignorant of their role in our thinking and decision-making.

Combine this ignorance with the conviction of our own epistemic superiority, and you can begin to see the magnitude of the problem. Appeals to “common sense” to overcome the friction of alternative views just won’t cut it.

We need, therefore, a systematic way of interrogating our own thinking, our models of rationality, and our own sense of what makes for a good reason. It can be used as a more objective standard for assessing the merit of claims made in the public arena.

This is precisely the job of epistemology.

Epistemology and critical thinking

One of the clearest ways to understand critical thinking is as applied epistemology. Issues such as the nature of logical inference, why we should accept one line of reasoning over another, and how we understand the nature of evidence and its contribution to decision making, are all decidedly epistemic concerns.

Just because people use logic doesn’t mean they are using it well.

The American philosopher Harvey Siegel points out that these questions and others are essential in an education towards thinking critically.

By what criteria do we evaluate reasons? How are those criteria themselves evaluated? What is it for a belief or action to be justified? What is the relationship between justification and truth? […] these epistemological considerations are fundamental to an adequate understanding of critical thinking and should be explicitly treated in basic critical thinking courses.

To the extent that critical thinking is about analysing and evaluating methods of inquiry and assessing the credibility of resulting claims, it is an epistemic endeavour.

Engaging with deeper issues about the nature of rational persuasion can also help us to make judgements about claims even without specialist knowledge.

For example, epistemology can help clarify concepts such as “proof”, “theory”, “law” and “hypothesis” that are generally poorly understood by the general public and indeed some scientists.

In this way, epistemology serves not to adjudicate on the credibility of science, but to better understand its strengths and limitations and hence make scientific knowledge more accessible.

Epistemology and the public good

One of the enduring legacies of the Enlightenment, the intellectual movement that began in Europe during the 17th century, is a commitment to public reason. This was the idea that it’s not enough to state your position, you must also provide a rational case for why others should stand with you. In other words, to produce and prosecute an argument.


Read more: How to teach all students to think critically


This commitment provides for, or at least makes possible, an objective method of assessing claims using epistemological criteria that we can all have a say in forging.

That we test each others’ thinking and collaboratively arrive at standards of epistemic credibility lifts the art of justification beyond the limitations of individual minds, and grounds it in the collective wisdom of reflective and effective communities of inquiry.

The sincerity of one’s belief, the volume or frequency with which it is stated, or assurances to “believe me” should not be rationally persuasive by themselves.

Simple appeals to believe have no place in public life.

If a particular claim does not satisfy publicly agreed epistemological criteria, then it is the essence of scepticism to suspend belief. And it is the essence of gullibility to surrender to it.

A defence against bad thinking

There is a way to help guard against poor reasoning – ours and others’ – that draws from not only the Enlightenment but also from the long history of philosophical inquiry.

So the next time you hear a contentious claim from someone, consider how that claim can be supported if they or you were to present it to an impartial or disinterested person:

  • identify reasons that can be given in support of the claim
  • explain how your analysis, evaluation and justification of the claim and of the reasoning involved are of a standard worth someone’s intellectual investment
  • write these things down as clearly and dispassionately as possible.

In other words, make the commitment to public reasoning. And demand of others that they do so as well, stripped of emotive terms and biased framing.

If you or they cannot provide a precise and coherent chain of reasoning, or if the reasons remain tainted with clear biases, or if you give up in frustration, it’s a pretty good sign that there are other factors in play.

It is the commitment to this epistemic process, rather than any specific outcome, that is the valid ticket onto the rational playing field.

The ConversationAt a time when political rhetoric is riven with irrationality, when knowledge is being seen less as a means of understanding the world and more as an encumbrance that can be pushed aside if it stands in the way of wishful thinking, and when authoritarian leaders are drawing ever larger crowds, epistemology needs to matter.

Peter Ellerton, Lecturer in Critical Thinking, Director of the UQ Critical Thinking Project, The University of Queensland

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

Leave a comment

Filed under Reblogs

The problem of false balance when reporting on science

The Conversation

Peter Ellerton, The University of Queensland

How do you know the people billed as science experts that you see, hear and read about in the media are really all that credible? Or have they been included just to create a perception of balance in the coverage of an issue?

It’s a problem for any media and something the BBC’s Trust is trying to address in its latest report on science impartiality in programming.

As part of ongoing training, staff, particularly in non-news programs, were told that impartiality is not just about including a wide range of views on an issue, as this can lead to a “false balance”. This is the process of providing a platform for people whose views do not accord with established or dominant positions simply for the sake of seeming “balanced”.

The BBC has been criticised before for “false balance” and there are reports now that certain climate change sceptics are banned from BBC News, although this is denied by the BBC.

It’s understandable that such false balance could grow from a desire to seem impartial, and particularly so since public broadcasters such as the BBC and the ABC in Australia are sensitive to claims of imbalance or bias.

Couple this with the need to negotiate the difficult ground of expert opinion, authentic balance and audience expectation, not to mention the always delicate tension between the imperatives of news and entertainment, and it hardly seems surprising that mistakes are made. An investigation this year found the ABC breached its own impartiality standards in its Catalyst program last year on statins and heart disease.

Finding the right balance

How then can journalists decide the best way to present a scientific issue to ensure accurate representation of the views of the community of experts? Indeed, how can any of us determine if what we are seeing in the media is balanced or a misrepresentation of expert opinion?

Hard to find the right balance.
Flickr/Paxson Woelber , CC BY

As I have written elsewhere, it is important to not confuse the right to be heard with an imagined right to be taken seriously. If an idea fails to survive in the community of experts, its public profile should diminish in proportion to its failure to generate consensus within that community.

A common reply to this is that science isn’t about consensus, it’s about the truth. This is so, but to use a consensus as evidence of error is fallacious reasoning.

While it’s true that some presently accepted notions have in the past been peripheral, the idea that simply being against the majority view equates to holding your intellectual ground in the best tradition of the enlightenment is ludicrous.

If all views are equal, then all views are worthless.

Were I to propose an idea free of testing or argument, I could not reasonably expect my idea to be as credible as those subject to rigorous experimentation and collaborative review. If such equality did exist then progress would be impossible, since progress is marked by the testing and rejection of ideas.

Defining an expert

In the case of science, this testing is the process of experimentation, data analysis and peer review. So if someone – scientist or otherwise – has not worked and published in an area, then they are not an expert in that area.

The first imperative for a journalist covering any story is to determine exactly in what field the issue best sits and then to seek advice from people who work and publish in that field.

Knowing how the issue fits into the broader picture of scientific investigation is very useful in determining this. It is one of the reasons that good science journalism follows from having journalists with some training in science.

Such a selection process, performed transparently, is an excellent defence against charges of bias.

Avoiding false balance

False balance can also be created by assuming that a person from outside the field (a non-expert) will somehow have a perspective that will shed light on an issue, that the real expert is too “caught up in the details” to be objective.

But suggesting that an expert is naive usually indicates an attempt at discrediting rather than truth seeking. Credibility is more about process than authority, and to be a recognised expert is to work within the process of science.

Also, if a piece of science is being criticised, we should ask if the criticism itself has been published. It’s not enough that someone with apparent authority casts doubt as this is simply an appeal to authority – an appeal that critics of mainstream science themselves use as a warrant to reject consensus.

A second journalistic imperative would be to recognise that not all issues are binary.

Coins may have two sides but not so every science issue.
Flickr/monkeyc net, CC BY-NC-SA

The metaphor that a coin has two sides is a powerful one, and the temptation to look at both sides of an issue is naturally strong. But the metaphor also assumes an equal weighting, and that both sides present the same space for discussion.

Proof and evidence

When an issue is genuinely controversial, the burden of proof is shared between opposing views. When a view is not mainstream, say that scientists are engaged in a conspiracy to defraud the public, the burden of proof sits with those promoting that view.

In such cases, as Christopher Hitchens succinctly put it:

What can be asserted without evidence can also be dismissed without evidence.

Attempting to dishonestly shift the burden of proof is a common device in the push to have young earth creationism taught in science classrooms.

The idea of “teaching both sides” or that students should be allowed to make up their own minds seems again like a recourse to the most basic ideas of a liberal education, but is in reality an attempt to bypass expert consensus, to offload the burden of proof rather than own it.

The fact is, that for issues such as creationism, vaccination and that climate change is occurring and is a function of human activity, it’s not about journalists suppressing views, it’s about quality control of information.

Stay with the issue

A classic means of muddying the waters is to employ straw man arguments, in which the point at issue is changed to one more easily defended or better suited to a particular interest. Politicians are adept at doing this, dodging hard questions with statements like “the real issue is” or “what’s important to people is”.

An expert versus who?

Deniers of climate science often change the issue from global warming to whether or not consensus is grounds for acceptance (it alone is not, of course), or focus on whether a particular person is credible rather than discuss the literature at large.

The anti-vaccine lobby talks about “choice” rather than efficacy of health care.
Young earth creationists talk about the right to express all views rather than engage with the science. Politicians talk about anything except the question they were asked.

The third imperative, therefore, is to be very clear as to what the article or interview is about and stick to that topic. Moving off topic negates the presence of the experts (the desired effect) and gives unsubstantiated claims prominence.

The impartiality checklist

The best method of dealing with cranks, conspiracy theorists, ideologues and those with a vested interest in a particular outcome is the best method for science reporting in general:

  • insist on expertise
  • recognise where the burden of proof sits
  • stay focused on the point at issue.

If the media sticks to these three simple rules when covering science issues, impartiality and balance can be justifiably asserted.

Correction: This article was amended on July 17, 2014 to include a report of the BBC’s denial that a climate change sceptic was banned from the public broadcaster.

The ConversationPeter Ellerton, Lecturer in Critical Thinking, The University of Queensland

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

 

1 Comment

Filed under Reblogs

The truth, the whole truth and … wait, how many truths are there?

The Conversation

Peter Ellerton, The University of Queensland

Calling something a “scientific truth” is a double-edged sword. On the one hand it carries a kind of epistemic (how we know) credibility, a quality assurance that a truth has been arrived at in an understandable and verifiable way.

On the other, it seems to suggest science provides one of many possible categories of truth, all of which must be equal or, at least, non-comparable. Simply put, if there’s a “scientific truth” there must be other truths out there. Right?

Let me answer this by reference to the fingernail-on-the-chalkboard phrase I’ve heard a little too often:

“But whose truth?”

If somebody uses this phrase in the context of scientific knowledge, it shows me they’ve conflated several incompatible uses of “truth” with little understanding of any of them.

As is almost always the case, clarity must come before anything else. So here is the way I see truth, shot from the hip.

Venture Vancouver

While philosophers talk about the coherence or correspondence theories of truth, the rest of us have to deal with another, more immediate, division: subjective, deductive (logical) and inductive (in this case, scientific) truth.

This has to do with how we use the word and is a very practical consideration. Just about every problem a scientist or science communicator comes across in the public understanding of “truth” is a function of mixing up these three things.

Subjective truth

Subjective truth is what is true about your experience of the world. How you feel when you see the colour red, what ice-cream tastes like to you, what it’s like being with your family, all these are your experiences and yours alone.

In 1974 the philosopher Thomas Nagel published a now-famous paper about what it might be like to be a bat. He points out that even the best chiropterologist in the world, knowledgeable about the mating, eating, breeding, feeding and physiology of bats, has no more idea of what it is like to be a bat than you or me.

Similarly, I have no idea what a banana tastes like to you, because I am not you and cannot ever be in your head to feel what you feel (there are arguments regarding common physiology and hence psychology that could suggest similarities in subjective experiences, but these are presently beyond verification).

What’s more, if you tell me your favourite colour is orange, there are absolutely no grounds on which I can argue against this – even if I felt inclined. Why would I want to argue, and what would I hope to gain? What you experience is true for you, end of story.

Deductive truth

Deductive truth, on the other hand, is that contained within and defined by deductive logic. Here’s an example:

Premise 1: All Gronks are green.
Premise 2: Fred is a Gronk.
Conclusion: Fred is green.

Even if we have no idea what a Gronk is, the conclusion of this argument is true if the premises are true. If you think this isn’t the case, you’re wrong. It’s not a matter of opinion or personal taste.

PistoCasero

If you want to argue the case, you have to step out of the logical framework in which deductive logic operates, and this invalidates rational discussion. We might be better placed using the language of deduction and just call it “valid”, but “true” will do for now.

In my classes on deductive logic we talk about truth tables, truth trees, and use “true” and “false” in every second sentence and no one bats (cough) an eyelid, because we know what we mean when we use the word.

Using “true” in science, however, is problematic for much the same reason that using “prove” is problematic (and I have written about that on The Conversation before). This is a function of the nature of inductive reasoning.

Inductive truth

Induction works mostly through analogy and generalisation. Unlike deduction, it allows us to draw justified conclusions that go beyond the information contained in the premise. It is induction’s reliance on empirical observation that separates science from mathematics.

In observing one phenomenon occurring in conjunction with another – an electric current and an induced magnetic field, for instance – I generalise that this will always be so. I might even create a model, an analogy of the workings of the real world, to explain it – in this case that of particles and fields.

This then allows me to predict what future events might occur or to draw implications and create technologies, such as developing an electric motor.

And so I inductively scaffold my knowledge, using information I rely upon as a resource for further enquiry. At no stage do I arrive at deductive certainty, but I do enjoy greater degrees of confidence.

I might even speak about things being “true”, but, apart from simple observational statements about the world, I use the term as a manner of speech only to indicate my high level of confidence.

Now, there are some philosophical hairs to split here, but my point is not to define exactly what truth is, but rather to say there are differences in how the word can be used, and that ignoring or conflating these uses leads to a misunderstanding of what science is and how it works.

For instance, the lady that said to me it was true for her that ghosts exist was conflating a subjective truth with a truth about the external world.

I asked her if what she really meant was “it is true that I believe ghosts exist”. At first she was resistant, but when I asked her if it could be true for her that gravity is repulsive, she was obliging enough to accept my suggestion.

AMERICANVIRUS

Such is the nature of many “it’s true for me” statements, in which the epistemic validity of a subjective experience is misleadingly extended to facts about the world.

Put simply, it smears the meaning of truth so much that the distinctions I have outlined above disappear, as if “truth” only means one thing.

This is generally done with the intent of presenting the unassailable validity of said subject experiences as a shield for dubious claims about the external world – claiming that homeopathy works “for me”, for instance. Attacking the truth claim is then, if you accept this deceit, equivalent to questioning the genuine subject experience.

Checkmate … unless you see how the rules have been changed.

It has been a long and painful struggle for science to rise from this cognitive quagmire, separating out subjective experience from inductive methodology. Any attempt to reunite them in the public understanding of science needs immediate attention.

Operating as it should, science doesn’t spend its time just making truth claims about the world, nor does it question the validity of subject experience – it simply says it’s not enough to make object claims that anyone else should believe.

Subjective truths and scientific truths are different creatures, and while they sometimes play nicely together, their offspring are not always fertile.

So next time you are talking about truth in a deductive or scientifically inductive way and someone says “but whose truths”, tell them a hard one: it’s not all about them.

The ConversationPeter Ellerton, Lecturer in Critical Thinking, The University of Queensland

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

2 Comments

Filed under Reblogs

Philosophy in schools: promoting critical, creative and caring thinking

The Conversation

Laura D’Olimpio, University of Notre Dame Australia

It has been a good week for philosophy. The results of a year long study on the benefits of teaching philosophy to primary school aged children has just been published in the UK and the reports are positive. The project was delivered by SAPERE, funded by the Education Endowment Foundation, and independently evaluated by a team at Durham University.

Philosophy for Children (P4C) started in the 1970s in order to encourage children to think for themselves. The term ‘P4C’ was coined by Matthew Lipman who wanted to encourage reasonableness in citizens, and figured the best way to do so was to teach philosophical thinking skills from an early age. I have previously written about why children should study philosophy.

Supporters of P4C believe that philosophy need not be confined to the domain of the academy, but rather children from ages 3 and upwards (years K – 12) are capable of critical, creative and caring thinking. It is argued that these thinking skills will create reasonable and democratic citizens. The current research is lending support to this claim.

A few days ago the academics from Durham University wrote a piece for The Conversation. Examining the results of students who participated in one P4C class per week over the course of a school year, the researchers discovered that:

The gain in reading for the pupils on the programme… was around two months of extra progress. There was even some evidence of greater improvement in reasoning skills, with about one month’s extra progress for those children who did the philosophy programme. For both reading and reasoning, the results were more pronounced for those children eligible for free school meals.

The UK newspapers have been excited by these results and reports on the study have appeared in The Guardian, The Times and The Independent. The results of the study was also reported by the BBC, who write, “weekly philosophy sessions in class can boost primary school pupils’ ability in maths and literacy” and then go on to note that, “crucially, they seem to work especially well for the children who are most disadvantaged.”

Teachers of P4C use various techniques, with the Community of Inquiry (CoI) as a central pedagogy. The CoI is based on democratic student led discussions where the teacher acts as a facilitator of philosophical dialogue. Participating in a CoI helps students to see themselves as belonging to a community of lifelong learners.

The CoI encourages students to ask their own questions and explore diverse opinions in an effort to reach a shared truth. This democratic process often results in pluralism, but is designed to eliminate unreasonable ideas along the way. It does this by asking participants for reasons and evidence to support all claims.

The social and behavioural improvements that accompany P4C have been noted by teachers who see students gain confidence and tolerance when they are able to ask their own questions and hear perspectives that differ from their own. By practising listening to different ideas that they are then able to critically discuss, it has been found that students are better able to deal with disagreements and conflict. Schools that have adopted a whole school P4C approach like Buranda primary school in Queensland, have found there is less bullying in the school playground.

In our global world, there is more need than ever before to engage with ideas critically as well as with empathy. We are facing universal moral questions such as “what should we do about climate change?” that will affect future generations. Teaching these future generations how to think critically, creatively and collaboratively will surely improve their resourcefulness and problem solving abilities. For all of these reasons, teaching philosophy to children is a brilliant idea.

The ConversationLaura D’Olimpio is Lecturer in Philosophy at University of Notre Dame Australia.

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

Leave a comment

Filed under Reblogs

How to teach all students to think critically

The Conversation

By Peter Ellerton, The University of Queensland

All first year students at the University of Technology Sydney could soon be required to take a compulsory maths course in an attempt to give them some numerical thinking skills.

The new course would be an elective next year and mandatory in 2016 with the university’s deputy vice-chancellor for education and students Shirley Alexander saying the aim is to give students some maths “critical thinking” skills.

This is a worthwhile goal, but what about critical thinking in general?

Most tertiary institutions have listed among their graduate attributes the ability to think critically. This seems a desirable outcome, but what exactly does it mean to think critically and how do you get students to do it?

The problem is that critical thinking is the Cheshire Cat of educational curricula – it is hinted at in all disciplines but appears fully formed in none. As soon as you push to see it in focus, it slips away.

If you ask curriculum designers exactly how critical thinking skills are developed, the answers are often vague and unhelpful for those wanting to teach it.

This is partly because of a lack of clarity about the term itself and because there are some who believe that critical thinking cannot be taught in isolation, that it can only be developed in a discipline context – after all, you have to think critically about something.

So what should any mandatory first year course in critical thinking look like? There is no single answer to that, but let me suggest a structure with four key areas:

  1. argumentation
  2. logic
  3. psychology
  4. the nature of science.

I will then explain that these four areas are bound together by a common language of thinking and a set of critical thinking values.

1. Argumentation

The most powerful framework for learning to think well in a manner that is transferable across contexts is argumentation.

Arguing, as opposed to simply disagreeing, is the process of intellectual engagement with an issue and an opponent with the intention of developing a position justified by rational analysis and inference.

Arguing is not just contradiction.

Arguments have premises, those things that we take to be true for the purposes of the argument, and conclusions or end points that are arrived at by inferring from the premises.

Understanding this structure allows us to analyse the strength of an argument by assessing the likelihood that the premises are true or by examining how the conclusion follows from them.

Arguments in which the conclusion follows logically from the premises are said to be valid. Valid arguments with true premises are called sound. The definitions of invalid and unsound follow.

This gives us a language with which to frame our position and the basic structure of why it seems justified.

2. Logic

Logic is fundamental to rationality. It is difficult to see how you could value critical thinking without also embracing logic.

People generally speak of formal logic – basically the logic of deduction – and informal logic – also called induction.

Deduction is most of what goes on in mathematics or Suduko puzzles and induction is usually about generalising or analogising and is integral to the processes of science.

Logic is fundamental to rationality.

Using logic in a flawed way leads to the committing of the fallacies of reasoning, which famously contain such logical errors as circular reasoning, the false cause fallacy or appeal to popular opinion. Learning about this cognitive landscape is central to the development of effective thinking.

3. Psychology

The messy business of our psychology – how our minds actuality work – is another necessary component of a solid critical thinking course.

One of the great insights of psychology over the past few decades is the realisation that thinking is not so much something we do, as something that happens to us. We are not as in control of our decision-making as we think we are.

We are masses of cognitive biases as much as we are rational beings. This does not mean we are flawed, it just means we don’t think in the nice, linear way that educators often like to think we do.

It is a mistake to think of our minds as just running decision-making algorithms – we are much more complicated and idiosyncratic than this.

How we arrive at conclusions, form beliefs and process information is very organic and idiosyncratic. We are not just clinical truth-seeking reasoning machines.

Our thinking is also about our prior beliefs, our values, our biases and our desires.

4. The nature of science

It is useful to equip students with some understanding of the general tools of evaluating information that have become ubiquitous in our society. Two that come to mind are the nature of science and statistics.

Learning about what the differences are between hypotheses, theories and laws, for example, can help people understand why science has credibility without having to teach them what a molecule is, or about Newton’s laws of motion.

Understanding some basic statistics also goes a long way to making students feel more empowered to tackle difficult or complex issues. It’s not about mastering the content, but about understanding the process.

The language of thinking

Embedded within all of this is the language of our thinking. The cognitive skills – such as inferring, analysing, evaluating, justifying, categorising and decoding – are all the things that we do with knowledge.

If we can talk to students using these terms, with a full understanding of what they mean and how they are used, then teaching thinking becomes like teaching a physical process such as a sport, in which each element can be identified, polished, refined and optimised.

Critical thinking can be studied and taught in part like physical processes.
Flickr/Airman Magazine, CC BY-NC

In much the same way that a javelin coach can freeze a video and talk to an athlete about their foot positioning or centre of balance, a teacher of critical thinking can use the language of cognition to interrogate a student’s thinking in high resolution.

All of these potential aspects of a critical thinking course can be taught outside any discipline context. General knowledge, topical issues and media provide a mountain of grist for the cognitive mill.

General concepts of argumentation and logic are readily transferable between contexts once students are taught to recognise the deeper structures inherent in these fields and to apply them across a variety of situations.

Values

It’s worth understanding too that a good critical thinking education is also an education in values.

Not all values are ethical in nature. In thinking well we value precision, coherence, simplicity of expression, logical structure, clarity, perseverance, honesty in representation and any number of like qualities. If schools are to teach values, why not teach the values of effective thinking?

So, let’s not assume that students will learn to think critically just by learning the methodology of their subjects. Sure it will help, but it’s not an explicit treatment of thinking and is therefore less transferable.

A course that targets effective thinking need not detract from other subjects – in fact it should enhance performance across the board.

But ideally, such a course should not be needed if teachers of all subjects focused on the thinking of their students as well as the content they have to cover.

This article was originally published on The Conversation. (Reblogged with permission). Read the original article.

2 Comments

Filed under Reblogs

Francis Bacon on critical thinking

Francis Bacon, 1st Viscount St. Alban, QC (22 January 1561 – 9 April 1626), was an English philosopher, statesman, scientist, jurist, orator, essayist, and author. He served both as Attorney General and Lord Chancellor of England.  After his death, he remained extremely influential through his works, especially as philosophical advocate and practitioner of the scientific method during the second scientific revolution. (Note: Francis Bacon is not to be confused with Roger Bacon).

bacon

Leave a comment

Filed under Quotations