Monthly Archives: April 2015

Fifty years ago today, Menzies’ call on Vietnam changed Australia’s course

The Conversation

Nicholas Ferns, Monash University

In recent weeks, Australians have been exposed to an overwhelming amount of content in anticipation of the centenary of the Gallipoli landing. This has ranged from the thoughtful to the near-offensive. Lost in the dust of the ANZAC bandwagon has been another key anniversary of an event far less well known to Australians.

Today marks the 50th anniversary of Robert Menzies’ offer of an Australian battalion to the South Vietnamese government. Prior to the prime minister’s offer, Australia’s commitment to the Vietnamese conflict had been limited to fewer than 100 advisers. Menzies’ pledge significantly increased Australia’s involvement in the Vietnam War.


Robert Menzies meets the US defence secretary, Robert McNamara, at the Pentagon in 1964, the year before committing Australia to the escalating war. Wikimedia Commons/PHC/Ralph Seghers

What reasons were given for joining the war?

The circumstances of this offer are shrouded in controversy.

In 1965, Australia was involved in two crises in Southeast Asia, one in Vietnam and the other in Indonesia. The connection between the two was vital to Menzies’ decision to increase our involvement in Vietnam.

Having already committed a battalion to Malaysia to support resistance to the Konfrontasi policy of Indonesia’s Sukarno government, the logical next step for Menzies was to look to Vietnam. He did this with the support of his Cold War warrior and minister for external affairs, Paul Hasluck. They decided to send an Australian battalion to South Vietnam, partly to ensure continued American interest in the region.

A series of negotiations took place between Australian, American and Vietnamese representatives to secure acceptance of the Australian troops. The South Vietnamese government and the American ambassador in Saigon, Maxwell Taylor, were initially reluctant to receive more foreign troops. It seemed that the necessary South Vietnamese invitation wasn’t going to be forthcoming.

Fortunately for Menzies, the South Vietnamese government was persuaded to accept the Australian offer. A formal request was given just before Menzies made his speech in Parliament.

By this time it was late evening on Thursday, April 29. The Labor opposition leader, Arthur Calwell, and his deputy, Gough Whitlam, had already left Canberra for their home electorates. As many members of both sides had departed Canberra, Menzies made his announcement to a near-empty House of Representatives.

The reasons given to the Australian public were equally controversial.

In an oft-quoted part of his speech, Menzies claimed:

The takeover of South Vietnam would be a direct military threat to Australia and all the countries of South and South-East Asia. It must be seen as a part of a thrust by Communist China between the Indian and Pacific Oceans.

Seeking US protection from threats in Asia

The spectre of the mythical yellow hordes had been a constant theme in Australian history and has been used to instil xenophobia and need for a great and powerful protector. In Menzies’ case, it masked the true reasons for the offer: Australia’s fear of Sukarno’s Indonesia and the need for American security.

Around 100 Australian troops were already in Vietnam at the time of Menzies’ announcement. They were generally support and training troops, comprising the Australian Army Training Team Vietnam (AATTV). Within months Australian troop numbers had jumped to over 1000 and would peak at almost 7000 by 1969.

The Menzies government’s announcement threw Australian troops into a military “quagmire” of Washington’s making.

The dangers of forgetting

Though the troops’ commitment to the war is rightfully commemorated each Anzac Day, the circumstances surrounding Australia’s commitment of troops to Vietnam has been forgotten.

This collective forgetfulness is particularly relevant this year. In the current “much ado” about Gallipoli 100 years on, arguably a more significant anniversary has been forgotten. After Vietnam, Australian governments of both persuasions realised it was in the nation’s interest to engage with our “near north”.

After Vietnam, there would be no more racially based legislation and governments began to engage with the region, whether the ruling regime was communist or not. Governments still managed to send troops into more recent Washington quagmires, but the numbers are much smaller than the Vietnam commitment.

Menzies’ decision is the forgotten skeleton in the nation’s closet. This forgetfulness suggests a great deal not only about the current national “besottedness” with Gallipoli, but also concerning our collective unwillingness to confront less honourable aspects of our diplomatic and military history. With some notable exceptions, the nation’s populist commentators and the war pathos industry have used Gallipoli as a vehicle for national self-aggrandisement, despite the efforts of some academic historians to push for a more considered approach.

This country has a short attention span when it comes to its history. Well may we say lest we forget on Anzac Day. Those who served deserve that honour.

But perhaps today we should take time out from the Anzac media romp and begin to remember another, more ignominious moment in our recent history and the part played by Sir Robert Menzies. Lest we forget the lessons of the Vietnam War.

The ConversationThis article was originally published on The Conversation. (Reblogged by permission). Read the original article.


1 Comment

Filed under Reblogs

Fallacy of Inferred Insensitivity

by Damon Young
“You wrote about X, so you don’t care about Y”

 I’ve a column in the Canberra Times today, ‘Impossible vanity of caring for everything at once’.

I’m criticising what I’ve called the Fallacy of Inferred Insensitivity, which often looks like this: “You wrote about X, so you don’t care about Y.”  A sample:

In some cases, those who charge authors or speakers with indifference are correct. They accurately identify some moral blind spot, which is relevant to the case at hand but has been ignored or forgotten. In other words, they rightly recognise some bias. But in most cases, the critics are in no position to make this charge reasonably: even if they are right, they lack the evidence to establish this reasonably. They do not know the mercurial substance of another’s mind. At best, they are guessing well. At worst, they are simply rehashing their own biases.

Read the rest of Damon’s blog post here.


1 Comment

Filed under Reblogs

Belle Gibson shows that most of us care about right and wrong

The Conversation

Nicholas Hookway, University of Tasmania

With her lies about having cancer and her willingness to cash in on the hopes of actual cancer patients, Belle Gibson – the Australian woman behind The Whole Pantry app – is indicative of our run-down, self-indulgent and narcissistic moral world, right?

From an insatiable desire for fame and attention to the shallowness and consumerism of the wellness and New Age movement, Gibson’s tale of deceit embodies all that is wrong with the modern world. Or so the thinking goes.

But there’s a different story to be told here: one that focuses not on Gibson’s shocking lie that she healed herself naturally of cancer but the overwhelming moral response to her mass deception and the social role this plays.

From a raging Twittersphere to talk-back radio to water cooler conversations, the backlash to Gibson’s deceit reminds us that we have a strong and lively moral culture.

The collective disgust works, effectively, to reinforce the values we hold dear.

Belle Gibson: a fallen god of self-improvement culture

You know the argument: Western culture has been overtaken by an “anything goes” relativism and as a result we have lost touch with foundational moral laws and principles. Without the moral anchors of the past – community, religion and tradition – we are left bobbing on the ocean without a moral compass.

The growth of self-help or “therapeutic” culture gets wrapped up in this narrative of decline. Belle Gibson, of course, had been elevated to a modern-day god of our wellness and self-improvement culture.

Gibson’s Whole Pantry brand provided an exemplary story of the power of the “self-improved”. The idea of beating cancer through a commitment to wellness and healthy eating strikes a major chord within a culture that accords so much importance to choice, wellness and self-improvement. As the motto of The Whole Pantry states: “Your Whole Life, Starts with You”.

For some, Belle Gibson’s lies are the fulfilment of New Age thought and the contemporary turn to self-improvement. JR Hennessy, in The Guardian, located Gibson’s fall from grace as part of the inevitable commercialisation of the New Age and its hollow culture of self-fulfilment and faked authenticity.

On this analysis, health gurus, new age dieters, meditators, anti-vaxxers and the paleo set are lumped in one big pile of narcissism and irresponsibility.

A lively moral discourse: reminding ourselves about what matters

There may be merit in the concerns being expressed about wellness culture – especially in its more extreme faux-miracle forms. But the important lesson to draw from Gibson’s exposure is the health and vitality of morality in our culture.

Despite all the hand-wringing about moral relativism and the like, the outrage Gibson has provoked shows that we are actually very quick to point out moral wrongs and to seek to restore the balance.

A quick scan of the Twitter hashtag #bellegibson reveals a wide range of moral commentary. There are people calling for punitive action and incarceration and others pointing to her “extraordinary irresponsibility”. Still others are treading a more sympathetic line, highlighting the harm of internet shaming and Gibson’s fragile mental state.

This reaction suggests we have a keen sense of what the moral rules are and when they have been transgressed. We might not precisely agree as to what those rules are – but in our daily lives we question what constitutes the good life.

Proponents of moral decline overlook this “water-cooler” morality: how moral life is established and created in the relationships, communication and moral discourse of everyday life.

Culture as moral antibodies

Culture works here as a series of moral antibodies that seek to redress violations of shared basic moral principles and values. In the Gibson case, these are truth, justice and a sense of fairness.

As 19th-century sociologist Emile Durkheim famously argued, deviance plays a healthy role in society, working to clarify social norms, to preserve the moral boundaries of the community, and to help strengthen feelings of cohesiveness.

Belle Gibson may reveal our vulnerability to be hoodwinked by food fads and wellness warriors but the response to her transgressions is a powerful reminder of the values we share in common – and what happens when you violate them.

The next question our moral culture must ask itself is how healthy is it to publicly shame a vulnerable person, and what is the right balance between culpability and sense of care and generosity to those who have done the wrong thing.

It is arguably the latter which provides the strongest test of the health of a moral society.

See also:

The ‘hole’ in the pantry story: should Penguin have validated Belle Gibson’s cancer claims?

The ConversationThis article was originally published on The Conversation. (Reblogged by permission). Read the original article.


Leave a comment

Filed under Reblogs

We don’t need no (moral) education? Five things you should learn about ethics

The Conversation

Patrick Stokes, Deakin University

The human animal takes a remarkably long time to reach maturity. And we cram a lot of learning into that time, as well we should: the list of things we need to know by the time we hit adulthood in order to thrive – personally, economically, socially, politically – is enormous.

But what about ethical thriving? Do we need to be taught moral philosophy alongside the three Rs?

Ethics has now been introduced into New South Wales primary schools as an alternative to religious instruction, but the idea of moral philosophy as a core part of compulsory education seems unlikely to get much traction any time soon. To many ears, the phrase “moral education” has a whiff of something distastefully Victorian (the era, not the state). It suggests indoctrination into an unquestioned set of norms and principles – and in the world we find ourselves in now, there is no such set we can all agree on.

Besides, in an already crowded curriculum, do we really have time for moral philosophy? After all, most people manage to lead pretty decent lives without knowing their Sidgewick from their Scanlon or being able to spot a rule utilitarian from 50 yards.

But intractable moral problems don’t go away just because we no longer agree how to deal with them. And as recent discussions on this site help to illustrate, new problems are always arising that, one way or another, we have to deal with. As individuals and as participants in the public space, we simply can’t get out of having to think about issues of right and wrong.

Yet spend time hanging around the comments section of any news story with an ethical dimension to it (and that’s most of them), and it quickly becomes apparent that most people just aren’t familiar with the methods and frameworks of ethical reasoning that have been developed over the last two and a half thousand years. We have the tools, but we’re not equipping people with them.

So, what sort of things should we be teaching if we wanted to foster “ethical literacy”? What would count as a decent grounding in moral philosophy for the average citizen of contemporary, pluralistic societies?

What follows is in no way meant to be definitive. It’s not based on any sort of serious empirical data around people’s familiarity with ethical issues. It’s a just tentative stab (wait, can you stab tentatively?) at a list of things people should ideally know about ethics, and based, on what I see in the classroom and, online, often don’t.

1. Ethics and morality are (basically) the same thing

Many people bristle at the word “morality” but are quite comfortable using the term “ethical”, and insist there’s some crucial difference between the two. For instance, some people say ethics are about external, socially imposed norms, while morality is about individual conscience. Others say ethics is concrete and practical while morality is more abstract, or is somehow linked to religion.

Out on the value theory front lines, however, there’s no clear agreed distinction, and most philosophers use the two terms more or less interchangeably. And let’s face it: if even professional philosophers refuse to make a distinction, there probably isn’t one there to be made.

2. Morality isn’t (necessarily) subjective

Every philosophy teacher probably knows the dismay of reading a decent ethics essay, only to then be told in the final paragraph that, “Of course, morality is subjective so there is no real answer”. So what have the last three pages been about then?

There seems to be a widespread assumption that the very fact that people disagree about right and wrong means there is no real fact of the matter, just individual preferences. We use the expression “value judgment” in a way that implies such judgments are fundamentally subjective.

Sure, ethical subjectivism is a perfectly respectable position with a long pedigree. But it’s not the only game in town, and it doesn’t win by default simply because we haven’t settled all moral problems. Nor does ethics lose its grip on us even if we take ourselves to be living in a universe devoid of intrinsic moral value. We can’t simply stop caring about how we should act; even subjectivists don’t suddenly turn into monsters.

3. “You shouldn’t impose your morality on others” is itself a moral position.

You hear this all the time, but you can probably spot the fallacy here pretty quickly: that “shouldn’t” there is itself a moral “shouldn’t” (rather than a prudential or social “shouldn’t,” like “you shouldn’t tease bears” or “you shouldn’t swear at the Queen”). Telling other people it’s morally wrong to tell other people what’s morally wrong looks obviously flawed – so why do otherwise bright, thoughtful people still do it?

Possibly because what the speaker is assuming here is that “morality” is a domain of personal beliefs (“morals”) which we can set aside while continuing to discuss issues of how we should treat each other. In effect, the speaker is imposing one particular moral framework – liberalism – without realising it.

4. “Natural” doesn’t necessarily mean “right”

This is an easy trap to fall into. Something’s being “natural” (if it even is) doesn’t tell us that it’s actually good. Selfishness might turn out to be natural, for instance, but that doesn’t mean it’s right to be selfish.

This gets a bit more complicated when you factor in ethical naturalism or Natural Law theory, because philosophers are awful people and really don’t want to make things easy for you.

5. The big three: Consequentialism, Deontology, Virtue Ethics

There’s several different ethical frameworks that moral philosophers use, but some familiarity with the three main ones – consequentialism (what’s right and wrong depends upon consequences); deontology (actions are right or wrong in themselves); and virtue ethics (act in accordance with the virtues characteristic of a good person) – is incredibly useful.

Why? Because they each manage to focus our attention on different, morally relevant features of a situation, features that we might otherwise miss.

So, that’s my tentative stab (still sounds wrong!). Do let me know in the comments what you’d add or take out.

This is part of a series on public morality in 21st century Australia. We’ll be publishing regular articles on morality on The Conversation in the coming weeks.

The ConversationThis article was originally published on The Conversation. (Reblogged by permission). Read the original article.


Leave a comment

Filed under Reblogs

Opinions and free speech

Someone once said that defending an opinion by citing the right to free speech is sort of the ultimate concession  – you’re saying that the most compelling thing you can say for your position is that it’s not literally illegal to express it.

1 Comment

Filed under cartoons

How a PhD in linguistics prepared me for motherhood

The Conversation

Annabelle Lukin

Unlike most newborns, on his arrival into the world, my newly-minted son found himself in the arms of someone well-versed in the most fiercely contested question in contemporary linguistics: is language innate?

Are babies born with grammar hard-wired into their brain? Or is language something bestowed by culture and socialisation?

The early exchanges of gaze, attention and vocalisations with my baby in his first hours, days and weeks were experienced against the melodrama of modern linguistics’ greatest schism. This happens to revolve entirely around the role of mothers and significant others in the development of a child’s language.

In the story of how language emerges in the child, as told by Noam Chomsky, nature is largely the lone hero. The child comes with a “language organ” already installed in her or his brain, as a sudden and isolated gift of evolution – out of nowhere, all-at-once, fully formed and forever unchanging.

Chomsky’s Universal Grammar

Called “Universal Grammar” (or UG), the language organ is “invariant among humans”. While the world appears to be full of many and varied languages (estimated at more than 7,000), to Chomsky this rich variation in linguistic forms and functions is, despite appearances, superficial.

Chomsky speaking in 2012 on Universal Grammar and the genetics of language.

Why? Because of the “empirical conditions on language acquisition”, by which Chomsky means the quality and quantity of the language around the infant. Chomsky has argued for more than 50 years that language must be innate because the familial and domestic discourse that surrounds infants is “degenerate”. Children simply could not learn language because the input they receive from their mothers and significant others is “impoverished”.

The ‘poverty of the stimulus’

Chomsky named this central plank in his theory “the poverty of the stimulus”. As such, Chomsky’s most famous disciple, Steven Pinker, advises parents to ignore their offspring:

Young children plainly can’t understand a word you say. So why waste your breath in soliloquies? Any sensible person would surely wait until a child has developed speech and more gratifying two-way conversations become possible.

Maternal language, Pinker continued, is all part of “the same mentality that sends yuppies to ‘learning centers’ to buy little mittens with bull’s-eyes to help their babies find their hands sooner”. It is nothing more than a collective anxiety to “keep the helpless infant from falling behind in the great race of life”.

If you believe that language is not for communication, then this makes perfect sense. The real function of language, Chomsky argues, is to converse with yourself inside your own skull.

“In any useful sense of the term,” says Chomsky, “communication is not the function of language.”

While infants and toddlers might display “language-like expressions” – like the twins of YouTube fame (below) – this behaviour, to the 20th century’s most famous linguist, is akin to a dog being trained to respond to certain commands.

Talking twins showing turn-taking and the intonation patterns of their mother-tongue.

In a new and very readable critique of Universal Grammar, Professor Vyvan Evans writes that “despite being completely wrong” UG “is alive and kicking”.

Evans argues that the myth “has become institutionalised via retellings which are now immune to counterevidence”. The evidence and arguments against UG have come, for many years, from fields as diverse as animal communication studies, language typology, language evolution, infant communication, child language development, neuroscience and psychiatry. Evans reviews some of this evidence in his new book.

The alternative to Universal Grammar

As I waited for the birth of my son, I eschewed Australia’s best selling book for expectant mothers, in favour of the growing body of research on infant and mother communication. In this research, nature still had a crucial role: no less than giving my son his complex brain, one ready and able to learn language.

But nature bestowed on him something else besides: the capacity for “intersubjectivity”. The research shows babies are innately tuned to display their subjectivity – their tiny personalities – and to adapt or fit their displays of attention and emotion to the subjectivity of others.

Once researchers started to look – really look – at young infants, their early sociability became apparent. Professor Colwyn Trevarthen has, since the 1970s, closely observed the communicative repertoire of very young babies. He has observed their smiles, their coos of recognition, their frowns and their hand gestures”. All these postures “announce, for a sympathetic other person, the infant’s state of openness to the world”. The babies gestures are prolific, intelligible and organised.

Colwyn Trevarthen began observing mothers and young babies in the 1970s.

Babies’ survival is tied into a capacity to establish the mutual rhythms which produce human companionship. Trevarthen writes:

Being conversational is what it takes for a young person to begin learning what other people know and do, and this is the behaviour a fond parent expects and enjoys. It is the human adaption for cultural learning.

It is, of course, our capacity for cultural learning which sets us apart from all other animals.

Babies need joyful, responsive human company

Research in countries as diverse as Scotland, Nigeria, Germany, Sweden and Japan has shown mothers speak to infants in a manner that is rhythmic, repetitive, musical and regular. Far from being “degenerate” or “impoverished”, this kind of language is maximally designed for the needs of the young baby.

Father in conversation with nine-week-old girl.

Babies need and seek “joyful, responsive human company”, with a known, loving and attentive conversational partner. These “proto-conversations” provide the foundations for infants to step into the systems and structures of their mother-tongue.

As helpless and dependent as my baby son was, I knew my little munchkin was biologically prepared to initiate and sustain the interactions through which his beautifully complex human brain could get to know the world outside him, and his place in that world.

I knew our conversations would propel him into the rich and extravagant culture around him. And that this culture would reciprocate his curiosity with its many artifacts, including the infinitely creative, collective resource that is human language.

The ConversationThis article was originally published on The Conversation. (Reblogged by permission). Read the original article.


Leave a comment

Filed under Reblogs

Postmodernism disrobed

by Richard Dawkins

Intellectual Impostures
by Alan Sokal and Jean Bricmont
Profile: 1998. Pp. 274. £9.99
Published in the USA by Picador as Fashionable Nonsense in November 1998

Suppose you are an intellectual impostor with nothing to say, but with strong ambitions to succeed in academic life, collect a coterie of reverent disciples and have students around the world anoint your pages with respectful yellow highlighter. What kind of literary style would you cultivate? Not a lucid one, surely, for clarity would expose your lack of content. The chances are that you would produce something like the following:

We can clearly see that there is no bi-univocal correspondence between linear signifying links or archi-writing, depending on the author, and this multireferential, multi-dimensional machinic catalysis. The symmetry of scale, the transversality, the pathic non-discursive character of their expansion: all these dimensions remove us from the logic of the excluded middle and reinforce us in our dismissal of the ontological binarism we criticised previously.

This is a quotation from the psychoanalyst Félix Guattari, one of many fashionable French ‘intellectuals’ outed by Alan Sokal and Jean Bricmont in their splendid book Intellectual Impostures, previously published in French and now released in a completely rewritten and revised English edition. Guattari goes on indefinitely in this vein and offers, in the opinion of Sokal and Bricmont, “the most brilliant mélange of scientific, pseudo-scientific and philosophical jargon that we have ever encountered”. Guattari’s close collaborator, the late Gilles Deleuze, had a similar talent for writing:

In the first place, singularities-events correspond to heterogeneous series which are organized into a system which is neither stable nor unstable, but rather ‘metastable’, endowed with a potential energy wherein the differences between series are distributed… In the second place, singularities possess a process of auto-unification, always mobile and displaced to the extent that a paradoxical element traverses the series and makes them resonate, enveloping the corresponding singular points in a single aleatory point and all the emissions, all dice throws, in a single cast.

This calls to mind Peter Medawar’s earlier characterization of a certain type of French intellectual style (note, in passing, the contrast offered by Medawar’s own elegant and clear prose):

Style has become an object of first importance, and what a style it is! For me it has a prancing, high-stepping quality, full of self-importance; elevated indeed, but in the balletic manner, and stopping from time to time in studied attitudes, as if awaiting an outburst of applause. It has had a deplorable influence on the quality of modern thought…

The remainder of this review is available here. [Originally published in Nature, 9 July 1998, vol. 394, pp. 141-143.]


5 Comments

Filed under Reblogs

The Barbershop Paradox

The Barbershop paradox was proposed by Lewis Carroll in a three-page essay titled “A Logical Paradox,” which appeared in the July 1894 issue of Mind. The name comes from the ‘ornamental’ short story that Carroll uses to illustrate the paradox (although it had appeared several times in more abstract terms in his writing and correspondence before the story was published). Carroll claimed that it illustrated “a very real difficulty in the Theory of Hypotheticals” in use at the time. Modern logicians would not regard it as a paradox but simply as a logical error on the part of Carroll.

Briefly, the story runs as follows: Uncle Joe and Uncle Jim are walking to the barber shop. There are three barbers who live and work in the shop—Allen, Brown, and Carr—but not all of them are always in the shop. Carr is a good barber, and Uncle Jim is keen to be shaved by him. He knows that the shop is open, so at least one of them must be in. He also knows that Allen is a very nervous man, so that he never leaves the shop without Brown going with him. Uncle Joe insists that Carr is certain to be in, and then claims that he can prove it logically. Uncle Jim demands the proof. Uncle Joe reasons as follows.

Suppose that Carr is out. If Carr is out, then if Allen is also out Brown would have to be in, since someone must be in the shop for it to be open. However, we know that whenever Allen goes out he takes Brown with him, and thus we know as a general rule that if Allen is out, Brown is out. So if Carr is out then the statements “if Allen is out then Brown is in” and “if Allen is out then Brown is out” would both be true at the same time.

Uncle Joe notes that this seems paradoxical; the two “hypotheticals” seem “incompatible” with each other. So, by contradiction, Carr must logically be in.

However, the correct conclusion to draw from the incompatibility of the two “hypotheticals” is that what is hypothesised in them (– that Allen is out) must be false under our assumption that Carr is out. Then our logic simply allows us to arrive at the conclusion “If Carr is out, then Allen must necessarily be in”.

In modern logic theory this scenario is not a paradox. The law of implication reconciles what Uncle Joe claims are incompatible hypotheticals. This law states that “if X then Y” is logically identical to “X is false or Y is true” (¬X ∨ Y). For example, given the statement “if you press the button then the light comes on”, it must be true at any given moment that either you have not pressed the button, or the light is on.

In short, what obtains is not that ¬C yields a contradiction, only that it necessitates A, because ¬A is what actually yields the contradiction.

In this scenario, that means Carr doesn’t have to be in, but that if he isn’t in, Allen has to be in.

A more detailed discussion of this apparent paradox may be found on Wikipedia.


Leave a comment

Filed under Paradoxes

Islam and the media – let’s not fear open debate

The Conversation

Brian McNair, Queensland University of Technology

‘This is my weapon’.  Source: AAP/Denis Prezat

On Sunday I came across an article in the International Business Times, reporting a posthumous publication by slain Charlie Hebdo editorial director Stephane Charbonnier. The book, completed two days before the attack that killed him and nine of his colleagues, denounces the western media, politicians and those commentators who mute their criticism of Islam for fear of being accused of ‘Islamophobia’. “By what twisted logic”, Charbonnier writes, “is humor less compatible with Islam than with any other religion? … If we let it be understood that we can laugh at everything except certain aspects of Islam because Muslims are much more susceptible than the rest of the population, isn’t that discrimination?”

In the knowledge that the poor man is now dead, a victim of religiously-motivated killers, and with news of the Melbourne arrests dominating the headlines all weekend, I found the article both poignant and pertinent.

And then, driving in Brisbane on Sunday morning, I listened to an ABC radio discussion of the Melbourne anti-terrorism operation. The main theme of the discussion? Was it appropriate to interrogate Islam’s messages, or was the Victorian premier who had that morning declared that these young men are “not people of faith. They don’t represent any culture. This is not an issue of how you pray or where you were born… this is simply evil: plain and simple”, correct to steer public attention and anxiety away from the islamic connection?

We all understand the reasons why our politicians urge caution in addressing the issue of Islam and its interaction with democratic, secular cultures such as Australia’s. No-one wants to see moderate muslims scapegoated or blamed for the crimes of a few extremists.

We understand that the particular interpretation of the Quran which fuels the global jihad is not shared by muslims as a whole. There are extreme Christians too, and Hindus and even Buddhists, who advocate violence in the name of their respective deities. The history of Christianity is awash with conquest and innocent blood. The non-violent, non-extremist practitioners of these religions are not responsible for the crimes of the past, or for the actions of present-day radicals on the periphery. The same point applies to muslims, many of whom have spoken eloquently and forcefully against jihad.

We also know that those young men and women, often from secure middle class, moderate backgrounds, who choose to join the jihad do so for many reasons other than the religious.

But for that reason, too, analysis of Islam’s vulnerability to such hijacking should not be interpreted as an attack on muslims as a whole, or as ‘islamophobia’. On the contrary, as Charbonnier writes, failure to scrutinize Islam in the media and elsewhere, in the same way that we should scrutinize all religions and belief systems, is itself a kind of discrimination. It patronises Islam to say that its adherents are too sensitive to be treated with the same intellectual rigor and scrutiny as, say Christianity or Scientology.

Reluctance to draw attention to and satirise the absurdities of Islam – and all religions are absurd in their own ways – will in the end breed more public anger than it prevents. Moreover, it is an important sign of acceptance of democratic political culture that Islam’s leaders, even if they disapprove of what is said, should embrace the satirist and the heretic alike, without feeling the need for a fatwa. Christians had to swallow Life Of Brian, after all, though many church leaders called for bans. How offensive would we think it today, had bishops and cardinals called for the deaths of the Monty Python team?

So let’s be clear. Critiquing Islam in the media and elsewhere is not ‘islamophobia’.

It’s not racism, since being a muslim has nothing to do with ethnicity.

It’s not anti-muslim, since many muslims are critical of the extremists in their ranks, and ashamed of how the name of their religion has been tarnished.

It is, rather, a legitimate and increasingly necessary engagement with a uniquely (for our time) toxic variant of a belief system which, whether or not one disagrees with its tenets, can easily coexist with secular society in the same way that other religions do in a multicultural society. Anything less than vigorous, skeptical media discussion of those beliefs, including its still-medieval attitudes to women and homosexuality, does moderate muslims no favors.

The ConversationThis article was originally published on The Conversation. (Rebloged by permission). Read the original article.


Leave a comment

Filed under Reblogs

Cheryl’s Birthday solution

Remember, Albert is told either May, June, July or August.

Bernard is told either 14, 15, 16, 17, 18 or 19

Let’s go through it line by line.

Albert: I don’t know when Cheryl’s birthday is, but I know that Bernard doesn’t know too.

All Albert knows is the month, and every month has more than one possible date, so of course he doesn’t know when her birthday is. The first part of the sentence is redundant.

The only way that Bernard could know the date with a single number, however, would be if Cheryl had told him 18 or 19, since of the ten date options only these numbers appear once, as May 19 and June 18.

For Albert to know that Bernard does not know, Albert must therefore have been told July or August, since this rules out Bernard being told 18 or 19.

Line 2) Bernard: At first I don’t know when Cheryl’s birthday is, but now I know.

Bernard has deduced that Albert has either August or July. If he knows the full date, he must have been told 15, 16 or 17, since if he had been told 14 he would be none the wiser about whether the month was August or July. Each of 15, 16 and 17 only refers to one specific month, but 14 could be either month.

Line 3) Albert: Then I also know when Cheryl’s birthday is.

Albert has therefore deduced that the possible dates are July 16, Aug 15 and Aug 17. For him to now know, he must have been told July. Since if he had been told August, he would not know which date for certain is the birthday.

The answer, therefore is July 16.

Leave a comment

Filed under Puzzles