Tag Archives: experts

Book review: The Death of Expertise

The Conversation

File 20170619 28805 17lbqbv
A new book expresses concern that the ‘average American’ has base knowledge so low that it is now plummeting to ‘aggressively wrong’.
shutterstock

Rod Lamberts, Australian National University

I have to start this review with a confession: I wanted to like this book from the moment I read the title. And I did. Tom Nichols’ The Death of Expertise: The Campaign Against Established Knowledge and Why it Matters is a motivating – if at times slightly depressing – read.

In the author’s words, his goal is to examine:

… the relationship between experts and citizens in a democracy, why that relationship is collapsing, and what all of us, citizens and experts, might do about it.

This resonates strongly with what I see playing out around the world almost every day – from the appalling state of energy politics in Australia, to the frankly bizarre condition of public debate on just about anything in the US and the UK.

Nichols’ focus is on the US, but the parallels with similar nations are myriad. He expresses a deep concern that “the average American” has base knowledge so low it has crashed through the floor of “uninformed”, passed “misinformed” on the way down, and is now plummeting to “aggressively wrong”. And this is playing out against a backdrop in which people don’t just believe “dumb things”, but actively resist any new information that might threaten these beliefs.

He doesn’t claim this situation is new, per se – just that it seems to be accelerating, and proliferating, at eye-watering speed.

Intimately entwined with this, Nichols mourns the decay of our ability to have constructive, positive public debate. He reminds us that we are increasingly in a world where disagreement is seen as a personal insult. A world where argument means conflict rather than debate, and ad hominem is the rule rather than the exception.

Again, this is not necessarily a new issue – but it is certainly a growing one.

Oxford University Press

The book covers a broad and interconnected range of topics related to its key subject matter. It considers the contrast between experts and citizens, and highlights how the antagonism between these roles has been both caused and exacerbated by the exhausting and often insult-laden nature of what passes for public conversations.

Nichols also reflects on changes in the mediating influence of journalism on the relationship between experts and “citizens”. He reminds us of the ubiquity of Google and its role in reinforcing the conflation of information, knowledge and experience.

His chapter on the contribution of higher education to the ailing relationship between experts and citizens particularly appeals to me as an academic. Two of his points here exemplify academia’s complicity in diminishing this relationship.

Nichols outlines his concern about the movement to treat students as clients, and the consequent over-reliance on the efficacy and relevance of student assessment of their professors. While not against “limited assessment”, he believes:

Evaluating teachers creates a habit of mind in which the layperson becomes accustomed to judging the expert, despite being in an obvious position of having inferior knowledge of the subject material.

Nichols also asserts this student-as-customer approach to universities is accompanied by an implicit, and also explicit, nurturing of the idea that:

Emotion is an unassailable defence against expertise, a moat of anger and resentment in which reason and knowledge quickly drown. And when students learn that emotion trumps everything else, it is a lesson they will take with them for the rest of their lives.

The pervasive attacks on experts as “elitists” in US public discourse receive little sympathy in this book (nor should these). Nichols sees these assaults as entrenched not so much in ignorance, more as being rooted in:

… unfounded arrogance, the outrage of an increasingly narcissistic culture that cannot endure even the slightest hint of inequality of any kind.

Linked to this, he sees a confusion in the minds of many between basic notions of democracy in general, and the relationship between expertise and democracy in particular.

Democracy is, Nichols reminds us, “a condition of political equality”: one person, one vote, all of us equal in the eyes of the law. But in the US at least, he feels people:

… now think of democracy as a state of actual equality, in which every opinion is a good as any other on almost any subject under the sun. Feelings are more important than facts: if people think vaccines are harmful … then it is “undemocratic” and “elitist” to contradict them.

The danger, as he puts it, is that a temptation exists in democratic societies to become caught up in “resentful insistence on equality”, which can turn into “oppressive ignorance” if left unchecked. I find it hard to argue with him.

Nichols acknowledges that his arguments expose him to the very real danger of looking like yet another pontificating academic, bemoaning the dumbing down of society. It’s a practice common among many in academia, and one that is often code for our real complaint: that people won’t just respect our authority.

There are certainly places where a superficial reader would be tempted to accuse him of this. But to them I suggest taking more time to consider more closely the contexts in which he presents his arguments.

This book does not simply point the finger at “society” or “citizens”: there is plenty of critique of, and advice for, experts. Among many suggestions, Nichols offers four explicit recommendations.

  • The first is that experts should strive to be more humble.
  • Second, be ecumenical – and by this Nichols means experts should vary their information sources, especially where politics is concerned, and not fall into the same echo chamber that many others inhabit.
  • Three, be less cynical. Here he counsels against assuming people are intentionally lying, misleading or wilfully trying to cause harm with assertions and claims that clearly go against solid evidence.
  • Finally, he cautions us all to be more discriminating – to check sources scrupulously for veracity and for political motivations.

In essence, this last point admonishes experts to mindfully counteract the potent lure of confirmation bias that plagues us all.

It would be very easy for critics to cherry-pick elements of this book and present them out of context, to see Nichols as motivated by a desire to feather his own nest and reinforce his professional standing: in short, to accuse him of being an elitist. Sadly, this would be a prime example of exactly what he is decrying.

To these people, I say: read the whole book first. If it makes you uncomfortable, or even angry, consider why.

Have a conversation about it and formulate a coherent argument to refute the positions with which you disagree. Try to resist the urge to dismiss it out of hand or attack the author himself.

I fear, though, that as is common with a treatise like this, the people who might most benefit are the least likely to read it. And if they do, they will take umbrage at the minutiae, and then dismiss or attack it.

The ConversationUnfortunately we haven’t worked how to change that. But to those so inclined, reading this book should have you nodding along, comforted at least that you are not alone in your concern that the role of expertise is in peril.

Rod Lamberts, Deputy Director, Australian National Centre for Public Awareness of Science, Australian National University

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

1 Comment

Filed under Reblogs

Bertrand Russell on Andrew Jackson

‘American democracy underwent a great transformation when Andrew Jackson became President. Until his time presidents had been cultivated gentlemen, mostly with a settled position as landowners. Andrew Jackson represented a rebellion against these men on the part of the pioneers and immigrants. He did not like culture and was suspicious of educated men since they understood things that puzzled him. This element of hostility to culture has persisted in American democracy ever since, and has made it difficult for America to make the best use of its experts.’

Reference

Russell, Bertrand (1961) ‘What is Democracy’ in Fact and Fiction. George Allen & Unwin. London.

Leave a comment

Filed under Reblogs

Please don’t explain: Hanson 2.0 and the war on experts

The Conversation

Patrick Stokes, Deakin University

Along with Aqua’s “Barbie Girl,” Pauline Hanson has long stood as a grim reminder that the second half of the 1990s was much worse than the first half. And now, 18 years later, Hanson finds herself back in Canberra.

Hanson’s racist agenda will be a stain on the Senate just as surely as the views she represents are a stain on Australia itself. For that reason alone, her return is a cause for dismay. But it is not the only cause.

Both Hanson herself and her wider party have a vocal sideline in science denialism: the view that expert consensus on various topics is corrupted and unreliable.

Hanson has pushed the myth that vaccination causes autism, and wants a royal commission into the “corruption” of climate science, declaring that “Climate change should not be about making money for a lot of people and giving scientists money”.

At the time of writing, it’s quite possible Malcolm Roberts, who has the number two slot on the One Nation Senate ticket in Queensland, will be joining Hanson in Canberra. Roberts is a project leader of the Galileo Movement, a lobby group who deny anthropogenic climate change and insist the global scientific community and governments are corruptly hiding the truth from their publics.

Conspiracism in public life

This might seem small beer next to the potentially disastrous effects a Hansonite revival might have on Australia’s pluralist and multicultural society.

But remember: Hanson had an outsized impact on Australian politics in the 90s precisely because she gave voice to views that resonated with much of the electorate and, unlike other politicians, wasn’t quite canny enough to reach for the dog whistle. In openly using phrases like “swamped with Asians,” Hanson shifted the Overton Window until the political establishment found the only way her views could be contained was by absorbing them.

Enter Roberts, a man who honestly believes a “tight-knit cabal” made up of “some of the major banking families in the world” are advancing corrupted climate science with the aim of global domination. Such language has some very dark associations in the history of conspiracy theory. Hence Andrew Bolt disassociated himself from the Galileo Movement for peddling a view that “smacks too much of the Jewish world conspiracy theorising I’ve always loathed.”

One might think that if even an arch-denialist like Bolt can’t abide views like Roberts’, One Nation’s climate conspiracism will end up either repudiated or ignored. But then, nobody in 1996 thought “swamped with Asians” rhetoric would have such an impact on the Australian polity either.

‘Post-truth politics’?

Besides, this has been a good season globally for political expertise bashing. Perhaps the new One Nation senators will find that, in another echo of the Howard years, the times will suit them.

In the lead-up to the UK’s referendum on leaving the European Union, Tory MP and leading Leave campaigner Michael Gove declared “people in this country have had enough of experts”. Gove is now in the running to become the Prime Minister who will preside over the UK’s divorce from the EU – and quite possibly, the breakup of the United Kingdom itself.

Michael Gove says people have had enough of experts. Paul Clarke/Wikimedia Commons

Should Gove get the gig, his counterpart across the pond come January 2017 may well be one Donald Trump, a man who believes climate change is a hoax and that vaccines cause autism (and has given voice to suspicions that Obama wasn’t born in the US and that Ted Cruz’ father was involved the Kennedy assassination).

And of course, denialism won’t be a novelty in Canberra either. Denis Jensen won’t be there when Senator Hanson arrives, but his colleague George Christiansen will be. David Leyonhjelm may no longer grace the Senate crossbenches, but thanks to him we’ll still be paying for a Commissioner to investigate Wind Turbine Syndrome complaints despite the lack of evidence for any such condition. And lest this be dismissed as a mere lefty rant, we should also note the Greens’ stance on genetically modified organisms.

All of this might be ascribed to “post-truth politics,” the condition in which political discourse is no longer constrained by norms of truth-telling. But simply insisting people tell the truth – hardly an outrageous demand – won’t help with this specific problem. To invoke the philosopher Harry Frankfurt’s ingenious distinction, post-truth politics is not fundamentally about lies, but bullshit. The liar knows the truth, and cares about it enough to conceal it. The bullshitter, by contrast, doesn’t care (and may not know) if what they say is true; they just care that you believe it. Trump, it seems fair to say, is a bullshitter. Much of the Gove-Johnson-Farage Brexit campaign was certainly built on bullshit.

But science denialists are not, or at least not necessarily, liars or bullshitters. Their beliefs are sincere. And they are shared by a great many people, who by definition won’t be persuaded by simple appeals to expert opinion because the authority of expert opinion is precisely what they deny. How should we respond to this?

Naïve Reason won’t save us

One disastrous answer would be to retreat into a naïve conception of capital-r Reason as some sort of panacea. Surprisingly smart people end up plumping for such a view. Consider this bit of utopianism from Neil deGrasse Tyson:

Even if Tyson’s being tongue-in-cheek here, this is emblematic of a fairly widespread view that if we just consult The Facts, and then simply apply the infallible techniques of Reason to these Facts, it becomes blindingly obvious precisely What Is To Be Done. This view is only slightly less naïve, and barely less self-congratulatory, than those it opposes.

You sometimes come across people who want to insist that battles over science denialism represent a conflict between “reality” and “ideology.” But there’s no direct access to “reality” – all knowledge is mediated through our existing concepts, language, and so on – and so, arguably, no non-ideological access to it either. Human knowledge doesn’t drop from the sky fully-formed and transparently validated by some infallible faculty of Reason. It’s always filtered through language, culture, politics, history, and the foibles of psychology. Producing knowledge is something humans do – and that means power relations are involved.

Distributed knowledge and trust

While anti-intellectualism and suspicion of expertise is nothing new, the problem is amplified by the very advances that make modern life what it is. Put crudely, we now know so much that nobody can know it all for themselves, and so we have to rely more and more on other people to know things for us.

Under such conditions of distributed knowledge, trust becomes ever more important. You can’t be an expert in everything, and so you have to take more and more on trust. Is human activity warming the climate? Does the MMR vaccine cause autism? Would Brexit tank the UK’s economy? These are not questions you or I can answer, assuming you or I aren’t researchers working in the relevant fields. So we have to defer to the relevant communities of experts – and that’s a problem if you’re not good with trust or deference.

The physicist Brian Cox recently said of Gove’s expertise remark that it represents the way “back to the cave.” If that’s a fate we want to avoid, we’re stuck with distributed knowledge, and the reliance on others it involves.

That being so, we need to enhance trust in the knowledge-generating social structures we depend upon. Of course, a certain proportion of people are always going to insist that scientists are secretly lying to us for profit or that doctors are incompetent or evil. The paranoid style, as Richard Hofstadter called it, will always be with us. And there will always be demagogues willing to exploit that paranoia, to turn expertise into an us-and-them conflict, or to feed resentment and flatter egos by telling people they know better than their GP or climatologists.

But such views can only gain broader traction if people are alienated from those sources of knowledge, if they see them as disconnected from and perhaps even hostile to their own lives and interests.

Technical knowledge is predominantly produced by universities, and utilised by a political class. These are institutions that are much harder to trust if university is a place that nobody like you goes to, or if nobody in the political class sounds like you. It’s much easier to see “government” as some sort of malign, alien force if you have no investment in its processes or hope of benefiting from them. Equally, when “government” means your friends and family who work in public service rather than a distant and abstract locus of force and authority, pervasive suspicion becomes harder to maintain.

Expertise denial has become a deeply corrosive feature of modern political society. It needs to be called out wherever it appears. But we also need to think about how we reduce people’s disconnection from the sources of epistemic authority. That is a far more wickedly difficult problem. It’s one we’ll still be dealing with long after Hanson’s second fifteen minutes are over. But we can’t wait until then to start.

The ConversationPatrick Stokes, Senior Lecturer in Philosophy, Deakin University

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

 

1 Comment

Filed under Reblogs

Why we need to listen to the real experts in science

The Conversation

By Michael Clarke, La Trobe University and Susan Lawler, La Trobe University

If we want to use scientific thinking to solve problems, we need people to appreciate evidence and heed expert advice.

But the Australian suspicion of authority extends to experts, and this public cynicism can be manipulated to shift the tone and direction of debates. We have seen this happen in arguments about climate change.

This goes beyond the tall poppy syndrome. Disregard for experts who have spent years studying critical issues is a dangerous default position. The ability of our society to make decisions in the public interest is handicapped when evidence and thoughtfully presented arguments are ignored.

Anyone can claim to be an expert these days. Flickr/Alan Cleaver , CC BY

Anyone can claim to be an expert these days. Flickr/Alan Cleaver , CC BY

So why is science not used more effectively to address critical questions? We think there are several contributing factors including the rise of Google experts and the limited skills set of scientists themselves. We think we need non-scientists to help us communicate with and serve the public better.

At a public meeting recently, when a well-informed and feisty elderly participant asked a question that referred to some research, a senior public servant replied: “Oh, everyone has a scientific study to justify their position, there is no end to the studies you could cite, I am sure, to support your point of view.”

This is a cynical statement, where there are no absolute truths and everyone’s opinion must be treated as equally valid. In this intellectual framework, the findings of science can be easily dismissed as one of many conflicting views of reality.

Such a viewpoint is dangerous from our point of view.

When scientists disagree with one another, as they must to ensure progress in their field, it is easy to argue that it is not possible to distinguish between conflicting hypotheses. But scientists always agree that critical thinking done well eventually leads to a better understanding and superior solutions. All opinions are not equal.

If you are flying in an airplane at 30,000 feet, you will not be content with just any scientific study about whether the wing will stay on the plane. Most people will want to put their trust in the calculations of an expert aeronautical engineer who understands the physics of stresses on the wing.

So why do we not want to trust experts in bushfire management, or climate change? Because most people are happier with experts whose conclusions fit their own ideas.

This encourages people to express their opinions, and the internet allows those opinions to get a wide viewing. This makes for interesting times, but not always effective solutions.

Google experts

The internet is filled with information and ideas. Everyone can quickly find “answers”, and this means that everyone is an “expert”.

But using Google to find the answer to Trivial Pursuit questions is not the same as researching a complex question. Experts do have skills and one of those is the ability to use high quality sources, up to date theoretical frameworks, and critical thinking based on their experience in a particular field. This is why an expert’s answers are going to be more accurate and more nuanced than a novice.

For example, people who use Dr Google to diagnose their symptoms before visiting an actual doctor, sometimes ask to be tested for diseases they do not have, or waste time seeking a second opinion because they are convinced that their “research” has led them to a correct diagnosis. If it were really that easy, would doctors have to spend all those years in medical school?

There is another problem called the Dunning-Kruger effect, which states that “people who lack the knowledge or wisdom to perform well are often unaware of this fact”.

In other words, people who think all answers can be found on Google are likely to be unaware of the effort involved in solving complex problems, or why years of specialist training might help.

This is almost more dangerous than complete ignorance, because unlike Donald Rumsfeld, they don’t even know what they don’t know.

Easy access to huge volumes of confusing information sits very comfortably in a post-modern world. Unfortunately, the outcome is that most people are reluctant to do the intellectual hard work of sifting through competing hypotheses. So how are we to engage in robust scientific debates in such a public arena?

Science is not enough

It has been said many times that scientists need to communicate their research more broadly. The challenges are well known – peer reviewed scientific publications are necessary for our careers and time spent engaging with the public is time away from the field, our computers and laboratory benches.

Nevertheless, if we hope to influence government policy we cannot assume that the implications of our research will be understood by those who most need to know what we are doing.

Reaching out to busy bureaucrats and politicians is not something that comes naturally to scientists. To turn science into policy we need a diverse team of people with different but complementary skills who share a commitment to the task.

Skills that are not commonly found in scientists may be found in political scientists, lawyers, sociologists, public relations companies, the arts community and the media.

Forming relationships with people who can translate our findings into something that cannot be ignored may be critical to success.

Consider what we are up against, lobby groups with deep pockets have come up with brilliant assaults on the thoughtful management of our environment.

“Cutting Green Tape” or “No fuels, no fire” – these clever bits of spin threaten decades of rigorous research and policy development. This is not a failure of science, but a triumph of imagination. We have been dramatically out-manoeuvred, shown to be amateurs, in the world of presenting competing ideas.

At a recent fire forum we learned that current policy is: “Based on science, but driven by values.” This means that despite the best evidence, the values of our current society will decide when to act. This introduces another definition of truth seeking, based on who made the best argument in a political or legal process.

Science is meant to be done dispassionately and objectively, so scientists are not well equipped to participate in debates about values. This is the realm of ethicists, philosophers, artists and theologians.

But if we are passionate about applying the lessons learned from our research, we will need marketers, lobbyists, communication experts, accountants and economists. A multi-disciplinary team is required to convince society to change.

Perhaps the people with these complementary skills will be able to help break down the anti-intellectualism we face, for the benefit of all.


This is based on an address delivered by Professor Michael Clarke at the 2nd Biodiversity Forum held at the Royal Society of Victoria, Melbourne in 2014.

This article was originally published on The Conversation. (Reblogged with permission). Read the original article.

Leave a comment

Filed under Reblogs

Are you a poor logician? Logically, you might never know

The Conversation

By Stephan Lewandowsky, University of Bristol and Richard Pancost, University of Bristol

This is the second article in a series, How we make decisions, which explores our decision-making processes. How well do we consider all factors involved in a decision, and what helps and what holds us back?


It is an unfortunate paradox: if you’re bad at something, you probably also lack the skills to assess your own performance. And if you don’t know much about a topic, you’re unlikely to be aware of the scope of your own ignorance.

Type in any keyword into a scientific search engine and a staggering number of published articles appears. “Climate change” yields 238,000 hits; “tobacco lung cancer” returns 14,500; and even the largely unloved “Arion ater” has earned a respectable 245 publications.

Experts are keenly aware of the vastness of the knowledge landscape in their fields. Ask any scholar and they will likely acknowledge how little they know relative to what is knowable – a realisation that may date back to Confucius.

Here is the catch: to know how much more there is to know requires knowledge to begin with. If you start without knowledge, you also do not know what you are missing out on.

This paradox gives rise to a famous result in experimental psychology known as the Dunning-Kruger effect. Named after Justin Kruger and David Dunning, it refers to a study they published in 1999. They showed that the more poorly people actually performed, the more they over-estimated their own performance.

People whose logical ability was in the bottom 12% (so that 88 out of 100 people performed better than they did) judged their own performance to be among the top third of the distribution. Conversely, the outstanding logicians who outperformed 86% of their peers judged themselves to be merely in the top quarter (roughly) of the distribution, thereby underestimating their performance.

John Cleese argues that this effect is responsible for not only Hollywood but the actions of some mainstream media.

Ignorance is associated with exaggerated confidence in one’s abilities, whereas experts are unduly tentative about their performance. This basic finding has been replicated numerous times in many different circumstances. There is very little doubt about its status as a fundamental aspect of human behaviour.

Confidence and credibility

Here is the next catch: in the eyes of others, what matters most to judge a person’s credibility is their confidence. Research into the credibility of expert witnesses has identified the expert’s projected confidence as the most important determinant in judged credibility. Nearly half of people’s judgements of credibility can be explained on the basis of how confident the expert appears — more than on the basis of any other variable.

Does this mean that the poorest-performing — and hence most over-confident — expert is believed more than the top performer whose displayed confidence may be a little more tentative? This rather discomforting possibility cannot be ruled out on the basis of existing data.

But even short of this extreme possibility, the data on confidence and expert credibility give rise to another concern. In contested arenas, such as climate change, the Dunning-Kruger effect and its flow-on consequences can distort public perceptions of the true scientific state of affairs.

To illustrate, there is an overwhelming scientific consensus that greenhouse gas emissions from our economic activities are altering the Earth’s climate. This consensus is expressed in more than 95% of the scientific literature and it is shared by a similar fraction — 97-98% – of publishing experts in the area. In the present context, it is relevant that research has found that the “relative climate expertise and scientific prominence” of the few dissenting researchers “are substantially below that of the convinced researchers”.

Guess who, then, would be expected to appear particularly confident when they are invited to expound their views on TV, owing to the media’s failure to recognise (false) balance as (actual) bias? Yes, it’s the contrarian blogger who is paired with a climate expert in “debating” climate science and who thinks that hot brick buildings contribute to global warming.

‘I’m not an expert, but…’

How should actual experts — those who publish in the peer-reviewed literature in their area of expertise — deal with the problems that arise from Dunning-Kruger, the media’s failure to recognise “balance” as bias, and the fact that the public uses projected confidence as a cue for credibility?

Speaker of the US House of Representatives John Boehner admitted earlier this year he wasn’t qualified to comment on climate change.

We suggest two steps based on research findings.

The first focuses on the fact of a pervasive scientific consensus on climate change. As one of us has shown, the public’s perception of that consensus is pivotal in determining their acceptance of the scientific facts.

When people recognise that scientists agree on the climate problem, they too accept the existence of the problem. It is for this reason that Ed Maibach and colleagues, from the Centre for Climate Change Communication at George Mason University, have recently called on climate scientists to set the record straight and inform the public that there is a scientific consensus that human-caused climate change is happening.

One might object that “setting the record straight” constitutes advocacy. We do not agree; sharing knowledge is not advocacy and, by extension, neither is sharing the strong consensus behind that knowledge. In the case of climate change, it simply informs the public of a fact that is widely misrepresented in the media.

The public has a right to know that there is a scientific consensus on climate change. How the public uses that knowledge is up to them. The line to advocacy would be crossed only if scientists articulated specific policy recommendations on the basis of that consensus.

The second step to introducing accurate scientific knowledge into public debates and decision-making pertains precisely to the boundary between scientific advice and advocacy. This is a nuanced issue, but some empirical evidence in a natural-resource management context suggests that the public wants scientists to do more than just analyse data and leave policy decisions to others.

Instead, the public wants scientists to work closely with managers and others to integrate scientific results into management decisions. This opinion appears to be equally shared by all stakeholders, from scientists to managers and interest groups.

Advocacy or understanding?

In a recent article, we wrote that “the only unequivocal tool for minimising climate change uncertainty is to decrease our greenhouse gas emissions”. Does this constitute advocacy, as portrayed by some commenters?

It is not. Our statement is analogous to arguing that “the only unequivocal tool for minimising your risk of lung cancer is to quit smoking”. Both statements are true. Both identify a link between a scientific consensus and a personal or political action.

Neither statement, however, advocates any specific response. After all, a smoker may gladly accept the risk of lung cancer if the enjoyment of tobacco outweighs the spectre of premature death — but the smoker must make an informed decision based on the scientific consensus on tobacco.

Likewise, the global public may decide to continue with business as usual, gladly accepting the risk to their children and grandchildren – but they should do so in full knowledge of the risks that arise from the existing scientific consensus on climate change.

Some scientists do advocate for specific policies, especially if their careers have evolved beyond simply conducting science and if they have taken new or additional roles in policy or leadership.

Most of us, however, carefully limit our statements to scientific evidence. In those cases, it is vital that we challenge spurious accusations of advocacy, because such claims serve to marginalise the voices of experts.

Portraying the simple sharing of scientific knowledge with the public as an act of advocacy has the pernicious effect of silencing scientists or removing their expert opinion from public debate. The consequence is that scientific evidence is lost to the public and is lost to the democratic process.

But in one specific way we are advocates. We advocate that our leaders recognise and understand the evidence.

We believe that sober policy decisions on climate change cannot be made when politicians claim that they are not scientists while also erroneously claiming that there is no scientific consensus.

We advocate that our leaders are morally obligated to make and justify their decisions in light of the best available scientific, social and economic understanding.


Click on the links below for other articles in the series, How we make decisions:

The ConversationStephan Lewandowsky receives funding from the Royal Society, from the World University Network (WUN), and from the ‘Great Western 4’ (GW4) consortium of English universities.

Richard Pancost receives funding from RCUK, the EU and the Leverhulme Trust.

This article was originally published on The Conversation. (Republished with permission). Read the original article.

Leave a comment

Filed under Reblogs