Tag Archives: vaccines

Tribal truth fallacy

During the 2020 US presidential election campaign, one of the starkest visual differences between Trump and Biden supporters was in the wearing of masks. Most Biden supporters appeared to wear masks, whereas most Trump supporters didn’t.

Even during Trump’s announcement of his nomination of Judge Amy Barrett to the US Supreme Court, very few people in the White House Rose Garden were wearing a mask. Worse still, some of the guests were seen hugging and kissing each other. 

As a result, the White House has become a COVID-19 ‘hotspot’ or super-spreader location, with seven attendees at the Justice Amy Barrett nomination announcement testing positive for coronavirus – even President Trump and the First Lady. More people caught the virus at the White House election ‘celebration night’. 130 Secret Service agents have also tested positive.

It is clear that at least some, if not most, of Trump supporters refuse to wear masks on political grounds. They seem to associate mask wearing with what they perceive to be ‘liberal’ pro-science attitudes. It is also possible that some Biden supporters might wear masks as a form of visual political opposition to the Trump supporters. In either case, this is irrational tribal behaviour. 

A similar phenomenon may be occurring in the climate change debate. Some beliefs against human causes of climate change may be genuinely (but mistakenly) held on the basis of personal interpretation of the evidence. But at least some of the far-right wing opposition is due to a perception of climate science being some sort of left-wing plot against fossil fuel industries.

The far left is not immune from such irrational tribal behaviour either. At least some of the opposition to GMOs and vaccines seems to be based on ideological opposition to large agribusinesses and the pharmaceutical industry, rather than on evidence-based health concerns.

Another example is where some atheists oppose the idea of free will simply because Christians believe in it.  (This is despite the fact that prominent atheists such as Professor Daniel Dennett also believe in free will).

The tribal truth fallacy lies not in the falsity of beliefs about mask wearing, climate change, GMOs, vaccines or free will per se; but in the basis for these beliefs being identification with one’s own tribe or opposition to a rival tribe.

6 Comments

Filed under Logical fallacies

Fall of an antivaxxer

The Doctor Who Fooled the World
Andrew Wakefield’s War on Vaccines
By Brian Deer
Scribe, A$35.00

(An edited version of these book reviews was published in The Skeptic magazine, September 2020, Vol 40 No 3)

The Doctor Who Fooled the World - Brian Deer

This is an important book. Whether it’s really about “the scientific deception of our time”, as the blurb on the back cover describes it, or the result of “the most extensive investigation by a reporter into an aspect of medicine ever undertaken”, as the author describes it, history will decide. But it has to be said that the subject of the book has caused immeasurable damage to the lives of many thousands, and possibly millions, of people.

Therefore we deemed it appropriate to run two reviews – the first by your editor, and the second by noted anti-anti-vaccination campaigner Peter Bowditch, who adds a personal perspective.

Brian Deer is an experienced journalist with a list of exposes of medical fraud and mispractice in his CV. But Wakefield is probably his magnum opus. He has spent 16 years following the saga of Andrew Wakefield, “the doctor without patients”. He wrote a number of revelatory articles on Wakefield’s progress for the Sunday Times, the publication that supported him throughout, both financially and legally, as well as many TV, radio and public appearances, and a significant expose in the British Medical Journal.

Readers of this publication will be aware of Andrew Wakefield’s role in the anti-vaccination movement – the search for autistic kids who could be linked to the measles/MMR vaccine, publication in The Lancet, Wakefield’s promotion of the ‘link’ which lead to a major decline in MMR vaccinations across the world and consequent increases in cases of measles and the damage that has caused.

Deer came to this story during the anti-vaccination campaign, with Wakefield a high-profile figure riding a wave of publicity and, frankly, adulation. From 2002 until the writing of this book in 2019, Deer has followed the vicissitudes of his subject, and played a key role in exposing a range of misconduct and duplicity that eventually led to the longest-ever inquiry by the UK General Medical Council (GMC). In January 2010, the GMC judged Wakefield to be “dishonest”, “unethical” and “callous”, and on 24 May 2010, Wakefield was struck off the UK medical register. Responding to Deer’s findings, The Lancet partially retracted Wakefield’s research in February 2004 and fully retracted it in February 2010 following the GMC findings. In 2011, Deer published his findings in the BMJ with an endorsement by the editors.

Wakefield, of course, became a tragic hero of the anti-vaccination movement, a martyr to the cause, now living in the US and still promoting the supposed autism link, despite the masses of evidence against him and his claims.

Needless to say, Wakefield does not come out of it looking rosy. In fact, Deer portrays Wakefield as an opportunist, a mediocre researcher who used his personal charisma as a tool to promote himself, and who cottoned on to a ‘good thing’ and milked it – and continues to milk it – for everything it’s got.

But Deer makes clear that, despite the book’s title, there were many others contributing to the failed theory and whose involvement made them just as guilty as Wakefield; it is just that Wakefield had the charisma and drive – if not the medical knowledge and skill – to push the case to a broader public. These others include scientific and medical associates, adulatory followers, politicians, parents, learned journals, lawyers (importantly) and, of course, a complicit media. Some of these have paid the price of their association with Wakefield.

But Deer’s coverage of the media is a bit surprising. This reviewer’s background is journalism, and seeing how the media promoted and boosted Wakefield’s scare tactics was always disappointing, to say the least! There would not have been a vaccine scare without some media putting the case in hyperbolic terms. Their role in the growth of the anti-vaccination movement is considerable and intrinsic to spreading misinformation and paranoia.

Therefore, it is interesting that Deer doesn’t spend more time on the media’s involvement. Certainly, he makes reference to a number of specific and highly partisan journalists, such as Lorraine Fraser of the Mail on Sunday, Jeremy Paxman and Susan Watts on the BBC’s Newsnight, and Matt Lauer of the NBC’s Today program, but his coverage of the media is as much about their attacks on him as it is their support for Wakefield, and in some cases, once Deer’s work had been publicised, trying to gazump him with a scoop.

With that in mind, Deer’s book covers a lot of his investigation in addition to what he is investigating. This adds an element akin to a detective thriller, which takes the book along at a very readable pace. Overall, media coverage notwithstanding, this is an excellent book. The depth and detail are spot on, from well-explained scientific and medical protocols and procedures to well-told human interest elements (the parents’ responses ranging from suspect support to desperate self-blame).

It is highly likely that this is the definitive version of the Wakefield and Co saga. Now for one on the evolution of the anti-vaccination movement to complement it.

PS: The story is bookended by a couple of Australians. In the beginning was John Walker-Smith, a gastroenterologist who was instrumental in testing the children who would eventually become the basis of The Lancet paper (and who only just missed out on suffering the same professional fate as Wakefield), and at the tail end we have Elle ‘the Body’ Macpherson, super-model and consort of the superstar Wakefield. One wonders whether everyone got their just desserts.

– Reviewed by Tim Mendham

In 1996 I was commissioned to write a book about the Internet. It was to explain to people who didn’t know anything about it or the technology behind it or what it could be good and bad for. There was some hysteria about the possibility of a flood of pornography filling our lounge rooms so I actually had to research porn (it was boring!) to answer the inevitable questions in interviews. I also looked for other forms of bad information because it was obvious even then that there would be dubious information coming down the tubes. One of the bad things I found was a group of websites spreading fear about vaccinations. I commented at the time that none of the pornography I was forced to watch was as offensive as some of these sites.

In 1999 I started paying more attention to the anti-vaccination sites and it wasn’t long before I was sneeringly told that a paper by a Dr Andrew Wakefield had been published in The Lancet (the world’s second-most prestigious and influential medical journal) which proved that the MMR vaccine caused autism. As I had experience of people citing unlikely research results in the hope that nobody would check, I read the paper for myself (I had access to the medical library at Westmead Hospital) and it proved no such thing – it only suggested there might be a link. There were several red flags on the paper, one of which was that the editors of The Lancet felt the need to include an editorial statement implying the clichés “further research is needed” and “the science is not settled”.

The biggest red flag for me came from something I had been taught about research methodology at university – the sample of subjects looked too good to be true. It seemed highly unlikely that the parents of the children had independently and randomly sought out a doctor (who didn’t see patients!) at a small and relatively unknown London hospital. I mentioned my concerns in a conference presentation in 2001. The most charitable view was that there had been some cherry-picking going on mixed with some confirmation bias. The peer review process can’t always detect outright fraud, so this was a case of “the benefit of the doubt”.

But fraud it certainly was.

Journalist Brian Deer had been investigating suspicious matters around the pharmaceutical industry for some years, and in 2003 he was approached by an editor at the Sunday Times and asked to apply his investigative skills to the Wakefield story, which by then had started to have a serious effect on vaccination levels and public health. There was enough information and doubt from within the medical profession itself to suggest that the public didn’t know all the things it should have known, and it wasn’t long before the facts started coming out – that Wakefield was paid a large amount of money to find what he wanted to find, that he had applied for a patent on a measles vaccine that would have made him very wealthy if it replaced the current vaccine, that the subjects of the study had not been randomly chosen but had been supplied by a lawyer, Roger Barr, who intended taking legal action against vaccine manufacturers, that Barr had used a loophole in the regulations to stripmine the Legal Aid system for tens of millions of pounds (shared with Wakefield), that Wakefield and Barr both had close associations with prominent anti-vaccination campaigners, that the laboratory doing the tests for measles DNA had less credibility than a school science project … the list went on.

In 2010 Wakefield’s registration as a medical practitioner was cancelled and The Lancet retracted the 1998 paper. It took too many years, but we thought that at last it might all be over. We were wrong.

Brian Deer (described as “a lying dog of a journalist” by a leading anti-vaccination campaigner) has now written a complete history of the Wakefield saga. The book goes back some years before the notorious 1998 paper to reveal the involvement of lawyer Barr and vociferous anti-vaccination organisations, through the almost unbelievable litany of lies, corruption and fraud that surrounded Wakefield and the coordinated attempts to use his fraudulent “research” to damage the public’s perception of the safety and efficacy of vaccines, to his elevation to hero status in the anti-vaccination movement and his current incarnation as the director and producer of anti-vaccination films liked the execrable “Vaxxed: From Cover-Up to Catastrophe”. (The use of the word “cover-up” in the title caused irony meters across the world to shatter, given the way that Wakefield et al had covered up his deceit. Also, when the film first came out in 2016 someone commented about the propaganda: “Leni Riefenstahl would have baulked at making something this dishonest”.)

I could summarise the book into something like those old Reader’s Digest condensed novels, but I wouldn’t know what to leave out and this review would be about 300 pages long. The book is an essential read for anyone who has followed Wakefield over the years (and even I, who have followed him very closely, found many new things to wonder and grimace at). It is essential reading for anyone who thinks that scientific and medical research can’t be corrupted by greed and self-interest or to support an agenda. And it is essential reading for anyone who thinks for a nanosecond that the anti-vaccination movement is based on any philosophy that includes honesty, ethics or morality. Strangely, the book also reinforces the claim by anti-vaccinators that all medical research is corrupt and driven by money, although they will make an exception in this case.

You need this book. Buy it! Highly recommended.

– Reviewed by Peter Bowditch

1 Comment

Filed under Book reviews

Why anti-vaxxers get it so wrong

By Tim Harding

The inability to accurately appraise one’s own knowledge is a cognitive bias known as the Dunning-Kruger Effect, first identified from social psychology experiments conducted in 1999. Dunning-Kruger effects occur when individuals’ lack of knowledge about a particular subject leads them to inaccurately gauge their expertise on that subject. Ignorance of one’s own ignorance can lead people who lack knowledge on a subject to think of themselves as more expert than those who are comparatively better informed.

A recent study published in the peer-reviewed journal Social Science and Medicine (and summarised in The Conversation) demonstrated that at least some anti-vaccination views are based on the Dunning-Kruger Effect.  The study found that 71 per cent of those who strongly endorse misinformation about the link between vaccines and autism feel that they know as much or more than medical experts about the causes of autism, compared to only 28 per cent who most strongly reject that misinformation.

The researchers found that nearly a third, or 30 percent, of people who think that they know more than medical experts about the causes of autism strongly support giving parents the latitude to not vaccinate their children. By contrast, 16 percent of those who do not think that they know more than medical professionals felt the same way.

The study also found that people who think they know more than medical experts are more likely to trust information about vaccines from non-expert sources, such as celebrities. These individuals are also more likely to support a strong role for non-experts in the process of making policies about vaccines and vaccination.

Whilst these recent research findings may not come as a surprise to seasoned skeptics, we now have  empirical evidence to explain why at least some anti-vaccination views are so irrational.

1 Comment

Filed under Logical fallacies, Reblogs

Who are you calling ‘anti-science’? How science serves social and political agendas

The Conversation

File 20170713 19681 1ey4qzl
Left, right, populist, elitist: there are many different ways to be anti-science. arindambanerjee/shutterstock

Darrin Durant, University of Melbourne

Florida recently passed a law which “authorizes county residents to challenge use or adoption of instructional materials” in schools. It’s been described as “anti-science” by individual scientists and USA’s National Center for Science Education.

//platform.twitter.com/widgets.js

From climate change to vaccination, genetic modification and energy security, anti-science is used as a critical phrase implying a person or group is rejecting science outright.

But it’s not that simple.

All shades of political positions are routinely ambivalent about science. Neither the right or left arms of politics are consistent supporters or attackers of science.


Read more: Why politicians think they know better than scientists


If there is no one definition of anti-science that works across all settings, why does it matter that we know anti-science means different things to different people? The reason is that science remains a key resource in arguing for social and political change or non-change.

Knowing what counts as anti-science for distinct groups can help illuminate what people take to be the proper grounds for social and political decision-making.

Left, right, populist, elitist

First up, I’ll define some broad terms.

To be politically “left” is to be concerned about social and economic equality, sometimes cultural equality too, and usually a state big enough to protect the less fortunate and less powerful.

To be “right” is to be concerned about individual autonomy and a state small enough to let markets and personal responsibility decide fates rather than central planners.

To be “populist” involves being anti-elite, anti-pluralist (the “us vs them” view of civic relations), tending toward conspiracy theories, and displaying a preference for direct over representative democracy.

It’s also worth noting here that science can be viewed as an elite endeavour. Not elitist in the two main negative senses, of being impractical or of being practiced by special people somehow different in kind to the rest of us. Instead, I mean science is elitist in the more technical sense of being a professionalised body of practice.

To become a scientist is to be admitted to an elite group in society – not everyone can attend events like Science and Technology 2017 Conference held in Hofburg Palace, Austria. ctbto/flickr, CC BY

The skills and knowledge possessed by scientists are gained by social immersion in various forms of training regimens. Both those learning contexts and the resulting skills and knowledge gained are not widely participated in, nor widely distributed. The experience-based and often professionalised context of science creates a select group.

Different flavours of anti-science

To make clear the way anti-science comes in different political flavours, let me first make some general claims.

Populists of either left-wing or right-wing persuasions distrust elites, and that can be enough for populists to at least be suspicious of factual claims produced distant from the populist. Pauline Hanson said that public vaccinations are a worry and parents should do their own research, including getting a (non-existent) test of their child for negative effects.

Anti-science among the mainstream left and right wings of politics is more complex. Each share a worry that science can be corrupted, but the left blames capitalist profiteering, and the right blames careerist attempts to distort the market.

Each also shares a worry that science can engulf politics, but the left worries that technical answers will displace deliberative politics, and the right worries that science will displace traditional values as the motor of social change.

But whereas anti-science from the left arises as a label for apprehensions about the application of science, anti-science from the right arises as a label for apprehensions about science’s raw ability to discover causal connections.

Populist left

Skeptic magazine publisher Michael Shermer thinks the populist left are anti-science by virtue of disliking genetically modified organisms (GMOs), nuclear power, fracking and vaccines. According to him, they shockingly obsess over the “purity and sanctity of air, water and especially food”.

But writer Chris Mooney is correct to reply that, taking vaccine-related scepticism as an example, Shermer has picked up on conspiracist rather than leftist beliefs.

Lacking authoritarianism, today’s populist-left disquiet with science is actually a lament that production-science tramples human values.

An example might be the Australian Vaccination Network, which claims to be neither pro- nor anti-vaccination and instead “pro-choice”. The populist left in this case pushes parental rights to the limit, presenting it as sufficient for decision-making yet under threat by larger institutions and their “foreign” ways.

Mainstream left

The mainstream-left are more ambivalent than straight anti-anything. GMOs and nuclear power are suspect? Climate science and vaccinations are promising? Leftist anti-science is more about anti-corruption and wariness that technical reasoning will supplant values debates in our democracies.

Greenpeace believes some kinds of scientific evidence, but distrusts others. takver/flickr, CC BY-SA

The Greenpeace critique of GMOs is a good example. Greenpeace appeals for independent science but suggests agro-chemical corporations are corrupting it, and they call for ethical-political deliberation about our food supply not just dry technical assessments of safety.

Populist right

The populist-right implies shadow governments conspire against the market and the people, as when the One Nation senator Malcolm Roberts reportedly claimed climate change science had been captured by “some of the major banking families in the world” who form a “tight-knit cabal”.

In general, the populist-right’s anti-science is just pro-conspiracist.

Mainstream right (small-state conservatives)

The mainstream-right is more complicated.

Sociologist Gordon Gauchat found that to be anti-science the political right had to score high on four dimensions:

  • religiosity
  • authoritarianism
  • distrust of government, and
  • scientific literacy (surprisingly).

They sometimes parrot the left’s allegations of corruption, but mainstream-right and populist-right approach corruption differently.

The mainstream-right is loath to imply a shadow world order, as that disrupts the ideology of the market. Instead, they limit the corruption implication to accusations of groupthink that distort the market (the typical example being climate scientists shutting down dissent for careerist reasons).

The mainstream-right has bigger fish to fry. Philosopher Heather Douglas has ideas about why the political right leans toward anti-science.

Douglas argues that shifts in the public-private boundary, whereby private behaviours become treated as matters of public concern, trouble the right more than the left. Social change is thus viewed more positively by progressive leftists than traditionalist right-wingers.

Douglas suggests that science routinely discovers causal relationships that prompt shifts in the public-private boundary; like finding waste has human and biosphere effects beyond the individual. That means science is pitted directly against traditional values as one of the motors of social change.

Not every example fits Douglas’ pattern. The Australian Liberal Party has been described as undermining renewable energy and being resistant to meaningful policy action on climate change, but clearly supports vaccination. Is that because, for the right, vaccinations expand the market, and right-wingers are more comfortable with social change driven by markets?

The predatory influence science can exert over important ethical-political issues troubles both left and right-wingers.

But where the left worries about the application of science to broader issues, small-state conservatives implicitly react to the means of production that enable political application: the discovery of causal relationships. The observations and experiments that feed into community-based assessments of causality constitute the core of science, not its secondary application to social issues.

As regulatory science has grown since the 1950s, small-state conservatives watched it expand the state by showing the private could be public. Science is a well-resourced competitor among the motors of social change.

Small-state conservatives experience science as guiding social change, a function they want to preserve for traditional values. Small-state conservatives are the true heirs to anti-science.

When the historian Naomi Oreskes talks of merchants of doubt – right-wing free marketers opposed to environmental regulation – she is in my judgement talking about small-state conservatives worried that science is a motor of change outside their sphere of direct control.

What anti-science isn’t, and what it might be

In his book How to be Antiscientific, Steven Shapin argues that descriptions of science, and what ought to be done in science, vary tremendously among scientists themselves.

So you’re not anti-science if you have a preference for or against things like a preferred method, or some particular philosophy of science, or some supposed “character” of science.

Nor are you anti-science because you highlight the uncertainties, the unknowns and the conditionality of scientific knowledge. Even when you are the outsider to science. That’s called free speech in a democracy.

Where does that leave anti-science? Maybe it leaves anti-science living with small-state conservatives, who in effect cast aspersions about something that might be essential to the ideal of scientific authority having a positive and functional relationship with democracy. That is, science as a public good.

If you end up denying the relevance of science to informing or guiding democratic decision-making, because you want some value untouched by information to do that guidance work, maybe that makes you about as anti-scientific as democracies can tolerate.


The ConversationRead more: Should scientists engage with pseudo-science or anti-science?


Darrin Durant, Lecturer in Science and Technology Studies, University of Melbourne

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

Leave a comment

Filed under Reblogs

The number of new flu viruses is increasing, and could lead to a pandemic

The Conversation

C Raina MacIntyre, UNSW; Abrar Ahmad Chughtai, UNSW, and Chau Bui, UNSW

Influenza has affected humans for over 6,000 years, causing pandemics at regular intervals. During the 1918 Spanish flu, it was thought to be a bacteria, until an American physician Richard Shope identified the virus in 1931. The Conversation

So how is it this pathogen has managed to stay around for so long, and why haven’t we beaten it yet? The answer is that influenza is a virus that changes rapidly and regularly.

New flu vaccines are required every year due to these changes and mutations of the virus. While all flu viruses which infect humans are similar, a pandemic virus (which is easily transmitted between humans) is significant because humans have no immunity to it, and so are vulnerable to severe infection and death. Seasonal viruses which we see year after year were once pandemic strains, but humans have now been exposed to these viruses and have some background immunity to them.

We have found that the last decade has seen an acceleration in the number of flu strains infecting humans.

Why are there so many flu strains today?

Around 100 years ago the world experienced the Spanish flu pandemic, and it took another 39 years for a novel influenza virus to emerge. It took a decade after that for the next one. Since 2011, however, we have seen seven novel and variant strains emerge. This is a very large increase compared to the past.

The reasons for this increase are unknown, but there could be many. One reason could be better diagnostics and testing; another could be changes in poultry farming and animal management practices, since influenza is a virus that affects humans, birds and many animal species; as well as changes in climate, urbanisation and other ecological influences.

But none of these factors have changed at the same rate as the emergence of new viruses has escalated. This warrants new research to unpack the relative contributions of all the different possible factors.

Another change is advances in genetic engineering tools, which make it possible to edit the genome of any living organism, including viruses. The possibility of a lab accident or deliberate release of engineered flu viruses is real. Experiments to engineer influenza viruses have been published since 2011, and remain controversial for the possible risk, compared to the relative possible benefit.

With so many more novel influenza viruses emerging and circulating, the probability of genetic mutation and emergence of a new pandemic strain is higher today than any time in the past. It’s a matter of when, not if.

What can we do to prevent a pandemic?

There’s actually already a lot being done to plan for and prevent another flu pandemic. This is both in terms of pharmaceutical drugs and vaccines, and non-pharmaceutical interventions like personal protective equipment, quarantine, border control and banning of mass gatherings in the event of an outbreak.

National pandemic plans outline interventions and the best sequence of different interventions, as well as prioritisation of these interventions. Most countries also conduct pandemic hypotheticals to test their systems and responses. But the best laid plans do not account for every possibility, and we usually encounter the unexpected.

For example, during the 2009 swine flu pandemic, the pandemic phases outlined in the Australian pandemic plan were revised to better fit the emerging situation. This highlights the need to be able to rapidly respond to changing circumstances and change strategies when required.

What about vaccines?

Vaccination is the most talked about strategy but producing a matched vaccine takes three to six months at a minimum. The pandemic would be expected to peak within about two months, so vaccines can’t be relied on until after the peak of the pandemic. Instead, we need to use antiviral medications, social distancing measures, personal protective equipment such as masks and gloves, isolation and quarantine to contain the pandemic.

Influenza vaccines are specific to strains of flu, and can be used for humans, birds or animals. However, they will only work against the specific strains the vaccine was designed for. There are no vaccines for many of the novel strains emerging all over the world.

It’s almost impossible to anticipate which specific virus will cause the next pandemic. At best we can prepare pre-pandemic vaccines which require an educated guess as to which virus may mutate into a pandemic strain, and make a vaccine against that.

A strain-specific pandemic planning strategy like this is not the best approach, as illustrated by the swine flu pandemic in 2009. From 2005 until 2009, the avian flu virus H5N1 (flu viruses are defined and named by proteins on their surface, haemagglutinin – H, and neuraminidase – N) was the major cause of bird flu, so the world focused heavily on preparing for a H5N1 pandemic and developing a H5 pre-pandemic vaccine.

However, the virus that caused the 2009 pandemic was H1N1, a completely different virus, so the pre-pandemic vaccines were no use.

A better approach is to try to prevent the emergence of new virus strains in birds and animals, and mitigate the risks once they emerge. This involves control strategies in both animal and human health sectors, surveillance and prevention efforts.

A targeted approach in global hotspots such as China, the source of the H7N9 influenza virus, and Egypt, which is experiencing a surge in H5N1 influenza, will also help.

Hotspots are generally where humans and livestock mix in close proximity, such as backyard poultry farms and live bird markets. Asia has historically been such a site. However, we sometimes see unusual outbreaks such as the bird flu outbreak in turkey farms in the USA in 2015.

Culling of birds is a commonly used method to control the risk once infection is detected. As are measures such as regulation of live bird markets and of the poultry and livestock industries. Excellent surveillance, rapid intelligence and picking up potential pandemics as they arise can make all the difference. We probably had a near miss pandemic strain arising in Indonesia in 2006, but the remote location and early detection mitigated the risk.

C Raina MacIntyre, Professor of Infectious Diseases Epidemiology, Head of the School of Public Health and Community Medicine, UNSW; Abrar Ahmad Chughtai, Epidemiologist, UNSW, and Chau Bui, PhD candidate, UNSW

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

Leave a comment

Filed under Reblogs

Here’s why we don’t have a vaccine for Zika (and other mosquito-borne viruses)

The Conversation

Suresh Mahalingam, Griffith University and Michael Rolph, Griffith University

As Zika fear rises, especially in the wake of the World Health Organization last night declaring a state of public health emergency, people are inevitably asking why we don’t have a vaccine to protect against the mosquito-borne virus.

Zika is generally a mild illness, causing fever, rash and joint pain, which usually resolves within seven to ten days. It was originally restricted to small outbreaks in the Pacific islands, Southeast Asia and Africa.

Due to the previously low impact of the virus and the estimated US$160-500 million it costs to develop a vaccine, Zika vaccine has not been on the radar. Other severe and potentially fatal mosquito-borne diseases such as malaria, dengue, and West Nile virus affect millions of people each year and have been a higher priority.

That has all changed with the recent “explosive” spread of Zika in the Americas and the potential link with microcephaly (reduced head size and brain damage) in babies of pregnant women who were infected.

Now we’re playing catch up on the research needed to develop vaccines. We know very little about how Zika replicates, how it causes disease, or how the immune system protects against infection.

So what is the status of Zika vaccine development? And how does this compare with the other mosquito-borne viruses that continue to have such a devastating impact on the world’s health?

Vaccine development

The ideal vaccine induces a strong response from the immune system, gives long-term protection with few doses, and causes no side effects. Though quickly developing such a vaccine is rarely this simple.

Zika

It’s early days, but scientists from the Public Health Agency of Canada, the Butantan Institute in Brazil, and the US National Institutes of Health have started work on Zika vaccines. These research teams may have vaccine candidates ready for initial clinical trials towards the end of the year.

Although full regulatory approval of a successful vaccine would take many years, it could potentially be used in public health emergencies within a year.

Yellow fever

The yellow fever vaccine, developed in 1938, has been highly successful at protecting against the virus, which can cause bleeding, jaundice, kidney and liver failure and, ultimately, death. Of the 44 countries at risk of yellow fever in Africa and the Americas, 35 have incorporated Yellow Fever vaccines into infant immunisation programs.

It is a live vaccine, in which a “weakened” virus induces a protective immune response against subsequent infection.

The Yellow Fever vaccine successfully protects against the virus, but is .
UNAMID/Flickr, CC BY-NC-ND

Live vaccines generally give strong protection, but safety is a significant issue, particularly in people with a weakened immune system.

Dengue

Dengue fever is a widespread tropical disease caused by dengue virus, which is transmitted by mosquitoes. Late-stage clinical trials of dengue vaccines are underway, and a vaccine has recently been licensed for use, but so far only in Mexico.

The field is littered with promising but failed vaccines that could not provide protection against the major strains of dengue virus. Nonetheless, there is hope that one will be available more widely in the coming years.

Chikungunya

Chikungunya virus has recently emerged as a serious human pathogen, causing fever and excruciating pain in the joints that can last months.

As with Zika, chikungunya was long considered unimportant because of its limited geographic distribution. Its dramatic expansion over the past decade, particularly in Southeast Asia and the Americas, has led to mobilisation of the vast medical research capabilities of the United States in response to the threat of it becoming established there.

Chikungunya vaccine development is proceeding rapidly, with a number of vaccines entering clinical trials. Researchers have reported early successes, but we are at least several years away from getting an approved vaccine.

Malaria

The big one is malaria, which kills more than 400,000 people a year. Scientists have been working on malaria vaccines for decades.

The RTS,S vaccine, developed by Glaxo Smith Kline, was successful in clinical trials and may soon be routinely used.

However, it only worked for some patient groups and provided only partial protection. Given its partial efficacy, there is debate in the medical community about the vaccine’s value.

The search continues for better vaccines.

Why is it so difficult to develop vaccines?

There is no recipe for the perfect vaccine. Despite the ever-increasing sophistication of vaccine technology, vaccine development often comes down to “suck it and see”. Many vaccines look promising in pre-clinical testing, only to fall over during the slow and expensive clinical trial process.

For many infectious diseases, we still don’t know what type of immune response is the most effective in providing protection. Since vaccines induce a protective immune response against infection, this can make vaccine design very difficult.

Vaccine safety is a major issue. “Live” or “attenuated” vaccines that involve a related or weakened version of the pathogen are often the most effective. But there is still the potential for these vaccines to cause disease, especially in recipients with weakened immune systems.

Vaccines go through a long process of clinical trials and assessment by regulators before they are approved for routine human use. This is a necessary process, but it sets a very high bar for approval. One of the most successful vaccines ever produced – the smallpox vaccine – is a live vaccine and would probably not have been approved by today’s regulators due to safety concerns.

Smallpox was eradicated in 1980.
Pan American Health Organization/Flickr, CC BY-ND

For dengue, there is an additional complication. People previously infected with dengue are at risk of developing much more severe disease when infected with a second, related dengue strain. Similarly, dengue vaccination could also lead to enhanced disease, rather than protection, when a person subsequently encounters the virus. This additional safety concern has markedly complicated and slowed dengue vaccine development.

Urgent priority

Zika causes mild fever in humans that on its own does not make a strong argument for a vaccine. But the possible link to microcephaly in unborn children, even though not yet definitely confirmed, makes vaccine development – and necessary funding – an urgent priority.

It’s also important to fund basic research to provide a necessary springboard for current and future vaccine development programs.

In the meantime, people in affected areas, including travellers, should take care to avoid mosquito bites by wearing long clothing and using repellents, bed nets and window screens.

The ConversationSuresh Mahalingam, Principal Research Leader, Institute for Glycomics, Griffith University and Michael Rolph, Senior research fellow, Griffith University

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

 

1 Comment

Filed under Reblogs

Dumb or dumber? Jim Carrey’s anti-vax antics expose the tactics of internet cranks

The Conversation

Michael J. I. Brown, Monash University and Geraint Lewis, University of Sydney

Comedian Jim Carrey flew his anti-vaccination colours on Twitter last week, railing against new Californian laws designed to reduce the number of unvaccinated children in public schools.

“California Gov says yes to poisoning more children with mercury and aluminum in mandatory vaccines. This corporate fascist must be stopped.”

Along the way, Carrey has provided some excellent examples of the tactics used by online cranks, such as emotional escalation, errors of omission, dismissing experts and proclaiming to support science while simultaneously undermining it.

It is by teasing out these crank tactics that we can see Carrey’s tweets for what they are: well-intentioned but misguided attacks against a lifesaving practice that has been proven time and again to be safe and effective according to our very best scientific practices.

Emotional escalation

Carrey’s tweets are notable for his use of CAPSLOCK, the typed equivalent of shouting.

“A trillion dollars buys a lot of expert opinions. Will it buy you? TOXIN FREE VACCINES, A REASONABLE REQUEST!

Emotional escalation, including yelling and insults, tend to polarise debates. Uncivil comments can also adversely impact how people interpret facts they have previously read.

Carrey’s anti-vaccination tweets also included photos of autistic boys, further escalating emotions. We feel sympathy for the boys and their families, but this is a poor substitute for statistical studies, which haven’t found any connection between vaccines and autism.

Carrey’s use of photos of autistic boys may also have backfired. One photo showed Alex Echols, who suffers from Tuberous Sclerosis, a genetic disorder often leads to autism. Echols’ autism has nothing to do with vaccines, yet he was initially used to emotionally bolster Carrey’s arguments (Carrey has since apologised).

Mercury matters

Ideas touted by cranks are often superficially true, yet misleading. A splendid example is:

“They say mercury in fish is dangerous but forcing all of our children to be injected with mercury in thimerosol is no risk. Make sense?”

This contains an error of omission, as it actually refers to two different mercury containing compounds. Methylmercury accumulates in animals and is dangerous when ingested. Thimerosal was once a common preservative in vaccines, and breaks down into ethylmercury, which is rapidly removed from the body.

Does Carrey’s tweet “make sense?” No.

I’m pro-science

Historian of science Michael Gordin succinctly notes:

“No one in the history of the world has ever self-identified as a pseudoscientist. There is no person who wakes up in the morning and thinks to himself, ‘I’ll just head into my pseudolaboratory and perform some pseudoexperiments to try to confirm my pseudotheories with pseudofacts.'”

Cranks often proclaim their love of science while simultaneously attacking it. Carrey tweeted:

“I repeat! I AM PRO-VACCINE/ANTI-NEUROTOXIN, as is Robert Kennedy Jr. Please read the following article and book http://bit.ly/1GLSpHf

Carrey claims to be pro-vaccine while credulously repeating dangerous myths about their risks. He lacks expertise to evaluate studies of the efficacy and risks of vaccines (which often use similar scientific techniques), but has reached strong yet contrary opinions on these topics. How can this be reasonable?

The tactic of proclaiming support for science while simultaneously undermining it isn’t restricted to comedians. The Australian newspaper has claimed it “supports global action on climate change based on the science,” but often repeats stories sourced from the internet that reject peer-reviewed climate science.

Dismissing experts

So how does Carrey dismiss the work of thousands of medical researchers from around the globe? Very easily. Like many internet cranks, he makes unfounded accusations of scientific organisations being corrupt:

“The CDC can’t solve a problem they helped start. It’s too risky to admit they have been wrong about mercury/thimerasol. They are corrupt.”

This is a very common tactic for dismissing broad swathes of evidence. Some climate contrarians believe scientists are engaged in criminal activity.

Such claims get into conspiracy theory territory, particularly as independent groups of scientists scattered across the globe get comparable results. For example, the American Berkeley Earth team – which started off sympathetic to climate change sceptics – finds very similar temperature rise across Australia as Bureau of Meteorology scientists.

Even sympathetic media generally tone down bloggers’ claims of criminal activity. That said, it is curious that innocent activities such as data processing and analysis are sometimes referred to as (more ominous sounding) data manipulation.

A less severe variant of the corruption tactic is claiming experts have a conflict of interest, as they are paid to undertake their work. Of course, this allows one to dismiss evidence from almost any professional – be that a doctor, lawyer, psychologist or scientist – leaving only courageous internet amateurs.

Popularity

Why is anyone paying attention to Carrey when it comes to vaccines? The answer is celebrity. He is a successful actor, with almost 15 million followers on Twitter. If he says something controversial, millions of people immediately know about it.

Crank ideas, which have been rejected by the scientific community, only remain alive while they have support from the public, celebrities, millionaires or politicians. Without popularity, crank ideas wither and die.

Cranks and their supporters know they must remain popular to survive, and game the system. Cranks often badge themselves as “coalitions”, “institutes”, “networks” and “alliances.” Cranks can buy social media followers or use “follow back” accounts to give the appearance of significant support. Websites often contain myriad links to fellow cranks, which may be an attempt to game search engine rankings. So cranks may appear more significant to the public and media than they truly are.

Of course, to have celebrity support is incredibly helpful to cranks. Along with Carrey, Bill Maher, Robert F Kennedy Jr. and Jenny McCarthy have promoted the anti-vaccination cause. They have helped keep this cause alive, even though it’s at odds with medical research.

Carrey almost certainly means well. But, like many internet cranks, he doesn’t have the expertise to distinguish scientific fact from dangerous myth. The recent death of a woman from measles and the Disneyland measles outbreak highlight just how dangerous such myths can be.


Michael J. I. Brown is Associate professor at Monash UniversityGeraint Lewis is Professor of Astrophysics at University of Sydney.

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.

Leave a comment

Filed under Reblogs

In the vaccine debate, science is just getting its boots on

The Conversation

Ian Musgrave

There is an old saying that a lie will be heard around the world while truth is still getting its boots on. This was brought home to me during a radio interview I did on Tuesday night in the wake of the Federal Government’s decision to remove the conscientious objection exemption for vaccination. I was astonished that in 2015, some of these pieces of misinformation are still out there, and still believed, if the passionate radio callers (and several posts in my Facebook feed) are any indication.

Here is a sample of some of the misinformation and misunderstandings I encountered on the radio show and on the internet in the past 24 hours (paraphrased slightly).

“Why should we inject our kids with polyethylene glycol/brake fluid?” We don’t. There is no ethylene glycol in our vaccines. We do have harmless traces of a completely different chemical, 2-phenoxyethanol, which is an antibacterial helping keep the vaccines sterile.

“Why are we injecting our kids with formaldehyde?” Formaldehyde is used to inactivate viruses in some vaccines. After clean-up, minute traces are left, but the amount you would get from a vaccine injection is much less that is circulating naturally in your blood. Yes, your body makes formaldehyde. If you are seriously worried about formaldehyde, don’t eat apples or pears, which contain much more formaldehyde than vaccines. For details see here and here.

“Why are we injecting our kids with mercury?” We aren’t, there has been no mercury in kids vaccines in Australia since 2000. Especially those in the vaccination schedule. Note that the amount of mercury in the Thiomersal preservative is less than what you would get from eating a can of tuna and no one seems to be advocating a fish free diet for kids.

“Why are we still giving kids small pox vaccine when small pox is extinct?” We are not. And I am astonished that anyone would think that we did, but this (paraphrased) was an actual question.

Measles vaccination conquers measles. Source: Epidemiol Rev (2002) 24 (2): 125-136. doi: 10.1093/epirev/mxf002

“But we don’t need vaccines, these diseases were going before vaccines”. Nope, see that graph? That’s the incidence of measles in the UK before and after the vaccine, note the strong correlation between the fall in measles and the vaccine coverage of the population. Similar graphs are seen for the US and Canada (see here for the most dishonest anti-vaccination graph ever).

Australia stopped collecting data on measles incidence so there is a big gap in our data, but the incidence of the disease was higher before the vaccine than after. Same goes for pertussis (we had just had an epidemic when the vaccine was introduced), diptheria and Heamophilus Influenza B (and if you want to claim it’s all hygiene and diet, the HIB vaccine was introduced in the ‘90’s where nutrition and hygiene was at modern standards). See the Australian Academy of Sciences “science of vaccination” for graphs and details.

“There have been no deaths from measles since 2000”, this is actually a false statement about US data. 2000 was the year that endemic measles was declared extinct in the US. In Australia, we haven’t has a measles death since 1995. Unsurprisingly, since vaccination has been so effective.

However, in the US the has been 8 deaths during the epidemics caused by unvaccinated people catching measles overseas and bringing it back to the US, where it spreads mostly amongst the unvaccinated. In the US, it is usually linked to the heinous meme “no measles deaths since 2000, hundreds of measles deaths from the measles vaccine”. This pernicious statement is untrue, there have been no deaths due to the measles vaccine.

“What about that study that showed vaccines cause autism”. No, just no. Andrew Wakefield’s study, since retracted for unethical conduct, was so sloppy that it was meaningless, and may even be fraudulent. This unethical study has caused thousand of people to forgo measles vaccines, with kids getting caught in epidemics that should never have happened.

In the debate about our response to under vaccination, it is assumed that people refusing vaccines are making rational choices, weighing up the pros and cons of vaccination versus side effects with the best available data.

The controversial Leunig cartoon that shows a mother fleeing a barrage of syringes inadvertently sums up what it is really about.

Fear

As the talking points I’ve encountered show, people are coming up with objections that are either wildly distorted or flat out untrue but they all have one thing in common. They all directly stoke the fear that by vaccinating our children we will harm them. A rational choice is difficult to make in this environment.

That a lie can travel around the world before truth gets its boots on is never truer than in this debate. This recent article contains talking points not covered above that are either not true or wildly distorted (Fluarix does not contain foetal bovine serum, the virus for the vaccine is grown in eggs; vaccinations are not intravenous and so on). But I’ve already spent three days and over 1,000 words to cover the standard false or misleading claims and I have to stop at some point.

All the items I talked about have been dealt with long ago. But if you do an internet search for “Australian vaccine information” three of the top five hits are vaccine denialist sites. In this age of Dr. Google sites that play on fears will trump the more sober (and boring) official sites.

My approach to vaccine refusers (the people whose decisions have been influenced by misinformation and fear, as opposed to hard core vaccine denialists) is to provide them with better and more accessible information.

This may not work as well as it might be naively imagined, a study on the best way to provide accurate vaccine information to parents who had previously failed to vaccinate their children found that although the parents understanding of vaccine safety improved, they were no more likely to have their children vaccinated. Some parents became even less likely to vaccinate their children.

Even in the light of this somewhat depressing knowledge, we should not stop trying to get truth out there. One of the difficulties in communicating vaccine facts is that these may leave a gap in peoples beliefs (accounting for their reluctance to accept the facts). An approach I’ve mentioned before is replacing the gap with an alternative narrative. Whichever approach we use, we need to keep the facts front and centre.

Remember, this is not just abstract knowledge, or “cute science facts”, but information that will keep real kids out of hospital and in some case save lives.

Truth (and science) may take time to get its boots on, but those boots were made for walking, and the journey has just begun.

The ConversationThis article was originally published on The Conversation. (Reblogged by permission). Read the original article.


Leave a comment

Filed under Reblogs

Infections of the mind: why anti-vaxxers just ‘know’ they’re right

The Conversation

Thom Scott-Phillips, Durham University

Anti-vaccination beliefs can cause real, substantive harm, as shown by the recent outbreak of measles in the US. These developments are as shocking and distressing as their consequences are predictable. But if the consequences are so predictable, why do the beliefs persist?

It is not simply that anti-vaxxers don’t understand how vaccines work (some of them may not, but not all of them). Neither are anti-vaxxers simply resistant to all of modern medicine (I’m sure that many of them still take pain killers when they need to). So the matter is not as simple as plain stupidity. Some anti-vaxxers are not that stupid, and some stupid people are not anti-vaxxers. There is something more subtle going on.

Naïve theories

We all have what psychologists call “folk” theories, or “naïve” theories, of how the world works. You do not need to learn Newton’s laws to believe that an object will fall to the floor if there is nothing to support it. This is just something you “know” by virtue of being human. It is part of our naïve physics, and it gives us good predictions of what will happen to medium-sized objects on planet earth.

Naïve physics is not such a good guide outside of this environment. Academic physics, which deals with very large and very small objects, and with the universe beyond our own planet, often produces findings that are an affront to common sense.

A life force. Food by Shutterstock

As well as physics, we also have naïve theories about the natural world (naïve biology) and the social world (naïve psychology). An example of naïve biology is “vitalistic causality” – the intuitive belief that a vital power or life force, acquired from food and water, is what makes humans active, prevents them from being taken ill, and enables them to grow. Children have this belief from a very young age.

Naïve theories of all kinds tend to persist even in the face of contradictory arguments and evidence. Interestingly, they persist even in the minds of those who, at a more reflexive level of understanding, know them to be false.

In one study, adults were asked to determine, as quickly as possible, whether a statement was scientifically true or false. These statements were either scientifically true and naïvely true (“A moving bullet loses speed”), scientifically true but naïvely false (“A moving bullet loses height”), scientifically false but naïvely true (“A moving bullet loses force”), or scientifically false and naïvely false (“A moving bullet loses weight”).

Adults with a high degree of science education got the questions right, but were significantly slower to answer when the naïve theory contradicted their scientific understanding. Scientific understanding does not replace naïve theories, it just suppresses them.

Sticky ideas

As ideas spread through a population, some stick and become common, while others do not. The science of how and why ideas spread through populations is called cultural epidemiology. More and more results in this area are showing how naïve theories play a major role in making some ideas stickier than others. Just as we have a natural biological vulnerability to some bacteria and not others, we have a natural psychological vulnerability to some ideas and not others. Some beliefs, good and bad, are just plain infectious.

Here is an example. Bloodletting persisted in the West for centuries, even though it was more often than not harmful to the patient. A recent survey of the ethnographic data showed that bloodletting has been practiced in one form or another in many unrelated cultures, across the whole world.

Paraphernalia. (Source: Peter Merholz, CC BY-SA)

A follow-up experiment showed how stories that do not originally have any mention of bloodletting (for instance, about an accidental cut) can, when repeated over and over again, become stories about bloodletting, even among individuals with no cultural experience of bloodletting.

These results cannot be explained by bloodletting’s medical efficiency (since it is harmful), or by the perceived prestige of western physicians (since many of the populations surveyed had no exposure to them). Instead, the cultural success of bloodletting is due to the fact that it chimes with our naïve biology, and in particular with our intuitive ideas of vitalistic causality.

Bloodletting is a natural response to a naïve belief that the individual’s life force has been polluted in some way, and that this pollution must be removed. Anti-vaccination beliefs are a natural complement to this: vaccinations are a potential poison that must be kept from the body at all costs.

At an intuitive, naïve level we can all identify with these beliefs. That is why they can satirised in mainstream entertainment.

In Stanley Kubrick’s great comedy Dr. Strangelove, the American general Jack D. Ripper explains to Lionel Mandrake, a group captain in the Royal Air Force, that he only drinks “distilled water, or rainwater, and only pure grain alcohol”, because, he believes, tap water is being deliberately infected by Communists to “sap and impurify all of our precious bodily fluids”. The joke works because Ripper’s paranoia is directed at something we all recognise: the need to keep our bodies free from harmful, alien substances. Anti-vaxxers think they are doing the same.

The ConversationThis article was originally published on The Conversation. (Reblogged by permission). Read the original article.


Leave a comment

Filed under Reblogs

How vaccines change the way we think about disease

The Conversation

By Elena Conis, Emory University

The news on the current measles outbreak contains plenty of reminders that measles causes brain damage, pneumonia, hearing loss and death. A few lone voices have spoken up to say measles isn’t that serious, including an Arizona doctor who said it’s “really just a fever and a rash” – and soon found himself under investigation by his state’s medical board.

Back in the 1960s, it wasn’t controversial to call measles benign. Though the disease killed about 400-500 Americans a year, it was considered a normal part of childhood. It was so common, in fact, that to this day, people born in the pre-measles vaccine era are considered immune. But the introduction of the measles vaccine, and efforts to promote it, fundamentally changed things. In the five decades since we’ve been immunizing against it, measles has become increasingly known as a deadly killer.

This transformation in perception, from relatively benign to a serious disease, isn’t unique to measles. As I have discovered in my research, it’s a pattern that’s been repeated over and over again in the modern history of immunization. This is not to say that measles is now considered a mild infection, or to suggest that risk from the virus, or other vaccine-preventable diseases, is overestimated. The point I want to argue is that the introduction of a vaccine reframes our perception of the disease it prevents.

Vaccines change our perception of risk

How does this happen? New vaccines simultaneously drive down the number of people getting the disease and increase our awareness of the risks of the disease.

Vaccines shine a spotlight on their target infections and, in time, those infections — no matter how “common” or relatively unimportant they may have seemed before — become known for their rare and serious complications and defined by the urgency of their prevention.

A spotted vaccine delivery van labeled ‘Measles must go.’ Source: CDC

This certainly happened to measles, whose first vaccine was uneventfully released in 1963.

At the time, many parents saw measles as a common and relatively harmless part of childhood – even though it infected three to four million people a year and caused roughly 48,000 hospitalizations annually. Many doctors felt as parents did, especially when comparing measles to such worrisome disease threats as smallpox and polio. Even the head of the Centers for Disease Control described measles as a disease “of only mild severity” which caused “infrequent complications.”

But the very development of the vaccine focused new scientific attention on the disease. Within a few years, scientists had compared measles to polio — the previous decade’s public health priority — and found it a much more serious threat to children’s health. Inspired by this finding, and frustrated by the public’s lack of enthusiasm for the vaccine, federal health officials launched a national campaign to publicize measles’ dangers.

The campaign officially spread the word, for the first time, that measles was “a serious disease that sometimes causes pneumonia, deafness, encephalitis and even death.” Public figures ranging from the Surgeon General to Ann Landers announced that measles could leave children blind, deaf and mentally impaired. And the campaign employed a poster child — disabled ten-year-old Kim Fisher — to illustrate the idea that measles immunization was necessary because “one death, one brain-damaged child, or even one child who needs hospitalization is one too many,” as one campaign supporter put it.

A new picture of measles emerges

As the campaign wore on, scientists continued to study the disease more closely than ever. Doctors began to report measles cases to health departments at unprecedented rates. And together, doctors and scientists began to pay more attention to the disease’s risks than even before. As a result, a new picture of the disease began to form: it appeared to cause more deaths than previously thought, brain damage in even mild cases, even harm to fetuses.

As the public continued to respond to the national campaign with “general apathy,” however, health officials redoubled their efforts to publicize measles’ “dramatic aspects,” and states began passing laws requiring the vaccine for schoolchildren. Within just over a decade, the country saw an all-time low of measles cases — and the disease had solidly acquired its new reputation as a deadly infection worthy of prevention at any cost.

A measles immunization campaign poster display at the Eradicate Measles Exhibit in 1972. Source: CDC/Don Lovell

We used to think mumps and chickenpox were ‘mild’ too

In the decades that followed the introduction of the measles vaccine, vaccine makers and health officials duplicated this approach with one new vaccine after another.

Mumps, often the butt of jokes in its pre-vaccine days, was no laughing matter within a decade of its vaccine’s introduction in 1967. Hepatitis B was considered an obscure infection of little import to most Americans when its vaccine first came out in 1981, but soon after it evolved into a “cousin” of AIDS known for lurking in nail salons, piercing parlors and playgrounds.

Since the development of the chickenpox vaccine in the 1990s, the virus has been transformed in the public imagination from an innocuous if uncomfortable rite of childhood to a highly contagious infection that can cause pneumonia, sepsis and sometimes death. And in just the last decade, human papillomavirus (HPV) has morphed from a little-known sexually transmitted infection to a widely known cause of multiple forms of cancer. Each of these transformations in perception was triggered by a new vaccine.

Each new vaccine invited deliberation on how it should be used. That, in turn, focused increased scientific attention on the disease. Often, as federal health officials and other scientists accumulated new information about the disease’s risks and complications, the vaccine maker did its part to market its vaccine. As talk of each disease and its more dramatic aspects spread, public and scientific perception of the disease gradually transformed.

In this country, high vaccination rates rest on a consensus about the diseases prevented by vaccines. When doctors, health officials and, in particular, parents view a disease as serious, they view its vaccine as one worth getting.

The recent increase in the number of philosophical objectors to measles vaccine shows that historical consensus about the disease itself has eroded in recent years. But history also shows that one surefire route to consensus about a disease is fear of that disease. And fear often spreads like wildfire during disease outbreaks, much like what is happening once again now with measles.

The ConversationThis article was originally published on The Conversation. (Reblogged by permission).  Read the original article.


Leave a comment

Filed under Reblogs