Welcome to Tim Harding’s blog of writings and talks about logic, rationality, philosophy and skepticism. There are also some reblogs of some of Tim’s favourite posts by other writers, plus some of his favourite quotations and videos. This blog has a Facebook connection at The Logical Place.
There are over nine hundred posts here about all sorts of topics – please have a good look around before leaving.
If you are looking for an article about the Birth of Experimental Science recently published in The Skeptic magazine titled ‘Out of the Dark’, it is available here.
If you are looking for an article about the Dark Ages recently published in The Skeptic magazine titled ‘In the Dark’, it is available here.
If you are looking for an article about the Traditional Chinese Medicine vs. Endangered Species recently published in The Skeptic magazine titled ‘Bad Medicine’, it is available here.
If you are looking for an article about the rejection of expertise published in The Skeptic magazine titled ‘Who needs to Know?’, it is available here.
If you are looking for an article about Charles Darwin published in The Skeptic magazine titled ‘Darwin’s Missing Link“, it is available here.
If you are looking for an article about the Astronomical Renaissance published in The Skeptic magazine titled ‘Rebirth of the Universe‘, it is available here.
If you are looking for an article about DNA and GM foods published in The Skeptic magazine titled ‘The Good Oil‘, it is available here.
If you are looking for an article about animal welfare published in The Skeptic magazine titled ‘Creature Features‘, it is available here.
If you would like to submit a comment about anything written here, please read our comments policy.
The word ‘logic‘ is not easy to define, because it has slightly different meanings in various applications ranging from philosophy, to mathematics to computer science. In philosophy, logic’s main concern is with the validity or cogency of arguments. The essential difference between informal logic and formal logic is that informal logic uses natural language, whereas formal logic (also known as symbolic logic) is more complex and uses mathematical symbols to overcome the frequent ambiguity or imprecision of natural language.
So what is an argument? In everyday life, we use the word ‘argument’ to mean a verbal dispute or disagreement (which is actually a clash between two or more arguments put forward by different people). This is not the way this word is usually used in philosophical logic, where arguments are those statements a person makes in the attempt to convince someone of something, or present reasons for accepting a given conclusion. In this sense, an argument consist of statements or propositions, called its premises, from which a conclusion is claimed to follow (in the case of a deductive argument) or be inferred (in the case of an inductive argument). Deductive conclusions usually begin with a word like ‘therefore’, ‘thus’, ‘so’ or ‘it follows that’.
A good argument is one that has two virtues: good form and all true premises. Arguments can be either deductive, inductive or abductive. A deductive argument with valid form and true premises is said to be sound. An inductive argument based on strong evidence is said to be cogent. The term ‘good argument’ covers all three of these types of arguments.
A valid argument is a deductive argument where the conclusion necessarily follows from the premises, because of the logical structure of the argument. That is, if the premises are true, then the conclusion must also be true. Conversely, an invalid argument is one where the conclusion does not logically follow from the premises. However, the validity or invalidity of arguments must be clearly distinguished from the truth or falsity of its premises. It is possible for the conclusion of a valid argument to be true, even though one or more of its premises are false. For example, consider the following argument:
Premise 1: Napoleon was German
Premise 2: All Germans are Europeans
Conclusion: Therefore, Napoleon was European
The conclusion that Napoleon was European is true, even though Premise 1 is false. This argument is valid because of its logical structure, not because its premises and conclusion are all true (which they are not). Even if the premises and conclusion were all true, it wouldn’t necessarily mean that the argument was valid. If an argument has true premises and its form is valid, then its conclusion must be true.
Deductive logic is essentially about consistency.The rules of logic are not arbitrary, like the rules for a game of chess. They exist to avoid internal contradictions within an argument. For example, if we have an argument with the following premises:
Premise 1: Napoleon was either German or French
Premise 2: Napoleon was not German
The conclusion cannot logically be “Therefore, Napoleon was German” because that would directly contradict Premise 2. So the logical conclusion can only be: “Therefore, Napoleon was French”, not because we know that it happens to be true, but because it is the only possible conclusion if both the premises are true. This is admittedly a simple and self-evident example, but similar reasoning applies to more complex arguments where the rules of logic are not so self-evident. In summary, the rules of logic exist because breaking the rules would entail internal contradictions within the argument.
An inductive argument is one wherethe premises seek to supply strong evidence for (not absolute proof of) the truth of the conclusion. While the conclusion of a sound deductive argument is supposed to be certain, the conclusion of a cogent inductive argument is supposed to be probable, based upon the evidence given. An example of an inductive argument is:
Premise 1: Almost all people are taller than 26 inches Premise 2: George is a person Conclusion: Therefore, George is almost certainly taller than 26 inches
Whilst an inductive argument based on strong evidence can be cogent, there is some dispute amongst philosophers as to the reliability of induction as a scientific method. For example, by the problem of induction, no number of confirming observations can verify a universal generalization, such as ‘All swans are white’, yet it is logically possible to falsify it by observing a single black swan.
Abduction may be described as an “inference to the best explanation”, and whilst not as reliable as deduction or induction, it can still be a useful form of reasoning. For example, a typical abductive reasoning process used by doctors in diagnosis might be: “this set of symptoms could be caused by illnesses X, Y or Z. If I ask some more questions or conduct some tests I can rule out X and Y, so it must be Z.
Incidentally, the doctor is the one who is doing the abduction here, not the patient. By accepting the doctor’s diagnosis, the patient is using inductive reasoning that the doctor has a sufficiently high probability of being right that it is rational to accept the diagnosis. This is actually an acceptable form of the Argument from Authority (only the deductive form is fallacious).
Hodges, W. (1977) Logic – an introduction to elementary logic (2nd ed. 2001) Penguin, London.
Lemmon, E.J. (1987) Beginning Logic. Hackett Publishing Company, Indianapolis.
If you find the information on this blog useful, you might like to consider supporting us.
Rationality may be defined as as the quality of being consistent with or using reason, which is further defined as the mental ability to draw inferences or conclusions from premises (the ‘if – then’ connection). The application of reason is known as reasoning; the main categories of which are deductive and inductive reasoning. A deductive argument with valid form and true premises is said to be sound. An inductive argument based on strong evidence is said to be cogent. It is rational to accept the conclusions of arguments that are sound or cogent, unless and until they are effectively refuted.
A fallacy is an error of reasoning resulting in a misconception or false conclusion. A fallacious argument can be deductively invalid or one that has insufficient inductive strength. A deductively invalid argument is one where the conclusion does not logically follow from the premises. That is , the conclusion can be false even if the premises are true. An example of an inductively invalid argument is a conclusion that smoking does not cause cancer based on the anecdotal evidence of only one healthy smoker.
By accident or design, fallacies may exploit emotional triggers in the listener (e.g. appeal to emotion), or take advantage of social relationships between people (e.g. argument from authority). By definition, a belief arising from a logical fallacy is contrary to reason and is therefore irrational, even though a small number of such beliefs might possibly be true by coincidence.
According to the American Library Association, Huckleberry Finn is #14 on their list of “most banned or challenged books“, with “challenging” meaning “attempts to remove a book from a library”, and “banning” meaning “a successful removal”. Here are the first 15 books on the “top 100 list” from 2000-2009 (lists compiled by decade), in order with the most/banned challenged works at the top. The results are surprising, but I bet you can guess why some folks find these books offensive:
One would think, for instance, that Maya Angelou and Toni Morrison’s books, both superb expositions of black culture and oppression, wouldn’t be banned, but they’re banned because of their sexual content, as are many other books.
That’s not the case for Adventures of Huckleberry Finn, the great and influential novel by Mark Twain. Ernest Hemingway said this of that book: “All modern American literature comes from one book by…
In these censorious times, made even more censorious by liberals’ counter-reaction to Trump (his election has heightened fears of Islamophobia and racism among the Left), we can expect to see even more calls for bowdlerizing books or removing them from libraries. This is the case with two wonderful books—To Kill a Mockingbird and Huckleberrry Finn—in some schools in Virginia. As the Guardian reports, parental complaints have led to both books being removed from school libraries.
Harper Lee and Mark Twain’s literary classics were removed from classrooms in Accomack County, in Virginia after a formal complaint was made by the mother of a biracial teenager. At the centre of the complaint was the use of the N-word, which appears frequently in both titles.
According to CNN, Donald Trump has tapped former neurosurgeon and present-day creationist, Seventh-Day Adventist, believer in Satan, global-warming denialist, and flat-out abortion opponent (including in cases of rape or incest) Ben Carson to be his Secretary of Housing and Urban Development. His job? to “oversee federal public housing programs and helps formulate policy on homelessness and housing discrimination.”
Carson may have been a good neurosurgeon, but has no qualifications for such a job. Plus he’s a creationist! I always worry that people who reject evolution—a mandate of Carson’s religion—won’t accept other facts they don’t like, either.
Well, this is in line with Trump’s other disasterious picks. My advice for Carson? Something he should know as a doctor: “First, do no harm.”
By the way, has Trump ever made any statements about whether he accepts evolution?
We have recently began taking a look at Paul Feyerabend’s (recently released, even though he died back in 1994) book Philosophy of Nature, which presents his ideas on the history of the different ways in which human beings have tried to make sense of the world. The second chapter is on the structure and function of myths, since mythological accounts are one of the three “forms of life” that humans have come up with in order to understand the world, and that Feyerabend explores in his book (the other two are philosophy and science).
Oh dear, I am going to be right out of step with the world’s literary elite with this one: I am supposed to be bowled over by the brilliance of Pond by Claire-Louise Bennett, but I was bored brainless by the time I got to the end of the second story chunk of text and pressed on to the obligatory page 50 only because it is polite to an author to give a book a fair go. Then *smacks forehead* I pressed on #IgnoringMyReliableInstincts because I thought that the literary elite must be right and I must be wrong and that there must surely be something special which I’ve missed. And then I gave up altogether at page 103 because really, #Life’sTooShort and reading about the quest to replace broken stove-knobs and the importance of sitting on a particular ottoman at a party is IMO just too lame, while reflections on drinking…
The Australian federal government recently announced a national plan to tackle antimicrobial resistance with an approach encompassing the health and agriculture sectors.
Antimicrobial resistance occurs when microorganisms (bacteria, viruses, fungi or protozoans) evolve to survive exposure to antimicrobials. Antibiotic resistance and so-called “superbugs” relate to bacteria and most frequently make the headlines, but antimicrobial resistance includes resistance to all antimicrobial agents, and not just those used in medicine.
It’s not just the health sector that’s responsible
We need to recognise human, animal and environmental health are linked. Optimal human health depends on good health in the other two sectors.
Recent human diseases, such as swine flu or Middle East respiratory syndrome (or MERS), have originated from animals and are shaped by activities such as land clearing, intensive farming, animal domestication and climate change.
Unveiling the links between human, animal and environmental health provides a comprehensive way to develop effective interventions.
Traditionally, antimicrobial resistance has been viewed as a problem for the medical health sector, whose responsibility it is to address it.
Problems have included inappropriate and over-prescribing of antibiotics, both in hospital, and by GPs.
However, if we view and treat antimicrobial resistance solely as a medical health problem, we neglect behavioural, environmental, agricultural, economic and social contributions.
Overuse of antimicrobials in both animals and humans contributes to antimicrobial resistance.
In farm animals, one concern internationally has been the over-use of antibiotics as “growth promoters” or to prevent illness rather than just to treat sick animals.
Every time an antimicrobial is used, resistant microorganisms preferentially grow and replace sensitive ones. Resistant microorganisms or resistance genes that arise in agricultural or domesticated animals can easily spread to, and among, other humans and animals.
Inadequately treated waste streams from households, hospitals and farms using antimicrobials release antimicrobials into the environment. This may encourage growth of resistant microorganisms. These waste streams can also release resistance genes and resistant microorganisms into the environment, where they can act as reservoirs for genetic exchange and further development of resistance.
Resistant microorganisms and resistance genes can potentially be geographically spread, and later reintroduced into animals and humans via agricultural activities and the food chain.
So interventions need to target antimicrobial use in humans and animals, as well as control the spread of resistant bacteria and resistance genes through improved waste treatment. Interventions in one sector may not improve the overall problem if any one intervention is offset by counterproductive practices in any other.
Globalisation complicates this even further. Interventions in one country may have a benefit, but be of limited value, if not taken up by other countries. This is why a global approach to antimicrobial resistance is so important.
Challenges to a unified approach
To implement the “one health” approach, we need collaborations between disciplines and across sectors. However, these collaborations are not always easy. Researchers are often entrenched in their disciplinary silos.
Researchers must have opportunities to talk to those outside their speciality, be open to new ideas, and make efforts to understand each other’s discipline-specific language. Interdisciplinary university courses play an important part in creating future leaders who can move easily between disciplines.
Funding bodies must also catch up; interdisciplinary grant applications are often reviewed by discipline-specific panels resulting in higher failure rates. This actively disincentivises research across disciplines.
Effective collaboration between life and environmental sciences, social sciences, and policy makers will be crucial for implementing successful evidence-based policy. Policy that has not been developed in consultation with affected stakeholders will most likely fail. The social sciences are important partners in identifying the facilitators and barriers to changing behaviour that can help guide policy.
In all sectors, stakeholders must be made aware of how their decisions impact other stakeholders. Finally, policy must respect the goals of stakeholders in each sector to encourage long-term change.
This tale comes from today’s Sunday Expressas well as Jihad Watch (which took the story from the Express), and it’s a bit confusing. Apparently a government-funded hotline in the Netherlands has said it’s basically okay for callers (or posters) to call for the death of gays if they’re Muslims; after all, that’s what the Qur’an tell them. (Calling for the death of gays is a crime if you’re not a Muslim.)
As the Express reports:
In a shocking move, the taxpayer-funded hotline said it would not pursue a criminal complaint over horrific messages from radical Islamists because the Koran says gay people can be killed.
The disgraceful stance came to light when a member of the public complained about death threats posted to an online forum which called for homosexuals to be “burned, decapitated and slaughtered”.
Dutch MPs today reacted with horror to the revelations, demanding an immediate…
Philip Henry Napoleon Opas OBE QC was born in Melbourne on 24 February 1917 and died on 25 August 2008, aged 91. His antecedents were Portuguese and Jewish. His accountant father Joseph Opas was the first to recommend to the Victoria Police that a special company squad of accountancy-trained men be set up to combat business fraud.
Educated at Melbourne Church of England Grammar School, Opas’ schooling was cut short at the age of 15. Still a teenager, he was apprenticed to Roy Schilling as a law clerk and graduated from the University of Melbourne with a Bachelor of Laws. He enlisted with the RAAF in 1939 when the second world war broke out. Twelve months’ war service was permitted to count as six months’ articles, a rule under which Opas was the first lawyer in Australia to qualify. In 1942, while on leave from New Guinea, he was admitted to practice in the Supreme Court of Victoria as a barrister and solicitor. In 1946 after the war, he signed the Victorian Bar Roll and read with R.V Monahan (who was later appointed to the Supreme Court bench and became Sir Robert Monahan). Opas was a junior barrister for some fifteen years before taking silk in 1958. His practice was wide and varied, ranging from constitutional matters to local government.
In 1966, Opas became defence counsel to Ronald Ryan, in a lengthy and well-publicised murder trial that was to become the defining moment in Opas’ long career. Despite the tenacious defence of his client, Ryan was eventually found guilty and executed (the last person to be hanged in Victoria) in 1967. Shortly after, the Victorian Bar Ethics Committee recommended that Opas be struck off the Bar Roll for touting. Opas was represented at a public hearing by the late Richard E McGarvie, and acquitted. Disillusioned and dispirited, he left the Bar in 1968 and went to work with CRA, ConZinc Rio Tinto before returning to the Bar in 1972.
In 1973, Opas was appointed chairman of the Environment Protection Appeals Board, and later to the Town Planning Appeals Tribunal. He held a number of similar positions specialising in local government and planning during the 1980s before retiring in 1989.
I was a scientific officer with the Environment Protection Authority from 1972 until 1982, when I was promoted to the Department of the Premier and Cabinet as Senior Policy Adviser, Natural Resources. Whilst at the EPA, I and my scientific colleagues were initially dumbfounded and then horrified to observe Philip Opas QC, during a hearing of the Environment Protection Appeals Board, place a drop of crude oil in a beaker of water, drink it and then immediately declare that crude oil in water was not pollution. In fact, I thought his behaviour was moronic and unbelievably dismissive of scientific evidence. Perhaps he suffered from the common sense fallacy. Naturally, he got a lot of cheap tabloid publicity for this silly and irresponsible stunt.
Possibly as a result of this cheap publicity, he was later appointed the position of CEO of the City of Doncaster & Templestowe, a position for which he had almost no relevant qualifications or experience. In this case, the councillors of the City of Doncaster & Templestowe were the morons. Leaving aside whether Ronald Ryan was guilty of murder, Philip Opas QC should be admired for his tenacity in trying to save his client from being hanged. But in the non-legal aspects of his career, he appears to have been out of his depth.
A major hospital in western Sydney recently reported a number of diabetes patients were suffering from scurvy, a historical disease common in sailors on long voyages who were deprived of citrus fruit and vegetables.
Scurvy is caused by severe and chronic deficiency of vitamin C (ascorbic acid), and is in modern times extremely rare. But considering our current dietary habits and their association with lifestyle diseases such as diabetes, could scurvy be making a comeback?
What is it?
In 1747, before the protective effects of vitamin C had been identified, British physician James Lind conducted the first clinical experiment in the history of medicine. He provided oranges and lemons to a group of sailors who were showing symptoms of scurvy. They showed remarkable improvements in a short time.
However, it took more than 50 years for this evidence to be used in practice, and for the British navy to issue lemon juice to sailors.
Vitamin C is necessary for the production of collagen – a vital, structural protein in connective tissues throughout our body – and iron absorption. Because humans can’t naturally make vitamin C, it has to be provided from external sources – either fruits and vegetables or foods fortified with it.
A lack of vitamin C results in a defective formation of collagen and connective tissues, which can result in easy bruising, bleeding gums, blood spots in the skin, joint pain and delayed wound healing.
Because vitamin C is needed for iron absorption, anaemia – which is a lack in the number and quality of red blood cells that carry oxygen – and fatigue may be present in those who are deficient. A blood test to determine vitamin C levels is used to confirm a scurvy diagnosis.
Only 7% of the population meet the guidelines for vegetables – five to six or more servings for men depending on age, and five or more for women. Only one in 20 (5.1%) adults meet both.
The situation is not limited to Australia. In the United Kingdom, it has been claimed wartime diseases such as scurvy are being seen in children because of diets high in junk food, which are worse for them than rationing was 70 years ago.
An estimated 25% of British men and 16% of women on low incomes have blood vitamin C concentrations indicative of deficiency, and a further fifth of the population have levels in the depleted range. This is due, in part, to inadequate access to fresh fruit and vegetables. Similar patterns are being identified in the United States.
Some people are more at risk of scurvy than others. Those at high risk are usually elderly people who may have difficulty chewing vitamin C-rich foods, and those with a diet devoid of fresh fruits and vegetables due to low incomes, ignorance or excessively restrictive diets, for example as a result of allergies.
It is estimated that up to 50% of older adults may have a marginal or even deficient vitamin C status. This is especially true for those who live for long periods in institutions such as hospitals, and rely on in-house food for their nutrient requirements.
It’s common practice in hospital kitchens to cook vegetables for prolonged times, which reduces their vitamin C content. Hospitals also often use the cook-to-chill food service system, and vitamin C is lost from food during chilled storage after cooking. Further, patients may dislike hospital food or feel too unwell to eat enough.
Scurvy can be prevented by consuming enough vitamin C, either in the diet or as a vitamin supplement. Citrus fruits such as oranges and lemons, as well as kiwi fruit, strawberries, guava, papaya and blackcurrants, are excellent sources. Vegetables high in vitamin C include capsicum, broccoli, potatoes, cabbage, tomatoes, and spinach.
Overcooking vegetables is likely to destroy vitamin C content. This is due partly to a reaction with oxygen that renders the vitamin inactive, and partly to leaching of the vitamin into the water used for cooking. It has been shown that 10% of the vitamin C content of cabbage was lost by heat-associated destruction during cooking, while 80% was leached into the cooking water.
When cooking vegetables, don’t drop them into the water until it’s boiled. This is because rapidly boiling water contains less oxygen than cold water, and the reaction with oxygen kills off the vitamin’s protective qualities.
Losses during cooking can be reduced by at least half if vegetables are only one-quarter covered by water, rather than being completely immersed. Use of the vegetable cooking water in soups and gravies would also substantially increase the amount of vitamin C you get.
Substantial losses of vitamin C also occur during reheating of chilled food. However, the losses are dependent on the time taken to reheat, as well as the portion size of the foods. Reheating a bulk portion (2kg) of food results in an average vitamin C loss of 23%, compared with losses of 10 to 15% if individually portioned food is reheated for the same length of time.
The re-emergence of scurvy is a poor reflection on the nation’s diet. So eat more fruits and vegetables, and make sure the latter aren’t overcooked.
Repeatedly, studies have found a large majority of smokers regret ever starting to smoke: 85% in this study, 90% in this four nation study. Each year, some 40% of smokers make an attempt to stop, with most relapsing within weeks.
Many fork out considerable money in pharmaceuticals along the way in the attempt to shake their smoking. Snake-oil, evidence-free quick-cure merchants advertising on telegraph poles for “laser therapy quitting” happily make up to A$500 from the more gullible.
With 12.8% of Australians aged 14 and over smoking daily, and 90% of these regretting they ever started, today just 1.28% are contented smokers. Recent evidence shows 55% of young smokers now approve of plain packaging with their ghoulish, unavoidable picture warnings. Can there be any product that enjoys less consumer satisfaction and customer loyalty?
One of the most common taunts pro-smokers hurl at tobacco control advocates with great relish is the claim they are enemies of pleasure: they just can’t stand the thought or sight of people taking pleasure from smoking. Perhaps they are right. Airport smoking rooms strike me as about the most fun and pleasure you could get. The queues of non-smokers you see waiting to get in just to experience it all pretty much clinch that argument.
The picture being painted here is of elegant smokers, hand gesturing and exhaling as in Richard Klein’s Cigarettes are sublime constantly pleasuring themselves in a way denied to non-smokers who have not woken up to the joys of nicotine.
But what is it that nicotine addicts like about pulling the chemical deep into their lungs some 90,000 times a year?
In 1994, the New York Times published the ratings of two of the USA’s most renowned addiction specialists, Neil Benowitz and Jack Henningfield, on the relative addictiveness of nicotine, caffeine, heroin, cocaine, alcohol and marijuana (cannabis). They rated each of these on a scale of one (most serious) to six (least serious).
Both rated nicotine higher in dependence than all the other drugs. By “dependence” they meant how difficult it is for the user to quit, the relapse rate, and the percentage of people who eventually become dependent.
Nicotine withdrawal also rated high (third behind the often discussed agonies of alcohol and heroin withdrawal). Both experts rated nicotine fourth behind cocaine, heroin and alcohol when it came to reinforcement (essentially, the pleasure given by the drug). But both rated nicotine last on intoxication, behind even caffeine.
Taking all this together, a picture emerges of nicotine dependent people regretful they started smoking, living in full knowledge of their high dependency, experiencing often unpleasant withdrawal symptoms when they have not been able to smoke for a while, and being relieved of this unpleasantness quickly when lighting up another cigarette.
Nicotine withdrawal symptoms include headache, nausea, constipation or diarrhoea, fatigue, drowsiness and insomnia, irritability, difficulty concentrating, anxiety, depressed mood, increased hunger and caloric intake and of course, constant tobacco cravings.
Smokers know from the earliest days of their addiction these feelings can disappear within a few seconds as the nicotine is rapidly transported from their lungs to their brains where dopamine is released and experienced as pleasurable.
Smokers often insist the pleasure from this release can somehow be experienced independently of the pleasures of the nicotine withdrawal symptoms rapidly dissipating.
So what is the “pleasure” being experienced here? When you have a toothache and this is relieved by a strong analgesic, your mood can quickly elevate as the codeine begins to work.
The argument that smoking and inhaling nicotine is “pleasurable” is a bit like saying being beaten up every day is something you want to continue with, because hey, it feels so good when the beating stops for a while.
Holiday periods like the upcoming Christmas break are time-honoured opportunities for smokers to make quit attempts. I used to smoke (in late school and to my mid 20s). I thought smoking was a great way to make a statement about myself that would impress those I cared to impress and irritate those I cared to irritate. But I always thought it tasted disgusting, was a stupid thing to continue and threatened to limit my early career opportunities.
I recall just drifting out of smoking, a pathway common to many ex-smokers. And like many smokers, I recall it being anything but difficult or torturous. This is one of the best kept secrets in tobacco control. While there are many smokers who struggle to quit and fail many times, there are many more who found the experience easier than they expected, sometimes far easier.
There are many more ex-smokers in Australia than smokers. The common narratives of quitting smoking ushering in the pleasures of tasting food and drink better, feeling physically better and of course the pleasure of having more disposable income can be compared with the supposed pleasures of smoking. Good luck if you are planning to quit. It’s the single most important thing you can do to improve your health.