Monthly Archives: February 2016

Did the Crusades lead to Islamic State?

The Conversation

Carole Cusack, University of Sydney

How do we account for forces and events that paved the way for the emergence of Islamic State? Our series on the jihadist group’s origins tries to address this question by looking at the interplay of historical and social forces that led to its advent.

Today, professor of religious studies Carole Cusack considers the Crusades: can we really understand anything about Islamic State by looking at its rise as the latest incarnation of a centuries-old struggle between Islam and Christianity?

In 1996, late US political scientist Samuel P. Huntington published the book The Clash of Civilizations and the Remaking of World Order. Following the collapse of communism in 1989, he argued, conflicts would increasingly involve religion.

Islam, which Huntington claimed had been the opponent of Christianity since the seventh century, would increasingly feature in geopolitical conflict.

So, it wasn’t particularly shocking when, after the September 11 attacks on the World Trade Centre and the Pentagon, the then-US president, George W. Bush, used the term “crusade” to describe the American military response.

Framing the subsequent “war on terror” as a crusade acted as a red flag to journalists and political commentators, who could treat the events as simply the most recent stoush in a centuries-old conflict.

The actual Crusades (1096-1487) themselves evoke a romantic image of medieval knights, chivalry, romance and religious high-mindedness. But representing them as wars between Christians and Muslims is a gross oversimplification and a misreading of history.

Early Islamic conquests

That there were wars between Muslims and Christians is certainly true. After the death of Abu Bakr (573-634), the Prophet Muhammad’s father-in-law and first caliph, the second Caliph Umar (583-644) sent the Islamic armies in three divisions to conquer and spread the religion of Islam.

Whole regions that were Christian fell to Islam. The Holy Land, which comprised modern-day Palestinian territories, Israel, Lebanon, Syria and Jordan, for instance, was defeated. And Egypt was conquered without even a battle in 640.

The ancient and vast Persian Empire, officially Zoroastrian in religion, had been conquered by 642. Weakened by war with the Christian Byzantine Empire, Persia was no match for the Muslim forces.

Muslim armies marched across north Africa and crossed the Straits of Gibraltar into modern Spain, eventually securing a large territory in the Iberian Peninsula, which was known as Al-Andalus (also known as Muslim Spain or Islamic Iberia).

They also marched across the Pyrenees and into France in 732, the centenary of Muhammad’s death. But they were decisively defeated at the Battle of Poitiers (also known as Battle of Tours and, by Arab sources, as Battle of the Palace of the Martyrs) by the Frankish general, Charles Martel (686-741), grandfather of the great Emperor Charlemagne.

This was seen as a Christian victory and, after Poitiers, there were no further attacks on Western Europe. The Crusades came much later.

The causes of the Crusades

The proximate causes of the First Crusade (1096-1099) include the defeat of the Byzantine Emperor Alexius Comnenus (1056-1118), who was crowned in 1081 and ruled until his death. His armies met the Muslim Seljuk Turks at the Battle of Manzikert in 1071 and were defeated.

This placed the city of Constantinople at risk of conquest. So, the emperor requested that the West send knights to assist him – and he was prepared to pay.

Pope Urban II (1044-1099) preached the Crusade at the Council of Clermont in 1095. He argued that the Turks and Arabs attacked Christian territories and had “killed and captured many, and have destroyed the churches and devastated the empire”.

He also promised his audience:

All who die by the way, whether by land or by sea, or in battle against the pagans, shall have immediate remission of sins. This I grant them through the power of God with which I am invested.

This was recorded by a monk called Fulcher of Chartres, who wrote a chronicle of the First Crusade.

The four leaders of the First Crusade.
Alphonse-Marie-Adolphe de Neuville via Wikimedia Commons

Thousands answered the pope’s call and the First Crusade conquered Jerusalem in 1099. But the Crusaders’ presence in the Middle East was short-lived and the port city of Ruad, the last Christian possession, was lost in 1302/3.

Many later conflicts that were called Crusades were not actions against Muslim armies at all. The Fourth Crusade (1202-1204), for instance, was a Venetian Catholic army, which besieged Constantinople. Catholic Christians attacked Orthodox Christians, then looted the city, taking its treasures back to Venice.

Islam was not a factor in the Albigensian Crusade of 1209-1229, either. In that instance, Pope Innocent III (1160/1-1216) used the language of war against the infidel (literally “unfaithful”, meaning those without true religion) against heretics in the south of France. So, “right-thinking” Christians killed “deviant” Christians.

The end of the Middle Ages

It wasn’t all intermittent fighting. There were also periods of peace and productive relationships between Christian and Muslim rulers in the Middle Ages.

For instance, Charlemagne (742-814) (also know as Charles the Great or Charles I), who united most of Western Europe during the early part of the Middle Ages, sent gifts to Harun al-Rashid (763-809), the Caliph of Baghdad. In return, he received diplomatic presents such as a chess set, an elaborate clepsydra (water clock) and an elephant.

In Spain, the culture from the early eighth century to the late 15th was known as “la Convicencia” (the co-existence), as Jews, Christians and Muslims lived in relative peace (though the level of harmony has been exaggerated). And there was an exchange of ideas in fields including mathematics, medicine and philosophy.

The Christian kingdoms of the north gradually reconquered Al-Andalus. And, in 1492, King Ferdinand (1452-1516) and Queen Isabella (1451-1504) reclaimed Granada and expelled the Jews and Muslims from Spain, or forced them to convert to Christianity.

A clumsy view

Clearly, to speak of an “us versus them” mentality, or to frame current geopolitical conflicts as “crusades” of Christians against Muslims, or vice versa, is to misunderstand – and misuse – history.

Not all blood and guts: the Caliph of Baghdad Harun al-Rashid receives a delegation from Charlemagne.
Julius Köckert via Wikimedia Commons

Modern Westerners would find medieval Crusader knights as unappealing as they do Islamic State.

And it’s impossible to miss the fact that the immediate entry into heaven Pope Urban promised to Christian soldiers who died in battle against the infidel Muslims is conceptually identical to the martyrdom ideology of contemporary jihadists.

Reality is more complex – and more interesting – than the simple continuation of a historical struggle against the same enemy. Muslims conquered Christian territories, yes, but Christians engaged in reconquest.

There were forced conversions to both Islam and Christianity, and – very importantly – actual governments and monarchs were involved. It’s a simplistic thing to say that “Islamic State is neither Islamic nor a state”, but there’s an element of truth in it.

The most important reason we should resist the lure of the crusade tag to any fight against jihadists is that groups like Islamic State want the West to think like that.

It justified the Paris bomb attacks of November 2015 as attacks against “the Crusader nation of France”. Osama bin Laden used the same reasoning after the September 11 attacks.

By adopting the role of Crusaders, Western nations play into Islamic State’s hands. It’s how these jihadists want the West to understand itself – as implacably opposed to Islam. But it’s not, and it never has been.

This is the sixth article in our series on the historical roots of Islamic State. Look out for more stories on the theme in the coming days.

The ConversationCarole Cusack, Professor of Religious Studies, University of Sydney

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.


Leave a comment

Filed under Reblogs

Open thread: free speech v private companies

Why Evolution Is True

by Grania Spingies

This is an issue that I go back and forth on. I don’t think I really know what the right answer is. Pretty much everybody here supports the idea of free expression, some of us quite vehemently. On the other hand, many of us are not disquieted by anti-discrimination laws that insist that business-owners can be penalised to refusing service on the basis of race, ethnicity, gender and sexual orientation.

I know that I don’t want to live in a society that looks like this:

a6761c7be4414134a8ccbb42be6edb6c women-forbidden



But anti-discrimination laws are at odds with free speech, and therein lies the problem.

LGBT and liberal activist Peter Tatchell recently wrote that he had changed his mind about the Belfast bakery that refused to supply a cake to a same-sex couple because it espoused views they disagreed with. He argues:

This finding of political discrimination against Lee sets a worrying precedent. Northern…

View original post 388 more words

Leave a comment

Filed under Uncategorized

Collaboration: Too much of a good thing?

The Conversation

Libby Sander, Griffith University

It’s not your imagination. Involvement by managers and employees in collaborative endeavours has increased by 50% in the past two decades, according to research published in the Harvard Business Review. The study found that in many companies, the time spent in meetings, on the phone and answering emails takes up to 80% of employees’ time. Collaboration is seen as a vital precursor to the production of creative ideas, problem solving and improved social capital.

In designing new workplaces, collaboration is often the holy grail against which all other office requirements are measured. Some workplaces are now so open and transparent, that it is possible for a group of employees to talk face-to-face about a work problem while seated simultaneously in the office cafe, at the work station area and on a rowing machine. At Apple’s new campus in California, the design is intended to get employees to collaborate in key interaction areas, such as the restaurant. However, if an employee’s desk is at the wrong end of the building, walking to the restaurant will mean undertaking an 800 metre trip.

The focus on open workplaces is driven in part by a desire to reduce real estate costs for organisations, but also by a belief that increased interaction leads to increased collaboration. However, a study of 42,000 employees showed there was little solid evidence that open layouts improved interaction. Other research has shown that increased awareness through being able to see others doesn’t translate clearly to collaboration. The study also suggests that most office design is an experiment, and that the outcomes beyond self-report surveys are rarely tested.

Both the processes and places where work is occurring are allowing increasingly little room for employees to undertake the solitary work required to achieve results. Between 2008 and 2013, a survey showed that amongst knowledge workers, time spent on collaborating had decreased by 20% while time spent on focussed work requiring deep thought had increased by 13%. When employees can’t focus and think clearly they actually collaborate less and become more withdrawn.

Further, the perception that collaboration adds value and improves team productivity is likely to be overstated. The Harvard Business Review research has shown that in most cases, 20% to 35% of value-added collaborations come from only 3% to 5% of employees. Other research has shown that a single employee in a team who constantly goes above and beyond the scope of their role, can drive team performance more than the rest of the team combined.

Employees feel increasing pressure to assist others and go beyond their scope. University of Oklahoma professor Mark Bolino told the Harvard Business Review this phenomenon is called “escalating citizenship”. The result of this is increased burnout and lower satisfaction. Employees who are seen as the best source of information and most helpful collaborators score the lowest on engagement and career satisfaction.

To address this situation, organisations need to reconsider how to balance focussed and collaborative work both from a process and space design perspective. Knowing which employees are bearing the brunt of the collaborative burden is essential. Putting up your hand to take on more and more is seen as an essential prerequisite for career advancement. Alarmingly though, given the nature of collaborative helping, this extra work can often go unnoticed, leaving employees burnt out and disillusioned. The best solution to a problem may not involve having a meeting, forming a committee, or a putting together a new project team.

The ConversationLibby Sander, PhD Candidate, Griffith University

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.


1 Comment

Filed under Reblogs

Paleo diets = weight gain

 By Jane Gardner

The low-carbohydrate Paleo diet that is becoming increasingly popular as a way to lose weight may instead make you fat and even cause symptoms of pre-diabetes, according to new research.The finding, detailed in a paper in Nature journal Nutrition and Diabetes, has prompted a warning against fad diets which have little or no scientific evidence behind them.Diabetes researcher at the University of Melbourne’s Department of Medicine, based at the Austin Hospital, Associate Professor Sof Andrikopoulos, was interested to learn whether the Paleo diet could benefit patients with diabetes or pre-diabetes.

His research group took two groups of overweight mice with pre-diabetes symptoms and put one group on a low-carbohydrate, high-fat (LCHF) diet similar to Paleo diets. The other group remained on their normal diet.The LCHF mice were switched from a three per cent fat diet to a 60 per cent fat diet while their carbohydrate intake was reduced to only 20 per cent.

Associate Professor Andrikopoulos with the mice. Picture: Paul Burston
Associate Professor Andrikopoulos with the mice. Picture: Paul Burston

“The hypothesis was that the Paleo diet group would gain less weight and we would see improvements in glycemic control,” Associate Professor Andrikopoulos says.

After eight weeks, the group on the LCHF diet gained more weight, their glucose intolerance worsened and their insulin levels rose. They gained 15 per cent of their body weight and their fat mass doubled from two per cent to almost four per cent.

Associate Professor Andrikopoulos says he had been expecting some weight loss, but instead was completely surprised by the extent of the weight gains.

The researchers used mice for the study, because their genetic, biological and behavioural characteristics closely resemble those of humans.

“In humans, this level of weight gain will increase blood pressure and the risk of anxiety and depression and may cause bone issues and arthritis,” Associate Professor Andrikopoulos says.

“For someone who is already overweight, this diet would increase blood sugar and insulin levels and could actually pre-dispose that person to diabetes.

Associate Professor Andrikopoulos tells how the study was conducted.

“We are told to eat zero carbs and lots of fat on the Paleo diet. Our model tried to mimic that, but we didn’t see any improvements in weight or symptoms. In fact, they got worse.

“There is a very important public health message here. You need to be very careful with fad diets, always seek professional advice for weight management and always aim for diets backed by evidence.”

The moral of the story is that calories matter. If you eat more calories, you will put on more weight.

He says hype around these diets is driven by celebrity chefs, unrealistic before-and-after glossy magazine celebrity weight-loss stories, and rapid weight-loss reality TV shows.

The result is more people turning to potentially dangerous fad diets for a quick fix.

“These diets are becoming more popular because of the media and social media. Instead of scientific literature, we get endorsement from individuals who’ve lost 20 kilograms talking about it on social media,” he says.

Associate Professor Andrikopoulos says the low carbohydrate diet has been inspired by our hunter-gatherer past. Picture: Paul Burston
Associate Professor Andrikopoulos says the low carbohydrate diet has been inspired by our hunter-gatherer past. Picture: Paul Burston

“But other than those special cases, we don’t have enough scientific evidence to support the use of those diets, particularly in people with pre-diabetes and diabetes.”

Every day, he says, people receive messages that increase the pressure to lose weight quickly.

“There is a lot of stigma around being overweight. Every day we’re exposed to stories about a slim celebrity who ate three cabbage leaves a day to lose weight.

“What people don’t understand is that looking good is a celebrity’s day job. They have someone to cook their food and another person telling them to exercise. The real world doesn’t work like that. There are no quick fixes.

“When you think about it, celebrity advocates of these diets are often very active and can handle an increased fat load, rather than your average Australian who is a lot less active.

“If you put someone with a sedentary lifestyle on a high-fat, low-carb diet, I bet you that person will gain weight.”

He said the low-carbohydrate diet has been inspired by our hunter-gather past when we didn’t eat any processed foods, but what may be more relevant to losing weight is the fact we don’t do anywhere near the same amount of exercise we did then.

Associate Professor Andrikopoulos says the Mediterranean diet is the best for people with pre-diabetes or diabetes.

“It’s backed by evidence and is a low-refined sugar diet with healthy oils and fats from fish and extra virgin olive oil, legumes and protein.”

Multi-media content: Paul Burston

Picture: Shutterstock

This article was first published on Pursuit. (Reblogged by permission). Read the original article.

Leave a comment

Filed under Reblogs

Chemmart’s myDNA test offers more than it can deliver

The Conversation

Ken Harvey, Monash University and Basia Diug, Monash University

When you enter a Chemmart pharmacy, it’s hard to miss the posters and brochures promoting its “revolutionary myDNA test”. The brochure states it’s “personalised medicine”, where “your DNA results … can help guide your future health and lifestyle choices”.

It’s an enticing promise. Knowing your genes unlocks a healthier future. Right?

The myDNA advertising, also on the Chemmart and websites, claims that “70% of people who take a myDNA test have a finding that could affect current or future medications”.

Chemmart myDNA brochure front page.
Freely available from Chemmart pharmacies

The promotional materials suggest taking the test if you’re using antidepressants, pain or reflux medication. The test also covers the common, but difficult to manage, blood thinner warfarin, which is prescribed to approximately 30% of patients over the age of 70 years.

Chemmart says it’s particularly relevant if you have a history of not responding to common drugs, you experience side-effects from your medication, you take multiple medications, you have children on prescribed medication, or you are pregnant or planning pregnancy.

But some of Chemmart’s claims may be misleading for consumers who lack detailed knowledge of DNA testing and may produce unrealistic expectations of the product’s effectiveness. In so doing, Chemmart appears to have breached a number of provisions of the Therapeutic Goods Advertising Code 2015.

The test

The test costs A$149 and is not covered by Medicare or private health insurance rebates. It involves a cheek swab taken by a trained pharmacist, which is sent to Australian Clinical Labs (formerly Healthscope Pathology) for DNA analysis.

The results are delivered by personal consultation with the pharmacist and are also sent to your nominated doctor. If the pharmacist recommends changes to medication, you are referred to your doctor.

As part of their training, participating pharmacists are encouraged to explain the test to local doctors, both to educate them and prepare them for pharmacist referrals or enquiries from patients.

The test identifies common variations in four genes that are involved in processing a number of drugs. For example,

  • CYP2C9 and VKORC1 affect the metabolism of the blood-thinning drug warfarin
  • CYP2D6 is involved in the metabolism of the pain killers codeine and tramadole, as well as antidepressants
  • CYP2C19 affects the metabolism of antidepressants, the newer blood thinner clopidogrel and drugs taken for reflux, such as esomeprazole.

The science

The term pharmacogenetics was first coined in 1959 but it is only recently that pharmacogenetic (PGx) tests have moved from the lab to the clinic.

Researchers have now identified inherited variation in approximately 20 genes affecting around 80 medications, which are potentially actionable in the clinic.

Genetic variations may predict, for example, that a patient is at increased risk of experiencing the side-effects of certain drugs because they metabolise it slowly and high concentrations can build up on a normal dose.

Other genetic variations may cause a particular drug to be metabolised too rapidly, so patients may need a higher dose of the drug to achieve a therapeutic effect.

But a “normal” PGx test doesn’t mean you’re not at risk of drug-related side-effects or of not responding to a drug. Current tests only capture known variants of known genes.

And even if the test shows gene variants that impact on a certain drug’s metabolism, this is only one of many patient characteristics and factors that influence how they respond to drugs. Others include interactions with other drugs, allergies, and kidney and liver function.

As a result, the cost-effectiveness and clinical utility of PGx tests is still uncertain.

Case study: warfarin

Warfarin is a commonly prescribed blood-thinning drug that prevent strokes in patients who have an irregular heartbeat or whose heart valves have been replaced. However, it must be monitored closely with regular clotting tests to ensure it is at the right therapeutic level to prevent bleeding or stroke.

Variations in the genes VKORC1 and CYP2C9 can either slow warfarin metabolism, thereby increasing the risk of bleeding, or increase sensitivity, which may require a lower dose to produce the required effect.

However, in practice, the clinical implementation of genetic testing for warfarin dosing has been disappointing. The current consensus guidelines by the American College of Chest Physicians actually warn against the routine use of genetic data to guide dosing because of its poor predictive value for large populations.

In Australia, GPs manage warfarin dosage via regular monitoring of the clotting test. This allows GPs to adjust the patient’s dose, taking into account all important patient risk factors.

The promotion

Chemmart claims “myDNA is a genetic test that identifies which medications will work best for you”. This overstates the role and value of this test.

It has limited applicability only to certain drugs in particular situations.

We do not believe the test is “particularly relevant” to those who “take multiple medications, have children on prescribed medication”, or “are pregnant or planning pregnancy” because of the extremely limited applicability of the test to these patient groups.

We also have problems with the claim that “the myDNA test covers 50% of the most commonly prescribed medications”. This is not in accord with data from the United States, which shows that just 7% of approved medications and 18% of outpatient prescriptions are affected by actionable pharmacogenes (genes you can test for and alter medication around). Nor is it in accord with Australian data on the top ten drugs prescribed.

We have submitted our concerns to the Therapeutic Goods Advertising Complaint Resolution Panel for an independent determination.

More research (and GP training) will be required to determine if PGX tests improve patient care and are cost-effective. In the meantime, marketing claims should reflect the current uncertain clinical role of these tests.

The role of test providers, pharmacists and doctors

Before undertaking PGx tests, which are also available from other providers, the National Health and Medical Research Council recommends discussing their value and applicability with your GP.

Pharmacists play an important role in engaging with consumers and doctors to achieve better use of medication. But the current model of pharmacist “detailing” PGx tests has several problems. “Detailers” are not welcomed by all GPs. Some clinics have demanded the pharmacist bring a free lunch (often provided by drug reps) before they will schedule a visit.

Finally, the national prescribing service NPS Medicinewise has an important role to play in developing a pharmacogenetic education campaign. This could still involve trained pharmacists but NPS Medicinewise authority would provide greater access to GPs without the demand for lunch as the price of entry.

The ConversationKen Harvey, Adjunct Associate Professor, School of Public Health and Preventive Medicine, Monash University and Basia Diug, Senior Lecturer & Deputy- Head of the Medical Education Research Quality Unit, School of Public Health and Preventive Medicine, Monash University

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.


Leave a comment

Filed under Reblogs

The Authoritarian Left reaches rock bottom: The “no-platforming” of Peter Tatchell in the UK

Why Evolution Is True

Like the Russian revolutionaries after 1917, the progressive left—including atheists—are beginning to eat their own. I find it infinitely depressing to see people at each other’s throats about issues of semantics, censorship and virtue signaling, while the malfeasance of our opponents—conservatives and repressive religionists—goes unchallenged. I’m not exactly sure why this is happening, but I do know that it’s not only divisive and unseemly, but unproductive and solipsistic.

The latest ridiculous performance in this charade is the “no-platforming” of Peter Tatchell, an LGBT and liberal activist who has spent his entire adult life campaigning for gay rights, gay marriage, and for humanitarian causes like opposing the Iraq war and apartheid. (Read his Wikipedia bio to get an idea of the breadth of his activism.)

But now he’s a victim of the Authoritarian Left, suffering the death of a thousand smears for not hewing to Acceptable Behavior. He’s been “no-platformed” (i.e., subject to a student…

View original post 1,587 more words

Leave a comment

Filed under Uncategorized

Sacred cow no more: what proposed changes to negative gearing really mean

The Conversation

Danika Wright, University of Sydney

Negative gearing is set for its biggest change in decades. Once considered the sacred cow of federal government policies, both major parties are proposing changes ahead of this year’s federal election.

The Labor party has already unveiled its policy, which involves only allowing negative gearing on new houses and cutting the generous 50% capital gains tax concession to 25%.

The Government has not confirmed its stance and has attacked Labor policy, but has crucially left its options open.

Less than 12 months ago, even considering changes to negative gearing was considered “unpopular” at best with then-Prime Minister Tony Abbott reassuring voters that there would be no changes to negative gearing.

Now, while neither the Liberal government or Labor opposition are keen to abolish negative gearing entirely as the Greens and some economists (most notably Saul Eslake) have proposed, both sides of politics are claiming they will “make the system fairer”.

So what has prompted this change?

First, a quick refresher on negative gearing. Gearing is the use of debt (such as a mortgage) to increase the amount of money that may be invested. Negative gearing occurs when the cost of this debt (for example, the interest charged) exceeds the income from the investment (like the rent).

Why would anyone want to incur a loss? The most widely promoted reason is because this net loss may be deducted against the investor’s taxable income, lowering their overall tax bill. This is clearly going to benefit those investors with higher incomes more than those with lower incomes. And, the statistics show that those with more to gain are more likely to take advantage of it:

Proportion of tax filers with negatively geared investment properties
by income band. 
‘Better Tax’, Tax Discussion Paper, Australian Government, March 2015

There is a second benefit from the property investor’s perspective – the treatment of capital gains. When a capital gain on the investment asset is realised (it is sold), the investor gets a 50% discount on the capital gains tax.

For many years now, investment properties have been the most popular investment asset after the owner-occupied home and superannuation. Like any other investment, including shares, borrowers may take on debt to facilitate the property purchase. The benefits from negative gearing, combined with the perception of housing as a “safe” asset class with steady capital appreciation has fuelled the flow of investor capital into the market.

But now, with growing public awareness of the operation of negative gearing, its cost through missed tax revenue, and increasing housing affordability concerns, even those benefiting from negative gearing are questioning whether it is appropriate.

Comparing the alternatives

The Coalition government has been reluctant to commit to any policy changes, instead hinting at “capping” the amount of negative gearing benefit that may be claimed. In a way, this is a reinvention of the 2011 Gillard government proposal to target just the “wealthiest” investors who own portfolios of investment properties. We might expect to hear more details from Treasurer Scott Morrison in his Wednesday Press Club address.

The Labor party, by contrast, has made the first solid move. In so doing they have made negative gearing a major election issue and will force the government into presenting a real policy proposal, likely as part of the 2016-17 Budget.

The Bill Shorten-Chris Bowen policy announced over the weekend has been in the works since April 2015. If implemented, negative gearing would apply as usual to properties purchased prior to July 2017. After this date, however, investors would only be able to claim negative gearing against newly constructed investment properties.

But will it work?

The Government’s suggested approach would effectively create a “progressive negative gearing”. The tax rate increases (because the discount decreases) as income increases. While still lacking details, tackling the problem this way more reasonably considers the broad range of individuals who own investment properties, from teachers and nurses to the country’s most wealthy.

Labor argues that the asymmetry of its gearing policy between new and existing properties will divert investment to housing construction and boost the housing supply.

However, critics of the Labor proposal argue that it will trigger a rush of investors to the existing housing market prior to the July 2017 change, in turn reigniting a house price bubble.

Neither the opposition nor their critics are right and this is why:

  • To truly benefit from a negative gearing investment strategy, the property buyer would be hoping to win on the capital gains side. For that to occur, property prices need to keep rising, yet that is exactly what the policy is hoping to contain. Savvy investors will know this and take their savvy dollars elsewhere.
  • If you still believe there will be a boom in 12 months time, the better strategy would be buy now (well, technically, when or if Labor are successful at the election) in order to sell to some sucker then.

An even more critical flaw in the Labor proposal is the absence of strategy addressing when and where we will see this new housing supply.

Fundamentally, either approach makes housing a less attractive investment. In the short term investors will look to move their capital into other asset classes, primarily the stock market, which has been volatile.

It could take decades to see the broader implications of removing the incentive to invest in housing over other assets if existing property owners continue to benefit from “grandfathering” the current rules. Therein lies the flaw with arguing this policy change will also help to repair the federal budget (remembering the owner-occupier still makes their capital gains completely tax free).

Ultimately, the tide has turned politically on negative gearing. Whichever party sells their proposal best (alongside changes to superannuation and the GST) will be the election winner.

The ConversationDanika Wright, Lecturer in Finance, University of Sydney

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.


Leave a comment

Filed under Reblogs

The philosophy of irony and sarcasm

Footnotes to Plato

characters in ancient Greek comedy characters in ancient Greek comedy

In Ancient Greek comedy, Eiron was a clever underdog who somehow always managed to get the better of his rival, Alazon, by sheer use of wit. The Socratic dialogues by Plato essentially represent Socrates as the philosophical equivalent of Eiron. And, of course, it is from him that we derive the term “irony,” the Greek root of which means dissimulation, feigned ignorance.

Contrast that with sarcasm. That word also has a Greek root, naturally, which meant “to tear flesh, bite the lip in rage, sneer.”

View original post 1,406 more words

Leave a comment

Filed under Uncategorized

French Kitchen, Classic Recipes for Home Cooks, by Serge Dansereau

ANZ LitLovers LitBlog

French KitchenThis is a beaut recipe book.  French Kitchen Classic Recipes for Home Cooks, by Canadian chef Serge Dansereau is exactly what it says it is: great French recipes for do-at-home cooking.  It’s written for Australian cooks, using ingredients readily available here.  (Sydney people will know the name: Serge Dansereau runs the Bathers Pavilion Restaurant and Café.)

The recipes are arranged by meals:

  • breakfasts;
  • lunch: soup; salads; sandwiches; and tarts, flans, terrines and souffles
  • cooking for kids (not McDonalds-type pap, the recipes are mainly small sized versions of things that adults enjoy too);
  • dinner: fish; chicken and game; meat; and
  • desserts and baking,

So, in ‘Breakfasts’ it starts with the classic recipe for Crêpes au citron (lemon crêpes), but a few pages further on there’s a recipe for pancakes which Dansereau says you also can thin down to make crêpes, if like him, you have a fussy family where one wants crêpes and the other…

View original post 775 more words

Leave a comment

February 14, 2016 · 8:03 pm

Gravitational waves discovered: top scientists respond

The Conversation

Keith Riles, University of Michigan; Alan Duffy, Swinburne University of Technology; Amanda Weltman, University of Cape Town; Daniel Kennefick, University of Arkansas; David Parkinson, The University of Queensland; Maria Womack, University of South Florida; Stephen Smartt, Queen’s University Belfast; Tamara Davis, The University of Queensland, and Tara Murphy, University of Sydney

One hundred years ago, Albert Einstein published his general theory of relativity, which described how gravity warps and distorts space-time.

While this theory triggered a revolution in our understanding of the universe, it made one prediction that even Einstein doubted could be confirmed: the existence of gravitational waves.

Today, a century later, we have that confirmation, with the detection of gravitational waves by the Advanced Laser Interferometer Gravitational-Wave Observatory (aLIGO) detectors.

Here we collect reactions and analysis from some of the leading astronomers and astrophysicists from around the world.

Keith Riles, University of Michigan

Keith Riles explains gravitational waves.

Einstein was skeptical that gravitational waves would ever be detected because the predicted waves were so weak. Einstein was right to wonder – the signal detected on September 14, 2015 by the aLIGO interferometers caused each arm of each L-shaped detector to change by only 2 billionths of a billionth of a meter, about 400 times smaller than the radius of a proton.

It may seem inconceivable to measure such tiny changes, especially with a giant apparatus like aLIGO. But the secret lies in the lasers (the real “L” in LIGO) that are projected down each arm.

Fittingly, Einstein himself indirectly helped make those lasers happen, first by explaining the photoelectric effect in terms of photons (for which he earned the Nobel Prize), and second, by creating (along with Bose) the theoretical foundation of lasers, which create coherent beams of photons, all with the same frequency and direction.

In the aLIGO arms there are nearly a trillion trillion photons per second impinging on the mirrors, all sensing the precise positions of the interferometer mirrors. It is this collective, coherent sensing that makes it possible to determine that one mirror has moved in one direction, while a mirror in the other arm has moved in a different direction. This distinctive, differential motion is what characterizes a gravitational wave, a momentary differential warp of space itself.

By normally operating aLIGO in a mode of nearly perfect cancellation of the light returning from the two arms (destructive interference), scientists can therefore detect the passage of a gravitational wave by looking for a momentary brightening of the output beam.

The particular pattern of brightening observed on September 14 agrees remarkably well with what Einstein’s General Theory of Relativity predicts for two massive black holes in the final moments of a death spiral. Fittingly, Einstein’s theory of photons has helped to verify Einstein’s theory of gravity, a century after its creation.

Amanda Weltman, University of Cape Town

The results are in and they are breathtaking. Almost exactly 100 years ago Einstein published “Die Feldgleichungen der Gravitation” in which he laid out a new theory of gravity, his General Theory of Relativity. Einstein not only improved on his predecessor, Newton, by explaining the unexpected orbit of the planet Mercury, but he went beyond and laid out a set of predictions that have shaken the very foundations of our understanding of the universe and our place in it. These predictions include the bending of light leading to lensed objects in the sky, the existence of black holes from which no light can escape as well as the entire framework for our modern understanding of cosmology.

NASA’s Hubble Space Telescope captured gravitational lensing of light, as predicted by Einstein.
NASA, ESA, K. Sharon (Tel Aviv University) and E. Ofek (Caltech), CC BY

Einstein’s predictions have so far all proven true, and today, the final prediction has been directly detected, that of gravitational waves, the tiniest ripples through space; the energy radiated away by two massive heavenly bodies spiralling into each other. This is the discovery of the century, and it is perhaps poetic that one of the places it is being announced is Pisa, the very place where, according to legend, 500 years ago, Galileo dropped two massive objects to test how matter reacts to gravity.

As we bathe in the glory of this moment it is appropriate to ask, what is next for astronomy and physics and who will bring about the next revolution? Today’s discovery will become tomorrow’s history. Advanced LIGO brings a new way of testing gravity, of explaining the universe, but it also brings about the end of an era of sorts. It is time for the next frontier, with the Square Kilometre Array project finally afoot across Africa and Australia, the global South and indeed Africa itself is poised to provide the next pulse of gravity research.

Stephen Smartt, Queen’s University Belfast

Not only is this remarkable discovery of gravitational waves an extraordinary breakthrough in physics, it is a very surprising glimpse of a massive black hole binary system, meaning two black holes that are merging together.

Black holes are dark objects with a mass beyond what is possible for neutron stars, which are a type of very compact stars – about 10 km across and weighing up to two solar masses. To imagine this kind of density, think of the entire human population squeezed onto a tea spoon. Black holes are even more extreme than that. We’ve known about binary neutron stars for years and the first detection of gravitational waves were expected to be two neutron stars colliding.

What we know about black hole pairs so far comes from the study of the stars orbiting around them. These binary systems typically have black holes with masses five to 20 times that of the sun. But LIGO has seen two black holes with about 30 times the mass of the sun in a binary system that has finally merged. This is remarkable for several reasons. It is the first detection of two merging black holes, it is at a much greater distance than LIGO expected to find sources, and the total mass in the system is also much larger than expected.

This raises interesting questions about the stars that could have produced this system. We know massive stars die in supernovae, and most of these supernovae (probably at least 60%) produce neutron stars. The more massive stars have very large cores that collapse and are too massive to be stable neutron stars so they collapse all the way to black holes.

But a binary system with two black holes of around 30 solar masses is puzzling. We know of massive binary star systems in our own and nearby galaxies, and they have initial masses well in excess of 100 suns. But we see them losing mass through enormous radiation pressure and they are predicted, and often observed, to end their lives with masses much smaller – typically about ten times the sun.

If the LIGO object is a pair of 30 solar mass black holes, then the stars that formed it must have been at least as massive. Astronomers will be asking – how can massive stars end their lives so big and how can they create black holes so massive? As well as the gravitational wave discovery, this remarkable result will affect the rest of astronomy for some time.

Alan Duffy, Swinburne University

The detection of gravitational waves is the confirmation of Albert Einstein’s final prediction and ends a century-long search for something that even he believed would remain forever untested.

This discovery marks not the end, but rather the beginning, of an era in which we explore the universe around us with a fundamentally new sense. Touch, smell, sight and sound all use ripples in an electromagnetic field, which we call light, but now we can make use of ripples in the background field of space-time itself to “see” our surroundings. That is why this discovery is so exciting.

The Advanced Laser Interferometer Gravitational-Wave Observatory (aLIGO) measured the tiny stretching of space-time by distant colliding black holes, giving them a unique view into the most extreme objects in general relativity.

The exact “ringing” of space-time as the ripples pass through the detector test this theory and our understanding of gravity in ways no other experiment can.

We can even probe the way galaxies grow and collide by trying to measure the gravitational waves from the even larger collisions of supermassive black holes as the galaxies they are contained in smash together.

Australia in particular is a leading nation in this search, using distant pulsars as the ruler at the Parkes telescope.

The LIGO detectors are sensitive to the minute ripples in space-time caused by the merging of two black holes.
University of Birmingham Gravitational Waves Group, Christopher Berry

Tara Murphy, University of Sydney

In addition to binary black holes, aLIGO will detect gravitational waves from other events such as the collision of neutron stars, which are the dense remnants left over when a massive stars collapse.

Astronomers think that two neutron stars colliding may trigger a gamma-ray burst, which we can detect with “regular” telescopes.

Simulation of neutron stars colliding. Credit: NASA

In Australia, we have been using the Murchison Widefield Array and the Australian Square Kilometre Array Pathfinder) to follow-up aLIGO candidates.

aLIGO is an incredibly sensitive instrument but it has very poor ability to determine where the gravitational waves are coming from. Our radio telescopes can scan large areas of sky extremely quickly, so can play a critical part in identifying the event.

This project has been like no other one I have worked on. When aLIGO identifies a candidate, it sends out a private alert to an international network of astronomers. We respond as quickly as possible with our telescopes, scanning the region the event is thought to have occurred in, to see if we can detect any electromagnetic radiation.

Everything is kept top secret – even the other people using our telescopes are not allowed to know where we are pointing them.

To make sure their complex processing pipeline was working correctly, someone in the aLIGO team inserted fake events into the process. Nobody on the team, or those of us doing follow-up, had any idea whether what we were responding to was real or one of these fake events.

We are truly in an era of big science. This incredible result has been the work of not only hundreds of aLIGO researchers and engineers, but hundreds more astronomers collaborating around the globe. We are eagerly awaiting the next aLIGO observing run, to see what else we can find.

Tamara Davis, University of Queensland

Rarely has a discovery been so eagerly anticipated.

When I was a university undergraduate, almost 20 years ago, I remember a physics lecturer telling us about the experiments trying to detect gravitational waves. It felt like the discovery was imminent, and it was one of the most exciting discoveries that could be made in physics.

Mass and energy warping the fabric of space is one of the pieces of general relativity that most captures the imagination. However, while it has enormous explanatory power, the reality of that curvature is hard to grasp or confirm.

For the last few months I’ve had to sit quietly and watch as colleagues followed up the potential gravitational wave signal. This is the one and only time in my scientific career that I wasn’t allowed to talk about a scientific discovery in progress.

But that’s because it is such a big discovery that we had to be absolutely sure about it before announcing it, lest we risk “crying wolf”.

Every last check had to be done, and of course, we didn’t know whether it was a real signal, or a signal injected by the experimenters to keep us on our toes, test the analysis and follow-up.

I work with a project called the Dark Energy Survey, and with our massive, wide-field, half-billion pixel camera on a four metre telescope in Chile, my colleagues took images trying to find the source of the gravitational waves.

The wide-field is important, because the gravitational wave detectors aren’t very good at pinpointing the exact location of the source.

Unfortunately if it was a black hole merger, we wouldn’t expect to see any visible light.

Now that we’re in the era of detecting gravitational waves, though, we’ll be able to try again with the next one.

Maria Womack, University of South Florida

This is a momentous change for astronomy. Gravitational-wave astronomy can now truly begin, opening a new window to the universe. Normal telescopes collect light at different wavelengths, such as Xray, ultraviolet, visible, infrared and radio, collectively referred to as electromagnetic radiation (EM). Gravitational waves are emitted from accelerating mass analogous to the way electromagnetic waves are emitted from accelerating charge; both are emitted from accelerating matter.

The most massive objects with the highest accelerations will be the first events detected. For example, Advanced LIGO, funded by the U.S. National Science Foundation, can detect binary black holes in tight, fast orbits. GWs carry away energy from the orbiting pair, which in turn causes the black holes to shrink their orbit and accelerate even more, until they merge in a violent event, which is now detectable on Earth as a whistling “chirp.”

An example signal from an inspired gravitational wave source.
A. Stuver/LIGO, CC BY-ND

The gravitational-wave sky is completely uncharted, and new maps will be drawn that will change how we think of the universe. GWs might be detected coming from cosmic strings, hypothetical defects in the curvature of space-time. They will also be used to study what makes some massive stars explode into supernovae, and how fast the universe is expanding. Moreover, GW and traditional telescopic observing techniques can be combined to explore important questions, such as whether the graviton, the presumed particle that transmits gravity, actually have mass? If massless, they will arrive at the same time as photons from a strong event. If gravitons have even a small mass, they will arrive second.

Daniel Kennefick, University of Arkansas

Almost 100 years ago, in February 1916, Einstein first mentioned gravitational waves in writing. Ironically it was to say that he thought they did not exist! Within a few months he changed his mind and by 1918 had published the basis of our modern theory of gravitational waves, adequate to describe them as they pass by the Earth. However his calculation does not apply to strongly gravitating systems like a binary black hole.

Albert Einstein was the original theorist who started the hunt for gravitational waves.

It was not until 1936 that Einstein returned to the problem, eventually publishing one of the earliest exact solutions describing gravitational waves. But his original sceptical attitude was carried forward by some of his former assistants into the postwar rebirth of General Relativity. In the 1950s, doubts were expressed as to whether gravitational waves could carry energy and whether binary star systems could even generate them.

One way to settle these disputes was to carry out painstaking calculations showing how the emission of gravitational waves affected the motion of the binary system. This proved a daunting challenge. Not only were the calculations long and tedious, but theorists found they needed a much more sophisticated understanding of the structure of space-time itself. Major breakthroughs included the detailed picture of the asymptotic structure of space-time, and the introduction of the concept of matched asymptotic expansions. Prior to breakthroughs such as these, many calculations got contradictory results. Some theorists even got answers that the binary system should gain, not lose, energy as a result of emitting gravitational waves!

While the work of the 1960s convinced theorists that binary star systems did emit gravitational waves, debate persisted as to whether Einstein’s 1918 formula, known as the quadrupole formula, correctly predicted the amount of energy they would radiate. This controversy lasted into the early 1980s and coincided with the discovery of the binary pulsar which was a real-life system whose orbit was decaying in line with the predictions of Einstein’s formula.

In the 1990s, with the beginnings of LIGO, theorists’ focus shifted to providing even more detailed corrections to formulas such as these. Researchers use descriptions of the expected signal as templates which facilitate the extraction of the signal from LIGO’s noisy data. Since no gravitational wave signals had ever been seen before, theorists found themselves unusually relevant to the detection project – only they could provide such data analysis templates.

David Parkinson, University of Queensland

Gravitational waves can be used to provide a direct probe of the very early universe. The further away we look, the further back in time we can see. But there is a limit to how far back we can see, as the universe was initially an opaque plasma, and remained so even as late as 300,000 years after the Big Bang.

This surface, from which the cosmic microwave background is emitted, represents the furthest back any measurement of electromagnetic radiation can directly investigate.

But this plasma is no impediment for gravitational waves, which will not be absorbed by any intervening matter, but come to us directly. Gravitational waves are predicted to be generated by a number of different mechanisms in the early universe.

For example, the theory of cosmic inflation, which suggests a period of accelerated expansion moments after the Big Bang, goes on to predict not just the creation of all structure that we see in the universe, but also a spectrum of primordial gravitational waves.

It is these primordial gravitational waves that the BICEP2 experiment believed it had detected in March 2014.

BICEP2 measured the polarisation pattern of the cosmic microwave background, and reported a strong detection of the imprint of primordial gravitational waves. These results turned out in fact to be contamination by galactic dust, and not primordial gravitational waves.

But there is every reason to believe that future experiments may be able detect these primordial gravitational waves, either directly or indirectly, and so provide a new and complementary way to understand the physics of the Big Bang.

The ConversationKeith Riles, Professor of Physics, University of Michigan; Alan Duffy, Research Fellow, Swinburne University of Technology; Amanda Weltman, SARChI in Physical Cosmology, Department of Mathematics and Applied Mathematics, University of Cape Town; Daniel Kennefick, Associate Professor of Physics, University of Arkansas; David Parkinson, Researcher in astrophysics, The University of Queensland; Maria Womack, Research Professor of Physics, University of South Florida; Stephen Smartt, Professor of Physics and Mathematics, Queen’s University Belfast; Tamara Davis, Professor, The University of Queensland, and Tara Murphy, Associate Professor and ARC Future Fellow, University of Sydney

This article was originally published on The Conversation. (Reblogged by permission). Read the original article.


1 Comment

Filed under Reblogs