New statistical methods would let researchers deal with data in better, more robust ways

From The Conversation.

No matter the field, if a researcher is collecting data of any kind, at some point he is going to have to analyze it. And odds are he’ll turn to statistics to figure out what the data can tell him.

A wide range of disciplines – such as the social sciences, marketing, manufacturing, the pharmaceutical industry and physics – try to make inferences about a large population of individuals or things based on a relatively small sample. But many researchers are using antiquated statistical techniques that have a relatively high probability of steering them wrong. And that’s a problem if it means we’re misunderstanding how well a potential new drug works, or the effects of some treatment on a city’s water supply, for instance.

As a statistician who’s been following advances in the field, I know there are vastly improved methods for comparing groups of individuals or things, as well as understanding the association between two or more variables. These modern robust methods offer the opportunity to achieve a more accurate and more nuanced understanding of data. The trouble is that these better techniques have been slow to make inroads within the larger scientific community.

What if these mice aren’t actually representative of all the other mice out there? Cmdragon, CC BY-SA

When classic methods don’t cut it

Imagine, for instance, that researchers gather a group of 40 individuals with high cholesterol. Half take drug A, while the other half take a placebo. The researchers discover that those in the first group have a larger average decrease in their cholesterol levels. But how well do the outcomes from just 20 people reflect what would happen if thousands of adults took drug A?

Or on a more cosmic scale, consider astronomer Edwin Hubble, who measured how far 24 galaxies are from Earth and how quickly they’re moving away from us. Data from that small group let him draw up an equation that predicts a galaxy’s so-called recession velocity given its distance. But how well do Hubble’s results reflect the association among all of the millions of galaxies in the universe if they were measured?

In these and many other situations, researchers use small sample sizes simply because of the cost and general difficulty of obtaining data. Classic methods, routinely taught and used, attempt to address these issues by making two key assumptions.

First, scientists assume there’s a particular equation for each individual situation that will accurately model the probabilities associated with possible outcomes. The most commonly used equation corresponds to what’s called a normal distribution. The resulting plot of the data is bell-shaped and symmetric around some central value.

Curves based on equations that describe different symmetric data sets. Inductiveload

Second, researchers assume the amount of variation is the same for both groups they’re comparing. For example, in the drug study, cholesterol levels will vary among the millions of individuals who might take the medication. Classic techniques assume that the amount of variation among the potential drug recipients is exactly the same as the amount of variation in the placebo group.

A similar assumption is made when studying associations. Consider, for example, a study examining the relationship between age and some measure of depression. Among the millions of individuals aged 20, there will be variation among their depression scores. The same is true at age 30, 80 or any age in between. Classic methods assume that the amount of variation is the same for any two ages we might pick.

All these assumptions allow researchers to use methods that are theoretically and computationally convenient. Unfortunately, they might not yield reasonably accurate results.

While writing my book “Introduction to Robust Estimation and Hypothesis Testing,” I analyzed hundreds of journal articles and found that these methods can be unreliable. Indeed, concerns about theoretical and empirical results date back two centuries.

When the groups that researchers are comparing do not differ in any way, or there is no association, classic methods perform well. But if groups differ or there is an association – which is certainly not uncommon – classic methods may falter. Important differences and associations can be missed, and highly misleading inferences can result.

Even recognizing these problems can make things worse, if researchers try to work around the limitations of classic statistical methods using ineffective or technically invalid methods. Transforming the data, or tossing out outliers – any extreme data points that are far out from the other data values – these strategies don’t necessarily fix the underlying issues.

A new way

Recent major advances in statistics provide substantially better methods for dealing with these shortcomings. Over the past 30 years, statisticians have solidified the mathematical foundation of these new methods. We call the resulting techniques robust, because they continue to perform well in situations where conventional methods fall down.

Conventional methods provide exact solutions when all those previously mentioned assumptions are met. But even slight violations of these assumptions can be devastating.

The new robust methods, on the other hand, provide approximate solutions when these assumptions are true, making them nearly as accurate as conventional methods. But it’s when the situation changes and the assumptions aren’t true that the new robust methods shine: They continue to give reasonably accurate solutions for a broad range of situations that cause trouble for the traditional ways.

Depression scores among older adults. The data are not symmetric, like you’d see in a normal curve. Rand Wilcox, CC BY-ND

One specific concern is the commonly occurring situation where plots of the data are not symmetric. In a study dealing with depression among older adults, for example, a plot of the data is highly asymmetric – roughly because most adults are not overly depressed.

Outliers are another common challenge. Conventional methods assume that outliers are of no practical importance. But of course that’s not always true, so outliers can be disastrous when using conventional methods. Robust methods offer a technically sound – though not obvious, based on standard training – way to deal with this issue that provides a much more accurate interpretation of the data.

Another major advance has been the creation of bootstrap methods, which are more flexible inferential techniques. Combining bootstrap and robust methods has led to a vast array of new and improved techniques for understanding data.

These modern techniques not only increase the likelihood of detecting important differences and associations, but also provide new perspectives that can deepen our understanding of what data are trying to tell us. There is no single perspective that always provides an accurate summary of data. Multiple perspectives can be crucial.

In some situations, modern methods offer little or no improvement over classic techniques. But there is vast evidence illustrating that they can substantially alter our understanding of data.

Education is the missing piece

So why haven’t these modern approaches supplanted the classic methods? Conventional wisdom holds that the old ways perform well even when underlying assumptions are false – even though that’s not so. And most researchers outside the field don’t follow the latest statistics literature that would set them straight.

There is one final hurdle that must be addressed if modern technology is to have a broad impact on our understanding data: basic training.

Most intro stats textbooks don’t discuss the many advances and insights that have occurred over the last several decades. This perpetuates the erroneous view that, in terms of basic principles, there have been no important advances since the year 1955. Introductory books aimed at correcting this problem are available and include illustrations on how to apply modern methods with existing software.

Given the millions of dollars and the vast amount of time spent on collecting data, modernizing basic training is absolutely essential – particularly for scientists who don’t specialize in statistics. Otherwise, important discoveries will be lost and, in many instances, a deep understanding of the data will be impossible.

Author: Rand Wilcox, Professor of Statistics, University of Southern California – Dornsife College of Letters, Arts and Sciences

Treasury Yields May Fall Short of Consensus Views

From Moody’s

Once again, the 10-year Treasury yield confounds the consensus. As of early April, the consensus had predicted that the benchmark 10-year Treasury would average 2.6% during 2017’s second quarter. To the contrary, the 10-year Treasury yield has averaged a much lower 2.29% thus far in the second quarter, including a recent 2.30%. Moreover, the 10-year Treasury yield has moved in a direction opposite to what otherwise might be inferred from March 14’s hiking of fed funds’ midpoint from 0.625% to 0.875%. For example, April 27’s 10-year Treasury of 2.30% was less than its 2.62% close of March 13, just prior to the latest Fed rate hike.

The latest decline by Treasury bond yields since March 14’s Fed rate hike stems from a slower than anticipated pace for business activity that has helped to rein in inflation expectations. March’s unexpectedly small addition of 98,000 workers to payrolls increases the risk of lower than expected household expenditures that could bring a quick end to the ongoing series of Fed rate hikes.

As inferred from the CME Group’s FedWatch tool, the fed funds futures contract assigns a negligible 4.3% probability to a Fed rate hike at the May 3 meeting of the FOMC. However, the likelihood of a rate hike soars to 70.6% at June 14’s FOMC meeting. Thus, do not be surprised if the policy statement of May 3’s FOMC meeting strongly hints of a June rate hike. Nevertheless, a June rate hike probably requires the return of at least 140,000 new jobs per month, on average, for April and May.

Unlike the Treasury bond market’s more sober view of business prospects since the March 14 rate hike, equities have rallied and the high-yield bond spread has narrowed. Incredibly, the VIX index sank to 10.6 on April 27, which was less than each of its prior month-long averages. The closest was the 10.8 of November 2006, or when the high-yield bond spread averaged 330 bp. Thus, if the VIX index does not climb higher over the next several weeks, the high-yield bond spread is likely to narrow from an already thin 385 bp.

Market value of common stock nears record percent of revenues

As equity market overvaluation heightens the risk of a deep drop by share prices, Treasury bonds become a more attractive insurance policy in case the equity bubble bursts. This is especially true if the next harsh equity-market correction is triggered by a contraction of profits, as opposed to an inflation-inspired jump by interest rates.

Equities are now very richly priced relative to corporate gross-value-added, where the latter aggregates the value of the final goods and services produced by corporations. Basically, gross value added nets out the value of the intermediate materials and services from which final products are produced.

The market value of US common stock now approximates 226% of the estimated gross value added of US corporations, where the latter is a proxy for corporate revenues. During the previous cycle, the ratio peaked at the 185% of Q2-2007 and then bottomed at the 103% of Q1-2009. The ratio is now the highest since the 231% of Q1-2000. Not only was the latter a record high, but it also coincided with a cycle peak for the market value of US common stock.

All else the same, the fair value of equities should decline as bond yields increase. Thus, the overvaluation implicit to Q1-2000’s atypically high ratio of the market value of common stock to corporate gross value added was compounded by Q1-2000’s relatively steep long-term Baa industrial company bond yield of 8.28%. Even if the market value of US common stock now matched Q1-2000’s 231% of corporate gross value added, Q1-2000’s equity market appears to be much more overvalued largely because the April 26, 2017 long-term Baa industrial company bond yield of 4.65% was well under the 8.28% of Q1-2000. (Figure 1.)

Consensus implicitly foresees record-long business upturn

Be it the Blue Chip or the Bloomberg survey, the consensus long-term outlook for interest rates suggests a great deal of confidence in the longevity of the current business cycle upturn. April’s consensus projects a steady climb by fed funds and the 10-year Treasury yield into 2021, by which time the forecast looks for yearlong averages of 2.88% for the fed funds’ midpoint and 3.6% for the 10-year Treasury yield.

Thus, the consensus implicitly expects that the current economic recovery (which is about to finish its eighth year in July 2017) will reach an exceptional 12th year in 2021. The implied expectation of a record long business cycle upturn is derived from the observation that each previous recession since 1979 has prompted significant declines by both fed funds and the 10-year Treasury yield. The absence of any predicted drop by the 10-year Treasury yield’s yearlong average between now and the end of 2022 is tantamount to forecasting an economic recovery of record length. (Figure 2.)

The current record-holder among economic recoveries is the upturn of April 1991 through February 2001 that lasted about 9.75 years. In a distant second place is the upturn of December 1982 through June 1990 that covered roughly 7.5 years. If the consensus proves correct about the duration of the ongoing upturn, a seemingly overvalued equity market is far from exhausting its upside potential.

Long-term outlook on profits requires low long-term bond yields

The consensus also maintains positive views on the near- and long-term outlook for pretax profits from current production. April’s Blue Chip consensus not only expects pretax operating profits to grow by 4.9% in 2017 and by 4.2% in 2018, but March’s long-term outlook projected profits growth in each of the five-years-ended 2023 of 4.0% annually, on average. The realization of seven consecutive years of profits growth requires the avoidance of a possibly disruptive climb by interest rates. Thus, the 10-year Treasury might well have difficulty spending much time above 3%, if, as expected, corporate gross value added’s average annual growth rate is less than 4%. (Figure 3.)

Morrison’s budget switch points at infrastructure boom

From The New Daily.

The government has bent to calls from experts and Labor by clearing away the accounting impediments to a big spend on infrastructure.

In a speech on Thursday, his last before he hands down the May 9 budget, Treasurer Scott Morrison promised to change how the budget reports the deficit.

Instead of reporting the ‘underlying cash balance’ (which counts “good and bad debt”) prominently and burying the ‘net operating balance’ (which only counts “bad debt”), Mr Morrison said he will put them side by side from now on.

“While the net operating balance has been a longstanding feature of our budget papers … it has not been in clear focus. This change will bring us into line with the states and territories, who report on versions of the net operating balance, as well as key international counterparts including New Zealand and Canada.”

In this context, “good debt” is borrowings for economy-boosting capital expenditure, such as roads and trains that reduce the time it takes to get to work, while “bad debt” is borrowing to cover the cost of defence and welfare.

As an example, in the latest MYEFO budget update, the projected underlying cash deficit for 2017-18 was $28.7 billion but the net operating deficit was only $19.2 billion.

Mr Morrison’s pledge was a marked reversal on his comments late last year when he said the government would only take on “so-called good debt” for infrastructure spending once it had brought “bad debt” under control.

The Coalition will soon, perhaps in the next six months, be forced to administratively lift the $500 billion gross debt ceiling to allow the government to keep borrowing. Nevertheless, the government will heed the calls of experts for debt-fuelled stimulus.

Various expert bodies, including the Reserve Bank, have been prodding the government to take advantage of record-low borrowing costs to renew Australia’s public infrastructure.

In his farewell address, former RBA governor Glenn Stevens said the economy would only be pulled out if its malaise if “someone, somewhere, has both the balance sheet capacity and the willingness to take on more debt and spend”.

“Let me be clear that I am not advocating an increase in deficit financing of day-to-day government spending,” Mr Stevens said.

“The case for governments being prepared to borrow for the right investment assets – long-lived assets that yield an economic return – does not extend to borrowing to pay pensions, welfare and routine government expenses, other than under the most exceptional circumstances.”

Credit ratings agencies, the International Monetary Fund and the OECD have also encouraged infrastructure spending.

And in a discussion paper last year, Labor’s shadow finance minister Jim Chalmers called for consultation on the “optimal budget presentation for intelligent investment in productivity enhancing infrastructure assets” and the idea of splitting out “spending on productive economic assets such as infrastructure from recurrent expenditure”.

Labor took a very different line on Thursday, with Shadow Treasurer Chris Bowen accusing the government of employing accounting “smoke and mirrors” to hide its economic mismanagement.

Anthony Albanese, the opposition’s infrastructure spokesman, welcomed the change but accused the government of wasting the last four years coming to the decision.

“Treasurer Scott Morrison’s declaration today that at a time of record low interest rates it makes sense to borrow for projects that boost economic productivity is precisely what Labor, the Reserve Bank and economists have been saying for years,” Mr Albanese said.

He warned the government’s “ill-advised” decision to create an infrastructure unit within the Department of Prime Minister and Cabinet, rather than relying on the independent Infrastructure Australia, risked pork barrelling.

“Creating another bureaucracy to sideline the independent adviser makes no sense. The government should have already learned that lesson from its creation of the Northern Australia Investment Facility, which was announced two years ago but has not invested in a single project.”

ABC 7:30 Does Good and Bad Debt

So the latest pivot from the Government is a focus on the “good and bad debt” as an apparent key to growth, with housing affordability now becoming more of a side show as the realisation dawns that they cannot solve that equation. This segment discusses the issue, and includes a contribution from DFA.


Inflation number misses the housing crisis

From The New Daily.

Almost nothing is to be seen of Australia’s housing crisis in the latest inflation figures.

Wednesday’s Consumer Price Index (CPI) from the Australian Bureau of Statistics showed just a 0.5 per cent increase in inflation this quarter, up 2.1 per cent over the past 12 months.

Over the same period, house prices grew 1.4 per cent, and by 75 per cent over the past five years.

The biggest increases in the CPI were in fuel, healthcare, power and, yes, housing.

However, as pointed out recently by Commonwealth Bank senior economist Gareth Aird, this ‘housing’ figure, which accounts for 22 per cent of the CPI calculation, does not truly reflect the struggle of many Australians to get onto the property ladder.

“The CPI is a poor barometer of changes in the cost of living for people who don’t own a dwelling and aspire to purchase one,” Mr Aird wrote.

That’s because the CPI measure of ‘housing’ only counts rents, utilities and the cost of building a new dwelling. It doesn’t include the cost of the land the dwelling sits on. And it doesn’t include the interest costs of repaying a mortgage.

If the full cost of housing was factored in, Mr Aird estimated it would add roughly 55 percentage points to headline inflation.

As mentioned above, the CPI ‘housing’ measure also doesn’t include interest charges. It used to, but they were removed in 1998 after lobbying from the Reserve Bank, which argued that rising mortgage interest rates would push up inflation, thereby requiring official cash rate rises, which would then push mortgage rates even higher, in a vicious loop.

The RBA said then that “excluding interest charges would in no way distort the outcome over the long run”.

Australia Institute senior research fellow David Richardson said if the CPI were to include land prices, the inflation rate would be pulled too hard by almost out of control house prices.

“Imagine if things went up 15 per cent a year in price,” Mr Richardson told The New Daily.

“Lots of contracts in Australia are indexed against the CPI. If they’re sort of fiddled then you’re talking billions and billions in consequences.”

Marcel van Kints, program manager with the Prices Branch of the ABS Macroeconomic Statistics Division, told The New Daily: “The ABS CPI aligns with international standards, an international respected measure of inflation.”

The ABS also published a FAQ with Wednesday’s release in which they pointed to their reasoning behind the exclusion of land from the CPI.

They said that housing is included in the Selected Living Cost Indexes, which are “particularly suited to assessing whether or not the disposable incomes of households have kept pace with price changes”.

Inflation outpaces wages

All of this is seeing many Australians left behind as both housing and the prices of popular consumer goods rise while wages stagnate.

Over the past 12 months, the CPI rose 2.1 per cent while wages grew by only 1.9 per cent, according to the latest ABS data.

The Australia Institute’s David Richardson said this is leaving many Australians worse off.

“I suspect that as professionals and skilled white collar workers, we’re all in the same boat,” he said.

“What we’re seeing now is a symptom of structural change that’s been creeping up on us for a long time.”

Inflation rises 0.5 per cent in the March quarter 2017

The data from the ABS today shows that the Consumer Price Index (CPI) rose 0.5 per cent in the March quarter 2017. This follows a rise of 0.5 per cent in the December quarter 2016. The CPI rose 2.1 per cent through the year to March quarter 2017. The trimmed mean was 1.9 per cent. Housing costs rose 2.5 per cent is the past year.

In original terms, Melbourne prices rose 0.9% in the quarter, highlighting the pressure on households there. As we said the other day, average CPI is understating what is happening in Victoria at the moment.

This data confirms the next RBA cash rate adjustment is more likely up, than down. It also underscores the flat, or falling income growth households are experiencing. More pressure, more mortgage stress as cost of living rises outstrip income growth, in a rising mortgage rate market.

The most significant price rises this quarter are automotive fuel (+5.7 per cent), medical and hospital services (+1.6 per cent) and new dwelling purchase by owner-occupiers (+1.0 per cent). These rises are partially offset by falls in Furnishings, household equipment and services (-1.0 per cent) and Recreation and culture (-0.7 per cent).

Vegetable prices have risen 13.1 per cent through the year to March quarter 2017. Adverse weather conditions in major growing areas over previous periods continue to impact supply for particular vegetables (potatoes, salad vegetables, cabbages and cauliflower). Offsetting these rises are price falls for capsicums and broccoli.

Top Of The Housing Cycle? – UBS

From Investor Daily.

Australian house price growth will slow to 7 per cent in 2017 before it collapses to between zero and 3 per cent in 2018, predicts UBS.

In a new housing outlook report, UBS said it is “calling the top” for Australian residential housing activity despite a surprise rebound in February approvals to 228,000.

While the “historical trigger” for a housing downturn is missing (namely, RBA interest rate hikes), mortgage rates are rising and home buyer sentiment is at a near record low, said UBS.

“Hence, we are ‘calling the top’, but stick to our forecasts for commencements to ‘correct but not collapse’ to 200,000 in 2017 and 180,000 in 2018,” said the report.

House prices are rising four times faster than incomes, noted UBS, which is unsustainable and suggests that growth has peaked.

“We see a moderation to [approximately] 7 per cent in 2017 and 0-3 per cent in 2018, amid record supply and poor affordability, with the new buyer mortgage repayment share of income spiking to a decade high,” said UBS.

The report also pointed to the March 2017 Rider Levett Bucknall residential crane count, which has more than tripled since 2013 to a record 548, but is now flat year-on-year.

Housing affordability has gone from “bad to even worse”, said UBS, with the house price to income ratio soaring to a record 6.5.

“With record low rates, repayments haven’t yet reached historical tipping points where prices fell, but would if mortgage rates rose by only [approximately] 100 basis points,” said the report.

The gross rental yield for two-bedroom unit has fallen to a record-low of less than 4 per cent, said UBS, which is now below mortgage rates of 4.25-4.50 per cent.

UBS also pointed to Australia’s household debt to GDP ratio of 123 per cent, which is one of the highest in the world.

Modelling shows how many billions in revenue the government is missing out on

From The Conversation.

The federal government could collect billions more in royalties and tax revenue if it changed the rules on debt loading and adopted alternative royalty schemes in dealing with oil and gas giants, new modelling shows.

Our modelling, funded by lobby group GetUp, found that over the four-year period from 2012 to 2015, Chevron’s average effective interest rate was 6.4%. However, it has been steadily reducing from 7.8% in 2012 to 5.7% in 2015.

We estimated that if Australia adopted a similar approach to Hong Kong to eliminate debt loading abuse, United States oil and gas giant Chevron would have been denied A$6.27 billion in interest deductions, potentially increasing tax revenues by A$1.89 billion over the four-year period (2012-2015).

The issue of debt loading abuse was highlighted last week when the full bench of the Federal Court dismissed unanimously Chevron’s appeal against the Australian Taxation Office (ATO), ordering the company to pay more than A$300 million.

Chevron Australia was using debt loading, where, compared its equity, it borrowed a large amount of debt at a high interest rate from its US subsidiary (which borrows at much lower rates). It did this in order to shift profits from high to low tax jurisdictions.

Based on Australia’s existing “thin capitalisation” rules, there is a maximum allowable debt that interest deductions can be claimed on, in a company’s tax return. Companies can exceed this debt but the interest charges must be at “arm’s length” – at commercial rates.

Chevron’s size and financial strength allow it to negotiate very competitive (low) rates on its external borrowings and this was the main issue in the Federal Court case. As the court has now ruled on what constitutes a reasonable interest rate for inter-company loans, this benchmark will likely be adopted by the ATO.

It can now approach and enforce this benchmark in similar disputes with confidence that companies engaged in debt loading will want to settle rather than engage in a costly court battle.

What the government could save from addressing debt loading

Chevron’s tax avoidance measures meant the interest rate, adjusted for maximum allowable debt, varied only slightly from their effective rate. Our modelling showed that if the ATO had applied the thin capitalisation rules to Chevron’s accounts each year over the four-year period, it would’ve reduced Chevron’s interest deduction by A$461 million and potentially generate an additional tax liability of A$138 million.

We modelled a scenario where Chevron Australia’s interest deductions were limited to the group’s external interest rate, applied to its level of debt. This would have reduced in the interest deduction by A$4.8 billion over the four year period, potentially generating A$1.4 billion in additional tax revenue.

We also worked out what would happen if Australia applied the debt loading rules Hong Kong does currently. Hong Kong disallows all deductions for related-party interest payments, making abuse of the system difficult. According to the latest ATO submission to the Senate tax inquiry, investment in the extraction of Australian oil and gas is almost entirely in the form of related-party loans.

Chevron Australia’s debt is entirely made up of related-party loans. If the Hong Kong solution was operating in Australia, we found that Chevron would have been denied A$6.275 billion in interest deductions, potentially increasing tax revenues by A$1.89 billion over the four-year period.

We also looked at ExxonMobil Australia, which also has high amounts of related-party debt (98.5%), and the Hong Kong solution would have denied ExxonMobil A$2.7 billion in interest deductions, potentially increasing tax revenue by more than A$800 million for the same period.

US oil and gas company Chevron lost a Federal Court appeal against the ATO. Toru Hanai/ Reuters

Changing the PRRT for more revenue

Our report also includes an analysis of the potential for additional revenue from replacing the Petroleum Resource Rent Tax (PRRT) with resource rent systems used in the US and Canada. Oil and gas sales have increased from an average of A$5.96 billion per year between 1988 and 1991, to an annual average of A$33.3 billion between 2012 and 2015, indicating the huge growth in this sector.

We modelled what would happen if the US and the Alberta, Canada, royalty schemes were applied to Australian production volumes and realised prices, to compare returns to those achieved by the PRRT.

The US royalty scheme charges a flat percentage royalty on production volumes, priced at the well-head. The royalty rate was progressively increased in the US from 12.5% to 18.75% between 2006 and 2008.

Based on the data from Australian production volumes and realised sales prices, the US royalty scheme could have potentially raised an additional A$5.9 billion in revenue for Australia since 1988, or A$212 million per year.

However, over the period from 2010 to 2015, the additional revenues would have been almost A$2.5 billion per year. This is because of both the decline in the PRRT revenues, relative to price and volumes, and the increase in the royalty rate in the US.

However, while the US scheme would raise more than the PRRT, the Alberta royalty scheme would raise substantially more revenue than both of these schemes. The Alberta scheme is progressive in nature, meaning the royalty rate increases with the realised price, similar to income levels and personal income tax rates.

The Alberta scheme has been amended many times and the current scheme only started in January 2017, so the full effects of this scheme will not be evident for some time. However, based on the data from Australian production volumes and realised sales prices, we calculate the Alberta royalty scheme would have raised an additional A$103 billion in revenue since 1988, or an additional A$3.7 billion per year.

As the scheme was only implemented this year, these results may be unrealistic, but are indicative of the level of revenue that could be raised. Over the period from 2010 to 2015, the additional revenue would have been A$11.3 billion per year.

The modelling done for our report considers just two multinational corporations, their use of debt loading and the PRRT. Now we can can hope for more revenue collection from many of the multinationals operating in Australia, as a result of the recent Federal Court ruling.

Critically, too often corporations are able to work within Australia’s tax rules to avoid paying for operating here, by constantly arguing they can’t develop business in Australia unless there are tax breaks. Our modelling demonstrates governments need to ensure corporations benefiting from the use of Australia’s resources are contributing the same as they do in other jurisdictions.

Authors: Roman Lanis, Associate Professor, Accounting, University of Technology Sydney; Brett Govendir, Lecturer, University of Technology Sydney; Ross McClure,
PhD Candidate, casual academic, University of Technology Sydney

The man who would rid the world of GDP

From The Conversation.

The South African economist Lorenzo Fioramonti is one of the leading critics of the fact that we measure the well-being of society using a single statistic. In three books, most recently The World After GDP (2015), he argues that the economic activity captured in gross domestic product (GDP) has been the priority for policies and incentives around the world for the past few decades – with disastrous results.

At a recent guest lecture at the Scottish parliament in Edinburgh that was organised by the Carnegie UK Trust, Fioramonti told his audience that GDP is a fundamentally flawed measure of economic performance, let alone well-being.

It has been foisted on the world by rich countries, especially the US, and the political interests that they represent. Just as those who live by the sword die by the sword, any democratic government can expect to lose power if it fails to increase GDP.

Back in 1992 it was James Carville, Bill Clinton’s  director of strategy, who kept repeating to the future president the phrase: “It’s the economy, stupid”. Carville knew President Bush would struggle to defend his handling of the economy. He insisted Clinton repeatedly raise weak GDP growth to show Bush was failing to lead the country.

Sure enough, it helped win the election. Case closed? Not according to Fioramonti.

Simplicity and complexities

Economists like GDP because it is a single statistic. It seems precise. But as Fioramonti pointed out on a day the Scottish parliament had been debating a small fall in Scottish GDP, the initial estimates are always subject to revision. Important variables are only available after taxes have been paid, so the most accurate figures take two or three years. By the time those are published, there probably won’t be any debates in parliament on the subject.

Then there is how to measure GDP. Most countries total all of the income that activities produce, ranging from the wages of individuals to the revenue of companies. But this can lead to all kinds of distortions. Take Ireland, for example. If a UK shopper buys a product online from a retailer domiciled in Ireland, that retailer’s income will be counted as part of Ireland’s GDP.

That would be perfectly reasonable for, say, an Irish shop with a website. But many multinationals put all their European sales through an Irish business unit for tax purposes. Consequently Irish GDP is no longer an accurate measure of the economy’s performance.

When I interviewed Fioramonti after his lecture, he quickly rejected any suggestion that you could solve these problems simply by having a better measure of GDP. This would simply continue to confuse the wealth of the nation with its income, and fail to value other factors important to our well-being such as sustainability. If an offshore drilling company is depleting the Great Barrier Reef, say, focusing on GDP merely continues to prioritise the business success over the environmental damage.

Fioramonti linked the primacy of GDP to the development of Keynesian thought and the perceived need to measure national income after the global slump of the 1930s. His characterisation of the use of GDP in political analysis reminded me of Keynes’ claim that “madmen in authority, who hear voices in the air, are distilling their frenzy from some academic scribbler of a few years back”.

But Fioramonti believes economists cannot shrug off responsibility for politicians’ use of GDP. The Keynesian economists who adopted GDP growth as a policy target after the war ignored Keynes’ own critique that monetary values cannot truly measure well-being. And when Keynesian demand management failed to achieve strong GDP growth in the 1960s and 1970s, the neoliberal economists who came to the fore compounded the problem by making that growth an even greater priority.

For Fioramonti, weaning the world off GDP is a little like playing chess: you need to win by accepting the rules and conventions of the game before you can change the game. In other words, you need to demonstrate to advocates that, as in the Irish example, GDP no longer measures well-being.

So far so compelling, but I must admit I struggled with his proposed alternative. Fioramonti envisages a “census of assets” – a 21st-century global Domesday Book that would be a record of how people value the assets they need for a good life. It would include everything from jobs to shelter to the surrounding countryside. It would use the language of sustainability and need, and what was included would be subject to a public vote.

I pressed him on how we might value and compare the multiple sources of well-being that are essential to an alternative approach. He was clear it wouldn’t be primarily about assigning monetary values to things.

You would accept that different categories would be measured in different ways and that these would all be part of the mix. Where it made sense you would monitor resource depletion, for instance reducing the value you ascribe to the Great Barrier reef as appropriate.

All these measurements would go towards a national “performance dashboard” – in line with a concept being promoted by the Carnegie Trust. The trust shares Fioramonti’s interest in measuring well-being and incidentally sees Scotland’s efforts to score its government policies using a wide range of indicators as being at the leading edge.

Our discussion was rapidly going away from economics towards something much broader. Fioramonti said he considers even social interactions to be vital to well-being. I certainly agreed with this, but it highlights a problem of practicality. The challenge for developing Fioramonti’s census will be balancing the easily measurable factors associated with well-being with the broader range that are arguably important.

It is not made easier because Fioramonti and other critics of GDP seem to value dialogue rather than statistical measurement. He talks about beating the economists at their own chess game, but he seems to have left the table after an opening gambit.

Author: Robert Mochrie, Associate Professor of Economics, Heriot-Watt University

US weekly earnings increase 4.2 percent

According to the US Bureau of Statistics, weekly earnings of the nation’s 110.7 million full-time wage and salary workers were $865 (not seasonally adjusted) in the first quarter of 2017, an increase of 4.2 percent from a year earlier ($830).

From the first quarter of 2016 to the first quarter of 2017, median usual weekly earnings increased 4.2 percent for men who usually worked full time and 2.0 percent for women. In the first quarter of 2017, women who usually worked full time had median weekly earnings of $765, or 80.5 percent of the $950 median for men.

Among the major race and ethnicity groups, median weekly earnings for full-time wage and salary workers were $894 for Whites in the first quarter of 2017, $679 for Blacks or African Americans, $1,019 for Asians, and $649 for workers of Hispanic or Latino ethnicity.

These data are from the Current Population Survey.