Banks and Fintech – Where Do They Fit?

US Fed Governor Lael Brainard spoke on “Where Do Banks Fit in the Fintech Stack?” at the Northwestern Kellogg Public-Private Interface Conference on “New Developments in Consumer Finance: Research & Practice”

In particular she explored different approaches to how banks are exposing their data in a fintech context and the regulatory implications. Smaller banks may be at a disadvantage.

Different Approaches to the Fintech Stack

Because of the high stakes, fintech firms, banks, data aggregators, consumer groups, and regulators are all still figuring out how best to do the connecting. There are a few alternative approaches in operation today, with various advantages and drawbacks.

A number of large banks have developed or are in the process of developing interfaces to allow outside developers access to their platforms under controlled conditions. Similar to Apple opening the APIs of its phones and operating systems, these financial companies are working to provide APIs to outside developers, who can then build new products on the banks’ platforms. It is worth highlighting that platform APIs generally vary in their degree of openness, even in the smartphone world. If a developer wants to use a Google Maps API to embed a map in her application, she first must create a developer account with Google, agreeing to Google’s terms and conditions. This means she will have entered a contract with the owner of the API, and the terms and conditions may differ depending on how sensitive the particular API is. Google may require only a minimum amount of information for a developer that wants to use an API to display a map. Google may, however, require more information about a developer that wants to use a different API to monitor the history of a consumer’s physical locations over the previous week. And in some cases, the competitive interests of Google and a third-party app developer may diverge over time, such that the original terms of access are no longer acceptable.

The fact that it is possible and indeed relatively common for the API provider–the platform–to require specific controls and protections over the use of that API raises complicated issues when imported to the banking world. As banks have considered how to facilitate connectivity, the considerations include not only technical issues and the associated investment, but also the important legal questions associated with operating in a highly regulated sector. The banks’ terms of access may be determined in third-party service provider agreements that may offer different degrees of access. These may affect not only what types of protections and vetting are appropriate for different types of access over consumers’ funds and data held at a bank in order to enable the bank to fulfill its obligations for data security and other consumer protections, but also the competitive position of the bank relative to third-party developers.

There is a second broad type of approach in which many banks have entered into agreements with specialized companies that essentially act as middlemen, frequently described as “data aggregators.” These banks may lack the budgets and expertise to create their own open APIs or may not see that as a key element in their business strategies. Data aggregators collect consumer financial account data from banks, on the one hand, and then provide access to that data to fintech developers, on the other hand. Data aggregators organize the data they collect from banks and other data sources and then offer their own suite of open APIs to outside developers. By partnering with data aggregators, banks can open their systems to thousands of developers, without having to invest in creating and maintaining their own open APIs. This also allows fintech developers to build their products around the APIs of two or three data aggregators, rather than 15,000 different banks and other data sources. And, if agreements between data aggregators and banks are structured as data aggregators performing outsourced services to banks, the bank should be able to conduct the appropriate due diligence of its vendors, whose services to those banks may be subject to examination by safety and soundness regulators.

Some banks have opted for a more “closed” approach to fintech developers by entering into individual agreements with specific technology providers or data aggregators. These agreements often impose specific requirements rather than simply facilitating structured data feeds. These banks negotiate for greater control over their systems by limiting who is accessing their data–often to a specific third party’s suite of products. Likewise, many banks use these agreements to limit what types of data will be shared. For instance, banks may share information about the balances in consumers’ accounts but decline to share information about fees or other pricing. While recognizing the legitimate need for vetting of third parties for purposes of the banks fulfilling their responsibilities, including for data privacy and security, some consumer groups have suggested that the standards for vetting should be commonly agreed to and transparent to ensure that banks do not restrict access for competitive reasons and that consumers should be able to decide what data to make available to third-party fintech applications.

A third set of banks may be unable or unwilling to provide permissioned access, for reasons ranging from fears about increased competition to concerns about the cost and complexity of ensuring compliance with underlying laws and regulations. At the very least, banks may have reasonable concerns about being able to see, if not control, which third-party developers will have access to the banking data that is provided by the data aggregators. Accordingly, even banks that have previously provided structured data feeds to data aggregators may decide to limit or block access. In such cases, however, data aggregators can still move forward to collect consumer data for use by fintech developers without the permission or even potentially without the knowledge of the bank. Instead, data aggregators and fintech developers directly ask consumers to give them their online banking logins and passwords. Then, in a process commonly called “screen scraping,” data aggregators log onto banks’ online consumer websites, as if they were the actual consumers, and extract information. Some banks report that as much as 20 to 40 percent of online banking logins is attributable to data aggregators. They even assert that they have trouble distinguishing whether a computer system that is logging in multiple times a day is a consumer, a data aggregator, or a cyber attack.

For community banks with limited resources, the necessary investments in API technology and in negotiating and overseeing data-sharing agreements with data aggregators and third-party providers may be beyond their reach, especially as they usually rely on service providers for their core technology. Some fintech firms argue that screen scraping–which has drawn the most complaints about data security–may be the most effective tool for the customers of small community banks to access the financial apps they prefer–and thereby necessary to remain competitive until more effective broader industry solutions are developed.

Clearly, getting these connectivity questions right, including the need to manage the consumer protection risks, is critically important. It could make the difference between a world in which the fintech wave helps community banks become the platforms of the future, on the one hand, or, on the other hand, a world in which fintech instead further widens the gulf between community banks and the largest banks.

 

Don’t bet the house on a property market correction

From The New Daily.

Experts have warned against predicting that property prices have peaked just yet.

A flurry of headlines this week generated by UBS analysts, Australian Financial Review columnists and others all warned that Sydney and possible Melbourne prices had peaked and we should brace for a correction.

Most were based on slower price growth in Sydney dwelling values and slight reductions in auction clearance rates compiled by CoreLogic, a property data firm.

However, CoreLogic director of research Tim Lawless cautioned against reading into the results (especially dwelling values, which are yet to be officially released for April) because April and May are generally weaker periods.

“Potentially there is some seasonality creeping into these numbers and that’s one of the reasons why I would probably suggest caution calling the peak right now before we see a few more months and see if the trend actually develops,” Mr Lawless told The New Daily.

“When we look at, say, a year ago or any sort of seasonality in the marketplace, yeah, we do generally see some easing in our reading around April and May.”

A further complication is that CoreLogic adjusted how it calculated dwelling values in May 2016 to account for seasonality. The result, according to Mr Lawless, is that “technically speaking, there are some challenges and complexities making a year-to-year comparison”, although he said the adjustments were “quite minor” and values could still be compared.

The change sparked a scandal last year, with the Reserve Bank ditching the company as its preferred data source after claiming it had overstated dwelling values in April and May.

Despite this, CoreLogic remains the most widely cited property data source because it reports dwelling values daily. But the most authoritative is the Australian Bureau Statistics, which has measured similar quarter-on-quarter falls in the past, especially between the December and June quarters. And yet, the trend has been ever upwards.

IFM chief economist Dr Alex Joiner agreed we shouldn’t jump to conclusions based on the latest statistics.

“I wouldn’t suggest that anyone looks at any month-to-month data in Australia and makes firm conclusions from it,” Dr Joiner told The New Daily.

“People might want to rush to call the top, but the trends are for gradually decelerating growth, and I think that’s about right.”

But if this is not the peak, the market is “very much approaching it” because the Reserve Bank and the banks are likely to lift interest rates even as wage growth stays low, Dr Joiner said.

“When that actually decelerates price growth, whether it’s this month or later in the year, I don’t know. But we’re certainly eeking out the very last stages of price growth in the property market.”

New statistical methods would let researchers deal with data in better, more robust ways

From The Conversation.

No matter the field, if a researcher is collecting data of any kind, at some point he is going to have to analyze it. And odds are he’ll turn to statistics to figure out what the data can tell him.

A wide range of disciplines – such as the social sciences, marketing, manufacturing, the pharmaceutical industry and physics – try to make inferences about a large population of individuals or things based on a relatively small sample. But many researchers are using antiquated statistical techniques that have a relatively high probability of steering them wrong. And that’s a problem if it means we’re misunderstanding how well a potential new drug works, or the effects of some treatment on a city’s water supply, for instance.

As a statistician who’s been following advances in the field, I know there are vastly improved methods for comparing groups of individuals or things, as well as understanding the association between two or more variables. These modern robust methods offer the opportunity to achieve a more accurate and more nuanced understanding of data. The trouble is that these better techniques have been slow to make inroads within the larger scientific community.

What if these mice aren’t actually representative of all the other mice out there? Cmdragon, CC BY-SA

When classic methods don’t cut it

Imagine, for instance, that researchers gather a group of 40 individuals with high cholesterol. Half take drug A, while the other half take a placebo. The researchers discover that those in the first group have a larger average decrease in their cholesterol levels. But how well do the outcomes from just 20 people reflect what would happen if thousands of adults took drug A?

Or on a more cosmic scale, consider astronomer Edwin Hubble, who measured how far 24 galaxies are from Earth and how quickly they’re moving away from us. Data from that small group let him draw up an equation that predicts a galaxy’s so-called recession velocity given its distance. But how well do Hubble’s results reflect the association among all of the millions of galaxies in the universe if they were measured?

In these and many other situations, researchers use small sample sizes simply because of the cost and general difficulty of obtaining data. Classic methods, routinely taught and used, attempt to address these issues by making two key assumptions.

First, scientists assume there’s a particular equation for each individual situation that will accurately model the probabilities associated with possible outcomes. The most commonly used equation corresponds to what’s called a normal distribution. The resulting plot of the data is bell-shaped and symmetric around some central value.

Curves based on equations that describe different symmetric data sets. Inductiveload

Second, researchers assume the amount of variation is the same for both groups they’re comparing. For example, in the drug study, cholesterol levels will vary among the millions of individuals who might take the medication. Classic techniques assume that the amount of variation among the potential drug recipients is exactly the same as the amount of variation in the placebo group.

A similar assumption is made when studying associations. Consider, for example, a study examining the relationship between age and some measure of depression. Among the millions of individuals aged 20, there will be variation among their depression scores. The same is true at age 30, 80 or any age in between. Classic methods assume that the amount of variation is the same for any two ages we might pick.

All these assumptions allow researchers to use methods that are theoretically and computationally convenient. Unfortunately, they might not yield reasonably accurate results.

While writing my book “Introduction to Robust Estimation and Hypothesis Testing,” I analyzed hundreds of journal articles and found that these methods can be unreliable. Indeed, concerns about theoretical and empirical results date back two centuries.

When the groups that researchers are comparing do not differ in any way, or there is no association, classic methods perform well. But if groups differ or there is an association – which is certainly not uncommon – classic methods may falter. Important differences and associations can be missed, and highly misleading inferences can result.

Even recognizing these problems can make things worse, if researchers try to work around the limitations of classic statistical methods using ineffective or technically invalid methods. Transforming the data, or tossing out outliers – any extreme data points that are far out from the other data values – these strategies don’t necessarily fix the underlying issues.

A new way

Recent major advances in statistics provide substantially better methods for dealing with these shortcomings. Over the past 30 years, statisticians have solidified the mathematical foundation of these new methods. We call the resulting techniques robust, because they continue to perform well in situations where conventional methods fall down.

Conventional methods provide exact solutions when all those previously mentioned assumptions are met. But even slight violations of these assumptions can be devastating.

The new robust methods, on the other hand, provide approximate solutions when these assumptions are true, making them nearly as accurate as conventional methods. But it’s when the situation changes and the assumptions aren’t true that the new robust methods shine: They continue to give reasonably accurate solutions for a broad range of situations that cause trouble for the traditional ways.

Depression scores among older adults. The data are not symmetric, like you’d see in a normal curve. Rand Wilcox, CC BY-ND

One specific concern is the commonly occurring situation where plots of the data are not symmetric. In a study dealing with depression among older adults, for example, a plot of the data is highly asymmetric – roughly because most adults are not overly depressed.

Outliers are another common challenge. Conventional methods assume that outliers are of no practical importance. But of course that’s not always true, so outliers can be disastrous when using conventional methods. Robust methods offer a technically sound – though not obvious, based on standard training – way to deal with this issue that provides a much more accurate interpretation of the data.

Another major advance has been the creation of bootstrap methods, which are more flexible inferential techniques. Combining bootstrap and robust methods has led to a vast array of new and improved techniques for understanding data.

These modern techniques not only increase the likelihood of detecting important differences and associations, but also provide new perspectives that can deepen our understanding of what data are trying to tell us. There is no single perspective that always provides an accurate summary of data. Multiple perspectives can be crucial.

In some situations, modern methods offer little or no improvement over classic techniques. But there is vast evidence illustrating that they can substantially alter our understanding of data.

Education is the missing piece

So why haven’t these modern approaches supplanted the classic methods? Conventional wisdom holds that the old ways perform well even when underlying assumptions are false – even though that’s not so. And most researchers outside the field don’t follow the latest statistics literature that would set them straight.

There is one final hurdle that must be addressed if modern technology is to have a broad impact on our understanding data: basic training.

Most intro stats textbooks don’t discuss the many advances and insights that have occurred over the last several decades. This perpetuates the erroneous view that, in terms of basic principles, there have been no important advances since the year 1955. Introductory books aimed at correcting this problem are available and include illustrations on how to apply modern methods with existing software.

Given the millions of dollars and the vast amount of time spent on collecting data, modernizing basic training is absolutely essential – particularly for scientists who don’t specialize in statistics. Otherwise, important discoveries will be lost and, in many instances, a deep understanding of the data will be impossible.

Author: Rand Wilcox, Professor of Statistics, University of Southern California – Dornsife College of Letters, Arts and Sciences

Sydney property prices down: CoreLogic

From The Real Estate Conversation.

CoreLogic has revealed the property market has been largely flat during the month of April, ahead of the release of its end-of-month numbers on Monday.

CoreLogic’s hedonic home value index for Australia’s top five property markets held virtually steady in the first 27 days of the month, indicating that the current cycle could be moving through its peak.

Sydney prices recorded a “subtle” decline, according to CoreLogic, a dramatic though welcome turnaround from the blistering 18.8 per cent increase recorded in March. The five-city aggregate also recorded an exceptionally strong result in March, rising 12.9 per cent despite a 4.7 per cent decline in Perth prices.

Leeanne Pilkington, deputy president of the Real Estate Institute of New South Wales, says the April decline in Sydney prices was only very slight, and will vary from suburb to suburb.

“None of my agents are telling me they’re worried about prices going down,” she said.

However, Pilkington said her agents are saying there a lower numbers at open houses, which means there could be less competition in the market between buyers.

“We’ve seen that [trend] with the lower clearance rate last week,” she said. Pilkington said clearance rates above 80 per cent were not sustainable, and that a modest decline in clearance rates would actually be desirable.

“We really want some stability in the market,” she said.

Pilkington said April was a holiday month, containing both Easter and ANZAC day, so the numbers for the month may not reflect the true state of the market. Auction clearance rates over the weekend will provide clearer guidance, she said.

Tim Lawless, head of research Asia Pacific with CoreLogic, attributes the flat overall result to recent regulatory changes which have led to higher mortgage rates and weaker investment demand, causing a “dampening” effect on the property market.

Prudential perspectives on the property market

Wayne Byers APRA  Chairman spoke at  CEDA’s 2017 NSW Property Market Outlook in Sydney  today.  Of note, he explains the reason why the 10% investor speed limit was not reduced, because of the potential impact on commercial propery construction!

My remarks today come, unsurprisingly, from a prudential perspective. Property prices and yields, planning rules, the role of foreign purchasers, supply constraints, and taxation arrangements are all important elements of any discussion on property market conditions, and I’m sure the other speakers today will touch on most of those issues in some shape or form. But I’ll focus on APRA’s key objective when it comes to property: making sure that standards for property lending are prudent, particularly in an environment of heightened risk.

Sound lending standards

Our recent activity in relation to residential and commercial property lending has been directed at ensuring banking institutions maintain sound lending standards. Our ultimate goal is to protect bank depositors – it is, after all, ultimately their money that banks are lending. Basic banking – accepting money from depositors and lending to sound borrowers who have good prospects of repaying their loans – is what it’s all about. Of course, banking is about risk-taking and it is inevitable that not every loan will be fully repaid, but with appropriate lending standards and sufficient diversification, the risk of losses that jeopardise the financial health of a bank – and therefore the security offered to depositors – can be reduced to a significant degree. The banking system is heavily exposed to the inevitable cycles in property markets, and our goal is to seek to make sure the system can readily withstand those cycles without undue stress.

Our mandate goes no further than that. We also have to take many influences on the property market – tax policy, interest rates, planning laws, foreign investment rules – as a given. And there are credit providers beyond APRA’s remit, so a tightening in one credit channel may just see the business flow to other providers anyway. For those reasons, there are clear limits on the influence we have. Property prices are driven by a range of local and global factors that are well beyond our control: whether prices go up or go down, we are, like King Canute, unable to hold back the tide.

Of course, that is not to ignore the fact that one determinant of property market conditions is access to credit. We acknowledge that in influencing the price and availability of credit, we do have an impact on real activity – and this may feed through to asset prices in a range of ways. But I want to emphasise that we are not setting out to control prices. Property prices will go up and they will go down (even for Sydney residential property!). It is not our job to stop them doing either of those things. Rather, our goal is to make sure that whichever way prices are moving at any particular point in time in any particular location, prudentially-regulated lenders are alert to the property cycle and making sound lending decisions. That is the best way to safeguard bank depositors and the stability of the financial system.

Residential property lending

APRA has been ratcheting up the intensity of its supervision of residential property lending over the past five or so years. Initially, this involved some fairly typical supervisory measures:

  • in 2011 and again in 2014, we sought assurances from the Boards of the larger lenders that they were actively monitoring their housing lending portfolios and credit standards;
  • in 2013, we commenced more detailed information collections on a range of housing loan risk metrics;
  • in 2014, we stress-tested around the largest lenders against scenarios involving a significant housing market downturn;
  • also in 2014, we issued a Prudential Practice Guide on sound risk management practices for residential mortgage lending; and
  • we have conducted numerous hypothetical borrower exercises to assess differences in lending standards between lenders, and changes over time.

These steps are typical of the role of a prudential supervisor: focusing on the strength of the governance, risk management and financial resources supporting whatever line of business is being pursued, without being too prescriptive on how that business should be undertaken.

But from the end of 2014, we stepped into some relatively new territory by defining specific lending benchmarks, and making clear that lenders that exceeded those benchmarks risked incurring higher capital requirements to compensate for their higher risk. In particular, we established quantitative benchmarks for investor lending growth (10 per cent), and interest rate buffers within serviceability assessments (the higher of 7 per cent, or 2 per cent over the loan product rate), as a means of reversing a decline in lending standards that competition for growth and market share had generated.

I regard these recent measures as unusual, and not reflective of our preferred modus operandi. We came to the view, however, that the higher-than-normal prescription was warranted in the environment of high house prices, high household debt, low interest rates, low income growth and strong competitive pressures.  In such an environment, it is easy for borrowers to build up debt. Unfortunately, it is much harder to pay that debt back down when the environment changes. So re-establishing a sound foundation in lending standards was a sensible investment.

Since we introduced these measures in late 2014, investor lending has slowed and serviceability assessments have strengthened. But at the same time, housing prices and debt have got higher, official interest rates have fallen further and wage growth remains subdued. So we recently added an additional benchmark on the share of new lending that is occurring on an interest-only basis (30 per cent) to further reduce vulnerabilities in the system.

Each of these measures has been a tactical response to evolving conditions, designed to improve the resilience of bank balance sheets in the face of forces that might otherwise weaken them. We will monitor their effectiveness over time, and can do more or less as need be. We have also flagged that, at a more strategic level, we intend to review capital requirements for mortgage lending as part of our work on establishing ‘unquestionably strong’ capital standards, as recommended by the Financial System Inquiry (FSI).

Looking at the impact so far, I have already noted that our earlier measures have helped slow the growth in investor lending (Chart 1), and lift the quality of new lending. Serviceability tests have strengthened, although as one would expect in a diverse market there are still a range of practices, ranging from the quite conservative to the less so. Lenders subject to APRA’s oversight have increasingly eschewed higher risk business (often by reducing maximum loan-to-valuation ratios (Chart 2)), or charged a higher price for it.

For example, there is now a clear price differential between lending to owner-occupiers on a principal and interest basis, and lending to investors on an interest-only basis (Chart 3). And as a result of our most recent guidance to lenders, we expect some further tightening to occur.

Looked at more broadly, the most important impact has been to reduce the competitive pressure to loosen lending standards as a means of chasing market share. We are not seeking to interfere in the ability of lenders to compete on price, service standards or other aspects of the customer experience. We do, however, want to reduce the unfortunate tendency of lenders, lulled by a long period of buoyant conditions, to compete away basic underwriting standards.

Of course, lenders not regulated by APRA will still provide competitive tension in that area and it is likely that some business, particularly in the higher risk categories, will flow to these providers. That is why we also cautioned lenders who provide warehouse facilities to make sure that the business they are funding through these facilities was not growing at a materially faster rate than the lender’s own housing loan portfolio, and that lending standards for loans held within warehouses was not of a materially lower quality than would be consistent with industry-wide sound practices. We don’t want the risks we are seeking to dampen coming onto bank balance sheets through the back door.

Commercial real estate lending

For all of the current focus on residential property lending, it has been cycles in commercial real estate (CRE) that have traditionally been the cause of stress in the banking system. So we are always quite interested in trends and standards in this area of lending. And tighter conditions for residential lending will also impact on lenders’ funding of residential construction portfolios – we need to be alert to the inter-relationships between the two.

Overall, lending for commercial real estate remains a material concentration of the Australian banking system. But while commercial lending exposures of APRA-regulated lenders continue to grow in absolute terms, they have declined relative to the banking system’s capital (partly reflecting the expansion in the system’s capital base). Exposures are now well down from pre-GFC levels as a proportion of capital, albeit much of the reduction was in the immediate post-crisis years and, more recently, the relative position has been fairly steady (Chart 4).

The story has been broadly similar in most sub-portfolios, with the notable exception of land and residential development exposures (Chart 5).

These have grown strongly as the banking industry has funded the significant new construction activity that has been occurring, particularly in the capital cities of Australia’s eastern seaboard. At the end of 2015, these exposures were growing extremely rapidly at just on 30 per cent per annum, but have since slowed significantly as newer projects are now being funded at little more than the rate at which existing projects roll off (Chart 6).

(As an aside, this is one reason why we opted not to reduce the 10 per cent investor lending benchmark for residential lending recently. There is a fairly large pipeline of residential construction to be absorbed over the course of 2017, and there is little to be gained from unduly constricting that at this point in time.)

In response to the generally low interest rate environment, coupled with relatively high price growth in some parts of the commercial market, low capitalisation rates and indications that underwriting standards were under competitive pressure, we undertook a thematic review of commercial property lending over 2016. We looked at the portfolio controls and underwriting standards of a number of larger domestic banks, as well as foreign bank branches which have been picking up market share and growing their commercial real estate lending well above system growth rates.

The review found that major lenders were well aware of the need to monitor commercial property lending closely, and the need to stay attuned to current and prospective market conditions. But the review also found clear evidence of an erosion of standards due to competitive pressures – for example, of lenders justifying a particular underwriting decision not on their own risk appetite and policies, but based on what they understood to be the criteria being applied by a competitor. We were also keen to see genuine scrutiny and challenge that aspirations of growth in commercial property lending were achievable, given the position in the credit cycle, without compromising the quality of lending. This was often being hampered by inadequate data, poor monitoring and incomplete portfolio controls. Lenders have been tasked to improve their capabilities in this regard.

Given the more heterogeneous nature of commercial property lending, it is more difficult to implement the sorts of benchmarks that we have applied to residential lending. But that should not be read to imply we have any less interest in the quality of commercial property lending. Our workplan certainly has further investigation of commercial property lending standards in 2017, and we will keep the need for additional guidance material under consideration.

Concluding remarks

So to sum up, property exposures – both residential and commercial – will remain a key area of focus for APRA for the foreseeable future. Sound lending standards are vital for the stability and safety of the Australian banking system, and given the high proportion of both residential mortgage and commercial property lending in loan portfolios, there will be no let up in the intensity of APRA’s scrutiny in the foreseeable future. But despite the fact the merit of our actions are often assessed based on their expected impact on prices, that is not our goal. Prudence (not prices) is our catch cry: our objective is to make ensure that, whatever the next stage of the property cycle may bring, the balance sheet of the banking system is resilient to it.

ANZ Hikes Investor Loans (Again)

From news.com.au.

ONE of the nation’s largest banks ANZ has lifted interest rates on home loan deals.

The bank has followed in the footsteps of rivals the Commonwealth Bank and Westpac, moving interest rates on both owner occupier and investor loans.

Some of the moves also include decreases and are effective immediately.

The moves come ahead of the Reserve Bank of Australia board meeting on Tuesday where it’s expected they will keep the cash rate on hold at 1.5 per cent.

Owner occupiers and investors signing up to interest-only fixed rate deals will be the worst hit with some hikes as high as 0.4 per cent.

On 2, 4, and 5 year fixed owner occupier interest-only loans the rates will rise by 0.4 per cent on the bank’s Breakfree products (this is one of the bank’s most popular products).

On one of the most popular fixed loans terms, three-year owner occupier interest-only loans will rise by 0.3 per cent to 4.49 per cent increasing repayments on a $300,000 30-year loan by $75 per month to $1123.

For investors on a three-year fixed-rate interest-only Breakfree deal the rate will rise 0.3 per cent 4.69 per cent, pushing up repayments by $75 per month to $1173.

For both owner occupiers and investors on principal and interest fixed rate deals rates on nearly all these products will fall.
Borrowers have been hit by fixed rates increases in recent weeks.

Borrowers have been hit by fixed rates increases in recent weeks.Source:Supplied

The three-year fixed rate owner occupier principal and interest deal will fall by 0.2 per cent to 3.99 per cent saving customers $34 per month and making repayments $1431.

On a three-year fixed rate investor principal and interest deal the rate will fall by 0.1 per cent to 4.44 per cent.

An ANZ spokesman said the “reflect our need to closely manage our regulatory obligations, portfolio risk and the competitive environment.”

Mozo spokeswoman Kirsty Lamont said the increases by ANZ are a result of the financial regulator, the Australian Prudential and Regulation Authority limiting their interest-only lending.

Mozo spokeswoman Kirsty Lamont said there’s increasing pressure on financial institutions to limit interest-only lending.Source:News Corp Australia

“It’s now more important than ever for interest only borrowers to do their homework on where to find the best rates in this current climate of tighter regulation,’’ she said.

“With the Federal Reserve jacking up rates in the US and inflation just scraping within the Reserve Bank target, we expect a cash rate increase in the next 12 months which means these fixed rates are unlikely to be around for a long time.”

Investor Loan Growth Outpaces Owner Occupied In March

The latest data from the RBA, the credit aggregates, shows that loan growth was strongest for investment home loans, at an annualised rate of 7.1% compared with owner occupied loans at 6.2%. Business lending fell again, and personal credit continues to fall.

The proportion of lending to business fell to 32.8% (a record low) and the proportion of home lending for investors sat at 34.9%

Total credit grew $9.7 billion (up 0.4%), owner occupied lending rose $6.7 billion (up 0.6%), investment loans rose $2.5 billion (up 0.4%) and lending to business up $1 billion (up 0.1%).

However, the RBA adjusts these numbers to take account of $1.2 billion restatement between owner occupied and investment loans. Overall housing rose 6.5% in the past 12 months, way above income growth, so higher household debt once again.

Comparing the RBA and APRA data, it looks like the share of non-bank investor home lending is rising, and of course these lenders are not under the APRA regulatory control, but fall under ASIC (and they are not required to hold capital, as they are not ADIs). This is a loophole.

The RBA notes:

Following the introduction of an interest rate differential between housing loans to investors and owner-occupiers in mid-2015, a number of borrowers have changed the purpose of their existing loan; the net value of switching of loan purpose from investor to owner-occupier is estimated to have been $51 billion over the period of July 2015 to March 2017, of which $1.2 billion occurred in March 2017. These changes are reflected in the level of owner-occupier and investor credit outstanding. However, growth rates for these series have been adjusted to remove the effect of loan purpose changes.

Mortgage Lending Strong in March 2017

APRA have just released their monthly banking statistics for March 2017. Overall lending by the banks (ADI’s) rose $7.1 billion to $1.54 trillion, up 0.47% or 7.5% over the past 12 months, way, way ahead of income growth!

Owner occupied  loans grew by 0.49% to $998 billion and investment loans rose 0.43% to $545 billion. No slow down yet despite the recent regulatory “tightening” and interest rate rises. Investment loans are 35.3% of all book.  Housing debt will continue to climb, a worry in a low income growth environment, and unsustainable.

In fact the rate of lending is ACCELERATING!

Looking at the banks share of loans, the big four remain in relatively similar places.

The four majors grew the fastest whilst the regional banks  lost share.

Looking at the investment loan speed limits, the majors are “comfortably” below the 10% APRA limit. Some smaller players remain above.

So, the current changes to regulatory settings are not sufficient to control loan growth. Perhaps they are relying on tighter underwriting and rising mortgage rates to clip the speed, but remember many investors are negatively geared, so rising mortgage interest costs are actually born by the tax payer! The only thing which will slow the loan growth is if home prices start to fall.

The RBA data comes out shortly, this will give a view of all lending, including the non-bank sector (though partial, and delayed).

 

How far away is the paperless mortgage really?

From Mortgage Professional Australia.

More data and end-to-end systems are pushing us ever closer to the paperless mortgage and point-of-sale approvals but predicting the future of technology is a risky business.


From the millennium bug to Google Glass, we’ve seen plenty of ‘game changers’ which were no such thing. Rather than take an impossibly broad view of the future, we asked our industry leaders to explain what brokers should expect over the next few years, starting with the paperless mortgage.

Glenn Lees, CEO of Connective, says the entirely paperless mortgage is “closer than ever … I think what’s driving it now is lenders understand what a competitive advantage it can be”. The barriers are simply “institutional inertia”, he says, with lenders’ risk and compliance teams “understandably nervous” about changing the application process.

NextGen.Net sales director Tony Carn is less optimistic; he believes the paperless mortgage will take some time to come about due to the current focus on credit risk. However, he says the technology is there, and mortgages are already becoming increasingly electronic thanks to e-conveyancing platform PEXA and increasing use of electronic verification by lenders.

At AFG, CIO Jaime Vogel believes that “we will end up with a significant number of applications being digital end-to-end”. The process will be similar to the gradual take-up of ApplyOnline. “As lenders understand the benefits of that innovation it’ll progressively change and we’ll find the vast majority will be digital end-to-end,” Vogel says. “We feel the technology would be relatively easily implemented in the broker process.”

Verifying with video at HashChing 

Identifying borrowers, under the Know Your Customer (KYC) guidelines, has long been a time-consuming part of the application process. It was a particular problem for online marketplace HashChing, CEO Mandeep Sodhi recalls. “The broker was seeing ‘the consumer is in Cairns, but I’ve got this great deal and I’m in Sydney’.” For brokers there was an additional problem: having to visit a bank or Australia Post outlet to get identified was causing many customers to walk away from a deal.

HashChing’s virtual online identification (VOI) technology uses a video call to compare the borrower to a photo on their Australian passport or driver’s licence, giving the broker a percentage of how much they match. It also does a behind-the-scenes DBS check and tells the broker the borrower’s current location, and the video is stored for seven years in case of an enquiry from ASIC. The system is currently being trialled by 150 brokers, saving them eight hours on average, with a full rollout scheduled for 1 March.

There’s no legal barrier to video identification; the challenge is persuading lenders to accept it. While HashChing has an exclusive partnership with the South African developer of the software, E4, it is encouraging lenders to work with E4 to use VOI technology. Combined with online document collection, VOI can free brokers from the tyranny of distance, Sodhi believes. “With this technology the broker can be anywhere in Australia and the consumer can be anywhere in Australia … the geographic barrier is gone.”

Using data

In March 2016 Siobhan Hayden, then-CEO of the MFAA, predicted the next evolution in mortgage broking would be driven by data scraping. Data scraping is extracting data from documents, web pages and storage vaults, which can then be put to use in a number of ways: automatically filling in forms, reducing the need to ask borrowers for documentation, and more informed decisions by lenders.

Data is already changing the mortgage application process. Electronic mortgages through Bank Australia and conditional approvals via the CommBank Property app are available to customers of these banks, as the banks already have the relevant data. Data scraping across institutions is in its early stages, warns AFG’s Vogel. “There’s certainly not enough data available to make a complete and proper assessment, but we’re trying to make the best of the data which is available to use.”

Other professions are further ahead in data scraping. Next.Gen.Net’s Carn points out that accountants can already access information on their clients held by the ATO; giving brokers the same access would be a “very simple technology solution, but there [needs to be] a risk appetite to allow that to happen”.

At Rubik Group a current project is looking at using wealth management to provide solutions for property investors. “One of the top unmet needs is around investment property,” head of product Emily Chen says. “It’s almost personal financial management: how do I budget? How do I know when I’m ready to buy that next property?”

Already some banks offer digital ‘dashboards’ that show customers the funds available in their current, savings and super accounts. Chen suggests that property could be added to this mix, bringing in external data on property values on fixed loan terms, creating “a total wealth view for the customer”.

What’s holding back data scraping – and by consequence paperless mortgages – is concerns around security. Computer hacking has become international news, and 71% of Australians are concerned about having their information stolen, according to Veda’s 2016 Cybercrime and Fraud Report, with older Australians more concerned.

“A lot of brokers are not aware of how much is invested in secure data processing,” argues Carn. However, he warns that vulnerabilities remain: “We’re operating in a market that’s heavily regulated, and everyone’s aware of data security, yet we still see a lot of emailing of personal customer information, which I think is quite horrifying.”

Vogel believes younger borrowers are more accepting of their data being used. “If there is value for the customer and the opportunity to get a reduced interest rate then I’d certainly expect that a high percentage of customers would be willing to provide that information.” That is conditional, however, on those customers trusting that brokers can keep their data secure, which is why AFG is investing heavily in data security.

Brokers Under The Microscope At Senate Standing Committee

From Australian Broker.

Lender-imposed conditions that brokers write a certain number of loans per month or year to retain accreditation need to go, said Peter White, executive director of the Finance Brokers Association of Australia (FBAA).

Speaking in front of the Senate Standing Committee on Economics in a government inquiry into consumer protection in the banking, insurance and financial sector on Wednesday (26 April), White said these restrictions – called minimum volume hurdles – were reducing a broker’s ability to write loans for whatever lender they desired.

“What that creates is a very bad consumer outcome because a broker can only give guidance on loans for lenders that they’re accredited to,” he said.

These restrictions mean that while a broker may be doing the right thing for the borrower with regards to the panel of accredited lenders they have access to, there may be another outside that scope which is more suitable.

“Unfortunately they can’t reach into that because they are constrained by the aggregator’s agreements and those accreditations. That’s generally restricted because they don’t have volumes to reach that lender,” he told the panel.

“Those sorts of things need to go.”

White also criticised elite broker clubs, saying he would outlaw them if he could. With brokers given access to better speed of applications, this was “unreasonable” and “completely unfair” to the borrower.

“You have an innocent borrower at the backend there. He’s sitting behind a broker who may only give a specific lender one deal every three months,” he said. “That gets penalised because they don’t have the volume. It’s got nothing to do with the borrower.”

When asked about soft dollar benefits, White said that these incentives needed to become more transparent although completely outlawing them may not be the best solution.

There was also nothing wrong with the current base model of commissions that brokers are paid today, he said, referring to the FBAA’s global research that found Australian brokers were paid below the global average. As for trail, this provided a number of positive consumer outcomes when it was introduced.

“With trail being brought into place, that was there to minimise the outcome of churn and also to provide a greater level of service to the borrower that wasn’t necessarily being provided by the banks.”

The FBAA’s research showed that taking away trail led to higher upfront commissions, a greater level of churn, and sales of additional products that may not be acceptable in the market.

Finally, White expressed his opposition to the fixed upfront commission recommended by consumer advocacy group CHOICE.

“The baseline of lending is very standard,” he said. “But it’s the knowledge and capabilities of what adds onto that to make it appropriate to that borrower’s specific needs or their lending structure. That becomes quite a significant skill set and it’s not the same.

“If you do a mum and dad home loan for example, that’s a very different transaction to doing development finance and working through feasibility studies and presales and all the research and due diligence that goes into that.”

Although these are generally higher loan sizes, the amount of work definitely increased as well, he said.

Separately, ASIC said Improved tracking of broker data is required

Difficulties by lenders to compile clear, robust data on brokers has prompted the Australian Securities & Investments Commission (ASIC) to call for improved systems that allow banks and non-banks to track and report on broker activity.

Speaking in front of a Senate Standing Committee on Economics in a government inquiry into consumer protection in the banking, insurance and financial sector on Wednesday (26 April), ASIC deputy chair Peter Kell said that collecting data for the regulator’s recent Review of Mortgage Broker Remuneration was a challenge.

The main difficulty was that some lenders could not track simple issues such as the loans that were originated from and the amount of remuneration paid to each individual broker. Certain lenders also had no way to track the soft dollar benefits offered.

“One of the recommendations we have made is that this information should be provided through a new public reporting regime of consumer outcomes,” Kell said. “[This will] require lenders to set up systems to allow them to track this [and] also provide some transparency in the market.”

Kell emphasised that these gaps in information were not as a result of any unwillingness by the lenders to provide data. However, “it was apparent that the systems that some of the lenders had in place were not as robust and didn’t give them as clear a picture as I think they themselves would wish,” he told the committee.

The exercise was a “wakeup call” for some of the lenders, he said.

ASIC recommended a public reporting regime to eliminate current issues with the non-consistent structures between lenders with different systems, metrics and numbers.

“Having a public reporting regime is a good discipline to ensure that this data will be collected going forward,” Kell said.

However, the challenge for ASIC now is determining how to compile the collated information in a manner that both the industry and the public can see.