All posts by Guest Contributor

Loading...

I received a call on my cell phone the other day. It was my bank calling because a transaction outside of my normal behavior pattern tripped a flag in their fraud models. “Hello!" said the friendly, automated voice, “I’m calling from [bank name] and we need to talk to you about some unusual transaction activity on your account, but before we do, I need to make sure Monica Bellflower has answered the phone. We need to ask you a few questions for security reasons to protect your account. Please hold on a moment.”  At this point, the IVR (Interactive Voice Response) system invoked a Knowledge Based Authentication session that the IVR controlled. The IVR, not a call center representative, asked me the Knowledge Based Authentication questions and confirmed the answers with me. When the session was completed, I had been authenticated, and the friendly, automated voice thanked me before launching into the list of transactions to be reviewed. Only when I questioned the transaction was I transferred, immediately – with no hold time, to a human fraud account management specialist. The entire process was seamless and as smooth as butter. Using IVR technology is not new, but using IVR to control a Knowledge Based Authentication session is one way of controlling operational expenses. An example of this is reducing the number of humans that are required, while increasing the ROI made in both the Knowledge Based Authentication tool and the IVR solution.  From a risk management standpoint, the use of decisioning strategies and fraud models allows for the objective review of a customer’s transactions, while employing fraud best practices. After all, an IVR never hinted at an answer or helped a customer pass Knowledge Based Authentication, and an IVR didn't get hired in a call center for the purpose of committing fraud. These technologies lend themselves well, to fraud alerts and identity theft prevention programs, and also to account management activities. Experian has successfully integrated Knowledge Based Authentication with IVR as part of relationship management and/or risk management solutions.  To learn more, visit the Experian website at: https://www.experian.com/decision-analytics/fraud-detection.html?cat1=fraud-management&cat2=detect-and-reduce-fraud).  Trust me, Knowledge Based Authentication with IVR is only the beginning. However, the rest will have to wait; right now my high-tech, automated refrigerator is calling to tell me I'm out of butter.

Published: April 20, 2010 by Guest Contributor

By: Ken Pruett I want to touch a bit on some of the third party fraud scenarios that are often top of mind with our customers: identity theft; synthetic identities; and account takeover. Identity Theft Identity theft usually occurs during the acquisition stage of the customer life cycle. Simply put, identity theft is the use of stolen identity information to fraudulently open up a new account.  These accounts do not have to be just credit card related. For example, there are instances of people using others identities to open up wireless phone and utilities accounts Recent fraud trends show this type of fraud is on the rise again after a decrease over the past several years.  A recent Experian study found that people who have better credit scores are more likely to have their identity stolen than those with very poor credit scores. It does seem logical that fraudsters would likely opt to steal an identity from someone with higher credit limits and available purchasing power.  This type of fraud gets the majority of media attention because it is the consumer who is often the victim (as opposed to a major corporation). Fraud changes over time and recent findings show that looking at data from a historical perspective is a good way to help prevent identity theft.  For example, if you see a phone number being used by multiple parties, this could be an indicator of a fraud ring in action.  Using these types of data elements can make your fraud models much more predictive and reduce your fraud referral rates. Synthetic Identities Synthetic Identities are another acquisition fraud problem.  It is similar to identity theft, but the information used is fictitious in nature.  The fraud perpetrator may be taking pieces of information from a variety of parties to create a new identity.  Trade lines may be purchased from companies who act as middle men between good consumers with good credit and perpetrators who creating new identities.   This strategy allows the fraud perpetrator to quickly create a fictitious identity that looks like a real person with an active and good credit history. Most of the trade lines will be for authorized users only.  The perpetrator opens up a variety of accounts in a short period of time using the trade lines. When creditors try to collect, they can’t find the account owners because they never existed.  As Heather Grover mentioned in her blog, this fraud has leveled off in some areas and even decreased in others, but is probably still worth keeping an eye on.  One concern on which to focus especially is that these identities are sometimes used for bust out fraud. The best approach to predicting this type of fraud is using strong fraud models that incorporate a variety of non-credit and credit variables in the model development process.  These models look beyond the basic validation and verification of identity elements (such as name, address, and social security number), by leveraging additional attributes associated with a holistic identity -- such as inconsistent use of those identity elements. Account Takeover Another type of fraud that occurs during the account management period of the customer life cycle is account takeover fraud.  This type of fraud occurs when an individual uses a variety of methods to take over an account of another individual. This may be accomplished by changing online passwords, changing an address or even adding themselves as an authorized user to a credit card. Some customers have tools in place to try to prevent this, but social networking sites are making it easier to obtain personal information for many consumers.  For example, a person may have been asked to provide the answer to a challenge question such as the name of their high school as a means to properly identify them before gaining access to a banking account.  Today, this piece of information is often readily available on social networking sites making it easier for the fraud perpetrators to defeat these types of tools. It may be more useful to use out of wallet, or knowledge-based authentication and challenge tools that dynamically generate questions based on credit or public record data to avoid this type of fraud.  

Published: April 5, 2010 by Guest Contributor

By: Wendy Greenawalt In my last few blogs, I have discussed how optimization can be leveraged to make improved decisions across an organization while considering the impact that opimizing decisions have to organizational profits, costs or other business metrics. In this entry, I would like to discuss how optimization is used to improve decisions at the point of acquisition, while minimizing costs. Determining the right account terms at inception is increasingly important due to recent regulatory legislation such as the Credit Card Act.  Doing so plays a role in assessing credit risk, relationship managment, and increasing out of wallet share. These regulations have established guidelines specific to consumer age, verification of income, teaser rates and interest rate increases. Complying with these regulations will require changes to existing processes and creation of new toolsets to ensure organizations adhere to the guidelines. These new regulations will not only increase the costs associated with obtaining new customers, but also the long term revenue and value as changes in account terms will have to be carefully considered. The cost of on-boarding and servicing individual accounts continues to escalate while internal resources remain flat. Due to this, organizations of all sizes are looking for ways to improve efficiency and decisions while minimizing costs. Optimizing decisions is an ideal solution to this problem. Optimized strategy trees (trees that optimize decisioning strategies) can be easily implemented into current processes to ensure lending decisions adhere to organizational revenue, growth or cost objectives as well as regulatory requirements.  Optimized strategy trees enable organizations to create executable strategies that provide on-going decisions based upon optimization conducted at a consumer level. Optimized strategy trees outperform manually created trees as they are created utilizing sophisticated mathematical analysis and ensure organizational objectives are adhered to. In addition, an organization can quantify the expected ROI of decisioning strategies and provide validation in strategies – before implementation. This type of data is not available without the use of a sophisticated optimization software application.  By implementing optimized strategy trees, organizations can minimize the volume of accounts that must be manually reviewed, which results in lower resource costs. In addition, account terms are determined based on organizational priorities leading to increased revenue, retention and profitability.

Published: April 5, 2010 by Guest Contributor

By: Wendy Greenawalt Financial institutions have placed very little focus on portfolio growth over the last few years.  Recent market updates have provided little guidance to the future of the marketplace, but there seems to be a consensus that the US economic recovery will be slow compared to previous recessions. The latest economic indicators show that slow employment growth, continued property value fluctuations and lower consumer confidence will continue to influence the demand and issuance of new credit. However, the positive aspect is that most analysts agree that these indicators will improve over the next 12 to 24 months. Due to this, lenders should start thinking about updating acquisition strategies now and consider new tools that can help them reach their short and long-term portfolio growth goals. Most financial institutions have experienced high account delinquency levels in the past few years. These account delinquencies have had a major impact to consumer credit scores. The bad news is that the pool of qualified candidates continues to shrink so the competition for the best consumers will only increase over the next few years. Identifying target populations and improving response/booking rates will be a challenge for some time so marketers must create smarter, more tailored offers to remain competitive and strategically grow their portfolios. Recently, new scores have been created to estimate consumer income and debt ratios when combined with consumer credit data. This data can be very valuable and when combined with optimization (optimizing decisions) can provide robust acquisition strategies. Specifically, optimization / optimizing decisions allows an organization to define product offerings, contact methods, timing and consumer known preferences, as well as organizational goals such as response rates, consumer level profitability and product specific growth metrics into a software application. The optimization software will then utilize a proven mathematical technique to identify the ideal product offering and timing to meet or exceed the defined organizational goals.  The consumer level decisions can then be executed via normal channels such as mail, email or call centers. Not only does optimization software reduce campaign development time, but it also allows marketers to quantify the effectiveness of marketing campaigns – before execution. Today, optimization technology provide decision analytics accessible for organizations of almost any size and can provide an improvement over business-as-usual techniques for decisioning strategies. If your organization is looking for new tools to incorporate into existing acquisition processes, I would encourage you to consider optimization and the value it can bring to your organization.

Published: April 1, 2010 by Guest Contributor

By: Kari Michel Lenders want to find new customer through more informed credit risk decisions and use new types of data relationships to cross-sell.   The strategic goals of any company are to get more customers and revenue while reducing costs on the operating side and the credit loss side.  Some of the ways to meet these goals are to improve operating efficiency in creating and managing credit attributes, which represent the building blocks of how lenders make customer decisions. Lenders face many challenges in leveraging data from multiple credit and non-credit sources (e.g. credit bureaus) and maintaining data attributes across multiple systems. Furthermore, a lack of access to raw data makes it difficult to create effective, predictive attributes. Simply managing the discrepancies between specifications and code can become a very time consuming effort.  Maintaining a common set of attributes used in many types of scorecards and decision types often becomes difficult.  As a result, there is a heavy reliance on external people and technical resources to find the right tools to try and pull the data sources and attributes together. In an ideal situation, a lender should be able to easily access raw data elements across multiple sources and aggregate the data into meaningful attributes. Experian can offer these capabilities through its Attribute Toolbox product, allowing one or more systems to access a common set of standard analytics.  A set of highly predictive attributes, Premier Attributes, are available and offers a much more effective solution  for managing standard attributes across an enterprise.  With the use of these tools, lenders can decrease maintenance costs by quickly integrating data and analytics into existing business architecture to make profitable decisions.  

Published: March 24, 2010 by Guest Contributor

By: Tom Hannagan An autonomic movement describes an action or response that occurs without conscious control. This, I fear, may be occurring at many banks right now related to their risk-based pricing and profit picture for several reasons. First, the credit risk profile of existing customers is subject to continuous change over time. This was always true to some extent. But, as we’ve seen in the latest economic recession, there can be a sizeable risk level migration if enough stress is applied. It is most obvious in the case of delinquencies and defaults, but is also occurring with customers that have performing loans. The question is: how well are we keeping up with the behind-the-scenes changes risk ratings/score ranges? The changes in relative risk levels of our clients are affecting our risk-based profit picture -- and required capital allocation -- without conscious action on our part. Second, the credit risk profile of collateral categories is also subject to change over time. Again, this is not exactly new news. But, as we’ve seen in the latest real estate meltdown and dynamics affecting the valuation of financial instruments, to name two, there can be huge changes in valuation and loss ratios. And, this occurs without making one new loan.  These changes in relative loss-given-default levels are affecting our risk-based expected loss levels, risk-adjusted profit and capital allocation, in a rather autonomic manner. Third, aside from changes in risk profiles of customers and collateral types, the bank’s credit policy may change. The risk management analysis of expected credit losses is continuously (we presume) under examination and refinement by internal credit risk staff. It is certainly getting unprecedented attention by external regulators and auditors. These policy changes need to be reflected in the foundation logic of risk-based pricing and profit models. And that’s just in the world of credit risk. Fourth, there can also be changes in our operating cost structure, including mitigated operational risks, and product volumes that affect the allocation of risk-based non-interest expense to product groups and eventually to clients. Although it isn’t the fault of our clients that our cost structure is changing, for better or worse, we nonetheless expect them to bear the burden of these expenses based on the services we provide to them. Such changes need to be updated in the risk-based profit calculations. Finally, there is the market risk piece of risk management.  It is possible if not likely that our ALCO policies have changed due to lessons from the liquidity crisis of 2008 or the other macro economic events of the last two years. Deposit funds may be more highly valued, for instance. There may also be some rotation in assets from lending. Or, the level of reliance on equity capital may have materially changed. In any event, we are experiencing historically low levels for the price of risk-free (treasury rate curve) funding, which affects the required spread and return on all other securities, including our fully-at-risk equity capital. These changes are occurring apart from customer transactions, but definitely affect the risk-based profit picture of each existing loan or deposit account and, therefore, every customer relationship. If any, let alone all, of the above changes are not reflected in our risk-based performance analysis and reporting, and any pricing of new or renewed services to our customers, then I believe we are involved in autonomic changes in risk-based profitability.

Published: March 24, 2010 by Guest Contributor

By:Wendy Greenawalt In my last few blogs, I have discussed how optimizing decisions can be leveraged across an organization while considering the impact those decisions have to organizational profits, costs or other business metrics. In this entry, I would like to discuss how this strategy can be used in optimizing decisions at the point of acquisition, while minimizing costs. Determining the right account terms at inception is increasingly important due to recent regulatory legislation such as the Credit Card Act. These regulations have established guidelines specific to consumer age, verification of income, teaser rates and interest rate increases. Complying with these regulations will require changes to existing processes and creation of new toolsets to ensure organizations adhere to the guidelines. These new regulations will not only increase the costs associated with obtaining new customers, but also the long term revenue and value as changes in account terms will have to be carefully considered. The cost of on-boarding and servicing individual accounts continues to escalate, and internal resources remain flat. Due to this, organizations of all sizes are looking for ways to improve efficiency and decisions while minimizing costs. Optimization is an ideal solution to this problem. Optimized strategy trees can be easily implemented into current processes and ensure lending decisions adhere to organizational revenue, growth or cost objectives as well as regulatory requirements.  Optimized strategy trees enable organizations to create executable strategies that provide on-going decisions based upon optimization conducted at a consumer level. Optimized strategy trees outperform manually created trees as they are created utilizing sophisticated mathematical analysis and ensure organizational objectives are adhered to. In addition, an organization can quantify the expected ROI of a given strategy and provide validation in strategies – before implementation. This type of data is not available without the use of a sophisticated optimization software application.  By implementing optimized strategy trees, organizations can minimize the volume of accounts that must be manually reviewed, which results in lower resource costs. In addition, account terms are determined based on organizational priorities leading to increased revenue, retention and profitability.

Published: March 5, 2010 by Guest Contributor

There seems to be two viewpoints in the market today about Knowledge Based Authentication (KBA): one positive, one negative.  Depending on the corner you choose, you probably view it as either a tool to help reduce identity theft and minimize fraud losses, or a deficiency in the management of risk and the root of all evil.  The opinions on both sides are pretty strong, and biases “for” and “against” run pretty deep. One of the biggest challenges in discussing Knowledge Based Authentication as part of an organization’s identity theft prevention program, is the perpetual confusion between dynamic out-of-wallet questions and static “secret” questions.  At this point, most people in the industry agree that static secret questions offer little consumer protection.  Answers are easily guessed, or easily researched, and if the questions are preference based (like “what is your favorite book?”) there is a good chance the consumer will fail the authentication session because they forgot the answers or the answers changed over time. Dynamic Knowledge Based Authentication, on the other hand, presents questions that were not selected by the consumer.  Questions are generated from information known about the consumer – concerning things the true consumer would know and a fraudster most likely wouldn’t know.  The questions posed during Knowledge Based Authentication sessions aren’t designed to “trick” anyone but a fraudster, though a best in class product should offer a number of features and options.  These may allow for flexible configuration of the product and deployment at multiple points of the consumer life cycle without impacting the consumer experience. The two are as different as night and day.  Do those who consider “secret questions” as Knowledge Based Authentication consider the password portion of the user name and password process as KBA, as well?  If you want to hold to strict logic and definition, one could argue that a password meets the definition for Knowledge Based Authentication, but common sense and practical use cause us to differentiate it, which is exactly what we should do with secret questions – differentiate them from true KBA. KBA can provide strong authentication or be a part of a multifactor authentication environment without a negative impact on the consumer experience.  So, for the record, when we say KBA we mean dynamic, out of wallet questions, the kind that are generated “on the fly” and delivered to a consumer via “pop quiz” in a real-time environment; and we think this kind of KBA does work.  As part of a risk management strategy, KBA has a place within the authentication framework as a component of risk- based authentication… and risk-based authentication is what it is really all about.  

Published: March 5, 2010 by Guest Contributor

When a client is selecting questions to use, Knowledge Based Authentication is always about the underlying data – or at least it should be.  The strength of Knowledge Based Authentication questions will depend, in large part, on the strength of the data and how reliable it is.  After all, if you are going to depend on Knowledge Based Authentication for part of your risk management and decisioning strategy the data better be accurate.  I’ve heard it said within the industry that clients only want a system that works and they have no interest where the data originates.  Personally, I think that opinion is wrong. I think it is closer to the truth to say there are those who would prefer if clients didn’t know where the data that supports their fraud models and Knowledge Based Authentication questions originates; and I think those people “encourage” clients not to ask.  It isn’t a secret that many within the industry use public record data as the primary source for their Knowledge Based Authentication products, but what’s important to consider is just how accessible that public record information is.  Think about that for a minute.  If a vendor can build questions on public record data, can a fraudster find the answers in public record data via an online search? Using Knowledge Based Authentication for fraud account management is a delicate balance between customer experience/relationship management and risk management.  Because it is so important, we believe in research – reading the research of well-known and respected groups like Pew, Tower, Javelin, etc. and doing our own research.  Based on our research, I know consumers prefer questions that are appropriate and relative to their activity.  In other words, if the consumer is engaged in a credit-granting activity, it may be less appropriate to ask questions centered on personal associations and relatives.  Questions should be difficult for the fraudster, but not difficult or perceived as inappropriate or intrusive by the true consumer.  Additionally, I think questions should be applicable to many clients and many consumers.  The question set should use a mix of data sources: public, proprietary, non-credit, credit (if permissible purpose exists) and innovative. Is it appropriate to have in-depth data discussions with clients about each data source?  Debatable.  Is it appropriate to ensure that each client has an understanding of the questions they ask as part of Knowledge Based Authentication and where the data that supports those questions originates?  Absolutely.    

Published: March 2, 2010 by Guest Contributor

By: Kari Michel What is Basel II?  Basel II is the international convergence of Capital Measurement and Capital Standards. It is a revised framework and is the second iteration of an international standard of laws. The purpose of Basel II is to create an international standard that banking regulators can use when creating regulations about how much capital banks need to put aside to guard against the types of financial and operations risk banks face.  Basel II ultimately implements standards to assist in maintaining a healthy financial system. The business challenge The framework for Basel II compels the supervisors to ensure that banks implement credit rating techniques that represent their particular risk profile.  Besides the risk inputs (Probability of Default (PD), Loss Given Default (LGD) and Exposure at Default (EAD)) calculation, the final Basel accord includes the “use test” requirement which is the requirement for a firm to use an advanced approach more widely in its business and met merely for calculation of regulatory capital. Therefore many financial institutions are required to make considerable changes in their approach to risk management (i.e. infrastructure, systems, processes, data requirements).  Experian is a leading provider of risk management solutions -- products and services for the new Basel Capital Accord (Basel II).  Experian’s approach includes consultancy, software, and analytics tailored to meet the lender’s Basel II requirements.  

Published: February 26, 2010 by Guest Contributor

By: Wendy Greenawalt Marketing is typically one of the largest expenses for an organization while also being a priority to reach short and long-term growth objectives. With the current economic environment, continuing to be unpredictable many organizations have reduced budgets and focused on more risk and recovery activities. However, in the coming year we expect to see improvements and organizations renew their focus to portfolio growth. We expect that campaign budgets will continue to be much lower than what was allocated before the mortgage meltdown but organizations are still looking for gains in efficiency and response to meet business objectives. Creation of optimized marketing strategies is quick and easy when leveraging optimization technology enabling your internal resources to focus on more strategic issues. Whether your objective is to increase organizational or customer level profit, growth in specific product lines or maximizing internal resources optimization can easily identify the right solution while adhering to key business objectives. The advanced software now available enables an organization to compare multiple campaign options simultaneously while analyzing the impact of modifications to revenue, response or other business metrics. Specifically, very detailed product offer information, contact channels, timing, and letter costs from multiple vendors and consumer preferences can all be incorporated into an optimization solution. Once defined the complex mathematical algorithm factors every combination of all variables, which could range in the thousands, are considered at the consumer level to determine the optimal treatment to maximize organizational goals and constraints. In addition, by incorporating optimized decisions into marketing strategies marketers can execute campaigns in a much shorter timeframe allowing an organization to capitalize on changing market conditions and consumer behaviors. To illustrate the benefit of optimization an Experian bankcard client was able to reduced analytical time to launch programs from 7 days to 90 minutes while improving net present value. In my next blog, we will discuss how organizations can cut costs when acquiring new accounts.  

Published: February 22, 2010 by Guest Contributor

By: Wendy Greenawalt Marketing is typically one of the largest expenses for an organization and it is also a priority to reach short- and long-term growth objectives. With the current economic environment continuing to be unpredictable, many organizations have reduced budgets and are focusing more on risk management and recovery activities. However, in the coming year, we expect to see improvements in the economy and organizations renewing their focus on portfolio growth. We expect that marketing campaign budgets will continue to be much lower than those allocated before the mortgage meltdown but organizations will still be looking for gains in efficiency and responsiveness to meet business objectives. Optimizing decisions, creation of optimized marketing strategies, is quick and easy when leveraging optimization technology.  Those strategies enable your internal resources to focus on more strategic issues. Whether your objective is to increase organizational or customer level profit, growth in specific product lines or maximizing internal resources, optimization / optimizing decisions can easily identify the right solution while adhering to key business objectives. The advanced software now available to facilitate optimizing decisions enables an organization to compare multiple campaign options simultaneously while analyzing the impact of modifications to revenue, response or other business metrics. Specifically, very detailed product offer information, contact channels, timing, and letter costs from multiple vendors -- and consumer preferences -- can all be incorporated into an optimization solution. Once defined, the complex mathematical algorithm factors every combination of all variables, which could range in the thousands.  These variables are considered at the consumer level to determine the optimal treatment to maximize organizational goals and constraints. In addition, by optimizing decisions and incorporating them into marketing strategies, marketers can execute campaigns in a much shorter timeframe allowing an organization to capitalize on changing market conditions and consumer behaviors. To illustrate the benefit of optimization: an Experian bankcard client was able to reduce analytical time to launch programs from seven days to 90 minutes while improving net present value. In my next blog, we will discuss how organizations can cut costs when acquiring new accounts.  

Published: February 22, 2010 by Guest Contributor

By: Wendy Greenawalt The economy has changed drastically in the last few years and most organizations have had to reduce costs across their businesses to retain profits. Determining the appropriate cost-cutting measures requires careful consideration of trade-offs while quantifying the short- and long-term organizational priorities.  Too often, cost reduction decisions are driven by dynamic market conditions, which mandate quick decision-making. Due to this, decisions are made without a sound understanding of the true impact to organizational objectives. Optimization (optimizing decisions) can be used for virtually any business problem and provides decisions based on complex mathematics. Therefore, whether you are making decisions related to outsourcing versus staffing, internal versus external project development or specific business unit cost savings opportunities, optimization can be applied. While some analytical requirements exist to obtain the highest business metric improvements, most organizations have the data available that is required to take full advantage of optimization technology.  If you are using predictive models, credit attributes and have multiple actions that can be taken on an individual consumer, then, most likely, your organization can benefit from strategies in optimizing decisions. In my next few blogs, I will discuss how optimization / optimizing decisions can be used to create better strategies across an organization whether your focus is marketing, risk, customer management or collections.  

Published: February 19, 2010 by Guest Contributor

By: Tom Hannagan While waiting on the compilation of fourth quarter banking industry results, I thought it might be interesting to relate the commercial real estate (CRE) risk management position facing commercial banks from the third quarter. CRE risk is an important consideration in enterprise risk management and for loan pricing and profitability. The slowdown in the global economy has affected CRE credit risk because of increased vacancy rates, halted development projects, and the loss of value affecting commercial properties. As CRE loans come up for renewal, many will find that there have equity deficits and that they are facing tightened credit standards. If a commercial property loan started life at 80 percent loan to value, and the property value has dropped 25 percent, the renewed loan balance will be down at least 25 percent, requiring a substantial net payoff from the borrower. This net cash payoff requirement would be tough to accomplish in good times and all-but-impossible for many borrowers in this economy. After all, the main reason for the decline in property value to begin with is its reduced cash flow performance. Following the third quarter numbers, total U.S. commercial real estate is generally estimated at $3.4 to $3.5 trillion. Commercial banks owned just over half of that debt, or about $1.8 trillion according to Federal Reserve and FDIC sources. The (possibly only) good news with that total is that commercial banks owned a relatively small share of the commercial-mortgage-backed securities (CMBS) slice of CRE exposure. CMBS assets were 21 percent of total CRE credit or $714 billion, but banks owned a total of $54 billion, which represented only 3 percent of total bank CRE assets. Unfortunately, the opposite is true for construction lending. U.S. banks, in total, had $486 to $534 billion (depending on the source) in construction and land loans, representing 27 percent to 30 percent of banks’ total CRE holdings. The true credit risk management picture is much more revealing if we cut the numbers by bank size. According to Deutsche Bank research, the largest 97 banks (those with over $10 billion in total assets) had $14.8 trillion in total assets and $1.0 trillion of the banking industry’s CRE credits.  This amounts to about 7 percent of the total assets for this group of larger banks. The 7,500 community banks, with aggregate assets of $2 trillion, had about $786 billion in CRE lending. This amounts to about 28 percent of total assets. That is roughly four times the level of exposure found in the larger banks. The 7 percent level of credit risk average exposure at the large bank group is less than their average level of equity or risk-based capital. For the banks under the $10 billion level, the 28 percent level of CRE exposure is almost three times their average equity position. The riskiest portion of CRE lending is clearly the construction and land development loans. The subtotals in this area confirm where the cumulative risk lies. Again, according to Deutsche Bank research, the largest 97 banks had $299 billion of the banking industry’s $534 billion in construction loans. Although this is 56 percent of total bank construction lending, it amounts to only 2 percent of this group’s total assets.  The 7,500 community banks had aggregate construction loans of $235 billion. This amounts to about 8.5 percent of total assets. That is a bit over four times the level of exposure found in the larger banks. The 2 percent level of construction credit risk exposure at the large bank group is one-fourth of their average level of common equity. At banks under the $10 billion level, the 8.5 percent level of CRE exposure, compared to total assets, is about the same as their average equity position. According to Moody’s, bank have already taken about $90 billion in net loan losses in CRE assets through the third quarter of 2009. That means the industry has perhaps another $150 billion in write-offs coming. This would total $240 billion in CRE credit losses for the banking industry due to this economic downturn. That would equate to 13.3 percent of the banking industry’s share of total CRE credit. With the decline in commercial property values ranging from 10 percent to 40 percent, a 13 percent loss is certainly not a worst case scenario. Banks have ramped up their loss reserves, and although the numbers aren’t out yet, we know many banks have used the fourth quarter 2009 to further bolster their allowances for loan and lease losses (ALLL). The larger the ALLL, the safer the risk-based equity account. Risk managers are aware of all of this and banks are very actively developing their strategies to handle the refunding requirements and, at the same time, be in a position to explain to regulators and external auditor how they are protecting shareholders. But the numbers are very daunting and not every bank will have enough net cash flow and risk equity to cover the inevitable losses.  

Published: February 11, 2010 by Guest Contributor

By: Amanda Roth Last week, we discussed how pricing with competition is important to ensure sound decision practices are being implemented in the domains of loan pricing and profitability.  The extreme of pricing too high for the market can obviously be detrimental to your organization.  The other extreme can be just as dangerous. Pricing for your profitability, regardless of what the competition is charging in your area, has a few potential issues associated with it regarding management of risk.  For example, the statistics state you can charge 5 percent in your “A” tier and still be profitable, but the competition is charging 7.5 percent for the same tier.  You may be thinking that by offering 5 percent you will attract the “best of the best” to your organization.  However, what your statistics may not be showing you is the risk outside of your applicant base.  If you significantly change the customers you are bringing in, does your risk increase as well, ultimately increasing the cost associated with each loan?   Increased costs will reduce or even eliminate the profitability you had expected. A second potential issue is setting the expectation within the marketplace.  It is often understood with the consumers that when changes occur to the interest rate at the federal level, there will be changes at their local financial institution.  These changes are often very small.  By undercutting your competition by such an extreme amount, your customers may question any attempts to raise rates more than 50bp, if you do experience increased costs as a result of the earlier situation or any other factors.  A safer strategy would be to charge between 6.5 percent and 7 percent, which allows you to obtain some of the best customers, ensure stability within the market, and take advantage of additional profitability while it is available.  This is definitely a winning strategy for all -- and an important consideration as you develop your portfolio risk management objectives.    

Published: February 5, 2010 by Guest Contributor

Subscribe to our blog

Enter your name and email for the latest updates.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe to our Experian Insights blog

Don't miss out on the latest industry trends and insights!
Subscribe