By: John Straka Unsurprisingly, Washington deficit hawks have been eyeing the “sacred cows” of tax preferences for homeownership for some time now. Policymakers might even unwind or eliminate the mortgage interest deductions and capital-gains exemptions on home appreciation that have been in place in the U.S for many decades. There is an economic case to be made for doing this—more efficient resource allocation of capital, other countries have high ownership rates without such tax preferences, etc. But if you call or email or tweet Congress, and you choose this subject, my advice is to tell them that they should wait unti it’s 2005. In other words, now—or even the next few years most likely—is definitely not a good time at all to eliminate these housing tax preferences. We need to wait until it’s something like “2005”—when housing markets are much stronger again (hopefully) and state and local government finances are far from their relatively dire straits at present. If we don’t do this right, and insist on making big changes here now, then housing will take an immediate hit, and so will employment from both the housing sector and state and local governments (with further state and local service cutbacks also, due to budget shortfalls). The reason for this, of course, is that most homeowners today have not really benefited much, and won’t, from those well-established tax preferences. Why not? Because these preferences have been in place for so long now that the economic value (expected present discounted value) of these tax savings was long ago baked into the level of home prices that most homeowners paid when they bought their homes. Take the preferences away now, and the value of homes will immediately drop, and therefore so will property tax revenues collected by local governments across the U.S. This strategy will thus further bash the state- and-local sector in order to plump up some (we hope) our federal tax revenues by the value of the tax preferences. Housing will become a further drag on economic growth, and so will the resulting employment losses from both construction and local government services. As a result, it’s possible that on net the federal government may actually lose revenue from making this kind of change at precisely the wrong time. It may very well never be quite like “2005” again. But waiting for greater housing and local government strength to change long-standing housing tax preferences should make the macroeconomic impact smaller, less visible, and more easily absorbed.
The high-profile data breaches in recent months not only left millions of consumers vulnerable to the threat of identity theft and caused businesses to incur significant costs, but it also brought data security to the top of the agenda in Washington. In Congress, members of both the House and the Senate have used the recent data breaches to demonstrate the need for a uniform national data breach notification standard and increased data security standards for companies that collect consumer information. Hearings have been held on the issue and it is expected that legislation will be introduced this summer.At the same time, the Obama Administration continues to call for greater data security standards. The White House released its highly anticipated cybersecurity initiative in May. In addition to implementing a national data breach notification law, the proposal would require certain private companies to develop detailed plans to safeguard consumer data.As legislation develops and advances through multiple Congressional committees, Experian will be working with allies and coalitions to ensure that the data security standards established under the Gramm-Leach-Bliley Act and the Fair Credit Reporting Act are not superseded with new, onerous and potentially ineffective mandates.We welcome your questions and comments below.
A surprising occurrence is happening in the consumer credit markets. Bank card issuers are back in acquisition mode, enticing consumers with cash back, airline points and other incentives to get a share of their wallet. And while new account originations are nowhere near the levels seen in 2007, recent growth in new bank card accounts has been significant; 17.6% in Q1 2011 when compared to Q1 2010. So what is accounting for this resurgence in the credit card space while the economy is still trying to find its footing and credit is supposedly still difficult to come by for the average consumer? Whether good or bad, the economic crisis over the past few years appears to have improved consumers debt management behavior and card issuers have taken notice. Delinquency rates on bank cards are lower than at any time over the past five years and when compared to the start of 2009 when bank card delinquency was peaking; current performance has improved by over 40%. These figures have given bank card issuers the confidence to ease their underwriting standards and re-establish their acquisition strategies. What’s interesting however is the consumer segments that are driving this new growth. When analyzed by VantageScore, new credit card accounts are growing the fastest in the VantageScore D and F tiers with 46% and 53% increases year over year respectively. For comparison, VantageScore A and B tiers saw 5% and 1% increases during the same time period respectively. And although VantageScore D and F represent less than 10% of new bank card origination volume ($ limits), it is still surprising to see such a disparity in growth rates between the risk categories. While this is a clear indication that card issuers are making credit more readily available for all consumer segments, it will be interesting to see if the debt management lessons learned over the past few years will stick and delinquency rates will continue to remain low. If these growth rates are any indication, the card issuers are counting on it.
TRMA’s recent Summer 2011 Conference in San Francisco was another great, insightful event. Experian’s own Greg Carmean gave a presentation regarding the issues involved in providing credit to small-business owners. I recently interviewed Greg to get his impressions about last month’s conference. KM: I’m speaking with Experian Program Manager, Greg Carmean, who spoke at TRMA’s Summer Conference. Hi, Greg. GC: Hi, Kathy. KM: Greg, I know I’ve interviewed you before, but can you please remind everyone what your role is here at Experian? GC: Sure, I’m a Program Manager on the Small Business Credit Share side. I work with small- and medium-size companies, including telecom and cable companies, to reduce credit risk and get more value from their data. KM: Thanks, Greg. So last month, you spoke at TRMA’s Summer Conference. What did you discuss? GC: My presentation was entitled, “Beyond Consumer Credit – Providing a More Comprehensive Assessment of Small-Business Owners.” I talked about how traditional risk management tools can provide a point-in-time look at a business owner, but often fail to show the broader picture of the risk associated with all of their current and previous businesses. There is 3-4 times more fraud in small business than in consumer. Business identity theft has become a bigger issue, Tax ID verification is a common problem, and there’s a lot of concern about agents bringing in fraudulent accounts. KM: Why did you choose this particular topic? GC: Well, Kathy, small business is seen as a large area of opportunity, but there can be a lot of difficulty involved in validation, especially when it comes to remote authentication and new businesses. KM: Would you say there’s more fraud in small business than on the consumer side? GC: Believe it or not, there is 3-4 times more fraud in small business than in consumer. Business identity theft has become a bigger issue, Tax ID verification is a common problem, and there’s a lot of concern about agents bringing in fraudulent accounts. Many telecom and cable companies are beginning to adopt more aggressive, manual processes to lower the risk of fraud. Unfortunately, that usually results in lower activation. KM: What can be done about it? GC: Many telecom and cable companies are beginning to adopt more aggressive, manual processes to lower the risk of fraud. Unfortunately, that usually results in lower activation. KM: Sounds like it can be frustrating! GC: It can be, especially for the salespeople who bring in an account, and then find it’s not approved for service. Sometimes clients will pass a fraud check, but not a credit check. One of the topics I touched on is better tools that more accurately identify a small business owner's risk across all of their current and previous businesses to alleviate some of these problems. KM: Is there anything else telecom and cable companies should be doing? GC: I think the best risk-mitigation tool when it comes to account acquisition is leveraging information about both the small business and its owner. As they say, knowledge is power. KM: Definitely! Thanks again for your time today, Greg. Share your thoughts! If you attended TRMA’s Summer Conference, and especially if you attended Greg Carmean’s session, we’d love to hear from you. Please share your thoughts by commenting on this blog post. All of us at Experian look forward to seeing you at TRMA’s Fall Conference in Dallas, Texas, on September 20 – 21, 2011.
By: Staci Baker The Durbin Amendment, according to Wikipedia, gave the Federal Reserve the power to regulate debit card interchange fees. The amendment, which will have a profound impact on banks, merchants and anyone who holds a debit card will take effect on October 1, 2011 rather than the originally announced July 21, 2011, which will allow banks additional time to implement the new regulations. The Durbin Amendment states that card networks, such as Visa and Mastercard, will include an interchange fee of 21 cents per transaction, and must allow debit cards to be processed on at least two independent networks. This will cost banks roughly $9.4 billion annually according to CardHub.com. As stipulated in the Amendment, institutions with less than $10 billion in assets are exempt from the cap. In preparation for the Durbin Amendment, several banks have begun to impose new fees on checking accounts, end reward programs, raise minimum balance requirements and have threatened to cap transaction amounts for debit card transactions at $50 to $100 in order to recoup some of the earnings they are expected to lose. These new regulations will be a blow to already hurting consumers as their out of wallet expenses keep increasing. As you can see, The Durbin Amendment, which is meant to help consumers, will instead have the cost from the loss of interchange fees passed along in other forms. And, the loss of revenue will greatly impact the bottom line of banking institutions. Who will be the bigger winner with this new amendment - the consumer, merchants or the banks? Will banks be able to lower the cost of credit to an amount that will entice consumers away from their debit cards and to use their credit cards again? I think it is still far too soon to tell. But, I think over the next few months, we will see consumers use payment methods in a new way as both consumers and banks come to a middle ground that will minimize risk levels for all parties. Consumers will still need to shop and bankers will still need their tools utilized. What are you doing to prepare for The Durbin Amendment?
Every communication company wants to inoculate its portfolio against bad debt, late payments and painful collections. But many still use traditional generic risk models to uncover potential problems, either because they’ve always used generics or because they see their limited predictive abilities as adequate enough. Generalization dilutes results The main problem with generics, however, is how they generalize consumers’ payment behavior and delinquencies across credit cards, mortgages, auto loans and other products. They do not include payment and behavioral data focused on actual communications customers only. Moreover, their scoring methodologies can be too broad to provide the performance, lift or behavioral insights today’s providers strive to attain. Advantages of industry-specific models Communications-specific modeling can be more predictive, if you want to know who’s more likely to prioritize their phone bill and remit promptly, and who’s not. In multiple market validations, pitting an optimized industry-specific model against traditional generic products, Experian’s Tele-Risk ModelSM and Telecommunications, Energy and Cable (TEC) Risk ModelSM more accurately predicted the likelihood of future serious delinquent or derogatory payment behavior. Compared with generics, they also: Provided a stronger separation of good and bad accounts More precisely classified good vs. bad risk through improved rank ordering Accurately scored more consumers than a generic score that might have otherwise been considered unscorable Anatomy of a risk score These industry risk models are built and optimized using TEC-specific data elements and sample populations, which makes them measurably more predictive for evaluating new or existing communications customers. Optimization also helps identify other potentially troublesome segments, including those that might require special handling during on boarding, “turn ons,” or managing delinquency. Check the vital signs To assess the health of your portfolio, ask a few simple questions: Does your risk model reflect unique behaviors of actual communications customers? Is overly generic data suppressing lift and masking hidden risk? Could you score more files that are currently deemed unscorable? Unless the answer is ‘yes’ to all, your model probably needs a check-up—stat.
Lately there has been a lot of press about breaches and hacking of user credentials. I thought it might be a good time to pause and distinguish between authentication credentials and identity elements. Identity elements are generally those bits of meta data related to an individual. Things like: name, address, date of birth, Social Security Number, height, eye color, etc. Identity elements are typically used as one part of the authentication process to verify an individual’s identity. Credentials are typically the keys to a system that are granted after someone’s identity elements have been authenticated. Credentials then stand in place of the identity elements and are used to access systems. When credentials are compromised, there is risk of account takeover by fraudsters with mal intent. That’s why it’s a good idea to layer-in risk based authentication techniques along with credential access for all businesses. But for financial institutions, the case is clear: a multi-layered approach is a necessity. You only need to review the FFIEC Guidance of Authentication in an Internet Banking Environment to confirm this fact. Boiled down to its essence, the latest guidance issued by the FFIEC is rather simple. Essentially it’s asking U.S. financial institutions to mitigate risk using a variety of processes and technologies, employed in a layered approach. More specifically, it asks those businesses to move beyond simple device identification — such as IP address checks, static cookies and challenge questions derived from customer enrollment information — to more complex device intelligence and more complex out-of-wallet identity verification procedures. In the world of online security, experience is critical. Layered together, Experian’s authentication capabilities (including device intelligence from 41st Parameter, out-of-wallet questions and analytics) offers a more comprehensive approach to meeting and exceeding the FFIEC’s most recent guidance. More importantly, they offer the most effective and efficient means to mitigating risk in online environments, ensuring a positive customer experience and have been market-tested in the most challenging financial services applications.
Like their utility counterparts, communications providers routinely participate in federally subsidized assistance programs that discount installation or monthly service for qualified low-income customers. But, as utilities have found, certain challenges must be considered when mining this segment for new growth opportunities, including: Thwarting scammers who use falsified income data and/or multiple IDs to game the system and double up on discounts Equipping internal teams to efficiently process the potential mountain of program applications and recertification paperwork The right tool for the job Experian’s Financial Assistance CheckerSM product is a powerful scoring tool that indicates whether consumers may qualify for low-income assistance programs (such as LifeLine and LinkUp). Originally designed for (and currently used by) utilities, Financial Assistance Checker offers risk-reduction and resource utilization efficiencies that also benefit communications providers. Automation saves time For example, Financial Assistance Checker may be used to help qualify specific individuals among new and existing low-income program participants, as well as others who may qualify but have not yet enrolled. The solution also helps automate labor-intensive manual reviews, making the process less costly and more efficient. Some companies have reduced manual intervention by up to 50% by using financial assistance scores to automatically re-certify current enrollees. Strengthen your overall game plan Experian’s Financial Assistance Checker may be used to: Produce a score that aids in effective decisions Reduce the number of manually reviewed applications Facilitate more efficient resource allocation Mitigate fraud risk by rejecting unqualified applicants Cautionary caveat Financial Assistance Checker is derived exclusively from Experian’s credit data without demographic factors. While it’s good at qualifying applicants and customers, it may not be used as a basis for adverse action or removal from a program — only to determine eligibility for low-income assistance. Today, acquisitions is the name of the game. If your growth strategy calls for leveraging subsidized segments, consider adding Experian’s Financial Assistance Checker product to your starting lineup. After all, the best offense could just be a strong defense. Link & Learn This link takes you to a short but informative video about LifeLine and LinkUp. See the FCC’s online Lifeline and Link Up program overview here. Hot off the government press! Click to see the FCC’s 6/21/11 report on Lifeline and LinkUp Reform and Modernization
By: Kennis Wong On the surface, it’s not difficult to define existing account fraud. Obviously, it is fraud perpetrated against an existing account. But the way I see it, existing account fraud can be broken down into four types. The first type is account takeover fraud, which is what most organizations think as the de facto existing account fraud. This is when a real consumer using his or her own identity to open a legitimate account, but the account later on get taken over by an identity fraudster. The idea is that when the account was first established, it was created by the rightful person. But somewhere along the way, the account and identity information were compromised. The fraudster uses the compromised information to engineer their way into the account. The second type is impersonation. Impersonation is somewhat similar to account takeover in the sense that it is also misusing the victim’s account. But the difference is that impersonation is more of a one or few times misuses of the account. Examples are a fraudulent use of a credit card or wire transfer. These are the obvious categories. But I think we should also think about these other categories. My definition of existing account fraud also includes this third type – identity fraud that was undetected during application. In other words, an account is established based on stolen identity. Many organizations call this “new account fraud”, which I don’t have a problem with. But I think it’s really also existing account fraud, because – is this existing account? The answer is yes. Is this fraud? Absolutely. It’s not that difficult, is it? Similarly, I am including first-party fraud in existing account fraud as well. A consumer can use his or her own identity to open an account, with an intention to default after the account is established. Example is bust out fraud. You see that this is an expanded definition of existing account fraud, because my focus is on detection. No matter at what point and how identity fraud comes in, it becomes an account in your organization, and that is where we need to discover the fraud. But at the end of the day, it’s not too important how to categorize or name the fraud - whether it's application fraud, existing account fraud, first party fraud or third party fraud, as long as organizations understand them enough and have a good way to detect them. Read more blog posts on existing account fraud.
By: Kari Michel The topic of strategic default has been a hot topic for the media as far back as 2009 and continues as this problem won’t really go away until home prices climb and stay there. Terry Stockman (not his real name) earns a handsome income, maintains a high credit score and owns several residential properties. They include the Southern California home where he has lived since 2007. Terry is now angling to buy the foreclosed home across the street. What’s so unusual about this? Terry hasn’t made a mortgage payment on his own home for more than six months. With prices now at 2003 levels, his house is worth only about one-half of what he paid for it. Although he isn’t paying his mortgage loan, Terry is current with his other debt payments. Terry is a strategic defaulter — and he isn’t alone. By the end of 2008, a record 1 in 5 mortgages at least 60 days past due was a strategic default. Since 2008, strategic defaults have fallen below that percentage in every quarter through the second quarter of 2010, the most recent quarter for which figures are available. However, the percentages are still high: 16% in the last quarter of 2009 and 17% in the second quarter of last year. Get more details off of our 2011 Strategic Default Report What does this mean for lenders? Mortgage lenders need to be able to identify strategic defaulters in order to best employ their resources and set different strategies for consumers who have defaulted on their loans. Specifically designed indicators help lenders identify suspected strategic default behavior as early as possible and can be used to prioritize account management or collections workflow queues for better treatment strategies. They also can be used in prospecting and account acquisition strategies to better understand payment behavior prior to extending an offer. Here is a white paper I thought you might find helpful.
When the Consumer Financial Protection Bureau (CFPB) takes authority on July 21, debt collectors and communications companies should pay close attention. If the CFPB has its way, the rules may be changing. Old laws, new technologies The rules governing consumer communications for debt collection haven’t seen a major update since they were written in 1977. While the FTC has enforcement power in this area, it can’t write rules—Congress must provide direction. Consequently, the rules guiding the debt collection industry have evolved based on decisions by the courts. In the meantime, technology has outpaced the law. Debt collectors have taken advantage of the latest available methods of communication, such as cell phones, autodialers and email, while the compliance requirements have largely remained murky. At the same time, complaints about debt collection practices to the FTC continue to rise. While the number is relatively low compared to the amount of overall activity, the FTC receives more complaints about debt collectors than any other industry. The agency has also raised concerns about how new communication tools, such as Facebook and Twitter, will impact the future of debt collection. Priorities for the CFPB While mortgages, credit cards and payday loans will be the early priorities for the CFPB, high on the list of to-do items will be to update the laws governing consumer communications for debt collection. Under the Dodd-Frank Act, the CFPB will be responsible not only for enforcing the Fair Debt Collection Practices Act (FDCPA), but it will also have a new ability to write the rules. This raises new issues, such as how new regulations will affect how debt collection companies can contact consumers. Even as lenders and communications companies have expressed concern about the CFPB writing the rules, the hope is that the agency will create a more predictable legal structure that covers new technologies and reduces the uncertainty around compliance. Faced with the prospect of clarifying the compliance requirements around debt collection, the ACA (Association of Collection and Credit Professionals) has started to get in front of the CFPB by putting together its own blueprint. Will the CFPB be ready by July 21? Over the last year, the CFPB has been busy building an organizational structure but still lacks a leader appointed by the President and confirmed by the Senate. (Elizabeth Warren is currently the unofficial director.) Without a permanent director in place, the agency will be unable to gain full regulatory authority on July 21 – the date set by the Treasury Department. Until then, the CFPB will be able to enforce existing laws but will be unable to write new regulations. Despite the political uncertainty, debt collectors and communications firms still need to be prepared. One way is to ensure you’re following industry best practices established by ACA. To help you be ready for any outcome, we’ll continue to follow this issue and keep you apprised of the CFPB’s direction. Let us know your thoughts and concerns in the comment section. Or feel free to contact your Experian rep directly with any questions you may have. Helpful links: Association of Credit and Collection Professionals Fair Debt Collection Practices Act (PDF) Consumer Financial Protection Bureau (CFPB)
This is the third and final post in an interview between Experian’s Tom Whitfield and Dr. Michael Turner, founder, president and CEO of the Policy and Economic Research Council (PERC)—a non-partisan, non-profit policy institute devoted to research, public education, and outreach on public and economic policy matters. In this post Dr. Turner discusses mandatory credit-information sharing for communications companies, and the value of engaging and educating state regulators. _____________________________ Does it make sense for the FTC to mandate carriers to report? Credit information sharing in the United States is a voluntary system under the Fair Credit Reporting Act (FCRA). Mandating information sharing would break precedent with this successful, decades-old regime, and could result in less rather than more information being shared, as it shifts from being a business matter to a compliance issue. Additionally, the voluntary nature of credit reporting allows data furnishers and credit bureaus to modify reporting in response to concerns. For example, in reaction to high utility bills as a result of severe weather, a utility provider may wish to report delinquencies only 60 days or more past due. Similarly, a credit bureau may not wish to load data it feels is of questionable quality. A voluntary system allows for these flexible modifications in reporting. Further, under existing federal law, those media and communications firms that decide they want to fully report payment data to one or more national credit bureaus are free to do so. In short, there is simply no need for the FTC to mandate that communications and media companies report payment data to credit bureaus, nor would there be any immediate benefit in so doing. How much of the decision is based on the influence of the State PUC or other legislative groups? Credit information sharing is federally regulated by the Fair Credit Reporting Act (FCRA). The FCRA preempts state regulators, and as such, a media or communications firm that wants to fully report may do so regardless of the preferences of the state PUC or PSC. PERC realizes the importance of maintaining good relations with oversight agencies. We recommend that companies communicate the fact of fully reporting payment data to a PUC or PSC and engage in proactive outreach to educate state regulators on the value of credit reporting customer payment data. There have been notable cases of success in this regard. Currently, just four states (CA, OH, NJ and TX) have partial prohibitions regarding the onward transfer of utility customer payment data to third parties, and none of these provisions envisioned credit reporting when drafted. Instead, most are add-ons to federal privacy legislation. Only one state (CA) has restrictions on the onward transfer of media and communications customer payment data, and again this has nothing to do with credit reporting. Agree, disagree or comment Whether you agree with Dr. Turner’s assertions or not, we’d love to hear from you. So please, take a moment to share your thoughts about full-file credit reporting in the communications industry. Click here to learn more about current and pending legislation that impacts communications providers.
By: John Straka The U.S. housing market remains relatively weak, but it’s probably not as weak as you think. To what extent are home prices really falling again? Differing Findings Most recent media coverage of the “double dip in home prices” has centered on declines in the popular Case-Schiller price index; however, the data entering into this index is reported with a lag (the just released April index reflects data for February-April) and with some limitations. CoreLogic publishes a more up-to-date index value that earlier this month showed a small increase, and more importantly, CoreLogic also produces an index that excludes distressed sales. This non-distressed index has shown larger recent price increases, and it shows increases over the last 12 months in 20 states. Others basing their evidence on realtors’ listing data have concluded that there was some double dip last year, but prices have actually been rising now for several months (See Altos). These disparate findings belie overly simplistic media coverage, and they stress that “the housing market” is not one single market, of course, but a wide distribution of differing outcomes in very many local neighborhood home markets across the nation. (For a pointed view of this, see Charron.) Improved Data Sources Experian is now working with the leading source of the most granular and timely home market analytics and information, from nationwide local market data, and the best automated valuation model (AVM) provider based on these and other data, Collateral Analytics. (Their AVM leads in accuracy and geographic coverage in most large lender and third party AVM tests). While acknowledging their popularity, value, and progress, Collateral Analytics President Dr. Michael Sklarz questions the traditional dominance of repeat-sales home price indexes (from Case-Shiller etc.). Repeat-sales data typically includes only around 20 to 30 percent of the total home sales taking place. Collateral Analytics instead studies the full market distribution of home sales and market data and uses their detailed data to construct hedonic price indexes that control for changing home characteristics. This approach provides a similar “constant quality” claim as repeat-sales—without throwing away a high percentage of the market observations. Collateral Analytics indexes also cover over 16,000 zip codes, considerably more than others. Regular vs. Distressed Property Sales Nationwide, some well-known problem states, areas and neighborhoods continue to fare worse than most others in today’s environment, and this skewed national distribution of markets is not well described by overall averages. Indeed, on closer inspection, the recent media-touted gloomy picture of home prices that are “falling again” or that “continue to fall” is a distorted view for many local home markets, where prices have been rising a little or even more, or at least remaining flat or stable. Nationwide or MSA averages that include distressed-property sales (as Case-Shiller tends to do) can be misleading for most markets. The reason for this is that distressed-property sales, while given much prominence in recent years and lowering overall home-price averages, have affected but not dominated most local home markets. The reporting of continued heavy price discounts (twenty percent or significantly more) for distressed sales in most areas is a positive sign of market normality. It typically takes a significantly large buildup of distressed property sales in a local area or neighborhood home market to pull down regular property sale prices to their level. For normal or regular home valuation, distressed sales are typically discounted due to their “fire sale” nature, “as is” sales, and property neglect or damage. This means that the non-distressed or regular home price trends are most relevant for most homes in most neighborhoods. Several examples are shown below. As suggested in these price-per-living-area charts, regular (non-distressed) home-sale prices have fared considerably better in the housing downturn than the more widely reported overall indexes that combine regular and distressed sales(1). Regular-Sale and Combined Home Prices in $ Per Square Foot of Living Area and Distress Sales as a Pct of Total Sales In Los Angeles, combined sale prices fell 46 percent peak-to-trough and are now 16 percent above the trough, while regular sale prices fell by considerably less, 33 percent, and are now 3 percent above the trough. Distressed sales as a percent of total sales peaked at 52 percent in 2009:Q1, but then fell to a little under 30 percent by 2010:Q2, where it has largely remained (this improvement occurred before the general “robo-signer” process concerns slowed down industry foreclosures). L.A. home prices per square foot have remained largely stable for the past two years, with some increase in distressed-sale prices in 2009. Market prices in this area most recently have tended to remain essentially flat—weak, but not declining anew, with some upward pressure from investors and bargain hunters (previously helped by tax credits before they expired). Double-Dip: No. In Washington DC, single-family home prices per square foot have been in a saw- tooth seasonal pattern, with two drops of 15-20% followed by sizable rebounds in spring sales prices. The current combined regular & REO average price is 17 percent below its peak but 13 percent above its trough, while the regular-sale average price is just 12 percent below the peak and 10 percent above its trough. Distressed sales have been comparatively low, but rising slowly to a peak of a little over 20 percent in 2010, with some slight improvement recently to the high teens. Single-family prices in DC have remained comparatively strong; however, more of the homes in DC are actually condos, and condo prices have not been quite as strong, with the market data showing mixed signals but with the average price per square foot remaining essentially flat. Double-Dip: No. In the Miami area, the combined average home price per square foot fell by 48 percent peak to trough and is now just 1 percent above the 2009:Q2 trough. The regular-sale average price already experienced an earlier double-dip, falling by 32 percent to 2009:Q2, then stabilizing for a couple of quarters before falling another 9 percent relative to the peak; since 2010:Q3 this average has been choppy but basically flat, now 3 percent above that second trough. Prices in Miami have been among the weakest in large metro areas, but average prices have been largely flat for the past year, without any sharp new double dip. Distressed sales as a percent of the total peaked at 53 percent in 2009:Q1, but then fell to a little under 30 percent by 2010:Q2; since then there has been some return to a higher distress share, in the mid to upper 30s (but all of these figures are about 10 percentage points lower for condos). New Double-Dip: No. The Dallas area has seen some of the strongest prices in the nation. The combined price per square foot had an earlier peak and fell by 31 percent peak to trough, but it is now 33 percent above the trough. The regular-sale average price fell briefly by 22 percent peak to trough, but it has since risen by 32 percent from the 2009:Q1 trough to where it is now 3 percent above the peak. The increases have occurred in a saw-tooth seasonal pattern with spring prices the highest, but prices here have been largely rising considerably. Distress sales as a percent of the total peaked at 22 percent in 2009:Q1 but have largely fallen since and now stand at just 11 percent. Double-Dip: No. Here You Can See 47 More Examples of Where Double-Dips Are and Are Not: » Pacific West » Southwest » Mountain West » Midwest » Northeast » Mid Atlantic » Southeast To summarize this information and gain a little more insight into the general area conditions for most homes and individuals in the U.S., we can add up the number of homes and the total population across the counties examined. To be sure, this information is not a rigorous random sample across homes, but I have tried to include and show the details of both stronger and weaker metro-area counties throughout the U.S. As shown in the tables below, the information used here has covered 51 metro-area counties, including a total population of over 15 million homes and nearly 75 million individuals(2). These results may be regarded as suggestive of findings from a more thoroughgoing study. Based on these reviews of the market price averages and other data, my assessment is that a little over half of the counties examined are not currently or recently experiencing a double-dip in home prices. Moreover, these counties, where home prices appear to be at least flat or relatively stronger, encompass almost two-thirds (65%) of the total affected U.S. population examined, and nearly three-fifths (58%) of the total properties covered by the data studied. Conclusion This is, on balance, good news. But there are remaining concerns. One is the continued high, or more recently rising, shares of distressed sales in many markets, and the “shadow inventory” of distressed sales now being held up in the current foreclosure pipeline. But it is also interesting to see that many of the reductions in the distressed-property shares of total sales in high-stress areas occurred before the foreclosure processing slowdowns. Another interesting observation is that most of the recent double-dips in prices have been relatively mild compared to the previous original peak-to-trough meltdown. While, to be sure, there are plenty of reasons to remain uncertain and cautious about U.S. home prices, home markets in general do vary considerably, with significant elements of improvement and strength as well as the continuing weaknesses. Despite many reports today about “the beleaguered housing market,” there really is no such thing … not unless the report is referring to a very specific local market. There definitely are double dips in many areas, and reasons for continuing overall concern. But the best available evidence suggests that there are actually double-dip markets—most relatively moderately so, stable markets, and stronger markets, with markets affecting a majority of homes and individuals actually in the stable and stronger categories. Note: In a next installment, we’ll look at some more granular micro market data, to explore in greater depth the extensive variety of home-price outcomes and market conditions in weak pockets and strong pockets across various local areas and home markets. This will highlight the importance of having very good information, at sub-county and even sub-zip code levels, on local-neighborhood home markets. Source of Home Price and Market Information: Collateral Analytics HomePriceTrends. I thank Michael Sklarz for providing the extensive information for this report and for comments, and I thank Stacy Schulman for assistance in this posting. __________________ (1) Based on analysis by Collateral Analytics, price/living sq ft is a useful, simple “hedonic” measure which typically controls for around 70 percent or more of the changing characteristics in a housing stock and home sale mix. Patterns in home prices without dividing by the square footage are generally similar, but not always. (2) The property inventory counts are from Collateral Analytics, while the population estimates are from the 2010 U.S. Census.
This is the second in a three-part interview between Experian’s Tom Whitfield and Dr. Michael Turner, founder, president and CEO of the Policy and Economic Research Council (PERC)—a non-partisan, non-profit policy institute devoted to research, public education, and outreach on public and economic policy matters. Dr. Turner is a prominent expert on credit access, credit reporting and scoring, information policy, and economic development. Mr. Whitfield is the Director of Marketing for Experian’s Telecommunications, Energy and Cable practice. In this post Dr. Turner explains how full-file credit reporting actually benefits consumers and why many communications providers haven’t yet embraced it. _____________________________ Why is full-file credit reporting good for communications customers? Approximately 54 million Americans either have no credit report, or have very little information in their credit reports to generate a credit score. Most of these “thin-file/no-file” persons are financially excluded and many of them are media and communications customers. By having their payment data fully reported to a credit bureau and included in their credit reports, many will be able to access affordable sources of mainstream credit for the first time; others will be helped by repairing their damaged credit. In this way, consumers will save by not relying on high-cost lenders to have their credit needs met. Why don’t providers embrace reporting like other major industries/lenders? A major reason is inertia—providers haven’t done it before and are not sure how they would benefit from change. Just recently, PERC released a major study highlighting the business case for fully reporting customer payment data to one or more nationwide credit bureaus. This includes customer survey results, peer survey results and case studies. The results all point to tremendous upside from fully reporting payment data, with only manageable downsides—including external communications and regulators. Misperceptions and misunderstandings Another significant reason is regulator misperceptions and misunderstandings. State public service and public utility commissions (PSCs and PUCs) aren’t experts in credit reporting or the regulatory framework around credit-information sharing. Many mistakenly believe the data is unregulated and can be used for marketing. Not wanting to contribute to an increase in commercial mail and telemarketing calls, some regulators have a knee-jerk reaction when the topic of credit reporting is raised by an interested media, communications or utility company. PERC has been working to educate regulators and has had success in their outreach efforts. PERC can be a resource to firms interested in full-file reporting in direct communications with regulators. Part 3: Wednesday, June 29 Next, in the concluding post of this interview with PERC founder, president and CEO Dr. Michael Turner, the doctor discusses mandatory credit-information sharing for communications companies, and the value of engaging and educating state regulators. Agree, disagree or comment Whether you agree with Dr. Turner’s assertions or not, we’d love to hear from you. So please, take a moment to share your thoughts about full-file credit reporting in the communications industry.
This is the first in a three-part interview between Experian’s Tom Whitfield and Dr. Michael Turner, founder, president and CEO of the Policy and Economic Research Council (PERC)—a non-partisan, non-profit policy institute devoted to research, public education, and outreach on public and economic policy matters. Dr. Turner is a prominent expert on credit access, credit reporting and scoring, information policy, and economic development. Mr. Whitfield is the Director of Marketing for Experian’s Telecommunications, Energy and Cable practice. In this post Dr. Turner discusses how communications providers and their customers can both benefit from full-file credit reporting. Comments, suggestions and differing viewpoints are welcome. _____________________________ Why is full reporting to the bureaus so critical for communication providers? PERC’s research has found at least three good business reasons for media and communications companies to consider this practice: 1) Improved cash flow. In a survey of nearly 1,000 heads of household (those with primary bill paying responsibility), media and communications payments ranked below payments that were fully reported to credit bureaus. When asked how credit reporting would impact bill payment prioritization, half of all respondents indicated they would be “much more likely” or “more likely” to pay their media and communications bills on time. Such an outcome would represent a significant cash flow improvement. In fact, case study results substantiate this, and demonstrate further benefits from reduced delinquencies and charge offs. 2) Cost savings. In a survey of media, communications, and utilities the perceived costs of reporting payments to a bureau were, in fact, substantially greater than actual costs incurred, and perceived benefits significantly lower than actual benefits. In most cases, the actual benefits reported by firms fully reporting payment data to one or more nationwide credit bureaus were multiples higher than the actual costs, which were reported as being modest as a ratio of IT and customer service expenditures. 3) More customer loyalty, less churn. In a competitive deregulated environment, telling customers about the benefits of fully reporting payment data (building a good credit history, reducing costs of credit and insurance, increasing credit access and credit limits, improving chances of qualifying for an apartment rental or job) could result in increased loyalty and less churn. How do providers stand to benefit from reporting? Providers benefit because fully reporting payment data to a nationwide credit bureau for inclusion in credit reports actually changes customer behavior. Reporting negative-only data doesn’t affect customers in the same way, and, in the vast majority of cases, does not affect payment behavior at all, as consumers are entirely unaware of reporting or see it as a “black list.” By communicating the many customer benefits of fully reporting payment data to a credit bureau for inclusion in a credit report, the provider benefits from improved cash flow, reduced charge offs, and improved customer loyalty. Part 2: Monday, June 27 In Part 2 of this interview, Dr. Turner explains how full-file credit reporting actually benefits consumers and why many communications providers haven’t yet embraced it. The primary reason uncovered in PERC’s research may surprise you, so be sure to come back for Part 2. Agree, disagree or comment Whether you agree with Dr. Turner’s assertions or not, we’d love to hear from you. So please, take a moment to share your thoughts about full-file credit reporting in the communications industry.