Latest Posts

Loading...

With the most recent guidance newly issued by the Federal Financial Institutions Examination Council (FFIEC) there is renewed conversation about knowledge based authentication. I think this is a good thing.  It brings back into the forefront some of the things we have discussed for a while, like the difference between secret questions and dynamic knowledge based authentication, or the importance of risk based authentication. What does the new FFIEC guidance say about KBA?  Acknowledging that many institutions use challenge questions, the FFIEC guidance highlights that the implementation of challenge questions can greatly impact efficacy of its usefulness. Chances are you already know this.  Of greater importance, though, is the fact that the FFIEC guidelines caution on the use of less sophisticated systems and information that can be easily guessed or obtained from an Internet search, given the amount of information available.    As mentioned above, the FFIEC guidelines call for questions that “do not rely on information that is often publicly available,” recommending instead a broad range of data assets on which to base questions.  This is an area knowledge based authentication users should review carefully.  At this point in time it is perfectly appropriate to ask, “Does my KBA provider rely on data that is publicly sourced”  If you aren’t sure, ask for and review data sources.  At a minimum, you want to look for the following in your KBA provider:     ·         Questions!  Diverse questions from broad data categories, including credit and noncredit assets ·         Consumer question performance as one of the elements within an overall risk-based decisioning policy ·         Robust performance monitoring.  Monitor against established key performance indicators and do it often ·         Create a process to rotate questions and adjust access parameters and velocity limits.  Keep fraudsters guessing! ·         Use the resources that are available to you.  Experian has compiled information that you might find helpful: www.experian.com/ffiec Finally, I think the release of the new FFIEC guidelines may have made some people wonder if this is the end of KBA.  I think the answer is a resounding “No.”  Not only do the FFIEC guidelines support the continued use of knowledge based authentication, recent research suggests that KBA is the authentication tool identified as most effective by consumers.  Where I would draw caution is when research doesn’t distinguish between “secret questions” and dynamic knowledge based authentication, which we all know is very different.   

Published: October 4, 2011 by Guest Contributor

By: Mike Horrocks Have you ever been struck by a turtle or even better burnt by water skies that were on fire?  If you are like me, these are not accidents that I think will ever happen to me and I'm not concerned that my family doctor didn't do a rotation in medical school to specialize in treating them. On October 1, 2013, however, doctors and hospitals across the U.S. will have ability to identify, log, bill, and track those accidents and thousands of other very specific medical events.  In fact the list will jump from a current 18,000 medical codes to 140,000 medical codes.  Some people hail this as a great step toward the management of all types of medical conditions, whereas others view it as a introduction of noise in a medical system already over burdened.  What does this have to do with credit risk management you ask? When I look at the amount of financial and non-financial data that the credit industry has available to understand the risk of our consumer or business clients, I wonder where we are in the range of “take two aspirins and call me in the morning” to “[the accident] occurred inside a chicken coop” (code: Y9272).   Are we only identifying a risky consumer after they have defaulted on a loan?  Or are we trying to find a pattern in the consumer's purchases at a coffee house that would correlate with some other data point to indicate risk when the moon is full? The answer is somewhere in between and it will be different for each institution.  Let’s start with what is known to be predictable when it comes to monitoring our portfolios - data and analytics, coupled with portfolio risk monitoring to minimize risk exposure - and then expand that over time.  Click here for a recent case study that demonstrates this quite successfully with one of our clients. Next steps could include adding in analytics and/or triggers to identify certain risks more specifically. When it comes to risk, incorporating attributes or a solid set of triggers, for example, that will identify risk early on and can drill down to some of the specific events, combined with technology that streamlines portfolio management processes - whether you have an existing system in place or in search of a migration - will give you better insight to the risk profile of your consumers. Think about where your organization lies on the spectrum.    If you are already monitoring your portfolio with some of these solutions, consider what the next logical step to improve the process is - is it more data, or advanced analytics using that data, a combination of both, or perhaps it's a better system in place to monitoring the risk more closely. Wherever you are, don’t let your institution have the financial equivalent need for these new medical codes W2202XA, W2202XD, and W2202XS (injuries resulting from walking into a lamppost once, twice, and sequentially).

Published: September 19, 2011 by Guest Contributor

Our guest blogger this week is Tom Bowers, Managing Director, Security Constructs LLC – a security architecture, data leakage prevention and global enterprise information consulting firm. The rash of large-scale data breaches in the news this year begs many questions, one of which is this: how do hackers select their victims? The answer: research. Hackers do their homework; in fact, an actual hack typically takes place only after many hours of first studying the target. Here’s an inside look at a hacker in action: Using search queries through such resources as Google and job sites, the hacker creates an initial map of the target’s vulnerabilities.  For example, job sites can offer a wealth of information such as hardware and software platform usage, including specific versions and its use within the enterprise. The hacker fills out the map with a complete intelligence database on your company, perhaps using public sources such as government databases, financial filings and court records. Attackers want to understand such details as how much you spend on security each year, other breaches you’ve suffered, and whether you’re using LDAP or federated authentication systems. The hacker tries to identify the person in charge of your security efforts.  As they research your Chief Security Officer or Chief Intelligence Security Officer (who they report to, conferences attended, talks given, media interviews, etc.) hackers can get a sense of whether this person is a political player or a security architect, and can infer the target’s philosophical stance on security and where they’re spending time and attention within the enterprise. Next, hackers look for business partners, strategic customers and suppliers used by the target.  Sometimes it may be easier to attack a smaller business partner than the target itself.  Once again, this information comes from basic search engine queries; attackers use job sites and corporate career sites to build a basic map of the target’s network. Once assembled, all of this information offers a list of potential and likely egress points within the target. While there is little you can do to prevent hackers from researching your company, you can reduce the threat this poses by conducting the same research yourself.  Though the process is a bit tedious to learn, it is free to use; you are simply conducting competitive intelligence upon your own enterprise.  By reviewing your own information, you can draw similar conclusions to the attackers, allowing you to strengthen those areas of your business that may be at risk. For example, if you want to understand which of your web portals may be exposed to hackers, use the following search term in Google: “site:yourcompanyname.com – www.yourcompanyname.com” This query specifies that you want to see everything on your site except WWW sites.  Web portals do not typically start with WWW and this query will show “eportal.yourcompanyname, ecomm.yourcompanyname.” Portals are a great place to start as they usually contain associated user names and passwords;   this means that a database is storing these credentials, which is a potential goldmine for attackers.  You can set up a Google Alert to constantly watch for new portals; simply type in your query, select how often you want updates, and Google will send you an alert every time a new portal shows up in its results. Knowledge is power.  The more you know about your own business, the better you can protect it from becoming prey to hacker-hawks circling in cyberspace. Download our free Data Breach Response Guide

Published: September 6, 2011 by Michael Bruemmer

By: Mike Horrocks Let’s all admit it, who would not want to be Warren Buffet for a day?  While soaking in the tub, the “Sage of Omaha” came up with the idea to purchase shares of Bank of America and managed to close the deal in under 24 hours (and also make $357 million in one day thanks to an uptick in the stock). Clearly investor opinions differ when picking investments, so what did Buffet see that was worth taking that large of a risk? In interviews Buffet simply states that he saw the fundamentals of a good bank (once they fix a few things), that will return his investment many times over. He has also said that he came to this conclusion based on years of seeing opportunities where others only see risk. So what does that have to do with risk management? First, ask yourself as you look at your portfolio of customers what ones are you  “short-selling”  and risk losing and what customers are you investing into and expect Buffet-like returns on in the future? Second, ask yourself how are you making that “investment” decision on your customers? And lastly, ask yourself how confident you are in that decision? If you’re not employing some mode of segmentation today on your portfolio stop and make that happen as soon as you are done reading this blog. You know what a good customer looks like or looked like once upon a time. Admit to yourself that not every customer looks as good as they used to before 2008 and while you are not “settling”, be open minded on who you would want as a customer in the future. Amazingly, Buffet did not have Bank of America’s CEO Brian Moynihan’s phone number when he wanted to make the deal. This is where you are heads and shoulders above Garot’s Steak House’s favorite customer.  You have deposit information, loan activity and performance history, credit data, and even the phone number of your customers. This gives you plenty of data and solutions to build that profile of what a good customer looks like – thereby knowing who to invest in. The next part is the hardest. How confident are you in your decision that you will put your money on it? For example, my wife invested in Bank of America the day before Warren put in his $5 billion. She saw some of the same signs that he did in the bank. However, the fact that I am writing this blog is an indicator that she clearly did not invest to the scale that Warren did. But what is stopping you from going all in and investing in your customers’ future? If the fundamentals of your customer segmenting are sound, any investment today into your customers will come back to you in loyalty and profits in the future. So at the risk of conjuring up a mental image, take the last lesson from Warren Buffet’s tub soaking investment process and get up and invest in those perhaps risky today, yet sound tomorrow customers or run the risk of future profits going down the drain.

Published: August 30, 2011 by Guest Contributor

By: Kari Michel The way medical debts are treated in scores may change with the introduction of June 2011, Medical Debt Responsibility Act. The Medical Debt Responsibility Act would require the three national credit bureaus to expunge medical collection records of $2,500 or less from files within 45 days of their being paid or settled. The bill is co-sponsored by Representative Heath Shuler (D-N.C.), Don Manzullo (R-Ill.) and Ralph M. Hall (R-Texas). As a general rule, expunging predictive information is not in the best interest of consumers or credit granters -- both of which benefit when credit reports and scores are as accurate and predictive as possible. If any type of debt information proven to be predictive is expunged, consumers risk exposure to improper credit products as they may appear to be more financially equipped to handle new debt than they truly are. Medical debts are never taken into consideration by VantageScore® Solutions LLC if the debt reporting is known to be from a medical facility. When a medical debt is outsourced to a third-party collection agency, it is treated the same as other debts that are in collection. Collection accounts of lower than $250, or ones that have been settled, have less impact on a consumer’s VantageScore® credit score. With or without the medical debt in collection information, the VantageScore® credit score model remains highly predictive.

Published: August 29, 2011 by Guest Contributor

By: Mike Horrocks The realities of the new economy and the credit crisis are driving businesses and financial institutions to better integrate new data and analytical techniques into operational decision systems. Adjusting credit risk processes in the wake of new regulations, while also increasing profits and customer loyalty will require a new brand of decision management systems to accelerate more precise customer decisions. There is a Webinar scheduled for Thursday that will insightfully show you how blending business rules, data and analytics inside a continuous-loop decisioning process can empower your organization - to control marketing, acquisition and account management activities to minimize risk exposure, while ensuring portfolio growth. Topics include: What the process is and the key building blocks for operating one over time Why the process can improve customer decisions How analytical techniques can be embedded in the change control process (including data-driven strategy design or optimization) If interested check out more - there is still time to register for the Webinar. And if you just want to see a great video - check out this intro.

Published: August 24, 2011 by Guest Contributor

With the raising of the U.S. debt ceiling and its recent ramifications consuming the headlines over the past month, I began to wonder what would happen if the general credit consumer had made a similar argument to their credit lender. Something along the lines of, “Can you please increase my credit line (although I am maxed out)? I promise to reduce my spending in the future!” While novel, probably not possible. In fact, just the opposite typically occurs when an individual begins to borrow up to their personal “debt ceiling.” When the amount of credit an individual utilizes to what is available to them increases above a certain percentage, it can adversely affect their credit score, in turn affecting their ability to secure additional credit. This percentage, known as the utility rate is one of several factors that are considered as part of an individual’s credit score calculation. For example, the utilization rate makes up approximately 23% of an individual’s calculated VantageScore® credit score. The good news is that consumers as a whole have been reducing their utilization rate on revolving credit products such as credit cards and home equity lines (HELOCs) to the lowest levels in over two years. Bankcard and HELOC utilization is down to 20.3% and 49.8%, respectively according to the Q2 2011 Experian – Oliver Wyman Market Intelligence Reports. In addition to lowering their utilization rate, consumers are also doing a better job of managing their current debt, resulting in multi-year lows for delinquency rates as mentioned in my previous blog post. By lowering their utilization and delinquency rates, consumers are viewed as less of a credit risk and become more attractive to lenders for offering new products and increasing credit limits. Perhaps the government could learn a lesson or two from today’s credit consumer.

Published: August 23, 2011 by Alan Ikemura

Consumer credit card debt has dipped to levels not seen since 2006 and the memory of pre-recession spending habits continues to get hazier with each passing day. In May, revolving credit card balances totaled over $790 billion, down $180 billion from mid-2008 peak levels. Debit and Prepaid volume accounted for 44% or nearly half of all plastic spending, growing substantially from 35% in 2005 and 23% a decade ago. Although month-to-month tracking suggests some noise in the trends as illustrated by the slight uptick in credit card debt from April to May, the changes we are seeing are not at all temporary. What we are experiencing is a combination of many factors including the aftermath impacts of recession tightening, changes in the level of comfort for financing non-essential purchases, the “new boomer” population entering the workforce in greater numbers and the diligent efforts to improve the general household wallet composition by Gen Xers. How do card issuers shift existing strategies? Baby boomers are entering that comfortable stage of life where incomes are higher and expenses are beginning to trail off as the last child is put through college and mortgage payments are predominantly applied toward principle. This group worries more about retirement investments and depressed home values and as such, they demand high value for their spending. Rewards based credit continues to resonate well with this group. Thirty years ago, baby boomers watched as their parents used cash, money orders and teller checks to manage finances but today’s population has access to many more options and are highly educated. As such, this group demands value for their business and a constant review of competitive offerings and development of new, relevant rewards products are needed to sustain market share. The younger generation is focused on technology. Debit and prepaid products accessible through mobile apps are more widely accepted for this group unlike ten to fifteen years ago when multiple credit cards with four figure credit limits each were provided to college students in large scale. Today’s new boomer is educated on the risks of using credit, while at the same time, parents are apt to absorb more of their children’s monthly expenses. Servicing this segment's needs, while helping them to establish a solid credit history, will result in long-term penetration in a growing segment. Recent CARD Act and subsequent amendments have taken a bite out of revenue previously used to offset increased risk and related costs that allowed card issuers to service the near-prime sector. However, we are seeing a trend of new lenders getting in to the credit card game while existing issuers start to slowly evaluate the next tier. After six quarters of consistent credit card delinquency declines, we are seeing slow signs of relief. The average VantageScore for new card originations increased by 8 points from the end of 2008 into early 2010 driven by credit tightening actions and has started to slowly come back down in recent months.   What next? What all of this means is that card issuers have to be more sophisticated with risk management and marketing practices. The ability to define segments through the use of alternate data sources and access channels is critical to ongoing capture of market share and profitable usage. First, the segmentation will need to identify the “who” and the “what.” Who wants what products, how much credit is a consumer eligible for and what rate, terms and rewards structure will be required to achieve desired profit and risk levels, particularly as the economy continues to teeter between further downturn and, at best, slow growth. By incorporating new modeling and data intelligence techniques, we are helping sophisticated lenders cherry pick the non-super prime prospects and offering guidance on aligning products that best balance risk and reward dynamics for each group. If done right, card issuers will continue to service a diverse universe of segments and generate profitable growth.

Published: August 22, 2011 by Guest Contributor

As I’m sure you are aware, the Federal Financial Institutions Examination Council (FFIEC) recently released its, "Supplement to Authentication in an Internet Banking Environment" guiding financial institutions to mitigate risk using a variety of processes and technologies as part of a multi-layered approach. In light of this updated mandate, businesses need to move beyond simple challenge and response questions to more complex out-of-wallet authentication.  Additionally, those incorporating device identification should look to more sophisticated technologies well beyond traditional IP address verification alone. Recently, I contribute to an article on how these new guidelines might affect your institution.  Check it out here, in full:  http://ffiec.bankinfosecurity.com/articles.php?art_id=3932 For more on what the FFIEC guidelines mean to you, check out these resources - which also gives you access to a recent Webinar.

Published: August 19, 2011 by Keir Breitenfeld

What happens when once desirable models begin to show their age? Not the willowy, glamorous types that prowl high-fashion catwalks. But rather the aging scoring models you use to predict risk and rank-order various consumer segments. Keeping a fresh face on these models can return big dividends, in the form of lower risk, accurate scoring and higher quality customers. In this post, we provide an overview of custom attributes and present the benefits of overlaying current scoring models with them. We also suggest specific steps communications companies can take to improve the results of an aging or underperforming model. The beauty of custom attributes Attributes are highly predictive variables derived from raw data. Custom attributes, like those you’ve created in house or obtained from third parties, can provide deeper insights into specific behaviors, characteristics and trends. Overlaying your scoring model with custom attributes can further optimize its performance and improve lift. Often, the older the model, the greater the potential for improvement. Seal it with a KS Identifying and integrating the most predictive attributes can add power to your overlay, including the ability to accurately rank-order consumers. Overlaying also increases the separation of “goods and bads” (referred to as “KS”) for a model within a particular industry or sub-segment. Not surprisingly, the most predictive attributes vary greatly between industries and sub-segments, mainly due to behavioral differences among their populations. Getting started The first step in improving an underperforming model is choosing a data partner—one with proven expertise with multivariate statistical methods and models for the communications industry. Next, you’ll compile an unbiased sample of consumers, a reject inference sample and a list of attributes derived from sources you deem most appropriate. Attributes are usually narrowed to 10 or fewer from the larger list, based on predictiveness Predefined, custom or do-it-yourself Your list could include attributes your company has developed over time, or those obtained from other sources, such as Experian Premier AttributesSM (more than 800 predefined consumer-related choices) or Trend ViewSM attributes. Relationship, income/capacity, loan-to-value and other external data may also be overlaid. Attribute ToolboxTM Should you choose to design and create your own list of custom attributes, Experian’s Attribute ToolboxTM offers a platform for development and deployment of attributes from multiple sources (customer data or third-party data identified by you). Testing a rejuvenated model The revised model is tested on your both your unbiased and reject inference samples to confirm and evaluate any additional lift induced by newly overlaid attributes. After completing your analysis and due diligence, attributes are installed into production. Initial testing, in a live environment, can be performed for three to twelve months, depending on the segment (prescreen, collections, fraud, non-pay, etc), outcome or behavior your model seeks to predict. This measured, deliberate approach is considered more conservative, compared with turning new attributes on right away. Depending on the model’s purpose, improvements can be immediate or more tempered. However, the end result of overlaying attributes is usually better accuracy and performance. Make your model super again If your scoring model is starting to show its age, consider overlaying it with high-quality predefined or custom attributes. Because in communications, risk prevention is always in vogue. To learn more about improving your model, contact your Experian representative. To read other recent posts related to scoring, click here.

Published: August 19, 2011 by Guest Contributor

The following article was originally posted on August 15, 2011 by Mike Myers on the Experian Business Credit Blog. Last time we talked about how credit policies are like a plant grown from a seed. They need regular review and attention just like the plants in your garden to really bloom. A credit policy is simply a consistent guideline to follow when decisioning accounts, reviewing accounts, collecting and setting terms. Opening accounts is just the first step. Here are a couple of key items to consider in reviewing  accounts: How many of your approved accounts are paying you late? What is their average days beyond terms? How much credit have they been extended? What attributes of these late paying accounts can predict future payment behavior? I recently worked with a client to create an automated credit policy that consistently reviews accounts based on predictive credit attributes, public records and exception rules using the batch account review decisioning tools within BusinessIQ. The credit team now feels like they are proactively managing their accounts instead of just reacting to them.   A solid credit policy not only focuses on opening accounts, but also on regular account review which can help you reduce your overall risk.

Published: August 18, 2011 by Guest Contributor

By: Staci Baker In my last post about the Dodd-Frank Act, I described the new regulatory bodies created by the Act. In this post, I will concentrate on how the Act will affect community banks. The Dodd-Frank Act is over 3,000 pages of proposed and final rules and regulations set forth by the Consumer Financial Protection Bureau (CFPB). For any bank, managing such a massive amount of regulations is a challenge, but for a median-size bank with fewer employees, it can be overwhelming. The Act has far reaching unintended consequences for community banks.  According to the American Bankers Association, there are five provisions that are particularly troubling for community banks: 1.       Risk retention 2.       Higher Capital Requirements and Narrower Qualifications for Capital 3.       SEC’s Municipal Advisors Rule 4.       Derivatives Rules 5.       Doubling Size of the Deposit Insurance Fund (DIF) In order meet new regulatory requirements, community banks will need to hire additional compliance staff to review the new rules and regulations, as well as to ensure they are implemented on schedule. This means the additional cost of outside lawyers, which will affect resources available to the bank for staff, and for its customers and the community. Community banks will also feel the burden of loosing interchange fee income. Small banks are exempt from the new rules; however, the market will follow the lowest priced product. Which will mean another loss of revenue for the banks. As you can see, community banks will greatly be affected by the Dodd-Frank Act. The increased regulations will mean a loss of revenues, increased oversight, additional out-side staffing (less resources) and reporting requirements. If you are a community bank, how do you plan on overcoming some of these obstacles?

Published: August 15, 2011 by Guest Contributor

It’s time to focus on growth again.In 2010, credit marketers focused on testing new acquisition strategies. In 2011, credit marketers are implementing learnings from those tests.As consumer lending becomes more competitive, lenders are strategically implementing procedures to grow portfolios by expanding their marketable universe. The new universe of prospective customers is moving steadily beyond prime to a variety of near-prime segments outside of the marketing spectrum that lenders have targeted for the past three years.Many credit marketers have moved beyond testing based on new regulatory requirements and have started to market to slightly riskier populations. From testing lower-scoring segments to identifying strategies for unbanked/underbanked consumers, the breadth of methods that lenders are using to acquire new accounts has expanded. Portfolio growth strategies encompass internal process enhancements, product diversification, and precise underwriting and account management techniques that utilize new data assets and analytics to mitigate risk and identify the most profitable target populations.Experian® can help you identify best practices for growth and develop customized strategies that best suit your acquisition objectives. Whether your needs include internal methods to expand your marketable universe (i.e., marketing outside of your current footprint or offers to multiple individuals in a household) or changes to policies for external expansion strategies (i.e., near-prime market sizing or targeting new prospects based on triggered events), Experian has the expertise to help you achieve desired results. For more information on the new acquisition strategies and expanding your marketing universe, leave a comment below or call 1 888 414 1120.

Published: August 9, 2011 by Guest Contributor

By: John Straka Unsurprisingly, Washington deficit hawks have been eyeing the “sacred cows” of tax preferences for homeownership for some time now. Policymakers might even unwind or eliminate the mortgage interest deductions and capital-gains exemptions on home appreciation that have been in place in the U.S for many decades. There is an economic case to be made for doing this—more efficient resource allocation of capital, other countries have high ownership rates without such tax preferences, etc. But if you call or email or tweet Congress, and you choose this subject, my advice is to tell them that they should wait unti it’s 2005. In other words, now—or even the next few years most likely—is definitely not a good time at all to eliminate these housing tax preferences. We need to wait until it’s something like “2005”—when housing markets are much stronger again (hopefully) and state and local government finances are far from their relatively dire straits at present. If we don’t do this right, and insist on making big changes here now, then housing will take an immediate hit, and so will employment from both the housing sector and state and local governments (with further state and local service cutbacks also, due to budget shortfalls). The reason for this, of course, is that most homeowners today have not really benefited much, and won’t, from those well-established tax preferences. Why not? Because these preferences have been in place for so long now that the economic value (expected present discounted value) of these tax savings was long ago baked into the level of home prices that most homeowners paid when they bought their homes. Take the preferences away now, and the value of homes will immediately drop, and therefore so will property tax revenues collected by local governments across the U.S. This strategy will thus further bash the state- and-local sector in order to plump up some (we hope) our federal tax revenues by the value of the tax preferences. Housing will become a further drag on economic growth, and so will the resulting employment losses from both construction and local government services. As a result, it’s possible that on net the federal government may actually lose revenue from making this kind of change at precisely the wrong time. It may very well never be quite like “2005” again. But waiting for greater housing and local government strength to change long-standing housing tax preferences should make the macroeconomic impact smaller, less visible, and more easily absorbed.

Published: August 9, 2011 by Guest Contributor

The high-profile data breaches in recent months not only left millions of consumers vulnerable to the threat of identity theft and caused businesses to incur significant costs, but it also brought data security to the top of the agenda in Washington. In Congress, members of both the House and the Senate have used the recent data breaches to demonstrate the need for a uniform national data breach notification standard and increased data security standards for companies that collect consumer information. Hearings have been held on the issue and it is expected that legislation will be introduced this summer.At the same time, the Obama Administration continues to call for greater data security standards. The White House released its highly anticipated cybersecurity initiative in May. In addition to implementing a national data breach notification law, the proposal would require certain private companies to develop detailed plans to safeguard consumer data.As legislation develops and advances through multiple Congressional committees, Experian will be working with allies and coalitions to ensure that the data security standards established under the Gramm-Leach-Bliley Act and the Fair Credit Reporting Act are not superseded with new, onerous and potentially ineffective mandates.We welcome your questions and comments below.

Published: August 4, 2011 by Guest Contributor

Subscribe to our blog

Enter your name and email for the latest updates.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe to our Experian Insights blog

Don't miss out on the latest industry trends and insights!
Subscribe