Uncategorized

Loading...

Many compliance regulations such the Red Flags Rule, USA Patriot Act, and ESIGN require specific identity elements to be verified and specific high risk conditions to be detected. However, there is still much variance in how individual institutions reconcile referrals generated from the detection of high risk conditions and/or the absence of identity element verification. With this in mind, risk-based authentication, (defined in this context as the “holistic assessment of a consumer and transaction with the end goal of applying the right authentication and decisioning treatment at the right time") offers institutions a viable strategy for balancing the following competing forces and pressures: Compliance – the need to ensure each transaction is approved only when compliance requirements are met; Approval rates – the need to meet business goals in the booking of new accounts and the facilitation of existing account transactions; Risk mitigation – the need to minimize fraud exposure at the account and transaction level. A flexibly-designed risk-based authentication strategy incorporates a robust breadth of data assets, detailed results, granular information, targeted analytics and automated decisioning. This allows an institution to strike a harmonious balance (or at least something close to that) between the needs to remain compliant, while approving the vast majority of applications or customer transactions and, oh yeah, minimizing fraud and credit risk exposure and credit risk modeling. Sole reliance on binary assessment of the presence or absence of high risk conditions and identity element verifications will, more often than not, create an operational process that is overburdened by manual referral queues. There is also an unnecessary proportion of viable consumers unable to be serviced by your business. Use of analytically sound risk assessments and objective and consistent decisioning strategies will provide opportunities to calibrate your process to meet today’s pressures and adjust to tomorrow’s as well.

Published: January 10, 2011 by Keir Breitenfeld

When we think about fraud prevention, naturally we think about mininizing fraud at application. We want to ensure that the identities used in the application truly belong to the person who applies for credit, and not from some stolen identities. But the reality is that some fraudsters do successfully get through the defense at application. In fact, according to Javelin’s 2011 Identity Fraud Survey Report, 2.5 million accounts were opened fraudulently using stolen identities in 2010, costing lenders and consumers $17 billion. And these numbers do not even include other existing account fraud like account takeover and impersonation (limited misusing of account like credit/debit card and balance transfer, etc.). This type of existing account fraud affected 5.5 million accounts in 2010, costing another $20 billion. So although it may seem like a no brainer, it’s worth emphasizing that we need to continue to detect fraud for new and established accounts. Existing account fraud is unlikely to go away any time soon.  Lending activities have changed significantly in the last couple of years. Origination rate in 2010 is still less than half of the volume in 2008, and booked accounts become riskier. In this type of environment, when regular consumers are having hard time getting new credits, fraudsters are also having hard time getting credit. So naturally they will switch their focus to something more profitable like account takeover. Does your organization have appropriate tools and decisioning strategy to fight against existing account fraud?

Published: January 10, 2011 by Matt Ehrlich

Cell phone use on the rise A Wikipedia list of cell phone usage by country showed that as of December 2009, the U.S. had nearly 286 million cell phones in use. In parallel, a recent National Center for Health Statistics study found that one in every seven homes surveyed received all or almost all their calls on cell phones, even though they had a landline. Study results further indicated, one in four homes in the U.S. relied solely on cell phones. This statistic highlights these households had no land line at all during the last half of 2009. Since this time, the number of households that fall within this category have increased 1.8 percent. Implications for communications companies The increasing use of cell phones, coupled with the decreasing use of landlines, raises some very important concerns for communications companies: The physical address on file may not be accurate, since consumers can keep the same number as they jump providers. The increased use of pre-paid cell phones shines a new light on the growing issue that contact numbers are not a consistent means of reaching the consumer. These two issues make locating cell phone-only customers for purposes of cross-selling and/or collections an enormous challenge. It would certainly make everyone’s job easier if cell phone providers were willing to share their customer data with a directory assistance provider. The problem is, doing so, exposes them to attacks from their competition and since provider churn rate concerns are at an all-time high, can you really blame them? Identifying potentially risky customers, among cell phone-only consumers, becomes more difficult. Perfectly good customers may no longer use a landline. From a marketing point of view, calling cell phones for a sales pitch is not allowed, how then do you reach your prospects?     What concerns you? Certainly, this list is by no means complete. The concerns above warrant further discussion in future blog posts. I want to know what concerns you most when it comes to the rise in cell phone-only consumers. This feedback will allow me to gear future posts to better address your concerns.

Published: January 10, 2011 by Guest Contributor

-- by, Andrew Gulledge One of the quickest and easiest ways to reduce fraud in your portfolio is to incorporate question weighting into your out of wallet question strategy. To continue the use of knowledge based authentication without question weighting is to assign a point value of 100 points to each question. This is somewhat arbitrary (and a bit sloppy) when we know that certain questions consistently perform better than others. So if a fraudster gets 3 easier questions right, and 1 harder question wrong they will have an easier time passing your authentication process without question weighting. If, on the other hand, you adopt question weighting as part of your overall risk based authentication approach, that same fraudster would score much worse on the same KBA session. The 1 question that they got wrong would have cost them a lot of points, and the 3 easier questions they got right wouldn’t have given them as many points. Question weighting based on known fraud trends is more punitive for the fraudsters. Let’s say the easier questions were worth 50 points each, and the harder question was worth 150 points. Without question weighting, the fraudster would have scored 75% (300 out of 400 points). With question weighting, the fraudster would have scored 50% (150 out of 300 points correct). Your decisioning strategy might well have failed him with a score of 50, but passed him with a score of 75. Question weighting will often kick the fraudsters into the fail regions of your decisioning strategy, which is exactly what risk based authentication is all about. Consult with your fraud account management representative to see if you are making the most out of your KBA experience with the intelligent use of question weighting. It is a no-brainer way to improve your overall fraud prevention, even if you keep your overall pass rate the same. Question weighting is an easy way to squeeze more value of your knowledge based authentication tool.  

Published: October 20, 2010 by Guest Contributor

-- by, Andrew GulledgeThe intelligent use of question weighting in KBA should be a no-brainer for anyone using out of wallet questions. Here’s the deal: some authentication questions consistently give fraudsters a harder time than other questions. Why not capitalize on that knowledge?Question weighting is where each question type has a certain number of points associated with it. So a question that fraudsters have an easier time with might be worth only 50 points, while a question that fraudsters often struggle with might be worth 150 points. So the KBA score ends up being the total points correct divided by the total possible points. The point is to make the entire KBA session more punitive for the bad guys.Fraud analytics are absolutely essential to the use of intelligent question weighting. While fraud prevention vendors should have recommended question weights as part of their fraud best practices, if you can provide us with as many examples as possible of known fraud that went through the out of wallet questions, we can refine the best practice question weighting model to work better for your specific population.Even if we keep your pass rate the same, we can lower your fraud rate. On the other hand, we can up your pass rate while keeping the fraud rate consistent.  So whether your aim it to reduce your false positive rate (i.e., pass more of the good consumers) or to reduce your fraud rate (i.e., fail more of the fraudsters), or some combination of the two, question weighting will help you get there.

Published: October 19, 2010 by Guest Contributor

Quite a scary new (although in some ways old) form of identity theft in the headlines recently. Here’s a link to the article, which talks about how children’s dormant Social Security numbers are being found and sold by companies online under the guise of CPN’s – aka credit profile numbers or credit protection numbers.  Using deceased, “found”, or otherwise illicitly obtained Social Security numbers is not something new.  Experian’s and any good identity verification tool is going to check against the Social Security Administration’s list of numbers listed as deceased as well as check to ensure the submitted number is in an SSA valid issue range.  But the two things I find most troubling here are: One, the sellers have found a way around the law by not calling them Social Security numbers and calling them CPN’s instead.  That seems ludicrous!  But, in fact, the article goes on to state that “Because the numbers exist in a legal gray area, federal investigators have not figured out a way to prosecute the people involved”. Two, because of the anonymity and the ability to quickly set up and abandon “shop”, the online marketplace is the perfect venue for both buyer and seller to connect with minimal risk of being caught. What can we as consumers and businesses take away from this?  As consumers, we’re reminded to be ever vigilant about the disclosure of not only OUR Social Security number but that of our family members as well.  For businesses, it’s a reminder to take advantage of additional identity verification and fraud prediction tools, such as Experian’s Precise ID, Knowledge IQ, and BizID, when making credit decisions or opening accounts rather than relying solely on consumer credit scores.

Published: September 10, 2010 by Matt Ehrlich

Working with clients in the financial sector means keeping an eye toward compliance and regulations like the Gramm-Leach-Bliley Act (GLB), the Fair Credit Reporting Act (FCRA) or Fair and Accurate Credit Transactions Act (FACTA). It doesn’t really matter what kind of product it is, if a client is a financial institution (FI) of some kind, one of these three pieces of legislation is probably going to apply. The good part is, these clients know it and typically have staff dedicated to these functions. In my experience, where most clients need help is in understanding which regulations apply or what might be allowed under each. The truth is, a product designed to minimize fraud, like knowledge based authentication, will function the same whether using FCRA regulated or non-FCRA regulated data. The differences will be in the fraud models used with the product, the decisioning strategies set-up, the questions asked and the data sources of those questions. Under GLB it is acceptable to use fraud analytics for detection purposes, as fraud detection is an approved GLB exception. However, under FCRA rules, fraud detection is not a recognized permissible purpose (for accessing a consumer’s data). Instead, written instructions (of the consumer) may be used as the permissible purpose, or another permissible purpose permitted under FCRA; such as legitimate business need due to risk of financial loss. Fraud best practices dictate engaging with clients, and their compliance teams, to ensure the correct product has been selected based on client fraud trends and client needs. A risk based authentication approach, using all available data and appropriately decisioning on that data, whether or not it includes out of wallet questions, provides the most efficient management of risk for clients and best experience for consumers.

Published: September 10, 2010 by Guest Contributor

By: Tom Hannagan An article in American Banker* today discusses how many community banks are now discouraging new deposit gathering. We have seen many headlines in the past couple of years about how banks are not lending. Loan origination has been trending downward for many months. Now, they aren’t seeking deposits either. You would think this is the ultimate way to lower risk, but that’s not necessarily so. There are many different reasons why banks have or may be reducing their balance sheets. Tighter credit standards, and relatively low loan demand are chief among them. This is largely a reaction, on the part of banks and borrowers, to the economic contraction and painfully slow recovery. The softness in real estate is still a large overhanging problem – for consumers, businesses, governments and the banks. Banks are still working on loss provisioning in an attempt to deal with the embedded credit risk from the last recession. Even though they may be shrinking, or very slowly growing their loan portfolio, all of the forward risk management considerations are still there. That is true for the lending business and for managing the overall balance sheet. Most apparent among all these considerations is that the entire existing loan portfolio is steadily coming up for renewal consideration. That is as much of an opportunity for reconsidering a loan’s risk and return characteristics as is considering a new loan. It is also an opportunity to review the relationship management strategy, including the value of other relationship services or the time to sell new services to that client. All these sales situations involve risk and return considerations. Not least among them are the deposit services – existing and potential – associated with the relationship. The main point in the American Banker article was that banks can have trouble putting new deposit funds to work profitably. That makes sense. Deposits involve operating risk and operating costs. The costs include both fixed and variable costs. There are four or five major types of deposits. Each of them has very different operating cost profiles, balance behavior and levels of interest expense. They also involve market risk in that their loyalty or likely duration varies. So, it is important to take both the risk and return factors of new/renewed loans into account AND to take the risk and return factors of new/existing deposit balances into account as part of ongoing relationship management – and the bank’s resulting balance sheet direction. This is a lot to consider. A good risk-based profitability regimen is as critical as ever. *American Banker, Tuesday, July 27, 2010. In Cash Glut, Banks Try to Discourage New Deposits. By, Paul Davis

Published: July 28, 2010 by Guest Contributor

US interest rates are at historically low levels, and while many Americans are taking advantage of the low interest rates and refinancing their mortgages, a great deal more are struggling to find jobs,  and unable to take advantage of the rate- friendly lending environment.  This market however, continues to be complex as lenders try to competitively price products while balancing dynamic consumer risk levels, multiple product options and minimize the cost of acquisition. Due to this, lenders need to implement advanced risk-based pricing strategies that will balance the uncertain risk profiles of consumers while closely monitoring long-term profitability as re-pricing may not be an option given recent regulatory guidelines. Risk-based pricing has been a hot topic recently with the Credit Card Act and Risk-Based Pricing Rule regulation and pending deadline. For lenders who have not performed a new applicant scorecard validation or detailed portfolio analysis in the last few years now is the time to review pricing strategies and portfolio mix. This analysis will aid in maintaining an acceptable risk level as the portfolio evolves with new consumers and risk tiers  while ensuring short and long-term profitability and on-going regulatory compliance. At its core, risk-based pricing is a methodology that is used to determine the what interest rate should be charged to a consumer based on the inherent risk and profitability present within a defined pricing tier. By utilizing risk-based pricing, organizations can ensure the overall portfolio is profitable while providing competitive rates to each unique portfolio segment. Consistent review and strategy modification is crucial to success in today’s lending environment. Competition for the lowest risk consumers will continue to increase as qualified candidate pools shrink given the slow economic recovery.  By reviewing your portfolio on a regular basis and monitoring portfolio pricing strategies closely an organization can achieve portfolio growth and revenue objectives while monitoring population stability, portfolio performance and future losses.

Published: July 10, 2010 by Guest Contributor

In case you’ve never heard of it, a Babel fish is a small translator; that allows a carrier to understand anything said in any form of language.  Alta Vista popularized the name but I believe Douglas Adams, author of The Hitchhiker’s Guide to the Galaxy, should be given credit for coining the term.  So, what does a Babel fish have to do with Knowledge Based Authentication? Knowledge Based Authentication is always about the data – I have said this before.  There is one universal truth: data doesn’t lie.  However, that doesn’t mean it is easy to understand what the data is saying.  It is a bit like a foreign language.  You may have taken classes, and you can read the language or carry on a passable conversation, but that doesn’t mean it’s a good idea to enter into a contract – at least, not without an attorney who speaks the language, or your very own Babel fish. Setting up the best Knowledge Based Authentication configuration for risk management of your line of business can sometimes seem like that contract in a foreign language. There are many decisions to be made and the number of questions to present and which questions to ask is often the easy part.  To truly get the most out of fraud models, it is necessary to consider where the score cuts that will be used with your Knowledge Based Authentication session will be set and what methodology will be used to invoke the Knowledge Based Authentication session: objective score performance, manual review and decision, etc.  It is also important to consider the “kind of fraud” you might be seeing. This is where it is helpful to have your very own Babel fish – one designed specifically for fraud trends, fraud data, fraud models and Knowledge Based Authentication.  If your vendor doesn’t offer you a Babel fish, ask for one.  Yours could have one of many titles, but you will know this person when you speak with them, for their level of understanding of not only your business but, more importantly, your data and what it means.  Sometimes the Babel fish will work in Consulting, sometimes in Product Management, sometimes in Analytics – the important thing is that there are fraud-specific experts available to you. Think about that for a minute.  Business today is a delicate balance between customer experience/relationship management and risk management.  If your vendor can’t offer you a Babel fish, tell them you have fish to fry – elsewhere.  

Published: June 10, 2010 by Guest Contributor

Well, here we are about two weeks from the Federal Trade Commission’s June 1, 2010 Red Flags Rule enforcement date.  While this date has been a bit of a moving target for the past year or so, I believe this one will stick.  It appears that the new reality is one in which individual trade associations and advocacy groups will, one by one, seek relief from enforcement and related penalties post-June 1.  Here’s why I say that: The American Bar Association has already file suit against the FTC, and in October, 2009, The U.S. District Court for the District of Columbia ruled that the Red Flags Rule is not applicable to attorneys engaged in the practice of law.  While an appeal of this case is still pending, in mid-March, the U.S. District Court for the District of Columbia issued another order declaring that the FTC should postpone enforcement of the Red Flags Rule “with respect to members of the American Institute of Certified Public Accountants” engaged in practice for 90 days after the U.S. Court of Appeals for the District of Columbia renders an opinion in the American Bar Association’s case against the FTC.” Slippery slope here.  Is this what we can expect for the foreseeable future? A rather ambiguous guideline that leaves openings for specific categories of “covered entities” to seek exemption?  The seemingly innocuous element to the definition of “creditor” that includes “businesses or organizations that regularly defer payment for goods or services or provide goods or services and bill customers later” is causing havoc among peripheral industries like healthcare and other professional services. Those of you in banking are locked in for sure, but it ought to be an interesting year as the outliers fight to make sense of it all while they figure out what their identity theft prevention programs should or shouldn’t be.  

Published: May 13, 2010 by Keir Breitenfeld

A common request for information we receive pertains to shifts in credit score trends. While broader changes in consumer migration are well documented – increases in foreclosure and default have negatively impacted consumer scores for a group of consumers – little analysis exists on the more granular changes between the score tiers. For this blog, I conducted a brief analysis on consumers who held at least one mortgage, and viewed the changes in their score tier distributions over the past three years to see if there was more that could be learned from a closer look. I found the findings to be quite interesting. As you can see by the chart below, the shifts within different VantageScore® credit score tiers shows two major phases. Firstly, the changes from 2007 to 2008 reflect the decline in the number of consumers in VantageScore® credit score tiers B, C, and D, and the increase in the number of consumers in VantageScore® credit score tier F. This is consistent with the housing crisis and economic issues at that time. Also notable at this time is the increase in VantageScore® credit score tier A proportions. Loan origination trends show that lenders continued to supply credit to these consumers in this period, and the increase in number of consumers considered ‘super prime’ grew. The second phase occurs between 2008 and 2010, where there is a period of stabilization for many of the middle-tier consumers, but a dramatic decline in the number of previously-growing super-prime consumers. The chart shows the decline in proportion of this high-scoring tier and the resulting growth of the next highest tier, which inherited many of the downward-shifting consumers. I find this analysis intriguing since it tends to highlight the recent patterns within the super-prime and prime consumer and adds some new perspective to the management of risk across the score ranges, not just the problematic subprime population that has garnered so much attention. As for the true causes of this change – is unemployment, or declining housing prices are to blame? Obviously, a deeper study into the changes at the top of the score range is necessary to assess the true credit risk, but what is clear is that changes are not consistent across the score spectrum and further analyses must consider the uniqueness of each consumer.

Published: April 27, 2010 by Kelly Kent

By: Wendy Greenawalt Optimization has become somewhat of a buzzword lately being used to solve all sorts of problems. This got me thinking about what optimizing decisions really means to me? In pondering the question, I decided to start at the beginning and really think about what optimization really stands for. For me, it is an unbiased mathematical way to determine the most advantageous solution to a problem given all the options and variables. At its simplest form, optimization is a tool, which synthesizes data and can be applied to everyday problems such as determining the best route to take when running errands. Everyone is pressed for time these days and finding a few extra minutes or dollars left in our bank account at the end of the month is appealing. The first step to determine my ideal route was to identify the different route options, including toll-roads, factoring the total miles driven, travel time and cost associated with each option. In addition, I incorporated limitations such as required stops, avoid main street, don’t visit the grocery store before lunch and must be back home as quickly as possible. Optimization is a way to take all of these limitations and objectives and simultaneously compare all possible combinations and outcomes to determine the ideal option to maximize a goal, which in this case was to be home as quickly as possible. While this is by its nature a very simple example, optimizing decisions can be applied to home and business in very imaginative and effective means. Business is catching on and optimization is finding its way into more and more businesses to save time and money, which will provide a competitive advantage. I encourage all of you to think about optimization in a new way and explore the opportunities where it can be applied to provide improvements over business-as-usual as well as to improve your quality of life.  

Published: April 20, 2010 by Guest Contributor

Recently, the Commerce Department reported that consumer spending levels continued to rise in February, increasing for the fifth straight month *, while flat income levels drove savings levels lower. At the same time, media outlets such as Fox Businesses, reported that the consumer “shopping cart” ** showed price increases for the fourth straight month. Somewhat in opposition to this market trend, the Q4 2009 Experian-Oliver Wyman Market Intelligence Reports reveal that the average level of credit card debt per consumer decreased overall, but showed increases in only one score band. In the Q4 reports, the score band that demonstrated balance increases was VantageScore® credit score A – the super prime consumer - whose average balance went up $30 to $1,739. In this time of economic challenge and pressure on household incomes, it’s interesting to see that the lower credit scoring consumers display the characteristics of improved credit management and deleveraging; while at the same time, consumers with credit scores in the low-risk tiers may be showing signs of increased expenses and deteriorated savings. Recent delinquency trends support that low-risk consumers are deteriorating in performance for some product vintages. Even more interestingly, Chris Low, Chief Economist at FTN Financial in New York was quoted as saying "I guess the big takeaway is that consumers are comfortably consuming again. We have positive numbers five months in a row since October, which I guess is a good sign,".  I suggest that there needs to be more analysis applied within the details of these figures to determine whether consumers really are ‘comfortable’ with their spending, or whether this is just a broad assumption that is masking the uncomfortable realities that lie within.

Published: April 8, 2010 by Kelly Kent

In the past few days I’ve read several articles discussing how lenders are taking various actions to reduce their exposure to toxic mortgages – some, like Bank of America, are engaging new principal repayment programs.*  Others, (including Bank of America) are using existing incentive programs to fast-track the approvals of short-sales to stunt their losses and acquire stronger lenders on existing real-estate assets. Given the range of options available to lenders, there are significant decisions to make regarding the creditworthiness of existing consumers and which treatment strategies are best for each borrower, these decisions important for assessing credit risk, loan origination strategies and loan pricing and profitability.  Experian analysis has uncovered the attributes of borrowers with various borrowing behaviors: strategic defaulters, cash-flow managers, and distressed borrowers, each of whom require a unique treatment strategy. The value of credit attributes and predictive risk scores, like Experian Premier Attributes and VantageScore® credit score, has never been higher to lenders. Firms like Bank of America are relying on credit delinquency attributes to segment eligible borrowers for its programs, and should also consider that more extensive use of attributes can further sub-segment its clients based on the total consumer credit profile. Consumers who are late on mortgage payments, yet current on other loans, may be likely to re-default; whereas some consumers may merely need financial planning advice and enhanced money management skills. As lenders develop new methods to manage portfolio risk and deal with toxic assets on their portfolios, they should also continue to seek new and innovative analytics, including optimization, to make the best decisions for their customers, and their business. *  LA Times, March 25, 2010, ‘Bank of America to reduce mortgage principal for some borrowers’

Published: April 2, 2010 by Kelly Kent

Subscribe to our blog

Enter your name and email for the latest updates.

This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

Subscribe to our Experian Insights blog

Don't miss out on the latest industry trends and insights!
Subscribe