As we prepare to attend next week’s FS-ISAC & BITS Summit we know that the financial services industry is abuzz about massive losses from the ever-evolving attack vectors including DDoS, Malware, Data Breaches, Synthetic Identities, etc. Specifically, the recent $200 million (and counting) in losses tied to a sophisticated card fraud scheme involving thousands of fraudulent applications submitted over several years using synthetic identities. While the massive scale and effectiveness of the attack seems to suggest a novel approach or gap in existing fraud prevention controls, the fact of the matter is that many of the perpetrators could have been detected at account opening, long before they had an opportunity to cause financial losses. Synthetic identities have been a headache for financial institutions for years, but only recently have criminal rings begun to exploit this attack vector at such a large scale. The greatest challenge with synthetic identities is that traditional account opening processes focus on identity verification compliance around the USA PATRIOT Act and FACT Act Red Flags guidance, risk management using credit bureau scores, and fraud detection using known fraudulent data points. A synthetic identity ring simply sidesteps those controls by using new false identities created with data that could be legitimate, have no established credit history, or slightly manipulate elements of data from individuals with excellent credit scores. The goal is to avoid detection by “blending in” with the thousands of credit card, bank account, and loan applications submitted each day where individuals do not have a credit history, where minor typos cause identity verification false positives, or where addresses and other personal data does not align with credit reports. Small business accounts are an even easier target, as third-party data sources to verify their authenticity are sparse even though the financial stakes are higher with large lines of credit, multiple signors, and complex (sometimes international) transactions. Detecting these tactics is nearly impossible in a channel where anonymity is king — and many rings have become experts on gaming the system, especially as institutions continue to migrate the bulk of their originations to the online channel and the account opening process becomes increasingly faceless. While the solutions described above play a critical role in meeting compliance and risk management objectives, they unfortunately often fall short when it comes to detecting synthetic identities. Identity verification vendors were quick to point the finger at lapses in financial institutions’ internal and third-party behavioral and transactional monitoring solutions when the recent $200 million attack hit the headlines, but these same providers’ failure to deploy device intelligence alongside traditional controls likely led to the fraudulent accounts being opened in the first place. With synthetic identities, elements of legitimate creditworthy consumers are often paired with other invalid or fictitious applicant data so fraud investigators cannot rely on simply verifying data against a credit report or public data source. In many cases, the device used to submit an application may be the only common element used to link and identify other seemingly unrelated applications. Several financial institutions have already demonstrated success at leveraging device intelligence along with a powerful risk engine and integrated link analysis tools to pinpoint these complex attacks. In fact, one example alone spanned hundreds of applications and represented millions of dollars in fraud saves at a top bank. The recent synthetic ring comprising over 7,000 false identities and 25,000 fraudulent cards may be an extreme example of the potential scope of this problem; however, the attack vector will only continue to grow until device intelligence becomes an integrated component of all online account opening decisions across the industry. Even though most institutions are satisfying Red Flags guidance, organizations failing to institute advanced account opening controls such as complex device intelligence can expect to see more attacks and will likely struggle with higher monetary losses from accounts that never should have been booked.
Findings from Experian's latest State of the Automotive Finance Market analysis showed the average loan term for a new vehicle jumped to an all-time high of 65 months in Q4 2012, up from 63 months in Q4 2011. More consumers also are opting for leases, with the lease share of new auto financing increasing to 24.79 percent, up from 10.45 percent in Q4 2011.
Outsourcing can be risky business. The Ponemon Institute reports that 65% of companies who outsourced work to a vendor have had a data breach involving consumer data and 64% say it has happened more than once. Their study, Securing Outsourced Consumer Data, sponsored by Experian® Data Breach Resolution also found that the most common cause for breaches were negligence and lost or stolen devices. Despite the gravity of these errors, only 38 percent of businesses asked their vendor to fix the problems that led to the breach and surprisingly, 56% of the companies learned about the data breach accidentally instead of through security protocols and control procedures. These findings come from a survey of 748 people in a supervisory (or higher) job who work in vendor management at companies that share or transfer consumer data mainly for marketing, finance and outsourced IT operations including cloud services and payment processing. The survey also polled the vendors and 57% of them reported that they in turn, outsourced work to a third party. 23% of vendors could not tell how often data loss happened which is a sign that they don’t have proper procedures and policies in place to know when incidents occur. When asked about their data breach notification practices, only 16 percent of vendors said they immediately notified their client after the breach investigation with 25 percent saying they don’t even tell clients about breaches of data. Keeping all work and information in house is not feasible in today’s multi-corporate companies, and outsourcing is a business reality, however, all parties have a responsibility to protect the sensitive and confidential data that is entrusted to them. When outsourcing consumer data to vendors, here are a few guidelines companies need to follow to safeguard the information: 1. Make sure you hold vendors to the same security standards as your own in-house security policies and practices. 2. Make sure the vendor has appropriate security and controls procedures in place to monitor potential threats. 3. Audit the vendor’s security and privacy practices and make sure in your contract with them, the vendor is legally obligated to fix data problems should a breach occur including notifying consumers. 4. Monitor the security and privacy practices of vendors you work with especially if you share consumer data with them. 5. Require background checks for vendor employees who have access to confidential information. The goal of this study was to better understand what companies are doing to protect consumer data they outsource and where improvements could be made to insure privacy and security when sharing private information with third parties. The solution seems to be that all parties must first agree that data privacy and protection is paramount and then work toward the mutual goal of achieving responsible privacy and security practices. Download the Securing Outsourced Consumer Data report
The most recent release of the S&P/Experian Consumer Credit Default Indices showed national credit default rates decreased in February. The national composite* moved from 1.63 percent in January to 1.55 percent in February. First mortgage and bankcard default rates followed a similar pattern. These trends are consistent with other economic news, such as improvements in employment and continuing gains in housing.
By: Maria Moynihan A recently-released staff report prepared for the House Oversight and Government Reform Committee revealed that nearly 17,000 efficiency and process improvement recommendations made by agency Inspectors General remain pending as of 2012 and in combination could have saved more than $67 billion in wasteful government spending. At the same time, the 2013 Identity Fraud Report released in February 2013 by Javelin Strategy & Research indicates that in 2012, identity fraud incidents increased by more than one million victims and fraudsters stole more than $21 billion, the highest amount since 2009. Fraudsters know where process inefficiencies lie and government agencies can no longer delay the implementation of much needed system improvements. There are several service providers and integrators in the public sector that offer options and tools to choose from. Specifically, identity management tools exist that can authenticate a person’s identity online and in real-time, verify an address, validate one’s income and assets, and provide a full view of a constituent so funds go to those who need them most and stay out of the hands of fraudsters or those who are otherwise not eligible. There is a better way to validate and authenticate individuals or businesses as part of a constituent review processes and time is of the essence. By simply incorporating third-party data and analytics into established infrastructure, agencies can immediately gain improved insight for efficient decision making. Experian recently sponsored the FCW Executive Briefing on Detecting and Preventing Wasteful and Improper Payments. Click here to view the keynote presentation or stay tuned as I share more on this pressing issue.
Using a more inclusive scoring model such as the new VantageScore® 3.0, lenders can score up to 30 million consumers who are labeled "unscoreable" by traditional models. Nearly 25 percent of these consumers are prime or near-prime credit quality.
By: Maria Moynihan State and Federal agencies are tasked with overseeing the integration of new Health Insurance Exchanges and with that responsibility, comes the effort of managing information updates, ensuring smooth data transfer, and implementing proper security measures. The migration process for HIEs is no simple undertaking, but with these three easy steps, agencies can plan for a smooth transition: Step 1: Ensure all current contact information is accurate with the aid of a back-end cleansing tool. Back-end tools clean and enhance existing address records and can help agencies to maintain the validity of records over time. Step 2: Duplicate identification is a critical component of any successful database migration - by identifying and removing existing duplicate records, and preventing future creation of duplicates, constituents are prevented from opening multiple cases, thereby reducing the probability for fraud. Step 3: Validate contact data as it is captured. This step is extremely important, especially as information gets captured across multiple touch points and portals. Contact record validation and authentication is a best practice for any database or system gateway. Agencies and those particularly responsible for the successful launches of HIEs are expected to leverage advanced technology, data and sophisticated tools to improve efficiencies, quality of care and patient safety. Without accurate, standard and verified contact information, none of that is possible. Access the full Health Insurance Exchange Toolkit by clicking here.
While the overall average VantageScore® for consumers in Q4 2012 was 748, the average score can vary greatly by specific loan product. For example, the average VantageScore for consumers with a home equity line of credit is 864, which is the highest average score for all products, reflecting tighter lending requirements. Student loans have the lowest average VantageScore of 695.
Spending on debit and prepaid cards in the United States topped $2 trillion in 2011, with 75 percent of this purchase volume being non-ATM transactions. The evolution of marketing knowledge and tactics for the U.S. debit card market can be applied to other countries migrating payment from cash to noncash transactions.
The Experian/Moody's Analytics Small Business Credit Index tumbled in Q4 2012, falling 6.8 points to 97.3 from 104.1 in the previous quarter. This is the second consecutive quarterly decline and is the index's lowest reading since Q3 2011. The drop in the index was driven primarily by a rise in delinquent balances as a slowdown in personal income growth pulled retail sales lower.
This post is in response to the recent Bankinter story of NFC payments at the point-of-sale without requiring SE – and the lack of any real detail around how it plans to achieve that goal. I am not privy to Bankinter’s plan to dis-intermediate the SE, but as I know a wee bit about how NFC works, I thought a post would help in clearing up any ambiguity as to how Card emulation and Host Card emulation differs, upsides, challenges – the whole lot. Back in December of 2012, Verizon responded to an FCC complaint over its continued blocking of GoogleWallet on Verizon network. The gist of Verizon’s response was that as GoogleWallet is different to PayPal, Square and other wallet aggregators in that its reliance on the phone’s Secure Element – a piece of proprietary hardware, lies behind the reason for Verizon denying GoogleWallet from operating on its devices or network. Verizon continued to write that Google is free to offer a modified version of GoogleWallet that does not require integration with the Secure Element. Now Software Card Emulation was not born out of that gridlock. It had been always supported by both NXP and Broadcom chipsets at the driver level. Among operating systems, BlackberryOS supports it by default. With Android however, application support did not manifest despite interest from the developer community. Google chose to omit exposing this capability via the API from Android 2.3.4 – may have to do with opting to focus its developer efforts elsewhere, or may have been due to carrier intervention. What very few knew is that a startup called SimplyTapp had already been toiling away at turning the switch back on – since late 2011. Host Card What? But first – let’s talk a bit about Card Emulation and how Host Card Emulation (or SE on the Cloud) differs in their approach. In the case of GoogleWallet, Card Emulation represents routing communication from an external contactless terminal reader directly to the embedded secure element, dis-allowing visibility by the operating system completely. Only the secure element and the NFC controller are involved. Card Emulation is supported by all merchant contactless terminals and in this mode, the phone appears to the reader as a contactless smart card. Google Wallet, Isis and other NFC mobile wallets rely on card emulation to transfer payment credentials to the PoS. However the downsides to this are payment apps are limited to the SE capacity (72kb on the original embedded SE on Nexus S), SE access is slower, and provisioning credentials to the SE is a complex, brittle process involving multiple TSM’s, multiple Carriers (in the case of Isis) and multiple SE types and handsets. Host Card Emulation (or Software Card Emulation) differs from this such that instead of routing communications received by the NFC controller to the secure element, it delivers them to the NFC service manager – allowing the commands to be processed by applications installed on the phone. With that, the approach allows to break dependency on the secure element by having credentials stored anywhere – in the application memory, in the trusted execution environment (TEE) or on the cloud. The benefits are apparent and a couple is noted: NFC returns to being a communication standard, enabling any wallet to use it to communicate to a PoS – without having to get mired down in contracts with Issuers, Carriers and TSMs. No more complex SE cards provisioning to worry about Multiple NFC payment wallets can be on the phone without worrying about SE storage size or compartmentalizing. No need to pay the piper – in this case, the Carrier for Over-the-air SE provisioning and lifecycle management. Card Issuers would be ecstatic. However this is no panacea, as software card emulation is not exposed to applications by Android and host card emulation patches that have been submitted (by SimplyTapp) have not yet been merged with the main android branch – and therefore not available to you and I – unless we root our phones. Which is where SimplyTapp comes in. SimplyTapp appealed to an early segment of Android enthusiasts who abhorred having been told as to what functionality they are allowed to enable on their phones – by Google, Carriers or anyone else. And to any who dared to root an NFC phone (supported by CyanogenMod) and install the Cyanogenmod firmware, they were rewarded by being able to use both SimplyTapp as well as GoogleWallet to pay via NFC – the former where credentials were stored on the cloud and the latter – within the embedded SE. So how does this work? SimplyTapp created a Host Card Emulation patch which resolves potential conflicts that could arise from having two competing applications (SimplyTapp and GW) that has registered for the same NFC event from the contactless external reader. It does so by ensuring that upon receiving the event – if the SimplyTapp app is open in the foreground (On-Screen) then the communication is routed to it and if not – it gets routed to GoogleWallet. This allows consumers to use both apps harmoniously on the same phone (take that ISIS and Google Wallet!). SimplyTapp today works on any NFC phone supported by CyanogenMod. Apart from SimplyTapp, InsideSecure is working on a similar initiative as reported here. You get a wallet! And you get a wallet! Everyone gets a wallet! Well not quite. What are the downsides to this approach? Well for one – if you wish to scale beyond the enthusiasts, you need Google, the platform owner to step up and make it available to all without having to root our phones. For that to happen it must update the NFC service manager to expose Host Card emulation for the NXP and Broadcom chipsets. And if Google is not onboard with the idea, then you need to find an OEM, a Handset manufacturer or an Amazon ready to distribute your amended libraries. Further, you can also expect Carriers to fight this move as it finds its investment and control around the secure element threatened. With the marked clout they enjoy with the OEM’s and Handset manufacturers by way of subsidies, they can influence the outcome. Some wonder how is it that BlackberryOS continues to support Host Card Emulation without Carrier intervention. The short answer may be that it is such a marginal player these days that this was overlooked or ignored. The limitations do not stop there. The process of using any cloud based credentials in an EMV or contactless transaction has not been certified yet. There is obviously interest and it probably will happen at some point – but nothing yet. Debit cards may come first – owing to the ease in certification. Further, Closed loop cards may probably be ahead of the curve compared to Open loop cards. More about that later. *Update: Latency is another issue when the credentials are stored on the cloud. Especially when NFC payments were called out last year to be not quick enough for transit.* So for all those who pine for the death of secure elements, but swear fealty to NFC, there is hope. But don’t set your alarm yet. So what will Google do? Let’s consider for a moment that Google is down with this. If so, does that represent a fork in the road for Google Wallet? Will the wallet application leverage HCE on phones with inaccessible Secure Elements, while defaulting to the Secure Element on phones it has? If so, it risks confusing consumers. Further – enabling HCE lets other wallets to adopt the same route. It will break dependency with the secure element, but so shall it open the flood gates to all other wallets who now wants to play. It would seem like a pyrrhic victory for Google. All those who despised proximity payments (I am looking at you Paypal & Square!) will see their road to contactless clear and come calling. As the platform owner – Google will have no choice but to grin and bear it. But on a positive note, this will further level the playing field for all wallets and put the case for contactless back – front and center. Will Google let this happen? Those who look at Google’s history of tight fisted control over the embedded SE are bound to cite precedent and stay cynical. But when it comes down it, I believe Google will do the right thing for the broader android community. Even on the aspect of not relinquishing control over the embedded SE in the devices it issued, Google had put the interests of consumer first. And it felt that, after all things considered it felt it was not ready to allow wanton and unfettered access to the SE. Google had at one point was even talking about allowing developers write their own “card emulation” applets and download them to the SE. Broadcom also has an upcoming quad-combo chip BCM43341 that has managed to wrap NFC, Bluetooth 4.0, Wi-Fi and FM Radio, all on a single die. Further, the BCM43341 also supports multiple Secure Elements. Now, I also hear Broadcom happens to be a major chip supplier to a fruit company. What do you think? This is content was originally posted to Cherian's personal blog at DropLabs.
Experian Automotive's Q4 2012 credit trends analysis found that 60-day delinquencies rose from 0.72 percent in Q4 2011 to 0.74 percent in Q4 2012. It was the first time in three years that 60-day delinquencies experienced a year-over-year increase.
Big news [last week], with Chase entering in to a 10 year expanded partnership with Visa to create a ‘differentiated experience’ for its merchants and consumers. I would warn anyone thinking “offers and deals” when they hear “differentiated experience” – because I believe we are running low on merchants who have a perennial interest in offering endless discounts to its clientele. I cringe every time someone waxes poetic about offers and deals driving mobile payment adoption – because I am yet to meet a merchant who wanted to offer a discount to everyone who shopped. There is an art and a science to discounting and merchants want to identify customers who are price sensitive and develop appropriate strategies to increase stickiness and build incremental value. It’s like everyone everywhere is throwing everything and the kitchen sink at making things stick. On one end, there is the payments worshippers, where the art of payment is the centre piece – the tap, the wave, the scan. We pore over the customer experience at the till, that if we make it easier for customers to redeem coupons, they will choose us over the swipe. But what about the majority of transactions where a coupon is not presented, where we swipe because its simply the easiest, safest and the boring thing to do. Look at the Braintree/Venmo model, where payment is but a necessary evil. Which means, the payment is pushed so far behind the curtain – that the customer spends nary a thought on her funding source of choice. Consumers are issuer agnostic to a fault – a model propounded by Square’s Wallet. Afterall, when the interaction is tokenized, when a name or an image could stand in for a piece of plastic, then what use is there for an issuer’s brand? So what are issuers doing? Those that have a processing and acquiring arm are increasingly looking at creative transaction routing strategies, in transactions where the issuer finds that it has a direct relationship with both the merchant and the consumer. This type of selective routing enables the issuer to conveniently negotiate pricing with the merchant – thereby encouraging the merchant to incent their customers to pay using the card issued by the same issuer. For this strategy to succeed, issuers need to both signup merchants directly, as well as encourage their customers to spend at these merchants using their credit and debit cards. FI’s continue to believe that they can channel customers to their chosen brands, but “transactional data doth not maketh the man” – and I continue to be underwhelmed by issuer efforts in this space. Visa ending its ban on retailer discounts for specific issuer cards this week must be viewed in context with this bit – as it fuels rumors that other issuers are looking at the private payment network option – with merchants preferring their cards over competitors explicitly. The wild wild west, indeed. This drives processors to either cut deals directly with issuers or drives them far deeper in to the merchant hands. This is where the Braintree/Venmo model can come in to play – where the merchant – aided by an innovative processor who can scale – can replicate the same model in the physical world. We have already seen what Chase Paymentech plans to do. There aren’t many that can pull off something similar. Finally, What about Affirm, the new startup by Max Levchin? I have my reservations about the viability of a Klarna type approach in the US – where there is a high level of credit card penetration among the US customers. Since Affirm will require customers to choose that as a payment option, over other funding sources – Paypal, CC and others, there has to be a compelling reason for a customer to choose Affirm. And atleast in the US, where we are card-entrenched, and everyday we make it easier for customers to use their cards (look at Braintree or Stripe) – it’s a tough value proposition for Affirm. Share your opinions below. This is a re-post from Cherian's personal blog at DropLabs.
According to a recent Ponemon Institute study, 65 percent of study participants say their organization has had a data breach in the past two years involving consumer data outsourced to a third party. Most of these are preventable, as employee negligence accounts for 45 percent of data breaches and lost or stolen devices account for 40 percent.
Last January, I published an article in the Credit Union Journal covering the trend among banks to return to portfolio growth. Over the year, the desire to return to portfolio growth and maximize customer relationships continues to be a strong focus, especially in mature credit markets, such as the US and Canada. Let’s revisit this topic, and start to dive deeper into the challenges we’ve seen, explore the core fundamentals for setting customer lending limits, and share a few best practices for creating successful cross-sell lending strategies. Historically, credit unions and banks have driven portfolio growth with aggressive out-bound marketing offers designed to attract new customers and members through loan acquisitions. These offers were typically aligned to a particular product with no strategy alignment between multiple divisions within the organization. Further, when existing customers submitted a new request for credit, they were treated the same as incoming new customers with no reference to the overall value of the existing relationship. Today, however, financial institutions are looking to create more value from existing customer relationships to drive sustained portfolio growth by increasing customer retention, loyalty and wallet share. Let’s consider this idea further. By identifying the needs of existing customers and matching them to individual credit risk and affordability, effective cross-sell strategies that link the needs of the individual to risk and affordability can ensure that portfolio growth can be achieved while simultaneously increasing customer satisfaction and promoting loyalty. The need to optimize customer touch-points and provide the best possible customer experience is paramount to future performance, as measured by market share and long-term customer profitability. By also responding rapidly to changing customer credit needs, you can further build trust, increase wallet share and profitably grow your loan portfolios. In the simplest sense, the more of your products a customer uses, the less likely the customer is to leave you for the competition. With these objectives in mind, financial organizations are turning towards the practice of setting holistic, customer-level credit lending parameters. These parameters often referred to as umbrella, or customer lending, limits. The challenges Although the benefits for enhancing existing relationships are clear, there are a number of challenges that bear to mind some important questions: How do you balance the competing objectives of portfolio loan growth while managing future losses? How do you know how much your customer can afford? How do you ensure that customers have access to the products they need when they need them What is the appropriate communication method to position the offer? Few credit unions or banks have lending strategies that differentiate between new and existing customers. In the most cases, new credit requests are processed identically for both customer groups. The problem with this approach is that it fails to capture and use the power of existing customer data, which will inevitably lead to suboptimal decisions. Similarly, financial institutions frequently provide inconsistent lending messages to their clients. The following scenarios can potentially arise when institutions fail to look across all relationships to support their core lending and collections processes: Customer is refused for additional credit on the facility of their choice, whilst simultaneously offered an increase in their credit line on another. Customer is extended credit on a new facility whilst being seriously delinquent on another. Customer receives marketing solicitation for three different products from the same institution, in the same week, through three different channels. Essentials for customer lending limits and successful cross-selling By evaluating existing customers on a periodic (monthly) basis, financial institutions can holistically assess the customer’s existing exposure, risk and affordability. By setting customer level lending limits in accordance with these parameters, core lending processes can be rendered more efficient, with superior results and enhanced customer satisfaction. This approach can be extended to consider a fast-track application process for existing relationships with high value, low risk customers. Traditionally, business processes have not identified loan applications from such individuals to provide preferential treatment. The core fundamentals of the approach necessary for the setting of holistic customer lending (umbrella) limits include: The accurate evaluation of credit and default rise The calculation of additional lending capacity and affordability Appropriate product offerings for cross-sell Operational deployment Follow my blog series over the next few months as we explore the core fundamentals for setting customer lending limits, and share a few best practices for creating successful cross-sell lending strategies.