By: Maria Moynihan Crime prevention and awareness techniques are changing and data, analytics and use of technology is making a difference. While law enforcement departments continue to face issues related to data - ranging from working with outdated information, inability to share data across departments, and difficulty in collapsing data for analysis - a new trend is emerging where agencies are leveraging outside data sources and analytic expertise to better report on crimes, collapse information, predict patterns of behavior and ultimately locate criminals. One best practice being implemented by law enforcement agencies is to skip trace an individual much like a debt collector would. Techniques involve using historic address information and individual connections to better track to a person’s current location. See the full write up from CollectionsandCreditRisk.com to see how this works. Another great example of effective use of data in investigations can be seen in this video, where one Experian client, Intellaegis of El Dorado Hills, CA, recently worked with local law enforcement to follow the digital data footprints of a particular suspect, finding her in in just five minutes of searching. p> And, yet another representation of improved data gathering, handling and sharing of information for crime prevention and awareness can be found on a site I was just made aware of by one of my neighbors - www.crimemapping.com. Information is collapsed across departments for greater insight into the crimes that are happening within a neighborhood, offering a more comprehensive option for the general public to turn to on local area crime activity. Clearly, data, analytics and technology are making a positive impact to law enforcement processes and investigations. What is your public safety organization doing to evolve and better protect and serve the public?
When I wrote about Host Card Emulation back in March, it provoked much debate around whether this capability will die on the cutting floor or be meaningfully integrated in to a future Android iteration. And now that it has, this post is an attempt to look forward, even though much of it is speculative. But I will provide some perspective from a number of conversations I had in the last week with Networks, Issuers, TSMs, Merchants, Platform Owners and EMV practitioners and provide some insight in to perceptions, impacts and the road ahead for NFC. And I will provide some context to why HCE matters to each of these players. First – if you haven’t read my previous post on HCE – this would be a good time to do so. Media has unfortunately focused yet again on the controversy in light of the KitKat HCE announcement – focusing on the end-run around Carriers rather than the upside this brings to those who have been disincentivized previously to consider NFC. What they all seem to have missed is that HCE allows for the following: it reduces the gap between merchants and card issuance, brings the topic of closed-loop and contactless in focus, and more tactically – allows for an easy deployment scenario that does not require them to change the software inside the terminal. I hope those three things do not get lost in translation. Google: Being a Platform Owner for once The Android team deserves much credit for enabling support for Host Card Emulation in KitKat. Beyond the case for platform support – something Blackberry already had – there were both altruistic and selfish reasons for going this route. The former – altruistic – had to do with throwing open another door that would invite third party developers to build on an open NFC stack – while firmly shutting other ones (read criticism from Ars that Android is quickly becoming a closed source – partly through its Play services approach). It was time it acted like a platform owner. And being one entailed democratizing access to tap-and-pay. Selfish – because for the more than 200M Android devices that shipped with NFC support – a fraction of these are tap-and-pay worthy. It had become absurd that one must enquire upon Carrier, Platform, Issuer and Device support before installing an NFC payment app, much less use it. Talk about fragmentation. This was a problem only Google could begin to fix – by removing the absurd limitations put in place in the name of security – but in truth existed because of profit, control and convenience. Google’s role hardly ends here. Today – Host Card Emulation – by definition alone, is reserved as a technical topic. Out of the gate, much needs to be done to educate Issuers and Merchants as to why this matters. For retailers – used to much cynicism in matters relating to NFC – Host Card Emulation offers an opportunity to develop and deploy a closed-loop contactless scheme using retailer’s preferred payment sources – private label, debit, credit and in that order. HCE to Merchants: Friend or Foe? In my opinion – merchants stand to benefit most from HCE. Which is another reason why Google really embraced this concept. Despite having certain benefits for Issuers to provision cards without having to pay the piper, Google had its eyes set on expanding the offline footprint for GoogleWallet and to successfully do so – needed to focus on the merchant value prop while dialing back on what retailers once called the “data donation agreement”. Where merchants primarily struggle today in mobile – is not in replicating the plastic model – it is to create a brand new loyalty platform where the customer sets a payment source and forgets it – preferably one that’s preferred by the merchant – for example a private label card or debit. Except, no open loop wallets had actually centered itself around this premise so far. Google Wallet launched with Citi, then reverted to a negative margin strategy – by charging the merchant CP rates while paying the Issuers CNP rates. It wasn’t ideal – as merchants did not want Google anywhere near the transaction value chain. Meanwhile – it gave Google quite the heartburn to see Apple being successful with Passbook – requiring merchants give nothing back in return for leveraging it to deliver geo-targeted offers and loyalty. This silent takedown must have forced Google’s hands in getting serious about building a complete offer, loyalty, payment scheme that is collaborative (HCE support was a collaborative effort introduced by SimplyTapp) and merchant friendly. I believe HCE support now represents a serious effort to help merchants commercialize a closed-loop advantage in contactless without requiring software changes inside the terminal. Contactless was out of bounds for merchants till now. Not anymore. Having fielded a number of calls from retailers as to what this means, I will distill retailer reactions down to this: measured optimism, casual pessimism and “network” cynicism. Retailers have always looked at EMV and terminalization as a head-fake for NFC – to further lay down the tracks for another three decades of control around pricing and what they see as anti-competitive behavior. Though HCE is in no way tethered to NFC (it’s agnostic of a communication method) – due to its current close association with NFC, merchants see the conversation as a non-starter – until there is a constructive dialogue with networks. At the same time, merchants are cautiously optimistic about the future of HCE – provided that there is a standards body that provides them equal footing with Platform owners, Issuers and networks – to dictate its scope and future. As the platform owner – Google should work with the merchant body, networks, issuers and other stakeholders to see this through. It was not a surprise that those who I talked to all agreed about one thing: that Carriers really should have no role to play in this framework. TSM’s/SE Providers: Where to from here? The nine party model is dead, or will be very soon – as the SE rental model has been shown as previously not being sustainable – and now with HCE – simply wasteful. TSM’s had been focused outside of US for the last several years – as the lack of meaningful commercial launches meant that the US market will simply not bring scale for many years. And with Google shifting away from using a Secure Element in its flagship Nexus models – the writing was already on the wall. TSM’s will look to extend their capabilities in to non-traditional partnerships (Gemalto/MCX) and in to non-hardware scenarios (competing with Cloud SE providers like SimplyTapp in the HCE model). Bell-ID is such an example – and quite likely the only example right now. Networks: Certify or Not? What does Host Card Emulation mean to V/MA? It is no secret that the networks had more than toyed with the idea of software card emulation these last couple of years – realizing the rapidly shrinking runway for NFC. Focus for networks should be now to certify the new approach, as a legitimate way to store and transfer credentials. It’s interesting to hear how our neighbors in the north have reacted to this news. There is still ambiguity among Canadian issuers and networks as to what this means – including debates as to whether an onboard SE is still required for secure storage. That ambiguity will not dissipate till V/MA step in and do their part. I must quote an EMV payments consultant from the north who wrote to me this week: “My boss calls the TSM model “traditional” and I remind him in NFC payments there is no tradition… I think for some people the Global Platform standards with the TSM smack in the middle are like a comfort food – you know what you are getting and it feels secure (with 1000′s of pages of documentation how could they not be!)” That should give GP and TSMs some comfort. Device Support for HCE: What does that look like? Google does not report sales figures on Nexus 4, Nexus 5, Google Play editions of Samsung Galaxy S4 and HTC One – the four devices that are slated to receive KitKat over the next few weeks (apart from the Nexus tablets). So if I would venture a guess – I would say approx 20M devices in total that has NFC capability that will support Host Card Emulation soon. That may not seem much – but it’s a strong base . There is also a possibility that post-Galaxy Nexus devices from Samsung may leapfrog 4.3 to go directly to KitKat. If that happens – just based on reported sales volumes for Galaxy S3 and S4 – that would be a total of 100M devices with NFC support. What does that mean for Samsung’s revenue model around SE – who has an embedded SE from Oberthur in the S3 & S4 devices, which it hopes to charge rent to Visa and others – that’s unclear at this point. Issuers: ISIS alternative or more? For those issuers who passed on Isis, or those who were scorned by Isis – this enables them to outfit their current mobile assets with a payment feature. I wrote about the absurdity in a contactless transaction where the consumer has to close his merchant or banking app and switch to Isis to tap-and-pay – instead of equipping merchant/bank apps with a tap-and-pay feature. HCE means a lot more for Private label Issuers – who have a very inspired base of merchants looking to bridge the gap between private label cards and mobile – and now have an alternative to clumsy, costly and complex orchestrations for provisioning cards – replaced with an easy integration and cheaper deployment. More about that later. Finally, Carriers & Isis: Fight or Flight? God Speed.
In the 1970s, it took an average of 18 days before a decision could be made on a credit card application. Credit decisioning has come a long way since then, and today, we have the ability to make decisions faster than it takes to ring up a customer in person at the point of sale. Enabling real-time credit decisions helps retail and online merchants lay a platform for customer loyalty while incentivizing an increased customer basket size. While the benefits are clear, customers still are required to be at predetermined endpoints, such as: At the receiving end of a prescreened credit offer in the mail At a merchant point of sale applying for retail credit In front of a personal computer The trends clearly show that customers are moving away from these predetermined touch-points where they are finding mailed credit offers antiquated, spending even less time at a retail point of sale versus preferring to shop online and exchanging personal computers for tablets and smartphones. Despite remaining under 6 percent of retail spending, e-commerce sales for Q2 2013 have reportedly been up 18.5 percent from Q2 2012, representing the largest year-over-year increase since Q4 2007, before the 2008 financial crisis. Fueled by a shift from personal computers to connected devices and a continuing growth in maturity of e-commerce and m-commerce platforms, this trend is only expected to grow stronger in the future. To reflect this shift, marketers need to be asking themselves how they should apportion their budgets and energies to digital while executing broader marketing strategies that also may include traditional channels. Generally, traditional card acquisitions methods have failed to respond to these behavioral shifts, and, as a whole, retail banking was unprepared to handle the disintermediation of traditional products in favor of the convenience mobile offers. Now that the world of banking is finding its feet in the mobile space, accessibility to credit must also adapt to be on the customer’s terms, unencumbered by historical notions around customer and credit risk. Download this white paper to learn how credit and retail private-label issuers can provide an optimal customer experience in emerging channels such as mobile without sacrificing risk mitigation strategies — leading to increased conversions and satisfied customers. It will demonstrate strategies employed by credit and retail private-label issuers who already have made the shift from paper and point of sale to digital, and it provides recommendations that can be used as a business case and/or a road map.
By: Zach Smith On September 13, the Consumer Financial Protection Bureau (CFPB) announced final amendments to the mortgage rules that it issued earlier this year. The CFPB first issued the final mortgage rules in January 2013 and then released subsequent amendments in June. The final amendments also make some additional clarifications and revisions in response to concerns raised by stakeholders. The final modifications announced by the CFPB in September include: Amending the prohibition on certain servicing activities during the first 120 days of a delinquency to allow the delivery of certain notices required under state law that may provide beneficial information about legal aid, counseling, or other resources. Detailing the procedures that servicers should follow when they fail to identify or inform a borrower about missing information from loss mitigation applications, as well as revisions to simplify the offer of short-term forbearance plans to borrowers suffering temporary hardships. Clarifying best practices for informing borrowers about the address for error resolution documents. Exempting all small creditors, including those not operating predominantly in rural or underserved areas, from the ban on high-cost mortgages featuring balloon payments. This exemption will continue for the next two years while the CFPB re-examines the definitions of “rural” and “underserved.” Explaining the "financing” of credit insurance premiums to make clear that premiums are considered to be “financed” when a lender allows payments to be deferred past the month in which it’s due. Clarifying the circumstances when a bank’s teller or other administrative staff is considered to be a “loan originator” and the instances when manufactured housing employees may be classified as an originator under the rules. Clarifying and revising the definition of points and fees for purposes of the qualified mortgage cap on points and fees and the high-cost mortgage points and fees threshold. Revising effective dates of many loan originator compensation rules from January 10, 2014 to January 1, 2014. While the industry continues to advocate for an extension of the effective date to provide additional time to implement the necessary compliance requirements, the CFPB insists that both lenders and mortgage servicers have had ample time to comply with the rules. Most recently, in testimony before the House Financial Services Committee, CFPB Director Richard Cordray stated that “most of the institutions have told us that they will be in compliance” and he didn’t foresee further delays. Related Research Experian's Global Consulting Practice released a recent white paper, CCAR: Getting to the Real Objective, that suggests how banks, reviewers and examiners can best actively manage CCAR's objectives with a clear dual strategy that includes both short-term and longer-term goals for stress-testing, modeling and system improvements. Download the paper to understand how CCAR is not a redundant set of regulatory compliance exercices; its effects on risk management include some demanding paradigm shifts from traditional approaches. The paper also reviews the macroeconomic facts around the Great Recession revealing some useful insights for bank extreme-risk scenario development, econometric modeling and stress simulations. Related Posts Where Business Models Worked, and Didn't, and Are Most Needed Now in Mortgages Now That the CFPB Has Arrived, What's First on It's Agenda Can the CFPB Bring Debt Collection Laws into the 21st Centrury
Billions of dollars are being issued in fraudulent refunds at the state and federal level. Most of the fraud can be categorized around identity theft. An example of this type of fraud may include fraudsters acquiring the Personal Identifying Information (PII) from a deceased individual, buying it from someone not filing or otherwise stealing it from legitimate sources like a doctor’s office. The PII is then used to fill out tax returns, add fraudulent income information and request bogus deductions. Additional forms of tax refund fraud may include: Direct consumer tax refund fraud using real PII of US Citizens to file fraudulent tax returns and claim bogus deductions thereby increasing refund amounts EITC (Earned Income Tax Credit)/ACC (Additional Childcare Credit) fraud which is usually perpetrated with the assistance of a tax preparer and claiming improper cash payments and/or deductions for non-existent children. Tax Preparer Fraud where tax preparers purposefully submit false information on tax returns or file false returns for clients. Under reporting of income on tax filings. Taking multiple Homestead Exemptions for tax credit. Since this Fraud more often occurs as an early filing using Fraudulent or stolen PII the individual consumer is at risk for long term Identity issues. Exacerbating the tax refund fraud problem: The majority of returns that request refunds are now filed online (83% of all federal filings in 2012 were online) -if you file online, there is no need to submit a W-2 form with that online filing. If your employment information cannot be pulled into the forms by your tax software you can fill it in manually. The accuracy of information regarding employer and wage information for which deductions are based, is only verified after the refund is issued. Refunds directly deposited - filers now have the option to have their refunds deposited into a bank account for faster receipt. Once these funds are deposited and withdrawn there is no way to trace where the funds have gone. Refunds provided on debit cards – filers can request their refund in the form of a debit card. This is an even bigger problem than bank account deposits because once issued, there is no way to trace who uses a debit card and for what purpose. So what do you need to look for when reviewing tax fraud prevention tools? Look for a provider that has experience in working with state and federal government agencies. Proven expertise in this domain is critical, and experience here means that the provider has cleared the disciplined review process that the government requires for businesses they do business with. Look for providers with relevant certifications for authentication services, such as the Kantara Identity Assurance Framework for levels of identity assurance. Look for providers that can authenticate users by verifying the device they’re using to access your applications. With over 80% of tax filings occurring online, it is critical that any identity proofing strategy also allows for the capability to verify the source or device used to access these applications. Since tax fraudsters don’t limit their use of stolen IDs to tax fraud and may also use them to perpetrate other financial crimes such as opening lines of credit – you need to be looking at all avenues of fraudulent activity If fraud is detected and stopped, consider using a provider that can offer post fraud mitigation processes for your customers/potential victims. Getting tax refunds and other government benefits into the right hands of their recipients is important to everyone involved. Since tax refund fraud detection is a moving target, it’s buyer beware if you hitch your detection efforts to a provider that has not proven their expertise in this unique space.
TL;DR Read within as to how Touch ID is made possible via ARM’s TrustZone/TEE, and why this matters in the context of the coming Apple’s identity framework. Also I explain why primary/co-processor combos are here to stay. I believe that eventually, Touch ID has a payments angle – but focusing on e-commerce before retail. Carriers will weep over a lost opportunity while through Touch ID, we have front row seats to Apple’s enterprise strategy, its payment strategy and beyond all – the future direction of its computing platform. I had shared my take on a possible Apple Biometric solution during the Jan of this year based on its Authentec acquisition. I came pretty close, except for the suggestion that NFC is likely to be included. (Sigh.) Its a bit early to play fast and loose with Apple predictions, but its Authentec acquisition should rear its head sometime in the near future (2013 – considering Apple’s manufacturing lead times), that a biometric solution packaged neatly with an NFC chip and secure element could address three factors that has held back customer adoption of biometrics: Ubiquity of readers, Issues around secure local storage and retrieval of biometric data, Standardization in accessing and communicating said data. An on-chip secure solution to store biometric data – in the phone’s secure element can address qualms around a central database of biometric data open to all sorts of malicious attacks. Standard methods to store and retrieve credentials stored in the SE will apply here as well. Why didn’t Apple open up Touch ID to third party dev? Apple expects a short bumpy climb ahead for Touch ID before it stabilizes, as early users begin to use it. By keeping its use limited to authenticating to the device, and to iTunes – it can tightly control the potential issues as they arise. If Touch ID launched with third party apps and were buggy, it’s likely that customers will be confused where to report issues and who to blame. That’s not to say that it won’t open up Touch ID outside of Apple. I believe it will provide fettered access based on the type of app and the type of action that follows user authentication. Banking, Payment, Productivity, Social sharing and Shopping apps should come first. Your fart apps? Probably never. Apple could also allow users to set their preferences (for app categories, based on user’s current location etc.) such that biometrics is how one authenticates for transactions with risk vs not requiring it. If you are at home and buying an app for a buck – don’t ask to authenticate. But if you were initiating a money transfer – then you would. Even better – pair biometrics with your pin for better security. Chip and Pin? So passé. Digital Signatures, iPads and the DRM 2.0: It won’t be long before an iPad shows up in the wild sporting Touch ID. And with Blackberry’s much awaited and celebrated demise in the enterprise, Apple will be waiting on the sidelines – now with capabilities that allow digital signatures to become ubiquitous and simple – on email, contracts or anything worth putting a signature on. Apple has already made its iWork productivity apps(Pages, Numbers, Keynote), iMovie and iPhoto free for new iOS devices activated w/ iOS7. Apple, with a core fan base that includes photographers, designers and other creative types, can now further enable iPads and iPhones to become content creation devices, with the ability to attribute any digital content back to its creator by a set of biometric keys. Imagine a new way to digitally create and sign content, to freely share, without worrying about attribution. Further Apple’s existing DRM frameworks are strengthened with the ability to tag digital content that you download with your own set of biometric keys. Forget disallowing sharing content – Apple now has a way to create a secondary marketplace for its customers to resell or loan digital content, and drive incremental revenue for itself and content owners. Conclaves blowing smoke: In a day and age where we forego the device for storing credentials – whether it be due to convenience or ease of implementation – Apple opted for an on-device answer for where to store user’s biometric keys. There is a reason why it opted to do so – other than the obvious brouhaha that would have resulted if it chose to store these keys on the cloud. Keys inside the device. Signed content on the cloud. Best of both worlds. Biometric keys need to be held locally, so that authentication requires no roundtrip and therefore imposes no latency. Apple would have chosen local storage (ARM’s SecurCore) as a matter of customer experience, and what would happen if the customer was out-of-pocket with no internet access. There is also the obvious question that a centralized biometric keystore will be on the crosshairs of every malicious entity. By decentralizing it, Apple made it infinitely more difficult to scale an attack or potential vulnerability. More than the A7, the trojan in Apple’s announcement was the M7 chip – referred to as the motion co-processor. I believe the M7 chip does more than just measuring motion data. M7 – A security co-processor? I am positing that Apple is using ARM’s TrustZone foundation and it may be using the A7 or the new M7 co-processor for storing these keys and handling the secure backend processing required. Horace Dediu of Asymco had called to question why Apple had opted for M7 and suggested that it may have a yet un-stated use. I believe M7 is not just a motion co-processor, it is also a security co-processor. I am guessing M7 is based on the Cortex-M series processors and offloads much of this secure backend logic from the primary A7 processor and it may be that the keys themselves are likely to be stored here on M7. The Cortex-M4 chip has capabilities that sound very similar to what Apple announced around M7 – such as very low power chip, that is built to integrate sensor output and wake up only when something interesting happens. We should know soon. This type of combo – splitting functions to be offloaded to different cores, allows each cores to focus on the function that it’s supposed to performed. I suspect Android will not be far behind in its adoption, where each core focuses on one or more specific layers of the Android software stack. Back at Google I/O 2013, it had announced 3 new APIs (the Fused location provider) that enables location tracking without the traditional heavy battery consumption. Looks to me that Android decoupled it so that we will see processor cores that focus on these functions specifically – soon. I am fairly confident that Apple has opted for ARM’s Trustzone/TEE. Implementation details of the Trustzone are proprietary and therefore not public. Apple could have made revisions to the A7 chip spec and could have co-opted its own. But using the Trustzone/TEE and SecurCore allows Apple to adopt existing standards around accessing and communicating biometric data. Apple is fully aware of the need to mature iOS as a trusted enterprise computing platform – to address the lack of low-end x86 devices that has a hardware security platform tech. And this is a significant step towards that future. What does Touch ID mean to Payments? Apple plans for Touch ID kicks off with iTunes purchase authorizations. Beyond that, as iTunes continue to grow in to a media store behemoth – Touch ID has the potential to drive fraud risk down for Apple – and to further allow it to drive down risk as it batches up payment transactions to reduce interchange exposure. It’s quite likely that à la Walmart, Apple has negotiated rate reductions – but now they can assume more risk on the front-end because they are able to vouch for the authenticity of these transactions. As they say – customer can longer claim the fifth on those late-night weekend drunken purchase binges. Along with payment aggregation, or via iTunes gift cards – Apple has now another mechanism to reduce its interchange and risk exposure. Now – imagine if Apple were to extend this capability beyond iTunes purchases – and allow app developers to process in-app purchases of physical goods or real-world experiences through iTunes in return for better blended rates? (instead of Paypal’s 4% + $0.30). Heck, Apple can opt for short-term lending if they are able to effectively answer the question of identity – as they can with Touch ID. It’s Paypal’s ‘Bill Me Later’ on steroids. Effectively, a company like Apple who has seriously toyed with the idea of a Software-SIM and a “real-time wireless provider marketplace” where carriers bid against each other to provide you voice, messaging and data access for the day – and your phone picks the most optimal carrier, how far is that notion from picking the cheapest rate across networks for funneling your payment transactions? Based on the level of authentication provided or other known attributes – such as merchant type, location, fraud risk, customer payment history – iTunes can select across a variety of payment options to pick the one that is optimal for the app developer and for itself. And finally, who had the most to lose with Apple’s Touch ID? Carriers. I wrote about this before as well, here’s what I wrote then (edited for brevity): Does it mean that Carriers have no meaningful role to play in commerce? Au contraire. They do. But its around fraud and authentication. Its around Identity. … But they seem to be stuck imitating Google in figuring out a play at the front end of the purchase funnel, to become a consumer brand(Isis). The last thing they want to do is leave it to Apple to figure out the “Identity management” question, which the latter seems best equipped to answer by way of scale, the control it exerts in the ecosystem, its vertical integration strategy that allows it to fold in biometrics meaningfully in to its lineup, and to start with its own services to offer customer value. So there had to have been much ‘weeping and moaning and gnashing of the teeth’ on the Carrier fronts with this launch. Carriers have been so focused on carving out a place in payments, that they lost track of what’s important – that once you have solved authentication, payments is nothing but accounting. I didn’t say that. Ross Anderson of Kansas City Fed did. What about NFC? I don’t have a bloody clue. Maybe iPhone6? iPhone This is a re-post from Cherian's original blog post "Smoke is rising from Apple's Conclave"
By: Matt Sifferlen I recently read interesting articles on the Knowledge@Wharton and CNNMoney sites covering the land grab that's taking place among financial services startups that are trying to use a consumer's social media activity and data to make lending decisions. Each of these companies are looking at ways to take the mountains of social media data that sites such as Twitter, Facebook, and LinkedIn generate in order to create new and improved algorithms that will help lenders target potential creditworthy individuals. What are they looking at specifically? Some criteria could be: History of typing in ALL CAPS or all lower case letters Frequent usage of inappropriate comments Number of senior level connections on LinkedIn The quantity of posts containing cats or annoying self-portraits (aka "selfies") Okay, I made that last one up. The point is that these companies are scouring through the data that individuals are creating on social sites and trying to find useful ways to slice and dice it in order to evaluate and target consumers better. On the consumer banking side of the house, there are benefits for tracking down individuals for marketing and collections purposes. A simple search could yield a person's Facebook, Twitter, or LinkedIn profile. The behaviorial information can then be leveraged as a part of more targeted multi-channel and contact strategies. On the commercial banking side, utilizing social site info can help to supplement any traditional underwriting practices. Reviewing the history of a company's reviews on Yelp or Angie's List could share some insight into how a business is perceived and reveal whether there is any meaningful trend in the level of negative feedback being posted or potential growth outlook of the company. There are some challenges involved with leveraging social media data for these purposes. 1. Easily manipulated information 2. Irrelevant information that doesn't represent actual likes, thoughts or relevant behaviors 3. Regulations From a Fraud perspective, most online information can easily and frequently be manipulated which can create a constantly moving target for these providers to monitor and link to the right customer. Fake Facebook and Twitter pages, false connections and referrals on LinkedIn, and fabricated positive online reviews of a business can all be accomplished in a matter of minutes. And commercial fraudsters are likely creating false business social media accounts today for shelf company fraud schemes that they plan on hatching months or years down the road. As B2B review websites continue to make it easier to get customers signed up to use their services, the downside is there will be even more unusable information being created since there are less and less hurdles for commercial fraudsters to clear, particularly for sites that offer their services for free. For now, the larger lenders are more likely to utilize alternative data sources that are third party validated, like rent and utility payment histories, while continuing to rely on tools that can prevent against fraud schemes. It will be interesting to see what new credit and non credit data will be utilized as a common practice in the future as lenders continue their efforts to find more useful data to power their credit and marketing decisions.
Isis has had a slew of announcements – about an impending national rollout and further assertion by both Amex and Chase of their intent to continue their partnership. Surprisingly (or not) Capital One has stayed mum about its plans, and neither has Barclays or Discover shown any interest. And much ink has been spilled at how resolute (and isolationist) Isis has been – including here on this blog. So does the launch reflect a maturity in the JV to tackle a national rollout, or is it being forced to show its hands? Wait..I have more questions... What about the missing partner? I have no reason to believe that CapitalOne will continue its relationship with Isis – as I doubt they learnt anything new from the Isis pilot – apart from the excruciatingly difficult orchestration required to balance multiple TSMs, Carriers, Handsets and the Secure Element. Further, there are no new FI launch partners – no BofA, no WellsFargo, no Citi – who each are capable of paying the upfront cost to be on Isis. But, even to those who can afford it – the requisite capital and operating expenditures stemming from a national rollout, should give pause when compared against the lift Isis can provide to incremental revenue via Isis wallet consumers. This is the biggest qualm for Issuers today – that Isis has all the capability to drive distribution and do secure provisioning – but none of the capacity to drive incremental card revenue. And Isis opting to profit from simply delivering merchant offers based on store proximity, with no visibility in to past payment behavior and no transactional marketing capabilities – is hardly different or better than what FourSquare already does for Amex, for example. So why bother? *Updated* There is also a total misalignment of objectives between Isis and its Issuing partners around customer acquisition. Isis charges for provisioning credentials to the wallet regardless of how many transactions that may follow. So Isis has an incentive to push its wallet to everyone with a phone even if that person never completes a contactless transaction. Where as its Issuers have an incentive to get most bang for the buck by targeting the folks most likely to use a smartphone to pay after activation. See a problem? *End Update* How much more runway does Isis have? This is the question that has been around most. How much more capital is Isis’s parents willing to plow in to the JV before they come calling? The rumored quarter a billion pot holds enough to power a national rollout, but is it enough to sustain that momentum post-launch? If those $100 Amazon Gift cards they were handing out in Austin/SLC to boost consumer adoption (a final push just prior to reporting overall usage numbers) were any indication, Isis needs to invest in a smarter go-to-market strategy. It wouldn’t be surprising if Isis had to go back to its parents for mo’ money so that it can continue to run – while standing absolutely still. Who has a recognizable brand – Isis or Amex/Chase? Isis once boasted about buying a billion impressions in their pilot markets across various marketing channels. I shudder to see the ROI on that ad spend – especially when all the Ads in the world could not help if a customer still had to get a new phone, or get a new SIM by visiting the carrier store – to do what a plastic card can do effortlessly. It’s FI partners (Chase, Amex and CapitalOne) have so far kept any Isis branding outside of their ads, and I doubt if that would change. After all, why would Amex and Chase who collectively spent about $4.2B in advertising last year care about giving Isis any visibility, when a Chase or an Amex customer still has to fire up an Isis app to use a Chase or an Amex card? Why would Amex and Chase dilute its brand by including Isis messaging – when they themselves are pitted against each other inside the wallet? For some inexplicable reason – Isis made a conscious decision to become a consumer brand instead of a white label identity, provisioning and payment platform. (And for all of the faults attributable to Google – they are a consumer brand and yet – look at all the trouble it had to make its payments efforts scale.) I believe that until Isis displays a willingness to let its Issuing partners play front and center, any support they in turn provide to Isis is bound to be non-committal. Have you counted the Point-of-Sale registers? MCX proved to be the sand in mobile payments gears since the announcement. It has had “quite the intended” effect of killing any kind of forward movement on in-store payment initiatives that required a conventional point-of-sale upgrade. Contactless upgrades at point-of-sale which have long been tied to the EMV roadmap has had a series of setbacks, not the least of which is the continuing ambiguity around Issuer readiness, merchant apathy, and roadblocks such as the recent ruling. More so, the ruling injected more ambiguity in to how proximity payments would function, which payment apps must be supported for the same debit or credit transaction etc. With retailers, Isis brings nothing new that others are unable to claim, and infact it brings even less – as there is no new context outside of store-customer-proximity that it can bring to deliver discounts and coupons to customer prospects. And it’s cringeworthy when someone claims to “help” retailers in driving incremental traffic to stores, simply because they are able to pair context and proximity among other factors. These claims are hugely suspect due to how limited their “contexts” are – and no one can blend intent, past behavior, location and other factors better than Google – and even they churned out an inferior product called Google Offers. Transactional data is uniquely valuable – but Banks have been negligent in their role to do anything meaningful. But I digress. Coming full circle: Will we ever see proximity payments realized in a way that does not include the SE? The UICC based Secure Element model has been the favored approach by Carriers, which allows device portability and to exert control on the proximity payments ecosystem. We have seen deviations from the norm – in the form of Bankinter, and the recent RBC/BellID Secure cloud – which reject the notion of an onboard Secure Element, and opts to replace it with credentials on TEE, in memory or on the cloud. There is much interest around this topic, but predicting which way this will turn out is difficult owing to where the power to effect change resides – in the hands of OEMs, Ecosystem owners, Carriers etc. And don’t forget that Networks need to subscribe to this notion of credentials outside of SE, as well. But what about an Isis wallet that decouples itself from NFC/SE? Google has toyed with such an approach, but it clearly has the assets (Gmail, Android et al) to build itself a long runway. What about an Isis that exists outside of NFC/SE? Well – why do you need Isis then? To be fair, such an approach would pale against MCX or Paydiant or a number of other wallets and offer even less reasons for merchants to adopt. Paydiant offers both a better point-of-sale integration and a quicker QR capture – which Isis will struggle to match. It’s abundantly clear – take away the SE – and just as easily, the Carrier value proposition collapses on its own like a pack of cards. That’s one risky bet. What are your thoughts about the future of Isis? I am on Twitter here, if you wish to connect. And you can find me on LinkedIn here. This is a re-post from Cherian's original blog post "Isis: A JV at odds."
By: Maria Moynihan Government organizations that handle debt collection have similar business challenges regardless of agency focus and mission. Let’s face it, debtors can be elusive. They are often hard to find and even more difficult to collect from when information and processes are lacking. To accelerate debt recovery, governments must focus on optimization--particularly, streamlining how resources get used in the debt collection process. While the perception may be that it’s difficult to implement change given limited budgets, staffing constraints or archaic systems, minimal investment in improved data, tools and technology can make a big difference. Governments most often express the below as their top concerns in debt collection: Difficulty in finding debtors to collect on late tax submissions, fines or fees. Prioritizing collection activities--outbound letters, phone calls, and added steps in decisioning. Difficulty in incorporating new tools or technology to reduce backlogs or accelerate current processes. By simply utilizing right party contact data and tools for improved decisioning, agencies can immediately expose areas of greater possible ROI over others. Credit and demographic data elements like address, income models, assets, and past payment behavior can all be brought together to create a holistic view of an individual or business at a point in time or over time. Collections tools for improved monitoring, segmentation and scoring could be incorporated into current systems to improve resource allotment. Staffing can then be better allocated to not only focus on which accounts to pursue by size, but by likelihood to make contact and payment. Find additional best practices to optimize debt recovery in this guide to Maximizing Revenue Potential in the Public Sector. Be sure to check out our other blog posts on debt collection.
The desire to return to portfolio growth is a clear trend in mature credit markets, such as the US and Canada. Historically, credit unions and banks have driven portfolio growth with aggressive out-bound marketing offers designed to attract new customers and members through loan acquisitions. These offers were typically aligned to a particular product with no strategy alignment between multiple divisions within the organization. Further, when existing customers submitted a new request for credit, they were treated the same as incoming new customers with no reference to the overall value of the existing relationship. Today, however, financial institutions are looking to create more value from existing customer relationships to drive sustained portfolio growth by increasing customer retention, loyalty and wallet share. Let’s consider this idea further. By identifying the needs of existing customers and matching them to individual credit risk and affordability, effective cross-sell strategies that link the needs of the individual to risk and affordability can ensure that portfolio growth can be achieved while simultaneously increasing customer satisfaction and promoting loyalty. The need to optimize customer touch-points and provide the best possible customer experience is paramount to future performance, as measured by market share and long-term customer profitability. By also responding rapidly to changing customer credit needs, you can further build trust, increase wallet share and profitably grow your loan portfolios. In the simplest sense, the more of your products a customer uses, the less likely the customer is to leave you for the competition. With these objectives in mind, financial organizations are turning towards the practice of setting holistic, customer-level credit lending parameters. These parameters often referred to as umbrella, or customer lending, limits. The challenges Although the benefits for enhancing existing relationships are clear, there are a number of challenges that bear to mind some important questions to consider: · How do you balance the competing objectives of portfolio loan growth while managing future losses? · How do you know how much your customer can afford? · How do you ensure that customers have access to the products they need when they need them · What is the appropriate communication method to position the offer? Few credit unions or banks have lending strategies that differentiate between new and existing customers. In the most cases, new credit requests are processed identically for both customer groups. The problem with this approach is that it fails to capture and use the power of existing customer data, which will inevitably lead to suboptimal decisions. Similarly, financial institutions frequently provide inconsistent lending messages to their clients. The following scenarios can potentially arise when institutions fail to look across all relationships to support their core lending and collections processes: 1. Customer is refused for additional credit on the facility of their choice, whilst simultaneously offered an increase in their credit line on another. 2. Customer is extended credit on a new facility whilst being seriously delinquent on another. 3. Customer receives marketing solicitation for three different products from the same institution, in the same week, through three different channels. Essentials for customer lending limits and successful cross-selling By evaluating existing customers on a periodic (monthly) basis, financial institutions can assess holistically the customer’s existing exposure, risk and affordability. By setting customer level lending limits in accordance with these parameters, core lending processes can be rendered more efficient, with superior results and enhanced customer satisfaction. This approach can be extended to consider a fast-track application process for existing relationships with high value, low risk customers. Traditionally, business processes have not identified loan applications from such individuals to provide preferential treatment. The core fundamentals of the approach necessary for the setting of holistic customer lending (umbrella) limits include: · The accurate evaluation of credit and default risk · The calculation of additional lending capacity and affordability · Appropriate product offerings for cross-sell · Operational deployment Follow my blog series over the next few months as we explore the essentials for customer lending limits and successful cross-selling.
There are two core fundamentals of evaluating loan loss performance to consider when generating organic portfolio growth through the setting of customer lending limits. Neither of which can be discussed without first considering what defines a “customer.” Definition of a customer The approach used to define a customer is critical for successful customer management and is directly correlated to how joint accounts are managed. Definitions may vary by how joint accounts are allocated and used in risk evaluation. It is important to acknowledge: Legal restrictions for data usage related to joint account holders throughout the relationship Impact on predictive model performance and reporting where there are two financially linked individuals with differently assigned exposures Complexities of multiple relationships with customers within the same household – consumer and small business Typical customer definitions used by financial services organizations: Checking account holders: This definition groups together accounts that are “fed” by the same checking account. If an individual holds two checking accounts, then she will be treated as two different and unique customers. Physical persons: Joint accounts allocated to each individual. If Mr. Jones has sole accounts and holds joint accounts with Ms. Smith who also has sole accounts, the joint accounts would be allocated to both Mr. Jones and Ms. Smith. Consistent entities: If Mr Jones has sole accounts and holds joint accounts with Ms. Smith who also has sole accounts, then 3 “customers” are defined: Jones, Jones & Smith, Smith. Financially-linked individuals: Whereas consistent entities are considered three separate customers, financially-linked individuals would be considered one customer: “Mr. Jones & Ms. Smith”. When multiple and complex relationships exist, taking a pragmatic approach to define your customers as financially-linked will lead to a better evaluation of predicted loan performance. Evaluation of credit and default risk Most financial institutions calculate a loan default probability on a periodic basis (monthly) for existing loans, in the format of either a custom behavior score or a generic risk score, supplied by a credit bureau. For new loan requests, financial institutions often calculate an application risk score, sometimes used in conjunction with a credit bureau score, often in a matrix-based decision. This approach is challenging for new credit requests where the presence and nature of the existing relationship is not factored into the decision. In most cases, customers with existing relationships are treated in an identical manner to those new applicants with no relationship – the power and value of the organization’s internal data goes overlooked whereby customer satisfaction and profits suffer as a result. One way to overcome this challenge is to use a Strength of Relationship (SOR) indicator. Strength of Relationship (SOR) indicator The Strength of Relationship (SOR) indicator is a single-digit value used to define the nature of the relationship of the customer with financial institution. Traditional approaches for the assignment of a SOR are based upon the following factors Existence of a primary banking relationship (salary deposits) Number of transactional products held (DDA, credit cards) Volume of transactions Number of loan products held Length of time with bank The SOR has a critical role in the calculation of customer level risk grades and strategies and is used to point us to the data that will be the most predictive for each customer. Typically the stronger the relationship, the more we know about our customer, and the more robust will be predictive models of consumer behavior. The more information we have on our customer, the more our models will lean towards internal data as the primary source. For weaker relationships, internal data may not be robust enough alone to be used to calculate customer level limits and there will be a greater dependency to augment internal data with external third party data (credit bureau attributes.) As such, the SOR can be used as a tool to select the type and frequency of external data purchase. Customer Risk Grade (CRG) A customer-level risk grade or behavior score is a periodic (monthly) statistical assessment of the default risk of an existing customer. This probability uses the assumption that past performance is the best possible indicator of future performance. The predictive model is calibrated to provide the probability (or odds) that an individual will incur a “default” on one or more of their accounts. The customer risk grade requires a common definition of a customer across the enterprise. This is required to establish a methodology for treating joint accounts. A unique customer reference number is assigned to those customers defined as “financially-linked individuals”. Account behavior is aggregated on a monthly basis and this information is subsequently combined with information from savings accounts and third party sources to formulate our customer view. Using historical customer information, the behavior score can accurately differentiate between good and bad credit risk individuals. The behavior score is often translated into a Customer Risk Grade (CRG). The purpose of the CRG is to simplify the behavior score for operational purposes making it easier for noncredit/ risk individuals to interpret a grade more easily than a mathematical probability. Different methods for evaluating credit risk will yield different results and an important aspect in the setting of customer exposure thresholds is the ability to perform analytical tests of different strategies in a controlled environment. In my next post, I’ll dive deeper into adaptive control, champion challenger techniques and strategy design fundamentals. Related content: White paper: Improving decisions across the Customer Life Cycle
Contact information such as phone numbers and addresses are fundamental to being able to reach a debtor, but knowing when to reach out to the debtor is also a crucial factor impacting success or failure in getting payment. As referenced in the chart below, when a consumer enters the debtor life cycle, they often avoid talking with you about the debt because they do not have the ability to pay. When the debtor begins to recover financially, you want to be sure you are among the first to reach out to them so you can be the first to be paid. According to Don Taylor, President of Automated Collection Services, they have seen a lift of more than 12% of consumers with trigger hits entering repayment, and this on an aged portfolio that has already been actively worked by debt collection staff. Monitoring for a few key changes on the credit profiles of debtors provides the passive monitoring that is needed to tell you the optimal time to reach back to the consumer for payment. Experian compiled several recent collection studies and found that a debtor paying off an account that was previously past due provided a 710% increase in the average payment. Positive improvement on a consumers’ credit profile is one of those vital indicators that the consumer is beginning to recover financially and could have the will—and ability—to pay bad debts. The collection industry is not like the big warehouse stores—quantity and value do not always work hand in hand for the debt collection industry. Targeting the high value credit events that are proven to increase collection amounts is the key to value, and Experian has the expertise, analytics and data to help you collect in the most effective manner. Be sure to check out our other debt collection blog posts to learn how to recover debt more quickly and efficiently.
By: Joel Pruis Times are definitely different in the banking world today. Regulations, competition from other areas, specialized lenders, different lending methods resulting in the competitive landscape we have today. One area that is significantly different today, and for the better, is the availability of data. Data from our core accounting systems, data from our loan origination systems, data from the credit bureaus for consumer and for business. You name it, there is likely a data source that at least touches on the area if not provides full coverage. But what are we doing with all this data? How are we using it to improve our business model in the banking environment? Does it even factor into the equation when we are making tactical or strategic decisions affecting our business? Unfortunately, I see too often where business decisions are being made based upon anecdotal evidence and not considering the actual data. Let’s take, for example, Major League Baseball. How much statistics have been gathered on baseball? I remember as a boy keeping the stats while attending a Detroit Tigers game, writing down the line up, what happened when each player was up to bat, strikes, balls, hits, outs, etc. A lot of stats but were they the right stats? How did these stats correlate to whether the team won or lost, does the performance in one game translate into predictable performance of an entire season for a player or a team? Obviously one game does not determine an entire season but how often do we reference a single event as the basis for a strategic decision? How often do we make decisions based upon traditional methods without questioning why? Do we even reference traditional stats when making strategic decisions? Or do we make decisions based upon other factors as the scouts of the Oakland A’s were doing in the movie Moneyball? In one scene of the Movie, Billy Beane, general manager of the A’s, is asking his team of scouts to define the problem they are trying to solve. The responses are all very subjective in nature and only correlate to how to replace “talented” players that were lost due to contract negotiations, etc. Nowhere in this scene do any of the scouts provide any true stats for who they want to pursue to replace the players they just lost. Everything that the scouts are talking about relates to singular assessments of traits that have not been demonstrated to correlate to a team making the playoffs let alone win a single game. The scouts with all of their experience focus on the player’s swing, ability to throw, running speed, etc. At one point the scouts even talk about the appearance of the player’s girlfriends! But what if we changed how we looked at the sport of baseball? What if we modified the stats used to compile a team; determine how much to pay for an individual player? The movie Moneyball highlights this assessment of the conventional stats and their impact or correlation to a team actually winning games and more importantly the overall regular season. Bill James is given the credit in the movie for developing the methodology ultimately used by the Oakland A’s in the movie. This methodology is also referred to as Sabermetrics. In another scene, Peter Brand, explains how baseball is stuck in the old style of thinking. The traditional perspective is to buy ‘players’. In viewing baseball as buying players, the traditional baseball industry has created a model/profile of what is a successful or valuable player. Buy the right talent and then hopefully the team will win. Instead, Brand changes the buy from players to buying wins. Buying wins which require buying runs, in other words, buy enough average runs per game and you should outscore your opponent and win enough games to win your conference. But why does that mean we would have to change the way that we look at the individual players? Doesn’t a high batting average have some correlation to the number of runs scored? Don’t RBI’s (runs batted in) have some level of correlation to runs? I’m sure there is some correlation but as you start to look at the entire team or development of the line up for any give game, do these stats/metrics have the best correlation to lead to greater predictability of a win or more specifically the predictability of a winning season? Similarly, regardless of how we as bankers have made strategic decisions in the past, it is clear that we have to first figure out what it is exactly we are trying to solve, what we are trying to accomplish. We have the buzz words, the traditional responses, the non-specific high level descriptions that ultimately leave us with no specific direction. Ultimately it allows us to just continue the business as usual approach and hope for the best. In the next few upcoming blogs, we will continue to use the movie Moneyball as the back drop for how we need to stir things up, identify exactly what it is we are trying to solve and figure out how to best approach the solution.
By: Maria Moynihan Cybersecurity, identity management and fraud are common and prevalent challenges across both the public sector and private sector. Industries as diverse as credit card issuers, retail banking, telecom service providers and eCommerce merchants are faced with fraud threats ranging from first party fraud, commercial fraud to identity theft. If you think that the problem isn't as bad as it seems, the statistics speak for themselves: Fraud accounts for 19% of the $600 billion to $800 billion in waste in the U.S. healthcare system annually Medical identity theft makes up about 3% of 8.3 million overall victims of identity theft In 2011, there were 431 million adult victims of cybercrime in 24 countries In fiscal year 2012, the IRS’ specialized identity theft unit saw a 78% spike from last year in the number of ID theft cases submitted The public sector can easily apply the same best practices found in the private sector for ID verification, fraud detection and risk mitigation. Here are four sure fire ways to get ahead of the problem: Implement a risk-based authentication process in citizen enrollment and account management programs Include the right depth and breadth of data through public and private sources to best identity proof businesses or citizens Offer real-time identity verification while ensuring security and privacy of information Provide a Knowledge Based Authentication (KBA) software solution that asks applicants approved random questions based on “out-of-wallet” data What fraud protection tactics has your organization implemented? See what industry experts suggest as best practices for fraud protection and stay tuned as I share more on this topic in future posts. You can view past Public Sector blog posts here.
By: Maria Moynihan Reduced budgets, quickly evolving technologies, a weakened economy and resource constraints are clearly impacting the Public Sector, but it’s not all doom and gloom. Always with new challenges, come new opportunities. Government agencies must still effectively run programs, optimize processes and find growth in revenue streams. Below you will find the top 5 business challenges facing the Public Sector and municipal utilities today and ways to overcome them: 1. Difficulty finding debtors When asked to name the top challenge to their debt collection processes, governments most often indicate the difficulty in locating debtors whose whereabouts don’t in fact match information they have on hand. Skip tracing with right party contact data is key to finding people or businesses for collections and there are several cost effective ways to do this - either through industry leading tools or by tapping into available sources like voter registration information. 2. Difficulty in prioritizing debt collection efforts When resources are limited, it is critical to not only focus efforts by size, but by likelihood to make contact and access debtors with an ability to pay. Credit and demographic data elements like income, assets, past payment behavior, and age can all be brought together to better identify areas of greater ROI over others. 3. Lack of data available By simply incorporating third-party data and analytics into an established infrastructure, agencies can immediately gain improved insight for efficient decision making. Leverage on-hand data sources to improve understandings of individuals or businesses. 4. Difficulty of incorporating tools to improve debt recovery Governments too often attempt to reduce backlogs by simply trying to accelerate processes that are suboptimal to start with. This is both expensive and unlikely to produce the desired result. In the case of debt collection, success is driven by the tools and processes that allow for refined monitoring, segmentation and prioritization of accounts for improved decisioning. 5. Difficulty in determining to outsource or continue to internally collect While outsourcing to debt collection agencies is always an option, it may not be the most resourceful one, or in some cases, even necessary. Cost to value considerations per effort need to be made by agencies and often, the most effective strategy is to perform minimal efforts internally and to outsource older or skip accounts to third party agencies. What is your agency’s biggest business challenge? See what industry experts suggest as best practices for Public Sector collections or download Experian’s guide to Maximizing Revenue Potential in the Public Sector to learn more.