“Building a better mousetrap merely results in smarter mice” – Charles Darwin Credit card issuers in general have a good handle on fraud. They manage it under 10bps (i.e. losses of $0.10 or less per $100 of transactions) on transactions made with a "dumb" plastic card lacking any additional context. So Issuers wishing for Apple Pay fraud to fall between 2-3bps was not totally out of character, considering the protections in place by Apple and Networks to keep fraud away – including issuer support during provisioning, NFC, Tokenization, a tamper proof Secure Element and TouchID. But fraud seems to have followed a different trajectory here. About a month post-launch, it seems like fraud has come to Apple Pay. (in one case – as high as 600bps for an issuer that I cannot name). Though what follows was written in the context of Apple Pay, much of it translates to any other competitor – irrespective of origin, scale, intent, or patron saint. Apple Pay and the Yellow Path: All Apple Pay participating card issuers are required to build a “Yellow Path” for when card provisioning in to Apple Pay requires additional bank verification. Implementation of the “Yellow Path” and corresponding customer experience has varied per Card Issuer. Today, depending on your card issuer – you could expect much variance – such as being directed to their call center, being asked to authenticate via the bank’s mobile app, or an entirely other 2FA verification. As one can expect – each has varying levels of success and friction – with just a couple of banks opting to authenticate via their mobile apps, that would have provided a far easier and customer friendly provisioning experience. Where as, those that opted for call center verification traded efficiency for friction and by most reports – the corresponding experience has been subpar. In fact initially “Yellow Path” was marked optional for card issuers by Apple – which meant that only a couple of Issuers directed much focus at it. Apple reversed its decision and made it mandatory less than a month before launch – which led to issuers scrambling to build and provide this support. Why any bank would consider this optional is beyond me. Either way, Card issuer implementations of the Apple Pay Yellow Path have proved to be inadequate – as I am willing to bet that most of the fraud in Apple Pay came by stolen identities. For all the paranoia around elevating your phone to be the container for all your credit cards – fraud in Apple Pay has assumed more traditional and unsophisticated ways. No, iPhones weren’t stolen and then used for unauthorized purchases, TouchID was not compromised, Credentials weren’t ripped out of Apple’s tamper proof secure element – nor the much feared but rarely attempted MITM attacks(capture and relay an NFC transmission at a different terminal). Instead fraudsters bought stolen consumer identities complete with credit card information, and convinced both software and manual checks that they were indeed a legitimate customer. Fraud on Apple Pay is somewhat unique – as the Pay setup is one of the first things one would do upon getting their iPhone 6. At which point – the device will have little to no background or context with the bank. Further, the customer most likely haven’t had the time to install the bank app or login. It is no wonder then that a number of banks defaulted to “Call our call center” as the default Yellow path. In an earlier post on ISIS (Softcard) I did write how the vast retail network coupled with visibility in to customer identity positioned Carriers as a trusted partner for banks to do secure provisioning. But ISIS had other (yet unrealized) aspirations. For all the focus in protecting transactions and plastic – for e.g. via EMV and Tokenization – issuance and provisioning remains the soft underbelly – under protected and easily compromised. And this should concern all – because the strongest chain is only as good as its weakest link – and those with malice are almost always the first to find it. Fraud in Apple Pay will in time, come to be managed – but the fact that easily available PII can waylay best in class protection should give us all pause. Make sure to download our fraud prevention whitepaper to gain more insight on how you can prepare your business. This post originally appeared here.
This season’s peak week, the Wednesday before Thanksgiving through the Tuesday after Cyber Monday, had an 18 percent increase in email volume, an 11 percent rise in transactions and a 7 percent increase in email revenue in comparison to peak week 2013. Cyber Monday provided 27 percent of total peak week revenue followed by Black Friday, which accounted for 18 percent of revenue. Marketers can design more successful holiday campaigns by staying on top of the latest email trends. View the December Holiday Hot Sheet
Experian's most recent State of Credit report analyzed the average credit scores for more than 100 metropolitan statistical areas (MSAs).
This is the second post in a three-part series. Imagine the circumstances of a traveler coming to a never before visited culture. The opportunity is the new sights, cuisine and cultural experiences. Among the risks is the not before experienced pathogens and the strength of the overall health services infrastructure. In a similar vein, all too frequently we see the following conflict within our client institutions. The internal demands of an ever-increasing competitive landscape drive businesses to seek more data; improved ease of accessibility and manipulation of data; and acceleration in creating new attributes supporting more complex analytic solutions. At the same time, requirements for good governance and heightened regulatory oversight are driving for improved documentation and controlled access, all with improved monitoring and documented and tested controls. As always, the traveler/businessman must respond to the environment, and the best medicine is to be well-informed of both the perils and the opportunities. The good news is that we have seen many institutions invest significantly in their audit and compliance functions over the past several years. This has provided the lender with both better insights into its current risk ecosystem and the improved skill set to continue to refine those insights. The opportunity is for the lender to leverage this new strength. For many lenders, this investment largely has been in response to broadening regulatory oversight to ensure there are proper protocols in place to confirm adherence to relevant rules and regulations and to identify issues of disparate impact. A list of the more high-profile regulations would include: Equal Credit Opportunity Act (ECOA) — to facilitate enforcement of fair lending laws and enable communities, governmental entities and creditors to identify business and community development needs and opportunities of women-owned, minority-owned and small businesses. Home Mortgage Disclosure Act (HMDA) — to require mortgage lenders to collect and report additional data fields. Truth in Lending Act (TLA) — to prohibit abusive or unfair lending practices that promote disparities among consumers of equal creditworthiness but of different race, ethnicity, gender or age. Consumer Financial Protection Bureau (CFPB) — evolving rules and regulations with a focus on perceptions of fairness and value through transparency and consumer education. Gramm-Leach-Bliley Act (GLBA) — requires companies to give consumers privacy notices that explain the institutions’ information-sharing practices. In turn, consumers have the right to limit some, but not all, sharing of their information. Fair Debt Collections Practices Act (FDCPA) — provides guidelines for collection agencies that are seeking to collect legitimate debts while providing protections and remedies for debtors. Recently, most lenders have focused their audit/compliance activities on the analytics, models and policies used to treat consumer/client accounts/relationships. This focus is understandable since it is these analytics and models that are central to the portfolio performance forecasts and Comprehensive Capital Analysis and Review (CCAR)–mandated stress-test exercises that have been of greater emphasis in responding to recent regulatory demands. Thus far at many lenders, this same rigor has not yet been applied to the data itself, which is the core component of these policies and frequently complex analytics. The strength of both the individual consumer–level treatments and the portfolio-level forecasts is negatively impacted if the data underlying these treatments is compromised. This data/attribute usage ecosystem demands clarity and consistency in attribute definition; extraction; and new attribute design, implementation to models and treatments, validation and audit. When a lender determines there is a need to enhance its data governance infrastructure, Experian® is a resource to be considered. Experian has this data governance discipline within our corporate DNA — and for good reason. Experian receives large and small files on a daily basis from tens of thousands of data providers. In order to be sure the data is of high quality so as not to contaminate the legacy data, rigorous audits of each file received are conducted and detailed reports are generated on issues of quality and exceptions. This information is shared with the data provider for a cycle of continuous improvement. To further enhance the predictive insights of the data, Experian then develops new attributes and complex analytics leveraging the base and developed attributes for analytic tools. This data and the analytic tools then are utilized by thousands of authorized users/lenders, who manage broad-ranging relationships with millions of individuals and small businesses. These individuals and businesses then have rights to reproach Experian for error(s) both perceived and actual. This demanding cycle underscores the value of the data and the value of our rigorous data governance infrastructure. This very same process occurs at many lenders sites. Certainly, a similar level of data integrity born from a comprehensive data governance process also is warranted. In the next and final blog in this series, we will explore how a disciplined business review of an institution’s data governance process is conducted. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.
41st Parameter, a part of Experian, surveyed 250 marketers to understand the relationship between omnichannel retailing, fraud prevention and the holiday shopping season. The findings show that few marketers understand the full benefit of fraud-prevention systems on their activities as 60% of marketers were unsure of the cost of fraud to their organization. The survey also indicated that 40% of marketers said their organization had been targeted by hackers or cybercriminals. Download the Holiday Marketing Fraud Survey: http://snip.ly/JoyF With holiday shopping in full stride, 35% of businesses said they planned to increase their digital spend for the 2014 holiday season. Furthermore, Experian Marketing Services reported that during 2014, 80%t of marketers planned on running cross-channel marketing campaigns. As marketers integrate more channels into their campaigns, new challenges emerge for fraud-risk managers who face continuous pressure to adopt new approaches. Here are three steps to help marketers and risk managers maintain a frictionless experience for customers: Marketers should communicate their plans early to the fraud-risk team, especially if they are planning to target a new or unexpected audience. Making this part of the process will reduce the chances that risk management will stop or inhibit customers. Ensure that marketers understand what the risk-management department is doing with respect to fraud detection. Chances are risk managers are waiting to tell you. Marketers shouldn’t assume that fraud won’t affect their business and talk to their risk-management division to learn how much fraud truly costs their company. Then they can understand what they need to do to make sure that their marketing efforts are not thwarted. “Marketers spend a great deal of time and money bringing in new customers and increasing sales, especially this time of year, and in too many cases, those efforts are negated in the name of fraud prevention,” said David Britton, vice president of industry solutions, 41st Parameter. “Marketers can help an organization’s bottom line by working with their fraud-risk department to prevent bad transactions from occurring while maintaining a seamless customer experience. Reducing fraud is important and protecting the customer experience is a necessity.” Few marketers understand the resulting impact of declined transactions because of suspected fraud and this is even more pronounced among small businesses, with 70% saying they were unsure of fraud’s impact. Fifty percent of mid-sized business marketers and 67% of large-enterprise marketers were unsure of the impact of fraud as well An uncoordinated approach to new customer acquisition can result in lost revenue affecting the entire organization. For example, the industry average for card-not-present declines is 15%. However, one to three percent of those declined transactions turn out to be valid transactions, equating to $1.2 billion in lost revenue annually. Wrongfully declined transactions can be costly as the growth of cross-channel marketing increases and a push towards omnichannel retailing pressures marketers to find new customers. “Many businesses loosen their fraud detection measures during high peak time because they don’t have the tools to review potentially risky orders manually during the higher-volume holiday shopping period,” said Britton. “Criminals look to capitalize on this and exploit these gaps in any way possible, taking an omnifraud approach to maximizing their chances of success. Striking the right balance between sales enablement and fraud prevention is the key to maximizing growth for any business at all times of the year.” Download Experian’s fraud prevention report to learn more about how businesses can address these new marketing challenges.
Opening a new consumer checking account in the 21st century should be simple and easy to understand as a customer right? Unfortunately, not all banks have 21st century systems or processes reflecting the fact that negotiable order of withdrawal (NOW) accounts, or checking accounts, were introduced decades ago within financial institutions and often required the consumer to be in person to open the account. A lot has changed and consumers demand simpler and transparent account opening processes with product choices that match their needs at a price that they’re willing to pay. Financial institutions that leverage modernized technology capabilities and relevant decision information have the best chance to deliver consumer friendly experiences that meet consumer expectations. It is obvious to consumers when we in the financial services industry get it right and when we don’t. The process to open a checking account should be easily understood by consumers and provide them with appropriate product choices that aren’t “one size fits all”. Banks with more advanced core-banking systems incorporating relevant and compliant decision data and transparent consumer friendly approval processes have a huge opportunity to differentiate themselves positively from competitors. The reality is that banking deposit management organizations throughout the United States continue to evolve check screening strategies, technology and processes. This is done in an effort to keep up with evolving regulatory expectations from the consumer advocacy regulatory bodies such as the Consumer Financial Protection Bureau (CFPB) and designed to improve transparency of checking account screening for new accounts for an increased number of consumers. The CFPB advocates that financial institutions adopt new checking account decision processes and procedures that maintain sound management practices related to mitigating fraud and risk expense while improving consumer transparency and increasing access to basic consumer financial instruments. Bank shareholders demand that these accounts be extended to consumers profitably. The CFPB recognizes that checking accounts are a basic financial product used by almost all consumers, but has expressed concerns that the checking account screening processes may prevent access to some consumers and may be too opaque with respect to the reasons why the consumer may be denied an account. The gap between the expectations of the CFPB, shareholders and bank deposit management organization’s current products and procedures are not as wide as they may seem. The solution to closing the gap includes deploying a more holistic approach to checking account screening processes utilizing 21st century technology and decision capabilities. Core banking technology and checking products developed decades ago leave banks struggling to enact much needed improvements for consumers. The CFPB recognizes that many financial institutions rely on reports used for checking account screening that are provided by specialty consumer reporting agencies (CRAs) to decision approval for new customers. CRAs specialize in checking account screening and provide financial institutions with consumer information that is helpful in determining if a consumer should be approved or not. Information such as the consumer’s check writing and account history such as closed accounts or bounced checks are important factors in determining eligibility for the new account. Financial institutions are also allowed to screen consumers to assess if they may be a credit risk when deciding whether to open a consumer checking account because many consumers opt-in for overdraft functionality attached to the checking account. Richard Cordray, the CFPB Director, clarified the regulatory agency’s position as to how consumers are treated in checking account screening processes within his prepared remarks at a forum on this topic in October 2014. “The Consumer Bureau has three areas of concern. First, we are concerned about the information accuracy of these reports. Second, we are concerned about people’s ability to access these reports and dispute any incorrect information they may find. Third, we are concerned about the ways in which these reports are being used.” The CFPB suggests four items they believe will improve financial institution’s checking account screening policies and practices: Increase the accuracy of data used from CRA’s Identify how institutions can incorporate risk screening tools while not excluding potential accountholders unnecessarily Ensure consumers are aware and notified of information used to decision the account opening process Ensure consumers are informed of what account options exist and how they access products that align with their individual needs Implementing these steps shouldn’t be too difficult to accomplish for deposit management organizations as long as they are fully leveraging software such as Experian’s PowerCurve customized for deposit account origination, relevant decision information such as Experian’s Precise ID Platform and Vantage Score® credit score combined with consumer product offerings developed within the bank and offered in an environment that is real-time where possible and considers the consumer’s needs. Enhancing checking account screening procedures by taking into account consumer’s life-stage, affordability considerations, unique risk profile and financial needs will satisfy expectations of the consumers, regulators and the financial institution shareholders. Financial institutions that use technology and data wisely can reduce expenses for their organizations by efficiently managing fraud, risk and operating costs within the checking account screening process while also delighting consumers. Regulatory agencies are often delighted when consumers are happy. Shareholders are delighted when regulators and consumers are happy. Reengineering checking account opening processes for the modern age results in a win-win-win for consumers, regulators and financial institutions. Discover how an Experian Global Consultant can help you with your banking deposit management needs.
By: John Robertson Capital is the life-blood of financial institutions and has become more readily scrutinized since the global credit crisis. How one manages their capital is primarily driven by how well one manages their risk. The use of economic capital in measuring profitability enhances risk management efforts by providing a common indicator for risk. It provides pricing metrics such as RAROC (risk adjusted return on capital) and economic value added which include expected and unexpected losses consequently broadening the evaluation of the adequacy of capital in relation to the bank's overall risk profile. The first accounts of economic capital date back to the ancient Phoenicians, who took rudimentary tallies of frequency and severity of illnesses among rural farmers to gain an intuition of expected losses in productivity. These calculations were advanced by correlations with predictions of climate change, political outbreak, and birth rate change. The primary value of economic capital is its application to decision-making and overall risk management. Economic capital is a measure of risk, not of capital held. It represents the amount of money which is needed to secure the survival in a worst case scenario; it is a buffer against expected shocks in market values. Economic capital measures risk using economic realities rather than accounting and regulatory rules, which can be misleading. The concept of economic capital differs from regulatory capital in the sense that regulatory capital is the mandatory capital the regulators require to be maintained while economic capital is the best estimate of required capital that financial institutions use internally to manage their own risk and to allocate the cost of maintaining regulatory capital among different units within the organization. The allocation of economic capital to support credit risk begins with similar inputs to derive expected losses but considers other factors to determine unexpected losses, such as credit concentrations and default correlations among borrowers. Economic capital credit risk modeling measures the incremental risk that a transaction adds to a portfolio rather than the absolute level of risk associated with an individual transaction. In a previous blog I restated a phrase I had heard long ago; “Margins will narrow forever”. How well you manage your capital will help you extend “forever”. Has your institution started using these types of risk measures? The Phoenicians did. Learn more about our credit risk solutions.
Originally contributed by: Bill Britto Smart meters have made possible new services for customers, such as automated budget assistance and bill management tools, energy use notifications, and "smart pricing" and demand response programs. It is estimated that more than 50 million smart meters have been deployed as of July 2014. Utilities and customers alike are benefiting from these smart meter deployments. It is now obvious the world of utilities is changing, and companies are beginning to cater more to their customers by offering them tools to keep their energy costs lower. For example, several companies offer prepay to some of their customers who do not have bank accounts. For many of those "unbanked" customers, prepay could be the only way to sign up for a utility services. Understanding the value of prospects and the need to automate decisions to achieve higher revenue and curb losses is imperative to the utility. It is here where a decisioning solution, like PowerCurve OnDemand> can make a real difference for utility customers by providing modified decision strategies based on market dynamics, business and economic environments. Imagine what a best of class decision solution can do by identifying what matters most about consumers and business and by leveraging internal and external data assets to replace complexity with cost efficiency? Solutions like PowerCurve OnDemand deliver the power and speed-to-market to respond to changing customer demands, driving profitability and growing customer lifetime value - good for business and good for customers.
A new comarketing agreement for MainStreet Technologies’ (MST) Loan Loss Analyzer product with Experian Decision Analytics’ Baker Hill Advisor® product will provide the banking industry with a comprehensive, automated loan-management offering. The combined products provide banks greater confidence for loan management and loan-pricing calculations. Experian Decision Analytics Baker Hill Advisor product supports banks’ commercial and small-business loan operations comprehensively, from procuring new loans through collections. MST’s Loan Loss Analyzer streamlines the estimation and documentation of the Allowance for Loan and Lease Losses (ALLL), the bank’s most critical quarterly calculation. The MST product automates the most acute processes required of community bankers in managing their commercial and small-business loan portfolios. Both systems are data-driven, configurable and designed to accommodate existing bank processes. The products already effectively work together for community banks of varying asset sizes, adding efficiencies and accuracy while addressing today’s increasingly complex regulatory requirements. “Experian’s Baker Hill Advisor product-development priorities have always been driven by our user community. Changes in regulatory and accounting requirements have our clients looking for a sophisticated ALLL system. Working with MainStreet, we can refer our clients to an industry-leading ALLL platform,” said John Watts, Experian Decision Analytics director of product management. “The sharing of data between our organizations creates an environment where strategic ALLL calculations are more robust and tactical lending decisions can be made with more confidence. It provides clients a complete service at every point within the organization.” “Bankers, including many using our Loan Loss Analyzer, have used Experian’s Baker Hill® software to manage their commercial loan programs for more than three decades,” said Dalton T. Sirmans, CEO and MST president. “Bankers who choose to implement Experian’s Baker Hill Advisor and the MST Loan Loss Analyzer will be automating their loan management, tracking, reporting and documentation in the most comprehensive, user-friendly and feature-rich manner available.” For more information on MainStreet Technologies, please visit http://www.mainstreet-tech.com/banking For more information on Baker Hill, visit http://ex.pn/BakerHill
This is the first post in a three-part series. You’ve probably heard the adage “There is a little poison in every medication,” which typically is attributed to Paracelsus (1493–1541), the father of toxicology. The trick, of course, is to prescribe the correct balance of agents to improve the patient while doing the least harm. One might think of data governance in a similar manner. A well-disciplined and well-executed data governance regimen provides significant improvements to the organization. So too, an overly restrictive or poorly designed and/or ineffectively monitored data governance ecosystem can result in significant harm; less than optimal models/scorecards, inaccurate reporting, imprecise portfolio outcome forecasts and poor regulatory reports, subsequently resulting in significant investment and loss of reputation. In this blog series, we will address the issues and best practices associated with the broad mandate of data governance. In its simplest definition, data governance is the management of the availability, usability, integrity and security of the data employed in an enterprise. A sound data governance program includes a governing body or council, a defined set of procedures and a plan to execute those procedures. Well, upon quick reflection, effective data governance is not simple at all. After all, data is ubiquitous, is becoming more available, encompasses aspects of our digital lives not envisioned as little as 15 years ago and is constantly changing as people’s behavior changes. To add another level of complexity, regulatory oversight is becoming more pervasive as regulations passed since the Great Recession have become more intrusive, granular and demanding. When addressing issues of data governance lenders, service providers and insurers find themselves trying to incorporate a wide range of issues. Some of these are time-tested best practices, while others previously were never considered. Here is a reasonable checklist of data governance concerns to consider: Who owns the data governance responsibility within the organization? Is the data governance group seen as an impediment to change or is it a ready part of the change management culture? Is the backup and retrieval discipline — redundancy and recovery — well-planned and periodically tested? How agile/flexible is the governance structure to new data sources? How does the governance structure document and reconcile similar data across multiple providers? Are there appropriate and documented approvals and consents from the data provider(s) for all disclosures? Are systemic access and modification controls and reporting fully deployed and monitored for periodic refinement? Does the monitoring of data integrity, persistence and entitled access enable a quick fix culture where issues are identified and resolved at the source of the problem and not settled by downstream processes? Are all data sources, including those that are proprietary, fully documented and subject to systemic accuracy/integrity reporting? Once obtained, how is the data stored and protected in both definition and accessibility? How do we alter data and leverage the modified outcome? Are there reasonable audits and tracking of downstream reporting? In the event of a data breach, does the organization have well-documented protocols and notification thresholds in place? How recently and to what extent have all data retrieval, manipulation, usage and protection policies and processes been audited? Are there scheduled and periodic reports made to the institution board on issues of data governance? Certainly, many institutions have most of these aspects covered. However, “most” is imprecise medicine, and ill effects are certain to follow. As Paracelsus stated, “The doctor can have a stronger impact on the patient than any drug.” As in medical services, for data governance initiatives those impacts can be beneficial or harmful. In our next blog, we’ll discuss observations of client data governance gaps and lessons learned in evaluating the existing data governance ecosystem. Make sure to read Compliance as a Differentiator perspective paper for deeper insight on regulations affecting financial institutions and how you can prepare your business. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.
By: Ori Eisen This article originally appeared on WIRED. When I started 41st Parameter more than a decade ago, I had a sense of what fraud was all about. I’d spent several years dealing with fraud while at VeriSign and American Express. As I considered the problem, I realized that fraud was something that could never be fully prevented. It’s a dispiriting thing to accept that committed criminals will always find some way to get through even the toughest defenses. Dispiriting, but not defeating. The reason I chose to dedicate my life to stopping online fraud is because I saw where the money was going. Once you follow the money and you see how it is used, you can’t “un-know.” The money ends up supporting criminal activities around the globe – not used to buy grandma a gift. Over the past 10 years the nature of fraud has become more sophisticated and systematized. Gone are the days of the lone wolf hacker seeing what they could get away with. Today, those days seem almost simple. Not that I should be saying it, but fraud and the people who perpetrated it had a cavalier air about them, a bravado. It was as if they were saying, in the words of my good friend Frank Abagnale, “catch me if you can.” They learned to mimic the behaviors and clone the devices of legitimate users. This allowed them to have a field day, attacking all sorts of businesses and syphoning away their ill-gotten gains. We learned too. We learned to look hard and close at the devices that attempted to access an account. We looked at things that no one knew could be seen. We learned to recognize all of the little parameters that together represented a device. We learned to notice when even one of them was off. The days of those early fraudsters has faded. New forces are at work to perpetrate fraud on an industrial scale. Criminal enterprises have arisen. Specializations have emerged. Brute force attacks, social engineering, sophisticated malware – all these tools, and so many more – are being applied every day to cracking various security systems. The criminal underworld is awash in credentials, which are being used to create accounts, take over accounts and commit fraudulent transactions. The impact is massive. Every year, billions of dollars are lost due to cyber crime. Aside from the direct monetary losses, customer lose faith in brand and businesses, resources need to be allocated to reviewing suspect transactions and creativity and energy are squandered trying to chase down new risks and threats. To make life just a little simpler, I operate from the assumption that every account, every user name and every password has been compromised. As I said at the start, fraud isn’t something that can be prevented. By hook or by crook (and mainly by crook), fraudsters are finding cracks they can slip through; it’s bound to happen. By watching carefully, we can see when they slip up and stop them from getting away with their intended crimes. If the earliest days of fraud saw impacts on individuals, and fraud today is impacting enterprises, the future of fraud is far more sinister. We’re already seeing hints of fraud’s dark future. Stories are swirling around the recent Wall Street hack. The President and his security team were watching warily, wondering if this was the result of a state-sponsored activity. Rather than just hurting businesses or their customers, we’re on the brink (if we haven’t crossed it already) of fraud being used to destabilize economies. If that doesn’t keep you up at night I don’t know what will. Think about it: in less than a decade we have gone from fraud being an isolated irritant (not that it wasn’t a problem) to being viewed as a potential, if clandestine, weapon. The stakes are no longer the funds in an account or even the well being of a business. Today – and certainly tomorrow – the stakes will be higher. Fraudsters, terrorists really, will look for ways to nudge economies toward the abyss. Sadly, the ability of fraudsters to infiltrate legitimate accounts and networks will never be fully stifled. The options available to them are just too broad for every hole to be plugged. What we can do is recognize when they’ve made it through our defenses and prevent them from taking action. It’s the same approach we’ve always had: they may get in while we do everything possible to prevent them from doing harm. In an ideal world bad guys would never get through in the first place; but we don’t live in an ideal world. In the real world they’re going to get in. Knowing this isn’t easy. It isn’t comforting or comfortable. But in the real world there are real actions we can take to protect the things that matter – your money, your data and your sense of security. We learned how to fight fraud in the past, we are fighting it with new technologies today and we will continue to apply insights and new approaches to protect our future. Download our Perspective Paper to learn about a number of factors that are contributing to the evolving fraud landscape.
Through all the rather “invented conflict” of MCX vs Apple Pay by the tech media these last few weeks – very little diligence was done on why merchants have come to reject NFC (near field communication) as the standard of choice. Maybe I can provide some color here – both as to why traditionally merchants have viewed this channel with suspicion leading up to CurrenC choosing QR, and why I believe its time for merchants to give up hating on a radio. Why do merchants hate NFC? Traditionally, any contactless usage in stores stems from international travelers, fragmented mobile NFC rollouts and a cornucopia of failed products using a variety of form factors – all of which effectively was a contactless chip card with some plastic around it. Any merchant supported tended to be in the QSR space – biggest of which was McDonalds - and they saw little to no volume to justify the upgrade costs. Magstripe, on the other hand, was a form factor that was more accessible. It was cheap to manufacture, provisioning was a snap, distribution depended primarily on USPS. Retailers used the form factor themselves for Gift cards, Pre-paid and Private Label. In contrast – complexity varies in contactless for all three – production, provisioning and distribution. If it’s a contactless card – all three can still follow pretty much the norm – as they require no customization or changes post-production. Mobile NFC was an entirely different beast. Depending on the litany of stakeholders in the value chain – from Hardware – OEM and Chipset support – NFC Controller to the Secure Element, the OS Support for the NFC stack, the Services – Trusted Service Managers of each flavor (SE vs SP), the Carriers (in case of OTA provisioning) and the list goes on. The NFC Ecosystem truly deters new entrants by its complexity and costs. Next – there was much ambiguity to what NFC/contactless could come to represent at the point of sale. Merchants delineated an open standard that could ferry over any type of credential – both credit and debit. Even though merchants prefer debit, the true price of a debit transaction varies depending on which set of rails carry the transaction – PIN Debit vs Signature Debit. And the lack of any PIN Debit networks around the contactless paradigm made the merchants fears real – that all debit transactions through NFC will be carried over the more costly signature debit route (favoring V/MA) and that a shift from magstripe to contactless would mean the end to another cost advantage the merchants had to steer transactions towards cheaper rails. The 13 or so PIN debit networks are missing from Apple Pay – and it’s an absence that weighed heavily in the merchants decision to be suspicious of it. Maybe even more important for the merchant – since it has little to do with payment – loyalty was a component that was inadequately addressed via NFC. NFC was effective as a secure communications channel – but was wholly inadequate when it came to transferring loyalty credentials, coupons and other things that justify why merchants would invest in a new technology in the first place. The contactless standards to move non-payment information, centered around ISO 18092 – and had fragmented acceptance in the retail space, and still struggled from a rather constricted pipe. NFC was simply useful as a payments standard and when it came to loyalty – the “invented a decade ago” standard is wholly inadequate to do anything meaningful at the point of sale. If the merchant must wrestle with new ways to do loyalty – then should they go back in time to enable payments, or should they jerry rig payments to be wrapped in to loyalty? What looks better to a merchant? Sending a loyalty token along with the payment credential (via ISO 18092) OR Encapsulating a payment token (as a QR Code) inside the Starbucks Loyalty App? I would guess – the latter. Even more so because in the scenario of accepting a loyalty token alongside an NFC payment – you are trusting the payment enabler (Apple, Google, Networks, Banks) with your loyalty token. Why would you? The reverse makes sense for a merchant. Finally – traditional NFC payments – (before Host Card Emulation in Android) – apart from being needlessly complex – mandated that all communication between the NFC capable device and the point-of-sale terminal be limited to the Secure Element that hosts the credential and the payment applets. Which means if you did not pay your way in to the Secure Element (mostly only due to if you are an issuer) then you have no play. What’s a merchant to do? So if you are a merchant – you are starting off with a disadvantage – as those terminologies and relationships are alien to you. Merchants did not own the credential – unless it was prepaid or private label – and even then, the economics wouldn’t make sense to put those in a Secure Element. Further, Merchants had no control in the issuer’s choice of credential in the Secure Element – which tended to be mostly credit. It was then no surprise that merchants largely avoided this channel – and then gradually started to look at it with suspicion around the same time banks and networks began to pre-ordain NFC as the next stage in payment acceptance evolution. Retailers who by then had been legally embroiled in a number of skirmishes on the interchange front – saw this move as the next land grab. If merchants could not cost effectively compete in this new channel – then credit was most likely to become the most prevalent payment option within. This suspicion was further reinforced with the launch of GoogleWallet, ISIS and now Apple Pay. Each of these wrapped existing rails, maintained status quo and allowed issuers and networks to bridge the gap from plastic to a new modality (smartphones) while changing little else. This is no mere paranoia. The merchants fear that issuers and networks will ultimately use the security and convenience proffered through this channel as an excuse to raise rates again. Or squeeze out the cheaper alternatives – as they did with defaulting to Signature Debit over PIN debit for contactless. As consumers learn a new behavior (tap and pay) they fear that magstripe will eclipse and a high cost alternative will then take root. How is it fair that to access their customer’s funds – our money – one has to go through toll gates that are incentivized to charge higher prices? The fact that there are little to no alternatives between using Cash or using a bank issued instrument to pay for things – should worry us as consumers. As long as merchants are complacent about the costs in place for them to access our money – there won’t be much of an incentive for banks to find quicker and cheaper ways to move money – in and out of the system as a whole. I digress. So the costs and complexities that I pointed to before, that existed in the NFC payments ecosystem – served to not only keep retailers out, but also impacted issuers ability to scale NFC payments. These costs materialized in to higher interchange cards for the issuer when these initiatives took flight – partly because the issuer was losing money already, and had then little interest to enable debit as a payments choice. GoogleWallet itself had to resort to a bit of “negative margin strategy” to allow debit cards to be used within. ISIS had little to no clout, nor any interest to push issuers to pick debit. All of which must have been quite vexing for an observant merchant. Furthermore, just as digital and mobile offers newer ways to interact with consumers – they also portend a new reality – that new ecosystems are taking shape across that landscape. And these ecosystems are hardly open – Facebook, Twitter, Google, Apple – and they have their own toll gates as well. Finally – A retail payment friend told me recently that merchants view the plethora of software, systems and services that encapsulate cross-channel commerce as a form of “Retailer OS”. And if Payment acceptance devices are end-points in to that closed ecosystem of systems and software – they are rightfully hesitant in handing over those keys to the networks and banks. The last thing they want to do is let someone else control those toll-gates. And it makes sense and ironically – it has parallel in the iOS ecosystem. Apple’s MFi program is an example of an ecosystem owner choosing to secure those end-points – especially when those are manufactured by a third party. This is why Apple exacts a toll and mandates that third party iOS accessory manufacturers must include an Apple IC to securely connect and communicate with an iOS device. If Apple can mandate that, then why is it that a retailer should have no say over the end-points through which payments occur in it’s own retail ecosystem? Too late to write about how the retailer view of NFC must evolve – in the face of an open standard, aided by Host Card Emulation – but that’s gotta be another post. Another time. See you all in Vegas. Make sure to join the Experian #MobilePayChat on Twitter this Tuesday at 12:15 p.m. PT during Money2020 conference: http://ex.pn/Money2020. If you are attending the event please stop by our booth #218. This post originally appeared here.
By: John Robertson I began this blog series asking the question “How can banks offer such low rates?” Exploring the relationship of pricing in an environment where we have a normalized. I outlined a simplistic view of loan pricing as: + Interest Income + Non-Interest Income Cost of Funds Non-Interest Expense Risk Expense = Income before Tax Along those lines, I outlined how perplexing it is to think at some of these current levels, banks could possibly make any money. I suggested these offerings must be lost leaders with the anticipation of more business in the future or possibly, additional deposits to maintain a hold on the relationship over time. Or, I shudder to think, banks could be short funding the loans with the excess cash on their balance sheets. I did stumble across another possibility while proving out an old theory which was very revealing. The old theory stated by a professor many years ago was “Margins will continue to narrow…. Forever”. We’ve certainly seen that in the consumer world. In pursuit of proof to this theory I went to the trusty UBPR and looked at the net interest margin results from 2011 until today for two peer groups (insured commercial banks from $300 million to $1 billion and insured commercial banks greater the $3 billion). What I found was, in fact, margins have narrowed anywhere from 10 to 20 basis points for those two groups during that span even though non-interest expense stayed relatively flat. Not wanting to stop there, I started looking at one of the biggest players individually and found an interesting difference in their C&I portfolio. Their non-interest expense number was comparable to the others as well as their cost of funds but the swing component was non-interest income. One line item on the UPBR’s income statement is Overhead (i.e. non-interest expense) minus non-interest income (NII). This bank had a strategic advantage when pricing there loans due to their fee income generation capabilities. They are not just looking at spread but contribution as well to ensure they meet their stated goals. So why do banks hesitate to ask for a fee if a customer wants a certain rate? Someone seems to have figured it out. Your thoughts?
By: Mike Horrocks I am at the Risk Management Association’s annual conference in DC and I feel like I am back to where my banking career began. One of the key topics here is how important the Risk Rating Grade is and what impact that right or wrong Risk Rating Grade can have on the bank. It is amazing to me how a risk rating is often a shot in the dark at some institutions or can even vary on the training of one risk manager to another. For example, you could have a commercial credit with fantastic debt service coverage and have it tied to a terrible piece of collateral and that risk rating grade will range anywhere from prime type credit (cash flow is king and the loan will never default – so why concern ourselves with collateral) to low, subprime (do we really want that kind of collateral dragging us down or in our OREO portfolio?), to anywhere in between. Banks need to define the attributes of a risk rating grade and consistently apply that grade. The failure of doing that will lead to having that poor risk rating grade impact ALLL calculations (with either an over allocation or not enough) and then that will roll into the loan pricing (making you more costly or not enough to match for the risk). The other thing I hear consistently is that we don’t have the right solutions or resources to complete a project like this. Fortunately there is help. A bank should never feel like they should try to do this alone. I recall how it was an all hands on deck when I first started out to make sure we were getting the right loan grading and loan pricing in place at the first super-regional bank I worked at – and that was without all the compliance pressure of today. So take a pause and look at your loan grading approach – is it passing or failing your needs? If it is not passing, take some time to read up on the topic, perhaps find a tutor (or business partner you can trust) and form a study group of your best bankers. This is one grade that needs to be at the top of the class. Looking forward to more from RMA 2014!
The ubiquity of mobile devices provides financial services marketers with an effective way to distribute targeted, customized messages that appeal to a single shopper — a marketing segment of one.