41st Parameter, a part of Experian, surveyed 250 marketers to understand the relationship between omnichannel retailing, fraud prevention and the holiday shopping season. The findings show that few marketers understand the full benefit of fraud-prevention systems on their activities as 60% of marketers were unsure of the cost of fraud to their organization. The survey also indicated that 40% of marketers said their organization had been targeted by hackers or cybercriminals. Download the Holiday Marketing Fraud Survey: http://snip.ly/JoyF With holiday shopping in full stride, 35% of businesses said they planned to increase their digital spend for the 2014 holiday season. Furthermore, Experian Marketing Services reported that during 2014, 80%t of marketers planned on running cross-channel marketing campaigns. As marketers integrate more channels into their campaigns, new challenges emerge for fraud-risk managers who face continuous pressure to adopt new approaches. Here are three steps to help marketers and risk managers maintain a frictionless experience for customers: Marketers should communicate their plans early to the fraud-risk team, especially if they are planning to target a new or unexpected audience. Making this part of the process will reduce the chances that risk management will stop or inhibit customers. Ensure that marketers understand what the risk-management department is doing with respect to fraud detection. Chances are risk managers are waiting to tell you. Marketers shouldn’t assume that fraud won’t affect their business and talk to their risk-management division to learn how much fraud truly costs their company. Then they can understand what they need to do to make sure that their marketing efforts are not thwarted. “Marketers spend a great deal of time and money bringing in new customers and increasing sales, especially this time of year, and in too many cases, those efforts are negated in the name of fraud prevention,” said David Britton, vice president of industry solutions, 41st Parameter. “Marketers can help an organization’s bottom line by working with their fraud-risk department to prevent bad transactions from occurring while maintaining a seamless customer experience. Reducing fraud is important and protecting the customer experience is a necessity.” Few marketers understand the resulting impact of declined transactions because of suspected fraud and this is even more pronounced among small businesses, with 70% saying they were unsure of fraud’s impact. Fifty percent of mid-sized business marketers and 67% of large-enterprise marketers were unsure of the impact of fraud as well An uncoordinated approach to new customer acquisition can result in lost revenue affecting the entire organization. For example, the industry average for card-not-present declines is 15%. However, one to three percent of those declined transactions turn out to be valid transactions, equating to $1.2 billion in lost revenue annually. Wrongfully declined transactions can be costly as the growth of cross-channel marketing increases and a push towards omnichannel retailing pressures marketers to find new customers. “Many businesses loosen their fraud detection measures during high peak time because they don’t have the tools to review potentially risky orders manually during the higher-volume holiday shopping period,” said Britton. “Criminals look to capitalize on this and exploit these gaps in any way possible, taking an omnifraud approach to maximizing their chances of success. Striking the right balance between sales enablement and fraud prevention is the key to maximizing growth for any business at all times of the year.” Download Experian’s fraud prevention report to learn more about how businesses can address these new marketing challenges.
Opening a new consumer checking account in the 21st century should be simple and easy to understand as a customer right? Unfortunately, not all banks have 21st century systems or processes reflecting the fact that negotiable order of withdrawal (NOW) accounts, or checking accounts, were introduced decades ago within financial institutions and often required the consumer to be in person to open the account. A lot has changed and consumers demand simpler and transparent account opening processes with product choices that match their needs at a price that they’re willing to pay. Financial institutions that leverage modernized technology capabilities and relevant decision information have the best chance to deliver consumer friendly experiences that meet consumer expectations. It is obvious to consumers when we in the financial services industry get it right and when we don’t. The process to open a checking account should be easily understood by consumers and provide them with appropriate product choices that aren’t “one size fits all”. Banks with more advanced core-banking systems incorporating relevant and compliant decision data and transparent consumer friendly approval processes have a huge opportunity to differentiate themselves positively from competitors. The reality is that banking deposit management organizations throughout the United States continue to evolve check screening strategies, technology and processes. This is done in an effort to keep up with evolving regulatory expectations from the consumer advocacy regulatory bodies such as the Consumer Financial Protection Bureau (CFPB) and designed to improve transparency of checking account screening for new accounts for an increased number of consumers. The CFPB advocates that financial institutions adopt new checking account decision processes and procedures that maintain sound management practices related to mitigating fraud and risk expense while improving consumer transparency and increasing access to basic consumer financial instruments. Bank shareholders demand that these accounts be extended to consumers profitably. The CFPB recognizes that checking accounts are a basic financial product used by almost all consumers, but has expressed concerns that the checking account screening processes may prevent access to some consumers and may be too opaque with respect to the reasons why the consumer may be denied an account. The gap between the expectations of the CFPB, shareholders and bank deposit management organization’s current products and procedures are not as wide as they may seem. The solution to closing the gap includes deploying a more holistic approach to checking account screening processes utilizing 21st century technology and decision capabilities. Core banking technology and checking products developed decades ago leave banks struggling to enact much needed improvements for consumers. The CFPB recognizes that many financial institutions rely on reports used for checking account screening that are provided by specialty consumer reporting agencies (CRAs) to decision approval for new customers. CRAs specialize in checking account screening and provide financial institutions with consumer information that is helpful in determining if a consumer should be approved or not. Information such as the consumer’s check writing and account history such as closed accounts or bounced checks are important factors in determining eligibility for the new account. Financial institutions are also allowed to screen consumers to assess if they may be a credit risk when deciding whether to open a consumer checking account because many consumers opt-in for overdraft functionality attached to the checking account. Richard Cordray, the CFPB Director, clarified the regulatory agency’s position as to how consumers are treated in checking account screening processes within his prepared remarks at a forum on this topic in October 2014. “The Consumer Bureau has three areas of concern. First, we are concerned about the information accuracy of these reports. Second, we are concerned about people’s ability to access these reports and dispute any incorrect information they may find. Third, we are concerned about the ways in which these reports are being used.” The CFPB suggests four items they believe will improve financial institution’s checking account screening policies and practices: Increase the accuracy of data used from CRA’s Identify how institutions can incorporate risk screening tools while not excluding potential accountholders unnecessarily Ensure consumers are aware and notified of information used to decision the account opening process Ensure consumers are informed of what account options exist and how they access products that align with their individual needs Implementing these steps shouldn’t be too difficult to accomplish for deposit management organizations as long as they are fully leveraging software such as Experian’s PowerCurve customized for deposit account origination, relevant decision information such as Experian’s Precise ID Platform and Vantage Score® credit score combined with consumer product offerings developed within the bank and offered in an environment that is real-time where possible and considers the consumer’s needs. Enhancing checking account screening procedures by taking into account consumer’s life-stage, affordability considerations, unique risk profile and financial needs will satisfy expectations of the consumers, regulators and the financial institution shareholders. Financial institutions that use technology and data wisely can reduce expenses for their organizations by efficiently managing fraud, risk and operating costs within the checking account screening process while also delighting consumers. Regulatory agencies are often delighted when consumers are happy. Shareholders are delighted when regulators and consumers are happy. Reengineering checking account opening processes for the modern age results in a win-win-win for consumers, regulators and financial institutions. Discover how an Experian Global Consultant can help you with your banking deposit management needs.
By: John Robertson Capital is the life-blood of financial institutions and has become more readily scrutinized since the global credit crisis. How one manages their capital is primarily driven by how well one manages their risk. The use of economic capital in measuring profitability enhances risk management efforts by providing a common indicator for risk. It provides pricing metrics such as RAROC (risk adjusted return on capital) and economic value added which include expected and unexpected losses consequently broadening the evaluation of the adequacy of capital in relation to the bank's overall risk profile. The first accounts of economic capital date back to the ancient Phoenicians, who took rudimentary tallies of frequency and severity of illnesses among rural farmers to gain an intuition of expected losses in productivity. These calculations were advanced by correlations with predictions of climate change, political outbreak, and birth rate change. The primary value of economic capital is its application to decision-making and overall risk management. Economic capital is a measure of risk, not of capital held. It represents the amount of money which is needed to secure the survival in a worst case scenario; it is a buffer against expected shocks in market values. Economic capital measures risk using economic realities rather than accounting and regulatory rules, which can be misleading. The concept of economic capital differs from regulatory capital in the sense that regulatory capital is the mandatory capital the regulators require to be maintained while economic capital is the best estimate of required capital that financial institutions use internally to manage their own risk and to allocate the cost of maintaining regulatory capital among different units within the organization. The allocation of economic capital to support credit risk begins with similar inputs to derive expected losses but considers other factors to determine unexpected losses, such as credit concentrations and default correlations among borrowers. Economic capital credit risk modeling measures the incremental risk that a transaction adds to a portfolio rather than the absolute level of risk associated with an individual transaction. In a previous blog I restated a phrase I had heard long ago; “Margins will narrow forever”. How well you manage your capital will help you extend “forever”. Has your institution started using these types of risk measures? The Phoenicians did. Learn more about our credit risk solutions.
Originally contributed by: Bill Britto Smart meters have made possible new services for customers, such as automated budget assistance and bill management tools, energy use notifications, and "smart pricing" and demand response programs. It is estimated that more than 50 million smart meters have been deployed as of July 2014. Utilities and customers alike are benefiting from these smart meter deployments. It is now obvious the world of utilities is changing, and companies are beginning to cater more to their customers by offering them tools to keep their energy costs lower. For example, several companies offer prepay to some of their customers who do not have bank accounts. For many of those "unbanked" customers, prepay could be the only way to sign up for a utility services. Understanding the value of prospects and the need to automate decisions to achieve higher revenue and curb losses is imperative to the utility. It is here where a decisioning solution, like PowerCurve OnDemand> can make a real difference for utility customers by providing modified decision strategies based on market dynamics, business and economic environments. Imagine what a best of class decision solution can do by identifying what matters most about consumers and business and by leveraging internal and external data assets to replace complexity with cost efficiency? Solutions like PowerCurve OnDemand deliver the power and speed-to-market to respond to changing customer demands, driving profitability and growing customer lifetime value - good for business and good for customers.
A new comarketing agreement for MainStreet Technologies’ (MST) Loan Loss Analyzer product with Experian Decision Analytics’ Baker Hill Advisor® product will provide the banking industry with a comprehensive, automated loan-management offering. The combined products provide banks greater confidence for loan management and loan-pricing calculations. Experian Decision Analytics Baker Hill Advisor product supports banks’ commercial and small-business loan operations comprehensively, from procuring new loans through collections. MST’s Loan Loss Analyzer streamlines the estimation and documentation of the Allowance for Loan and Lease Losses (ALLL), the bank’s most critical quarterly calculation. The MST product automates the most acute processes required of community bankers in managing their commercial and small-business loan portfolios. Both systems are data-driven, configurable and designed to accommodate existing bank processes. The products already effectively work together for community banks of varying asset sizes, adding efficiencies and accuracy while addressing today’s increasingly complex regulatory requirements. “Experian’s Baker Hill Advisor product-development priorities have always been driven by our user community. Changes in regulatory and accounting requirements have our clients looking for a sophisticated ALLL system. Working with MainStreet, we can refer our clients to an industry-leading ALLL platform,” said John Watts, Experian Decision Analytics director of product management. “The sharing of data between our organizations creates an environment where strategic ALLL calculations are more robust and tactical lending decisions can be made with more confidence. It provides clients a complete service at every point within the organization.” “Bankers, including many using our Loan Loss Analyzer, have used Experian’s Baker Hill® software to manage their commercial loan programs for more than three decades,” said Dalton T. Sirmans, CEO and MST president. “Bankers who choose to implement Experian’s Baker Hill Advisor and the MST Loan Loss Analyzer will be automating their loan management, tracking, reporting and documentation in the most comprehensive, user-friendly and feature-rich manner available.” For more information on MainStreet Technologies, please visit http://www.mainstreet-tech.com/banking For more information on Baker Hill, visit http://ex.pn/BakerHill
This is the first post in a three-part series. You’ve probably heard the adage “There is a little poison in every medication,” which typically is attributed to Paracelsus (1493–1541), the father of toxicology. The trick, of course, is to prescribe the correct balance of agents to improve the patient while doing the least harm. One might think of data governance in a similar manner. A well-disciplined and well-executed data governance regimen provides significant improvements to the organization. So too, an overly restrictive or poorly designed and/or ineffectively monitored data governance ecosystem can result in significant harm; less than optimal models/scorecards, inaccurate reporting, imprecise portfolio outcome forecasts and poor regulatory reports, subsequently resulting in significant investment and loss of reputation. In this blog series, we will address the issues and best practices associated with the broad mandate of data governance. In its simplest definition, data governance is the management of the availability, usability, integrity and security of the data employed in an enterprise. A sound data governance program includes a governing body or council, a defined set of procedures and a plan to execute those procedures. Well, upon quick reflection, effective data governance is not simple at all. After all, data is ubiquitous, is becoming more available, encompasses aspects of our digital lives not envisioned as little as 15 years ago and is constantly changing as people’s behavior changes. To add another level of complexity, regulatory oversight is becoming more pervasive as regulations passed since the Great Recession have become more intrusive, granular and demanding. When addressing issues of data governance lenders, service providers and insurers find themselves trying to incorporate a wide range of issues. Some of these are time-tested best practices, while others previously were never considered. Here is a reasonable checklist of data governance concerns to consider: Who owns the data governance responsibility within the organization? Is the data governance group seen as an impediment to change or is it a ready part of the change management culture? Is the backup and retrieval discipline — redundancy and recovery — well-planned and periodically tested? How agile/flexible is the governance structure to new data sources? How does the governance structure document and reconcile similar data across multiple providers? Are there appropriate and documented approvals and consents from the data provider(s) for all disclosures? Are systemic access and modification controls and reporting fully deployed and monitored for periodic refinement? Does the monitoring of data integrity, persistence and entitled access enable a quick fix culture where issues are identified and resolved at the source of the problem and not settled by downstream processes? Are all data sources, including those that are proprietary, fully documented and subject to systemic accuracy/integrity reporting? Once obtained, how is the data stored and protected in both definition and accessibility? How do we alter data and leverage the modified outcome? Are there reasonable audits and tracking of downstream reporting? In the event of a data breach, does the organization have well-documented protocols and notification thresholds in place? How recently and to what extent have all data retrieval, manipulation, usage and protection policies and processes been audited? Are there scheduled and periodic reports made to the institution board on issues of data governance? Certainly, many institutions have most of these aspects covered. However, “most” is imprecise medicine, and ill effects are certain to follow. As Paracelsus stated, “The doctor can have a stronger impact on the patient than any drug.” As in medical services, for data governance initiatives those impacts can be beneficial or harmful. In our next blog, we’ll discuss observations of client data governance gaps and lessons learned in evaluating the existing data governance ecosystem. Make sure to read Compliance as a Differentiator perspective paper for deeper insight on regulations affecting financial institutions and how you can prepare your business. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.
By: Ori Eisen This article originally appeared on WIRED. When I started 41st Parameter more than a decade ago, I had a sense of what fraud was all about. I’d spent several years dealing with fraud while at VeriSign and American Express. As I considered the problem, I realized that fraud was something that could never be fully prevented. It’s a dispiriting thing to accept that committed criminals will always find some way to get through even the toughest defenses. Dispiriting, but not defeating. The reason I chose to dedicate my life to stopping online fraud is because I saw where the money was going. Once you follow the money and you see how it is used, you can’t “un-know.” The money ends up supporting criminal activities around the globe – not used to buy grandma a gift. Over the past 10 years the nature of fraud has become more sophisticated and systematized. Gone are the days of the lone wolf hacker seeing what they could get away with. Today, those days seem almost simple. Not that I should be saying it, but fraud and the people who perpetrated it had a cavalier air about them, a bravado. It was as if they were saying, in the words of my good friend Frank Abagnale, “catch me if you can.” They learned to mimic the behaviors and clone the devices of legitimate users. This allowed them to have a field day, attacking all sorts of businesses and syphoning away their ill-gotten gains. We learned too. We learned to look hard and close at the devices that attempted to access an account. We looked at things that no one knew could be seen. We learned to recognize all of the little parameters that together represented a device. We learned to notice when even one of them was off. The days of those early fraudsters has faded. New forces are at work to perpetrate fraud on an industrial scale. Criminal enterprises have arisen. Specializations have emerged. Brute force attacks, social engineering, sophisticated malware – all these tools, and so many more – are being applied every day to cracking various security systems. The criminal underworld is awash in credentials, which are being used to create accounts, take over accounts and commit fraudulent transactions. The impact is massive. Every year, billions of dollars are lost due to cyber crime. Aside from the direct monetary losses, customer lose faith in brand and businesses, resources need to be allocated to reviewing suspect transactions and creativity and energy are squandered trying to chase down new risks and threats. To make life just a little simpler, I operate from the assumption that every account, every user name and every password has been compromised. As I said at the start, fraud isn’t something that can be prevented. By hook or by crook (and mainly by crook), fraudsters are finding cracks they can slip through; it’s bound to happen. By watching carefully, we can see when they slip up and stop them from getting away with their intended crimes. If the earliest days of fraud saw impacts on individuals, and fraud today is impacting enterprises, the future of fraud is far more sinister. We’re already seeing hints of fraud’s dark future. Stories are swirling around the recent Wall Street hack. The President and his security team were watching warily, wondering if this was the result of a state-sponsored activity. Rather than just hurting businesses or their customers, we’re on the brink (if we haven’t crossed it already) of fraud being used to destabilize economies. If that doesn’t keep you up at night I don’t know what will. Think about it: in less than a decade we have gone from fraud being an isolated irritant (not that it wasn’t a problem) to being viewed as a potential, if clandestine, weapon. The stakes are no longer the funds in an account or even the well being of a business. Today – and certainly tomorrow – the stakes will be higher. Fraudsters, terrorists really, will look for ways to nudge economies toward the abyss. Sadly, the ability of fraudsters to infiltrate legitimate accounts and networks will never be fully stifled. The options available to them are just too broad for every hole to be plugged. What we can do is recognize when they’ve made it through our defenses and prevent them from taking action. It’s the same approach we’ve always had: they may get in while we do everything possible to prevent them from doing harm. In an ideal world bad guys would never get through in the first place; but we don’t live in an ideal world. In the real world they’re going to get in. Knowing this isn’t easy. It isn’t comforting or comfortable. But in the real world there are real actions we can take to protect the things that matter – your money, your data and your sense of security. We learned how to fight fraud in the past, we are fighting it with new technologies today and we will continue to apply insights and new approaches to protect our future. Download our Perspective Paper to learn about a number of factors that are contributing to the evolving fraud landscape.
Through all the rather “invented conflict” of MCX vs Apple Pay by the tech media these last few weeks – very little diligence was done on why merchants have come to reject NFC (near field communication) as the standard of choice. Maybe I can provide some color here – both as to why traditionally merchants have viewed this channel with suspicion leading up to CurrenC choosing QR, and why I believe its time for merchants to give up hating on a radio. Why do merchants hate NFC? Traditionally, any contactless usage in stores stems from international travelers, fragmented mobile NFC rollouts and a cornucopia of failed products using a variety of form factors – all of which effectively was a contactless chip card with some plastic around it. Any merchant supported tended to be in the QSR space – biggest of which was McDonalds - and they saw little to no volume to justify the upgrade costs. Magstripe, on the other hand, was a form factor that was more accessible. It was cheap to manufacture, provisioning was a snap, distribution depended primarily on USPS. Retailers used the form factor themselves for Gift cards, Pre-paid and Private Label. In contrast – complexity varies in contactless for all three – production, provisioning and distribution. If it’s a contactless card – all three can still follow pretty much the norm – as they require no customization or changes post-production. Mobile NFC was an entirely different beast. Depending on the litany of stakeholders in the value chain – from Hardware – OEM and Chipset support – NFC Controller to the Secure Element, the OS Support for the NFC stack, the Services – Trusted Service Managers of each flavor (SE vs SP), the Carriers (in case of OTA provisioning) and the list goes on. The NFC Ecosystem truly deters new entrants by its complexity and costs. Next – there was much ambiguity to what NFC/contactless could come to represent at the point of sale. Merchants delineated an open standard that could ferry over any type of credential – both credit and debit. Even though merchants prefer debit, the true price of a debit transaction varies depending on which set of rails carry the transaction – PIN Debit vs Signature Debit. And the lack of any PIN Debit networks around the contactless paradigm made the merchants fears real – that all debit transactions through NFC will be carried over the more costly signature debit route (favoring V/MA) and that a shift from magstripe to contactless would mean the end to another cost advantage the merchants had to steer transactions towards cheaper rails. The 13 or so PIN debit networks are missing from Apple Pay – and it’s an absence that weighed heavily in the merchants decision to be suspicious of it. Maybe even more important for the merchant – since it has little to do with payment – loyalty was a component that was inadequately addressed via NFC. NFC was effective as a secure communications channel – but was wholly inadequate when it came to transferring loyalty credentials, coupons and other things that justify why merchants would invest in a new technology in the first place. The contactless standards to move non-payment information, centered around ISO 18092 – and had fragmented acceptance in the retail space, and still struggled from a rather constricted pipe. NFC was simply useful as a payments standard and when it came to loyalty – the “invented a decade ago” standard is wholly inadequate to do anything meaningful at the point of sale. If the merchant must wrestle with new ways to do loyalty – then should they go back in time to enable payments, or should they jerry rig payments to be wrapped in to loyalty? What looks better to a merchant? Sending a loyalty token along with the payment credential (via ISO 18092) OR Encapsulating a payment token (as a QR Code) inside the Starbucks Loyalty App? I would guess – the latter. Even more so because in the scenario of accepting a loyalty token alongside an NFC payment – you are trusting the payment enabler (Apple, Google, Networks, Banks) with your loyalty token. Why would you? The reverse makes sense for a merchant. Finally – traditional NFC payments – (before Host Card Emulation in Android) – apart from being needlessly complex – mandated that all communication between the NFC capable device and the point-of-sale terminal be limited to the Secure Element that hosts the credential and the payment applets. Which means if you did not pay your way in to the Secure Element (mostly only due to if you are an issuer) then you have no play. What’s a merchant to do? So if you are a merchant – you are starting off with a disadvantage – as those terminologies and relationships are alien to you. Merchants did not own the credential – unless it was prepaid or private label – and even then, the economics wouldn’t make sense to put those in a Secure Element. Further, Merchants had no control in the issuer’s choice of credential in the Secure Element – which tended to be mostly credit. It was then no surprise that merchants largely avoided this channel – and then gradually started to look at it with suspicion around the same time banks and networks began to pre-ordain NFC as the next stage in payment acceptance evolution. Retailers who by then had been legally embroiled in a number of skirmishes on the interchange front – saw this move as the next land grab. If merchants could not cost effectively compete in this new channel – then credit was most likely to become the most prevalent payment option within. This suspicion was further reinforced with the launch of GoogleWallet, ISIS and now Apple Pay. Each of these wrapped existing rails, maintained status quo and allowed issuers and networks to bridge the gap from plastic to a new modality (smartphones) while changing little else. This is no mere paranoia. The merchants fear that issuers and networks will ultimately use the security and convenience proffered through this channel as an excuse to raise rates again. Or squeeze out the cheaper alternatives – as they did with defaulting to Signature Debit over PIN debit for contactless. As consumers learn a new behavior (tap and pay) they fear that magstripe will eclipse and a high cost alternative will then take root. How is it fair that to access their customer’s funds – our money – one has to go through toll gates that are incentivized to charge higher prices? The fact that there are little to no alternatives between using Cash or using a bank issued instrument to pay for things – should worry us as consumers. As long as merchants are complacent about the costs in place for them to access our money – there won’t be much of an incentive for banks to find quicker and cheaper ways to move money – in and out of the system as a whole. I digress. So the costs and complexities that I pointed to before, that existed in the NFC payments ecosystem – served to not only keep retailers out, but also impacted issuers ability to scale NFC payments. These costs materialized in to higher interchange cards for the issuer when these initiatives took flight – partly because the issuer was losing money already, and had then little interest to enable debit as a payments choice. GoogleWallet itself had to resort to a bit of “negative margin strategy” to allow debit cards to be used within. ISIS had little to no clout, nor any interest to push issuers to pick debit. All of which must have been quite vexing for an observant merchant. Furthermore, just as digital and mobile offers newer ways to interact with consumers – they also portend a new reality – that new ecosystems are taking shape across that landscape. And these ecosystems are hardly open – Facebook, Twitter, Google, Apple – and they have their own toll gates as well. Finally – A retail payment friend told me recently that merchants view the plethora of software, systems and services that encapsulate cross-channel commerce as a form of “Retailer OS”. And if Payment acceptance devices are end-points in to that closed ecosystem of systems and software – they are rightfully hesitant in handing over those keys to the networks and banks. The last thing they want to do is let someone else control those toll-gates. And it makes sense and ironically – it has parallel in the iOS ecosystem. Apple’s MFi program is an example of an ecosystem owner choosing to secure those end-points – especially when those are manufactured by a third party. This is why Apple exacts a toll and mandates that third party iOS accessory manufacturers must include an Apple IC to securely connect and communicate with an iOS device. If Apple can mandate that, then why is it that a retailer should have no say over the end-points through which payments occur in it’s own retail ecosystem? Too late to write about how the retailer view of NFC must evolve – in the face of an open standard, aided by Host Card Emulation – but that’s gotta be another post. Another time. See you all in Vegas. Make sure to join the Experian #MobilePayChat on Twitter this Tuesday at 12:15 p.m. PT during Money2020 conference: http://ex.pn/Money2020. If you are attending the event please stop by our booth #218. This post originally appeared here.
By: John Robertson I began this blog series asking the question “How can banks offer such low rates?” Exploring the relationship of pricing in an environment where we have a normalized. I outlined a simplistic view of loan pricing as: + Interest Income + Non-Interest Income Cost of Funds Non-Interest Expense Risk Expense = Income before Tax Along those lines, I outlined how perplexing it is to think at some of these current levels, banks could possibly make any money. I suggested these offerings must be lost leaders with the anticipation of more business in the future or possibly, additional deposits to maintain a hold on the relationship over time. Or, I shudder to think, banks could be short funding the loans with the excess cash on their balance sheets. I did stumble across another possibility while proving out an old theory which was very revealing. The old theory stated by a professor many years ago was “Margins will continue to narrow…. Forever”. We’ve certainly seen that in the consumer world. In pursuit of proof to this theory I went to the trusty UBPR and looked at the net interest margin results from 2011 until today for two peer groups (insured commercial banks from $300 million to $1 billion and insured commercial banks greater the $3 billion). What I found was, in fact, margins have narrowed anywhere from 10 to 20 basis points for those two groups during that span even though non-interest expense stayed relatively flat. Not wanting to stop there, I started looking at one of the biggest players individually and found an interesting difference in their C&I portfolio. Their non-interest expense number was comparable to the others as well as their cost of funds but the swing component was non-interest income. One line item on the UPBR’s income statement is Overhead (i.e. non-interest expense) minus non-interest income (NII). This bank had a strategic advantage when pricing there loans due to their fee income generation capabilities. They are not just looking at spread but contribution as well to ensure they meet their stated goals. So why do banks hesitate to ask for a fee if a customer wants a certain rate? Someone seems to have figured it out. Your thoughts?
By: Mike Horrocks I am at the Risk Management Association’s annual conference in DC and I feel like I am back to where my banking career began. One of the key topics here is how important the Risk Rating Grade is and what impact that right or wrong Risk Rating Grade can have on the bank. It is amazing to me how a risk rating is often a shot in the dark at some institutions or can even vary on the training of one risk manager to another. For example, you could have a commercial credit with fantastic debt service coverage and have it tied to a terrible piece of collateral and that risk rating grade will range anywhere from prime type credit (cash flow is king and the loan will never default – so why concern ourselves with collateral) to low, subprime (do we really want that kind of collateral dragging us down or in our OREO portfolio?), to anywhere in between. Banks need to define the attributes of a risk rating grade and consistently apply that grade. The failure of doing that will lead to having that poor risk rating grade impact ALLL calculations (with either an over allocation or not enough) and then that will roll into the loan pricing (making you more costly or not enough to match for the risk). The other thing I hear consistently is that we don’t have the right solutions or resources to complete a project like this. Fortunately there is help. A bank should never feel like they should try to do this alone. I recall how it was an all hands on deck when I first started out to make sure we were getting the right loan grading and loan pricing in place at the first super-regional bank I worked at – and that was without all the compliance pressure of today. So take a pause and look at your loan grading approach – is it passing or failing your needs? If it is not passing, take some time to read up on the topic, perhaps find a tutor (or business partner you can trust) and form a study group of your best bankers. This is one grade that needs to be at the top of the class. Looking forward to more from RMA 2014!
The ubiquity of mobile devices provides financial services marketers with an effective way to distribute targeted, customized messages that appeal to a single shopper — a marketing segment of one.
By: Joel Pruis I have just completed the first of two presentations on Model Risk Governance at the RMA Annual Conference. The focus of the presentation was the compliance with the Model Risk Governance guidance at the smaller asset sized financial institutions. The big theme across all of the attendees at the first session was the need for resources to execute on the Model Risk Governance. Such resources are scarce at the smaller asset sized institutions forcing the need and use for external vendors to assist in the development and ongoing validation of any models in use. With that said, the one area that cannot be outsourced is the model risk governance responsibility of the financial institution. While resources are few, we have to look for existing roles within the organization to support the model risk governance such as: - Internal Audit - reviewing process, inputs, consistency - Loan Review - accuracy, consistency, thresholds, etc. - Compliance - Data usage, pricing consistency, etc. Start gathering your governance team at your organization and begin the effort around model risk governance! Discover how an Experian business consultant can help with your Model Risk Governance strategies and processes. Also, if you are interested in gaining deeper insight on regulations affecting financial institutions and how to prepare your business, download Experian’s Compliance as a Differentiator perspective paper.
Experian hosted the Future of Fraud event this week in New York City where Ori Eisen and Frank Abagnale hosted clients and prospects highlighting the need for innovative fraud solutions to stay ahead the consistent threat of online fraud. After, Ori and Frank appeared on Bloomberg TV, interviewed by Trish Regan discussing how retailers can handle fraud prevention. Ori and Frank highlighted how using data is good, especially when combined with analytics as a requirement for businesses working to try and prevent fraud now and in the future. "Data is good. The only way that you deal with a lot of this cyber(crime) is through data analytics. You have to know who I am dealing with. I have to know it is you and authenticate that it is you that wants to make this transaction." Frank Abagnale on BloombergTV Charles Chung recently detailed how utilizing the data for good can protect the customer experience while providing businesses a panoramic view to ensure data security and compliance to mitigate fraud risk. Ultimately, this view helps businesses build greater consumer confidence and create a more positive customer experience which is the first, and most important, prong in the fraud balance. Learn more on how Experian is using big data.
More than 10 years ago I spoke about a trend at the time towards an underutilization of the information being managed by companies. I referred to this trend as “data skepticism.” Companies weren’t investing the time and resources needed to harvest the most valuable asset they had – data. Today the volume and variety of data is only increasing as is the necessity to successfully analyze any relevant information to unlock its significant value. Big data can mean big opportunities for businesses and consumers. Businesses get a deeper understanding of their customers’ attitudes and preferences to make every interaction with them more relevant, secure and profitable. Consumers receive greater value through more personalized services from retailers, banks and other businesses. Recently Experian North American CEO Craig Boundy wrote about that value stating, “Data is Good… Analytics Make it Great.” The good we do with big data today in handling threats posed by fraudsters is the result of a risk-based approach that prevents fraud by combining data and analytics. Within Experian Decision Analytics our data decisioning capabilities unlock that value to ultimately provide better products and services for consumers. The same expertise, accurate and broad-reaching data assets, targeted analytics, knowledge-based authentication, and predictive decisioning policies used by our clients for risk-based decisioning has been used by Experian to become a global leader in fraud and identity solutions. The industrialization of fraud continues to grow with an estimated 10,000 fraud rings in the U.S. alone and more than 2 billion unique records exposed as a result of data breaches in 2014. Experian continues to bring together new fraud platforms to help the industry better manage fraud risk. Our 41st Parameter technology has been able to detect over 90% of all fraud attacks against our clients and reduce their operational costs to fight fraud. Combining data and analytics assets can detect fraud, but more importantly, it can also detect the good customers so legitimate transactions are not blocked. Gartner reported that by 2020, 40% of enterprises will be storing information from security events to analyze and uncover unusual patterns. Big data uncovers remarkable insights to take action for the future of our fraud prevention efforts but also can mitigate the financial losses associated with a breach. In the end we need more data, not less, to keep up with fraudsters. Experian is hosting Future of Fraud and Identity events in New York and San Francisco discussing current fraud trends and how to prevent cyber-attacks aimed at helping the industry. The past skepticism no longer holds true as companies are realizing that data combined with advanced analytics can give them the insight they need to prevent fraud in the future. Learn more on how Experian is conquering the world of big data.
If rumors hold true, Apple Pay will launch in a week. Five of my last six posts had covered Apple’s likely and actual strategy in payments & commerce, and the rich tapestry of control, convenience, user experience, security and applied cryptography that constitutes as the backdrop. What follows is a summation of my views – with a couple of observations from having seen the Apple Pay payment experience up close. About three years ago – I published a similar commentary on Google Wallet that for kicks, you can find here. I hope what follows is a balanced perspective, as I try to cut through some FUD, provide some commentary on the payment experience, and offer up some predictions that are worth the price you pay to read my blog. Source: Bloomua / Shutterstock.com First the criticism. Apple Pay doesn’t go far enough: Fair. But you seem to misunderstand Apple’s intentions here. Apple did not set out to make a mobile wallet. Apple Pay sits within Passbook – which in itself is a wrapper of rewards and loyalty cards issued by third parties. Similarly – Apple Pay is a wrapper of payments cards issued by third parties. Even the branding disappears once you provision your cards – when you are at the point-of-sale and your iPhone6 is in proximity to the reader (or enters the magnetic field created by the reader) – the screen turns on and your default payment card is displayed. One does not need to launch an app or fiddle around with Apple Pay. And for that matter, it’s even more limited than you think. Apple’s choice to leave the Passbook driven Apple Pay experience as threadbare as possible seems an intentional choice to force consumers to interact more with their bank apps vs Passbook for all and any rich interaction. Infact the transaction detail displayed on the back of the payment card you use is limited – but you can launch the bank app to view and do a lot more. Similarly – the bank app can prompt a transaction alert that the consumer can select to view more detail as well. Counter to what has been publicized – Apple can – if they choose to – view transaction detail including consumer info, but only retains anonymized info on their servers. The contrast is apparent with Google – where (during early Google Wallet days) issuers dangled the same anonymized transaction info to appease Google – in return for participation in the wallet. If your tap don’t work – will you blame Apple? Some claim that any transaction failures – such as a non-working reader – will cause consumers to blame Apple. This does not hold water simply because – Apple does not get in between the consumer, his chosen card and the merchant during payment. It provides the framework to trigger and communicate a payment credential – and then quietly gets out of the way. This is where Google stumbled – by wanting to become the perennial fly on the wall. And so if for whatever reason the transaction fails, the consumer sees no Apple branding for them to direct their blame. (I draw a contrast later on below with Samsung and LoopPay) Apple Pay is not secure: Laughable and pure FUD. This article references an UBS note talking how Apple Pay is insecure compared to – a pure cloud based solution such as the yet-to-be-launched MCX. This is due to a total misunderstanding of not just Apple Pay – but the hardware/software platform it sits within (and I am not just talking about the benefits of a TouchID, Network Tokenization, Issuer Cryptogram, Secure Element based approach) including, the full weight of security measures that has been baked in to iOS and the underlying hardware that comes together to offer the best container for payments. And against all that backdrop of applied cryptography, Apple still sought to overlay its payments approach over an existing framework. So that, when it comes to risk – it leans away from the consumer and towards a bank that understands how to manage risk. That’s the biggest disparity between these two approaches – Apple Pay and MCX – that, Apple built a secure wrapper around an existing payments hierarchy and the latter seeks to disrupt that status quo. Let the games begin: Consumers should get ready for an ad blitz from each of the launch partners of Apple Pay over the next few weeks. I expect we will also see these efforts concentrated around pockets of activation – because setting up Apple Pay is the next step to entering your Apple ID during activation. And for that reason – each of those launch partners understand the importance of reminding consumers why their card should be top of mind. There is also a subtle but important difference between top of wallet card (or default card) for payment in Apple Pay and it’s predecessors (Google Wallet for example). Changing your default card was an easy task – and wholly encapsulated – within the Google Wallet app. Where as in Apple Pay – changing your default card – is buried under Settings, and I doubt once you choose your default card – you are more likely to not bother with it. And here’s how quick the payment interaction is within Apple Pay (takes under 3 seconds) :- Bring your phone in to proximity of the reader. Screen turns on. Passbook is triggered and your default card is displayed. You place your finger and authenticate using TouchID. A beep notes the transaction is completed. You can flip the card to view a limited transaction detail. Yes – you could swipe down and choose another card to pay. But unlikely. I remember how LevelUp used very much the same strategy to signup banks – stating that over 90% of it’s customers never change their default card inside LevelUp. This will be a blatant land grab over the next few months – as tens of millions of new iPhones are activated. According to what Apple has told it’s launch partners – they do expect over 95% of activations to add at least one card. What does this mean to banks who won’t be ready in 2014 or haven’t yet signed up? As I said before – there will be a long tail of reduced utility – as we get in to community banks and credit unions. The risk is amplified because Apple Pay is the only way to enable payments in iOS that uses Apple’s secure infrastructure – and using NFC. For those still debating whether it was a shotgun wedding, Apple’s approach had five main highlights that appealed to a Bank – Utilizing an approach that was bank friendly (and to status quo) : NFC Securing the transaction beyond the prerequisites of EMV contactless – via network tokenization & TouchID Apple’s preference to stay entirely as an enabler – facilitating a secure container infrastructure to host bank issued credentials. Compressing the stack: further shortening the payment authorization required of the consumer by removing the need for PIN entry, and not introducing any new parties in to the transaction flow that could have introduced delays, costs or complexity in the roundtrip. Clear description of costs to participate – Free is ambiguous. Free leads to much angst as to what the true cost of participation really is(Remember Google Wallet?). Banks prefer clarity here – even if it means 15bps in credit. As I wrote above, Apple opting to strictly coloring inside the lines – forces the banks to shoulder much of the responsibility in dealing with the ‘before’ and ‘after’ of payment. Most of the bank partners will be updating or activating parts of their mobile app to start interacting with Passbook/Apple Pay. Much of that interaction will use existing hooks in to Passbook – and provide richer transaction detail and context within the app. This is an area of differentiation for the future – because those banks who lack the investment, talent and commitment to build a redeeming mobile services approach will struggle to differentiate on retail footprint alone. And as smarter banks build entirely digital products for an entirely digital audience – the generic approaches will struggle and I expect at some point – that this will drive bank consolidation at the low end. On the other hand – if you are an issuer, the ‘before’ and ‘after’ of payments that you are able to control and the richer story you are able to weave, along with offline incentives – can aid in recapture. The conspicuous and continued absence of Google: So whither Android? Uniformity in payments for Android is as fragmented as the ecosystem itself. Android must now look at Apple for lessons in consistency. For example, how Apple uses the same payment credential that is stored in the Secure Element for both in-person retail transactions as well as in-app payments. It may look trivial – but when you consider that Apple came dangerously close (and justified as well) in its attempt to obtain parity between those two payment scenarios from a rate economics point of view from issuers – Android flailing around without a coherent strategy is inexcusable. I will say this again: Google Wallet requires a reboot. And word from within Google is that a reboot may not imply a singular or even a cohesive approach. Google needs to swallow its pride and look to converge the Android payments and commerce experience across channels similar to iOS. Any delay or inaction risks a growing apathy from merchants who must decide what platform is worth building or focusing for. Risk vs Reward is already skewed in favor of iOS: Even if Apple was not convincing enough in its attempt to ask for Card Present rates for its in-app transactions – it may have managed to shift liability to the issuer similar to 3DS and VBV – that in itself poses an imbalance in favor of iOS. For a retail app in iOS – there is now an incentive to utilize Apple Pay and iOS instead of all the other competing payment providers (Paypal for example, or Google Wallet) because transactional risk shifts to the issuer if my consumer authenticates via TouchID and uses a card stored in Apple Pay. I have now both an incentive to prefer iOS over Android as well as an opportunity to compress my funnel – much of my imperative to collect data during the purchase was an attempt to quantify for fraud risk – and the need for that goes out of the window if the customer chooses Apple Pay. This is huge and the repercussions go beyond Android – in to CNP fraud, CRM and loyalty. Networks, Tokens and new end-points (e.g. LoopPay): The absence of uniformity in Android has provided a window of opportunity for others – regardless of how fragmented these approaches be. Networks shall parlay the success with tokenization in Apple Pay in to Android as well, soon. Prime example being: Loop Pay. If as rumors go – Samsung goes through with baking in Loop Pay in to its flagship S6, and Visa’s investment translates in to Loop using Visa tokenization – Loop may find the ubiquity it is looking for – on both ends. I don’t necessarily see the value accrued to Samsung for launching a risky play here: specifically because of the impact of putting Loop’s circuitry within S6. Any transaction failure in this case – will be attributed to Samsung, not to Loop, or the merchant, or the bank. That’s a risky move – and I hope – a well thought out one. I have some thoughts on how the Visa tokenization approach may solve for some of the challenges that Loop Pay face on merchant EMV terminals – and I will share those later. The return of the comeback: Reliance on networks for tokenization does allay some of the challenges faced by payment wrappers like Loop, Coin etc – but they all focus on the last mile and tokenization does little more for them than kicking the can down the road and delaying the inevitable a little while more. The ones that benefit most are the networks themselves – who now has wide acceptance of their tokenization service – with themselves firmly entrenched in the middle. Even though the EMVCo tokenization standard made no assumptions regarding the role of a Token Service Provider – and in fact Issuers or 3rd parties could each pay the role sufficiently well – networks have left no room for ambiguity here. With their role as a TSP – networks have more to gain from legitimizing more end points than ever before – because these translate to more token traffic and subsequently incremental revenue – transactional and additional managed services costs (OBO – On behalf of service costs incurred by a card issuer or wallet provider). It has never been a better time to be a network. I must say – a whiplash effect for all of us – who called for their demise with the Chase-VisaNet deal. So my predictions for Apple Pay a week before its launch: We will see a substantial take-up and provisioning of cards in to Passbook over the next year. Easy in-app purchases will act as the carrot for consumers. Apple Pay will be a quick affair at the point-of-sale: When I tried it few weeks ago – it took all of 3 seconds. A comparable swipe with a PIN (which is what Apple Pay equates to) took up to 10. A dip with an EMV card took 23 seconds on a good day. I am sure this is not the last time we will be measuring things. The substantial take-up on in-app transactions will drive signups: Consumers will signup because Apple’s array of in-app partners will include the likes of Delta – and any airline that shortens the whole ticket buying experience to a simple TouchID authentication has my money. Apple Pay will cause MCX to fragment: Even though I expect the initial take up to be driven more on the in-app side vs in-store, as more merchants switch to Apple Pay for in-app, consumers will expect a consistency in that approach across those merchants. We will see some high profile desertions – driven partly due to the fact that MCX asks for absolute fealty from its constituents, and in a rapidly changing and converging commerce landscape – that’s just a tall ask. In the near-term, Android will stumble: Question is if Google can reclaim and steady its own strategy. Or will it spin off another costly experiment in chasing commerce and payments. The former will require it to be pragmatic and bring ecosystem capabilities up to par – and that’s a tall ask when you lack the capacity for vertical integration that Apple has. And from the looks of it – Samsung is all over the place at the moment. Again – not confidence inducing. ISIS/SoftCard will get squeezed out of breath: SoftCard and GSMA can’t help but insert themselves in to the Apple Pay narrative by hoping that the existence of a second NFC controller on the iPhone6 validates/favors their SIM based Secure Element approach and indirectly offers Softcard/GSMA constituents a pathway to Apple Pay. If that didn’t make a lick of sense – It’s like saying ‘I’m happy about my neighbor’s Tesla because he plugs it in to my electric socket’. Discover how an Experian business consultant can help you strengthen your credit and risk management strategies and processes: http://ex.pn/DA_GCP This post originally appeared here.