By: Mike Horrocks Managing your portfolio can be a long and arduous process that ties up your internal resources but without it, there’s an increase of additional risk and potential losses. The key is to use loan automation to pull together data in a meaningful manner and go from a reactive to proactive process that can: Address the overall risks and opportunities within your loan portfolio Get a complete view of the credit and operational risk associated with a credit relationship or portfolio segment Monitor and track actionable steps by leveraging both internal and external data Watch how to avoid the 5 most common mistakes in loan portfolio management to help you reduce overall risk, but also identify cross sell and upsell opportunities. With a more automated process your lending staff can focus on bringing in new business rather than reacting to delinquencies and following up on credit issues.
There are two sides to every coin and in banking the question is often to you want to chase the depositor of that coin, or lend it out? Well the Federal Reserve’s decision to hold interest rates at record lows since the economic downturn gave the banks’ in the United States loan portfolios a nice boost from 2010-2011, but the subsequent actions and banking environment resulted in deposit growth outpacing loans – leading to a marked reduction in loan-to-deposit ratios across banks since 2011. In fact currently there is almost $1.30 in deposits for every loan out there today. This, in turn, has manifested itself as a reduction in net interest margins for all U.S. banks over the last three years – a situation unlikely to improve until the Fed hikes interest rates. Additionally, the banks’ have found that while they are now holding on to more of these deposits that additional regulations in the form of the CFPB looking to evaluate account origination processes, Basel III Liquidity concerns, CCAR and CIP & KYP have all made the burden of holding these deposits more costly. In fact the CFPB suggests four items they believe will improve financial institution’s checking account screening policies and practices: Increase the accuracy of data used from CRA’s Identify how institutions can incorporate risk screening tools while not excluding potential accountholders unnecessarily Ensure consumers are aware and notified of information used to decision the account opening process Ensure consumers are informed of what account options exist and how they access products that align with their individual needs Lastly, to add to this already challenging environment, technology has switched the channel of choice to your smartphone and has introduced a barrage of risks associated with identity authentication – as well as operational opportunities. As leaders in retail banking and in addressing the needs of your customers, I would like to extend an invitation on behalf of Experian for you to participate in our latest survey on the changing landscape of DDA opportunities. How are regulations changing your product set, what role does mobile play now and in the future, and what are your top priorities for 2015 and beyond? These are just a few of the insights we would like to gain from experts such as you. To access our survey, please click here. Our brief survey should take no more than seven minutes to complete and your insights will be highly valued as we look to better support you and your organization’s demand product needs. Our survey period will close in three weeks, so please respond now. As a sign of our appreciation for your insights, we will send all participants an anonymous aggregation of the responses so that you can see how others view the retail banking marketplace. So take advantage of this chance to learn from your peers and participate in this industry study and don’t leave your strategy to a flip of a coin.
This is the second post in a three-part series. Imagine the circumstances of a traveler coming to a never before visited culture. The opportunity is the new sights, cuisine and cultural experiences. Among the risks is the not before experienced pathogens and the strength of the overall health services infrastructure. In a similar vein, all too frequently we see the following conflict within our client institutions. The internal demands of an ever-increasing competitive landscape drive businesses to seek more data; improved ease of accessibility and manipulation of data; and acceleration in creating new attributes supporting more complex analytic solutions. At the same time, requirements for good governance and heightened regulatory oversight are driving for improved documentation and controlled access, all with improved monitoring and documented and tested controls. As always, the traveler/businessman must respond to the environment, and the best medicine is to be well-informed of both the perils and the opportunities. The good news is that we have seen many institutions invest significantly in their audit and compliance functions over the past several years. This has provided the lender with both better insights into its current risk ecosystem and the improved skill set to continue to refine those insights. The opportunity is for the lender to leverage this new strength. For many lenders, this investment largely has been in response to broadening regulatory oversight to ensure there are proper protocols in place to confirm adherence to relevant rules and regulations and to identify issues of disparate impact. A list of the more high-profile regulations would include: Equal Credit Opportunity Act (ECOA) — to facilitate enforcement of fair lending laws and enable communities, governmental entities and creditors to identify business and community development needs and opportunities of women-owned, minority-owned and small businesses. Home Mortgage Disclosure Act (HMDA) — to require mortgage lenders to collect and report additional data fields. Truth in Lending Act (TLA) — to prohibit abusive or unfair lending practices that promote disparities among consumers of equal creditworthiness but of different race, ethnicity, gender or age. Consumer Financial Protection Bureau (CFPB) — evolving rules and regulations with a focus on perceptions of fairness and value through transparency and consumer education. Gramm-Leach-Bliley Act (GLBA) — requires companies to give consumers privacy notices that explain the institutions’ information-sharing practices. In turn, consumers have the right to limit some, but not all, sharing of their information. Fair Debt Collections Practices Act (FDCPA) — provides guidelines for collection agencies that are seeking to collect legitimate debts while providing protections and remedies for debtors. Recently, most lenders have focused their audit/compliance activities on the analytics, models and policies used to treat consumer/client accounts/relationships. This focus is understandable since it is these analytics and models that are central to the portfolio performance forecasts and Comprehensive Capital Analysis and Review (CCAR)–mandated stress-test exercises that have been of greater emphasis in responding to recent regulatory demands. Thus far at many lenders, this same rigor has not yet been applied to the data itself, which is the core component of these policies and frequently complex analytics. The strength of both the individual consumer–level treatments and the portfolio-level forecasts is negatively impacted if the data underlying these treatments is compromised. This data/attribute usage ecosystem demands clarity and consistency in attribute definition; extraction; and new attribute design, implementation to models and treatments, validation and audit. When a lender determines there is a need to enhance its data governance infrastructure, Experian® is a resource to be considered. Experian has this data governance discipline within our corporate DNA — and for good reason. Experian receives large and small files on a daily basis from tens of thousands of data providers. In order to be sure the data is of high quality so as not to contaminate the legacy data, rigorous audits of each file received are conducted and detailed reports are generated on issues of quality and exceptions. This information is shared with the data provider for a cycle of continuous improvement. To further enhance the predictive insights of the data, Experian then develops new attributes and complex analytics leveraging the base and developed attributes for analytic tools. This data and the analytic tools then are utilized by thousands of authorized users/lenders, who manage broad-ranging relationships with millions of individuals and small businesses. These individuals and businesses then have rights to reproach Experian for error(s) both perceived and actual. This demanding cycle underscores the value of the data and the value of our rigorous data governance infrastructure. This very same process occurs at many lenders sites. Certainly, a similar level of data integrity born from a comprehensive data governance process also is warranted. In the next and final blog in this series, we will explore how a disciplined business review of an institution’s data governance process is conducted. Discover how a proven partner with rich experience in data governance, such as Experian, can provide the support your company needs to ensure a rigorous data governance ecosystem. Do more than comply. Succeed with an effective data governance program.
Opening a new consumer checking account in the 21st century should be simple and easy to understand as a customer right? Unfortunately, not all banks have 21st century systems or processes reflecting the fact that negotiable order of withdrawal (NOW) accounts, or checking accounts, were introduced decades ago within financial institutions and often required the consumer to be in person to open the account. A lot has changed and consumers demand simpler and transparent account opening processes with product choices that match their needs at a price that they’re willing to pay. Financial institutions that leverage modernized technology capabilities and relevant decision information have the best chance to deliver consumer friendly experiences that meet consumer expectations. It is obvious to consumers when we in the financial services industry get it right and when we don’t. The process to open a checking account should be easily understood by consumers and provide them with appropriate product choices that aren’t “one size fits all”. Banks with more advanced core-banking systems incorporating relevant and compliant decision data and transparent consumer friendly approval processes have a huge opportunity to differentiate themselves positively from competitors. The reality is that banking deposit management organizations throughout the United States continue to evolve check screening strategies, technology and processes. This is done in an effort to keep up with evolving regulatory expectations from the consumer advocacy regulatory bodies such as the Consumer Financial Protection Bureau (CFPB) and designed to improve transparency of checking account screening for new accounts for an increased number of consumers. The CFPB advocates that financial institutions adopt new checking account decision processes and procedures that maintain sound management practices related to mitigating fraud and risk expense while improving consumer transparency and increasing access to basic consumer financial instruments. Bank shareholders demand that these accounts be extended to consumers profitably. The CFPB recognizes that checking accounts are a basic financial product used by almost all consumers, but has expressed concerns that the checking account screening processes may prevent access to some consumers and may be too opaque with respect to the reasons why the consumer may be denied an account. The gap between the expectations of the CFPB, shareholders and bank deposit management organization’s current products and procedures are not as wide as they may seem. The solution to closing the gap includes deploying a more holistic approach to checking account screening processes utilizing 21st century technology and decision capabilities. Core banking technology and checking products developed decades ago leave banks struggling to enact much needed improvements for consumers. The CFPB recognizes that many financial institutions rely on reports used for checking account screening that are provided by specialty consumer reporting agencies (CRAs) to decision approval for new customers. CRAs specialize in checking account screening and provide financial institutions with consumer information that is helpful in determining if a consumer should be approved or not. Information such as the consumer’s check writing and account history such as closed accounts or bounced checks are important factors in determining eligibility for the new account. Financial institutions are also allowed to screen consumers to assess if they may be a credit risk when deciding whether to open a consumer checking account because many consumers opt-in for overdraft functionality attached to the checking account. Richard Cordray, the CFPB Director, clarified the regulatory agency’s position as to how consumers are treated in checking account screening processes within his prepared remarks at a forum on this topic in October 2014. “The Consumer Bureau has three areas of concern. First, we are concerned about the information accuracy of these reports. Second, we are concerned about people’s ability to access these reports and dispute any incorrect information they may find. Third, we are concerned about the ways in which these reports are being used.” The CFPB suggests four items they believe will improve financial institution’s checking account screening policies and practices: Increase the accuracy of data used from CRA’s Identify how institutions can incorporate risk screening tools while not excluding potential accountholders unnecessarily Ensure consumers are aware and notified of information used to decision the account opening process Ensure consumers are informed of what account options exist and how they access products that align with their individual needs Implementing these steps shouldn’t be too difficult to accomplish for deposit management organizations as long as they are fully leveraging software such as Experian’s PowerCurve customized for deposit account origination, relevant decision information such as Experian’s Precise ID Platform and Vantage Score® credit score combined with consumer product offerings developed within the bank and offered in an environment that is real-time where possible and considers the consumer’s needs. Enhancing checking account screening procedures by taking into account consumer’s life-stage, affordability considerations, unique risk profile and financial needs will satisfy expectations of the consumers, regulators and the financial institution shareholders. Financial institutions that use technology and data wisely can reduce expenses for their organizations by efficiently managing fraud, risk and operating costs within the checking account screening process while also delighting consumers. Regulatory agencies are often delighted when consumers are happy. Shareholders are delighted when regulators and consumers are happy. Reengineering checking account opening processes for the modern age results in a win-win-win for consumers, regulators and financial institutions. Discover how an Experian Global Consultant can help you with your banking deposit management needs.
Originally contributed by: Bill Britto Smart meters have made possible new services for customers, such as automated budget assistance and bill management tools, energy use notifications, and "smart pricing" and demand response programs. It is estimated that more than 50 million smart meters have been deployed as of July 2014. Utilities and customers alike are benefiting from these smart meter deployments. It is now obvious the world of utilities is changing, and companies are beginning to cater more to their customers by offering them tools to keep their energy costs lower. For example, several companies offer prepay to some of their customers who do not have bank accounts. For many of those "unbanked" customers, prepay could be the only way to sign up for a utility services. Understanding the value of prospects and the need to automate decisions to achieve higher revenue and curb losses is imperative to the utility. It is here where a decisioning solution, like PowerCurve OnDemand> can make a real difference for utility customers by providing modified decision strategies based on market dynamics, business and economic environments. Imagine what a best of class decision solution can do by identifying what matters most about consumers and business and by leveraging internal and external data assets to replace complexity with cost efficiency? Solutions like PowerCurve OnDemand deliver the power and speed-to-market to respond to changing customer demands, driving profitability and growing customer lifetime value - good for business and good for customers.
A new comarketing agreement for MainStreet Technologies’ (MST) Loan Loss Analyzer product with Experian Decision Analytics’ Baker Hill Advisor® product will provide the banking industry with a comprehensive, automated loan-management offering. The combined products provide banks greater confidence for loan management and loan-pricing calculations. Experian Decision Analytics Baker Hill Advisor product supports banks’ commercial and small-business loan operations comprehensively, from procuring new loans through collections. MST’s Loan Loss Analyzer streamlines the estimation and documentation of the Allowance for Loan and Lease Losses (ALLL), the bank’s most critical quarterly calculation. The MST product automates the most acute processes required of community bankers in managing their commercial and small-business loan portfolios. Both systems are data-driven, configurable and designed to accommodate existing bank processes. The products already effectively work together for community banks of varying asset sizes, adding efficiencies and accuracy while addressing today’s increasingly complex regulatory requirements. “Experian’s Baker Hill Advisor product-development priorities have always been driven by our user community. Changes in regulatory and accounting requirements have our clients looking for a sophisticated ALLL system. Working with MainStreet, we can refer our clients to an industry-leading ALLL platform,” said John Watts, Experian Decision Analytics director of product management. “The sharing of data between our organizations creates an environment where strategic ALLL calculations are more robust and tactical lending decisions can be made with more confidence. It provides clients a complete service at every point within the organization.” “Bankers, including many using our Loan Loss Analyzer, have used Experian’s Baker Hill® software to manage their commercial loan programs for more than three decades,” said Dalton T. Sirmans, CEO and MST president. “Bankers who choose to implement Experian’s Baker Hill Advisor and the MST Loan Loss Analyzer will be automating their loan management, tracking, reporting and documentation in the most comprehensive, user-friendly and feature-rich manner available.” For more information on MainStreet Technologies, please visit http://www.mainstreet-tech.com/banking For more information on Baker Hill, visit http://ex.pn/BakerHill
By: John Robertson I began this blog series asking the question “How can banks offer such low rates?” Exploring the relationship of pricing in an environment where we have a normalized. I outlined a simplistic view of loan pricing as: + Interest Income + Non-Interest Income Cost of Funds Non-Interest Expense Risk Expense = Income before Tax Along those lines, I outlined how perplexing it is to think at some of these current levels, banks could possibly make any money. I suggested these offerings must be lost leaders with the anticipation of more business in the future or possibly, additional deposits to maintain a hold on the relationship over time. Or, I shudder to think, banks could be short funding the loans with the excess cash on their balance sheets. I did stumble across another possibility while proving out an old theory which was very revealing. The old theory stated by a professor many years ago was “Margins will continue to narrow…. Forever”. We’ve certainly seen that in the consumer world. In pursuit of proof to this theory I went to the trusty UBPR and looked at the net interest margin results from 2011 until today for two peer groups (insured commercial banks from $300 million to $1 billion and insured commercial banks greater the $3 billion). What I found was, in fact, margins have narrowed anywhere from 10 to 20 basis points for those two groups during that span even though non-interest expense stayed relatively flat. Not wanting to stop there, I started looking at one of the biggest players individually and found an interesting difference in their C&I portfolio. Their non-interest expense number was comparable to the others as well as their cost of funds but the swing component was non-interest income. One line item on the UPBR’s income statement is Overhead (i.e. non-interest expense) minus non-interest income (NII). This bank had a strategic advantage when pricing there loans due to their fee income generation capabilities. They are not just looking at spread but contribution as well to ensure they meet their stated goals. So why do banks hesitate to ask for a fee if a customer wants a certain rate? Someone seems to have figured it out. Your thoughts?
More than 10 years ago I spoke about a trend at the time towards an underutilization of the information being managed by companies. I referred to this trend as “data skepticism.” Companies weren’t investing the time and resources needed to harvest the most valuable asset they had – data. Today the volume and variety of data is only increasing as is the necessity to successfully analyze any relevant information to unlock its significant value. Big data can mean big opportunities for businesses and consumers. Businesses get a deeper understanding of their customers’ attitudes and preferences to make every interaction with them more relevant, secure and profitable. Consumers receive greater value through more personalized services from retailers, banks and other businesses. Recently Experian North American CEO Craig Boundy wrote about that value stating, “Data is Good… Analytics Make it Great.” The good we do with big data today in handling threats posed by fraudsters is the result of a risk-based approach that prevents fraud by combining data and analytics. Within Experian Decision Analytics our data decisioning capabilities unlock that value to ultimately provide better products and services for consumers. The same expertise, accurate and broad-reaching data assets, targeted analytics, knowledge-based authentication, and predictive decisioning policies used by our clients for risk-based decisioning has been used by Experian to become a global leader in fraud and identity solutions. The industrialization of fraud continues to grow with an estimated 10,000 fraud rings in the U.S. alone and more than 2 billion unique records exposed as a result of data breaches in 2014. Experian continues to bring together new fraud platforms to help the industry better manage fraud risk. Our 41st Parameter technology has been able to detect over 90% of all fraud attacks against our clients and reduce their operational costs to fight fraud. Combining data and analytics assets can detect fraud, but more importantly, it can also detect the good customers so legitimate transactions are not blocked. Gartner reported that by 2020, 40% of enterprises will be storing information from security events to analyze and uncover unusual patterns. Big data uncovers remarkable insights to take action for the future of our fraud prevention efforts but also can mitigate the financial losses associated with a breach. In the end we need more data, not less, to keep up with fraudsters. Experian is hosting Future of Fraud and Identity events in New York and San Francisco discussing current fraud trends and how to prevent cyber-attacks aimed at helping the industry. The past skepticism no longer holds true as companies are realizing that data combined with advanced analytics can give them the insight they need to prevent fraud in the future. Learn more on how Experian is conquering the world of big data.
If rumors hold true, Apple Pay will launch in a week. Five of my last six posts had covered Apple’s likely and actual strategy in payments & commerce, and the rich tapestry of control, convenience, user experience, security and applied cryptography that constitutes as the backdrop. What follows is a summation of my views – with a couple of observations from having seen the Apple Pay payment experience up close. About three years ago – I published a similar commentary on Google Wallet that for kicks, you can find here. I hope what follows is a balanced perspective, as I try to cut through some FUD, provide some commentary on the payment experience, and offer up some predictions that are worth the price you pay to read my blog. Source: Bloomua / Shutterstock.com First the criticism. Apple Pay doesn’t go far enough: Fair. But you seem to misunderstand Apple’s intentions here. Apple did not set out to make a mobile wallet. Apple Pay sits within Passbook – which in itself is a wrapper of rewards and loyalty cards issued by third parties. Similarly – Apple Pay is a wrapper of payments cards issued by third parties. Even the branding disappears once you provision your cards – when you are at the point-of-sale and your iPhone6 is in proximity to the reader (or enters the magnetic field created by the reader) – the screen turns on and your default payment card is displayed. One does not need to launch an app or fiddle around with Apple Pay. And for that matter, it’s even more limited than you think. Apple’s choice to leave the Passbook driven Apple Pay experience as threadbare as possible seems an intentional choice to force consumers to interact more with their bank apps vs Passbook for all and any rich interaction. Infact the transaction detail displayed on the back of the payment card you use is limited – but you can launch the bank app to view and do a lot more. Similarly – the bank app can prompt a transaction alert that the consumer can select to view more detail as well. Counter to what has been publicized – Apple can – if they choose to – view transaction detail including consumer info, but only retains anonymized info on their servers. The contrast is apparent with Google – where (during early Google Wallet days) issuers dangled the same anonymized transaction info to appease Google – in return for participation in the wallet. If your tap don’t work – will you blame Apple? Some claim that any transaction failures – such as a non-working reader – will cause consumers to blame Apple. This does not hold water simply because – Apple does not get in between the consumer, his chosen card and the merchant during payment. It provides the framework to trigger and communicate a payment credential – and then quietly gets out of the way. This is where Google stumbled – by wanting to become the perennial fly on the wall. And so if for whatever reason the transaction fails, the consumer sees no Apple branding for them to direct their blame. (I draw a contrast later on below with Samsung and LoopPay) Apple Pay is not secure: Laughable and pure FUD. This article references an UBS note talking how Apple Pay is insecure compared to – a pure cloud based solution such as the yet-to-be-launched MCX. This is due to a total misunderstanding of not just Apple Pay – but the hardware/software platform it sits within (and I am not just talking about the benefits of a TouchID, Network Tokenization, Issuer Cryptogram, Secure Element based approach) including, the full weight of security measures that has been baked in to iOS and the underlying hardware that comes together to offer the best container for payments. And against all that backdrop of applied cryptography, Apple still sought to overlay its payments approach over an existing framework. So that, when it comes to risk – it leans away from the consumer and towards a bank that understands how to manage risk. That’s the biggest disparity between these two approaches – Apple Pay and MCX – that, Apple built a secure wrapper around an existing payments hierarchy and the latter seeks to disrupt that status quo. Let the games begin: Consumers should get ready for an ad blitz from each of the launch partners of Apple Pay over the next few weeks. I expect we will also see these efforts concentrated around pockets of activation – because setting up Apple Pay is the next step to entering your Apple ID during activation. And for that reason – each of those launch partners understand the importance of reminding consumers why their card should be top of mind. There is also a subtle but important difference between top of wallet card (or default card) for payment in Apple Pay and it’s predecessors (Google Wallet for example). Changing your default card was an easy task – and wholly encapsulated – within the Google Wallet app. Where as in Apple Pay – changing your default card – is buried under Settings, and I doubt once you choose your default card – you are more likely to not bother with it. And here’s how quick the payment interaction is within Apple Pay (takes under 3 seconds) :- Bring your phone in to proximity of the reader. Screen turns on. Passbook is triggered and your default card is displayed. You place your finger and authenticate using TouchID. A beep notes the transaction is completed. You can flip the card to view a limited transaction detail. Yes – you could swipe down and choose another card to pay. But unlikely. I remember how LevelUp used very much the same strategy to signup banks – stating that over 90% of it’s customers never change their default card inside LevelUp. This will be a blatant land grab over the next few months – as tens of millions of new iPhones are activated. According to what Apple has told it’s launch partners – they do expect over 95% of activations to add at least one card. What does this mean to banks who won’t be ready in 2014 or haven’t yet signed up? As I said before – there will be a long tail of reduced utility – as we get in to community banks and credit unions. The risk is amplified because Apple Pay is the only way to enable payments in iOS that uses Apple’s secure infrastructure – and using NFC. For those still debating whether it was a shotgun wedding, Apple’s approach had five main highlights that appealed to a Bank – Utilizing an approach that was bank friendly (and to status quo) : NFC Securing the transaction beyond the prerequisites of EMV contactless – via network tokenization & TouchID Apple’s preference to stay entirely as an enabler – facilitating a secure container infrastructure to host bank issued credentials. Compressing the stack: further shortening the payment authorization required of the consumer by removing the need for PIN entry, and not introducing any new parties in to the transaction flow that could have introduced delays, costs or complexity in the roundtrip. Clear description of costs to participate – Free is ambiguous. Free leads to much angst as to what the true cost of participation really is(Remember Google Wallet?). Banks prefer clarity here – even if it means 15bps in credit. As I wrote above, Apple opting to strictly coloring inside the lines – forces the banks to shoulder much of the responsibility in dealing with the ‘before’ and ‘after’ of payment. Most of the bank partners will be updating or activating parts of their mobile app to start interacting with Passbook/Apple Pay. Much of that interaction will use existing hooks in to Passbook – and provide richer transaction detail and context within the app. This is an area of differentiation for the future – because those banks who lack the investment, talent and commitment to build a redeeming mobile services approach will struggle to differentiate on retail footprint alone. And as smarter banks build entirely digital products for an entirely digital audience – the generic approaches will struggle and I expect at some point – that this will drive bank consolidation at the low end. On the other hand – if you are an issuer, the ‘before’ and ‘after’ of payments that you are able to control and the richer story you are able to weave, along with offline incentives – can aid in recapture. The conspicuous and continued absence of Google: So whither Android? Uniformity in payments for Android is as fragmented as the ecosystem itself. Android must now look at Apple for lessons in consistency. For example, how Apple uses the same payment credential that is stored in the Secure Element for both in-person retail transactions as well as in-app payments. It may look trivial – but when you consider that Apple came dangerously close (and justified as well) in its attempt to obtain parity between those two payment scenarios from a rate economics point of view from issuers – Android flailing around without a coherent strategy is inexcusable. I will say this again: Google Wallet requires a reboot. And word from within Google is that a reboot may not imply a singular or even a cohesive approach. Google needs to swallow its pride and look to converge the Android payments and commerce experience across channels similar to iOS. Any delay or inaction risks a growing apathy from merchants who must decide what platform is worth building or focusing for. Risk vs Reward is already skewed in favor of iOS: Even if Apple was not convincing enough in its attempt to ask for Card Present rates for its in-app transactions – it may have managed to shift liability to the issuer similar to 3DS and VBV – that in itself poses an imbalance in favor of iOS. For a retail app in iOS – there is now an incentive to utilize Apple Pay and iOS instead of all the other competing payment providers (Paypal for example, or Google Wallet) because transactional risk shifts to the issuer if my consumer authenticates via TouchID and uses a card stored in Apple Pay. I have now both an incentive to prefer iOS over Android as well as an opportunity to compress my funnel – much of my imperative to collect data during the purchase was an attempt to quantify for fraud risk – and the need for that goes out of the window if the customer chooses Apple Pay. This is huge and the repercussions go beyond Android – in to CNP fraud, CRM and loyalty. Networks, Tokens and new end-points (e.g. LoopPay): The absence of uniformity in Android has provided a window of opportunity for others – regardless of how fragmented these approaches be. Networks shall parlay the success with tokenization in Apple Pay in to Android as well, soon. Prime example being: Loop Pay. If as rumors go – Samsung goes through with baking in Loop Pay in to its flagship S6, and Visa’s investment translates in to Loop using Visa tokenization – Loop may find the ubiquity it is looking for – on both ends. I don’t necessarily see the value accrued to Samsung for launching a risky play here: specifically because of the impact of putting Loop’s circuitry within S6. Any transaction failure in this case – will be attributed to Samsung, not to Loop, or the merchant, or the bank. That’s a risky move – and I hope – a well thought out one. I have some thoughts on how the Visa tokenization approach may solve for some of the challenges that Loop Pay face on merchant EMV terminals – and I will share those later. The return of the comeback: Reliance on networks for tokenization does allay some of the challenges faced by payment wrappers like Loop, Coin etc – but they all focus on the last mile and tokenization does little more for them than kicking the can down the road and delaying the inevitable a little while more. The ones that benefit most are the networks themselves – who now has wide acceptance of their tokenization service – with themselves firmly entrenched in the middle. Even though the EMVCo tokenization standard made no assumptions regarding the role of a Token Service Provider – and in fact Issuers or 3rd parties could each pay the role sufficiently well – networks have left no room for ambiguity here. With their role as a TSP – networks have more to gain from legitimizing more end points than ever before – because these translate to more token traffic and subsequently incremental revenue – transactional and additional managed services costs (OBO – On behalf of service costs incurred by a card issuer or wallet provider). It has never been a better time to be a network. I must say – a whiplash effect for all of us – who called for their demise with the Chase-VisaNet deal. So my predictions for Apple Pay a week before its launch: We will see a substantial take-up and provisioning of cards in to Passbook over the next year. Easy in-app purchases will act as the carrot for consumers. Apple Pay will be a quick affair at the point-of-sale: When I tried it few weeks ago – it took all of 3 seconds. A comparable swipe with a PIN (which is what Apple Pay equates to) took up to 10. A dip with an EMV card took 23 seconds on a good day. I am sure this is not the last time we will be measuring things. The substantial take-up on in-app transactions will drive signups: Consumers will signup because Apple’s array of in-app partners will include the likes of Delta – and any airline that shortens the whole ticket buying experience to a simple TouchID authentication has my money. Apple Pay will cause MCX to fragment: Even though I expect the initial take up to be driven more on the in-app side vs in-store, as more merchants switch to Apple Pay for in-app, consumers will expect a consistency in that approach across those merchants. We will see some high profile desertions – driven partly due to the fact that MCX asks for absolute fealty from its constituents, and in a rapidly changing and converging commerce landscape – that’s just a tall ask. In the near-term, Android will stumble: Question is if Google can reclaim and steady its own strategy. Or will it spin off another costly experiment in chasing commerce and payments. The former will require it to be pragmatic and bring ecosystem capabilities up to par – and that’s a tall ask when you lack the capacity for vertical integration that Apple has. And from the looks of it – Samsung is all over the place at the moment. Again – not confidence inducing. ISIS/SoftCard will get squeezed out of breath: SoftCard and GSMA can’t help but insert themselves in to the Apple Pay narrative by hoping that the existence of a second NFC controller on the iPhone6 validates/favors their SIM based Secure Element approach and indirectly offers Softcard/GSMA constituents a pathway to Apple Pay. If that didn’t make a lick of sense – It’s like saying ‘I’m happy about my neighbor’s Tesla because he plugs it in to my electric socket’. Discover how an Experian business consultant can help you strengthen your credit and risk management strategies and processes: http://ex.pn/DA_GCP This post originally appeared here.
By: Maria Moynihan Mobile devices are everywhere, and landlines and computer desktops are becoming things of the past. A recent American Marketing Association post mentioned that there already are more than 1 billion smartphones and more than 150 million tablets worldwide. As growth in mobile devices continues, so do expectations around convenience, access to mobile-friendly sites and apps, and security. What is your agency doing to get ahead of this trend? Allocating resources toward mobile device access and improved customer service is inevitable, and, arguably, investment and shifts in one of these areas ultimately will affect the other. As ease of information and services improves online or via mobile app, secure logons, identity theft safeguards and authentication measures must all follow suit. Industry best practices in network security call for advancements in: Authenticating users and their devices at the point of entry Detecting new and emerging fraud schemes in processes Developing seamless cross-checks of individuals across channels Click here to see what leading information service providers like Experian are doing to help address fraud across devices. There is a way to confidently authenticate individuals without affecting their overall user experience. Embrace the change.
In a recent webinar, we addressed how both the growing diversity of technology used for online transactions and the many different types of access can make authentication complicated. Technology is ever-changing and is continually reshaping the way we live. This leaves our industry to question how device intelligence factors into both the problem and solution surrounding diverse technologies in the online transaction space. Industry experts Cherian Abraham from the Experian Decision Analytics team and David Britton from 41st Parameter, a part of Experian, weighed in on the discussion. Putting It All Into Context Britton harkened back to a simpler time of authentication practices. In the early days of the web, user names and passwords were the only tools people had to authenticate online identities. Eventually, this led organizations to begin streamlining the process. “They did things like using cookies or placing files onto a computer so that the computer would be “known” to the business,” said Britton. However, those original methods are now struggling to fit into the modern-day authentication puzzle. “The challenge has been that for both privacy reasons and for the advancements of technology we have actually moved to a more privacy-centric environment where those types of things have fallen away in terms of their efficacy. For example, cookies are often easily deleted by simply browsing incognito. So as a result there’s been a counter move approach to how to authenticate online,” said Britton. New Technology – A Quick Fix? Don’t be fooled. Newer technologies cannot necessarily provide an easy alternative and incorporate older authentication methods. Britton referenced how the advent of mobile has actually made recognizing the consumer behind the device, the behavior of the machine and the data that the consumer is presenting even more complex. Additionally, rudimentary methods of authentication don’t actually exist well in the mobile environment. On the other hand, newer technologies and the mobile environment force a more layered approach to authentication methods. “There is a better way and the better way is to look at a variety of other inspirations beyond user names and passwords before vindicating the customer. This is all the more evident when you get to newer channels such as mobile where consumer expectations are so different and you cannot rely on the customer having to answer a long stream of characters and letters such as a user name or a password,” said Abraham. Britton weighed in as well on device intelligence and the layered approach. “Our whole philosophy around this has been that if you can recognize aspects of the device in the form of device intelligence – we’re able to actually leverage that information without crossing the boundaries of good privacy management. Furthermore, we are then able to say we recognize the attributes of the device and can recognize the device as that person is attempting to come back into an environment,” said Britton. He emphasized how being able to help companies understand who might be on the other end of the device has made a world of difference. This increasingly points to how authentication will continue to evolve in a in a multi-device, multi-screen and multi-channel environment. For more information and access to the full webinar – Stay tuned for additional #fraudlifecycle posts.
Fraud is not a point-in-time problem and data breaches should not be considered isolated attacks, which break through network defenses to abscond with credentials. In fact, data breaches are just the first stage of a rather complex lifecycle that begins with a vulnerability, advances through several stages of validation and surveillance, and culminates with a fraudulent transaction or monetary theft. Cyber criminals are sophisticated and have a growing arsenal of weapons at their disposal to infect individual and corporate systems and capture account information: phishing, SMSishing and Vishing attacks, malware, and the like are all attempts to thwart security and access-protected information. Criminal tactics have even evolved to include physical-world approaches like infiltrating physical call centers via social engineering attacks aimed at unsuspecting representatives. This, and similar efforts, are all part of the constant quest to identify and exploit weaknesses in order to stage and commit financial crimes. There are some companies that claim malware detection is the silver bullet to preventing fraud. This is simply not the case. The issue is that malware is only one method by which fraudsters may obtain credentials. The seemingly endless supply of pristine identity and account data in the criminal underground means that detecting a user’s system has been compromised is akin to closing the barn door after the hose has bolted. That is, malware can be an indicator that an account has been compromised, but it does not help identify the subsequent usage of the stolen credentials by the criminals, regardless of how the credentials were compromised. Compromised data is first validated by the seller as one of their “value adds” to the criminal underground and typically again by the buyer. Validation usually involves logging into an account to ensure that the credentials work as expected, and allows for a much higher “validated” price point. Once the credentials and/or account have been validated, cyber criminals can turn their attention to surveillance. Remember, by the time one realizes that credential information has been exposed, cyber criminal rings have captured the information they need – such as usernames, passwords, challenge responses and even token or session IDs – and have aded it to their underground data repositories. with traditional online authentication controls, it is nearly impossible to detect the initial fraudulent login that uses ill-gotten credentials. That is why it is critical to operate from the assumption that all account credentials have been compromised when designing an online authentication control scheme.
By: Maria Moynihan As consumers, we expect service, don’t we? When service or convenience lessens or is taken away from us altogether, we struggle to comprehend it. As a recent example, I went to the pharmacy the other day and learned that I couldn’t pick up my prescription since the pharmacists were out to lunch. “Who takes lunch anymore?” I thought, but then I realized that too often organizations limit their much needed services as a cost-saving measure. Government is no different. City governments, for instance, may reduce operating hours or slash services to balance budgets better, especially when collectables are maxed out, with little movement. For many agencies, reducing services is the easiest way to offset costs. Often, municipalities offset revenue deficits by optimizing their current collections processes and engaging in new methods of revenue generation. Why then isn’t revenue optimization and modernization being considered more often as a means to offset costs? Some may simply be unsure of how to approach it or unaware of the tools that exist to help. For agencies challenged with collections, there is an option for revenue assurance. With the right data, analytics and technologies, agencies can maximize collection efforts and take advantage of their past-due fines and fees to: Turn stale debt into a new source of revenue by determining the value of their entire debt portfolio and evaluating options for a stale assets sale Reduce delinquencies by better assessing constituents and businesses at the point of transaction and collecting outstanding debt before new services are rendered Minimize current debt by segmenting and prioritizing collection efforts through finding and contacting debtors and gauging their capacity to pay Improve future accounts receivable streams by identifying the best collectable debt for outsourcing What is your agency doing to offset costs and balance budgets better? See what industry experts suggest as best practices for collections, and generate more revenue to keep services fully in place for your constituents.
By: Mike Horrocks A recent industry survey was published that called out the number one reason that lenders were dissatisfied or willing to go to another financial institution (and take their book of business with them) was not compensation. While, compensation is often thought of as the number one driver for this kind of change in your bench of lenders, it had much more to do with being able to serve customers efficiently. One of the key reasons that lenders were unhappy was that they were in a workflow and decisioning process where the lender could not close loans on time, putting stress on the loan officer's relationships and destroying borrower confidence. Thinking of my own experiences as a commercial lender, my interactions with the private bankers, branch managers, and lenders that served every kind of customer, I would absolutely have to agree with this study. Nothing is more disheartening then working on bringing in a client, and then having the process not give me a response in the time that my clients are expecting or that the completion is achieving. Automation in the process is the key. While lenders still will need to be engaged in the process and paying attention to the relationship, it can be significantly refocused to other parts of the business. This leads to benefits such as: Protecting the back office and the consistence of booking and servicing loans. Ensuring that the risk appetite is consistent for the institution for every deal. Growing a portfolio of loans that can and will adhere to sound portfolio management techniques. So how is your process supporting lenders? Are you automating to help in areas that give you a competitive advantage with robust credit scores, decision strategies or risk management solutions that are helping close deals quickly or are you requiring a process that is keeping them from bringing more customers (and profits) in the door? Henry Ford is credited to say, “Coming together is a beginning. Keeping together is progress. Working together is success.” Take a closer look at your lending process. Do you have the tools that help bring your lenders, your customers, and your organization together? If you don’t you may be losing some of your best talent for loan production at a time when you can least afford it.
Cherian Abraham, our mobile commerce and payments consultant, recently wrote about the future of mobile banking in regards to the Apple Pay news out this week. The below article originally appeared in American Banker and is an edited version of his blog post. Editor's note: A version of this post originally appeared on Drop Labs. Depending on who you ask, the launch of Apple Pay was either exciting or uninspiring. The truth is far more complicated — particularly in terms of how it will impact the dynamics of Apple's relationship with banks. I would venture that most of the financial institutions on stage at the launch of Apple Pay earlier this week have mixed feelings about their partnership. They have had to sacrifice a lot of the room for negotiation that banks have retained with other wallet players such as Google Wallet and Softcard (the company formerly known as Isis). If you are an Apple Pay launch partner, having your credential or token on Apple Pay does not mean that you get to extend that credential into your own mobile banking app or wallet. For example, Bank A, with its credentials stored on Apple Pay, cannot leverage those credentials so that its own mobile banking app can use them to enable direct payments. Banks will have to accept that their credentials will be indefinitely locked to Apple Pay till deletion. No bank wants its brand to be overshadowed by Apple, nor do banks want smartphone users to close their app and open up a different wallet to make a payment. But this was not up for debate with Apple, which wants to tightly control the payment experience. This should be a cause of concern for Apple Pay partner banks, for whom enabling payments outside of Apple Pay in iOS is now off the table. Banks' only hope of having an integrated payment experience is to focus on Android, which supports host card emulation technology. HCE uses software to emulate a contactless smart card and communicate with near-field communication readers. I would expect a lot of banks to revisit Android and HCE in upcoming months. That goes double for the institutions that were not chosen to partner with Apple, along with retailers who have not rejected contactless payments as a modality in stores. Given that Apple will reportedly collect fees from its partner banks when customers execute transactions on the mobile wallet, all banks should be thinking about ways that they can make their presence on other Apple offerings more lucrative. If I were them, I would begin segmenting customers who hold one of iTunes' 500 million active accounts to see which ones are affluent spenders and which cards have higher interest rates, then implement targeted customer incentive strategies to move Apple users to higher-rate cards. I would use the same tactic to convince customers to replace debit cards on file with iTunes with credit cards. But the big takeaway is that from here on out, banks can only gain incremental value from iOS. If they want to create a unified payment system that customers can use as part of their existing banking relationships, they'll have to focus on Android. Should that happen, I doubt that Apple could prevent such moves from diluting its merchant value proposition. But such moves on the part of issuers are hardly long-term strategies to incentivize frequent usage, merchant participation and overall customer value. Learn more about how Experian can help you with your mobile banking needs please visit: http://ex.pn/1t3zCSJ?INTCMP=DA_Blog_Post091214