Cyber Attacks Can Be Costly – Is Cyber Insurance the Answer?

By Adam Hamm, Managing Director
Risk & Compliance




The WannaCry malware attack in mid-May focused the attention of corporations around the world on escalating cyber threats. Our Flash Report released immediately after the attack noted that it marked a new and unsettling aggressiveness on the part of cyber criminals: No previous assault matched the breadth of impact of WannaCry, which affected hospitals, corporations and government offices in more than 150 countries around the world.

The cost of getting businesses up and running after the attack was expected to potentially add up to billions of dollars. Additionally, some organizations could face lawsuits over their failure to secure the previously disclosed Windows vulnerability that the criminals exploited.

In fact, news on May 23 that Target Corp. had agreed to pay $18.5 million to settle state and financial institution claims stemming from an enormous data breach should have warranted as much corporate attention as the WannaCry event. Hackers stole data from up to 40 million credit and debit cards belonging to the retailer’s shoppers during the holiday season in 2013, and the company disclosed that the total cost of its cyber security failure had amounted to $202 million so far. A settlement stemming from a consumer class action has yet to be finalized.

The grave consequences of weak cyber security – from business disruptions to the expense of repairs and lawsuit payouts – may lead some to believe organizations are scrambling to make cyber liability insurance part and parcel of their IT security protocols. Yet, according to recent surveys, roughly half of U.S. firms don’t have cyber risk insurance, and more than 25 percent of executives without a policy say they have no plans to add one. Among the companies that have insurance, only 16 percent reported that they have policies that cover all liabilities.

There are reasons many companies are reluctant to purchase cyber liability insurance or beef up existing policies, and the two main ones are cost and complexity. Certainly, insurers can improve clarity on their policies and enhance the ability for customers to compare different proposals. And, it may very well be the prohibitive cost of cyber insurance that is causing some companies hit by ransomware attacks to try and recoup their losses using kidnapping, ransom and extortion policies originally acquired to protect workers in dangerous locations.

Even so, a cyber liability insurance policy is a prudent course of action in most cases. Although it should never be a substitute for strong cybersecurity defenses, it can spell the difference between a severely affected and fairly unscathed bottom line in the aftermath of an attack. Before committing to a policy, however, it is important that management teams and their insurance brokers discuss three pivotal issues:

  • What kind of cyber liability insurance policy does the company need? Does it need a first-person policy to cover the cost of retrieving data critical to the operation, or does the company possess consumer information that requires protection against third-party lawsuits? Does it need both?
  • What amount of coverage does the company want to obtain? This figure will depend on a number of factors, including the size of the company and the type of coverage it needs. To mitigate third-party risk, for example, settlements like Target’s could provide useful benchmarks.
  • What is the premium an organization is willing to pay? A number of variables should be used to determine this figure, including a company’s earnings, the size of the IT budget, and the operations or data at risk.

Once a company has answered these questions, it can begin to shop for cyber liability insurance. As part of the process, the management team needs to fully understand what the policies cover. But perhaps most importantly, organizations need to understand what the policies don’t cover, which will ultimately indicate whether the policy is worth the expenditure.

Given the sophistication and prevalence of successful data breaches, it is now more important than ever for companies to analyze whether a cyber liability insurance policy should be a part of their overall cyber strategy.

Financial Firm Auditors: Are You Ready to Audit Under CECL?



By Charles Soranno, Managing Director
Financial Reporting Compliance and Internal Audit

and Benjamin Shiu, Director, Model Risk Management


Amid widespread concern that Generally Accepted Accounting Principles (GAAP) are inadequate when it comes to advising investors on deteriorating credit quality, the Financial Accounting Standards Board (FASB) has issued a new methodology. The new standard, known as Current Expected Credit Loss, or CECL, uses data analytics to forecast expected losses based on internal and external trends, as well as borrower-specific information. In its simplest form, CECL replaces the old standard of actual or “incurred” loss with a forward-looking estimate of “expected loss” over the foreseeable future. (See our analysis of its anticipated impact.)

The standard was originally scheduled to become effective for public companies in December 2018, but that deadline has been pushed back to December 2020, with private companies to follow a year later.

CECL represents a significant change with far-reaching implications for loss reserves. And yet, just one in ten affected companies has made any significant effort to assess the potential impact and prepare for the change.

Protiviti conducted a webinar recently aimed at internal auditors trying to get the ball rolling at their organizations. As is often the case, the webinar generated more questions than we were able to address during the live session. We want to address some of the additional questions here.

Q: Isn’t the “foreseeable future” loss prediction based on “historical losses” as well? It’s hard to see how CECL offers any real improvement if the underlying data is essentially the same.

A: The forecast into the foreseeable future could be based on historical experiences (losses) and management judgment based on the most updated information.

For the forecasting based on historical losses, data is essential, and that is why CECL implementation will require companies to retain a variety of historical data over a much longer time horizon and analyze it against external information, such as FICO scores, loan-to-value and debt-to-income ratios, and debt service coverage. Internal audit will need to provide assurance on data completeness. With a longer time horizon and more variety of historical data, the CECL model should be able to better estimate the loss under different foreseeable future scenarios. Most companies already have such data saved. Even those who don’t, if they start saving data now, will have four years of historical data to work with by 2020.

For the forecasting based on management judgment, unlike the incurred loss model, the CECL model explicitly requires management to take into account the current information and identify the future scenarios for loss estimation.

Q: With the implementation of CECL, will there also be a corresponding allowance for loan and lease losses (ALLL) requirement on the lending institution?

A: Yes. Regulators published a Joint Statement on CECL on June 17, 2016. Expect more on ALLL in the future, but the June 17 statement is already out there.

Q: Isn’t stress modeling sometimes subjective even when using a third party?

A: Not necessarily. Third-party vendors typically use industry-level data to develop their models, and these models then serve as objective benchmarks against which institutional assets can be evaluated.

Q: What is going to be expected of internal auditors under CECL? Will we be expected to audit the ALLL process and controls over the model, or will we be expected to perform full model validation as well?

A: Both would be expected. Right now, internal auditors should be talking to management to ensure there is transparency into the portfolio and the credit quality evaluation process. There should be clear lines of reporting and communication to the board, and internal audit must remain close to the process throughout to ensure that the model is being applied, and that the model itself is valid as a predictor of credit losses in the foreseeable future.

As we discussed during the webinar, and at the highest level, processes, data sources and accounting will be changing under the CECL guidance. Whenever processes change, internal controls must be reassessed to make sure that no new critical risks have been created and that all critical risk areas have adequate controls in place.

Once in place, the controls must be tested by internal audit. For example, here are some critical concerns:

  • Data, process and judgments – Internal audit must collect and test company loss experience and other past events. Some of the processes will require judgment; those judgements must be articulated and supported by evidence. Forecasts on factors that affect collectability, either internal or third-party, must be validated and back-tested.
  • Other models – For some institutions, Asset Liability Management (ALM) and DFAST/CCAR models, because they incorporate effective lifetime and credit risk assessment, may be utilized (or modified) for CECL estimates as well. However, these models are used for regulatory and management purposes, not as a source of disclosures in financial statements.
  • Documenting processes and controls – Documenting processes and controls will be a major undertaking. Ideally, areas of control weakness in the new processes should be identified as the processes are being developed, not after the fact.
  • New skill sets – Many internal audit departments may require skills in data and modelling. Adequate budget must be provided for staff and training.

Q: Do you advise firms to develop benchmarking CECL models?

A: It may not be necessary to develop a complete benchmarking model. Nevertheless, during the development process, it is reasonable to assume that after considering a variety of alternative approaches, data and assumptions, a benchmarking model may emerge as a side product of verifying the performance of the primary model.

The bottom line is that the time for the internal audit function to develop key CECL-related objectives is now. What auditors have to audit has changed significantly. Data has a certain subjectivity, and auditors must ensure that subjectivity is reduced. In addition, auditors have to increase their skill competency – they have to increase their understanding of modeling and data analytics. To provide assurance, auditors must become confident of their skills and ability to analyze credit risk. The archived webinar is a good first step.

Jeff Marsh of Protiviti’s Risk and Compliance practice co-presented the webinar and contributed to the development of this content.

The Importance of Data Lineage for AML System

By Vishal Ranjane, Managing Director
Risk and Compliance




Financial organizations have long embraced the advantages that information technology offers, and many are looking forward to larger digitalization initiatives to gain market advantage. Customers appreciate the convenience of digital offerings, while firms enjoy the reduction in operating costs that information technology enables. Of course, in the multifaceted, highly regulated environment in which financial institutions operate, mastering the complexity of this digital future is both rewarding and risky.

In any financial firm’s application landscape, data flows from system to system. In an ideal world, key data gathered at the front end (customer-facing systems) makes it to the back-end systems without hitches. In reality, in the application architecture of almost any financial institution, systems are sometimes imperfectly integrated, often as a result of multiple acquisitions, and data does not always make the journey from system to system without some amount of attrition or change. However, banks and other financial institutions that handle customer data must be able to demonstrate that the information which originates upstream, in customer-facing systems, is the same information found in the bank’s risk and compliance systems downstream. This is where data lineage becomes important.

Data lineage tells the complete story of how data within an organization was produced, consumed, and manipulated by the organization’s applications. It traces the data’s movement through systems.

Once, it was sufficient to demonstrate to regulators that the right policies were in place, that the right procedures were followed, and the right reports were generated and reviewed to protect against threats like fraud and money laundering. Now, financial institutions must be able to demonstrate to regulators that they are using complete and accurate data to monitor for these activities.

Asserting data legitimacy

An organization asserts de facto data legitimacy when it relies on the integrity of its data for key reporting or decision-making activities, such as those involved with risk and compliance solutions. It is imperative that data from upstream systems of record or points of capture arrives in these downstream risk and compliance systems in a manner that does not materially alter or obscure the content received from the system of record or point of capture.

De facto data legitimacy claims is an area of focus for regulatory authorities who require that these claims be documented and proven. The recent Part 504 regulation by the State of New York Department of Financial Services emphasizes the importance of data lineage in an AML context, stating that a covered institution must not only identify all data sources that contain data relevant to its transaction monitoring and watchlist filtering programs, but also must ensure that these programs include the validation of the integrity, accuracy, and quality of the data to ensure that an accurate and complete set of data flows into these programs. In addition, the regulation specifically notes data mapping as a key component of end-to-end pre- and post-implementation testing of transaction monitoring and watchlist filtering programs.

Going back to the firm’s application landscape, upstream data – data entered initially by the customer, for example – may not survive the journey downstream, and facts about the transaction may be lost with each hop from system to system. Can an auditor know if a particular transaction was made with a teller, a wire, or via an ATM, for example? Was a deposit made by check or cash?

Data lineage documentation can be done using a variety of tools ranging from simple to sophisticated. In smaller, less complex systems, simple spreadsheets and diagramming tools may suffice, while large financial institutions may deploy vendor toolsets to automate tedious and error-prone capture and documentation activities.

Data lineage as part of data governance

Establishing the data lineage should, of course, be more than just an exercise in documenting what’s already in place. Performing this level of analysis and uncovering previously unknown silent errors or gaps in the data being used to manage AML risks and generate reports should lead to increased accuracy and confidence in the reports and management information presented to senior management, internal audit and regulators. An additional benefit is getting better insights into customer behavior – a value for any business.

Having a sustainable data lineage initiative is only the start. To be sustainable over the long run, such initiative needs to be part of a larger data governance program that is firm-wide and involves all departments and functions. Data governance efforts are viewed well by regulators, who increasingly put pressure on financial institutions to formally document business processes, data controls, source-to-target mapping, and defend all activities around data management. A Protiviti white paper, “AML and Data Governance: How Well Do You KYD?,” provides more information and may be of relevance to your company.

Benjamin Kelly of Protiviti’s Regulatory Risk and Compliance practice contributed to this content.

States Champion Regulatory Streamlining; CFPB Remains Focused on Consumer Loan Servicing and Fair Lending

By Carol Beaumier, Executive Vice President and Managing Director
Regulatory Compliance Practice




While regulatory relief remains a topic within the Beltway, the Conference of State Bank Supervisors (CSBS), the nationwide organization of financial regulators from all 50 states, the District of Columbia, Guam, Puerto Rico and the U.S. Virgin Islands, has already taken action to streamline the multistate regulatory oversight framework for one group of its regulated entities – money services businesses (MSB). In April, the CSBS launched the Money Services Business Call Report (MSB Call Report) which will allow MSBs to submit a single periodic financial form and other activity reports rather than deal with state-specific reporting requirements in varying formats. The MSB Call Report includes a Financial Condition Report, Transaction Activity Report, Permissible Investment Report and (to be added in the fourth quarter 2017) a Transaction Destination Country Report. The initial report was due by May 15, 2017. While individual states need to opt into this reporting, this move is nonetheless a step in the right direction for the MSB community.

Among the topics on the agenda of the Consumer Financial Protection Bureau (CFPB) are mortgage servicing rights for consumers and fair lending. The CFPB’s 2016 final rule amending certain provisions of Regulation X (Real Estate Settlement Procedures Act) and Regulation Z (Truth in Lending) will be effective in October 2017. The rule requires a series of modifications to the procedures and technology platforms used by mortgage services. These modifications affect, among other things, key definitions (successors in interest, delinquency), lender-placed insurance, loss mitigation, communications with borrowers in bankruptcy, and periodic statements and coupon books. With the effective date less than six months away, mortgage services need to understand and be prepared to implement all of the required changes.

The 2016 CFPB Fair Lending Report, published in April, signals the agency’s fair lending priorities for 2017. These include identification of redlining activities; mortgage and student loan servicing issues based on race, ethnicity, sex or age; and fair lending challenges faced by women-owned and minority-owned businesses. Lenders engaged in mortgage and student loan servicing and small business lending activities should consider stepping up their monitoring and testing of these areas in preparation for upcoming CFPB examinations.

Learn more about these developments in our May issue of Compliance Insightsavailable here, and review our monthly recap of compliance developments on the same site.

Retailers, Tech Firms and Financial Services Providers: It’s Time to Shape the Future of Mobile Payments — Are You Ready?

By Gordon Tucker, Managing Director, Technology, Media and Communications Industry Leader; Rick Childs, Managing Director, Consumer Products and Services Industry Leader; and Jason Goldberg, Director, Financial Services Business Performance Improvement


The global mobile payments market is projected to reach US$780 billion by the end of 2017, according to research firm TrendForce. That figure seems impressive until you consider that the ability to pay for goods and services with a mobile device has been a reality for years. It’s been nearly a decade since Starbucks, one of the biggest mobile payments success stories to date, launched its app and rewards program. And recent research by the Mobile Economic Forum found that one-fifth of global consumers have made a mobile payment in-store. Given the exponential growth in smart device innovation and adoption over the past decade and consumers’ inherent desire for convenience and speed when making a purchase, it is logical to think that the mobile channel would dominate as the avenue for payments by now. It’s where we’re headed, to be sure. But some formidable obstacles have been impeding the growth of the industry, such as:

  • Persistent concerns about fraud, privacy and security: Even though most consumers are aware of “digital wallets” — apps on smartphones that store credit card information and facilitate mobile payments — many remain wary of the risks. Fraud has been a problem, with weak authentication practices and identity theft at the root of many incidents — including those involving well-known brands like Apple Pay and Samsung Pay.

Consumers also worry about how companies are collecting and using data, including purchasing history and even geolocation. How and if that sensitive information is being protected from hackers is yet another concern. Tokenization helps to secure valuable transaction data, but data stored in digital wallets or merchants’ payment systems may still be vulnerable. Also, new entrants to the market may lack the security sophistication needed to protect sensitive data from compromise.

  • Bad timing: When solutions like Apple Pay, Google Wallet and Android Pay were being rolled out by mobile manufacturers and tech providers a few years ago, EMV chip card technology was also hitting the market. Retailers were initially confused, and frustrated, about whether to adopt mobile payments or EMV chip card technology. Most prioritized the latter. Now, adoption of that technology is near-universal in retail, even though EMV chip card transactions are slower than mobile payments or even traditional credit card payments.
  • Lack of a consistent experience: Merchants of all types have been racing to launch their own digital wallets. But it is unlikely that many will achieve long-term success with their ventures because consumers are already overwhelmed by choice in the market. Plus, these offerings are diverse, which means the mobile payments experience for consumers also varies. That works against efforts by retailers, and the mobile payments industry to engage consumers and convince them to pay with their smart devices at every opportunity. And there’s another ingredient for mobile payments success that not all retailers can capture: A key reason that apps from brands like Starbucks, Taco Bell and Dominos are so popular is that consumers do business with these retailers frequently — sometimes daily.
  • The fact that old habits die hard: One more dynamic that’s working against mobile payment adoption is the simple fact that it’s still easier and faster, in most cases, for consumers to pay for goods and services with cash, debit card or credit card. They’re comfortable with these methods, so they’re in no hurry to change. And many businesses that offer mobile payment options fail to do enough to incentivize consumers to make the switch — for example, they don’t provide compelling rewards to customers who use their app frequently.

A Growing Swell of Expectations From Consumers

The picture is not all bleak. There are other strong trends in motion that will help to drive mobile payments innovation as well as consumer adoption and use of these solutions. Here are some of the dynamics to watch:

  • New shopping trends will help mobile payments grow — a lot. Showrooming — where consumers examine merchandise in a traditional brick-and-mortar retail store or another offline setting and then buy it online, sometimes at a lower price — is just one example. It’s a retail experience that’s made for mobile — and it’s expanding as large e-commerce players like Amazon and Microsoft get in the game. Retailers can use mobile payment apps to incentivize shoppers to buy items in the store by offering discounts, special rewards or free delivery.
  • Mobile shopping apps are becoming more experiential for consumers. The core purpose of a mobile payment service is to facilitate transactions, of course, but that’s not enough to engage a consumer. Mobile shopping apps are evolving to help customers discover and research products before they are at the store and then help them locate those products while they’re in the store. These apps can also store shoppers’ receipts, gift cards and shopping lists; present discounts and coupons; enable comparison shopping; make the checkout process simple and fast, and more. Look for customer loyalty programs to evolve, as well; for instance, using data insights, a retailer could offer individualized incentives to mobile shoppers and reward them for specific behaviors.
  • A friction-free experience is becoming an expectation, fast. Mobile payments success hinges on creating a simple, seamless, value-adding and branded customer experience. Leading players in the person-to-person (P2P) payments space are setting the standard for the frictionless consumer experience — and winning over mobile-minded millennials. Recent research from Bank of America found that 62 percent of millennials use a P2P service.

Entrants in the P2P space are also focusing on the back end, trying to simplify operations and bake in security wherever possible without undermining the consumer experience. Good infrastructure that supports a secure and seamless customer experience is essential to the future of mobile payments. In the coming months on the blog, we’ll be exploring topics that retailers, technology companies and financial services providers, specifically, should consider when developing their mobile payments strategy. These topics include operational effectiveness, risk and compliance issues, technology strategy, and security and data privacy. Each of the industries mentioned above has an important role to play in helping to shape the evolution of the mobile payments industry. It will be through their collaboration, cooperation and innovation that the mobile payments experience can become what businesses and consumers alike envision it can — and should — be.

Cyber Risk Management: No More Quiet Backrooms


By Carol Beaumier, Executive Vice President and Managing Director
Regulatory Compliance Practice




Last month, in New York City, Protiviti hosted a gathering of scores of financial service industry representatives to discuss the recently enacted New York Department of Financial Services’ (DFS) Part 500, Cybersecurity Requirements For Financial Services Companies. Similar in design to the previously enacted DFS Part 504, Transaction Monitoring and Filtering Program Requirements and Certifications, Part 500 requires DFS-regulated covered entities (including banking organizations, insurance companies, money services businesses and others) to develop and maintain effective cybersecurity programs and to certify annually to the DFS that they are meeting the requirements of the regulation.

The attendees – chief information security officers, chief compliance officers, chief counsels, internal auditors and other senior executives of banks and insurance companies – engaged in a lively discussion with a panel of cyber experts about the challenges of managing cyber risk and were especially honored to hear directly from DFS Superintendent of Banking Maria Vullo, who shared the reasons her agency felt it necessary to adopt this regulation, as well as her compliance expectations.

Superintendent Vullo said that “as cyber-attacks are increasing across the globe, laws and regulations are not just appropriate, they are necessary. Government must be in the game, looking ahead to help prevent misconduct.” The need for a proactive partnership between government and industry to do more to prevent and learn from cyber attacks was a strong theme throughout the Superintendent’s comments. While she recognized that many covered entities have multiple regulators all of whom may have different expectations regarding cyber risk management, the Superintendent stated her firmly-held belief that to do nothing, in the hopes of achieving a uniform regulatory approach in the U.S., was simply not an option for the DFS, and she encouraged other regulators to adopt the DFS model. From a governance perspective, the Superintendent was very clear that industry responsibility for cyber risk management rests squarely at the feet of boards of directors and senior management.

In designing Part 500, the Superintendent said that DFS’s goal was to develop “a roadmap – minimum safeguards for cybersecurity – which leave room for innovation.”  The agency’s focus will be on the outcome, recognizing that different risk profiles will require different responses. Superintendent Vullo signaled a willingness to work with the industry and share leading practices toward the common goal of strengthening the industry’s cyber resilience and said that “where we see clear cooperation and good faith effort, our response will be tempered even where there is need for improvement.”

While the DFS is still developing its cyber framework and examination program, comments from the Superintendent and from the expert panel suggested that, in addition to support from the top of the organization, several other key takeaways from the session should be noted:

  • Until there is a uniform regulatory standard, organizations – especially large, complex multinational organizations – will still need to address varying expectations and different areas of focus as they develop or enhance their cyber programs.
  • A rigorous, customized risk assessment should be the cornerstone of the cybersecurity program, and it will be important for covered institutions to step back and revisit their risk assessment process and output to ensure that it is providing the appropriate foundation for building the program.
  • While many organizations would immediately turn to IT to build the cyber program, it is very important to involve the business – e.g., materiality should be designed at the business level since IT may see the risk differently. To be effective, cyber professionals must understand the business.
  • Third-party risk management issues, which are a very complex challenge for many organizations, are critically important to the cyber compliance effort.
  • While some of the control requirements (multifactor authentication and encryption or reasonable substitutes for these) are not required immediately, the time to start thinking about them is now since implementation will take time.
  • Communication across the organization will be critical to the success of the program.

One of our expert panelists likely summed up the feeling in the room when he reflected that in the beginning of his career IT people sat in a backroom and no one much cared what they did so long as things kept working, but as technology gradually became a business enabler, the attendant risks to the business could not be ignored. Cyber is one of those risks on which every institution and every regulator is now focused.  No more quiet backrooms for the IT, business and risk professionals charged with protecting their organizations against cyber attacks; they are now front and center in the battle to protect their organizations, their customers, and the market against the growing cyber threats.





In the UK, 2017-2018 Priorities for Financial Services Firms Published

By Bernadine Reese, Managing Director
Risk and Compliance, UK




The UK Financial Conduct Authority (FCA) has issued its annual business plan for fiscal year 2017-2018. The FCA is the conduct regulator for 56,000 financial services firms and financial markets in the UK and the prudential regulator for over 18,000 of those firms. Its annual business plan and mission statement gives firms and consumers greater clarity about how the regulator intends to prioritize its interventions in financial markets over the next 12 months.

The plan sets outs FCA’s cross-sector and individual sector priorities for the next 12 months. It identifies the following cross-sector priorities: culture and governance, financial crime and anti-money laundering (AML), promoting competition and innovation, technological change and resilience, treatment of existing customers, and consumer vulnerability and access.

The main individual sector priorities focus on the need to continue with the implementation of the Markets in Financial Instruments Directive (MiFID II); improving competition in all areas of financial services; supporting the implementation of ring-fencing in retail banking; and assessing the developing market for automated advice models (robo-advice) in the retail investment market.

A fundamental part of the plan is the risk outlook, which identifies key trends and emerging risks that help form the regulators’ priorities for the coming year. Technological change, cybercrime and resilience are noted as major risks. However, many of the largest risks detailed in the FCA’s risk outlook are external: international events, demographic changes, the course of the UK economy, and the impact of the UK’s decision to leave the European Union (EU), commonly known as Brexit.

We published a recent Flash Report, which lays out specifics and reasoning around each of this priorities. Financial firms in the UK are advised to familiarize themselves with the report so they can determine where to focus their compliance efforts and to better understand the regulator’s expectations.