Tackling Healthcare’s Growing Cybersecurity Crisis Starts With a Proper Risk Assessment

David StantonBy David Stanton, Director
IT Security and Privacy, Healthcare and Life Sciences industry

 

 

 

As electronic medical records continue to evolve into the de facto standard, healthcare organizations are reaping the cost reduction and business and economic benefits. These benefits are attributed to advanced storage methods, fluid application data sharing and real-time business-relevant analytics. But this progress has its downside, in the form of heightened attention from cyber criminals.

In 2014, healthcare organizations accounted for approximately 25 percent of all reported data breaches – the highest percentage of any industry sector. Even more cyber intrusions are expected in the coming years because of the growing demand for protected health information on the black market. Patient medical records – often exploited for medical identity theft, fraudulent insurance claims, expensive medical equipment and drug prescriptions – can be more valuable to cyber criminals than credit or debit card numbers, which can be cancelled and reissued easily. In 2013, complete health insurance credentials sold for US$20 apiece – approximately 20 times more than the value of a U.S. credit card number with a security code. (See the latest issue of PreView, Protiviti’s newsletter on emerging risks, for more on this troubling trend.)

In the face of this growing threat, what should healthcare leaders do right now? The first step toward protecting patient information is effective risk assessment. A legitimate security framework, such as the National Institute of Standards and Technology (NIST) Framework for Improving Critical Infrastructure Cybersecurity is a good benchmark from which to assess an organization’s cybersecurity capabilities. Though the use of the framework is voluntary, we support its risk-based approach to managing cybersecurity risk.

A good portion of healthcare organizations can use improvement in the area of cybersecurity risk assessment. According to responses of healthcare leaders who participated in a Protiviti survey about cybersecurity risk and the audit process, only slightly more than half (53 percent) of respondents said they address cybersecurity as part of their audit plan, and nearly half of those acknowledged that internal audit does not evaluate the organization’s cybersecurity program against the NIST framework.

Why the inaction? One reason is perhaps a false sense of security. Healthcare organizations traditionally have placed a strong focus on HIPAA compliance, which covers risk assessment – though not necessarily information security issues. Though HIPAA does require completion of a risk assessment, it does not call for best-practice execution of security controls and adversarial resiliency. Yet organizations continue to use the HIPAA standard as comprehensive risk assessment – potentially leaving themselves exposed to cybersecurity risk.

The availability of cyber insurance also may be contributing to healthcare organizations’ less-than-stellar adoption of a cyber risk assessment and lack of expediency around implementing typical good security hygiene found in other industries (e.g., patch management, encryption, asset management, system hardening, monitoring controls, etc.). But times are changing: Insurance providers are being more prescriptive about what security controls, technologies and processes must be in place to show proper due diligence and can outright reject a claim if preventive measures aren’t implemented before the occurrence of the incident. Cyber insurance also does not compensate for the reputational black eye caused by consumers’ perception of negligence in protecting their information.

The bottom line is this: Healthcare organizations must act now to reduce their cyber risk exposure. Initiating proper risk discussions certainly doesn’t guarantee the avoidance of a breach, or eliminate the risks completely. But it does prepare the organization to conduct five critical functions: identify, protect, detect, respond and – in the case of an incident – recover. The framework and assistance for conducting these functions are available – it’s a matter of taking the first step.

Revenue Recognition Webinar Series: Industry Considerations and Cross-Functional Implications

Chris WrightChris Wright, Managing Director
Leader of Protiviti’s Finance Remediation and Reporting Compliance practice

 

 

By now, regular readers of this blog should be well-aware that new Financial Accounting Standards Board (FASB) revenue recognition rules will apply to reporting periods beginning after December 15, 2018 — and will be allowed a year earlier for those who are ready. Now is the time for companies to be considering the potential effects of this change and running diagnostic exercises to determine how much work will be required to adapt their policies, procedures and controls to the new rules in time to be ready for their chosen or mandated due date.

Protiviti launched the Revenue Recognition webinar series in November of last year, working through the six elements of infrastructure, delineating the probable impacts of the transition process in each. The final installment — Industry Considerations and Cross-Functional Implications —  was held on July 23.

Chris Wright, Managing Director and leader of our Finance Remediation and Reporting Compliance practice, wraps up our post-blog Q&A series by answering some of the top questions posed during the live session.

Q: How will the new revenue recognition standards affect internal controls and how will that affect Sarbanes-Oxley (SOX) audits?

A: Internal and SOX audits, which test controls, will be impacted in a downstream manner through changes in accounting policy, which is likely to be affected by the new rules, and more so in some industries than others. Industries that rely on long-term and percentage-of-completion contracts — construction and aerospace, for example, and anyone who is manufacturing for the defense industry — are particularly likely to see substantial changes. If the company has to recognize revenue sooner, or based on different indicators, that will have to be baked into a new accounting policy. If that policy changes, then what people do at their desktops — in the accounting organization, the operations and logistics areas, in treasury and tax, and in HR, where they compute the commissions, etc. — may also change. And of course, whenever you change processes, you have to assess whether the controls you had in place before address any additional risks that come from that change. This is where the flow-through effect of the new rules could move all the way into the domains of internal and SOX audits.

Q: Does this new standard do away with percentage-of-completion accounting for long-term contracts?

A: As an academic matter, yes. As a practical matter, however, its effects may not go away completely. All generally accepted accounting principles regarding revenue recognition are replaced by the new standard. The rules on percentage-of-completion accounting have been around for 35 years. Companies have gotten used to them. What’s really going to be a challenge is to separate and account for multiple margins and deliverables – the delivery of one plane, one tank, or one building — within a single contract. One contract might have separate streams with different margins from quarter to quarter, or year to year.

Down the road, it’s not inconceivable that companies may not only change accounting policies as a result of the new standard, but also change their pricing and their approach to accounting. That’s why we recommend a cross-functional view of the new rules. If the initial diagnostic has determined a need for substantial change, it is important to assemble a team with a full view of all upstream and downstream impacts. Without assessing the gap between the new rules and the current rules, there is a potential to overestimate the simplicity or complexity of the changes — we need to get past guessing. The diagnostic assessment needs to start at the treetops and get to a granular level, and happen sooner rather than later.

Q: What are the biggest issues facing manufacturers whose standard “free-on-board” (FOB) terms are FOB shipping point, but do have some FOB destination customers?

A: As a practical matter, the new rules shouldn’t affect a company’s policy regarding the point at which ownership of goods transfers to the recipient. What needs to be clear is the terms, and that there are no further performance obligations — policies, procedures and controls. From a cross-functional perspective, it is important to test this process all the way through, and the sales force needs to be educated to make sure that what they are telling customers matches the terms in the company’s contract — terms which are the basis for the company’s new accounting policy.

Q: What should internal audit do to prepare for the changes?

A: Chief audit executives (CAE) are in a unique position as liaisons between the audit committee and management. The audit committee will likely want to weigh in on things like prospective versus retroactive reporting and early adoption. CAEs need to make sure that these items are on the audit committee agenda and that they are being addressed. On the management side, internal audit needs to at least attend diagnostic and subsequent project management meetings, and ideally should be represented as a fully participating voice in the project management organization. Although internal auditors should not be writing accounting policy, they play a big role in making sure revenue recognition issues, particularly the cross-functional implications — both upstream and downstream — are considered and addressed. We’re seeing that plans for testing the consistent application of the new rules may require a different skill set than some companies have committed to, and that they need to move from junior internal auditors with checklists to more senior personnel with more developed critical thinking capabilities. Otherwise, how can the CAE expect the internal audit function to be effective in challenging senior accounting officers as they apply a very different accounting approach to such a critical area as revenue recognition?

The new revenue recognition standard is an important and complex issue that could have process, policy and control implications throughout your organization. To help you and your organization navigate this change, we have established the microsite protiviti.com/revenuerecognition, with links to all five of our recorded webinars, and additional thought leadership on this topic.

More Resources Are Required to Master Third-Party Risks

Rocco Grillo - Protiviti NY 2014 (hi res) (2)By Rocco Grillo
Managing Director, IT Risk

 

 

 

As corporate boards, auditors and regulators increase their scrutiny of vulnerabilities associated with third-parties, vendor risk management (VRM) – and particularly the danger of lost or compromised data through third-party service providers – remains cause for concern at most organizations. This is what Protiviti’s most recent VRM benchmarking survey revealed. The survey, conducted in partnership with the Shared Assessments Program, collected feedback from directors and senior management at more than 450 organizations across a broad spectrum of industries. The overarching conclusion: A lack of perceived improvement, year-over-year.

In 2014, Protiviti began working with the Shared Assessments Program, a consortium of financial institutions, Big Four accounting firms and third-party risk management leaders in insurance, brokerage, healthcare, retail and telecommunications, to gauge internal perception of third-party risk management, using Shared Assessments’ proprietary VRM maturity model. The model is a COSO-like framework with 126 detailed components grouped into eight high-level criteria, and is designed to assess an organization’s ability to recognize and remediate third-party vendor risks on a scale of 0 to 5, with 5 being a fully evolved state of continuous improvement.

In our 2015 survey report, we grouped responses according to the respondent’s level of responsibility: chief executive, vice president and manager. For 2015, average responses by category ranged from 2.4 at the C-level to 2.8 for managers. In 2014, the range was 2.3 to 2.8. The average response for vice presidents fell in the middle of this range. Clearly, not a lot of change here.

There are many ways these results could be interpreted. Personally, I’d like to believe the flat results are due to progress, offset by increased expectation. In other words: Vendor risk management practices are improving, but not enough to affect perception in the face of increasing scrutiny and rising expectations. I prefer this “glass half full” approach; you may think differently. In either case, the points below, drawn from the survey, hold true:

  • VRM programs require more substantive advances – Regulatory agencies, most notably the U.S. Office of the Comptroller of the Currency, have asserted that “average” risk management no longer suffices. Organizations must enact the mind shifts, organizational culture and behavioral changes required to meet and exceed rising expectations.
  • Cybersecurity threats are a prominent challenge – High-profile data breaches, often involving millions of customer records and personally identifiable information, are being reported with greater frequency. Strengthening cybersecurity is a top priority, and third-party data security is critical to this effort.
  • Financial services organizations are leading the way – The financial services industry was the first to establish a Coordinating Council for Critical Infrastructure Protection in response to federal pressure in 1998. VRM practices in this sector remain significantly ahead of those in other data-vulnerable industries, including healthcare and insurance.
  • The number and intensity of vendor risks, and cybersecurity threats in particular, is increasing – From 2009 to 2014, the number of cybersecurity incidents increased at an average annual rate of 66 percent.

Regardless of how you interpret the results of our 2015 survey, the message is clear: VRM remediation efforts to date have, at best, kept pace with increasing threats and scrutiny. Organizations need to accelerate their efforts and increase the quantity and quality of resources devoted to this critical governance issue.

I recommend taking a look at the study and related video and podcast here. A VRM self-assessment tool is also available at the link.

Virtual Reality Check: Managing the Internet of Things

Last year, at SAP’s giant Sapphire Now user conference in Orlando, the Internet of Things (IoT) was a hot topic. Don’t feel bad if you haven’t heard that term or have trouble distinguishing the IoT from the regular old Internet. New tech terms are proliferating as fast as, well… things on the Internet do.

Here’s the scoop. The regular old Internet connects people electronically, using computers or portable devices. The IoT is things connecting electronically to people, or other things. (Check out this terrific Internet of Things graphic.) We’re talking sensor data here – your electronic water meter, home monitoring devices, smart appliances, navigation apps, Fitbits, as well as industrial controls, robotic sensors, inventory control tags, and other industrial technology – all of which are transforming how we live and work. The interconnection of these embedded devices is expected to usher in a new era of automation, smart objects and data sources – the possibilities are almost limitless as the IoT reshapes the Internet of tomorrow.

In 2011, connectivity giant Cisco published a report predicting that the number of devices connected to the Internet would increase from a 2003 base of 500 million, to more than 50 billion – or 6.5 times the world population – by 2020. Others have since dialed that estimate back to 25 billion. Still, that’s a lot of devices. Think it will have an impact?

This virtual universe is expanding at a speed almost too difficult to fathom: Cisco estimated that in 2011, the Internet traffic of just 20 average households exceeded the entire global Internet traffic of 2008. That in just three years! Hard to believe.

The transformational possibilities of the IoT are staggering, creating opportunities to reengineer industrial processes and revolutionize the retail customer experience while improving the efficiency and effectiveness of business processes, leading to new business models. IoT is enabling companies in almost every industry to connect and monitor their assets from virtually anywhere, improving the way these assets are used and managed. The potential bottom-line impact of this massive connectedness is hard to ignore in many industries, including automotive, aviation, energy, farming, firefighting, healthcare, trading operations, and transportation and logistics, to name a few.

For me, pondering a change of this scope and velocity is impossible without the sound of alarm bells. Imagine such rate of proliferation in, say, public health, where a single virus could quickly spread through human contact. Electronic viruses can spread farther and faster, raising the bar for detection and containment. So the security challenges of the IoT are significant.

But I don’t mean to rain on anybody’s parade. In fact, I think the IoT can be used to mitigate risk as much as it creates risk. For example, it can be used to shed light on trends and behaviors that were previously a best guess. It can be used by exchanges to watch for trading anomalies caused by automated trading. It can hone marketing strategies and vastly improve companies’ agility and response time to emerging risks.

To manage the IoT, we must harness big data, by analyzing and understanding the stories that data tells us, and capitalizing on that knowledge. The challenge lies in determining what stories are relevant to the business and how to support those stories with the least possible surplus, so we don’t create data for the sake of data and get lost in minutiae.

As noted earlier, security challenges must also be addressed. A slew of new connected devices means a slew of new potential penetration points for hackers and cyberattacks. How effectively organizations manage the IoT will depend on how well they manage to create order and extract value out of the data, while maintaining the security of the expanding information infrastructure. That’s a big job. If only there were a device to track its progress!

Jim

Emerging Trends in Financial Services IA Analytics: A Benchmarking Study Overview

Barbi GoldsteinBy Barbi Goldstein
Managing Director, Protiviti’s Internal Audit & Financial Advisory practice

 

 

 

For decades, internal audit (IA) departments at financial services industry (FSI) organizations have relied on data analytics to support their work. With the growing availability of data, the value of this practice has increased significantly. Increasingly, IA departments are looking to develop forward-looking analytics capability rather than just scrutinizing data in support of individual audits. To achieve this goal, IA functions inside the largest FSI companies are striving to operate independently, access data when and where they need it, and conduct their own analysis, rather than rely on the business units that generate the data.

The demand for enhanced analytics capabilities is being driven by a variety of factors, most notably IA’s growing role in supporting regulatory compliance needs and monitoring, and intensifying pressure to gain better insights for improved risk management. The organizations’ reliance on big data and big data tools is further escalating the need for sophisticated data analysis within IA.

It is not a surprise then that knowledge and use of data analytics tools rate as top priorities for organizations to address, according to the responses of FSI participants in Protiviti’s 2015 Internal Audit Capabilities and Needs Survey. Respondents identified the following areas among their top five audit process improvement priorities in the coming year:

  • Data analysis tools: Statistical analysis
  • Computer-assisted audit tools
  • Continuous auditing

The findings spurred us to develop a separate benchmarking study involving the IA departments at some of the largest financial institutions – we wanted to learn how they are advancing their analytics capabilities and get a glimpse at their priorities in this area.

The study’s questions touched on a number of topics, including staffing levels specific to analytics, types of analytics tools used, and key challenges. The study was distributed to a select group of the largest U.S. financial institutions, including 13 of the top-25 U.S. banks and two of the top-five U.S. insurers. Among the most significant findings:

  • IA functions treat analytics as a high priority: 87 percent of FSI IA functions report that they have a dedicated data analytics/information management group within internal audit.
  • Analytics are evolving to provide a more risk-based approach to internal audit: The vast majority (86 percent) of IA analytics functions employ continuous monitoring – to some degree. Typically, this practice is used to plan individual audits, monitor key risk indicators and support risk assessments.
  • There is a significant opportunity to expand continuous monitoring capabilities: Ninety percent of those who use continuous monitoring say that their monitoring is currently focused on specific areas where there are known risk issues. Less than half of participants currently monitor key risk indicators; fewer monitor indicators of fraud risk.
  • Analytics departments appear intent on having access to business data when they need it: A majority of participants indicated that IA has access to the business data it needs within its own data warehouse or a similar environment. As demand for continuous monitoring grows, so will the need for greater flexibility in accessing needed data.

Of course, with greater analytics ambition come new challenges. Among them: identifying where data resides, confidentiality and privacy issues, and the ability to combine data from multiple systems and/or environments for analysis. By collaborating and coordinating with key stakeholders and management, however, IA can overcome these obstacles and leverage analytics to monitor the business in the most risk-relevant manner.

Access our full report and analysis of the benchmarking study here.

Internal Auditing Around the World – Insights From This Year’s Edition

Brian Christensen - Protiviti PHX 2012_Low ResBy Brian Christensen
Leader of Protiviti’s Internal Audit and Financial Advisory practice

 

 

 

Now, more than at any time in history, internal auditors are viewed by audit committees and management less as police and more as trusted advisers, strategic partners and consultants. Partly due to the fallout from the 2008 financial crisis, the position is valued more than ever before. Management now looks to leverage internal audit as a strategic resource, recognizing that internal auditors’ broad and deep perspective of operations, risks and potential opportunities can help inform business decision-making.

In its latest edition of the annual Internal Auditing Around the World, Protiviti takes a look at the state of global internal audit practice. We find that many internal audit departments, along with their organizations, are in the midst of significant change and transformation – a period of reinvention. Internal audit teams are rising to the call to become strategic partners to the business – a role many have been working to achieve for years – while remaining careful not to compromise their independence and objectivity.

Here are some of the highlights from this year’s edition, according to top practitioners:

On continuous improvement:

Auditing is about driving improvement and enhancement for the good of all shareholders,” said David Barry, director of internal audit for the Australian wealth management company AMP Limited. “Our aim is to make risk management less nebulous and easier to manage.”

On strategic advice and consulting:

I think acting as a consultant to the business is the new frontier for the internal auditor,” said Marco Petracchini, senior vice president and director of internal audit for Eni, a multinational integrated energy company. “(We) have very broad knowledge of processes and risk so we can make a tremendous contribution to our colleagues beyond normal audit activities.”

On the importance of embracing change:

I strongly believe that if we, as auditors, do not evolve and change, we will soon become obsolete,” said Harsh Mohan, senior vice president of audit, compliance and risk for Etihad Airways. “Ninety percent of the job I did ten years ago has been automated.”

On becoming an alarm bell for high risks:

We had to change the mindset and behavior within internal audit,” said Peter Sneyers, chief auditor for Euroclear, one of the world’s largest providers of domestic and cross-border settlement services for bond, equity, exchange-traded funds and mutual fund transactions. “We had to ask more ‘so what’ questions, to focus on impacts and consequences, and to understand that we are not paid by the number of issues we find, but by the value we create.”

On building a culture of excellence:

We cannot just think we understand the business; we have to know that we do,” said Stephen Frimpong, vice president of internal audit at Kimberly-Clark. “We have to shape the audit plan to make sure we deliver impact and drive results.”

Starting to see a pattern here?

High performers set high standards and are not afraid to change. They hold themselves accountable to those standards with metrics and outcomes judged not by volume, but by the value created for the organization.

These are exciting times to be an internal auditor. The profession continues to rise to the ever-expanding demands created by the complexities of managing risks, monitoring controls, improving corporate governance and capitalizing on opportunities in international markets, and in our highly popular Internal Auditing Around the World series we will continue to track its growth and evolution. This series should be of great interest to internal audit professionals, as well as CEOs, CFOs and boards of directors worldwide.

To KYD or Not to KYD: It Is Hardly a Question

Matt McGivernShaheen DilBy Matt McGivern and Shaheen Dil,
Managing Directors

Protiviti’s Data and Analytics practice

 

 

The importance of “know your customer,” or KYC, activities to any AML compliance program is well known. A much less known – but equally crucial – component of an AML program is “know your data,” or KYD, which feeds into KYC and other AML compliance modules.

To run their AML compliance programs, financial firms use a variety of software to review customers, analyze transactions to identify suspicious activities and provide analytical and research capabilities to support suspicious activity reports (SARs). Both SARs and KYC rely on the quality and accessibility of data, which requires knowledge of that data – where it resides, who uses it, what actions are performed on it, etc. While over-stretched AML departments may not want to hear that they now need now to be more proficient in data management, KYD activities are needed and can drive efficiencies inside these departments through better data governance.

Due to the way they grow, financial institutions often are burdened with siloed organizational and technical infrastructure with redundant and difficult to integrate systems and data stores. This creates a particular challenge for AML compliance heads who have to make sense of disparate data that flows into the AML system from a variety of sources.

A recently published Protiviti point-of-view paper, AML and Data Governance: How well do you KYD?, sets out how firms can benefit from putting in place an effective data governance program to alleviate this problem. The paper covers the main challenges firms face with regard to data management in the context of an AML program and summarizes the main steps needed to create an effective data governance function as follows:

  • Institute and enforce effective master- and reference-data management programs
  • Institute enforceable enterprisewide data governance strategy and processes
  • Be proactive in assigning data ownership and monitoring of data quality
  • Create a centralized repository for metadata
  • Support big data initiatives

Financial institutions that take these steps in an effort to create better data governance will not only be better equipped with regard to their AML efforts; they are more likely to achieve good standing with regulators who look favorably on firms that demonstrate data governance efforts.

Case in point: A Protiviti team, while working on a customer repository project at one of our clients, uncovered substantial data integrity and completeness issues across core systems supporting transaction monitoring at the organization. Regulators severely criticized the bank following an AML compliance program examination – a criticism that could have been avoided if effective data governance practices had been put in place. The firm engaged Protiviti to help expedite remediation of the data issues and formulate an effective and proactive data governance resolution to avoid an enforcement action.

We highly recommend reading this paper to gain a clear understanding of how critical KYD is to the long-term success of your AML program. Regulatory scrutiny around AML compliance has intensified after a series of high-profile lapses – so making data governance a priority seems like a prudent approach for financial firms.