The Protiviti View  | Insights From Our Experts on Trends, Risks and Opportunities

The Protiviti View

Insights From Our Experts on Trends, Risks and Opportunities
Search

POST

4 mins to read

Cyber Risk Quantification – Takeaways From the FAIR Conference

Views
Demystifying digital transformation in finance - Explore the key to success in digital finance transformation
Larger Font
4 minutes to read

Protiviti was once again a sponsor of the 2019 FAIR Conference, hosted by the FAIR Institute, which took place on September 24-25 in National Harbor, MD. Protiviti cyber risk quantification experts Andrew Retrum and Vince Dasta were at the conference to meet with other FAIR experts and answer questions from attendees. For those who missed them, we present below a brief interview with Vince Dasta, Associate Director with our Security and Privacy practice, discussing some of the takeaways, the growing popularity of the FAIR methodology and its applicability to operational resilience.

FAIRCON, September 24-25, 2019
Vince Dasta Interview [transcript]

Kevin Donahue: Hi, this Kevin Donahue here with Protiviti welcoming you to a new edition of Powerful Insights. I’m happy to be talking today with Vince Dasta. Vince is an Associate Director with our firm Security and Privacy practice. Vince attended the 2019 FAIR Conference last week in Maryland, and I wanted him to share some of the highlights from that event with us. Vince, thanks for jumping on with me.

Vince Dasta: Yes. Thanks, Kevin.

Kevin Donahue: Vince, as I mentioned, I know you were at the FAIR Conference. I know that quantifying these cyber risks and addressing these issues is a big topic for financial service firms and others, in other industries as well. With that backdrop, what were some of the top takeaways, items being discussed at the conference that you noted?

Vince Dasta: Sure. As you mentioned, the FAIR Conference which is, Kevin, an annual conference – this is the third year that I’ve been involved, and the second through Protiviti sponsorship and, I’ll tell you this: The first thing that you notice is the growth and the maturing of the community. Three years ago, or two years ago, the conference was maybe 100 people in a hotel conference room. Last year, it was 200 people at Carnegie Mellon, and the level of participation was huge. This year, it was 450 people in a large convention center. The interest and the enthusiasm of the people that were there is something that’s pretty unbelievable. As far as the key themes, what’s interesting is when you look at this is that a couple of years ago, the conversation was around, “Can we quantify cyber risk or risk in general?” And the debate and the questions that were being answered was, “Is it even possible?” Then, from there, I think, it evolved and it was more about, “Should we quantify risk? What are we doing with this? Is this valuable to do?” This year, the key messaging was around, “How do you now operationalize this and create this as a component of your risk management program?” The novelty has worn off that this isn’t just a one-time activity that you do. This isn’t a means in itself. It’s a means to an end in managing risk effectively and making informed decisions. The level of adoption within the industry has grown so much that people have figured out how to do it, by and large. There’s still a lot of people with varying levels of maturity, but people figured out how to do this now. Now, the question is, “How do we integrate this into all of our other risk assessment and risk management practices? How do we make this a sustainable piece of how we do business, instead of a one-off novelty or special thing that we do sometimes?”

Kevin Donahue: That is great insight, thanks Vince. Let me ask you. There’s been growing talk, a lot of different discussions happening around this concept of operational resilience with financial services institutions. How does the quantification of cyber risk tie into that, tie into operational resilience?

Vince Dasta: Yes. It’s a great use case. One of the things that we’ve seen on the resilience side is that I think that, conceptually, almost everybody gets it. It’s relatively straightforward about managing resilience. The challenge comes when you start to peel back the onion a little bit, and you look at the details around, “Now, how do we actually do this? How do we do this in a way that’s not overly subjective and qualitative?” There are a couple of valuable use cases that we’re seeing on the risk quantification side and resilience, one of those being impact tolerance, which is an interesting concept that I think is very valuable in understanding, “What is our tolerance for these resilience type incidents? How much down time of a critical business service can we weather as a firm or as an organization before we’ve crossed some red line of irreparable harm or insolvency within our product line?” I think FAIR is a great way to do that because what it lets you do is talk about the probabilities of sustaining certain financial losses. We can do that in a scoped scenario, so that it’s very easy to understand what we’re talking about and what we’re not talking about, and what’s the likelihood of reaching this scenario. An example of that is, we helped a bank work through this problem for the outage of a service, and rather than come up with the answer that your impact tolerance is four days – the problem is there’s a lot of uncertainty around that type of information. It’s very hard to tell. Not all outages and issues are created equally. What we’re able to do is using FAIR and embracing the uncertainties there. We can talk about the probabilities of certain things happening. At four days, what’s the probability of losing a certain amount of money? We can do that in a way that’s very tangible, but also captures the uncertainty that’s there, and lets ultimately the business make those kind of business decisions in an informed way, rather than just from their gut or off of the top of their head using mental models. I think that’s something that’s really valuable, and we haven’t found another way to do this without using these types of quantitative risk analysis tools.

Kevin Donahue: That’s a great rundown. Thanks, Vince. I appreciate you jumping on the phone with me today. I want to remind our audience, or tell our audience that we’ve actually published a white paper on this topic called “Measuring Cyber Risks Quantitatively – Eliminating the Guesswork.” You can find that on our Protiviti website.

[End of transcript]

Was this post helpful to you?

Thanks for your feedback!

Subscribe to The Protiviti View Blog

To face the future confidently, you need to be equipped with valuable insights that align with your interests and business goals.

In this Article

Find a similar post by topics

Authors

The Protiviti View

By The Protiviti View

Verified Expert at Protiviti

EXPERTISE

No noise.
Just insights.

Subscribe now

Related posts

Article

What is it about

While the return-to-office decision is often framed in a straightforward manner — we believe collaboration, productivity and innovation flourish more...

Article

What is it about

What you need to know: Aging systems, data silos, regulatory pressures and talent gaps complicate enterprise transformation for public utilities....

Article

What is it about

The top priority for healthcare internal auditors this year is cybersecurity, according to a survey by Protiviti and the Association...

Search