Transcript of Getronic's John Pironti on Leading Edge Risk Management Practices

Richard Swart: Hi, this is Richard Swart with Information Security Media Group, publishers of BankInfoSecurity.com and CUInfoSecurity.com. Today I’ll be speaking with John Pironti, chief information risk strategist with Getronics. How are you today, John?

John Pironti: I’m great. How are you doing today, sir?

Swart: Let’s start talking about risk management, but rather talking about traditional issues of information and business impact analysis. I was wondering, is there some fundamental question or fundamental process that banking and finance executives should start with when they start thinking about risk management?

Pironti: There actually is. As we start looking at risk management and more specifically information risk management, which is really what we’re focusing our attention and the work I’m doing on, one of the first things we often ask ourselves is to figure out what problem are we trying to solve. To what degree are we trying to solve a problem? With what degree are we trying to protect the information? And once we understand those basic principles, then we should be looking to go through a process that we call Threat and Vulnerability Analysis. And what this does, it allows you to go through a logical process-oriented activity that helps an organization understand what truly is a threat to its information infrastructure and information assets versus what generically exists today in the community, because there’s so much out there that potentially affects somebody if they believe that something could happen. There are so many taxes, so many concepts. There are so many different possibilities. It’s important to focus down in a business logic sense to say ‘what is it do I want to protect? To what extent do I want to protect that? And how am I going to go about that?’

Swart: Do you talk about why it’s important to include a process analysis as part of that Threat and Vulnerability Assessment?

Pironti: Sure. So one of the most important things you can do when you start a Threaten Vulnerability Assessment is to start out by doing business process mapping and what business process mapping will do for you it will give you incredible amounts of intelligence about how your business operates and how the business processes which you’re going to protect and the associated data operate. Once you start with the business process analysis then you actually move into a logical and physical asset inventory. One of the things that I often find in organizations is that data has become so pervasive and there’s so much out there between partners, vendors and even internal infrastructure and employees and devices that most organizations really don’t have a good grasp of where their information is. So, you can’t protect something that you don’t know, and you can’t protect something that you can’t find. The business process analysis, or business process mapping concepts, will actually give you visual representations of where your data is, how data flows through a business process, and unless you start building a process to say this data flows through these devices, these solutions, these vendors, these partners and then you can go make appropriate risk management and Threat and Vulnerability Assessment decisions.

Swart: What’s the difference between a business processing mapping and just a data flow diagram?

Pironti: Business process mapping really talks to the business process. So, it really focuses and says to the business leader ‘tell me how your business operates, not just through technology but through people, process and procedure as well as technology.’ So, it differs in the sense that you’re really focusing in on the actual activities and the personnel and the contractual requirements and regulatory requirements, everything that goes along with an associated business process versus a business process flow, which is really just going to show you one dimension of information.

Swart: Now why is it harder for many organizations to focus on data instead of the technology as they conduct these assessments?

Pironti: Great question. Great question. The reality of it is technology is easy. Technology is something that is easily comprehended, it is easily understood. It is something that can solve simpler problems. Technology tends to be a reactive measure. It tends to be a situation where we have identified a potential threat that we’re worried about or have been affected by in most cases, and we decide now that we’re going to incorporate some technology solution to try and assist us. Technology is a great thing once we’ve understood the processes and the policies and procedures that we want to use to help those processes and policies be enacted, but you can’t start with technology. If you start with technology, then you’re bound to fail, and we’ve seen this over and over and over again in situations where we’ve had data breaches, data leakage, hacking situations and criminal activities and things of this nature, but it is always easier for an executive team or for an information specialist to actually go and propose a piece of technology versus invest the time and effort it takes to actually sit down and do business logic activities and business process mapping and Threat and Vulnerability Analysis and a data-focused concept.

Swart: What are best practices that a financial institution should be focusing on if they’re having to re-engineer their risk management process after having an incident?

Pironti: Well I don’t believe that I can tell you what best practices exist. I can tell you leading practices, but I can’t tell you what the best practice is for an organization because I can give you a series of things that I think are best practices, but someone can stand next to me with the same level of credentials and experience and have a completely contradictory set of best practices. So, I’m willing to give you some leading practices that I often find as I’m working with banks on a worldwide basis and financial institutions on a worldwide basis of what we’re looking to do. And the first place we start is the idea of having a structured approach to information security. It’s actually setting up a real information security program and it’s changing the mindset from a reactive process technologically-oriented concept to a proactive one where we start doing things like Threat and Vulnerability Analysis. We start building our framework or our backbone information security to our policies, procedures and guidelines and standards, and then we can start working from awareness. The #1 challenge we find today in information security capabilities within financial institutions across the world is the lack of general awareness within the organization of the capabilities of the information security organization, as well as of information security as a concept within the organization. Many financial institutions spend a lot of time doing yearly planning and training and then produce a one-hour PowerPoint training sessions or a one-hour in-person session where someone has to sign off to say ‘I’ve been trained for the year,’ but that’s not necessarily effective. One of the things that we often talk about when we talk about awareness training is the concept of different audiences and populations that you need to address. For instance, we have in the workforce today in the United States two differentiated levels of population. We have an older population that was taught by individuals teaching them in front of a room using a chalkboard or a lecture style. We have a younger generation of workforce that’s coming through now that’s used to learning from computers and all electronic learning means. So, when we’re doing awareness and trying to drive concepts and ideas home about information security and raise the knowledge of the population, we need to address both populations individually. We can’t assume the same learning tools are going to work for everybody, and this is something that’s often overlooked, as well as we have to simplify these processes because there’re cultural considerations and language considerations that we need to take into account that often are overlooked. For instance, we need to use more pictures and colors and graphics to simplify things so that any individual can understand it from any country, any culture, any population.

Swart: That’s great information. Let’s change our focus a little bit, though, and go back to the core fundamental process of managing security function in organizations. What role would you say that IT governance is playing in effective risk management?

Pironti: I think IT governance is essential to effective risk management. I think what governance allows us to do is it establishes boundaries and key performance indicators that we can judge ourselves by and set metrics and matters in place that allow us to understand how we’re doing, where do we want to go and how well do we want to do and make sure we’re in alignment with business processes and with business activities. The most important thing that we can teach information security organizations and individuals is that their role is really to provide information to decision-makers to make decisions, instead of making decisions themselves. So, what governance allows us to do is it gives us a structure and a process orientation and a framework to help facilitate gathering the information, analyzing the information, understanding what information should be communicated when, to whom, how and why.

Swart: One of the risks that’s getting a lot of attention right now is third-parties. Many organizations, especially the larger financial organizations, are using third-parties for a lot of their data infrastructure and data processing. What are some key questions that banks and financial institutions should be asking these vendors as they conduct a risk assessment?

Pironti: That’s a great question. I work with a lot of vendors that work with banks that are very challenged by the fact that they feel they’re getting asked lots of questions, they’re getting lots of audits, they’re getting lots of questionnaires from banks that are trying to figure out what they’re doing. And banks and financial institutions are doing this for all the right reasons because they really have started outsourcing business processes and business activities for all the right reasons, but now that that data is no longer in their direct control they want to make sure that the data is dealt with in accordance with their rules, restrictions, policies, guidelines, procedures and standards and all of these things. So, what we often prescribe for the financial world across the world actually is that it’s important to educate your vendors and educate your third-parties about what your expectations are, of how they will deal with your data. What are your policies? What are your capabilities? What are your standards? What are your guidelines? How would you like them to deal with your information? What are they asking them to step up to? At the same time, financial institutions need a way to do constant monitoring. Right now we have kind of a broken process in most cases. We touch these vendors if they’re parties on a yearly basis or a bi-yearly basis in most cases, which means that we have a long period or windows of time where that data can be at risk and we’re not aware of it. So, one of the things that we often work with financial institutions to understand is how do you set up ongoing relationships and ongoing monitoring infrastructure and metrics and measures to ensure that your data is being protected in the fashion that you’d like it to be protected and being dealt with in the way you’d like to be dealt with?

Swart: One last question. Let’s talk about the TJX case, not specifically that case, but sort of the implications of what we’ve learned from it. Data leakage is becoming a prominent issue right now. What are some of the leading practices that organizations and financial groups should be following in order to minimize their risk in data leakage?

Pironti: Sure. The most important thing to fight data leakage is to start with the asset inventory, the logical asset inventory, and that’s truly understanding where your data is, having an accurate identification of the data, and then classify that data. By classifying the data you can decide what controls need to be in place for what kinds of data. So, not all data is created equally and if we take the premise that not all data is created equally, we need to set up different levels of protection and capability for a more sensitive data and confidential data than we would the typical marketing data or publicly available data. So, to stop things like data leakage or stop things like data transportation outside of the boundaries of the organization that we do not like we need to make sure (1) we understand where it is, (2) we understand the sensitivity levels associated with that, and then we need to understand the business processes that may be driving that activity. In cases like TJX and other data breach cases and data leakage cases that we’ve been experiencing, some of these things are happening not because of necessarily a malicious business process was built or someone took advantage of something or taking advantage of how business processes operate. So, they’re taking advantage of the fact that there are different business processes that require data to be expanded in different environments that may not be in control completely of the organization, and that’s how attackers typically like to do this. They like to look at laptops, they look at scanning devices that exist or eavesdropping technology or key-loggers -- things of this nature. So, they’re taking advantage of the fact that there is not this conscious understanding of where data is and how it should be used and should or should it not be allowed to be in certain places. That’s often the fundamental question that’s asked once we built the asset inventory and say ‘um, does all that sensitive data have to be in all those places?,’ or can we consolidate it and create data vaulting concepts where we consolidate data points, and we do common centers of data base and data store, and then we can do central pulls from those data stores and put appropriate controls around those individual data stores to help mitigate the ideas of data leakage and data loss.

Swart: Well, John, thank you for your information today. It’s been very helpful. I’m sure listeners will benefit--

Pironti: It’s a real pleasure, sir. It’s a real pleasure. Thank you so much for the time.

Swart:Well, thank you for listening to another podcast of the Information Security Media Group. To listen to a selection of other podcasts or to find other educational content regarding information security for the banking and finance community, you can visit www.bankinfosecurity.com or www.cuinfosecurity.com.


About the Author

Richard Swart

Richard Swart

Editorial Contributor

Richard Swart is a contributing writer for BankInfoSecurity.com and CUInfoSecurity.com. Swart is currently pursuing Ph.D. in Management Information Systems at Utah State University. His areas of expertise include Information Security program management, including all aspects of governance, risk management and auditing. Recently, he led a nationwide research project comparing the Information Security competencies and skills needed in the industry versus the academic programs. Swart routinely interviews banking regulators, industry leaders and other Information Security practitioners for BankInfoSecurity.com and CUInfoSecurity.com.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing bankinfosecurity.com, you agree to our use of cookies.