Logs Paint Picture of Menacing Insider

A Practical Approach to Diminishing the Insider Threat
Logs Paint Picture of Menacing Insider

The costs to defend against the insider threat comes down to the amount of risk an organization is willing to accept, according to Randy Trzeciak and George Silowash of Carnegie Mellon University's CERT Insider Threat Center.

"How much risk the organization is willing to accept could determine what solutions [or] strategies you implement, and each one of those may have varying types of costs," says Silowash, co-author of the Common Sense Guide to Mitigating Insider Threats, in an interview with Information Security Media Group [transcript below].

A good starting point for organizations to identify their risk is to establish what the baseline behavior is for their end-users, says Trzeciak, senior member of CMU's CERT technical staff. And that requires investing in behavior analysis technologies.

Unlike outsiders, Trzeciak says, insiders have authorized access to IT systems, and identifying those who could place the organization at risk is "a little difficult. ... The challenge for the organization is to determine what's normal, compared to what's abnormal, and alert when those things are happening."

In an interview, Silowash and Trzeciak:

  • Provide examples of the 19 practices found in the updated guide that organizations should implement to prevent and detect the insider threat;
  • Discuss the costs associated with protecting against the insider threat; and
  • Explain how the latest guide differs from the three earlier versions of the publication.

Silowash is a cybersecurity threat and incident analyst at CERT, working with the threat technical solutions and standards team. He previously worked as an information systems security officer for the U.S. Department of Justice's National Drug Intelligence Center. He also is an adjunct professor at Norwich University's information assurance program.

Trzeciak is a senior member of the CERT technical staff. He also is an adjunct professor at Carnegie Mellon's H. John Heinz III School of Public Policy and Management. Previously, Trzeciak managed the management information systems team in the Software Engineering Institute's information technology department.

Insider Threat Guide

ERIC CHABROW: What's the main takeaway of the guide and who do you see as its main audience?

GEORGE SILOWASH: The main takeaway of the guide is that insider threats can affect many different organizations, all the way from small businesses up to enterprises or government organizations. People need to be aware of the different types of things that insiders may do to compromise the security of the organization's systems and data. I believe the audience of this is basically anybody from the small business up to the enterprise level, and specifically we target the five different groups within an organization: human resources, legal, physical security, data owners and information technology teams. But those aren't just the specific areas within the organization. The whole organization must work together to help mitigate insider threats.

CHABROW: Can you give a little example of what these case studies are?

SILOWASH: We've been collecting data from reported insider threats across different industries, with the government as well. Basically, these case studies are what we build the Common Sense Guide off of. We have over 700 case studies. We take from our database of insiders and try to build profiles for insiders and figure out what can be done inside an organization to help mitigate that. There are many different types of cases, and they're ranging from intellectual property theft to IP sabotage.

Latest Edition's New Features

CHABROW: The guide is in its fourth edition. What would a reader find new in the latest edition he or she would not have found in earlier editions?

SILOWASH: We provided a number of actual enhancements to the fourth edition of the guide. For example, we've added references to other best practices, such as CERT's resiliency and management models, ISO 27002 and NIST 800-53. These are all other standards that organizations may be following, and this kind of supplements them. We also added some additional quick wins and high-impact solutions. These are things that we think that an organization, both for large companies and for small business, can implement that will quickly address some of the insider threats that may be within the organization.

CHABROW: Can you give an example or two of those?

SILOWASH: For example, our first best practice is to consider threats from insiders and business partners in the enterprise. Drive those risk assessments. For quick wins and high-impact solutions, one of them would be, for all types of organizations, have all employees, contractors and trusted business partners sign the nondisclosure agreements upon hiring, termination, employment or contracts. Large organizations have more resources, more funding. Therefore, their budgets allow them to implement additional mitigation strategies. They might be able to prohibit personal items in secure areas - for example, cell phones being taken into a call center where an insider could potentially use the capabilities on that phone to capture data off of screens. You might want to have employees leave their phones outside of secure areas. But that might not be acceptable to all types of organizations. It all basically comes down to your risk acceptance and mitigation strategies.

CHABROW: You talk about the nondisclosure forms. Is that in itself a deterrent in that people would think twice about maybe taking some information with them when they leave an organization? Or is that more for the legal protections, or both?

SILOWASH: It could be for both. It could act as a deterrent. We've mostly seen it - at least in the cases that I've reviewed - as a legal protection for the organization.

Limiting the Use of Devices

CHABROW: Limiting certain devices in secure areas - is that a hard policy to implement? Do you have to do more than just asking people not to do it?

SILOWASH: Given the nature of the device, they're so small nowadays. It's very easy to conceal your phone in your pocket, bag or whatnot. It could be something that could be difficult to implement. That's kind of why we broke out the quick wins and high-impact solutions to the two different sizes of organizations, like all organizations starting from small business up to enterprise, and then large organizations where they might have the budgets to implement maybe a metal detector before you go into a secure area to screen employees before they enter that area to look for items that they may have concealed.

Cost of Mitigating Insider Threats

CHABROW: Does defending against the insider threat cost a lot of money compared to other forms of IT security?

RANDY TRZECIAK: In terms of the cost of protection strategies, certainly the insider threat problem is a little difficult to identify people who may be more at risk, given that these individuals have authorized access to authorized systems, and they're doing what they normally do on a day-to-day basis. In terms of the cost of trying to prevent or detect an insider from harming a critical asset, it's really if an organization is able to determine what anomalous behavior is, particularly when you start looking at IT systems. There are a number of tools and technologies that can determine what baseline behavior is on these IT systems. It's up to the organization to be able to identify what are variants from those normal activities on a day-to-day activity. Again, given that these insiders have authorized access to authorized systems, the challenge for the organization is to determine what's normal, compared to what's abnormal, and alert when those things are happening.

SILOWASH: It also comes down to the organization's risk acceptance. Depending on how much risk the organization is willing to accept could determine what solutions you would implement or what strategies you implement, and each one of those may have different varying types of costs. There might be more cost associated with implementing hardware and software to help mitigate the insider threat. I think it all boils down to the organization's tolerance for risk.

Best Practices

CHABROW: The latest edition describes 19 practices that organizations should implement to prevent and detect insider threats. Please describe several of these practices.

SILOWASH: Best practice number six is to know your assets. This is one that I'm a big fan of. Organizations have to know what they must protect. They have to know what types of systems they have. They also have to know the type of data that they're protecting. They need to know where that data lives on their systems, and they need to know what sensitivity level it is. Does it contain personally identifiable information or protected healthcare information? Understanding what your data is will help you determine what types of mitigation strategies you have to implement.

One of the other best practices we have is best practice number 12, to use a log-correlation engine or SIEM (Security Incident and Event Manager) system to log, monitor and audit employee actions. Logs help tell a story. No single event is going to be an indicator of an insider. You need to record as many logs as possible to help paint the picture of an insider or help paint the picture of normal network behavior and activity within the organization. No single event is going to tell you if that person is a malicious insider or not. You need to have as many events as possible to help describe what's going on in the organization and what actions an employee has taken that may be indicative of an insider.

One of the other ones we added here - it's one of the new practices - best practice number 16, is to develop a formalized insider threat program. Best practice 16 basically describes how to start an insider threat program within the organization. It outlines some of the different teams that are involved. You just can't have one team that's essentially siloed from the rest of the organization. It involves individuals from across the organization, and they have to work together to determine what data points they're going to collect and what information, even to the insider threat program, to help mitigate any risks of insiders within the company.

CHABROW: I gather those different constituencies within the organizations align with the six groups that the guide focuses on: human resources, legal, physical security, data owners, information technology and software. The guide maps the relevant groups to each practice. Can you give us an example or two of how that works?

SILOWASH: Each group has their own different responsibilities within the organization. They have different roles for preventing and detecting insider threats. For example, data owners will need to understand the data that's within the organization; they need to understand where it lives. They also need to understand what systems are processing that data. They need to be involved in various practices identified in the Common Sense Guide.

Governing Insider Threat

CHABROW: How should insider threat be governed within an organization, since you have six core groups here that have to deal with it?

SILOWASH: Management ultimately needs to set the strategy for mitigating insider threat. They need to establish the policies and procedures. They also need to set the example across the organization. But specifically, when we're talking about an insider threat team, it's more of an isolated group of individuals that have to treat the incident or the potential for an incident very carefully. It has to be confidential as well. You want to keep the group of individuals involved within an insider threat program limited. The insider threat team consists of people from across the organization, but the nature of their work is sensitive and needs to be kept kind of confidential.

CHABROW: Any final thoughts you have?

SILOWASH: The Common Sense Guide is a great tool for organizations, both large and small, just to start thinking about mitigating insider threats and to help them determine what they need to do to help lower their risk of a potential malicious insider causing harm to the organization. I think it boils down to risk management, understanding the risk and implementing mitigation strategies to reduce that risk.




Around the Network

Our website uses cookies. Cookies enable us to provide the best experience possible and help us understand how visitors use our website. By browsing bankinfosecurity.com, you agree to our use of cookies.