Some organizations still have concerns about sharing too much data and threat intelligence to help thwart attacks. But EMC's Kathleen Moriarty says their concerns are often misguided. It's not so much about revealing intellectual secrets, she says, as it is lacking a skilled workforce and sufficient resources to understand the data.
For larger organizations, that concern seems to dissipate because they have the bandwidth to take in and disseminate information.
Moriarty, who serves as EMC's global lead security architect, is the author of a new report about information sharing weaknesses that cross industry sectors.
"The larger organizations have the skilled resources," she says in an interview with Information Security Media Group [transcript below]. "They've worked through their legal processes. They know what they can share and they also know what's useful to share and what's useful to receive."
But for organizations that aren't able to handle the duties around information sharing, one alternative is leveraging vendors, Moriarty says.
"That's part of the problem here in how we should [utilize] vendors and the larger ecosystem to have a more effective sharing model, extending how we define information sharing," she says.
During this interview, Moriarty discusses:
- Why defining what data is needed to detect cyberthreats is the first step;
- How working with organizations such as the Anti-Phishing Working Group can greatly enhance the quality of the data that is shared;
- Why organizations struggle to share the right information.
At EMC, Moriarty works on technology strategy and industry standards for information security across several international standards bodies. She has been the primary author of multiple white papers and standards, including RFC6545, Real-time Inter-network Defense, and is co-chair of the Internet Engineering Task Force's Managed Incident Lightweight Exchange working group. Moriarty previously served as the head of IT security at MIT Lincoln Laboratory and was the director of information security at FactSet Research Systems.
Ineffective Information Sharing
TRACY KITTEN: Why has information sharing been ineffective?
KATHLEEN MORIARTY: I think we've had a mix of both effective and ineffective sharing to date, and I think there are a lot of opportunities for improvement. The report focuses in on a few key examples, like the APWG, who have really thought through their use cases and thought through what information is useful to share with whom, and use a mix of sharing and proprietary methods to address problems.
But then, if you move to other sharing initiatives where it's very broad and the data isn't necessarily directed at the users, but just meant to be useful tips to enact within sharing programs, you run into a number of problems. The first one is that an organization has to be large enough to have the resources to participate in this. If you think about a security program, you're going to first worry about having good policies, then good defenses, before you can get to having incident response handlers and on-staff forensic experts. The ability to have those resources is a big hurdle. Then the current sharing mechanisms require a lot of manual processing to figure out if the data shared is even useful to your organization before you can apply controls out. Then, being able to have the resources to figure out if you have data that's useful to share with others is also a challenge. I think those are some of the hurdles we're facing.
Sharing Info within Organizations
KITTEN: Just for the sake of clarity, when we talk about information sharing, we're not just talking about cross-industry information sharing. We're talking about information sharing that even occurs within an organization itself among departments, right?
MORIARTY: That's right; and that's a great point. Some of the challenges larger organizations face is that they may have a central place where they receive data and then they need to disseminate it out to different business units. The problems they may run into are, first, when they're receiving this data. As we mentioned, it comes in formats where they're lent to still using manual processes. So first they have that challenge. How do they aggregate their data? How do they consume it in a way where then they can process it and send it out to various business units?
Within the financial sector, there are a number of large organizations that have to do just that. They process the data and then figure out, "How do I get it to each of the business units? It's coming in different formats. Do they just forward that e-mail? Do they forward that common separated file? If it's in an automated format, is there a way for them to transform this to get it from the system they're using to process their data over to a system of that business unit?"
Then from that point, they might be receiving the type of data from several sources, so the problems they'll run into using these different formats is how do they know which source is the best source for data? How do they know which one they should pay attention to and prioritize the data that they're getting from that particular source? Some of the problems with that is that you're receiving data in different formats, and then you don't have necessarily the systems that can communicate the value back and forth, even within an organization so that you understand that a feed's worthwhile to continue to receive. Reporting has been a problem, and the ability to track and understand the value of data once it's used within the organization.
Does Info Sharing Enhance Cybersecurity?
KITTEN: Banking institutions have made strides to enhance information sharing, but they still question whether information sharing is really enhancing cybersecurity initiatives. Do you think this is a common misconception or does it relate to some of the manual processes and hurdles that you just discussed?
MORIARTY: I think it's a mix. You look at the examples in the paper with M3AAWG [Messaging, Malware, Mobile Anti-Abuse Working Group] and the APWG. They have great examples where they're able to aggregate data but then share it out in a very directed and actionable way. APWG, for instance, you're aggregating data on phishing attacks and then vendors, partners and members are able to access this large data pool to perform different actions. The browser vendors use these sources and other sources to create their block lists in browsers. If you think about it, it may not be labeled as sharing by most folks because they're receiving updated browser lists and organizations don't have to do anything for that to happen, but it's having an extremely broad impact. Organizations' end-users are benefiting from that sharing and the aggregation is happening through standards.
However, the pushing-out of browser block lists is happening within those vendor ecosystems to the browser providers, which is just fine because it's very effective and it's having a really broad impact with a few number of skilled resources having to actually do something. Then, if you turn to the MOG examples, where you have e-mail operators working together, they're actually stopping spam at the source. ... APWG does some similar things, where they do work with law enforcement and members to take down malware distribution sites or command-and-control servers, and that's fed back into their codes of data so that your block lists are not stale, they can be updated from a centralized source and you don't have problems with the same data circulating around where you would have stale information potentially.
You also asked about the misconception of sharing information and it not being effective. I think this is true in some cases. If you look at models where you're just sharing broadly within siloed groups of peers and you might be sharing with multiple groups of peers, you could be running into the same data source multiple times, having to process that and figure out what's useful or not. You might not be aware of what problems have been addressed already. In those cases, I think it's a challenge and could see the point of folks who are finding it ineffective.
Sharing Too Much Info
KITTEN: Historically, information sharing has been somewhat taboo as many organizations have been reluctant to share too much. Do you think this perception still hinders information sharing today?
MORIARTY: In some respects yes. However, I think large organizations have the skilled resources. They've worked through their legal processes. They know what they can share and they also know what's useful to share and what's useful to receive. Most of the times when I ask an organization how many different data types do they actually exchange, it's typically a very small number. The highest number I've received back has been 12 and the lowest is three, so it's typically a small set of information that's shared, but that could be very useful information to disseminate out to other parties.
I think the bigger challenge is the skilled resources. If you're large enough to have the resources to participate in these sharing circles, you may be able to consume data. But if you don't have the skilled resources to analyze your own environment and figure out what data's useful to share that, for that you might need pretty skilled forensic experts in networking and malware and other fields, and I think that's part of the problem here in how we should leverage vendors and the larger ecosystem to have a more effective sharing model, extending how we define information sharing, not just as what people are dubbing as free between different organizations, but creating models that integrate with vendors and service providers for effective sharing and mitigation or even stopping particular attack types.
KITTEN: Can you explain more thoroughly what you mean by standards for information sharing?
MORIARTY: In terms of it being a hindrance, I think some folks are jumping to the conclusion of what to use before they're mapping out their full use cases. If you take the time to map out a use case and determine what's useful to share with whom, that's a very important first step in the process of sharing information, because you may come up with very different models than just sharing your information quite broadly.
Then, when you get into standards, as chair for the MILE Working Group, some of our focus within the IETF is to make sure that we're winding up with interoperable solutions through standards. What that means is, just for the interface, if I'm implementing a sharing solution, that interface has to be able to package up data and then send it to another party where they may have a different implementation, and they un-package that data and it means exactly the same thing. Standards don't have to be applied to database; they can be used in that way. But that's not necessary for interoperability and could inhibit innovation in some ways. Standards really should be just left to the interface and should be based on requirements, and there are many, many standards out there for sharing information.
For instance, I brought up APWG. They use IODEF and that's out of the IETF work and then you have M3AAWG who has developed the Abuse Reporting Format, which is also from the IETF. And then you have efforts like STIX out of DHS and MITRE and those specifications are being adopted by FS-ISAC, for instance. They have different purposes and that's okay, and I think folks have spent a bit too much time worrying about what gets used where and can we aggregate down to a single format, and I think that's not as important of a question to ask as, "What do I need for my use cases?" Groups that do take a look at what it is that they actually need to share and focus on their use case tend to be pretty successful to achieve an end-game. Efforts like FS-ISAC, they're interested in a much broader range of data types that they would like to represent, so they've looked to STIX to help solve their problems. I think that the conversations around standards have hindered us a little bit, and we need to move beyond those and just make sure that each group is evaluating their requirements and using what's necessary to meet their needs for effective sharing. But [it's] really thinking through those use cases - what do I share with whom - to have a broad impact.
Role of Government
KITTEN: What about the role that government plays in mandating information sharing?
MORIARTY: Right now, the government has just focused on their outward sharing, and I think that's very good. Mandating sharing that would by-directional would be a bit of a challenge, and I think it comes down to that resource issue as well. It's not easy for each organization to have a level of resources necessary to be able to digest information and figure out what's useful from their environment to share; and some of the other challenge is that when you receive information from one source, you can't bleed it to another source, otherwise you could get shunned. These types of challenges can be tackled by larger organizations, but if you move to small and medium-size organizations, it becomes much tougher for them just from a skilled-resource perspective.
Although there are many challenges, I'm optimistic that we can make a difference, and sharing can be effective. The motivation behind writing this paper was really to challenge people to think about their use cases and to think about sharing as effectively as they could. Are there ways within your use cases to share data in a directed way to have a broad impact? If we move towards those models where we think of sharing as more of an ecosystem that involves not just sharing between organizations but also with vendors with law enforcement in a fuller ecosystem, I think we'll achieve an end-game where we actually improve our information security posture as a whole.