WEBVTT 1 00:00:07.260 --> 00:00:09.780 Anna Delaney: Hello, and welcome to the ISMG Editors' Panel. I'm 2 00:00:09.780 --> 00:00:12.930 Anna Delaney, and on this week's episode, we're asking the tough 3 00:00:12.930 --> 00:00:17.190 questions. What does it take to be a CISO in 2024 amidst rising 4 00:00:17.190 --> 00:00:19.620 risks? How do we tackle the looming challenge of 5 00:00:19.620 --> 00:00:23.940 disinformation? And are we ready to rethink identity management 6 00:00:24.060 --> 00:00:26.820 in the face of relentless breaches? Well, we're not 7 00:00:26.820 --> 00:00:30.030 answering these questions alone. Fortunately, joining us is a 8 00:00:30.030 --> 00:00:32.700 distinguished guest with an impressive background as a 9 00:00:32.700 --> 00:00:36.210 lawyer, CISO and former federal prosecutor with the U.S. 10 00:00:36.210 --> 00:00:39.960 Department of Justice, currently serving as the CEO of Ukraine 11 00:00:39.960 --> 00:00:43.350 Friends. He's also held prominent roles as CFO at 12 00:00:43.350 --> 00:00:47.100 Facebook, Uber and Cloudflare, as well as an Associate General 13 00:00:47.100 --> 00:00:50.730 Counsel at PayPal. He is of course, Joe Sullivan. Joe, 14 00:00:50.730 --> 00:00:53.640 welcome to the ISMG Editors' Panel. It's an honor to have you 15 00:00:53.640 --> 00:00:54.090 with us. 16 00:00:54.780 --> 00:00:56.850 Joe Sullivan: Oh, thank you for having me on. I'm excited to be 17 00:00:56.850 --> 00:00:57.150 here. 18 00:00:57.870 --> 00:01:00.360 Anna Delaney: And of course, we also have excellent regulars. 19 00:01:00.360 --> 00:01:03.600 Tom Field, senior vice president of editorial, and Mathew 20 00:01:03.600 --> 00:01:06.720 Schwartz, executive editor of DataBreachToday and Europe. Good 21 00:01:06.720 --> 00:01:07.320 to see you all. 22 00:01:08.100 --> 00:01:08.820 Tom Field: Thanks for having us. 23 00:01:09.630 --> 00:01:11.100 Mathew Schwartz: 2024 storms on. 24 00:01:11.310 --> 00:01:14.790 Anna Delaney: Oh, yes. Speaking of storms, Joe, why don't you 25 00:01:14.790 --> 00:01:17.550 tell us where you are in your virtual world. I hope you've got 26 00:01:17.550 --> 00:01:18.300 your life jacket. 27 00:01:18.960 --> 00:01:21.570 Joe Sullivan: Yeah, I'm coming to you live from the Andrea Gail 28 00:01:21.570 --> 00:01:24.990 , which is boat that's going out to catch some fish. Hopefully we 29 00:01:25.020 --> 00:01:28.860 make it back alive. But it's risky. Whenever you go out on 30 00:01:28.860 --> 00:01:33.030 the sea. We shipped out of Gloucester, Massachusetts, which 31 00:01:33.030 --> 00:01:38.070 is a town that a lot of my family calls home and excited to 32 00:01:38.070 --> 00:01:40.470 represent their beautiful place. 33 00:01:40.530 --> 00:01:42.780 Tom Field: Beautiful. Next time you in the States, Anna. 34 00:01:43.140 --> 00:01:45.180 Anna Delaney: Yes, I was thinking, Gloucester is maybe in 35 00:01:45.390 --> 00:01:50.010 the U.K. Anyway, not in the U.K. But it sets the scene very, very 36 00:01:50.040 --> 00:01:53.190 nicely for cybersecurity. I think all those risks there. 37 00:01:53.400 --> 00:01:55.050 Tom, tell us about your backdrop. 38 00:01:55.200 --> 00:01:57.630 Tom Field: Yeah, not nearly as dramatic. This is just a view of 39 00:01:57.630 --> 00:02:00.810 Charlotte when I was in a roundtable discussion last week, 40 00:02:00.990 --> 00:02:03.540 and we're up on the 21st floor, which is highest I'd been in 41 00:02:03.540 --> 00:02:05.820 Charlotte outside of an airplane, and just had an 42 00:02:05.820 --> 00:02:08.760 opportunity to take a nice photo with a viewer said, yep, got to 43 00:02:08.760 --> 00:02:10.020 work in the next Editors' Panel. 44 00:02:10.590 --> 00:02:13.560 Anna Delaney: Definitely works. Mat, are you keeping safe? 45 00:02:13.950 --> 00:02:16.110 Mathew Schwartz: Yes. Stormy conditions, slightly less 46 00:02:16.110 --> 00:02:19.680 dramatic than Gloucester, thankfully, here in Dundee, 47 00:02:19.680 --> 00:02:23.700 Scotland, lots of rain, and wandering the streets as I do 48 00:02:23.700 --> 00:02:25.470 looking for interesting reflections. 49 00:02:26.440 --> 00:02:28.900 Anna Delaney: Well, I've taken myself out into the sun, the 50 00:02:28.900 --> 00:02:32.440 virtual sun. So in the U.K., London, it's a bit chilly and 51 00:02:32.440 --> 00:02:36.700 gloomy on a February day. So I am yearning for a taste of the 52 00:02:36.700 --> 00:02:40.210 California sun in homage to our guest, Joe, I think you're in 53 00:02:40.210 --> 00:02:41.920 California having done my research, right? 54 00:02:42.880 --> 00:02:43.630 Joe Sullivan: That is correct. 55 00:02:43.660 --> 00:02:44.230 Anna Delaney: Okay. 56 00:02:44.620 --> 00:02:46.240 Tom Field: This is not the right week to be talking about 57 00:02:46.240 --> 00:02:47.200 California sun. 58 00:02:48.640 --> 00:02:53.500 Anna Delaney: Maybe not. This was taken in perhaps June 2022. 59 00:02:53.530 --> 00:02:58.660 But I'm just trying to bask in the Californian vibe. So, Joe, 60 00:02:58.690 --> 00:03:00.820 as you know, we have plenty of questions for you. So I'm going 61 00:03:00.820 --> 00:03:04.090 to pass the torch over to Tom, to start things off. 62 00:03:05.860 --> 00:03:11.050 Tom Field: Yes, what does it take to be a CISO in 2024, given 63 00:03:11.050 --> 00:03:15.460 the inherent challenges, and who wants the job, given the 64 00:03:15.460 --> 00:03:17.050 inherent challenges and the risks? 65 00:03:19.250 --> 00:03:21.380 Joe Sullivan: That's a good question. A lot of CISOs are 66 00:03:21.380 --> 00:03:24.500 asking that question themselves. I actually spent last week at a 67 00:03:24.530 --> 00:03:29.210 CISO retreat with a bunch of top security executives. And it was 68 00:03:29.210 --> 00:03:34.430 one of the topics, for sure. But I think that as much as there's 69 00:03:35.150 --> 00:03:39.680 fear about the risk, there is excitement about the growth of 70 00:03:39.680 --> 00:03:44.240 the role. The world is changing, and expectations of how 71 00:03:44.240 --> 00:03:47.420 companies and organizations should deliver security. The 72 00:03:47.420 --> 00:03:50.180 expectations are growing dramatically, and so that's a 73 00:03:50.180 --> 00:03:52.790 good thing for security leaders, because forever, they've been 74 00:03:52.790 --> 00:03:56.090 begging for the resources to do the job well. They've forever 75 00:03:56.090 --> 00:03:59.540 been begging to be on the exec team, to be able to talk to the 76 00:03:59.540 --> 00:04:02.210 board, to be able to talk about risk together with the other 77 00:04:02.210 --> 00:04:05.480 executives, and they're finally getting that chance. So on the 78 00:04:05.480 --> 00:04:08.990 one hand, there's the fear that comes with that responsibility. 79 00:04:09.560 --> 00:04:13.040 But the opportunity to grow in the digital world right now. 80 00:04:14.060 --> 00:04:17.630 Responsibility inside organizations is expanding 81 00:04:17.630 --> 00:04:20.270 around digital risk, you know, 20 years ago, it was just 82 00:04:20.270 --> 00:04:23.810 information security. And then 10 years ago, expectations were 83 00:04:23.810 --> 00:04:25.610 that these roles were going to step up and get involved in 84 00:04:25.610 --> 00:04:30.950 privacy operations. Five years ago, you know, the company was 85 00:04:30.950 --> 00:04:34.070 moving on to the blockchain, and somebody had to manage the risk 86 00:04:34.070 --> 00:04:38.570 of that. And now every company is deploying AI, and somebody 87 00:04:38.570 --> 00:04:42.320 has to manage the risk associated with that. And so who 88 00:04:42.320 --> 00:04:46.160 better than the CISO to step up and say, look at all of our 89 00:04:46.160 --> 00:04:51.080 risks in an integrated way. No risk function should stand alone 90 00:04:51.080 --> 00:04:53.570 inside an organization, we have to think about risks in an 91 00:04:53.570 --> 00:04:57.620 integrated way. And I see CISOs jumping to that opportunity. 92 00:04:58.730 --> 00:05:00.740 Tom Field: So we live in a world today; we have got zero days, we 93 00:05:00.740 --> 00:05:04.130 got nation-state activity, ransomware, election threats. 94 00:05:04.400 --> 00:05:08.210 What are the threats that give you the greatest concern today 95 00:05:08.300 --> 00:05:09.410 as a security leader? 96 00:05:09.000 --> 00:05:11.929 Joe Sullivan: Well, I spend about half my time consulting 97 00:05:11.997 --> 00:05:15.335 and advising companies and CISOs. And based on my 98 00:05:15.403 --> 00:05:19.219 conversations with them, ransomware is still number one. 99 00:05:19.287 --> 00:05:23.579 We talk about a lot of different things. But the reality is the 100 00:05:23.647 --> 00:05:27.257 ransomware attacks are so aggressive, so consistently 101 00:05:27.326 --> 00:05:31.413 never ending and hitting every industry. And so as a result, 102 00:05:31.481 --> 00:05:35.296 for the last few years, the boards and the CTO have been 103 00:05:32.550 --> 00:06:22.050 Tom Field: Ask you about another threat, which is disinformation. 104 00:05:35.365 --> 00:05:39.043 asking the security executive, how are we dealing with 105 00:05:39.112 --> 00:05:42.790 ransomware. And everybody's invested a lot of time and 106 00:05:42.859 --> 00:05:46.742 energy, and there are lots of products that claim to help 107 00:05:46.810 --> 00:05:51.102 solve your ransomware problems. But there's no you know, silver 108 00:05:51.170 --> 00:05:55.462 bullet in this situation. It's about really good hygiene across 109 00:05:55.530 --> 00:05:59.754 the board. And so even with all of that effort, I think I just 110 00:05:59.822 --> 00:06:03.773 saw an article that suggested that last year was the worst 111 00:06:03.842 --> 00:06:07.657 year in terms of ransomware payments in history. So even 112 00:06:07.725 --> 00:06:11.267 with all that effort, the problem is not going away, 113 00:06:11.336 --> 00:06:15.559 because for the bad guys, the economic opportunity is too big. 114 00:06:15.628 --> 00:06:19.920 And so a lot of people are still very worried about ransomware. 115 00:06:22.110 --> 00:06:24.480 As you know, the World Economic Forum has called it the top 116 00:06:24.480 --> 00:06:27.630 threat of 2024, just the headlines just the other day 117 00:06:27.930 --> 00:06:30.720 showed an organization in Asia that had been scammed 118 00:06:30.750 --> 00:06:35.490 considerably by a deepfake. How do we even begin to get our 119 00:06:35.490 --> 00:06:38.880 defensive arms around this disinformation challenge? 120 00:06:40.160 --> 00:06:42.800 Joe Sullivan: Yeah, I had the good fortune to be in Davos for 121 00:06:42.800 --> 00:06:46.250 the World Economic Forum a couple of weeks ago, and I was 122 00:06:46.250 --> 00:06:50.270 there doing some kind of lobbying for support for Ukraine 123 00:06:50.270 --> 00:06:53.450 and in the war situation for on the humanitarian side and also 124 00:06:54.140 --> 00:06:57.770 involved in a lot of different cybersecurity conversations. And 125 00:06:57.770 --> 00:07:03.230 yeah, it was a big topic of conversation. If the WEF says, 126 00:07:03.620 --> 00:07:07.790 misinformation is the number one global risk par none, it's going 127 00:07:07.790 --> 00:07:10.700 to lead to conversation. But the interesting thing, the question 128 00:07:10.700 --> 00:07:13.520 I kept asking everyone, when I would walk up to people inside 129 00:07:13.520 --> 00:07:16.580 companies, I would say, who's responsible for managing 130 00:07:16.580 --> 00:07:20.300 misinformation at your organization? And 90% of the 131 00:07:20.300 --> 00:07:24.200 people I asked didn't have an answer. So, you know, we at 132 00:07:24.200 --> 00:07:26.690 least we know who's responsible inside the organization for 133 00:07:26.690 --> 00:07:32.450 stopping ransomware. We don't have any kind of discipline or 134 00:07:32.450 --> 00:07:36.740 organized approach, across industry, across government and 135 00:07:36.740 --> 00:07:39.860 industry, across government industry, and, you know, the 136 00:07:39.860 --> 00:07:42.170 companies that are building the next generation of security 137 00:07:42.200 --> 00:07:46.880 products. It's kind of like a gray zone of lack of 138 00:07:46.880 --> 00:07:50.630 responsibility right now. But it's the public that seems to be 139 00:07:50.630 --> 00:07:53.390 most upset about it. And so someone's going to have to step 140 00:07:53.390 --> 00:07:55.010 up and do something about it soon. 141 00:07:55.880 --> 00:07:57.770 Tom Field: Very good. Joe, I appreciate your time. Appreciate 142 00:07:57.770 --> 00:08:00.140 your answers. And I'll pass you off now to my colleague, Mathew. 143 00:08:01.700 --> 00:08:04.550 Mathew Schwartz: Hey, Joe! Great to see you. I've got some 144 00:08:04.580 --> 00:08:08.900 identity questions for you. And this is a topic which I think 145 00:08:09.110 --> 00:08:12.890 can seem really, really complicated, especially when you 146 00:08:12.890 --> 00:08:16.580 see some of the really big players in this space, the 147 00:08:16.580 --> 00:08:21.320 identity providers, the IDPs, continuing to get popped, the 148 00:08:21.320 --> 00:08:25.850 likes of Microsoft, and Okta. And if they can't even keep 149 00:08:25.850 --> 00:08:30.500 themselves secure, or their customers secure, do we need to 150 00:08:30.500 --> 00:08:35.660 be rethinking do you think how we are approaching this identity 151 00:08:35.660 --> 00:08:37.190 management platform question? 152 00:08:38.600 --> 00:08:41.900 Joe Sullivan: I do think we need to re-examine it. It's a topic 153 00:08:41.900 --> 00:08:45.740 that I've spent a lot of time digging into. And statistically, 154 00:08:45.800 --> 00:08:48.830 you can find 100 different reports that say something like 155 00:08:48.860 --> 00:08:52.580 80% of all compromised organization involve the abuse 156 00:08:52.580 --> 00:08:56.930 of a trusted Identity inside the organization. So whether it's 157 00:08:56.930 --> 00:08:59.480 the breaking in through the front door, you know, straight 158 00:08:59.480 --> 00:09:02.930 through the identity and access controls, or lack of controls 159 00:09:02.960 --> 00:09:06.080 that exist, or they're coming in, you know, through a 160 00:09:06.080 --> 00:09:09.500 different kind of vulnerability. As soon as they get a beachhead 161 00:09:09.500 --> 00:09:11.870 inside your organization, they're typically going to try 162 00:09:11.870 --> 00:09:14.450 and take over an identity and use the privileged access that 163 00:09:14.450 --> 00:09:18.950 comes with it. And so identity is probably the number one area 164 00:09:18.950 --> 00:09:26.660 of investment for CISOs in 2024. And we're all thinking about 165 00:09:26.750 --> 00:09:29.120 where do we place that investment. And one of the 166 00:09:29.450 --> 00:09:33.050 topics that I think it's starting to come up is maybe we 167 00:09:33.050 --> 00:09:37.520 shouldn't consider our IDP to be, or you know, that we 168 00:09:37.520 --> 00:09:40.550 shouldn't consider the software that manages people getting in 169 00:09:40.550 --> 00:09:45.950 and out of our enterprise to be actually a security product. So 170 00:09:45.950 --> 00:09:49.760 for example, we don't, in most organizations just rely on our 171 00:09:49.760 --> 00:09:54.020 email provider to do all the security of email. Large 172 00:09:54.020 --> 00:09:57.890 organizations have at least one dedicated security product 173 00:09:57.890 --> 00:10:00.830 focused on email security, many organizations have two or three 174 00:10:00.830 --> 00:10:03.410 layers there. It's kind of shocking that we don't have 175 00:10:03.410 --> 00:10:07.730 those layers of controls and protection behind our IDPs. And 176 00:10:07.730 --> 00:10:11.270 I think we'll see more startups and security companies starting 177 00:10:11.270 --> 00:10:15.920 to offer products that stand behind the IDP and/or kind of 178 00:10:15.920 --> 00:10:19.430 scrutinize what the IDP is doing for the security organization. 179 00:10:20.960 --> 00:10:22.400 Mathew Schwartz: That's fascinating. There's so many 180 00:10:22.430 --> 00:10:25.070 assumptions, I think that come along with using an identity 181 00:10:25.070 --> 00:10:28.160 provider platform, like you said, some of the other 182 00:10:28.160 --> 00:10:32.960 assumptions we have are things like multifactor authentication, 183 00:10:33.170 --> 00:10:38.750 for example, people who like I've enabled MFA or 2FA. And you 184 00:10:38.750 --> 00:10:43.220 see these sorts of defenses. While they're essential, right, 185 00:10:43.220 --> 00:10:46.640 everybody should be using multifactor where they can. But 186 00:10:46.640 --> 00:10:49.790 sometimes they get routed around, sometimes really via 187 00:10:49.790 --> 00:10:54.590 simple seeming attacks, you know, MFA push notifications, 188 00:10:54.620 --> 00:10:58.070 for example. You keep getting him keep getting him, eventually 189 00:10:58.070 --> 00:11:02.030 a target accepts them. And I know that you've written about 190 00:11:02.030 --> 00:11:05.540 how a lot of people don't actually understand, under the 191 00:11:05.540 --> 00:11:10.490 hood, how authentication works, how maybe it's supposed to work, 192 00:11:10.490 --> 00:11:13.040 I guess we have protocols and frameworks, but then actually 193 00:11:13.040 --> 00:11:17.330 how it does work in the case of, say, FIDO2 in practice, or it 194 00:11:17.330 --> 00:11:21.890 might work in unexpected ways. This sounds like such a 195 00:11:21.890 --> 00:11:25.100 challenge. When you're trying to get better defenses in place. 196 00:11:25.130 --> 00:11:29.270 What should security teams be thinking? How can they upskill 197 00:11:29.270 --> 00:11:32.150 themselves? What sorts of assumptions would you caution 198 00:11:32.000 --> 00:11:34.769 Joe Sullivan: Yeah, that's a great question. So I've spent a 199 00:11:32.150 --> 00:11:32.870 against? 200 00:11:34.829 --> 00:11:38.561 long time in my career thinking about how do I best secure the 201 00:11:38.621 --> 00:11:42.233 identities of our, you know, insiders. Over a decade ago, we 202 00:11:42.293 --> 00:11:46.206 started rolling out hard keys as a second factor at Facebook, you 203 00:11:46.266 --> 00:11:49.999 had to have a hard key to get into our production environment, 204 00:11:50.059 --> 00:11:53.671 to touch code at all. And it's funny, that was only a decade 205 00:11:53.731 --> 00:11:57.042 ago, that was kind of an environment that was pre-SaaS, 206 00:11:57.102 --> 00:12:00.473 if you will, pre-cloud. And in that environment, when we 207 00:12:00.533 --> 00:12:04.206 deployed the hard key, as the second factor, we knew that you 208 00:12:04.266 --> 00:12:07.456 had to physically have that hard key to get into that 209 00:12:07.516 --> 00:12:11.008 environment, because it was behind a network perimeter, it 210 00:12:11.068 --> 00:12:14.800 was kind of like an old school environment. Now, I've actually 211 00:12:14.861 --> 00:12:18.653 helped a couple of companies in the last year rollout hard keys 212 00:12:18.713 --> 00:12:22.265 as their only second factor of authentication on the theory 213 00:12:20.690 --> 00:14:46.970 Mathew Schwartz: Fascinating. Thank you. I think you're going 214 00:12:22.325 --> 00:12:25.696 that, okay, if we have the hard key, that solves all our 215 00:12:25.757 --> 00:12:29.549 problems, you can't be phished, etc. But what we've seen in the 216 00:12:29.609 --> 00:12:33.221 last year is a rise in a very specific type of attack that I 217 00:12:33.281 --> 00:12:36.412 don't think enough people understand. So when you go 218 00:12:36.472 --> 00:12:40.204 through an identity provider, you can configure certain things 219 00:12:40.264 --> 00:12:43.395 like how long should my authentication session last? 220 00:12:43.455 --> 00:12:47.368 Should it be bound to a specific browser? Should it be bound to a 221 00:12:47.428 --> 00:12:51.281 specific device, like there are all kinds of things that you can 222 00:12:51.341 --> 00:12:55.013 work with your identity provider to put in that. So you know, 223 00:12:55.073 --> 00:12:58.625 when I go through, say, an Okta, and I log into a corporate 224 00:12:58.685 --> 00:13:01.695 environment, Okta drops a cookie, or we call it an 225 00:13:01.755 --> 00:13:05.186 identity token. And that token will very specifically put 226 00:13:05.247 --> 00:13:08.858 limits in place that Okta, say designated. The problem we're 227 00:13:08.919 --> 00:13:12.410 seeing now is that most of the time your identity provider 228 00:13:12.470 --> 00:13:16.082 stands in front of a bunch of SaaS apps and on-premises apps 229 00:13:16.142 --> 00:13:19.935 and the like, and it's not the identity provider that drops the 230 00:13:19.995 --> 00:13:23.667 ultimate token that determines whether you will get access to 231 00:13:23.727 --> 00:13:27.400 that environment. It's the SaaS app itself. So Okta will drop 232 00:13:27.460 --> 00:13:31.132 its own token saying, okay, you have to re-authenticate every 233 00:13:31.192 --> 00:13:34.623 four hours, because that's a corporate policy. But on the 234 00:13:34.684 --> 00:13:38.476 back end, the SaaS app that you just got authenticated to might 235 00:13:38.536 --> 00:13:42.268 drop a separate token that says, we think we're going to allow 236 00:13:42.329 --> 00:13:46.001 everybody to stay authenticated for a month because our users 237 00:13:46.061 --> 00:13:49.793 get frustrated when they have to re-login. And so that simple, 238 00:13:49.853 --> 00:13:53.285 like, handoff where your identity provider tells the SaaS 239 00:13:53.345 --> 00:13:56.897 app, what the rules are for your authentication seems to be 240 00:13:56.957 --> 00:14:00.448 broken. We see too many times, the SaaS app provider, they 241 00:14:00.509 --> 00:14:03.880 might charge you that single sign-on tax, but they don't 242 00:14:03.940 --> 00:14:07.672 actually go and configure the token that they're going to drop 243 00:14:07.732 --> 00:14:11.103 to honor what's in the IDP's instructions. And so all an 244 00:14:11.164 --> 00:14:15.077 attacker has to do is get on the device. And then they can go and 245 00:14:15.137 --> 00:14:18.989 steal that unencrypted token and take it and maybe that token is 246 00:14:19.050 --> 00:14:22.541 long lived. Or maybe it's supposed to be device bound, but 247 00:14:22.601 --> 00:14:26.153 it's not. And so that's where we're seeing repeated attacks 248 00:14:26.213 --> 00:14:29.945 against the same environment. Because the bad guys, one of the 249 00:14:30.006 --> 00:14:33.136 ways they can establish persistence, right now is to 250 00:14:33.196 --> 00:14:36.627 grab that second token. And they'll grab all those tokens 251 00:14:36.688 --> 00:14:40.179 and they'll experiment and see which SaaS provider kind of 252 00:14:40.239 --> 00:14:43.310 screwed up and left the security team in the lurch. 253 00:14:46.970 --> 00:14:51.470 to be hearing a lot more about this. People are so innovative 254 00:14:51.470 --> 00:14:56.000 and experimental. And this stuff just keeps surprising. That's 255 00:14:56.000 --> 00:14:58.490 great. Thank you so much. Speaking of surprises, I know 256 00:14:58.490 --> 00:14:59.990 Anna's got some questions for you. 257 00:15:00.830 --> 00:15:02.450 Anna Delaney: Got a couple of surprises. That's brilliant 258 00:15:02.450 --> 00:15:05.930 stuff so far. Thank you. Well, Joe, we can't overlook the topic 259 00:15:06.140 --> 00:15:10.550 of AI in this discussion. So what shows promise and give you 260 00:15:10.550 --> 00:15:14.480 hope, specifically, where do you see AI supporting the defenders 261 00:15:14.480 --> 00:15:17.090 today? And what are your aspirations for its future 262 00:15:17.090 --> 00:15:17.690 impact? 263 00:15:19.010 --> 00:15:20.990 Joe Sullivan: So, you know, on the one hand, everybody's 264 00:15:20.990 --> 00:15:24.560 freaking out about the risks of AI. But on the other hand, we're 265 00:15:24.560 --> 00:15:27.710 all talking about them right now, in the early days of 266 00:15:27.710 --> 00:15:30.680 deployment, which is a lot better than in a lot of other 267 00:15:30.680 --> 00:15:33.440 kind of like technology revolutions of the last 25 268 00:15:33.470 --> 00:15:37.100 years. It's so much better, that we're having public private 269 00:15:37.100 --> 00:15:40.760 discussions, we're having large organizations think about how to 270 00:15:40.760 --> 00:15:43.850 deploy and manage it, we're having lots of investment on the 271 00:15:43.850 --> 00:15:46.430 security side, I can't keep track of all the security 272 00:15:46.430 --> 00:15:49.190 startups that have launched in the last year or two or have 273 00:15:49.190 --> 00:15:53.090 pivoted in the last year or two to be something related to AI 274 00:15:53.090 --> 00:15:56.450 security. When you think about AI security, there are two kinds 275 00:15:56.450 --> 00:16:02.240 of topics. One is how do we secure the AI itself? And 276 00:16:02.420 --> 00:16:05.720 there's a ton of conversation about that. But then the second 277 00:16:05.720 --> 00:16:09.320 question is, how do we use AI for security. And that's where 278 00:16:09.320 --> 00:16:12.050 I'm seeing some really cool stuff right now. In fact, just 279 00:16:12.050 --> 00:16:15.560 yesterday, I paired up one of the startups that's doing AI for 280 00:16:15.560 --> 00:16:19.880 security with a company that I had helped hire their first 281 00:16:19.910 --> 00:16:24.350 security executive, and that security leader, he's a one man 282 00:16:24.350 --> 00:16:27.800 band at that company right now, you know, he's hired one person, 283 00:16:28.040 --> 00:16:31.400 but he's got a company that's, you know, growing quickly and 284 00:16:31.430 --> 00:16:37.640 has lots of risks. And we paired the startup security company 285 00:16:37.640 --> 00:16:42.980 with this other company, because the AI for security opportunity 286 00:16:42.980 --> 00:16:46.850 is huge. A small security team will be able to keep an eye on 287 00:16:46.850 --> 00:16:51.860 many more things like, if you're a one person security team, you 288 00:16:51.860 --> 00:16:54.800 often are like, okay, today, I will go in and review all the 289 00:16:54.800 --> 00:16:57.680 identity and access logs and make sure that that's all going 290 00:16:57.680 --> 00:17:01.190 well. And then the next day, I'll go review, you know, I need 291 00:17:01.190 --> 00:17:04.370 to do another sweep to make sure that our employees haven't, you 292 00:17:04.370 --> 00:17:07.100 know, connected to a bunch of extensions that are downloading 293 00:17:07.100 --> 00:17:10.130 our data. And then the next day I need to go. So it's like a 294 00:17:10.130 --> 00:17:14.420 constant, too many things to keep track of, and having an AI 295 00:17:14.420 --> 00:17:17.540 assistant that's looking at all your security tools and telling 296 00:17:17.540 --> 00:17:21.350 you what is the most important risk to go jump on today, or is 297 00:17:21.350 --> 00:17:25.070 even automating some decision making on in some lower risk 298 00:17:25.070 --> 00:17:30.110 areas. That is a real tool for small security organizations 299 00:17:30.110 --> 00:17:33.260 that can help them scale really quickly. And then the second 300 00:17:33.350 --> 00:17:37.730 reason that AI for security is so exciting is most of what we 301 00:17:37.730 --> 00:17:41.780 tend to do in security number we focus on prevention, but we also 302 00:17:41.780 --> 00:17:44.930 assume that we're going to get compromised. And then so we all 303 00:17:44.930 --> 00:17:47.390 invest in these large detection and response efforts where we 304 00:17:47.390 --> 00:17:50.510 collect a ton of logs, and they sit in these giant data 305 00:17:50.510 --> 00:17:54.080 warehouses. And we hope that our security detection tools will 306 00:17:54.080 --> 00:17:57.740 find the needle in the haystack. Well, AI is really going to help 307 00:17:57.740 --> 00:18:01.130 in that in that respect, because it's going to be much faster at 308 00:18:01.130 --> 00:18:04.610 processing the data and much better at identifying anomalies 309 00:18:04.640 --> 00:18:07.670 over time, as we train it. So there's some real exciting 310 00:18:07.670 --> 00:18:13.070 opportunities for defenders to be able to use AI to be much 311 00:18:13.070 --> 00:18:13.790 more effective. 312 00:18:15.070 --> 00:18:16.960 Anna Delaney: Well, very encouraging. That's great. And 313 00:18:17.170 --> 00:18:20.200 before we wrap up, I really like to ask you about information 314 00:18:20.200 --> 00:18:23.980 sharing. I know you've long advocated for threat intel 315 00:18:23.980 --> 00:18:27.880 sharing to bolster our defenses. However, recent reports suggest 316 00:18:27.880 --> 00:18:31.810 that CISA is encountering hurdles with its cybersecurity 317 00:18:31.810 --> 00:18:34.690 initiatives, raising concerns, of course, about the 318 00:18:34.870 --> 00:18:38.920 politicization of government cyber efforts. So given this 319 00:18:38.920 --> 00:18:42.460 backdrop, what are your insights on the current state of private 320 00:18:42.460 --> 00:18:46.660 sector and sharing and public private initiatives, and are we 321 00:18:46.660 --> 00:18:49.210 at a level that meets your expectations as a security 322 00:18:49.210 --> 00:18:49.720 professional? 323 00:18:50.890 --> 00:18:52.630 Joe Sullivan: Well, anybody who's been in security, as long 324 00:18:52.630 --> 00:18:56.110 as I have, have seen this conversation going on forever. 325 00:18:57.010 --> 00:19:00.190 The reality of the internet is that most of the internet's 326 00:19:00.190 --> 00:19:03.550 technology, and data is sitting in the hands of private 327 00:19:03.550 --> 00:19:07.900 organizations. So historically, we always counted on our 328 00:19:07.900 --> 00:19:10.840 governments to keep us safe. That's what yeah, one of the 329 00:19:10.840 --> 00:19:13.540 core reasons that people I don't know thousands or millions of 330 00:19:13.540 --> 00:19:16.570 years ago, bound together in communities was for the common 331 00:19:16.570 --> 00:19:21.940 defense. For the first time ever the cyber world creates a really 332 00:19:21.940 --> 00:19:25.210 difficult problem for governments in that they don't 333 00:19:25.210 --> 00:19:28.450 have visibility into what's happening unless somebody in the 334 00:19:28.450 --> 00:19:32.950 private sector cooperates. But why haven't we gotten better 335 00:19:32.950 --> 00:19:38.680 because people have this concern about privacy that's real. And 336 00:19:38.680 --> 00:19:41.920 the idea that the government would have incredibly more 337 00:19:41.920 --> 00:19:44.920 access and visibility into all of our private lives if we just 338 00:19:44.920 --> 00:19:47.500 gave the government unfettered access to all that data. So 339 00:19:47.500 --> 00:19:50.560 there's this fundamental tension that exists in cybersecurity, 340 00:19:50.740 --> 00:19:53.560 that doesn't exist anywhere else. So we've all been fighting 341 00:19:53.560 --> 00:19:57.100 for years to try and get better and better at collaboration. And 342 00:19:57.490 --> 00:20:01.780 it's two steps forward, one step back. And look, as much as 343 00:20:01.780 --> 00:20:04.990 there's criticism of CISO, for, you know, not being perfect 344 00:20:04.990 --> 00:20:08.560 right now, they've taken us many steps forward in the last few 345 00:20:08.560 --> 00:20:13.330 years in terms of that collaboration. And so, like, we 346 00:20:13.330 --> 00:20:16.180 have to nitpick, we have to criticize, because we all want 347 00:20:16.210 --> 00:20:19.150 to do better. But we shouldn't just throw in the towel, because 348 00:20:19.150 --> 00:20:22.900 it's not working perfectly yet. It's going to be a collaborative 349 00:20:22.900 --> 00:20:26.560 dynamic process. So that we maintain the right balance of 350 00:20:26.560 --> 00:20:34.030 privacy and security. And no one person can sit in all those 351 00:20:34.030 --> 00:20:37.750 spots and see a perfect solution. So we just got to keep 352 00:20:37.750 --> 00:20:41.020 pushing for it. Because at the end of the day, we always say in 353 00:20:41.020 --> 00:20:43.600 security, the bad guys all share all the information, and the 354 00:20:43.600 --> 00:20:48.310 good guys are at a deficit. I do think that the push towards more 355 00:20:48.310 --> 00:20:52.300 transparency for companies is a very good thing in that because 356 00:20:53.590 --> 00:20:57.880 if companies, by default, are afraid to share, because it'll 357 00:20:58.030 --> 00:21:01.720 hurt their brand or reputation, then the rest of us don't get 358 00:21:01.720 --> 00:21:06.700 the information we need to be able to learn the lessons from 359 00:21:06.700 --> 00:21:10.990 their failures. So I hope that we move towards, you know, we're 360 00:21:10.990 --> 00:21:14.320 seeing a lot of sticks right now, in terms of trying to force 361 00:21:14.470 --> 00:21:18.580 organizations to be more collaborative. But we still want 362 00:21:18.580 --> 00:21:22.420 to see more carrots too like, when I was on President Obama's 363 00:21:22.420 --> 00:21:25.240 cyber commission in 2016, one of the things we talked about a lot 364 00:21:25.240 --> 00:21:27.940 was how do we give immunity to organizations that come forward 365 00:21:27.940 --> 00:21:31.720 quickly? How do we reward them for wanting to cooperate and 366 00:21:31.720 --> 00:21:34.030 share the intelligence information that would help all 367 00:21:34.030 --> 00:21:37.390 of us in our collective defense? And I hope we can keep having 368 00:21:37.390 --> 00:21:39.970 those conversations about like, let's do carrots to not just 369 00:21:39.970 --> 00:21:40.450 sticks. 370 00:21:41.530 --> 00:21:43.420 Anna Delaney: We're all for carrots here. More carrots, 371 00:21:43.420 --> 00:21:47.320 please. This is great. This is really, really good insights. I 372 00:21:47.320 --> 00:21:50.140 mean, thank you so much, Joe. We've got one final question. 373 00:21:50.140 --> 00:21:52.750 We're going to give you a break for a moment. But it's just for 374 00:21:52.750 --> 00:21:56.170 fun. I'd like you to pick a sport and demonstrate how it can 375 00:21:56.170 --> 00:21:59.200 apply to cybersecurity. Are there any parallels between the 376 00:21:59.200 --> 00:22:03.130 sport and navigating the complexities of cyberspace? 377 00:22:03.600 --> 00:22:05.550 Tom Field: It reminds me of a story. The first time I visited 378 00:22:05.550 --> 00:22:09.810 India years ago, I sat down with one of my hosts. And he said, 379 00:22:09.810 --> 00:22:12.720 can you explain American football to me? And I said, can 380 00:22:12.720 --> 00:22:16.230 you explain cricket to me? Both games fascinating to watch, 381 00:22:16.260 --> 00:22:19.230 millions of supporters worldwide, or for someone that 382 00:22:19.230 --> 00:22:22.740 doesn't understand them, the rules are extremely complex. And 383 00:22:22.740 --> 00:22:25.500 the two of us sat down with napkins and drew what we could 384 00:22:25.500 --> 00:22:28.260 share with each other. And so I'd bring both those sports in 385 00:22:28.260 --> 00:22:31.350 there. Because it's something from the outside, you can enjoy 386 00:22:31.350 --> 00:22:34.230 watching. You can sort of get your hands around, but to really 387 00:22:34.230 --> 00:22:36.780 understand the intricacies, it takes some work. 388 00:22:38.640 --> 00:22:40.650 Anna Delaney: Very well said! Mat, what are you going to throw 389 00:22:40.650 --> 00:22:41.010 at us? 390 00:22:41.780 --> 00:22:43.730 Mathew Schwartz: What am I going to throw you? Thank you. That's 391 00:22:43.730 --> 00:22:48.260 not a lovely image. I'm going to be coming running toward you. 392 00:22:48.500 --> 00:22:52.490 Because maybe like 15 years ago, I decided I was going to learn 393 00:22:52.490 --> 00:22:56.660 how to run, like as in jogging, not just escaping from things. 394 00:22:56.690 --> 00:22:59.390 And I thought it would be a good thing to do because it was 395 00:22:59.390 --> 00:23:03.470 moving to Scotland. And I have so often on my computer needed 396 00:23:03.470 --> 00:23:08.480 to get out. And so I found the Couch to 5K program, which I 397 00:23:08.480 --> 00:23:11.570 think a lot of people have done, where over eight weeks, you 398 00:23:11.570 --> 00:23:16.550 learn how to go from perhaps a sedentary couch bound lifestyle, 399 00:23:16.760 --> 00:23:20.930 to being able to run five kilometers without stopping. And 400 00:23:20.930 --> 00:23:23.960 the thing that I love about that, which is counterintuitive, 401 00:23:24.260 --> 00:23:30.740 is that it's interval training. So you don't just run and just 402 00:23:30.740 --> 00:23:34.760 see how far you get. But you run and walk and run and walk and 403 00:23:34.760 --> 00:23:37.220 over the eight weeks, you start running more than you're 404 00:23:37.220 --> 00:23:40.970 walking. And I think that's a good thing to keep in mind when 405 00:23:40.970 --> 00:23:46.310 you're trying to master complex, difficult things. Especially 406 00:23:46.310 --> 00:23:47.360 like cybersecurity. 407 00:23:47.510 --> 00:23:50.390 Anna Delaney: Yeah, a lot of wisdom there. And you've run 408 00:23:50.390 --> 00:23:56.930 across the globe now so you run further than 5K I think so. It 409 00:23:56.930 --> 00:23:57.710 all began ... 410 00:23:57.770 --> 00:24:00.140 Mathew Schwartz: All the way around the globe. Yeah, maybe 411 00:24:00.000 --> 00:24:03.300 Anna Delaney: Okay, good. Well, did you know that there's such a 412 00:24:00.140 --> 00:24:00.560 soon. 413 00:24:03.300 --> 00:24:10.440 thing as extreme ironing? It is classified as an extreme sport 414 00:24:10.680 --> 00:24:13.770 in which people are enclosed in unconventional places and 415 00:24:13.770 --> 00:24:18.810 extreme locations such as mountain peaks or underwater or 416 00:24:18.810 --> 00:24:22.740 whilst skydiving so with that in mind, I think cybersecurity 417 00:24:22.740 --> 00:24:25.560 professionals often find themselves in unconventional, 418 00:24:25.560 --> 00:24:29.190 perhaps unexpected situations. And they've got to employ 419 00:24:29.190 --> 00:24:32.880 creative ways of thinking about things of protecting the crown 420 00:24:32.880 --> 00:24:36.510 jewels and I think thinking outside the box is an expression 421 00:24:36.510 --> 00:24:39.060 which can be applied to both. 422 00:24:39.600 --> 00:24:40.410 Tom Field: Extreme ironing! 423 00:24:42.360 --> 00:24:45.240 Anna Delaney: Yes. Am not doing it anytime soon but that did 424 00:24:45.930 --> 00:24:47.550 catch my interest. Joe? 425 00:24:48.750 --> 00:24:52.860 Joe Sullivan: So I'm going to say skiing. And I fell in love 426 00:24:52.860 --> 00:24:55.410 with skiing about a decade ago and it's something that I always 427 00:24:55.410 --> 00:24:58.740 try and find some time to do. And I was thinking about why do 428 00:24:58.740 --> 00:25:03.450 I like skiing? It's a risk reward situation. It's so much 429 00:25:03.450 --> 00:25:08.010 like what we do in security. It's risky, because if you go 430 00:25:08.010 --> 00:25:12.420 too fast, or you or you go faster than your skills would 431 00:25:12.420 --> 00:25:17.370 allow, you get really hurt. But it's a rewarding experience. 432 00:25:17.370 --> 00:25:21.300 Because when you do it with other people, it's more fun. And 433 00:25:21.330 --> 00:25:25.080 when you practice, and you focus and you're disciplined about it, 434 00:25:25.260 --> 00:25:29.820 you get better at it. And so to me, anything where you're 435 00:25:29.820 --> 00:25:34.950 looking at taking risks, and figuring out how to navigate 436 00:25:34.950 --> 00:25:38.670 them, it's like my day job. But at the same time, when you're 437 00:25:38.670 --> 00:25:41.250 out skiing, one of the nice things about it is you can't 438 00:25:41.250 --> 00:25:43.590 think about anything else you have to think at the moment. So 439 00:25:45.120 --> 00:25:47.940 if you've had a lot going on getting out there and you know, 440 00:25:47.970 --> 00:25:50.100 it's like going for a run or something like that. It just 441 00:25:50.100 --> 00:25:53.640 clears your head. And I think all of us in the risk business 442 00:25:53.640 --> 00:25:55.230 need to clear our head every so often. 443 00:25:56.170 --> 00:25:56.740 Tom Field: Very well said! 444 00:25:58.540 --> 00:26:01.300 Anna Delaney: We approve, that is a great, great example. So, 445 00:26:01.660 --> 00:26:03.670 Joe, thank you so much for playing along with that. It's 446 00:26:03.700 --> 00:26:06.850 been absolutely wonderful having you grace the ISMG studios. And 447 00:26:07.030 --> 00:26:10.300 thank you so much for your perspectives and expertise. We 448 00:26:10.330 --> 00:26:13.330 hope that you will return, we'd be honored to have you back. 449 00:26:14.740 --> 00:26:16.240 Joe Sullivan: Thank you for having me. Be happy to come 450 00:26:16.240 --> 00:26:16.600 back. 451 00:26:17.710 --> 00:26:20.500 Anna Delaney: And that's a wrap. Thank you so much. And thank you 452 00:26:20.500 --> 00:26:22.210 so much for watching. Until next time!