WEBVTT 1 00:00:07.740 --> 00:00:09.840 Anna Delaney: Hello, I'm Anna Delaney, and for this week's 2 00:00:09.900 --> 00:00:14.850 ISMG's Editors Panel, we are live at ISMG London Summit 2023. 3 00:00:15.120 --> 00:00:19.350 And what an event it's been. I am joined by my colleagues, Mat, 4 00:00:19.350 --> 00:00:23.640 Akshaya and Tony. Great to be with each other; it's the end of 5 00:00:23.640 --> 00:00:27.810 this brilliant day. Mat, I know you moderated a few panels on 6 00:00:27.840 --> 00:00:32.940 AI, budgeting. You hosted a roundtable. What can you share 7 00:00:32.940 --> 00:00:34.260 with us? What were some of the highlights? 8 00:00:34.540 --> 00:00:36.670 Mathew Schwartz: Well, the highlights for me were the 9 00:00:36.700 --> 00:00:39.460 budgeting secrets of our cybersecurity all stars. That 10 00:00:39.460 --> 00:00:42.730 was really fascinating discussion. I said how fun it 11 00:00:42.730 --> 00:00:45.010 was going to be. And I think there was laughter from the 12 00:00:45.100 --> 00:00:49.030 front of the audience, but we had them cowered by the end. It 13 00:00:49.030 --> 00:00:52.360 was a great discussion about how you think about budgeting. And 14 00:00:52.360 --> 00:00:55.300 it is such a challenge for security teams, because you need 15 00:00:55.300 --> 00:00:58.840 to be strategic. Unfortunately, sometimes, you're going to need 16 00:00:58.840 --> 00:01:01.960 to be reactive when you've got a breach. There's vendors to 17 00:01:01.960 --> 00:01:07.120 manage, there's people and hiring, of course, and trying to 18 00:01:07.120 --> 00:01:10.960 keep them skilled, really great discussion, as was the 19 00:01:10.960 --> 00:01:15.790 discussion we had on AI and machine learning, and what 20 00:01:15.790 --> 00:01:21.010 security teams, what CISOs, need to be doing to stay current on 21 00:01:21.010 --> 00:01:24.460 these sorts of things. And just to fast forward to the end, how 22 00:01:24.460 --> 00:01:28.330 they need to be understanding it, communicating what the 23 00:01:28.330 --> 00:01:33.040 possibilities are, and promising what those capabilities might 24 00:01:33.070 --> 00:01:36.340 deliver, at least today. And just keeping an eye on it. 25 00:01:36.340 --> 00:01:39.400 Because obviously, it's not just a buzzword. It's been widely 26 00:01:39.000 --> 00:01:43.170 Anna Delaney: And they are, or were, very much navigating. They 27 00:01:39.400 --> 00:01:39.880 used. 28 00:01:43.170 --> 00:01:45.570 are navigating through the space, which was the title of 29 00:01:45.570 --> 00:01:48.750 the session. But I thought, I loved how refreshingly honest 30 00:01:48.750 --> 00:01:51.540 they were about where they are, the frameworks they are 31 00:01:51.540 --> 00:01:58.680 following, and how they're using this as an opportunity to share 32 00:01:58.710 --> 00:02:01.950 the responsibility model with everybody ... security is 33 00:02:01.950 --> 00:02:06.750 everybody's responsibility. This is kind of an opportunity, 34 00:02:06.750 --> 00:02:07.800 perhaps with AI. 35 00:02:08.410 --> 00:02:11.740 Mathew Schwartz: It's the latest in a long line of technologies 36 00:02:11.770 --> 00:02:14.710 that have been widely adopted, possibly with minimal to no 37 00:02:14.710 --> 00:02:18.610 forethought in a manner which if we're lucky, has got some kind 38 00:02:18.610 --> 00:02:22.450 of a loose security framework monitoring, and oversight, 39 00:02:23.110 --> 00:02:26.380 meaning we've been here before, with practically every major 40 00:02:26.380 --> 00:02:30.100 technology adoption that we've ever seen. It gets adopted. The 41 00:02:30.100 --> 00:02:32.920 business says we need this in order to make money. And the 42 00:02:32.920 --> 00:02:36.280 security team says great, please let us help you. So it's just 43 00:02:36.280 --> 00:02:39.700 the latest. But I think CISOs are in a much better position 44 00:02:39.910 --> 00:02:44.260 these days to rapidly deploy, if you will, and to say, look, we 45 00:02:44.260 --> 00:02:47.260 need to use this in a careful manner. We're here to help you. 46 00:02:48.070 --> 00:02:49.870 Anna Delaney: Well-said. Akshaya, you had a bit of a 47 00:02:49.870 --> 00:02:53.890 bird's eye view of the events. And you were going through to a 48 00:02:53.890 --> 00:02:56.950 few sessions, reporting on them as well, live reporting the 49 00:02:56.980 --> 00:02:59.680 articles that are already up on our website. So how was it for 50 00:02:59.680 --> 00:03:00.790 you? Anything that stood out? 51 00:03:00.000 --> 00:03:03.120 Akshaya Asokan: Yep. So this was the first event that I'm 52 00:03:03.240 --> 00:03:06.840 attending in London. I was very excited. And it was packed. And 53 00:03:06.840 --> 00:03:12.720 it was buzzing, and the first session was that of Google Cloud 54 00:03:13.170 --> 00:03:17.670 chaos coordinator, John Stone. And he spoke at length about AI 55 00:03:17.670 --> 00:03:22.680 security, Google's safe AI frameworks - SAIF, and how 56 00:03:22.680 --> 00:03:27.150 companies can deploy. And everybody all eyes were on him, 57 00:03:27.150 --> 00:03:32.730 everybody was taking notes. And he did talk about how there is a 58 00:03:33.180 --> 00:03:37.740 tendency among practitioners to just jump on to sort of 59 00:03:37.830 --> 00:03:41.700 buzzwords like prompt injections, and you know, 60 00:03:42.060 --> 00:03:46.020 hallucinations. So before getting or worrying about 61 00:03:46.440 --> 00:03:50.250 algorithmic risks ... so he said, the focus must be on 62 00:03:50.400 --> 00:03:54.090 getting the basics of security right, which is your basic cyber 63 00:03:54.090 --> 00:03:59.700 hygiene practices, which is, you know, patching, looking for bugs 64 00:03:59.700 --> 00:04:03.390 in your source code and all of that. So I found that very 65 00:04:03.390 --> 00:04:08.700 fascinating. And he had a very in-depth presentation on how 66 00:04:09.600 --> 00:04:14.880 companies can sort of deploy AI to their solutions. So that was 67 00:04:14.880 --> 00:04:16.410 very interesting. 68 00:04:16.000 --> 00:04:19.120 Mathew Schwartz: A lot of enthusiasm and excitement for 69 00:04:19.120 --> 00:04:19.330 that. 70 00:04:19.540 --> 00:04:20.440 Anna Delaney: A lot of questions. 71 00:04:20.000 --> 00:04:23.060 Akshaya Asokan: And a lot of questions. So that was good, 72 00:04:23.090 --> 00:04:28.700 very informative. And on the sessions that Mat coordinated on 73 00:04:29.060 --> 00:04:35.510 AI regulation, and Andy Chakraborty and ... 74 00:04:35.510 --> 00:04:35.990 Anna Delaney: Ian Thornton-Trump ... 75 00:04:36.230 --> 00:04:39.020 Akshaya Asokan: Yeah, Ian Thornton-Trump and their 76 00:04:39.020 --> 00:04:45.170 discussion about governance and how sort of privacy-focused 77 00:04:45.170 --> 00:04:50.060 governance is sort of pushing companies to opt for more AI 78 00:04:50.060 --> 00:04:54.260 solutions that's trained in private data as opposed to 79 00:04:54.410 --> 00:04:58.430 public data because it can invoke fines with the EU's 80 00:04:58.430 --> 00:05:04.700 impending AI Act and other AI regulations that would soon come 81 00:05:04.700 --> 00:05:05.960 about in the U.S. 82 00:05:06.170 --> 00:05:08.300 Mathew Schwartz: Or can spill your secrets, if you train a 83 00:05:08.300 --> 00:05:10.910 public model, and it regurgitates it for somebody 84 00:05:10.910 --> 00:05:14.510 else, which we've seen. I think they cited Samsung as a recent 85 00:05:14.510 --> 00:05:16.940 example. So yeah, it was interesting. They were 86 00:05:16.940 --> 00:05:22.940 relatively bullish, I thought on private AI, that they could get 87 00:05:22.940 --> 00:05:26.270 the model and then train it, but keep the training inside. And I 88 00:05:26.270 --> 00:05:29.360 thought there would be some hesitancy there. But they were 89 00:05:29.360 --> 00:05:31.700 talking about how they might want to adopt that. 90 00:05:31.730 --> 00:05:34.580 Akshaya Asokan: Yeah. So I found that very fascinating, and how 91 00:05:35.720 --> 00:05:40.880 financial sector, where, you know, there are a lot of PIIs - 92 00:05:40.910 --> 00:05:43.070 personally identifiable information - where they have to 93 00:05:43.070 --> 00:05:45.830 be extremely careful. So that was very interesting and new to 94 00:05:45.830 --> 00:05:46.100 me. 95 00:05:46.340 --> 00:05:49.130 Anna Delaney: Great discussion. So, Tony, you, manned, you were 96 00:05:49.130 --> 00:05:51.770 the master-in-commander of the solution room today. Tell us 97 00:05:51.770 --> 00:05:52.160 about it. 98 00:05:52.690 --> 00:05:54.280 Tony Morbin: I mean, I thought that was really, really 99 00:05:54.280 --> 00:05:59.080 interesting. And people were pretty enthusiastic. But the 100 00:05:59.080 --> 00:06:02.800 solutions that came out, were not particularly a surprise. In 101 00:06:02.800 --> 00:06:06.160 fact, I would go back to, you know, Baden-Powell, with the 102 00:06:06.160 --> 00:06:10.270 scouts had it right to have his motto, Be Prepared. And that 103 00:06:10.270 --> 00:06:12.910 really sums it up. I mean, particularly obviously, 104 00:06:12.910 --> 00:06:18.550 playbooks making sure that they are practiced, making sure that 105 00:06:18.550 --> 00:06:21.250 they're well communicated as well, and that they're updated 106 00:06:21.670 --> 00:06:23.950 to deal with changing circumstances. 107 00:06:23.000 --> 00:06:26.090 Mathew Schwartz: So what were you solutioning? Because I was 108 00:06:26.090 --> 00:06:28.400 stuck in a roundtable. Otherwise, I would have liked to 109 00:06:28.400 --> 00:06:30.470 have ended in the session, but it was incident response, right? 110 00:06:30.000 --> 00:06:32.580 Tony Morbin: It was incident response. So it was running 111 00:06:32.580 --> 00:06:35.670 through a scenario that, you know, you're a global logistics 112 00:06:35.670 --> 00:06:39.300 company that has been hacked. It suspected that it's a 113 00:06:39.300 --> 00:06:43.590 ransomware. And one of the issues that came up was, is it 114 00:06:43.590 --> 00:06:48.240 really ransomware or not, you know. I was a bit shocked to see 115 00:06:48.240 --> 00:06:53.220 the total negative idea of, you know, no, we wouldn't bother 116 00:06:53.220 --> 00:06:57.480 communicating to the police, because law enforcement doesn't 117 00:06:57.480 --> 00:07:01.500 do anything. And they might even restrict us because they'll want 118 00:07:01.500 --> 00:07:06.450 to, you know, retain evidence and so on, and it might actually 119 00:07:06.450 --> 00:07:08.790 slow us getting down. I mean, there's some other questions 120 00:07:08.790 --> 00:07:11.340 that had come up actually, slightly earlier. But you know, 121 00:07:11.550 --> 00:07:16.140 should we pay a ransom, and it was very much, you know, it 122 00:07:16.140 --> 00:07:19.050 depends on your circumstances, you know, you have to kind of do 123 00:07:19.050 --> 00:07:21.780 what is best for you, even though the recommendation is 124 00:07:21.780 --> 00:07:27.390 always not to pay the ransom, if you can at all avoid it. So it 125 00:07:27.390 --> 00:07:30.780 was, in that sense, was predictable, but there was also, 126 00:07:31.140 --> 00:07:35.700 you know, don't panic, don't go running around finger pointing. 127 00:07:36.300 --> 00:07:39.330 You know, make sure that you go through this slowly, 128 00:07:39.330 --> 00:07:41.940 methodically, all the things about who you should contact, 129 00:07:42.090 --> 00:07:47.520 you know, does your CEO have your phone number, so that he 130 00:07:47.520 --> 00:07:49.500 can contact you if all your systems are down. 131 00:07:50.100 --> 00:07:51.300 Mathew Schwartz: Or if ransomware attackers are 132 00:07:51.300 --> 00:07:52.170 monitoring your ... 133 00:07:51.899 --> 00:07:54.772 Tony Morbin: Because they're monitoring everything and you 134 00:07:52.000 --> 00:09:05.500 Very good. So was there a particular speaker that stood 135 00:07:54.836 --> 00:07:58.732 don't know what they've already got, you know, maybe you need 136 00:07:58.796 --> 00:08:01.797 another channel of communication, separate from 137 00:08:01.861 --> 00:08:05.757 your normal systems. So all that was there, but it really was 138 00:08:05.820 --> 00:08:08.886 about get it all in the playbook, including your 139 00:08:08.950 --> 00:08:12.845 response to regulators when you need to respond, who needs to 140 00:08:12.909 --> 00:08:16.422 respond, what everybody's responsibilities are. And all 141 00:08:16.485 --> 00:08:20.317 those priorities should be in your playbook. And do practice 142 00:08:20.381 --> 00:08:24.021 them. Because if you don't practice, well, unfortunately, 143 00:08:24.085 --> 00:08:28.045 the other comment was, you only really, really learn if you've 144 00:08:28.108 --> 00:08:32.259 actually been through an attack. So all those who've been through 145 00:08:32.323 --> 00:08:36.283 the real thing, are suddenly a lot better at response, because 146 00:08:36.347 --> 00:08:40.242 they now have identified their failings. But they didn't know 147 00:08:40.306 --> 00:08:43.882 who to deputize to, who had what authority in the actual 148 00:08:43.946 --> 00:08:47.395 circumstance of having been attacked. So there is that 149 00:08:47.459 --> 00:08:51.610 emotional response. And you need to get that out of the way, work 150 00:08:51.674 --> 00:08:55.441 through step-by-step and maybe look at other people in your 151 00:08:55.505 --> 00:08:59.465 sector, how they have dealt with it, and steal their playbooks 152 00:08:59.529 --> 00:09:02.850 because they would have gone through the same thing. 153 00:09:05.000 --> 00:09:11.750 A particular speaker, it's difficult to kind of, you know, 154 00:09:05.500 --> 00:09:05.950 out for you? 155 00:09:11.810 --> 00:09:16.460 put one person above the others because it's a bit unfair. I 156 00:09:16.460 --> 00:09:21.680 know, Angus from MasterCard was saying how, obviously, one of 157 00:09:21.680 --> 00:09:25.100 the other issues with paying ransoms is you might not even 158 00:09:25.100 --> 00:09:28.580 have the wherewithal to be able to pay a ransom if the ransom 159 00:09:28.580 --> 00:09:31.370 has to be paid in cryptocurrency, if you don't 160 00:09:31.400 --> 00:09:36.980 have a system in place to be able to pay a ransom if that was 161 00:09:36.980 --> 00:09:41.420 what you decided. And it was also asked, you know, is there a 162 00:09:41.420 --> 00:09:45.890 checklist you can run through? As to whether or not you should 163 00:09:45.890 --> 00:09:49.250 pay a ransom, and frankly no, there isn't. Because it's going 164 00:09:49.250 --> 00:09:51.500 to depend on every circumstance. 165 00:09:52.010 --> 00:09:53.930 Anna Delaney: Akshaya, particular conversation or 166 00:09:53.930 --> 00:09:54.980 speaker or theme? 167 00:09:56.510 --> 00:09:57.830 Mathew Schwartz: You've already highlighted a couple. 168 00:09:57.000 --> 00:10:05.880 Akshaya Asokan: John Stone. Found him very interesting, 169 00:10:05.910 --> 00:10:10.410 very, very informative and lots of new information nuggets. 170 00:10:10.000 --> 00:10:15.040 Mathew Schwartz: I want to highlight Don Gibson for the 171 00:10:15.040 --> 00:10:17.890 closing keynote that he delivered. He had a wonderful 172 00:10:17.980 --> 00:10:21.310 ... I saw people taking pictures and he said, contact me, I'll 173 00:10:21.310 --> 00:10:24.610 send you my slide deck. I'm not precious, but just looking at 174 00:10:24.640 --> 00:10:31.390 his lessons learned from a life in security. What burnout looks 175 00:10:31.390 --> 00:10:35.830 like, some lessons learned from responses, and what that does to 176 00:10:35.830 --> 00:10:39.250 you and how you need to think about getting through that sort 177 00:10:39.250 --> 00:10:43.060 of thing. Really well-received. He had a great example as well, 178 00:10:43.090 --> 00:10:46.030 in the panel, when we were talking about budgeting. He 179 00:10:46.030 --> 00:10:49.690 looks at recent incidents, and circulates those maybe monthly, 180 00:10:50.350 --> 00:10:53.740 picks an incident, gives it to his executive team, the board 181 00:10:53.740 --> 00:10:56.800 members and says, here's something from our industry, 182 00:10:56.830 --> 00:11:00.490 usually, here's how we would have fared. And here's some 183 00:11:00.490 --> 00:11:03.310 thinking that I have on the matter. I would also highlight 184 00:11:03.310 --> 00:11:07.750 as I suspect, you might, the opening speaker, Helen Rabe from 185 00:11:07.960 --> 00:11:12.340 the BBC, who was great, I got to interview her. Wonderful 186 00:11:12.700 --> 00:11:14.950 details, insights and a great way to start the day. 187 00:11:14.990 --> 00:11:17.030 Anna Delaney: Yeah, on leadership and how emotional 188 00:11:17.030 --> 00:11:21.890 intelligence plays a big role in how she navigates as a leader. 189 00:11:22.130 --> 00:11:26.330 So that was that was excellent. I also loved the Navigating 190 00:11:26.750 --> 00:11:29.420 Executive Liability panel and our good friend Jonathan 191 00:11:29.420 --> 00:11:32.360 Armstrong, partner at Cordery Compliance. Great to have that 192 00:11:32.360 --> 00:11:36.710 legal aspect. A lot of questions directed to him. There was this, 193 00:11:36.710 --> 00:11:42.080 I think it was Quentyn Taylor, CISO for for EMEA at Canon, was 194 00:11:42.080 --> 00:11:45.920 saying, actually, we've been fighting for a seat of the 195 00:11:45.920 --> 00:11:50.540 board. It's time to grow up. You need to take responsibility. 196 00:11:50.690 --> 00:11:54.590 There's a great, great conversation there. And Jonathan 197 00:11:54.590 --> 00:11:57.200 was saying he was warning leaders to take note of these 198 00:11:57.200 --> 00:12:03.560 examples. Former TSB CIO, Carlos Abarca, and then a former CSO, 199 00:12:03.770 --> 00:12:07.610 Joe Sullivan, take note, because these are the trends. They're 200 00:12:07.610 --> 00:12:13.310 not just flukes, not fluke cases. And he likened these 201 00:12:13.310 --> 00:12:16.970 cases to the medieval tradition of having heads paraded on 202 00:12:16.970 --> 00:12:20.840 spikes. I love that. I think he was the best anecdote maker of 203 00:12:20.840 --> 00:12:24.740 the session. He also said, the supplier breach has become a 204 00:12:24.740 --> 00:12:28.820 competitive sport, almost like an Olympic game. So I love that. 205 00:12:29.240 --> 00:12:33.650 But what a great, great day. I've had a lot of fun. And if 206 00:12:33.650 --> 00:12:37.400 there was one word, just very quickly, to encapsulate the day, 207 00:12:37.760 --> 00:12:39.920 what comes to mind or image? 208 00:12:42.120 --> 00:12:45.270 Mathew Schwartz: Don had a - what was it, as an American, the 209 00:12:45.270 --> 00:12:47.100 nature of well-being, something like that. I will just say 210 00:12:47.100 --> 00:12:52.530 self-care. I thought there was a great note being sounded, not to 211 00:12:52.530 --> 00:12:55.830 be touchy feely, but about how all of these things are 212 00:12:55.830 --> 00:12:59.040 empowering. And it's not this fear, uncertainty, doubt. It is 213 00:12:59.040 --> 00:13:03.450 to equip you so that when you're in these situations, be they 214 00:13:03.600 --> 00:13:07.830 ransomware incident response, or something to do with your career 215 00:13:07.860 --> 00:13:11.400 or whatever, that you're in a better position. And I'd like to 216 00:13:11.400 --> 00:13:13.710 think that everybody is in a better position after today. 217 00:13:14.700 --> 00:13:15.180 Akshaya Asokan: Akshaya? 218 00:13:15.810 --> 00:13:20.790 Large language model. For the enthusiasm that it created. So 219 00:13:20.790 --> 00:13:22.020 yeah, definitely. 220 00:13:23.140 --> 00:13:25.450 Tony Morbin: I'll probably, unfortunately, use two words, 221 00:13:25.990 --> 00:13:29.410 which was emotional intelligence and again, talking Helen's 222 00:13:30.400 --> 00:13:33.730 comment on how the emotional intelligence allows you to 223 00:13:33.760 --> 00:13:36.940 articulate risk better to the board. And actually the opposite 224 00:13:36.940 --> 00:13:40.240 of taking responsibility, being able to articulate the risk 225 00:13:40.300 --> 00:13:43.990 without necessarily saying, just because I've spotted this here 226 00:13:44.020 --> 00:13:46.540 doesn't mean I have to take responsibility for it. It could 227 00:13:46.540 --> 00:13:48.850 be somebody else's responsibility. And so you don't 228 00:13:48.850 --> 00:13:52.720 go in in a confrontational way. You kind of explain it. And then 229 00:13:52.870 --> 00:13:56.410 who's the risk? So yeah, that whole emotional intelligence, 230 00:13:56.590 --> 00:14:00.820 which follows up on your area of as well and the whole, you know, 231 00:14:00.820 --> 00:14:05.890 not panicking and dealing with things calmly, rationally. 232 00:14:07.300 --> 00:14:10.150 Anna Delaney: Very good. For me it was engage. I had so many 233 00:14:10.150 --> 00:14:14.950 questions from the audience. I loved that. And it felt ... we 234 00:14:14.950 --> 00:14:19.420 came back to live events maybe last year. And I think the spent 235 00:14:19.450 --> 00:14:22.810 felt more sort of pre-pandemic time. I think everybody's just 236 00:14:23.080 --> 00:14:27.970 happy to be here, happy to connect. Energy and enthusiasm. 237 00:14:28.060 --> 00:14:31.540 So great. Well, I've had fun. Thank you so much for this wrap 238 00:14:31.540 --> 00:14:35.710 up. I hope you've enjoyed it. Thank you so much for watching.