WEBVTT 1 00:00:07.080 --> 00:00:09.540 Anna Delaney: Hello and welcome to this very special edition of 2 00:00:09.540 --> 00:00:12.690 the ISMG Editors' Panel. I'm Anna Delaney, and we have a 3 00:00:12.690 --> 00:00:16.200 packed agenda ahead today, including a look at generative 4 00:00:16.200 --> 00:00:19.230 AI, emerging threats and the development of technology 5 00:00:19.230 --> 00:00:22.410 guardrails, as well as preparing the financial sector for 6 00:00:22.470 --> 00:00:25.980 third-party regulation. As I said, it's a special episode 7 00:00:25.980 --> 00:00:29.760 because we are joined by none other than Troy Leach, chief 8 00:00:29.760 --> 00:00:33.000 strategy officer of the Cloud Security Alliance, who of course 9 00:00:33.000 --> 00:00:36.780 previously led the PCI Security Standards Council. Troy really 10 00:00:36.780 --> 00:00:37.800 good to have you back. 11 00:00:38.670 --> 00:00:40.860 Troy Leach: Thank you. Special edition, I'm feeling very 12 00:00:40.860 --> 00:00:41.760 honored. Thank you. 13 00:00:42.450 --> 00:00:45.210 Anna Delaney: Absolutely. Also with us today are my stellar 14 00:00:45.210 --> 00:00:47.580 colleagues, Mathew Schwartz, executive editor of 15 00:00:47.610 --> 00:00:50.610 DataBreachToday in Europe, and Michael Novinson, managing 16 00:00:50.610 --> 00:00:52.530 editor for ISMG business. 17 00:00:53.100 --> 00:00:53.610 Mathew Schwartz: Hello! 18 00:00:54.390 --> 00:00:57.360 Anna Delaney: Really good to see you both. So, Troy, as you know, 19 00:00:57.390 --> 00:01:00.540 we like to start our discussion off with a bit of a nosiest, 20 00:01:00.630 --> 00:01:03.810 where you are in the world today? Where are you virtually? 21 00:01:04.920 --> 00:01:08.160 Troy Leach: Well, you know, virtually, I'm back in Rome 22 00:01:08.550 --> 00:01:12.150 where I was this winter, although I did just return from 23 00:01:12.150 --> 00:01:15.930 Turks and Caicos. This is my version of a tan. It's really 24 00:01:15.930 --> 00:01:22.020 just less degrees of pale. But, I do always like to go to Rome 25 00:01:22.020 --> 00:01:24.510 probably be back there in about six months. 26 00:01:25.230 --> 00:01:27.510 Anna Delaney: Very nice. It's always a good idea, isn't it? 27 00:01:28.350 --> 00:01:31.650 Contrast, Michael, I think it's Vegas? 28 00:01:31.000 --> 00:01:34.300 Michael Novinson: I think you're right, where else can you get 29 00:01:34.300 --> 00:01:37.480 the canals of Venice, the Eiffel Tower, the Planet Hollywood 30 00:01:37.480 --> 00:01:41.980 balloon - all in one city, Las Vegas, Nevada. Black Hat, which 31 00:01:42.220 --> 00:01:45.910 myself and Tom Field - our esteemed boss - will be at next 32 00:01:45.910 --> 00:01:48.370 week. So looking forward to having some conversations there 33 00:01:48.370 --> 00:01:50.680 and getting caught up on the latest and greatest in the 34 00:01:50.680 --> 00:01:51.580 threat landscape. 35 00:01:51.600 --> 00:01:54.150 Anna Delaney: Matt, what a beautiful view behind you. Tell 36 00:01:54.150 --> 00:01:54.540 us more. 37 00:01:54.870 --> 00:01:57.660 Mathew Schwartz: Thank you. So this is Dundee where I live in 38 00:01:57.660 --> 00:02:01.800 Scotland. And I was out the other day with one of my cameras 39 00:02:01.800 --> 00:02:06.540 and a lens that loves sunshine and clouds and foliage and just 40 00:02:06.960 --> 00:02:11.220 playing around. This is a little office building, building is a 41 00:02:11.220 --> 00:02:15.600 wrong word, but office block, I guess, constructed of the 42 00:02:15.630 --> 00:02:19.500 containers that go in the back of trucks, lorries that go on to 43 00:02:19.500 --> 00:02:23.640 ships. So it's a container structure that's been built. I 44 00:02:23.640 --> 00:02:25.920 don't know if they are intending to build more. It's called 45 00:02:25.920 --> 00:02:29.370 District 10. But this one has the number one on it. So that 46 00:02:29.370 --> 00:02:31.050 has been an open question in my mind. 47 00:02:31.860 --> 00:02:34.530 Anna Delaney: Very cool, indeed. Well, in honor of our special 48 00:02:34.530 --> 00:02:38.700 guest, I transported myself back to Phoenix, Arizona, and among 49 00:02:38.700 --> 00:02:42.870 the cacti, where we were for ISMG's summit last November. 50 00:02:43.020 --> 00:02:45.630 Well, Troy, we have a few questions for you. I'm going to 51 00:02:45.630 --> 00:02:48.060 hand over to Matt at this point to start proceedings. 52 00:02:48.840 --> 00:02:51.780 Mathew Schwartz: Excellent. Well, great to have you on Troy. 53 00:02:51.780 --> 00:02:55.020 As always, thank you. And I know this is going to catch you by 54 00:02:55.020 --> 00:02:59.580 surprise. But I'd like to talk regulations. And just to change 55 00:02:59.580 --> 00:03:02.100 things up a little bit. I'm not going to start with PCI. 56 00:03:02.280 --> 00:03:04.230 Instead, I'd like to start because I'm over here in 57 00:03:04.230 --> 00:03:09.480 Scotland, with the EU's Digital Operational Resilience Act, or 58 00:03:09.480 --> 00:03:13.860 DORA, which is going to be going into effect at the beginning of 59 00:03:13.890 --> 00:03:19.530 2025. What do you think the industry needs to be doing now 60 00:03:19.800 --> 00:03:20.790 to prep for DORA? 61 00:03:22.320 --> 00:03:26.460 Troy Leach: Yeah, you know, DORA is something and obviously cloud 62 00:03:26.460 --> 00:03:31.830 security lines has a lot of engagement in Europe, offices in 63 00:03:31.830 --> 00:03:37.080 Berlin. But speaking a lot with the chapter in Dublin recently 64 00:03:37.080 --> 00:03:42.720 about this and others is real concern understanding of what is 65 00:03:42.720 --> 00:03:46.380 trying to be accomplished by the European supervisory 66 00:03:46.380 --> 00:03:49.950 authorities. They're trying to find ways to make sure that 67 00:03:49.950 --> 00:03:54.420 there's good contingency and resiliency for things that are 68 00:03:54.420 --> 00:03:58.050 really critical infrastructure and finding ways for them to 69 00:03:58.320 --> 00:04:01.110 protect specially with some of these organizations, they may 70 00:04:01.110 --> 00:04:05.490 have quite a bit of facilities and employees local to a 71 00:04:05.490 --> 00:04:09.360 country, but they may be headquartered in another region 72 00:04:09.360 --> 00:04:13.020 of the world. And that's raised a lot of concerns about how 73 00:04:13.020 --> 00:04:16.980 critical infrastructure could be impacted. So I think, you know, 74 00:04:16.980 --> 00:04:21.180 the technical requirements have not been solidified. But right 75 00:04:21.180 --> 00:04:23.970 now they're looking and it's going to be the beginning of 76 00:04:23.970 --> 00:04:28.440 first quarter of 2025, that this will go into effect. And there's 77 00:04:28.440 --> 00:04:31.320 going to be multiple parts. And there's going to be layers of 78 00:04:31.320 --> 00:04:35.130 complexity here. But I think for people that are looking at this, 79 00:04:35.250 --> 00:04:38.700 you know, they should be monitoring for the technical 80 00:04:38.730 --> 00:04:41.850 requirements for ICTs, that's information and communication 81 00:04:41.850 --> 00:04:45.510 technology companies, and further expectations from what 82 00:04:45.510 --> 00:04:52.050 these ESAs are. So that's the EBA, EIOPA, ESMA, and they 83 00:04:52.050 --> 00:04:54.810 possibly could have other additional layers of 84 00:04:54.810 --> 00:05:00.360 expectations for these ICT service providers and cloud 85 00:05:00.360 --> 00:05:03.090 service providers would be, especially the hyperscalers, 86 00:05:03.090 --> 00:05:09.540 like mature GCP, AWS, those will be the ones that probably are 87 00:05:09.540 --> 00:05:12.990 going to have to work with their existing financial service 88 00:05:13.260 --> 00:05:18.270 customers and understand what these changes really mean. And 89 00:05:18.270 --> 00:05:22.380 so if you are using these services today, you know, 90 00:05:22.380 --> 00:05:24.840 there's they're going to be the important questions to be asked 91 00:05:24.840 --> 00:05:28.140 is, are we going to be classified as critical? And the 92 00:05:28.140 --> 00:05:30.840 likely answer is, yes, if you are one of those four mentioned 93 00:05:30.840 --> 00:05:35.370 companies, and all of these associated expectations that are 94 00:05:35.370 --> 00:05:39.060 going to be new. You know, there's incident reporting, we 95 00:05:39.060 --> 00:05:44.190 just saw new regulation come out from the SEC here in the United 96 00:05:44.190 --> 00:05:48.690 States, very similar, about expectations around what that 97 00:05:48.690 --> 00:05:51.780 reporting of an incident should be. And what a third party that 98 00:05:51.780 --> 00:05:55.350 has an incident, how should a financial service report that I 99 00:05:55.350 --> 00:05:58.140 think identifying some of the business functions is going to 100 00:05:58.140 --> 00:06:02.580 be probably the first need, and what really has people concerned 101 00:06:02.580 --> 00:06:05.160 because they just don't know what it's going to all entail 102 00:06:05.160 --> 00:06:11.250 yet, is these threat-led pen tests, or TLPTs, and these TLPTs 103 00:06:12.060 --> 00:06:16.470 are going to be supposed to be independent penetration test of 104 00:06:16.470 --> 00:06:20.160 these cloud service providers. And so I think there needs to be 105 00:06:20.190 --> 00:06:23.940 a conversation that's happening now, in preparation for this, 106 00:06:24.180 --> 00:06:27.360 evaluate what the current contract allowances are, and 107 00:06:27.390 --> 00:06:31.590 expectations for how they plan to comply with the inspections, 108 00:06:31.590 --> 00:06:35.400 with the audits, with potential data forensic investigations. 109 00:06:35.640 --> 00:06:39.120 All of this is going to be part of DORA, and those conversations 110 00:06:39.120 --> 00:06:43.320 to start now, because it's going to take a while to make 111 00:06:43.320 --> 00:06:45.990 modifications to what the current expectations are. 112 00:06:47.400 --> 00:06:50.040 Mathew Schwartz: It sounds, yes, sounds complicated and not yet 113 00:06:50.040 --> 00:06:53.880 locked down. So I guess the closer you get to the deadline, 114 00:06:53.880 --> 00:06:56.580 without knowing all the details, it gets more difficult. Well, I 115 00:06:56.580 --> 00:07:03.840 want to transition briefly here to PCI DSS version 4.0. So the 116 00:07:03.870 --> 00:07:07.860 payment card industry has got an update to its well-known data 117 00:07:07.860 --> 00:07:12.780 security standards set to take effect in March 2025, I believe. 118 00:07:13.110 --> 00:07:18.510 So I know that you're not part of the council anymore. But when 119 00:07:18.510 --> 00:07:21.390 you look at this, where do you think we are in the PCI 120 00:07:21.390 --> 00:07:24.330 transition? And as we move toward version 4.0, are there 121 00:07:24.330 --> 00:07:28.950 any surprises or challenges that you're seeing or hearing about 122 00:07:28.950 --> 00:07:30.420 that people should be aware of? 123 00:07:31.740 --> 00:07:35.250 Troy Leach: Yeah, and I think if organizations are thinking this 124 00:07:35.250 --> 00:07:40.410 is just another year of PCI assessments going forward, I 125 00:07:40.410 --> 00:07:44.700 think they should be aware that in this newest version, it truly 126 00:07:44.700 --> 00:07:51.300 was a significant revisement of the standard. And, there's been 127 00:07:51.450 --> 00:07:55.080 fortunately, unlike DORA, we already know exactly what the 128 00:07:55.080 --> 00:07:59.160 expectations are, for within the requirements. And these 129 00:07:59.160 --> 00:08:03.030 requirements go into effect, the standard goes into effect March 130 00:08:03.060 --> 00:08:08.460 of 2024. And, then there are certain requirements that are 131 00:08:08.460 --> 00:08:11.430 seen as possibly a bigger lift, they might need additional 132 00:08:11.430 --> 00:08:14.760 budgetary planning, or just looking at the methodology 133 00:08:14.760 --> 00:08:18.240 that's being used, they're really new requirements, they 134 00:08:18.270 --> 00:08:20.970 typically are given a further sunrise date. And so these new 135 00:08:20.970 --> 00:08:23.730 sunrise requirements are going to - and there's about two dozen 136 00:08:23.730 --> 00:08:28.230 or so - are going to go in effect in March of 2025. But 137 00:08:28.230 --> 00:08:33.660 overall, I heard someone say its North of 60 new requirements for 138 00:08:33.690 --> 00:08:37.500 version 4.0 of the standard. And I'll give you just a couple of 139 00:08:37.530 --> 00:08:43.050 areas that I think probably be the biggest and going back very 140 00:08:43.050 --> 00:08:47.070 similar to DORA, looking at cloud service providers, and in 141 00:08:47.070 --> 00:08:49.890 general what they call multi-tenant service providers. 142 00:08:50.130 --> 00:08:53.190 And if you look at the prior version, there was no reference 143 00:08:53.190 --> 00:08:56.970 to cloud. Now there's mentioned CloudHSM, I think cloud was 144 00:08:56.970 --> 00:09:02.430 referenced several times in the now 300 plus page document. And 145 00:09:02.430 --> 00:09:05.490 so there's going to be this expectation very similar to DORA 146 00:09:05.670 --> 00:09:09.210 that there's going to be this external pen test of these cloud 147 00:09:09.210 --> 00:09:13.020 service providers and having an understanding how that will come 148 00:09:13.020 --> 00:09:17.340 into a place. I think another area of interest is for 149 00:09:17.340 --> 00:09:20.910 merchants, the retail community especially, is they're expanding 150 00:09:20.910 --> 00:09:26.640 the scope of what previously looking at the consumer browser 151 00:09:26.730 --> 00:09:30.810 and recognizing that over the years, we got really good at 152 00:09:30.810 --> 00:09:35.010 securing the retail environment and in that payment environment, 153 00:09:35.040 --> 00:09:38.460 e-commerce merchants, but we started to see the e-skimming 154 00:09:38.460 --> 00:09:41.880 attacks, right? We saw British Airways and Virgin and all these 155 00:09:42.120 --> 00:09:45.450 Ticketmaster and all these companies that, you know, their 156 00:09:45.450 --> 00:09:50.010 source code is coming from a 100 plus locations. And so criminals 157 00:09:50.010 --> 00:09:52.170 are saying, you know, it's really easy for me to 158 00:09:52.200 --> 00:09:55.920 compromise, easier I should say to compromise a source 159 00:09:55.920 --> 00:09:59.580 repository of code, and then I can be scraping all this 160 00:09:59.580 --> 00:10:03.390 information. Because it's gotten so sophisticated, that even if I 161 00:10:03.390 --> 00:10:06.540 type in my credit card number, and I ah, you know, I shouldn't 162 00:10:06.540 --> 00:10:09.330 do this, I don't want to do it this way. And I erase it, even 163 00:10:09.330 --> 00:10:13.710 though I haven't hit submit, they're still capturing exactly 164 00:10:13.710 --> 00:10:17.550 what I keyed in to, and all that payment information. So there's 165 00:10:17.550 --> 00:10:21.270 two new requirements, one's 643 - it's around managing the 166 00:10:21.270 --> 00:10:23.970 payment pages, the scripts that are loaded and executed in the 167 00:10:23.970 --> 00:10:26.880 customer's browser. And then there's a monitoring aspect of a 168 00:10:26.880 --> 00:10:31.860 mechanism that now needs to exist to detect changes and 169 00:10:31.860 --> 00:10:35.850 modifications to the consumer sides, the HTTP headers that 170 00:10:35.850 --> 00:10:38.670 you're receiving and from the payment pages. So those are 171 00:10:38.670 --> 00:10:42.930 going to be some significant changes. I think there's also a 172 00:10:42.930 --> 00:10:47.730 focus on phishing attacks. And recognize, we talked about 173 00:10:47.760 --> 00:10:51.000 artificial intelligence in the past and how that's becoming 174 00:10:51.000 --> 00:10:55.230 these deep fakes and ability to scrape someone's voice. You 175 00:10:55.230 --> 00:10:57.840 know, these phishing attacks are going to become much more 176 00:10:57.870 --> 00:11:01.920 complicated for organizations to manage. So there's going to be 177 00:11:01.920 --> 00:11:05.880 some new requirements around that. Obviously, they've 178 00:11:05.910 --> 00:11:08.670 expanded where encryption, so previously sensitive 179 00:11:08.670 --> 00:11:13.440 authentication data, and how you encrypt that vulnerability scans 180 00:11:13.440 --> 00:11:16.590 that have authenticated scanning now to demonstrate these 181 00:11:16.590 --> 00:11:21.930 scannings, scans are efficient and working correctly. And then 182 00:11:21.930 --> 00:11:25.860 just for service providers, documenting the scope and the 183 00:11:25.860 --> 00:11:30.270 the correct, where what is the payment data environment every 184 00:11:30.270 --> 00:11:33.660 six months, not every 12 months anymore, every six months. So 185 00:11:33.660 --> 00:11:36.900 those are just examples of some of the bigger changes that are 186 00:11:36.900 --> 00:11:40.410 coming that organizations really need to plan about how they're 187 00:11:40.410 --> 00:11:43.950 going to address it, it's not just going to be asking the GRC 188 00:11:43.950 --> 00:11:46.650 team to rinse and repeat what they've done in the past, 189 00:11:46.860 --> 00:11:50.490 they're really going to have to identify and strategize for how 190 00:11:50.490 --> 00:11:52.140 they'll address those new requirements. 191 00:11:52.770 --> 00:11:54.390 Mathew Schwartz: Fascinating. I mean, with all the digital 192 00:11:54.390 --> 00:11:56.850 skimming, e-skimmikng attacks we've been seeing, like you were 193 00:11:56.850 --> 00:12:00.270 talking about the Magecart style attacks, great to hear that we 194 00:12:00.270 --> 00:12:02.400 have some of this stuff coming down the pipe to help, 195 00:12:02.550 --> 00:12:06.630 hopefully, lock this sort of thing from happening. I know 196 00:12:06.630 --> 00:12:08.610 Michael's got some more questions on his front as well. 197 00:12:08.610 --> 00:12:10.890 So I want to hand over to Michael from Vegas. 198 00:12:11.460 --> 00:12:15.180 Michael Novinson: Absolutely. And do appreciate the continued 199 00:12:15.180 --> 00:12:18.240 time here. Want to ask you to gaze into the crystal ball here, 200 00:12:18.240 --> 00:12:21.750 look ahead to 2024, get a sense from you of what organizations 201 00:12:21.750 --> 00:12:25.470 are prioritizing in their security budgets as a result of 202 00:12:25.470 --> 00:12:29.850 the impending changes to PCI as well as the impending DORA 203 00:12:29.880 --> 00:12:30.900 regulations? 204 00:12:31.920 --> 00:12:35.580 Troy Leach: That's a really good question. I might be a little 205 00:12:35.580 --> 00:12:40.020 jaded having run, you know, PCI Council and being involved in 206 00:12:40.020 --> 00:12:44.730 standards development last 20 years or so. But I would say 207 00:12:44.730 --> 00:12:49.500 that history suggests that sometimes that while CISO may 208 00:12:49.500 --> 00:12:53.070 want to plan for that, sometimes the budgets don't get approved 209 00:12:53.070 --> 00:12:55.200 until there's an actual pain point. Like we've actually 210 00:12:55.200 --> 00:12:59.610 missed a requirement in the standard. So we may be a couple 211 00:12:59.610 --> 00:13:03.720 of years from that planning may not be - it should be - but it 212 00:13:03.720 --> 00:13:10.320 may not be in the 2024. But what I think is, and also say that in 213 00:13:10.320 --> 00:13:11.970 this year that you know, we're not driven by the fear of 214 00:13:11.970 --> 00:13:16.140 regulation. I think what people are driven by is the fear of 215 00:13:16.140 --> 00:13:20.790 obsolescence. And that's with AI. So you saw, CEOs are wanting 216 00:13:20.790 --> 00:13:24.900 to invest in AI to remain competitive. And so they're 217 00:13:24.900 --> 00:13:28.620 giving the budget and that focus of their CISOs, to say, we want 218 00:13:28.620 --> 00:13:31.020 to facilitate these new opportunities. And we want you 219 00:13:31.020 --> 00:13:33.900 to make it secure immediately for us, which is hard because 220 00:13:33.900 --> 00:13:37.320 the it's such an emerging technology. But I think that's 221 00:13:37.320 --> 00:13:40.140 where the security budgets or part of it is going to be 222 00:13:40.140 --> 00:13:46.290 focused is build us a secure internal AI platform that we can 223 00:13:46.290 --> 00:13:48.900 actually share sensitive information about our company 224 00:13:49.020 --> 00:13:52.350 and not fear that we're going to lose it to competitors. 225 00:13:53.640 --> 00:13:55.230 Michael Novinson: Troy, you've mentioned this third-party 226 00:13:55.230 --> 00:13:58.410 penetration testing requirements coming down the pipe - both in 227 00:13:58.410 --> 00:14:02.010 DORA as well as some enhancements on the PCI side. 228 00:14:02.250 --> 00:14:04.920 What do you feel cloud providers should do to prepare for that? 229 00:14:06.300 --> 00:14:11.190 Troy Leach: Well, I think it's a lot of education of what it 230 00:14:11.190 --> 00:14:15.120 means to them. You know, many of these larger cloud service 231 00:14:15.120 --> 00:14:18.930 providers, they're so large that sometimes they're working in 232 00:14:18.930 --> 00:14:22.290 silos themselves and different teams are trying to accomplish 233 00:14:22.290 --> 00:14:27.780 the same goals. But I think it's just first of all conducting a 234 00:14:27.780 --> 00:14:30.930 gap analysis of their own readiness of the client's 235 00:14:30.930 --> 00:14:34.650 workloads and knowing that they are going to be receiving these 236 00:14:34.650 --> 00:14:39.780 expectations, and valuate their current service agreements. I 237 00:14:39.780 --> 00:14:42.900 think there's going, many of these are in place for a long 238 00:14:42.900 --> 00:14:47.100 time. We see the U.S. Treasury, this year, talked about the 239 00:14:47.100 --> 00:14:50.070 state of financial services adoption of a cloud and what 240 00:14:50.070 --> 00:14:53.700 their biggest concerns was the existing contracts and 241 00:14:53.700 --> 00:14:57.630 agreements and that's something they want to evaluate further. I 242 00:14:57.630 --> 00:15:01.530 think CSPs should also, you know, identify mechanisms of how 243 00:15:01.530 --> 00:15:06.090 they can reduce the duplicity of these audits. So if I'm a major 244 00:15:06.120 --> 00:15:10.020 hyperscaler, and I have 1,000s upon 1,000s of organizations 245 00:15:10.020 --> 00:15:14.100 that are managing some form of regulated data, handling each 246 00:15:14.100 --> 00:15:19.560 one separately is going to be really, you know, costly to me. 247 00:15:19.560 --> 00:15:23.790 So trying to find ways that you can have multi-party acceptance 248 00:15:23.820 --> 00:15:26.910 of the activities you're doing for your, you're trying to 249 00:15:26.910 --> 00:15:30.030 adhere to your customers' compliance and regulatory needs. 250 00:15:30.030 --> 00:15:34.200 But you don't want to just repeat that same activity, when 251 00:15:34.200 --> 00:15:38.100 there's no real benefit of doing so 10s of 1,000s of times. I 252 00:15:38.100 --> 00:15:41.610 think that's one way of looking at those efficiencies, and then 253 00:15:41.610 --> 00:15:45.240 just really starting, they already do this, but performing 254 00:15:45.360 --> 00:15:49.590 resiliency testing and incident response reporting, making sure 255 00:15:49.590 --> 00:15:53.190 that, you know, running tests now to say, if we do have an 256 00:15:53.190 --> 00:15:56.550 incident, how do we report it to the SEC? How do we report it to 257 00:15:56.580 --> 00:16:00.060 a European supervisory authority? How, you know, 258 00:16:00.060 --> 00:16:04.440 looking at do we have those processes in place for anywhere 259 00:16:04.440 --> 00:16:09.210 where we might be called into question as part of stakeholder 260 00:16:09.240 --> 00:16:11.190 in the regulatory obligations? 261 00:16:12.390 --> 00:16:14.100 Michael Novinson: Certainly so important here. Entering the 262 00:16:14.100 --> 00:16:16.470 homestretch, I'm gonna turn it over to our closer, Anna 263 00:16:16.470 --> 00:16:16.980 Delaney. 264 00:16:18.300 --> 00:16:20.550 Anna Delaney: Thanks so much. Well, it's already been 265 00:16:20.550 --> 00:16:25.500 mentioned today, AI clearly is the buzzword of 2023. So Troy, 266 00:16:25.500 --> 00:16:28.950 I'd love your perspective as to how this technology gets used in 267 00:16:28.950 --> 00:16:32.370 heavily regulated financial services firms. Where do you see 268 00:16:32.370 --> 00:16:34.380 the opportunities as well as the risks? 269 00:16:35.950 --> 00:16:39.670 Troy Leach: Well, opportunities about and I really do think this 270 00:16:39.670 --> 00:16:44.530 is more than an annual buzzword for once. You know, there's some 271 00:16:44.530 --> 00:16:48.940 real advancements happening with large language models. Your 272 00:16:48.940 --> 00:16:51.850 biggest security advancements will be this faster, more 273 00:16:51.850 --> 00:16:55.360 efficient software design. Now, we can actually make SecDevOps 274 00:16:55.870 --> 00:16:59.890 work in real time. And you're starting to rather than just 275 00:16:59.890 --> 00:17:03.280 trying to teach and coach a software developer and try to 276 00:17:03.280 --> 00:17:07.390 encourage them to think about security, as they designed, it 277 00:17:07.390 --> 00:17:11.710 will be built in and be done in nanoseconds compared to what it 278 00:17:11.710 --> 00:17:15.940 was done previously. I also think pen testing, you know, and 279 00:17:15.940 --> 00:17:20.470 looking at this dynamic, regular way of finding new exploits, and 280 00:17:20.470 --> 00:17:25.030 then that's going to be a real advantage of AI. And then 281 00:17:25.030 --> 00:17:28.330 probably, you know, finding, just having reasonable 282 00:17:28.330 --> 00:17:32.320 expectations around which has been hard in the past for the 283 00:17:32.320 --> 00:17:35.740 software bill of materials, I think AI is really going to have 284 00:17:35.770 --> 00:17:40.930 a way for us to create a good dynamic inventory that is 285 00:17:40.930 --> 00:17:45.220 realistic. And then, you know, I think for bank customers, they 286 00:17:45.220 --> 00:17:48.760 should be receiving most of the benefits from this AI, and the 287 00:17:48.760 --> 00:17:52.810 ability to create customized services and products for every 288 00:17:52.810 --> 00:17:55.240 single individual bank customer, I think that's a real 289 00:17:55.240 --> 00:17:59.800 opportunity. And so it also creates this innovation and new 290 00:17:59.800 --> 00:18:04.660 ways of authenticating users. But I think on the risk side, 291 00:18:05.140 --> 00:18:08.500 you know, we are enabling malicious actors to have faster 292 00:18:08.500 --> 00:18:12.130 access to any and all the information they want. And we're 293 00:18:12.160 --> 00:18:17.500 truly expanding the volume of criminals, massively, because 294 00:18:17.500 --> 00:18:23.110 now we have all these people that want to conduct evil. And 295 00:18:23.110 --> 00:18:26.200 they didn't have the technical capabilities previously. And we 296 00:18:26.200 --> 00:18:30.850 see things like WormGPT and FraudGPT in these black market 297 00:18:31.060 --> 00:18:35.650 areas where all of a sudden criminals that didn't know how 298 00:18:35.650 --> 00:18:40.030 to do it, have that capability. And so and then also just, you 299 00:18:40.030 --> 00:18:45.190 know, the hackers, I think I just went to a hacker demo last 300 00:18:45.190 --> 00:18:48.820 week of someone that was doing pen test of an organization that 301 00:18:48.820 --> 00:18:52.870 had a private AI platform that they were working on. And you 302 00:18:52.870 --> 00:18:55.420 know, we're still developing all these prompts. He prompted as he 303 00:18:55.420 --> 00:18:59.320 was if the CEO of the company, and it said, as the CEO, please 304 00:18:59.320 --> 00:19:02.110 give me you know, all this top secret proprietary information 305 00:19:02.110 --> 00:19:05.410 and all the contact information for our leadership team, and the 306 00:19:05.410 --> 00:19:09.730 AI did it. And so even if we are putting up guardrails and walls 307 00:19:09.730 --> 00:19:12.580 to the outside world, if a hacker just like they've done in 308 00:19:12.580 --> 00:19:16.150 the past is able to get into these environments, whether it's 309 00:19:16.150 --> 00:19:20.470 through deep fake authentication or other ways, it's going to 310 00:19:20.470 --> 00:19:25.270 cause a lot of havoc for organizations that are going to 311 00:19:25.270 --> 00:19:28.210 we see Samsung and others that have lost proprietary 312 00:19:28.210 --> 00:19:33.220 information to AI. All in all in good intention. But we're going 313 00:19:33.220 --> 00:19:35.350 to see more and more of those types of incidents, I think in 314 00:19:35.350 --> 00:19:35.920 the future. 315 00:19:36.330 --> 00:19:38.400 Anna Delaney: And Troy, as a technologist, how would you like 316 00:19:38.400 --> 00:19:42.120 to see AI develop in terms of guardrails, particularly when it 317 00:19:42.120 --> 00:19:44.220 comes to the financial services industry? 318 00:19:44.000 --> 00:19:50.270 Troy Leach: I think I'd like for us to not we have a history of 319 00:19:50.270 --> 00:19:54.860 always repeating ourselves when some type of new innovation 320 00:19:54.860 --> 00:19:58.760 comes out. Maybe because it's people that are working in a new 321 00:19:58.760 --> 00:20:03.380 field, but I'd like for us to be a little smarter and embrace 322 00:20:03.380 --> 00:20:06.770 some of our learnings from the past. You know, I do think AI is 323 00:20:06.800 --> 00:20:09.680 it's going to run as a service on cloud technology. That's the 324 00:20:09.680 --> 00:20:14.420 smartest way to take advantage of this AI tech. In guardrails 325 00:20:14.420 --> 00:20:19.550 are important, I mentioned that, you know, we have it for the 326 00:20:19.550 --> 00:20:23.390 good side. But we have these WormGPT, FraudGPT and other dark 327 00:20:23.390 --> 00:20:25.640 web tools that are coming to market, they're going to 328 00:20:25.730 --> 00:20:29.390 accelerate that potential threat. And I think user 329 00:20:29.390 --> 00:20:32.300 education is going to be huge and important. So we're going 330 00:20:32.300 --> 00:20:37.910 to, you know, we have all this driver's license for driving a 331 00:20:37.910 --> 00:20:41.960 car responsibly, I think there might be a need for us to have 332 00:20:41.960 --> 00:20:47.060 some type of qualifications to have access to the prompting, 333 00:20:47.090 --> 00:20:51.350 although I will say prompt engineers are becoming a very 334 00:20:51.350 --> 00:20:55.880 fast growing part of the job market. And it's very lucrative 335 00:20:55.880 --> 00:20:59.420 for the moment because there's not too many of them. But we're 336 00:20:59.420 --> 00:21:03.260 going to see new innovation that's being created as part of 337 00:21:03.260 --> 00:21:08.090 this and I think user education would be the biggest part I hope 338 00:21:08.090 --> 00:21:09.080 develops very quickly. 339 00:21:10.080 --> 00:21:12.060 Anna Delaney: Very good. Well, this has been useful, thorough 340 00:21:12.060 --> 00:21:15.000 insight Troy. We really appreciate it. We have one final 341 00:21:15.000 --> 00:21:18.750 question for you. Just for fun, continuing with the generative 342 00:21:18.750 --> 00:21:23.430 AI buzz or not buzz, as you said, can you share a fun or 343 00:21:23.430 --> 00:21:29.850 interesting or quirky ChatGPT or related anecdote. Michael do you 344 00:21:29.850 --> 00:21:32.130 want to go first, then Matt, and hand over to Troy. 345 00:21:35.220 --> 00:21:39.360 Michael Novinson: Absolutely. So Vice did perhaps the most wise 346 00:21:39.360 --> 00:21:42.990 piece ever where they have one of their writers turned over 347 00:21:42.990 --> 00:21:46.440 their life to ChatGPT for 72 hours and let it make all the 348 00:21:46.440 --> 00:21:48.930 decisions around how it structured it, how the writer 349 00:21:48.930 --> 00:21:52.980 would structure their day. So it was an interesting process. And 350 00:21:53.580 --> 00:21:57.360 what happened was the chatbot kept saying to prioritize work 351 00:21:57.360 --> 00:22:00.690 to other obligations over the things that his spouse was 352 00:22:00.690 --> 00:22:03.060 asking him to do. So that made the spouse angry. So then he had 353 00:22:03.060 --> 00:22:06.750 to keep asking ChatGPT to write him apologies for his to his 354 00:22:06.750 --> 00:22:10.110 spouse, which started exactly what you think a robot 355 00:22:10.110 --> 00:22:13.260 apologizing would sound like just very formulaic. Needless to 356 00:22:13.260 --> 00:22:16.560 say, the spouse didn't feel the apologies were very sincere. So 357 00:22:16.560 --> 00:22:19.860 probably not great for either relationships or for time 358 00:22:19.860 --> 00:22:20.970 management at this point. 359 00:22:22.080 --> 00:22:25.380 Anna Delaney: It's a great one. Matt? 360 00:22:25.830 --> 00:22:32.040 Mathew Schwartz: Love it. So long time ago, I lived in Paris, 361 00:22:32.040 --> 00:22:34.560 which was wonderful. And I was working on Paris guidebooks 362 00:22:34.560 --> 00:22:36.990 while I lived there. And one of my colleagues on these 363 00:22:36.990 --> 00:22:40.650 guidebooks, Heather Stimmler, still lives in Paris, still does 364 00:22:40.650 --> 00:22:47.100 guidebooks, also does tours. And so she thought she would see if 365 00:22:47.250 --> 00:22:53.280 ChatGPT was good at creating a Paris itinerary for a first-time 366 00:22:53.340 --> 00:22:58.710 visitor. And I'll just quote her headline: AI fails miserably at 367 00:22:58.710 --> 00:23:03.870 creating a Paris itinerary. She found that when even with the 368 00:23:03.870 --> 00:23:07.860 simplest of prompts, trying to get it to the right answer, 369 00:23:08.580 --> 00:23:11.340 there were factual errors. There were navigation errors, there 370 00:23:11.340 --> 00:23:15.030 was poor sequencing of sites, there is inefficient use of 371 00:23:15.030 --> 00:23:18.900 time. The results she said were actually worse than I expected, 372 00:23:19.140 --> 00:23:23.640 to say the least. And this is fascinating, I think, because I 373 00:23:23.640 --> 00:23:25.410 have a hard time sometimes telling it when a restaurant 374 00:23:25.410 --> 00:23:29.400 down the streets open, I get conflicting advice from the 375 00:23:29.400 --> 00:23:33.120 various sites that purport to have reliable information about 376 00:23:33.120 --> 00:23:38.220 this. Imagine if you automate this now at scale. So it's just 377 00:23:38.220 --> 00:23:41.580 fascinating ChatGPT might be good for some things, but if 378 00:23:41.580 --> 00:23:44.880 you're visiting a foreign city, I would not trust it. 379 00:23:46.260 --> 00:23:48.180 Anna Delaney: Fascinating indeed, I thought surprising 380 00:23:48.180 --> 00:23:53.280 actually. Thought it'd be better at this point. Troy, what have 381 00:23:53.280 --> 00:23:53.850 you got to share? 382 00:23:55.140 --> 00:23:58.800 Troy Leach: Well, you know, I don't have much. I do think 383 00:23:58.800 --> 00:24:03.450 we're at this point, we saw autocorrect, you know, change 384 00:24:03.450 --> 00:24:06.750 your license and have a lot of funny stories. I do like the 385 00:24:06.810 --> 00:24:12.510 prompting and if I'm ever having a down moment, I like to ask 386 00:24:13.140 --> 00:24:17.430 ChatGPT to respond to me with an answer in a certain voice, 387 00:24:17.430 --> 00:24:23.820 whether it's a 1920s mobster or some famous celebrity. I think 388 00:24:23.850 --> 00:24:28.920 those are some of the best. It actually is phenomenal how that 389 00:24:28.920 --> 00:24:33.270 actually works out. But what I did and this really just took a 390 00:24:33.270 --> 00:24:38.490 couple of moments is I said you know what we're missing somebody 391 00:24:38.490 --> 00:24:44.430 on this and so I thought that I could while he's not here, 392 00:24:44.670 --> 00:24:50.790 possibly replicate Tom Field's voice and literally within 10 393 00:24:50.790 --> 00:24:55.230 minutes of effort, and I'm not very good at this yet. I thought 394 00:24:55.230 --> 00:25:01.620 I did a pretty darn good job. I'll give you just a sample of 395 00:25:01.620 --> 00:25:05.040 what the AI thinks, Tom, should sound like. 396 00:25:09.270 --> 00:25:11.550 Tom Field: I really appreciate the chance to come on and talk 397 00:25:11.550 --> 00:25:17.430 with you today. August 2, 2023, Anna, Mike, Matt, and especially 398 00:25:17.430 --> 00:25:20.730 my favorite guest of all time, Troy Leach. Really sorry, I 399 00:25:20.730 --> 00:25:23.730 missed this opportunity. For Information Security Media 400 00:25:23.730 --> 00:25:24.960 Group, I'm Tom Field. 401 00:25:25.110 --> 00:25:30.660 Troy Leach: So I'm actually flattered, you know, because Tom 402 00:25:30.660 --> 00:25:35.820 has so many guests over the years. So that I appreciate 403 00:25:35.820 --> 00:25:36.150 that. 404 00:25:36.330 --> 00:25:37.830 Mathew Schwartz: Doesn't say that sort of thing lightly, 405 00:25:37.830 --> 00:25:39.240 Troy, that's an alarm. 406 00:25:40.980 --> 00:25:43.500 Troy Leach: But that's where we're at is you know that 407 00:25:43.500 --> 00:25:47.490 spoofing is. So I think there's a lot of fun that we can have 408 00:25:47.490 --> 00:25:49.590 with it as long as it's in the right hands. 409 00:25:49.860 --> 00:25:52.680 Anna Delaney: Well, Tom would love that. I would just end with 410 00:25:52.680 --> 00:25:57.180 my ChatGPT story. So I asked ChatGPT, tell me about Troy 411 00:25:57.180 --> 00:26:01.590 Leach of the Cloud Security Alliance. And, you know, not 412 00:26:01.590 --> 00:26:05.670 completely correct as of my last update, it said, September 2021, 413 00:26:05.670 --> 00:26:09.000 Troy Leach was the chief technology officer of the Cloud 414 00:26:09.000 --> 00:26:12.600 Security Alliance. So it hasn't quite updated that. But, it goes 415 00:26:12.600 --> 00:26:15.450 on to say Troy Leach played a crucial role in shaping the 416 00:26:15.450 --> 00:26:20.010 CSA's mission and initiatives. And my favorite line. Under his 417 00:26:20.010 --> 00:26:23.520 leadership, the CSA has worked to create a safer and more 418 00:26:23.520 --> 00:26:27.480 secure cloud computing environment for businesses and 419 00:26:27.480 --> 00:26:29.970 individuals alike. So congratulations. 420 00:26:30.330 --> 00:26:33.360 Troy Leach: That's pretty impressive. Because yesterday 421 00:26:33.360 --> 00:26:40.530 was my one-year anniversary. And it's a 15-year-old company. So I 422 00:26:40.530 --> 00:26:45.570 think, AI may have given me a lot more credit than is 423 00:26:45.570 --> 00:26:50.880 deserved. But I think that's where we're at is I give it a, 424 00:26:51.450 --> 00:26:55.170 you know, I treat it as a summer intern. You know, you can take 425 00:26:55.170 --> 00:26:58.350 it, it will do the tasks that you're asked, but sometimes you 426 00:26:58.350 --> 00:27:01.860 probably double check that work before you send it and submit it 427 00:27:01.860 --> 00:27:02.640 to the boss. 428 00:27:03.300 --> 00:27:06.060 Anna Delaney: Wise words, well. We celebrate you, Troy, and this 429 00:27:06.060 --> 00:27:09.150 has been an immense pleasure. Thank you so much for joining us 430 00:27:09.150 --> 00:27:10.500 and for your rich insight. 431 00:27:11.760 --> 00:27:12.870 Troy Leach: Thank you very much. 432 00:27:13.020 --> 00:27:13.830 Mathew Schwartz: Thanks, Troy. 433 00:27:14.220 --> 00:27:14.790 Michael Novinson: Thank you, Troy. 434 00:27:15.720 --> 00:27:16.470 Troy Leach: Thank you, gentlemen. 435 00:27:16.710 --> 00:27:18.990 Anna Delaney: Thank you so much for watching. Until next time.