WEBVTT 1 00:00:00.000 --> 00:00:02.100 Anna Delaney: Hello, I'm Anna Delaney with Information 2 00:00:02.100 --> 00:00:04.980 Security Media Group. I'm very pleased to be joined by Valerie 3 00:00:04.980 --> 00:00:08.850 Abend, global cyber strategy lead at Accenture Security. 4 00:00:09.150 --> 00:00:10.140 Great to see you, Valerie. 5 00:00:10.170 --> 00:00:11.910 Valerie Abend: It's great to see you, Anna. I appreciate it. 6 00:00:12.390 --> 00:00:14.250 Anna Delaney: So let's talk about a few interesting trends 7 00:00:14.250 --> 00:00:19.170 to start us off: next-generation hacking techniques. There's an 8 00:00:19.170 --> 00:00:22.500 interesting trend at the moment: the number of ransoms paid by 9 00:00:22.500 --> 00:00:26.460 organizations is on the decline, which is positive news. But we 10 00:00:26.460 --> 00:00:29.400 know that the criminals are always innovating. How are they 11 00:00:29.400 --> 00:00:33.180 responding to this? And what do you see in terms of how their 12 00:00:33.180 --> 00:00:36.180 techniques, their tactics are changing, adapting? 13 00:00:36.450 --> 00:00:38.070 Valerie Abend: I think it's really smart to think about 14 00:00:38.070 --> 00:00:42.540 that, because they're certainly not going to stop innovating to 15 00:00:42.540 --> 00:00:46.350 accomplish their objectives, right? And in difficult times - 16 00:00:46.350 --> 00:00:50.430 and let's be honest, it's not an easy time, a lot of people are 17 00:00:50.430 --> 00:00:54.870 feeling distressed right now, maybe lost their job, or are in 18 00:00:54.870 --> 00:00:58.380 fear of losing their job. That is a really important moment for 19 00:00:58.380 --> 00:01:01.920 social engineering. And the bad guys know it. And so the rates 20 00:01:01.920 --> 00:01:05.490 of fraud are actually going to increase, because it's easier to 21 00:01:06.630 --> 00:01:10.380 prey on these kinds of individuals at this moment. So I 22 00:01:10.380 --> 00:01:12.450 fully suspect that they are changing their tactics 23 00:01:12.450 --> 00:01:16.590 specifically to accomplish their financial objectives by going 24 00:01:16.590 --> 00:01:18.120 after people, from their fear. 25 00:01:18.780 --> 00:01:20.730 Anna Delaney: There's been lots of discussion at the moment 26 00:01:20.730 --> 00:01:24.390 about generative AI tools, which is ChatGPT, and how the 27 00:01:24.390 --> 00:01:27.750 criminals can target large enterprises, small enterprises 28 00:01:27.750 --> 00:01:31.560 as well, small organizations, with those tools. How do you see 29 00:01:31.560 --> 00:01:34.320 them potentially weaponizing these tools? 30 00:01:34.410 --> 00:01:38.460 Valerie Abend: Well, generative AI is the topic all of the time. 31 00:01:38.490 --> 00:01:41.850 I actually just recently had a conversation with a large 32 00:01:41.850 --> 00:01:46.650 product company that also works in transportation. And we were 33 00:01:46.650 --> 00:01:51.300 really thinking through how do you leverage generative AI and 34 00:01:51.300 --> 00:01:55.560 the power here to enable your business, but do it in a 35 00:01:55.560 --> 00:01:59.190 responsible way, bringing together the chief data officer 36 00:01:59.490 --> 00:02:01.620 and the chief information security officer for 37 00:02:01.620 --> 00:02:04.770 responsibility framework, and thinking through this together 38 00:02:04.770 --> 00:02:08.310 so that you can leverage these capabilities. But the bad guys 39 00:02:08.340 --> 00:02:11.430 are moving faster. And that's the balance of the challenge 40 00:02:11.430 --> 00:02:15.360 there. Because while we sit down and think about, "Hey, let's 41 00:02:15.690 --> 00:02:19.410 sandbox this, let's kind of dip our toe," they don't have to 42 00:02:19.410 --> 00:02:22.590 wait for any responsibility framework to actually go after 43 00:02:22.590 --> 00:02:25.950 and use these capabilities. And so there's no question in my 44 00:02:25.950 --> 00:02:30.000 mind that they are thinking of new ways of leveraging it, some 45 00:02:30.000 --> 00:02:33.810 of which haven't even been thought of by our teams yet. 46 00:02:34.380 --> 00:02:36.570 Anna Delaney: So where are the opportunities for large 47 00:02:36.570 --> 00:02:39.690 enterprises on the defender side? Can you share some use 48 00:02:39.690 --> 00:02:40.290 cases? 49 00:02:40.680 --> 00:02:42.570 Valerie Abend: Absolutely. I think the one use case 50 00:02:42.570 --> 00:02:45.060 everybody's talking about is how to help security operations 51 00:02:45.060 --> 00:02:49.080 centers actually leverage these capabilities. Because there's so 52 00:02:49.080 --> 00:02:53.040 much data coming in. I actually think one of the interesting use 53 00:02:53.040 --> 00:02:56.700 cases might be around identity and access management, both 54 00:02:56.700 --> 00:03:00.300 internally for your own employees as well as externally 55 00:03:00.390 --> 00:03:04.620 with your customers. How do we use large datasets and these 56 00:03:04.620 --> 00:03:09.090 learning models to solve some of the point-in-time access, that's 57 00:03:09.120 --> 00:03:11.910 really important to run your business from a service 58 00:03:11.910 --> 00:03:15.420 availability perspective, but make sure it's only in a limited 59 00:03:15.420 --> 00:03:18.990 period of time, and that it's revoked in a timely fashion. I 60 00:03:18.990 --> 00:03:21.630 think there's a lot of opportunity that isn't yet being 61 00:03:21.630 --> 00:03:24.090 explored in identity access management that we can look to. 62 00:03:24.660 --> 00:03:28.020 Anna Delaney: What do you recommend enterprises do right 63 00:03:28.020 --> 00:03:31.470 now, in terms of adopting these tools? Where do they start? 64 00:03:31.530 --> 00:03:33.840 Valerie Abend: Yeah, I think it's important that you actually 65 00:03:33.840 --> 00:03:36.480 have a framework, that you're very thoughtful about it, that 66 00:03:36.480 --> 00:03:38.460 you're meeting with the business, coming up with those 67 00:03:38.460 --> 00:03:42.750 business use cases, and working in responsible ways, but not 68 00:03:42.750 --> 00:03:47.520 waiting. I just think that when we see technology innovation, 69 00:03:47.520 --> 00:03:50.700 it's really important to harness its potential. And this is a 70 00:03:50.700 --> 00:03:51.990 great opportunity for all of us. 71 00:03:52.830 --> 00:03:55.230 Anna Delaney: So let's talk about regulatory changes, lots 72 00:03:55.230 --> 00:03:58.590 happening in the landscape, can be difficult for multinationals 73 00:03:58.590 --> 00:04:03.030 to know what applies to them, what responsibilities they have, 74 00:04:03.150 --> 00:04:07.200 how to ingest these changes. What challenges do you see them 75 00:04:07.200 --> 00:04:08.880 face at the moment when it comes to regulation? 76 00:04:09.510 --> 00:04:12.360 Valerie Abend: So regulation is sort of an interesting one. I'm 77 00:04:12.360 --> 00:04:16.200 actually a former regulator myself. And I get a lot of 78 00:04:16.200 --> 00:04:20.700 questions from our clients, particularly in the governance 79 00:04:20.700 --> 00:04:23.400 space of our clients, not just with chief information security 80 00:04:23.400 --> 00:04:26.370 officers and their teams, but all the way up to the C-suite 81 00:04:26.370 --> 00:04:29.940 and the board specifically about this issue. And honestly, a lot 82 00:04:29.940 --> 00:04:32.940 of them want to see regulatory harmonization. They're seeing so 83 00:04:32.940 --> 00:04:36.060 much complexity, it feels like it's taking away resources and 84 00:04:36.060 --> 00:04:39.150 attention from doing the job that they need to do. But 85 00:04:39.180 --> 00:04:42.360 there's a balance here. And we're talking about - in many 86 00:04:42.360 --> 00:04:46.650 cases, critical infrastructure, 90% or more, which is owned and 87 00:04:46.650 --> 00:04:50.460 operated by the private sector, most of which actually isn't 88 00:04:50.460 --> 00:04:53.550 regulated. I think there's a lot of attention and focus on what 89 00:04:53.550 --> 00:04:57.270 is. But someone once said to me, "Valerie - when I was a 90 00:04:57.270 --> 00:05:01.320 regulator - why can't you just use what we do? Why do you have 91 00:05:01.320 --> 00:05:04.230 to do something specific, in my case, for the banking industry? 92 00:05:04.890 --> 00:05:07.890 Isn't it a stack on stack on stack?" I said, "No, the 93 00:05:07.890 --> 00:05:10.260 application monitoring the controls in a nuclear power 94 00:05:10.260 --> 00:05:13.800 plant aren't the same for your wholesale payment system." And 95 00:05:13.800 --> 00:05:16.800 it is important that while you might have a harmonized-based 96 00:05:16.830 --> 00:05:20.400 approach on certain aspects, that you then take tailored 97 00:05:20.400 --> 00:05:23.940 industry approaches that are specific to the risks of those 98 00:05:23.940 --> 00:05:27.630 businesses. And so there really is a very strong place here for 99 00:05:27.660 --> 00:05:29.130 industry-tailored regulation. 100 00:05:29.760 --> 00:05:31.620 Anna Delaney: Are there other lessons learned from that time 101 00:05:31.620 --> 00:05:35.550 as a regulator for organizations now, in terms of harmonizing 102 00:05:35.550 --> 00:05:36.270 those efforts? 103 00:05:36.570 --> 00:05:39.390 Valerie Abend: There are a number of them. The first is, I 104 00:05:39.390 --> 00:05:42.510 think a lot of private sector want to intrude, come to the 105 00:05:42.510 --> 00:05:47.340 table, in a conversation with the regulators. The challenge is 106 00:05:47.340 --> 00:05:49.500 different regimes around the world can handle that 107 00:05:49.500 --> 00:05:53.340 conversation, even legally in different ways. Some allow for 108 00:05:53.340 --> 00:05:57.360 open conversation, some don't. And you have to be sensitive to 109 00:05:57.360 --> 00:06:00.750 that. But when you approach that conversation, it's important to 110 00:06:00.750 --> 00:06:04.260 understand that the regulator is there, not just with a job to 111 00:06:04.260 --> 00:06:07.710 do, but that job is written in statute. And they have an 112 00:06:07.710 --> 00:06:11.280 obligation, they are actually overseeing, in many cases, by a 113 00:06:11.280 --> 00:06:14.940 legislative branch to fulfill that mission. And it's important 114 00:06:14.940 --> 00:06:17.430 that they actually show that they're fulfilling that 115 00:06:17.430 --> 00:06:21.150 obligation. And so helping them understand how best to fulfill 116 00:06:21.150 --> 00:06:23.940 that obligation, and being a partner at the table in a way 117 00:06:23.940 --> 00:06:27.030 that the legal construct allows it is the most productive way to 118 00:06:27.030 --> 00:06:27.900 work on that issue. 119 00:06:28.770 --> 00:06:31.170 Anna Delaney: Well, let's look at the SEC and changes they're 120 00:06:31.170 --> 00:06:35.100 implementing. They're imposing stricter rules about cyber 121 00:06:35.100 --> 00:06:39.210 expertise. At the board level, what are the changes we need to 122 00:06:39.210 --> 00:06:39.600 know about? 123 00:06:40.230 --> 00:06:42.780 Valerie Abend: So, many folks have seen the SEC weigh in at 124 00:06:42.780 --> 00:06:47.310 different points in time - 2011, 2018. And then they published an 125 00:06:47.310 --> 00:06:53.130 interim for comment, proposed guidance, which would increase 126 00:06:53.130 --> 00:06:56.100 the amount of transparency and reporting that boards would have 127 00:06:56.100 --> 00:06:58.650 to do and companies would have to do around how they're 128 00:06:58.650 --> 00:07:02.490 handling cybersecurity, both incidents as well as various 129 00:07:02.490 --> 00:07:05.670 governance issues, also including expertise on the 130 00:07:05.670 --> 00:07:09.120 board. And I think this is really important, I actually 131 00:07:09.120 --> 00:07:12.060 think it's important to create transparency, unintended 132 00:07:12.060 --> 00:07:14.520 consequences have to be looked at. That's why you have a 133 00:07:14.520 --> 00:07:18.690 comment period. But I think a lot of companies are struggling 134 00:07:18.930 --> 00:07:22.680 with what constitutes cyber expertise on a board. I have a 135 00:07:22.680 --> 00:07:25.440 lot of colleagues who are chief information security officers - 136 00:07:25.440 --> 00:07:28.440 very bright, very good at operations. But they don't know 137 00:07:28.440 --> 00:07:31.920 governance. They're not really understanding how the governance 138 00:07:31.920 --> 00:07:36.150 model works. And by the way, if they are on boards, they have to 139 00:07:36.150 --> 00:07:39.330 expand beyond just cyber, they have to look at all the other 140 00:07:39.330 --> 00:07:42.630 aspects of that fiduciary responsibility. And then I have 141 00:07:42.630 --> 00:07:45.330 other friends, colleagues who are board members on big 142 00:07:45.360 --> 00:07:48.210 publicly held companies. And they are desperate for that 143 00:07:48.210 --> 00:07:50.640 knowledge, but don't want to appear as though they don't know 144 00:07:50.640 --> 00:07:54.420 what they're talking about. And so I really hope that we kind of 145 00:07:54.420 --> 00:07:57.870 have constructive conversations, and that we bring these groups 146 00:07:57.870 --> 00:08:01.650 together closer to actually enable that better. I think a 147 00:08:01.650 --> 00:08:06.240 lot of people are armored up in that room, maybe too curated in 148 00:08:06.240 --> 00:08:08.340 how they're having that conversation. And we can do 149 00:08:08.340 --> 00:08:08.700 better. 150 00:08:09.150 --> 00:08:11.190 Anna Delaney: Now I know you work in centers working with 151 00:08:11.190 --> 00:08:14.730 government to help bridge the gap. So tell us more about that? 152 00:08:15.120 --> 00:08:16.950 Valerie Abend: Well, I think we're working with government, 153 00:08:17.130 --> 00:08:20.040 with key leaders across the private sector. And that's a 154 00:08:20.040 --> 00:08:23.100 really important place for a large global enterprise like 155 00:08:23.100 --> 00:08:26.370 Accenture, with the footprint that we have. Because when we 156 00:08:26.370 --> 00:08:29.490 make a change, when we lead with that change, we're not just 157 00:08:29.490 --> 00:08:32.640 credible, but we have real practical hands-on knowledge 158 00:08:32.640 --> 00:08:36.120 about why it works that way. So it's everything, from the 159 00:08:36.120 --> 00:08:38.730 analyst all the way to the boardroom. I think a lot of 160 00:08:38.730 --> 00:08:41.820 times I see fancy PowerPoints, but that's not going to make the 161 00:08:41.820 --> 00:08:45.090 change. You really have to help meet people where they're at, 162 00:08:45.390 --> 00:08:48.060 draw them into the conversation and give them practical 163 00:08:48.060 --> 00:08:51.090 approaches. And that's the gap that we're bridging between 164 00:08:51.330 --> 00:08:54.750 private sector, critical infrastructure, policymakers, 165 00:08:54.930 --> 00:08:57.750 government leaders to make that conversation better. 166 00:08:58.260 --> 00:08:59.430 Anna Delaney: How do you think these changes are going to 167 00:08:59.430 --> 00:09:01.020 impact the industry as a whole? 168 00:09:01.650 --> 00:09:04.200 Valerie Abend: Well, I think it's a really exciting time. You 169 00:09:04.200 --> 00:09:06.420 know, there's always going to be challenges, but I like to talk 170 00:09:06.420 --> 00:09:09.000 about what you can do about those challenges. I think as an 171 00:09:09.000 --> 00:09:11.670 industry as a whole, you know, honestly, there's going to be 172 00:09:11.670 --> 00:09:15.690 some changes around how we build things securely from the start. 173 00:09:15.870 --> 00:09:19.320 And that's where everyone wants to get, how do we, not just sort 174 00:09:19.320 --> 00:09:22.800 of hold people accountable, but actually get them excited and 175 00:09:22.800 --> 00:09:27.150 knowledgeable about doing it. One thing that I've noticed is 176 00:09:27.150 --> 00:09:30.270 that we've not done a great job and need to do a better job of 177 00:09:30.270 --> 00:09:34.230 actually explaining what the tailored and specific 178 00:09:34.230 --> 00:09:38.430 accountabilities are for every single member of the C-suite. I 179 00:09:38.430 --> 00:09:41.430 think about it, if you're the chief marketing officer, digital 180 00:09:41.430 --> 00:09:45.090 trust in your brand is really important. So how do we help you 181 00:09:45.090 --> 00:09:47.400 understand what that accountability looks like from a 182 00:09:47.400 --> 00:09:50.400 cyber perspective? And then empower and enable not just you 183 00:09:50.400 --> 00:09:53.520 but your entire team, so that when you're held accountable as 184 00:09:53.520 --> 00:09:57.660 a chief marketing officer on that, it's actually real for you 185 00:09:57.690 --> 00:10:01.140 and then taking that and moving to the chief human risk officer, 186 00:10:01.260 --> 00:10:05.310 moving to the chief financial officer, etc. So I'm really 187 00:10:05.310 --> 00:10:06.570 excited about that change. 188 00:10:07.440 --> 00:10:09.810 Anna Delaney: Well, I know that Accenture has news to announce 189 00:10:09.810 --> 00:10:12.150 this week. Can you spill the beans? 190 00:10:12.180 --> 00:10:14.880 Valerie Abend: Yeah, it's a big RSA for us. And I'm really 191 00:10:14.880 --> 00:10:19.170 excited about it. At RSA, we will be announcing a new 192 00:10:19.170 --> 00:10:23.730 partnership with Google. And that partnership will actually 193 00:10:23.730 --> 00:10:27.090 help us leverage Google's chronicle capability, which is 194 00:10:27.090 --> 00:10:30.000 all about data analytics. And I think they're known for that 195 00:10:30.000 --> 00:10:34.620 data analytics capability to further empower our managed 196 00:10:34.620 --> 00:10:37.830 detection and response capability. And I'm really 197 00:10:37.830 --> 00:10:41.760 excited about that, the future of using large language models 198 00:10:41.880 --> 00:10:46.980 to really expand and make the speed of what we do in that MxDR 199 00:10:46.980 --> 00:10:51.390 capability come alive for all of our clients in a really big way. 200 00:10:51.690 --> 00:10:54.660 Additionally, I think most people know that Google acquired 201 00:10:54.660 --> 00:10:58.650 Mandiant, and Accenture is partnering with Mandiant on 202 00:10:58.650 --> 00:11:01.440 crisis management response, incident response and threat 203 00:11:01.440 --> 00:11:03.930 intelligence. And I think the power of all this coming 204 00:11:03.930 --> 00:11:05.880 together, it's going to really be a game changer for our 205 00:11:05.880 --> 00:11:10.560 clients. The second thing we're announcing is in terms of our 206 00:11:10.560 --> 00:11:13.710 partnership with Palo Alto Networks. And this is exciting 207 00:11:13.710 --> 00:11:18.210 as well, because so much of what has changed is really that 208 00:11:18.210 --> 00:11:22.920 hybrid remote work environment. And you're talking about so many 209 00:11:22.920 --> 00:11:26.790 devices all throughout the world. And it's really hard to 210 00:11:26.790 --> 00:11:30.810 understand how to secure all of that. People talk about SASE and 211 00:11:30.810 --> 00:11:33.990 securing these edge capabilities. And we're really 212 00:11:33.990 --> 00:11:37.320 bringing new intellectual capability to that around 213 00:11:37.320 --> 00:11:40.770 diagnostics, and really cutting through the noise of all of 214 00:11:40.770 --> 00:11:44.190 that, signal to really identify where are your weakest points, 215 00:11:44.340 --> 00:11:47.160 and what do you need to do to make sure you're focused on 216 00:11:47.160 --> 00:11:48.930 closing those vulnerabilities fastest. 217 00:11:49.380 --> 00:11:51.810 Anna Delaney: All this news certainly reflects the theme of 218 00:11:51.810 --> 00:11:54.450 the event, 'Stronger Together'. So Valerie, this has been 219 00:11:54.510 --> 00:11:56.880 excellent. Thank you so much for sharing your expertise. 220 00:11:56.910 --> 00:11:58.470 Valerie Abend: Thank you for the opportunity, and I really 221 00:11:58.470 --> 00:11:59.250 enjoyed it. Thanks. 222 00:11:59.820 --> 00:12:02.310 Anna Delaney: Thank you so much for watching. For ISMG, I'm Anna 223 00:12:02.310 --> 00:12:02.790 Delaney.