WEBVTT 1 00:00:07.080 --> 00:00:09.480 Anna Delaney: Welcome to the ISMG Editors' Panel. I'm Anna 2 00:00:09.480 --> 00:00:12.360 Delaney, and today we're discussing the future of U.S. 3 00:00:12.360 --> 00:00:16.650 Federal cybersecurity, privacy legislation, AI integration and 4 00:00:16.650 --> 00:00:19.830 recent developments from Cisco. The question is how will these 5 00:00:19.830 --> 00:00:23.220 initiatives stand strong in the ever-changing U.S. political 6 00:00:23.220 --> 00:00:26.040 climate? Well, joining us to provide insights into the 7 00:00:26.040 --> 00:00:28.770 current state of federal security is Grant Schneider, 8 00:00:28.830 --> 00:00:33.690 senior director of cybersecurity services at Venable LLP. Grant, 9 00:00:33.720 --> 00:00:35.700 always a pleasure to see you. Great to have you back. 10 00:00:36.260 --> 00:00:39.530 Grant Schneider: Anna, it is a pleasure to be here. And privacy 11 00:00:39.530 --> 00:00:44.630 sounds so much cooler when it's in the U.K. than in the in the 12 00:00:44.630 --> 00:00:46.850 U.S. So love the way you pronounce it. 13 00:00:47.410 --> 00:00:52.420 Anna Delaney: Bring this right. And completing the team are Tom 14 00:00:52.420 --> 00:00:55.300 Field, senior vice president of editorial and Mathew Schwartz, 15 00:00:55.330 --> 00:00:57.760 executive editor of DataBreachToday in Europe. Great 16 00:00:57.760 --> 00:00:58.300 to see you. 17 00:00:59.470 --> 00:01:01.450 Tom Field: Thanks for having us. If you like bigger recruiting 18 00:01:01.450 --> 00:01:02.590 areas, grant as well. 19 00:01:03.980 --> 00:01:06.770 Anna Delaney: Grant, there you go. So, Grant, where are you? 20 00:01:06.770 --> 00:01:09.050 You off a mountain, I think or a few mountains. 21 00:01:09.980 --> 00:01:14.420 Grant Schneider: I am. I'm in Utah. I recently did a ski 22 00:01:14.420 --> 00:01:19.160 vacation in Utah. And, you know, as I was mentioning, I think 23 00:01:19.160 --> 00:01:23.300 it's, you know, not sure what our vacation opportunities are 24 00:01:23.300 --> 00:01:26.810 going to be like, between now and the elections. This fall 25 00:01:26.810 --> 00:01:30.950 here in the U.S., I think it's going to be busy. So Congress is 26 00:01:30.980 --> 00:01:33.470 starting to do a few things, which is interesting, and the 27 00:01:33.470 --> 00:01:37.130 administration keeps doing things. So busy times ahead. 28 00:01:37.420 --> 00:01:41.080 Anna Delaney: Yeah, take those breathers while you can. Tom, 29 00:01:41.000 --> 00:01:44.548 Tom Field: There's a story here. I was supposed to be in New York 30 00:01:41.080 --> 00:01:41.920 you got company. 31 00:01:44.613 --> 00:01:48.687 last Thursday. And so of course, I plan to fly from my home in 32 00:01:48.753 --> 00:01:52.630 Maine down in New York. But weather came in, looked bad. So 33 00:01:52.695 --> 00:01:56.638 I decided I'd go a day early. So I show up at the airport on 34 00:01:56.704 --> 00:02:00.515 Wednesday from white. The weather is canceled that flight. 35 00:02:00.580 --> 00:02:04.194 I ended up having to rent a van. And I took every other 36 00:02:04.260 --> 00:02:08.334 disenfranchised passenger from the airport and drove them down 37 00:02:08.399 --> 00:02:12.670 to Boston to meet our connecting flights. So I had firefighter, a 38 00:02:12.736 --> 00:02:16.613 scientist, and a salesperson, and a colleague from the U.K. 39 00:02:16.679 --> 00:02:20.095 and we all filled this van and actually had a pretty 40 00:02:20.161 --> 00:02:21.410 entertaining drive. 41 00:02:23.330 --> 00:02:26.990 Anna Delaney: Can't top of that, and did you make some money that 42 00:02:26.990 --> 00:02:28.670 day? Did you collect some tips? 43 00:02:28.000 --> 00:02:32.530 Tom Field: I would say no, I did not make any money. Nor did I 44 00:02:32.530 --> 00:02:35.410 make the most annoying sound in the world as Jim Carrey did in 45 00:02:35.410 --> 00:02:38.800 the movie cited by me, but if you are to drive for three 46 00:02:38.800 --> 00:02:40.330 hours, that's a pleasant way to drive. 47 00:02:41.450 --> 00:02:44.420 Anna Delaney: That's your good deed. Definitely for the month; 48 00:02:44.900 --> 00:02:46.940 well-done. Mat, top that. 49 00:02:48.140 --> 00:02:50.120 Mathew Schwartz: Yeah, no, there's no heartwarming anything 50 00:02:50.120 --> 00:02:53.690 here, Anna. This is a cold, brutal windy day in Glasgow, 51 00:02:53.960 --> 00:02:57.320 where I was tempted to warm my frozen heart with a little 52 00:02:58.130 --> 00:03:03.230 street graffiti, photography. So just a little bit of a bicycle 53 00:03:03.230 --> 00:03:06.800 there, you can see off in the distance. 54 00:03:10.620 --> 00:03:13.470 Anna Delaney: You find art in every corner of the world. So 55 00:03:13.800 --> 00:03:16.830 love it. Love the colors. Well, I sharing a taste of Bruges, the 56 00:03:16.830 --> 00:03:19.950 medieval treasure in Belgium, often called Venice of the 57 00:03:19.950 --> 00:03:23.370 North. It's famous, of course, for its beautiful canals and 58 00:03:23.370 --> 00:03:26.250 architecture, as you can see, but did you know it also has 59 00:03:26.280 --> 00:03:34.560 museums solely dedicated to beer and french fries? Wow, indeed. 60 00:03:34.590 --> 00:03:37.530 So Grant, you know what's coming your way. Tom, why don't you 61 00:03:37.680 --> 00:03:38.370 take it away? 62 00:03:39.120 --> 00:03:41.070 Tom Field: Well, I want to make plans for vacation over there. 63 00:03:41.070 --> 00:03:45.000 But that being the case, Grant, you mentioned it's going to be a 64 00:03:45.000 --> 00:03:49.500 busy year. Now already, we can see that cybersecurity is no 65 00:03:49.500 --> 00:03:54.000 longer a bipartisan issue. It causes discord wherever its 66 00:03:54.000 --> 00:03:57.810 raised now it's become very partisan. How do you feel that 67 00:03:57.810 --> 00:04:02.100 the President's budget and cybersecurity agenda will be 68 00:04:02.100 --> 00:04:04.890 received by this Congress in an election year? 69 00:04:06.880 --> 00:04:10.300 Grant Schneider: Oh, I am hopeful that cybersecurity is 70 00:04:10.300 --> 00:04:16.810 maybe still bipartisan, I think However, I definitely agree that 71 00:04:16.810 --> 00:04:20.590 we're shifting, you know, we're shifting away, I'm hoping we can 72 00:04:20.590 --> 00:04:24.700 keep the non bipartisan part, if you will, kind of narrowly focus 73 00:04:24.700 --> 00:04:29.110 because certainly disinformation has I think driven the 74 00:04:29.110 --> 00:04:33.010 nonpartisan piece. Election security is definitely not a 75 00:04:33.010 --> 00:04:39.250 bipartisan thing. And both of those have kind of tainted CISA 76 00:04:39.250 --> 00:04:43.360 with Congress; and where CISA was very much, you know, getting 77 00:04:43.360 --> 00:04:46.930 more money in appropriations than the President had asked for 78 00:04:46.930 --> 00:04:50.560 over the last several years, significantly more money 79 00:04:50.560 --> 00:04:55.030 appropriated by Congress. You know, I don't think that will be 80 00:04:55.030 --> 00:04:57.700 the case or I would be surprised if that were the case this year. 81 00:04:57.700 --> 00:05:01.540 So the President's budget just came out. It's a healthy ask of 82 00:05:01.570 --> 00:05:07.360 $3 billion for CISA, $13 billion for not counting DoD, $13 83 00:05:07.360 --> 00:05:10.090 billion in cyber across the board, which is a billion dollar 84 00:05:10.090 --> 00:05:14.260 increase. Billion dollar increase really isn't that big 85 00:05:14.260 --> 00:05:18.070 when you talk about a $7 trillion budget that came out. 86 00:05:18.370 --> 00:05:22.420 And I think CISA while a $3 billion request, that's actually 87 00:05:22.420 --> 00:05:26.110 less money than was appropriated in the past versus because of 88 00:05:26.110 --> 00:05:30.280 those additional increases from Congress. And, again, I think 89 00:05:30.550 --> 00:05:33.550 it's less likely that that's going to happen this year to 90 00:05:33.550 --> 00:05:37.300 exactly your point because, again, I'm hopeful that we can 91 00:05:37.300 --> 00:05:39.670 still call cybersecurity bipartisan, because we need it 92 00:05:39.670 --> 00:05:44.230 to be. But the things that start getting added into that 93 00:05:44.230 --> 00:05:48.820 ecosystem are getting some partisan. And, you know, here in 94 00:05:48.820 --> 00:05:52.000 the U.S., we are adding partisan politics to almost everything at 95 00:05:52.000 --> 00:05:52.480 this point. 96 00:05:52.960 --> 00:05:57.010 Tom Field: Exactly. Well, along those same lines, Grant, we 97 00:05:57.010 --> 00:06:01.270 haven't been able to pass any meaningful federal privacy 98 00:06:01.510 --> 00:06:05.860 legislation over the past decade. What hope is there for 99 00:06:05.860 --> 00:06:06.700 AI now? 100 00:06:09.210 --> 00:06:11.820 Grant Schneider: It's going to be hard. You know, there 101 00:06:12.210 --> 00:06:15.720 are/have been tons and tons of hearings on the Hill on AI. I 102 00:06:15.720 --> 00:06:21.540 think AI has crept into all of our... everyone's personal life 103 00:06:21.540 --> 00:06:25.320 and all of our professional lives, I would say as well. It 104 00:06:25.320 --> 00:06:30.480 is the new buzzword in this space, and it's what everone is 105 00:06:30.480 --> 00:06:34.080 talking about. And so Congress is talking about it as well. But 106 00:06:34.230 --> 00:06:38.220 I think it's going to be hard because what is AI is still 107 00:06:38.220 --> 00:06:41.430 super, super nuanced, right? What each of us probably if we 108 00:06:41.430 --> 00:06:44.880 were to go into closed room and describe artificial intelligence 109 00:06:45.720 --> 00:06:49.380 to someone or write it down, we'd probably come up with for, 110 00:06:49.620 --> 00:06:52.350 you know, there, there'd be some Venn diagram overlap, but 111 00:06:52.350 --> 00:06:55.440 probably for different perspectives and points of view, 112 00:06:55.650 --> 00:06:58.560 because it is so personal of where you're coming from and 113 00:06:58.560 --> 00:07:01.140 where you kind of set. So I think it's going to be a 114 00:07:01.140 --> 00:07:04.350 challenge for Congress to really be able to coalesce I think, you 115 00:07:04.350 --> 00:07:07.350 know, on the upside, there's strong agreement that they need 116 00:07:07.350 --> 00:07:10.950 to do something. But, you know, your analogy on privacy 117 00:07:10.950 --> 00:07:13.320 legislation, there's been strong agreement to do something around 118 00:07:13.320 --> 00:07:18.960 privacy for years and years. And just what to do and how to do 119 00:07:18.960 --> 00:07:21.630 it, and what the incentive structure is going to look like, 120 00:07:21.750 --> 00:07:24.090 and what the regulatory framework is going to look like. 121 00:07:24.450 --> 00:07:28.050 There's just not consensus on that. And I think we're much 122 00:07:28.050 --> 00:07:32.520 further away on AI than we are on privacy right now. 123 00:07:33.230 --> 00:07:36.149 Tom Field: We may get a hint of things to come even today, as 124 00:07:36.207 --> 00:07:39.886 we're sitting here speaking, I know the House has just passed a 125 00:07:39.944 --> 00:07:43.622 resolution banning TikTok in the U.S., and this is going to the 126 00:07:43.681 --> 00:07:47.126 Senate, the Senate may not agree, to be a sign of things to 127 00:07:47.184 --> 00:07:49.520 come. Meanwhile, I'll pass it on to Mat. 128 00:07:49.000 --> 00:07:54.670 Mathew Schwartz: Yes. Thank you. Well, great to have you back. I 129 00:07:54.670 --> 00:07:59.170 know that we've spoken before about the secure software 130 00:07:59.200 --> 00:08:03.460 development attestation form these government's and their 131 00:08:03.460 --> 00:08:08.740 long names for things. But back to CISA. CISA on Monday had 132 00:08:08.740 --> 00:08:14.080 released this form for the first time, which it bills as taking a 133 00:08:14.080 --> 00:08:18.460 major step in the implementation of its requirements that 134 00:08:18.460 --> 00:08:23.800 producers of software used by the federal government attest to 135 00:08:23.800 --> 00:08:27.430 the adoption of secure development practices. So with 136 00:08:27.430 --> 00:08:32.230 this attestation form, do you think this is a good step in the 137 00:08:32.230 --> 00:08:36.400 right direction on the road to more secure development 138 00:08:36.400 --> 00:08:40.180 practices? And do you think it's going to deliver? 139 00:08:41.800 --> 00:08:44.573 Grant Schneider: So I would say this form has already gotten a 140 00:08:44.635 --> 00:08:48.457 lot of attention from software developers, right? They gave us 141 00:08:48.518 --> 00:08:51.909 a draft of it about a year ago, they took some industry 142 00:08:51.970 --> 00:08:55.854 comments, they showed us another preview of it. And now we have 143 00:08:55.916 --> 00:08:59.737 the final form. So this is the form, it's out. This is the one 144 00:08:59.799 --> 00:09:03.498 that people need to, to attest to. And a couple things that, 145 00:09:03.559 --> 00:09:07.196 you know, they did in this revision they took accepted some 146 00:09:07.258 --> 00:09:11.141 changes from industry. The last version, wanted it to be signed 147 00:09:11.203 --> 00:09:15.025 by the CEO, the chief executive officer or the chief operating 148 00:09:15.086 --> 00:09:18.415 officer, this one has allowed for the CEO to designate 149 00:09:18.477 --> 00:09:21.990 someone, as long as they can still bind to the company to 150 00:09:22.052 --> 00:09:25.504 actually attest to the form, which I think is a positive 151 00:09:25.565 --> 00:09:29.264 step, it's going to make it easier for companies to get this 152 00:09:29.326 --> 00:09:33.024 form done and get it get it completed. That said, it's still 153 00:09:33.086 --> 00:09:35.736 going to bring a lot of attention inside of 154 00:09:35.798 --> 00:09:39.127 organizations. And I think that's going to be good for 155 00:09:39.188 --> 00:09:42.948 software development. I do think that this is going to have a 156 00:09:43.010 --> 00:09:46.462 positive effect because software developers, their legal 157 00:09:46.524 --> 00:09:50.099 departments are really going to want to understand, are we 158 00:09:50.161 --> 00:09:53.797 actually compliant, like signing a form is easy, but are we 159 00:09:53.859 --> 00:09:57.496 actually compliant? Where's the body of evidence that we're 160 00:09:57.558 --> 00:10:01.133 going to present to the person who has to sign this. And I 161 00:10:01.194 --> 00:10:04.955 think the people that are going to have to sign this, whether 162 00:10:05.016 --> 00:10:08.592 it's the CEO or someone they designate are going to ask to 163 00:10:08.653 --> 00:10:12.475 really understand where's the organization at. You know, we've 164 00:10:12.537 --> 00:10:16.358 seen the criminal cases against the Uber ciso, SolarWinds CSO, 165 00:10:16.420 --> 00:10:20.242 and the SolarWinds CISO, you know, clearly people in security 166 00:10:20.304 --> 00:10:24.125 positions and people that are going to sign this type of form, 167 00:10:24.187 --> 00:10:28.132 understand that it's not just a corporate liability that they're 168 00:10:28.194 --> 00:10:32.139 signing up for, it's actually a personal liability. And in fact, 169 00:10:32.200 --> 00:10:35.776 this form explicitly basically says that the Department of 170 00:10:35.837 --> 00:10:39.536 Justice could use the False Claims Act to come after someone 171 00:10:39.597 --> 00:10:43.481 who, you know, is signing this form in a way that's fraudulent. 172 00:10:43.543 --> 00:10:47.364 And the False Claims Act has a lot of potential penalties, big 173 00:10:47.426 --> 00:10:50.570 financial penalties on companies, but it also has a 174 00:10:50.631 --> 00:10:54.515 criminal statute, like this is a criminal statute that could be 175 00:10:54.576 --> 00:10:58.152 used. So I think it's going to get a lot of attention. You 176 00:10:58.213 --> 00:11:01.542 know, this is tied to NIST secure software development 177 00:11:01.604 --> 00:11:05.487 framework. It's not explicitly the framework, but it is tied to 178 00:11:05.549 --> 00:11:09.371 that. And I think you know, the administration has done a good 179 00:11:09.432 --> 00:11:12.884 job of identifying some core tenets and core elements of 180 00:11:12.946 --> 00:11:16.583 things software producers should be focused on. And I think 181 00:11:16.644 --> 00:11:20.343 people are going to get behind it. I hope to see consistency 182 00:11:20.405 --> 00:11:24.288 with this, right? We don't need for other versions of this, but 183 00:11:24.350 --> 00:11:27.986 either from the U.S. or from other countries. So I'm hoping 184 00:11:28.048 --> 00:11:31.808 we can get some consistency and coalesce around this now that 185 00:11:31.870 --> 00:11:32.980 it's actually out. 186 00:11:34.470 --> 00:11:36.630 Mathew Schwartz: Fantastic. Anything that lights a fire with 187 00:11:36.630 --> 00:11:41.100 secure development is wonderful. It's overdue, like meaningful 188 00:11:41.100 --> 00:11:45.120 privacy legislation in the U.S. Other things that we have been 189 00:11:45.120 --> 00:11:48.390 working toward for a long time are threats, intelligence 190 00:11:48.420 --> 00:11:52.680 sharing, and CISA I know has a really big mandate when it comes 191 00:11:52.680 --> 00:11:55.380 to all the different areas of cybersecurity is meant to 192 00:11:55.380 --> 00:11:59.100 meaningfully influence. But there are some hearings last 193 00:11:59.100 --> 00:12:03.120 month about CISA's joint cyber defense collaborative, and 194 00:12:03.390 --> 00:12:09.030 multiple people witnesses testifying said that it's not 195 00:12:09.030 --> 00:12:12.720 really doing what they would have hoped, from an information 196 00:12:12.720 --> 00:12:15.300 sharing standpoint and public-private wise. And then 197 00:12:15.330 --> 00:12:18.510 recently, also, there was a GAO Watchdog Report, noting that 198 00:12:18.780 --> 00:12:23.370 CISA needs more people, especially with OT skills and 199 00:12:23.370 --> 00:12:28.740 expertise. And so you're I think I can call you a beltway 200 00:12:28.740 --> 00:12:33.360 insider, certainly more inside than I am. Is this a normal 201 00:12:33.360 --> 00:12:36.780 course of events, when you have an organization like CISA that's 202 00:12:36.810 --> 00:12:41.190 gotten all this additional responsibility? It's still 203 00:12:41.880 --> 00:12:46.170 possibly on its shakedown/ruse. Are these normal kinds of 204 00:12:46.170 --> 00:12:50.040 growing pains that we should expect to be hearing? Or is it 205 00:12:50.040 --> 00:12:51.570 possibly more? 206 00:12:51.000 --> 00:12:53.999 Grant Schneider: So a couple of thoughts on that - one, to your 207 00:12:51.000 --> 00:14:24.780 High expectations, high need for delivery. Great, thank you; this 208 00:12:54.064 --> 00:12:57.847 point, CISA has grown a ton, right? We just said it's a $3 209 00:12:57.912 --> 00:13:02.085 billion organization right now. It wasn't maybe three years ago, 210 00:13:02.150 --> 00:13:05.737 it was a billion dollar organization in budget. So CISA 211 00:13:05.802 --> 00:13:09.389 has grown a ton during that growth, and even before the 212 00:13:09.454 --> 00:13:13.106 growth and budget, CISA has taken on just an innumerable 213 00:13:13.171 --> 00:13:17.279 number of new requirements and new initiatives; JCDC, being one 214 00:13:17.344 --> 00:13:21.387 of them, that started out as a pretty narrowly focused, right? 215 00:13:21.453 --> 00:13:25.431 It was kind of almost ... it came across the industry as, you 216 00:13:25.496 --> 00:13:29.734 know, we're going to have a club and some people get to be in the 217 00:13:29.800 --> 00:13:33.843 club, and we're going to share more information with them. And 218 00:13:33.908 --> 00:13:37.951 then it rapidly expanded for, I think, some pragmatic reasons, 219 00:13:38.016 --> 00:13:41.864 but also some you can't just have a club when you're in the 220 00:13:41.929 --> 00:13:45.711 government. And so what we've seen though, is, I think the 221 00:13:45.776 --> 00:13:49.428 focus of something like JCDC that maybe started out more 222 00:13:49.493 --> 00:13:52.884 narrow has become very broad. And when we talk about 223 00:13:52.949 --> 00:13:57.057 information sharing, and when I was in government, my question, 224 00:13:57.123 --> 00:14:00.383 you know, it's such an easy buzzword, we need more 225 00:14:00.448 --> 00:14:04.361 information sharing the world will be great if we just share 226 00:14:04.426 --> 00:14:07.817 more threat data. I always wanted to know like, what 227 00:14:07.882 --> 00:14:12.121 information ... if you could get your hands on any information in 228 00:14:12.186 --> 00:14:15.903 the world even if it doesn't exist, but you could have it 229 00:14:15.968 --> 00:14:19.751 like, what would it be? How would you want to get it? What 230 00:14:19.816 --> 00:14:23.924 would the time we need for it to be relevant to you? And I just 231 00:14:23.989 --> 00:14:27.837 think on information sharing, they've got to get a lot more 232 00:14:24.900 --> 00:16:10.950 is fascinating insights. I'm going to hand you over to Anna. 233 00:14:27.902 --> 00:14:31.749 granular about what they're sharing, when they're expecting 234 00:14:31.815 --> 00:14:35.727 to share it, and then who's going to do what with it? Right? 235 00:14:35.792 --> 00:14:39.901 What are we expecting industry to do? What are we expecting the 236 00:14:39.966 --> 00:14:43.944 government to do? So I think there's a lot of just refinement 237 00:14:44.009 --> 00:14:47.726 on the information sharing piece. On the manpower issues, 238 00:14:47.791 --> 00:14:51.704 you know, unemployment is low right now. And it continues to 239 00:14:51.769 --> 00:14:55.942 be very low and it is, you know, employees have a lot of choices 240 00:14:56.008 --> 00:15:00.116 in the job market. and CISA has their cyber pay. So they have a 241 00:15:00.181 --> 00:15:04.028 system where they can actually hire where the Department of 242 00:15:04.094 --> 00:15:08.202 Homeland Security actually has it and CISA can participate. And 243 00:15:08.267 --> 00:15:12.049 so they have opportunities to pay more money, but it's not 244 00:15:12.114 --> 00:15:16.158 always just about money. There's also a does the talent exist? 245 00:15:16.223 --> 00:15:20.331 Right? Are there the people out there, and you know, not to get 246 00:15:20.396 --> 00:15:24.244 back to AI. But the President's budget has a big initiative 247 00:15:24.309 --> 00:15:28.156 about surge in hiring of AI talent, I don't know where that 248 00:15:28.221 --> 00:15:32.199 talent is that they're going to hire from. Right? And I think 249 00:15:32.264 --> 00:15:36.112 CISA runs into some of those same challenges. So yes, these 250 00:15:36.177 --> 00:15:40.220 are growing pains. They're not unexpected to me. That said, we 251 00:15:40.285 --> 00:15:44.068 need to sit to get there faster just from a nation and our 252 00:15:44.133 --> 00:15:47.785 ability to defend. So this is going to have to make some 253 00:15:47.850 --> 00:15:51.371 prioritization decisions continuously and decide where 254 00:15:51.436 --> 00:15:55.610 they're going to invest in which of these initiatives are really 255 00:15:55.675 --> 00:15:59.718 going to drive home. And which ones because I don't think they 256 00:15:59.783 --> 00:16:01.740 can do everything all at once. 257 00:16:11.760 --> 00:16:14.190 Anna Delaney: Lots of education there. Thank you. So my 258 00:16:14.190 --> 00:16:17.520 questions revolve around FISMA reforming, if there are more 259 00:16:17.520 --> 00:16:21.270 long words. The U.S. House Oversight and Accountability 260 00:16:21.270 --> 00:16:23.970 Committee passed the Federal Information Security 261 00:16:24.000 --> 00:16:27.990 Modernization Act of 2023 earlier this month, which 262 00:16:28.110 --> 00:16:31.530 essentially shows lawmakers renewed efforts to update the 263 00:16:31.530 --> 00:16:34.680 main law governing cybersecurity in the federal government. 264 00:16:35.340 --> 00:16:38.070 Grant, what are your initial thoughts, takeaways on the Act 265 00:16:38.100 --> 00:16:39.270 passing out to Committee? 266 00:16:41.280 --> 00:16:44.970 Grant Schneider: Big step forward. Right? The Congress has 267 00:16:44.970 --> 00:16:49.920 been working on a FISMA update for the last two session now, 268 00:16:50.310 --> 00:16:54.150 this session and the two previous at the very least, I 269 00:16:54.150 --> 00:16:57.870 think there had been some holdups and language between 270 00:16:57.870 --> 00:17:00.000 what the House wanted in the Senate wanted and what the 271 00:17:00.000 --> 00:17:04.860 administration wanted. My understanding is that the house 272 00:17:04.860 --> 00:17:11.310 really felt that they are at good agreement on the text 273 00:17:11.340 --> 00:17:15.660 between the House and the Senate. So the Homeland Security 274 00:17:16.170 --> 00:17:19.230 and Government Affairs Committee in the Senate was the other 275 00:17:19.230 --> 00:17:22.440 committee that will has jurisdiction on FISMA and needs 276 00:17:22.440 --> 00:17:25.500 to push this forward. And I think Senator Peters is 277 00:17:25.500 --> 00:17:27.900 supportive. My understanding is he's supportive of the language 278 00:17:27.900 --> 00:17:30.780 that came through the House. So definitely a big step forward. 279 00:17:31.050 --> 00:17:34.830 I'm not sure what the path forward will be for it. But 280 00:17:34.830 --> 00:17:38.280 definitely, you know, a step forward and maybe some 281 00:17:38.280 --> 00:17:41.490 bipartisan opportunities on cybersecurity getting FISMA out 282 00:17:41.490 --> 00:17:47.040 there, which would be helpful. And, you know, there's just a 283 00:17:47.040 --> 00:17:50.940 lot they need to add in FISMA. I think, as you mentioned, 2014 is 284 00:17:50.940 --> 00:17:54.810 when it was last updated. We've got a national cyber director, 285 00:17:54.900 --> 00:17:58.110 we've got CISA, we've gotten a lot of changes that really need 286 00:17:58.110 --> 00:18:01.410 to be incorporated, and a lot of that got reflected inside of 287 00:18:02.250 --> 00:18:03.210 that legislation. 288 00:18:04.260 --> 00:18:06.000 Anna Delaney: And there were a couple of interesting features, 289 00:18:06.030 --> 00:18:09.420 notably, the inclusion of the role of the CISO at the Office 290 00:18:09.420 --> 00:18:12.990 of Management and Budget and assigning reporting duties for 291 00:18:12.990 --> 00:18:16.320 cyberattacks and incidents to agencies. So how do you see 292 00:18:16.320 --> 00:18:19.680 these provisions impacting federal cybersecurity practices? 293 00:18:21.120 --> 00:18:26.340 Grant Schneider: I think the recognition of the federal CISO 294 00:18:26.340 --> 00:18:32.160 role is great. I've decidedly a biased opinion on this, but I 295 00:18:32.160 --> 00:18:36.630 think it's great. Elevating it to be a presidential appointment 296 00:18:37.050 --> 00:18:41.880 is really helpful. And it's helpful for that person inside 297 00:18:41.880 --> 00:18:46.530 of OMB, inside and working with the interagency and certainly 298 00:18:46.530 --> 00:18:49.380 the broader Executive Office of the President. So I think that's 299 00:18:49.380 --> 00:18:54.180 definitely a plus. The fact that they also went the step of, you 300 00:18:54.180 --> 00:18:57.780 know, codifying or potentially if this gets signed into law 301 00:18:57.780 --> 00:19:01.620 would be codifying, the dual hatted that, that Chris DeRusha 302 00:19:01.620 --> 00:19:05.010 is the current federal CISO has as being a part of the National 303 00:19:05.010 --> 00:19:08.370 Cyber director, or the Office of the National Cyber director. I 304 00:19:08.370 --> 00:19:11.490 think that's also great, because I think that, you know, having 305 00:19:11.490 --> 00:19:15.450 that connectivity between the Office of Management and Budget, 306 00:19:15.450 --> 00:19:19.020 and the National Cyber Director on federal cybersecurity is 307 00:19:19.020 --> 00:19:23.430 really, really important. So I think that that's definitely, 308 00:19:23.460 --> 00:19:26.640 definitely going to be helpful. And I think, you know, there's 309 00:19:26.640 --> 00:19:29.550 also a whole bunch of like inside baseball, if you will 310 00:19:29.550 --> 00:19:33.150 really focused on inside the government roles and 311 00:19:33.150 --> 00:19:36.090 responsibilities. You know, there's a lot of collaboration 312 00:19:36.090 --> 00:19:39.690 and coordination in that bill right now. You know, they did 313 00:19:39.720 --> 00:19:43.260 assign some responsibilities to agencies, which is great. You 314 00:19:43.260 --> 00:19:48.180 know, if I have a concern, it's that, you know, in many areas, 315 00:19:48.180 --> 00:19:50.610 we haven't generated more clarity. We've sort of added 316 00:19:50.610 --> 00:19:54.630 more cooks to the kitchen, and cyber, it's a team sport. We 317 00:19:54.630 --> 00:19:56.640 need a lot of cooks in the kitchen, but someone's got to be 318 00:19:56.640 --> 00:20:00.720 the head chef. And I don't know that we've really gone far 319 00:20:00.720 --> 00:20:04.950 enough on who the head chef is in every single instance. And 320 00:20:04.950 --> 00:20:08.700 some of that doesn't necessarily need to be codified in law. But 321 00:20:08.700 --> 00:20:12.210 I do think it's something that needs to be figured out, perhaps 322 00:20:12.210 --> 00:20:15.870 by the administration via some other policy directive. 323 00:20:17.370 --> 00:20:19.830 Anna Delaney: Excellent. Grant, lots of useful takeaways there. 324 00:20:19.830 --> 00:20:23.100 So thank you very much. Well, before I wrap, I have one final 325 00:20:23.100 --> 00:20:26.220 question for you all, just for fun. If you could attend a 326 00:20:26.220 --> 00:20:30.000 cybersecurity themed costume party, of which there are many I 327 00:20:30.000 --> 00:20:33.630 know, what outfit would you wear to represent your favorite 328 00:20:33.630 --> 00:20:36.870 cybersecurity concept or technology? And Tom, you're in 329 00:20:36.870 --> 00:20:37.140 there. 330 00:20:38.610 --> 00:20:42.480 Tom Field: Going back to an oldie but a goodie. Zero, my 331 00:20:42.480 --> 00:20:46.050 hero. And we're going to represent what else, zero trust. 332 00:20:46.260 --> 00:20:50.010 Anna Delaney: Excellent. Mat, what are you going as? 333 00:20:50.760 --> 00:20:53.490 Mathew Schwartz: Like a massive nerd, because I'd dress up as a 334 00:20:53.490 --> 00:20:58.560 hardware security key. Probably YubiKey, although, as the BB 335 00:20:58.560 --> 00:21:01.800 says other options are available. Just because 336 00:21:02.010 --> 00:21:07.080 two-factor authentication is so essential and hardware security 337 00:21:07.080 --> 00:21:11.610 keys are gold standard, probably, as opposed to 338 00:21:11.610 --> 00:21:15.090 authentication apps, which I'm also partial to. That'd be my. 339 00:21:16.320 --> 00:21:18.660 Anna Delaney: So would you have a key around your neck or how 340 00:21:18.660 --> 00:21:20.970 would you ... would you be an actual physical key? 341 00:21:22.110 --> 00:21:23.910 Mathew Schwartz: Probably an actual physical key, you know, 342 00:21:23.910 --> 00:21:26.970 with like the end that you kind of plug in, that'd be up here. 343 00:21:26.970 --> 00:21:27.480 And then ... 344 00:21:28.050 --> 00:21:29.940 Tom Field: You'd be uber key, the YubiKey. 345 00:21:33.230 --> 00:21:36.140 Anna Delaney: Well, I'd go as a strong password. So I'd be 346 00:21:36.140 --> 00:21:39.800 dressed in the classic strong woman outfit for the circus, red 347 00:21:39.800 --> 00:21:42.800 and white striped tshirt, and then the word password 348 00:21:43.010 --> 00:21:45.710 emblazoned on the front. And then, you know, a mixture of 349 00:21:45.740 --> 00:21:48.560 letters and numbers and special characters on the back. So I'm 350 00:21:48.560 --> 00:21:51.800 just embodying the challenge of strong password management with 351 00:21:51.800 --> 00:21:56.600 circus flair. Grant, up in your mountain, who would you? 352 00:21:57.560 --> 00:22:00.380 Grant Schneider: So I think Mat and I need to go together with 353 00:22:00.380 --> 00:22:03.620 this one, because I had the same thoughts as him. And I thought 354 00:22:03.620 --> 00:22:06.710 about being an authentication app. But I wanted to be a 355 00:22:06.710 --> 00:22:10.790 hardware token. But maybe we end up with it. Like one of us is a 356 00:22:10.790 --> 00:22:14.390 mobile device. And one of us is the hardware token so that you 357 00:22:14.390 --> 00:22:18.380 can really see how it is multifactor authentication that 358 00:22:18.380 --> 00:22:21.980 you need to get to. So I think mine requires a partner. 359 00:22:22.910 --> 00:22:25.310 Mathew Schwartz: I liked that challenge. We could do like an 360 00:22:25.310 --> 00:22:27.920 authenticator app and you know, have six digits to change every 361 00:22:27.920 --> 00:22:29.180 60 seconds or whatever. 362 00:22:29.720 --> 00:22:32.030 Tom Field: You know, RSA is coming. We can some different 363 00:22:32.030 --> 00:22:32.960 programming this year. 364 00:22:34.820 --> 00:22:36.440 Anna Delaney: This is going to be one great party. But that's 365 00:22:36.440 --> 00:22:40.160 all I can say. Grant, your insights have been invaluable. 366 00:22:40.160 --> 00:22:43.010 As always, we really appreciate your knowledge and perspective 367 00:22:43.040 --> 00:22:44.330 that you've shared. So thank you. 368 00:22:45.590 --> 00:22:47.420 Grant Schneider: Well, I'm just sad we didn't get to hear more 369 00:22:47.420 --> 00:22:49.910 about the beer museum because I thought when we got to you, 370 00:22:49.910 --> 00:22:52.550 Anna, that that's where you're going to ask about but you know, 371 00:22:52.640 --> 00:22:53.240 next time. 372 00:22:53.390 --> 00:22:56.870 Anna Delaney: There's actually a crate from Homer Simpson from on 373 00:22:56.870 --> 00:23:02.840 the front of the museum. So I'll take up for next time. Thank you 374 00:23:02.840 --> 00:23:05.720 for joining us. We've had a lot of fun, and it's been really 375 00:23:05.720 --> 00:23:06.230 informative. 376 00:23:06.920 --> 00:23:07.250 Tom Field: Thank you. 377 00:23:08.360 --> 00:23:08.990 Grant Schneider: Thank you all. 378 00:23:09.260 --> 00:23:11.270 Anna Delaney: Thanks so much for watching. Until next time.