WEBVTT 1 00:00:00.210 --> 00:00:02.820 Anna Delaney: Hello, I'm Anna Delaney, and this is the ISMG 2 00:00:02.820 --> 00:00:05.940 Editors' Panel where colleagues on the editorial team join me to 3 00:00:05.940 --> 00:00:08.850 break down and discuss some of the top cybersecurity news 4 00:00:08.850 --> 00:00:12.540 stories of the week. The stars of this week's episode are Tom 5 00:00:12.540 --> 00:00:16.140 Field, senior vice president of editorial; Suparna Goswami, 6 00:00:16.140 --> 00:00:20.010 associate editor at ISMG Asia; and Mathew Schwartz, executive 7 00:00:20.010 --> 00:00:24.060 editor of DataBreachToday and Europe. Wonderful to see you 8 00:00:24.060 --> 00:00:24.360 all. 9 00:00:27.420 --> 00:00:29.160 Mathew Schwartz: Great to be here. Great to break down 10 00:00:29.220 --> 00:00:30.150 cybersecurity, Anna. 11 00:00:30.420 --> 00:00:33.150 Anna Delaney: Breaking down, yes. So, Mathew, start us off 12 00:00:33.150 --> 00:00:35.700 because you're with Dennis the Menace, are you? 13 00:00:35.750 --> 00:00:38.000 Mathew Schwartz: I am with Dennis the Menace. So Americans 14 00:00:38.000 --> 00:00:40.820 might be confused because in America, there's a different 15 00:00:40.850 --> 00:00:44.750 Dennis the Menace, which, as far as I'm aware, came about 16 00:00:44.750 --> 00:00:50.600 independently of the one in Britain, the origin of it was a 17 00:00:50.600 --> 00:00:54.560 DC Thompson here in Dundee. And it's still running in a comic 18 00:00:54.560 --> 00:00:59.270 called the Beano beloved by children for decades in Britain, 19 00:00:59.510 --> 00:01:03.920 and also featured on a legal street art graffiti wall down by 20 00:01:03.920 --> 00:01:05.960 the bus station where I like to hang out sometimes. 21 00:01:06.980 --> 00:01:09.320 Tom Field: Okay, let's talk more about this hanging out by the 22 00:01:09.350 --> 00:01:11.180 bus station. Save that for another time, Anna. 23 00:01:13.220 --> 00:01:15.380 Anna Delaney: Suparna, that's an awesome view behind you. 24 00:01:16.320 --> 00:01:17.730 Suparna Goswami: Yes, the background is of a city 25 00:01:17.730 --> 00:01:20.820 landscape in Sri Lanka. So I plan to visit the island nation 26 00:01:20.850 --> 00:01:23.550 end of December. And it's obvious I'll have better picture 27 00:01:23.550 --> 00:01:25.770 as a backdrop. Once I go there and click some beautiful 28 00:01:25.770 --> 00:01:26.760 pictures of the island. 29 00:01:27.140 --> 00:01:30.980 Anna Delaney: We look forward to that. Tom, you and I are 30 00:01:31.040 --> 00:01:32.900 sporting Christmas themes, I think. 31 00:01:32.000 --> 00:01:36.830 Tom Field: I'm just happy to be home. I think I was on the road. 32 00:01:37.190 --> 00:01:40.400 Absolutely every week from early September until a week or so 33 00:01:40.400 --> 00:01:43.610 ago, and for most weeks in the spring and the summer before 34 00:01:43.610 --> 00:01:47.120 that. It's just nice to be home in front of my own hearth and 35 00:01:47.120 --> 00:01:48.560 fireplace. And there you go. 36 00:01:49.130 --> 00:01:51.110 Anna Delaney: Did you cook the turkey over the half? 37 00:01:51.680 --> 00:01:55.520 Tom Field: I didn't. I am married to a chef. I was allowed 38 00:01:55.520 --> 00:01:58.370 to eat the turkey. Anything before that, I had nothing to do 39 00:01:58.000 --> 00:02:02.110 Anna Delaney: You chased well there. So I'm joining you live 40 00:01:58.370 --> 00:01:58.640 with. 41 00:02:02.110 --> 00:02:05.290 from Stockholm to moderate a roundtable this evening. And 42 00:02:05.410 --> 00:02:09.130 last night, I went for a stroll around the city center and found 43 00:02:09.130 --> 00:02:15.340 some friendly moved and a dose of Christmas spirit. Yeah, 44 00:02:15.340 --> 00:02:17.830 there's a lot of dough and then it's all lit up - very, very 45 00:02:17.830 --> 00:02:22.090 pretty. Tom, as we approach new year and you've been conducting 46 00:02:22.090 --> 00:02:25.120 some excellent interviews with some of our global ISMG 47 00:02:25.300 --> 00:02:29.020 contributors, one of whom is the creator, of course, of zero 48 00:02:29.020 --> 00:02:32.290 trust - John Kindervag. How did that conversation go? 49 00:02:32.660 --> 00:02:35.540 Tom Field: Oh, indeed. Yeah, we talk to John a couple times a 50 00:02:35.540 --> 00:02:38.540 year. And we always sort of talk about the state of the union in 51 00:02:38.540 --> 00:02:42.800 terms of zero trust, where are organizations challenged? What 52 00:02:42.800 --> 00:02:45.830 progress are you seeing? So we had that conversation recently, 53 00:02:45.830 --> 00:02:49.580 and will be up on our sites before long. And John made the 54 00:02:49.580 --> 00:02:54.710 point that because of sort of the wake of President Biden's 55 00:02:54.710 --> 00:02:57.620 executive order in 2021, and further guidance to come out 56 00:02:57.620 --> 00:03:01.340 this year, the zero trust has emerged. He always used to like 57 00:03:01.340 --> 00:03:05.000 to say that zero trust was like the Fight Club. And the first 58 00:03:05.000 --> 00:03:08.060 rule of Fight Club is you don't talk about Fight Club. That was 59 00:03:08.060 --> 00:03:12.470 zero trust. That's changed. But more than just people talking 60 00:03:12.500 --> 00:03:16.400 about it, organizations are embracing it. They've got 61 00:03:16.700 --> 00:03:20.720 measurable goals, and making measurable progress. And so I 62 00:03:20.720 --> 00:03:23.750 asked him the usual question, what's the state of the union 63 00:03:23.750 --> 00:03:26.960 with zero trust? And he told me about that. And I said, where 64 00:03:26.960 --> 00:03:30.290 are organizations challenged now? We know they're making 65 00:03:30.290 --> 00:03:32.900 progress. We know that they are working ahead, where are they 66 00:03:32.900 --> 00:03:36.110 challenged then? I'll share with you just a clip of this 67 00:03:36.110 --> 00:03:38.960 interview where he talks about what he's seeing as he travels 68 00:03:38.960 --> 00:03:39.680 globally now. 69 00:03:39.000 --> 00:03:41.970 John Kindervag: The biggest problem is that they think they 70 00:03:42.039 --> 00:03:46.253 have to do it all at once. I mean, the same thing that I just 71 00:03:46.322 --> 00:03:50.604 mentioned. They think, in the old days, you used to have to do 72 00:03:50.674 --> 00:03:55.025 everything all at once. And zero trust changes the paradigm. It 73 00:03:55.094 --> 00:03:59.584 inverts everything that you ever thought. And so it was designed. 74 00:03:59.654 --> 00:04:03.729 So you do one protect surfaces at a time. I need to protect 75 00:04:03.798 --> 00:04:07.597 this credit card database. That's one project. The next 76 00:04:07.666 --> 00:04:11.397 project is my HR system. The next project is about the 77 00:04:11.466 --> 00:04:15.749 elevators of a hospital, I found out that's a really important 78 00:04:15.818 --> 00:04:19.617 thing. And it's controlled by computers and needs to be 79 00:04:19.686 --> 00:04:23.900 protected. So you do each one of those things as a zero trust 80 00:04:20.700 --> 00:04:26.850 Anna Delaney: Yes. I mean, it's been a positive year for zero 81 00:04:23.969 --> 00:04:27.906 project. And eventually you're done. And then, of course, 82 00:04:26.981 --> 00:04:35.095 trust. I think we've seen this week that the DoD is working on 83 00:04:27.975 --> 00:04:32.396 everybody thinks that everything has to be protected in the same 84 00:04:32.465 --> 00:04:36.610 way. And I tell people all the time, there's a lot of things 85 00:04:35.226 --> 00:04:43.339 a full transition to zero trust by end of fiscal year to 2027; 86 00:04:36.679 --> 00:04:40.962 that you have, that don't need very much protection, if any. I 87 00:04:41.031 --> 00:04:44.830 see people spending millions of dollars to protect, say 88 00:04:43.470 --> 00:04:51.584 however, what I'm still hearing is that many organizations are 89 00:04:44.899 --> 00:04:49.251 websites, that the only thing on it is information that they're 90 00:04:49.320 --> 00:04:51.600 trying to give away to customers. 91 00:04:51.715 --> 00:04:59.959 still stalling at implementation level so that the awareness is 92 00:05:00.090 --> 00:05:08.073 there. They know it's a good idea, but there's just a lack of 93 00:05:08.204 --> 00:05:16.317 cybersecurity skills as well as IT skills. Is that what you're 94 00:05:16.448 --> 00:05:18.150 hearing, Tom? 95 00:05:18.330 --> 00:05:20.280 Tom Field: There are no resources as well. But I think 96 00:05:20.280 --> 00:05:25.020 we've made a key transition from organization becoming aware of 97 00:05:25.020 --> 00:05:29.850 exactly what zero trust is. And we've gone through the phase of 98 00:05:29.850 --> 00:05:33.840 vendors knocking at the door, wearing zero trust clothing and 99 00:05:33.840 --> 00:05:37.170 saying, trust us, we've got your zero trust solution. The 100 00:05:37.200 --> 00:05:40.380 organizations have gotten wise to that. And I think that 101 00:05:40.380 --> 00:05:43.410 they've set realistic objectives for themselves. And as John 102 00:05:43.410 --> 00:05:46.530 says, they're starting to realize what exactly their 103 00:05:46.530 --> 00:05:51.630 protect surface is, and how to approach in meaningful, 104 00:05:51.630 --> 00:05:55.740 practical ways. So I think it has become a turning point. And 105 00:05:55.740 --> 00:05:59.250 I think 2023 will be a significant year because there 106 00:05:59.250 --> 00:06:03.420 are lots of different regulatory regimes now that are promoting 107 00:06:03.420 --> 00:06:08.250 zero trust as a strategy. And I suspect that the cyber insurance 108 00:06:08.250 --> 00:06:11.040 industry is going to get behind that as well. So I think we will 109 00:06:11.040 --> 00:06:14.190 see measurable progress in 2023. 110 00:06:15.150 --> 00:06:17.250 Anna Delaney: That's great news. Well, we look forward to 111 00:06:17.280 --> 00:06:19.740 watching the interview. That's not been published yet. 112 00:06:20.250 --> 00:06:21.840 Tom Field: Very soon, very soon. 113 00:06:22.020 --> 00:06:24.570 Anna Delaney: Very soon. We look forward to that. Suparna, 114 00:06:24.570 --> 00:06:28.920 India's parliament has finally released its Digital Personal 115 00:06:28.920 --> 00:06:32.400 Data Protection Bill. So what does the latest version of the 116 00:06:32.430 --> 00:06:33.630 bill mean for CISOs? 117 00:06:35.070 --> 00:06:37.050 Suparna Goswami: Yes, Anna. As you said, after much 118 00:06:37.080 --> 00:06:40.740 deliberation and wait, India finally came out with a fresh 119 00:06:40.740 --> 00:06:45.270 draft of Data Protection Bill, which terms as Digital Data 120 00:06:45.270 --> 00:06:48.690 Protection Bill, so not sure the aim behind the change of name, 121 00:06:48.720 --> 00:06:51.240 if the law will be applicable to businesses, which are not 122 00:06:51.240 --> 00:06:54.600 digital. So not sure of that logic, but coming back to some 123 00:06:54.600 --> 00:06:58.080 of the highlights of the bill and its impact on CISOs. So, I 124 00:06:58.080 --> 00:07:02.400 did this interview with advocate of Supreme Court and she gave a 125 00:07:02.400 --> 00:07:07.140 very nice, elaborate answer on what impact the draft will have 126 00:07:07.140 --> 00:07:11.040 on CISOs. First, the draft has proposed to do away with the 127 00:07:11.040 --> 00:07:13.680 distinction between personal data and sensitive personal 128 00:07:13.680 --> 00:07:19.200 data. So everything comes under personal data now. Will it make 129 00:07:19.200 --> 00:07:22.470 life simple for CISOs or complicated is a question. 130 00:07:23.070 --> 00:07:27.420 Earlier in the draft it was defined what is personal data 131 00:07:27.660 --> 00:07:29.820 and what is sensitive personal data, and how do you have to 132 00:07:29.820 --> 00:07:32.970 compartmentalize and store it. So essentially, the person data 133 00:07:32.970 --> 00:07:35.640 could move out of the country, sensitive person data had to be 134 00:07:35.640 --> 00:07:38.850 kept in India. Now everything comes in the person data. So any 135 00:07:38.850 --> 00:07:42.690 personal information available online, will come under that 136 00:07:42.690 --> 00:07:47.430 purview. In a way it will make life easy for CISOs. But she 137 00:07:47.490 --> 00:07:51.750 thinks and I've spoken to a few of the CISOs where it can be bad 138 00:07:51.750 --> 00:07:53.820 for businesses because everything is personal data now, 139 00:07:53.820 --> 00:07:57.750 and you have to have top notch security for everything. In 140 00:07:57.750 --> 00:08:00.810 terms of compliance, it can be good for CISOs to implement it, 141 00:08:00.840 --> 00:08:05.010 because it just makes things simple. The second point is the 142 00:08:05.010 --> 00:08:08.460 draft has also done away with data localization. Now, this was 143 00:08:08.460 --> 00:08:11.580 such a big issue in the previous draft bill. There were protests 144 00:08:11.580 --> 00:08:15.120 by global companies and I think the current draft is probably 145 00:08:15.150 --> 00:08:17.790 aiming to please the global giants. There is no such 146 00:08:17.790 --> 00:08:22.020 requirement now, at least in the present draft bill. However, 147 00:08:22.020 --> 00:08:24.600 multiple data centers have been set up in India for the past few 148 00:08:24.600 --> 00:08:27.300 years. So not sure if they will protest now and force the 149 00:08:27.300 --> 00:08:29.250 company to change the requirements. So we have to wait 150 00:08:29.250 --> 00:08:33.660 and watch. Also, there is no surety whether this rule will 151 00:08:33.750 --> 00:08:37.500 overpower individual sectoral rules of storing data in a 152 00:08:37.500 --> 00:08:40.710 certain way. So now they have done away with data 153 00:08:40.710 --> 00:08:44.250 localization, but will it also be applicable to the banking 154 00:08:44.250 --> 00:08:47.940 sector? For example, the RBI mandates banks to store 155 00:08:47.970 --> 00:08:53.340 sensitive data on-prem and not on the cloud. So will this new 156 00:08:53.340 --> 00:08:57.120 rule overpower that? So we have to again wait and watch. And 157 00:08:57.150 --> 00:09:01.110 another interesting thing that they have added is now you can 158 00:09:01.110 --> 00:09:05.130 transfer data easily to some of the friendly states. Friendly 159 00:09:05.130 --> 00:09:07.920 states as a few countries so the government will enlist where 160 00:09:07.920 --> 00:09:11.970 data can be transferred easily. Now, this again, makes things 161 00:09:11.970 --> 00:09:16.830 easy for CISOs who have companies across the globe. So, 162 00:09:16.860 --> 00:09:21.900 this makes life simple for them. Data erasure - now, individuals 163 00:09:21.900 --> 00:09:24.990 can ask companies to raise data once the purpose is fulfilled. 164 00:09:24.990 --> 00:09:30.000 In a way I think GDPR as well as CCPA also talks about it. Now 165 00:09:30.000 --> 00:09:33.540 data principle will have the rights of data erasure, and I 166 00:09:33.540 --> 00:09:38.010 asked the advocate what CISOs need to do now. So for 167 00:09:38.010 --> 00:09:42.930 companies, this essentially means that they have to have 168 00:09:42.930 --> 00:09:46.170 this in the design phase itself, where they are aware which data 169 00:09:46.170 --> 00:09:50.190 they have to be erased - as a request comes in. They have to 170 00:09:50.190 --> 00:09:56.490 keep a record of that. So tomorrow if a person probably 171 00:09:56.490 --> 00:09:59.730 says, "my record has not been deleted," you can actually 172 00:09:59.730 --> 00:10:02.910 showcase. So they have to keep a record, and this has to come in 173 00:10:02.910 --> 00:10:06.960 the design phase itself. And finally, there is deemed 174 00:10:06.960 --> 00:10:13.020 consent. So it has been said that once you take consent to 175 00:10:13.080 --> 00:10:16.290 use the data, you don't have to go back to the person again to 176 00:10:16.290 --> 00:10:19.680 use it for a different purpose. So once you have taken consent, 177 00:10:19.710 --> 00:10:22.830 that's it. You don't have to go back to the person again and 178 00:10:22.830 --> 00:10:25.830 again, to take consent to use the data. And I think this 179 00:10:25.860 --> 00:10:28.980 again, will make life easy for the CISOs. So overall, I think, 180 00:10:30.570 --> 00:10:33.840 if you look from a privacy point of view, there are a lot of 181 00:10:34.800 --> 00:10:38.040 levies that have been given. But from CISOs and organizations 182 00:10:38.040 --> 00:10:41.640 point of view, it has made life simpler. But the privacy 183 00:10:41.640 --> 00:10:46.650 practitioners or the people who were there in the previous 184 00:10:46.950 --> 00:10:51.420 committee, they are obviously not very happy with the number 185 00:10:51.420 --> 00:10:55.740 of levies that have been given to the companies. 186 00:10:56.820 --> 00:10:59.670 Anna Delaney: Great! Thorough overview, Suparna. So what's 187 00:10:59.670 --> 00:11:02.910 next? Is this set in stone or there likely to be further 188 00:11:02.910 --> 00:11:04.530 changes in the coming year? 189 00:11:05.130 --> 00:11:08.850 Suparna Goswami: So yes, they have asked the public to comment 190 00:11:08.850 --> 00:11:15.090 on it, and suggest changes, but hopefully because we have been 191 00:11:15.090 --> 00:11:17.910 waiting for the past so many years for this draft bill. So 192 00:11:17.910 --> 00:11:23.130 hopefully it will pass the parliament and form into law 193 00:11:23.130 --> 00:11:28.020 soon. But yes, there is a lot of protest by the privacy 194 00:11:28.020 --> 00:11:31.320 practitioners out there. So let's see. But by December, they 195 00:11:31.320 --> 00:11:33.870 are saying they'll come out with a new version of the bill. 196 00:11:35.370 --> 00:11:37.410 Anna Delaney: The court is speaking about that, then. 197 00:11:37.770 --> 00:11:41.520 Thanks, Suparna. Matt, Twitter is in the news again this week. 198 00:11:41.550 --> 00:11:42.420 What's the latest? 199 00:11:43.200 --> 00:11:46.470 Mathew Schwartz: I know, are we having Twitter news fatigue yet? 200 00:11:46.530 --> 00:11:50.850 I mean, show of hands, I think probably everybody right? We can 201 00:11:50.850 --> 00:11:54.600 go around the room. Two hands from here. But on the 202 00:11:54.600 --> 00:11:57.690 cybersecurity front, some interesting news - new news - 203 00:11:57.780 --> 00:12:01.170 that doesn't have to do with anyone's management style or 204 00:12:01.290 --> 00:12:04.770 layoffs, although the layoffs could have an impact. We'll 205 00:12:04.770 --> 00:12:08.190 maybe get back to that in a second. So last week, a security 206 00:12:08.190 --> 00:12:12.540 researcher said that he had learned about a new breach or 207 00:12:12.570 --> 00:12:17.220 new to him, new to us breach involving 17 million Twitter 208 00:12:17.220 --> 00:12:22.710 account holders' details. So for a bit of background - this seems 209 00:12:22.710 --> 00:12:28.110 to relate to a breach that Twitter acknowledged in July. It 210 00:12:28.110 --> 00:12:31.950 warned that a feature that would allow you to find other users 211 00:12:32.070 --> 00:12:35.970 using their email or their phone number had been abused by 212 00:12:35.970 --> 00:12:40.950 attackers. So this was an opt-in feature, where if you said 213 00:12:40.980 --> 00:12:43.440 "okay, yes, let other people find me using my email or phone, 214 00:12:43.470 --> 00:12:49.440 no problem." They could do so. Twitter learned in January that 215 00:12:49.620 --> 00:12:53.730 this was a bug. So what happened was researcher came in and filed 216 00:12:53.730 --> 00:12:57.000 a bug bounty with Twitter's bug bounty program. So far, so good. 217 00:12:57.180 --> 00:13:01.950 And said, "Did you know that if I want, I can use the API that 218 00:13:01.950 --> 00:13:06.900 you've created to force match a whole bunch of different user 219 00:13:06.900 --> 00:13:11.100 accounts?" So they got this heads up in January, they didn't 220 00:13:11.100 --> 00:13:14.070 make a public notification, though, because they had that 221 00:13:14.100 --> 00:13:17.610 typical kind of kluge of, "we had no indication that feature 222 00:13:17.610 --> 00:13:21.390 had been abused." While the feature was abused, and somebody 223 00:13:21.390 --> 00:13:25.860 was selling 5.4 million accountholders' details. This 224 00:13:25.860 --> 00:13:30.030 came to light in July, which led Twitter to unroll the timeline. 225 00:13:30.210 --> 00:13:33.420 It said, it seems like this attack happened last December, 226 00:13:33.570 --> 00:13:37.950 we learned about the bug in January, there was no indication 227 00:13:37.950 --> 00:13:41.040 that anyone had abused this. So we didn't tell anybody. But 228 00:13:41.040 --> 00:13:43.680 we're telling you now, because especially for owners of 229 00:13:43.710 --> 00:13:46.800 synonymous accounts, who were trying to hide their identity, 230 00:13:46.920 --> 00:13:50.220 this would have been a great way not their words, for nation 231 00:13:50.220 --> 00:13:54.180 state attackers or others to unmask them. So if there's some 232 00:13:54.420 --> 00:13:57.210 account that's deeply critical of a certain government, and 233 00:13:57.210 --> 00:14:00.030 they want to figure out who this person is, if they're inside the 234 00:14:00.030 --> 00:14:03.270 country or otherwise reachable, this feature would have been 235 00:14:03.270 --> 00:14:09.030 useful to them. So we have these 5.4 million plus another million 236 00:14:09.030 --> 00:14:12.210 in change of suspended accounts, although that was sold privately 237 00:14:12.360 --> 00:14:17.400 with this 5.4 million person database getting sold in July. 238 00:14:17.760 --> 00:14:21.570 And kudos to Bleeping Computer. They've done some reporting on 239 00:14:21.570 --> 00:14:25.380 this. They spoke to the administrator of Breach Forums 240 00:14:25.680 --> 00:14:28.800 where this information was for sale. And the administrator said 241 00:14:28.800 --> 00:14:31.260 that about three people bought it for less than the asking 242 00:14:31.260 --> 00:14:34.260 price, which wasn't all that much. And also said that there 243 00:14:34.260 --> 00:14:36.780 was this other suspended list that got circulated privately. 244 00:14:37.230 --> 00:14:41.490 But the admin came forward to speak to them again to say this 245 00:14:41.490 --> 00:14:46.020 other breach of 17 million, that wasn't me. So what we have here 246 00:14:46.020 --> 00:14:51.210 is a breach from last December, but we also have one or more 247 00:14:51.210 --> 00:14:55.530 breaches by other attackers, who also seem to be abusing this API 248 00:14:55.530 --> 00:15:00.720 feature to amass account information on Twitter users. So 249 00:15:00.780 --> 00:15:04.710 fascinating breach story - data breaches don't seem to be going 250 00:15:04.710 --> 00:15:08.580 away, do they? Happened to Twitter before the change in 251 00:15:08.580 --> 00:15:13.170 management, and I think it begs the question of who's steering 252 00:15:13.170 --> 00:15:16.320 the security ship right now? Could there be more of these 253 00:15:16.320 --> 00:15:20.190 types of things happening? How long will it take for them to 254 00:15:20.190 --> 00:15:24.930 come to light? Also a cautionary lesson - for any social network 255 00:15:24.930 --> 00:15:28.350 or anybody else, building these types of features, be able to 256 00:15:28.350 --> 00:15:31.980 find me using a certain type of detail is that maybe revealing 257 00:15:31.980 --> 00:15:36.630 more than you should be? And finally, when these sorts of bug 258 00:15:36.630 --> 00:15:39.180 reports come to light, it would have been nice to have seen 259 00:15:39.180 --> 00:15:42.390 Twitter be a bit more proactive, because anybody who was affected 260 00:15:42.390 --> 00:15:46.770 by this was affected. Well, it could have been from June 2021, 261 00:15:46.800 --> 00:15:49.560 because that is when Twitter added this feature that it later 262 00:15:49.560 --> 00:15:53.010 came to call a flaw. So it could have been a long time that 263 00:15:53.010 --> 00:15:57.180 somebody was unmasked more than a year before Twitter gave them 264 00:15:57.180 --> 00:15:59.790 the heads up that they may have been unmasked in this manner. 265 00:16:01.400 --> 00:16:04.160 Anna Delaney: Excellent. And, Matt, do we know how big 266 00:16:04.190 --> 00:16:07.130 Twitter's security team is at the moment? Following the 267 00:16:07.130 --> 00:16:08.720 resignations and firings. 268 00:16:08.870 --> 00:16:09.860 Tom Field: Count out on a hand. 269 00:16:11.429 --> 00:16:15.569 Mathew Schwartz: I have no facts to offer to you, Anna. I did 270 00:16:15.569 --> 00:16:18.509 reach out to Twitter and using an email address that used to 271 00:16:18.509 --> 00:16:23.459 work for the press department. I received no response. Think that 272 00:16:23.459 --> 00:16:23.999 as you will. 273 00:16:24.600 --> 00:16:27.000 Anna Delaney: So how harmful could this potentially be to 274 00:16:27.030 --> 00:16:27.960 Musk and Twitter? 275 00:16:29.760 --> 00:16:31.380 Mathew Schwartz: That's a great question. I mean, I think 276 00:16:31.380 --> 00:16:36.390 Twitter is potentially facing a world of regulatory pain. They 277 00:16:36.390 --> 00:16:39.180 have some consent decrees in place, they need to be doing 278 00:16:39.180 --> 00:16:41.730 certain things, they need to be showing that their security 279 00:16:41.730 --> 00:16:45.570 program is robust. There have been a lot of people exiting the 280 00:16:45.570 --> 00:16:49.530 company, have they been hiring replacements in a timely manner 281 00:16:49.560 --> 00:16:52.650 and ensuring that they are treating things with 282 00:16:52.650 --> 00:16:56.100 responsibility that they should be treated? We know this week 283 00:16:56.100 --> 00:16:58.860 that they have stopped attempting to combat 284 00:16:58.920 --> 00:17:03.480 misinformation or disinformation about COVID-19, for example. Is 285 00:17:03.480 --> 00:17:07.140 this just the first of many cracks that we were seeing 286 00:17:07.140 --> 00:17:12.390 publicly that indicate that things are going really haywire 287 00:17:12.450 --> 00:17:14.760 inside the company? I couldn't possibly say. 288 00:17:15.630 --> 00:17:17.250 Anna Delaney: Now, you spent a lot of time on Twitter 289 00:17:17.250 --> 00:17:20.910 personally. Do you see a lot of people leaving the platform? 290 00:17:20.000 --> 00:17:25.790 Mathew Schwartz: It feels a lot thinner in terms of content than 291 00:17:25.790 --> 00:17:28.610 it used to. I know a lot of people going to Mastodon. I 292 00:17:28.610 --> 00:17:31.100 think a lot of journalists are waiting to see what happens. 293 00:17:31.340 --> 00:17:35.720 Mastodon is not as usable as Twitter. There's no obvious 294 00:17:35.720 --> 00:17:38.360 Twitter killer yet. I'm sure that Silicon Valley is 295 00:17:38.360 --> 00:17:41.960 attempting to turn one out. I don't know if Twitter will turn 296 00:17:41.960 --> 00:17:45.050 the corner. I've worked with online communities for all my 297 00:17:45.050 --> 00:17:48.560 career, and they're very, very easy to tear down. They're very 298 00:17:48.560 --> 00:17:51.680 hard to build up. And it's just such a shame. I think that 299 00:17:51.680 --> 00:17:53.990 that's happened because it has been a really good resource for 300 00:17:53.990 --> 00:17:54.650 a lot of people. 301 00:17:54.000 --> 00:17:57.060 Suparna Goswami: In fact, I asked a local cop here who is 302 00:17:57.060 --> 00:18:00.870 into fraud investigation, financial crime investigation, 303 00:18:00.870 --> 00:18:07.140 and we asked him whether they are to move away from WhatsApp 304 00:18:07.140 --> 00:18:11.760 and Twitter and in India, we have a Indian version of Twitter 305 00:18:11.790 --> 00:18:14.610 by the government called Coup. But they said it has become de 306 00:18:14.610 --> 00:18:17.070 facto - WhatsApp and Twitter - there's no way we can do away 307 00:18:17.070 --> 00:18:20.010 with them. Because that's where we find the maximum fruition. 308 00:18:20.280 --> 00:18:22.350 Mathew Schwartz: Exactly. Maximum uptake. Many communities 309 00:18:22.350 --> 00:18:25.470 say that as well. If a town has a flood warning, Twitter has 310 00:18:25.470 --> 00:18:28.110 been their biggest channel for getting that sort of information 311 00:18:28.110 --> 00:18:30.720 out. But of course, it's run by a private company, they can do 312 00:18:30.720 --> 00:18:33.390 whatever they want. Again, it's just a shame, I think. 313 00:18:34.770 --> 00:18:36.930 Anna Delaney: And Matt, have you been using Mastodon? Have you 314 00:18:36.960 --> 00:18:40.260 jumped ship? I know you're waiting to see what happens. But 315 00:18:40.260 --> 00:18:42.570 have you been exploring what that platform is like? 316 00:18:42.000 --> 00:18:45.570 Mathew Schwartz: I've been exploring, only so many hours in 317 00:18:45.570 --> 00:18:48.330 the day. I mean, a lot of the cybersecurity community seems to 318 00:18:48.330 --> 00:18:51.090 have gone to it, but they tend to be more technically astute, 319 00:18:51.090 --> 00:18:53.640 more technically able and interested. They don't mind 320 00:18:53.640 --> 00:18:57.030 being early adopters. Whereas I'd count myself in the other 321 00:18:57.030 --> 00:19:00.360 group, which thinks how much frustration is this going to 322 00:19:00.360 --> 00:19:04.410 cause? If I wait a month or so, might there be a nicer front end 323 00:19:04.410 --> 00:19:06.810 that makes all of this a lot easier to use? So we will see. 324 00:19:07.740 --> 00:19:11.280 Anna Delaney: Very true, well said. And finally then, I am 325 00:19:11.280 --> 00:19:13.890 aware that not everybody celebrates Christmas but as it's 326 00:19:13.920 --> 00:19:17.400 December this week, and many will be opening their advent 327 00:19:18.030 --> 00:19:21.780 calendars. What would your cybersecurity-themed advent 328 00:19:21.780 --> 00:19:22.860 calendar be? 329 00:19:25.350 --> 00:19:27.990 Tom Field: It's the dwell time advent calendar. I think it's 330 00:19:27.990 --> 00:19:30.390 one that can shrink every year. That's the goal anyway. 331 00:19:31.470 --> 00:19:36.150 Anna Delaney: I really like that. Suparna? 332 00:19:37.350 --> 00:19:42.300 Suparna Goswami: Yes, I am not that creative. But yes, I'd look 333 00:19:42.300 --> 00:19:45.030 each day expecting to find a new way hackers have discovered to 334 00:19:45.030 --> 00:19:51.360 carry out fraud. And I also hope I find the name of financial 335 00:19:51.360 --> 00:19:53.400 institutions that have discovered a normal way to 336 00:19:53.400 --> 00:19:58.050 effectively secure itself and its customers by doing a proper 337 00:19:58.050 --> 00:20:00.930 patch management game and have figured out all. So we all know 338 00:20:00.930 --> 00:20:04.110 how basic cybersecurity hygiene is sorely lacking. So maybe 339 00:20:04.110 --> 00:20:04.500 that. 340 00:20:05.280 --> 00:20:07.050 Tom Field: I mean, here's your two-year calendar, it's not one 341 00:20:07.050 --> 00:20:07.230 month. 342 00:20:10.660 --> 00:20:11.260 Anna Delaney: Matt? 343 00:20:12.300 --> 00:20:15.300 Mathew Schwartz: Nothing beats chocolate. But with that caveat, 344 00:20:15.540 --> 00:20:17.970 I was thinking along the lines of an incident response calendar 345 00:20:17.970 --> 00:20:21.690 or a tabletop exercise calendar, I don't know that you need 24 346 00:20:21.690 --> 00:20:24.960 different disaster recovery plans, but thinking about the 347 00:20:24.960 --> 00:20:27.900 things that could happen, and making sure that you have a plan 348 00:20:27.900 --> 00:20:31.350 in place that you've hopefully practiced for dealing with it. 349 00:20:31.350 --> 00:20:35.940 So you can pull that binder off the shelf, if the worst happens 350 00:20:35.940 --> 00:20:40.410 - a ransomware attack, for example - or some kind of 351 00:20:40.410 --> 00:20:43.170 disaster recovery scenario where your servers have gone down 352 00:20:43.170 --> 00:20:45.270 because of a tornado or a hurricane or something like 353 00:20:45.270 --> 00:20:48.270 that. So I don't know, maybe that's a little disaster 354 00:20:48.300 --> 00:20:52.530 focused. But if you need 24 disaster recovery plans, have 355 00:20:52.530 --> 00:20:54.300 them, practice them, keep them refined. 356 00:20:55.260 --> 00:20:57.210 Tom Field: If Diehard can be a Christmas movie, then that can 357 00:20:57.210 --> 00:20:58.140 be an advent calendar. 358 00:20:59.250 --> 00:21:01.200 Mathew Schwartz: Thank you very much. I completely agree. 359 00:21:01.230 --> 00:21:03.060 Diehard is one of my favorite Christmas films. 360 00:21:04.250 --> 00:21:06.260 Anna Delaney: Well, as I'm in Stockholm, it would have to be 361 00:21:06.260 --> 00:21:10.250 an ABBA-themed calendar, I think. And at each door, you'd 362 00:21:10.250 --> 00:21:13.670 have a cybersecurity privacy quote, taken from one of our 363 00:21:13.670 --> 00:21:18.200 interviews this year, and they'd have to be sung to the music of 364 00:21:18.200 --> 00:21:18.560 ABBA. 365 00:21:19.100 --> 00:21:20.510 Tom Field: Do I understand you're right next door to the 366 00:21:20.510 --> 00:21:21.170 ABBA museum? 367 00:21:21.230 --> 00:21:25.280 Anna Delaney: Yes, I think it's actually part of the hotel. So 368 00:21:25.280 --> 00:21:29.240 I'll be checking that out later. Be prepared for a background 369 00:21:29.630 --> 00:21:30.470 next week. 370 00:21:31.100 --> 00:21:33.140 Mathew Schwartz: I'll prepare for a karaoke Christmas next 371 00:21:33.000 --> 00:21:38.610 Anna Delaney: I mean, in time, in time, maybe. Tom, Suparna, 372 00:21:33.140 --> 00:21:33.650 week, Anna. 373 00:21:38.610 --> 00:21:40.530 Matt, that this has been an absolute pleasure. Thank you so 374 00:21:40.530 --> 00:21:47.220 much. Thanks so much for watching. Until next time.