WEBVTT 1 00:00:00.450 --> 00:00:02.400 Mathew Schwartz: Hi, I'm Mathew Schwartz with Information 2 00:00:02.400 --> 00:00:07.620 Security Media Group. And welcome to our latest ISMG 3 00:00:07.650 --> 00:00:13.320 Editors' Panel, where I'm joined by other ISMG editors to discuss 4 00:00:13.380 --> 00:00:20.820 the hot cybersecurity news of the day. And this week, it's my 5 00:00:20.820 --> 00:00:25.020 pleasure to welcome Marianne Kolbasuk McGee, the executive 6 00:00:25.020 --> 00:00:29.790 editor who's in charge of ISMG's healthcare coverage, and also 7 00:00:29.880 --> 00:00:35.040 for his debut at editors panel, Cal Harrison our editorial 8 00:00:35.310 --> 00:00:39.900 director. Cal, Marianne, great to have you here today. 9 00:00:40.260 --> 00:00:40.890 Marianne McGee: Thanks, Matt. 10 00:00:41.310 --> 00:00:42.720 Cal Harrison: Thanks, Matt. Good to see you. 11 00:00:42.000 --> 00:00:46.800 Mathew Schwartz: So, where are we hailing from? Marianne? 12 00:00:46.000 --> 00:00:52.660 Marianne McGee: Yeah, actually I'm in the Boston Common mean, a 13 00:00:52.660 --> 00:00:55.990 trip into Boston a couple of weekends ago with my husband, 14 00:00:56.410 --> 00:00:59.920 haven't been in the city there for a while, but it was pretty 15 00:01:00.070 --> 00:01:01.090 nice weather. 16 00:01:01.510 --> 00:01:03.100 Mathew Schwartz: What to do about something in Boston? 17 00:01:03.000 --> 00:01:07.530 Marianne McGee: Yeah, actually, we didn't see that. But yeah, I 18 00:01:07.530 --> 00:01:09.480 guess you never know what you're going to catch. We can take 19 00:01:09.480 --> 00:01:10.290 these photos. 20 00:01:12.420 --> 00:01:14.400 Mathew Schwartz: The urban density. Yeah. So it's throwing 21 00:01:14.430 --> 00:01:18.480 stuff up for you? Well, I suspect a slight opposite on the 22 00:01:18.480 --> 00:01:22.200 urban density front as we go to you, Cal, where are you hailing 23 00:01:22.200 --> 00:01:22.770 from here? 24 00:01:23.160 --> 00:01:26.430 Cal Harrison: Yeah, totally different from Boston. I am 25 00:01:26.430 --> 00:01:32.070 downtown, as a matter of fact, downtown Ridgeway South Carolina 26 00:01:32.100 --> 00:01:39.300 population 300. And behind me, is the world's smallest police 27 00:01:39.300 --> 00:01:46.020 station. That's one of our best-known attractions. It used 28 00:01:46.020 --> 00:01:51.270 to be a well house for the horses back in the 1890s, they 29 00:01:51.270 --> 00:01:55.770 believe, and it was remodeled during the Great Depression 30 00:01:57.390 --> 00:02:03.930 using federal money to create a one-room police station. And 31 00:02:03.930 --> 00:02:08.820 this is where they would, you know, they basically lock up the 32 00:02:08.820 --> 00:02:12.330 drunks on Saturday night, you know, and so that they could 33 00:02:12.330 --> 00:02:17.820 haul them to the county jail in the morning and there was one 34 00:02:17.820 --> 00:02:24.450 police officer who, pretty much, would monitor things on Main 35 00:02:24.450 --> 00:02:30.510 Street and stay warm by the cookstove inside and the cold 36 00:02:30.510 --> 00:02:34.680 weather. I thought this was really appropriate because this, 37 00:02:34.680 --> 00:02:38.430 you know, kind of speaks to a simpler time and law 38 00:02:38.430 --> 00:02:42.480 enforcement. You know, things are a lot more complex these 39 00:02:42.480 --> 00:02:45.990 days, not not only for the, you know, the frontline police 40 00:02:45.990 --> 00:02:49.680 officers, but you know, certainly all of the people in 41 00:02:49.680 --> 00:02:51.210 cybersecurity investigation. 42 00:02:53.010 --> 00:02:55.530 Mathew Schwartz: Things are a lot more complex, crazy 43 00:02:55.680 --> 00:02:59.190 cybercrime cross borders, which we're going to get into in a 44 00:02:59.190 --> 00:03:04.470 moment. And I'll just say, I am at a beach in Scotland. Last few 45 00:03:04.470 --> 00:03:07.740 days managed to get out it's been really warm here in the UK. 46 00:03:08.070 --> 00:03:11.070 So I know it doesn't look like a beach. But this was, I thought, 47 00:03:11.070 --> 00:03:16.290 made a nice backdrop, some grass at the edge of the North Sea 48 00:03:16.290 --> 00:03:21.030 basically by the dunes. So, but to your point, Cal, federal 49 00:03:21.030 --> 00:03:27.900 money, policing, also, how fast it seems things have changed. 50 00:03:28.470 --> 00:03:30.480 That brings me to the first story I'd like to discuss today, 51 00:03:30.480 --> 00:03:33.720 which is: Marianne , I know you've been covering the feds, 52 00:03:33.930 --> 00:03:38.460 having seized about $500,000 worth of cryptocurrency, earned 53 00:03:38.460 --> 00:03:42.960 by attackers wielding Maui ransomware, who have been 54 00:03:42.960 --> 00:03:46.920 hitting or had hit healthcare organizations and other 55 00:03:46.920 --> 00:03:49.530 businesses. What's the latest? 56 00:03:50.640 --> 00:03:53.730 Marianne McGee: As you said, that's absolutely so, actually 57 00:03:53.730 --> 00:03:58.200 during a speech at Fordham University this week, the 58 00:03:58.200 --> 00:04:02.220 Justice Department's Deputy Attorney General Lisa Monaco 59 00:04:02.520 --> 00:04:07.710 disclosed that its DOJ recently seized, as you said, a half 60 00:04:07.710 --> 00:04:09.840 million dollars in payments that were made to 61 00:04:09.870 --> 00:04:14.100 North-Korean-government-backed hackers involved in so called 62 00:04:14.100 --> 00:04:19.290 Maui ransomware attacks on at least two US healthcare entities 63 00:04:19.290 --> 00:04:23.670 as well as several other organizations. Monaco said that 64 00:04:23.670 --> 00:04:27.810 the attack victims included a Kansas Medical Center and a 65 00:04:27.810 --> 00:04:31.830 Colorado Healthcare Provider, neither of which she identified 66 00:04:31.830 --> 00:04:37.350 by name. But just a couple of weeks ago, the FBI, US Treasury 67 00:04:37.350 --> 00:04:41.310 Department and CISA issued a joint advisory to the healthcare 68 00:04:41.310 --> 00:04:48.540 sector about Maui threats. And Maui's ransomware gets its name 69 00:04:48.540 --> 00:04:52.920 from the name of the executable file used to maliciously encrypt 70 00:04:53.130 --> 00:04:58.650 victims' files. The DOJ says that in the May 2021 attack on 71 00:04:58.650 --> 00:05:03.330 the Kansas Medical center, North Korean cyber actors encrypted 72 00:05:03.330 --> 00:05:07.260 the hospitals, servers used to store patient data, and to also 73 00:05:07.260 --> 00:05:11.940 operate critical equipment. The attackers left a note demanding 74 00:05:11.940 --> 00:05:15.960 a ransom and they threatened to double that ransom within 48 75 00:05:15.960 --> 00:05:20.760 hours, Monaco said. The hospital's leadership made the 76 00:05:20.760 --> 00:05:24.300 difficult choice, as many hospital leadership teams do 77 00:05:24.300 --> 00:05:29.370 these days, to pay about $100,000 in Bitcoin to the 78 00:05:29.370 --> 00:05:32.970 attackers, because without access to the patient data, the 79 00:05:32.970 --> 00:05:36.090 medical centers, doctors, and nurses would have been severely 80 00:05:36.090 --> 00:05:42.300 hampered in providing critical care, the DOJ notes, but the 81 00:05:42.300 --> 00:05:46.770 hospital also notified the FBI, which then worked with federal 82 00:05:46.770 --> 00:05:50.490 prosecutors to investigate the attack. And that's when they 83 00:05:50.490 --> 00:05:54.870 discovered that the incident involved Maui — the ransomware 84 00:05:54.900 --> 00:05:59.730 variant that they had not seen until then. Investigators traced 85 00:05:59.730 --> 00:06:03.420 the ransomware payments that were made by Kansas through the 86 00:06:03.420 --> 00:06:07.890 blockchain, and then the FBI identified China-based money 87 00:06:07.890 --> 00:06:11.760 launderers, which they say regularly assist North Korean 88 00:06:11.760 --> 00:06:15.900 hackers to cash out ransom payments into fiat currency. 89 00:06:16.800 --> 00:06:21.630 Meanwhile, a recent study by security firm Sophos found that 90 00:06:21.630 --> 00:06:25.920 healthcare is the sector most likely to pay a ransom. The 91 00:06:25.920 --> 00:06:31.860 average pit ransom paid by healthcare victims was $197,000, 92 00:06:31.890 --> 00:06:36.090 according to Sophos, and that was the lowest among all the 93 00:06:36.090 --> 00:06:39.420 sectors examined. Healthcare sector entities are often 94 00:06:39.420 --> 00:06:42.630 willing to pay attackers in hopes that their IT systems and 95 00:06:42.630 --> 00:06:46.680 data won't be inaccessible for long, being that clinicians 96 00:06:46.680 --> 00:06:51.240 heavily rely on the data and systems for patient care. But, 97 00:06:51.240 --> 00:06:54.570 of course, as we've seen in other industries, as well, many 98 00:06:54.570 --> 00:06:57.810 hospitals are getting better in terms of having backups ready to 99 00:06:57.810 --> 00:07:01.950 go in case of a ransomware incident. So, cybercriminals 100 00:07:01.950 --> 00:07:05.760 are now shifting heavily to attacks that also involve data 101 00:07:05.760 --> 00:07:09.840 exfiltration, demanding ransoms in exchange for not publicly 102 00:07:09.840 --> 00:07:13.080 releasing stolen information. And I'm sure this is something 103 00:07:13.080 --> 00:07:15.690 you've seen all the time too, Matt, with your reporting 104 00:07:15.720 --> 00:07:16.770 another industries. 105 00:07:17.760 --> 00:07:21.360 Mathew Schwartz: Oh, definitely. I mean, here in the UK, we just 106 00:07:21.360 --> 00:07:24.930 had an alert from the lead cybersecurity agency and the 107 00:07:24.930 --> 00:07:28.260 government, and also from the privacy watchdog saying, if you 108 00:07:28.260 --> 00:07:32.850 pay because attackers have promised to not leak stolen data 109 00:07:33.000 --> 00:07:36.570 in return for a ransom payment, we're not going to look upon you 110 00:07:36.570 --> 00:07:40.440 any more favorably, if you weren't doing what you should 111 00:07:40.590 --> 00:07:43.890 have been doing. So you might pay to try to clean up the mess, 112 00:07:43.920 --> 00:07:46.530 we don't care if you screwed something up. It's kind of the 113 00:07:46.800 --> 00:07:50.400 message there. But so many interesting points from this, 114 00:07:50.670 --> 00:07:54.270 the fact that it is still a business decision, whether or 115 00:07:54.270 --> 00:07:57.570 not you want to pay or not pay, as long as you're not paying a 116 00:07:57.570 --> 00:08:02.220 sanctioned entity. I don't know if we might soon see Maui added 117 00:08:02.220 --> 00:08:05.700 to the sanctions entities list because it does trace to North 118 00:08:05.700 --> 00:08:08.850 Korea. Fascinating though, that the North Koreans have been 119 00:08:08.850 --> 00:08:12.900 getting into ransomware. It seems they'll do anything that 120 00:08:12.900 --> 00:08:16.200 generates a profit. But I think for a long time the thinking was 121 00:08:16.200 --> 00:08:19.620 the profits weren't good enough. But it certainly seems, more 122 00:08:19.620 --> 00:08:22.170 recently, with the alerts that we've been seeing from CISA and 123 00:08:22.170 --> 00:08:25.290 others, that the North Koreans are definitely getting into 124 00:08:25.290 --> 00:08:25.980 ransomware. 125 00:08:27.090 --> 00:08:29.970 Marianne McGee: It seems that way. You know, the Chinese, 126 00:08:29.970 --> 00:08:34.020 Russians, Iranians, I guess, involved, have also been threats 127 00:08:34.200 --> 00:08:38.010 to healthcare. But yeah, with Maui, alert that was issued a few 128 00:08:38.010 --> 00:08:41.190 weeks ago. That's one of the first ones I remember, at least 129 00:08:41.190 --> 00:08:43.860 in recent times, that had to do with North Korean hackers. 130 00:08:45.180 --> 00:08:46.800 Mathew Schwartz: Yeah, I mean, another big takeaway for me here 131 00:08:46.800 --> 00:08:52.800 too is the hospital, or the FBI, it had paid the ransom. And we 132 00:08:52.800 --> 00:08:56.940 saw that also with Colonial Pipeline in May 2021. This is 133 00:08:56.940 --> 00:09:00.150 something authorities have been urging victims to do. If they do 134 00:09:00.150 --> 00:09:04.020 choose to pay, they'll let the FBI know and preferably as 135 00:09:04.020 --> 00:09:07.770 quickly as possible, because, as you noted in the example that 136 00:09:07.770 --> 00:09:11.970 you sketched out, they were able to trace this on the blockchain. 137 00:09:12.210 --> 00:09:15.510 Everyone thinks it's anonymous, but it's a public ledger. And 138 00:09:15.510 --> 00:09:18.300 sometimes they can put the pieces together, like you said, 139 00:09:18.420 --> 00:09:23.460 follow it to the money launderers and sometimes get the 140 00:09:23.460 --> 00:09:25.710 money back or get better intelligence on these 141 00:09:25.710 --> 00:09:29.190 activities. So perhaps the next victim or the victim after, they 142 00:09:29.190 --> 00:09:32.760 can better get the money back, shut down these wallets, that 143 00:09:32.760 --> 00:09:37.770 sort of thing. So there's a huge case there, for even when the 144 00:09:37.770 --> 00:09:41.040 victims do pay, making sure they alert law enforcement so that 145 00:09:41.040 --> 00:09:43.500 law enforcement can potentially help them still. 146 00:09:43.860 --> 00:09:45.600 Marianne McGee: Yeah, Matt, I think the feds are really sort 147 00:09:45.600 --> 00:09:47.970 of pushing for that transparency, whether you pay or 148 00:09:47.970 --> 00:09:52.200 you don't pay, which was a breach reporting rule that, I 149 00:09:52.200 --> 00:09:54.480 don't know if it's in effect formally now, with critical 150 00:09:54.480 --> 00:09:59.700 infrastructure entities being required to report if they did 151 00:09:59.700 --> 00:10:03.360 pay a ransom to federal authorities within X number of 152 00:10:03.360 --> 00:10:07.020 hours. And, you know, if they got hit but didn't pay, they 153 00:10:07.020 --> 00:10:10.350 still have to report it with, you know, within another short 154 00:10:10.350 --> 00:10:13.230 amount of time. So, you know, definitely the feds want this 155 00:10:13.230 --> 00:10:15.150 information, you know, regardless of whether you've 156 00:10:15.150 --> 00:10:18.420 paid or not, I think it just helps them to try to trace down 157 00:10:18.420 --> 00:10:20.040 where these attacks are coming from. 158 00:10:20.580 --> 00:10:22.950 Mathew Schwartz: And triangulate what's happening. There's a 159 00:10:22.950 --> 00:10:27.180 March law that got passed in the States. But I think a lot of the 160 00:10:27.180 --> 00:10:30.630 detail about who's critical infrastructure, who has to 161 00:10:30.630 --> 00:10:34.500 notify authorities, is still getting worked out, hopefully, 162 00:10:34.950 --> 00:10:37.380 like you say, they'll get to the point where at least people are 163 00:10:37.380 --> 00:10:41.340 required to say, if they've paid, possibly privately to the 164 00:10:41.340 --> 00:10:43.740 government, possibly that information will be made public, 165 00:10:43.920 --> 00:10:47.460 but certainly it would help the FBI chase some of these groups. 166 00:10:47.820 --> 00:10:48.270 Yeah. 167 00:10:49.500 --> 00:10:53.370 Cal Harrison: And something Marianne had just pointed out to 168 00:10:53.370 --> 00:10:57.750 me yesterday is that these hospitals are under a huge 169 00:10:58.440 --> 00:11:02.490 liability, potential liability for patients that you know, 170 00:11:02.490 --> 00:11:06.630 could be affected and could actually die, you know, as a 171 00:11:06.630 --> 00:11:08.910 result of ransomware, right, Marianne? 172 00:11:09.420 --> 00:11:11.640 Marianne McGee: Yeah, that's one of the theories that, you know, 173 00:11:11.640 --> 00:11:15.150 you have these systems down and patients are not getting the 174 00:11:15.150 --> 00:11:18.420 care that they need. It does affect the medical devices, some 175 00:11:18.420 --> 00:11:22.380 of them are life supporting, you now, there's been at least one 176 00:11:22.380 --> 00:11:26.670 lawsuit so far, sort of a malpractice lawsuit tied to a 177 00:11:26.670 --> 00:11:30.870 ransomware attack at an Alabama hospital a few years ago, where 178 00:11:30.870 --> 00:11:34.470 a baby eventually died of complications because the baby 179 00:11:34.470 --> 00:11:38.160 was born during this ransomware attack. So, you know, there's 180 00:11:38.160 --> 00:11:41.310 all sorts of — there's a lot of research going into this too, 181 00:11:41.310 --> 00:11:46.380 about the impact of ransomware attacks on patient care, even in 182 00:11:46.380 --> 00:11:49.050 the longer term, you know, the impact on patients that might 183 00:11:49.050 --> 00:11:52.890 have been at a hospital, you know, that had some sort of 184 00:11:52.890 --> 00:11:55.380 disruption during the time that they were there, how do they 185 00:11:55.380 --> 00:11:59.580 fare once they leave the hospital? Do they survive? Do 186 00:11:59.580 --> 00:12:03.210 they, you know, have other illnesses or other problems? So, 187 00:12:03.210 --> 00:12:05.700 that's something that's really being looked at carefully. 188 00:12:06.720 --> 00:12:08.520 Mathew Schwartz: Not just the initial attack disrupting 189 00:12:08.520 --> 00:12:12.360 things, but the knock-on effects, these hack disrupting 190 00:12:12.360 --> 00:12:14.820 things to disrupt perhaps patient care, like you say, 191 00:12:15.510 --> 00:12:19.230 attackers preferring healthcare because healthcare can hardly 192 00:12:19.230 --> 00:12:24.240 afford to suffer or experience downtime. This is an astute move 193 00:12:24.270 --> 00:12:27.450 by criminals, unfortunately, at the expense of the rest of us. 194 00:12:28.020 --> 00:12:30.300 Everything ransomware seems to be at the expense of the rest of 195 00:12:30.300 --> 00:12:30.660 us. 196 00:12:31.530 --> 00:12:32.460 Cal Harrison: Unfortunately. 197 00:12:34.920 --> 00:12:38.490 Mathew Schwartz: Same old, same old, yes. Well, ransomware, 198 00:12:38.580 --> 00:12:43.110 always a hot topic. Lots going on here. Lots of criminal 199 00:12:43.140 --> 00:12:46.980 innovation. Thank you, Marianne, for that update. Another hot 200 00:12:46.980 --> 00:12:51.150 topic, and I saw this at the RSA Conference last month in San 201 00:12:51.150 --> 00:12:56.550 Francisco, is zero trust. And Cal, I know that you've been 202 00:12:56.550 --> 00:13:01.020 putting together a special report into zero trust. What do 203 00:13:01.020 --> 00:13:04.560 you think? High degree of buzz these days? 204 00:13:04.950 --> 00:13:09.660 Cal Harrison: Yeah, yes, it is. It is buzzword compliant, with 205 00:13:09.660 --> 00:13:15.600 almost all of the vendors at RSA — and actually, if you go out on 206 00:13:15.600 --> 00:13:19.920 Google and just type in zero trust solutions, you'll get 207 00:13:19.920 --> 00:13:24.960 pages and pages, you know, 60 plus vendors are offering zero 208 00:13:24.960 --> 00:13:30.240 trust solutions today. And the thing that's amazing is just a 209 00:13:30.240 --> 00:13:35.610 couple of years ago, most people were saying, you know, zero 210 00:13:35.610 --> 00:13:39.870 trust? I mean, it's kind of a buzzword, you know, is it real? 211 00:13:40.230 --> 00:13:44.670 Are people really going to invest the amount of money that 212 00:13:44.670 --> 00:13:48.420 it's going to take. And, of course, the big change happened 213 00:13:48.750 --> 00:13:54.840 about a year ago when President Biden passed the executive 214 00:13:54.840 --> 00:14:00.270 order, requiring the federal government to adopt a zero trust 215 00:14:00.270 --> 00:14:06.150 architecture. And there's been a quite a bit of movement since 216 00:14:06.150 --> 00:14:14.010 then, and just, as you say, at RSA, I mean, almost everybody in 217 00:14:14.010 --> 00:14:17.040 the booth area, you know, had their own zero trust solutions. 218 00:14:17.370 --> 00:14:22.260 I think we did 150 interviews, I'd say a good chunk of those 219 00:14:22.320 --> 00:14:27.960 people were talking about zero trust in those interviews, as 220 00:14:27.960 --> 00:14:28.680 you may recall. 221 00:14:28.990 --> 00:14:32.440 Mathew Schwartz: Absolutely. No, it was a huge topic. And we 222 00:14:32.440 --> 00:14:36.010 talked to John Kindervag, the creator, the founder of the zero 223 00:14:36.010 --> 00:14:39.490 trust concept. And I know one of the things he was emphasizing 224 00:14:39.490 --> 00:14:43.120 was, as you say, yes, it's been around for a while, but with 225 00:14:43.150 --> 00:14:47.890 COVID-19 and remote working and the rush to the cloud, the rush 226 00:14:47.890 --> 00:14:51.730 to digitization by so many organizations, the concept has 227 00:14:51.730 --> 00:14:57.130 really been embraced. But then, I did note that if you ask 228 00:14:57.130 --> 00:15:00.310 someone what is zero trust, what does it mean to you? I mean, I 229 00:15:00.310 --> 00:15:04.930 think at a basic level, it's an approach where all users, all 230 00:15:04.930 --> 00:15:08.740 applications need to be continuously authenticated and 231 00:15:08.770 --> 00:15:13.990 authorized and validated. I don't know if you would agree 232 00:15:14.020 --> 00:15:16.900 with that base definition, because I definitely heard some 233 00:15:16.900 --> 00:15:20.260 variation on what you should or shouldn't be doing, if you were 234 00:15:20.260 --> 00:15:24.310 calling yourself zero trust compliant. Or maybe compliance 235 00:15:24.310 --> 00:15:26.920 is even the wrong word. If you were a devotee, I suppose. 236 00:15:27.660 --> 00:15:33.090 Cal Harrison: Yeah. And it's really comes down to the 237 00:15:33.090 --> 00:15:38.220 architecture as Kindervag and others pointed out that it's not 238 00:15:38.220 --> 00:15:42.030 just something you can plug and play, you can't just, you know, 239 00:15:42.030 --> 00:15:46.320 get a zero trust solution off the shelf, plug it into your 240 00:15:46.320 --> 00:15:51.660 system, and expect it to give you the assurance that you just 241 00:15:51.660 --> 00:15:55.680 described, and I think what you were saying is exactly right, 242 00:15:55.680 --> 00:16:00.540 it's the concept is least privilege, and constantly 243 00:16:00.570 --> 00:16:09.150 authenticating the user, the application, and the device, and 244 00:16:09.690 --> 00:16:13.020 this is causing a huge disruption. So, that's why we're 245 00:16:13.020 --> 00:16:16.200 seeing this rush to zero trust, because it's potentially 246 00:16:16.200 --> 00:16:21.060 disrupting, you know, all of the traditional network and VPN 247 00:16:21.090 --> 00:16:26.040 vendors, you know, who had sort of built a whole industry around 248 00:16:26.040 --> 00:16:30.360 the trust model, which is basically, you get in the 249 00:16:30.360 --> 00:16:34.140 network, you know, you're authenticated. And that means 250 00:16:34.140 --> 00:16:40.050 your trusted, zero trust, you know, basically deals with the 251 00:16:40.050 --> 00:16:45.030 reality that you may not be who you say you are, once you get 252 00:16:45.030 --> 00:16:47.580 into the network, I mean, that's, you know, primarily 253 00:16:47.580 --> 00:16:48.570 what's happening. 254 00:16:49.560 --> 00:16:52.680 Mathew Schwartz: You could be on-premise, you could be in the 255 00:16:52.680 --> 00:16:56.250 cloud, how does that handshake or handoff or constant 256 00:16:56.250 --> 00:16:59.760 validation happen, there's a ton of nuance there. 257 00:17:00.930 --> 00:17:05.640 Cal Harrison: Absolutely. And what we saw, what we're seeing 258 00:17:05.640 --> 00:17:12.150 on the vendor community, is that, you know, for example, at 259 00:17:12.180 --> 00:17:20.760 RSA, the folks at Zscaler, you know, were pointing out that 260 00:17:21.060 --> 00:17:26.010 they have a policy decision engine, you know, which can be 261 00:17:26.010 --> 00:17:30.480 accessed through a software as a service, so that you can 262 00:17:30.480 --> 00:17:34.710 basically outsource, you know, that portion of the zero trust 263 00:17:34.710 --> 00:17:39.300 architecture to them, but you also need to plug in, you know, 264 00:17:39.330 --> 00:17:44.700 identity and access management and endpoint protection and data 265 00:17:44.970 --> 00:17:51.900 loss prevention and so on, to make it a working model. But at 266 00:17:51.990 --> 00:17:56.610 the same conference, Palo Alto Networks, which is, you know, a 267 00:17:56.610 --> 00:18:03.930 bastion of network defense suppliers, announced their zero 268 00:18:03.930 --> 00:18:10.200 trust network access 2.0 and, you know, they're saying that 269 00:18:10.230 --> 00:18:14.550 they have a fully functioning platform with all of the 270 00:18:14.550 --> 00:18:21.270 components that you need, in order to get to be zero trust 271 00:18:21.270 --> 00:18:29.100 compliant. The problem that Kindervag and folks, his 272 00:18:29.130 --> 00:18:33.930 associate from Forrester, Chase Cunningham, point out is that 273 00:18:34.530 --> 00:18:40.320 there really isn't a platform per se that you can just plug in 274 00:18:40.320 --> 00:18:42.810 and start using. You need to build the zero trust 275 00:18:42.810 --> 00:18:48.960 architecture from scratch. And Kindervag actually provided a 276 00:18:49.350 --> 00:18:54.600 five-step sort of implementation model, starting with 277 00:18:55.200 --> 00:18:58.380 identifying, you know, what you're trying to protect, what 278 00:18:58.380 --> 00:19:04.320 your attack surface is, mapping the transaction flows, that will 279 00:19:04.320 --> 00:19:09.000 get through that attack surface to your data and applications, 280 00:19:09.930 --> 00:19:16.680 building the architecture itself, fine tuning that. And 281 00:19:16.680 --> 00:19:21.180 then finally, of course, continuously monitoring the 282 00:19:21.180 --> 00:19:27.300 safety of your systems — users and data. So, it's not an 283 00:19:27.300 --> 00:19:31.140 overnight thing, you know, something that he was saying, 284 00:19:31.530 --> 00:19:37.260 people need to start working on now, in order to, you know, have 285 00:19:37.260 --> 00:19:40.800 something that's adequate in the next three to five years, for 286 00:19:40.800 --> 00:19:41.310 example. 287 00:19:41.850 --> 00:19:43.710 Mathew Schwartz: Well, and unique to every environment as 288 00:19:43.710 --> 00:19:46.170 well, because, of course, what you have in terms of 289 00:19:46.170 --> 00:19:49.860 infrastructure, precisely what you want to do with it is going 290 00:19:49.860 --> 00:19:54.870 to vary depending on the organization itself. And I know 291 00:19:54.900 --> 00:19:57.750 healthcare as well has also been at least dipping its toes into 292 00:19:57.750 --> 00:20:02.220 the zero trust fervor. Have you been seeing much on this front, 293 00:20:02.220 --> 00:20:02.730 Marianne? 294 00:20:03.450 --> 00:20:06.090 Marianne McGee: Oh, well, we just finished our healthcare 295 00:20:06.090 --> 00:20:09.780 conference in New York last week. And, you know, zero trust 296 00:20:09.780 --> 00:20:14.670 was certainly a little bit of a buzz there as well, especially 297 00:20:14.670 --> 00:20:17.160 in the context of the pandemic, and all the changes that that 298 00:20:17.160 --> 00:20:21.180 brought in terms of, you know, telemedicine and remote workers 299 00:20:21.180 --> 00:20:24.660 and, you know, temporary people being brought in to help with 300 00:20:24.660 --> 00:20:30.390 the surges, you know, remote monitoring devices, medical 301 00:20:30.390 --> 00:20:34.740 devices, in general, there's always a lot going on in 302 00:20:34.740 --> 00:20:37.590 healthcare, anyway. And I think the pandemic just kind of added 303 00:20:37.590 --> 00:20:41.940 to that pile of pressures. But yeah, you know, zero trust, 304 00:20:41.970 --> 00:20:45.540 there's some since I was in healthcare that have been sort 305 00:20:45.540 --> 00:20:48.330 of working on this for a while and then I think others are kind 306 00:20:48.330 --> 00:20:52.170 of watching to see what other organizations are doing in terms 307 00:20:52.170 --> 00:20:55.890 of those infrastructures and that sort of approach. But, you 308 00:20:55.890 --> 00:20:59.010 know, it's definitely something that the healthcare sector is, 309 00:20:59.160 --> 00:21:01.800 if they're not actively trying something, they're certainly 310 00:21:01.800 --> 00:21:02.940 keeping their eye on that. 311 00:21:04.170 --> 00:21:09.690 Cal Harrison: Yeah. A couple of things of note, from Chase 312 00:21:09.690 --> 00:21:14.970 Cunningham, who, again, you know, as a zero trust expert, 313 00:21:14.970 --> 00:21:20.370 along with Kindervag is, he points out that the industry so 314 00:21:20.370 --> 00:21:25.290 far has done a good job on identity and access management, 315 00:21:25.380 --> 00:21:29.580 there's some good solutions out there, the policy decision 316 00:21:29.580 --> 00:21:33.660 engine that, you know, they've made some great strides. Because 317 00:21:33.660 --> 00:21:35.700 that's going to be really important if you think about it 318 00:21:35.700 --> 00:21:40.020 to sort of continuously authenticate users. So, it's not 319 00:21:40.020 --> 00:21:44.520 a big pain to be able to access your applications and data, 320 00:21:44.550 --> 00:21:48.120 because you don't want to implement something that's going 321 00:21:48.120 --> 00:21:53.040 to slow everybody down and cause a roadblock. One thing that he 322 00:21:53.040 --> 00:21:57.210 mentioned, which I think is noteworthy, is that the other 323 00:21:57.210 --> 00:22:03.120 piece is on the data security side that you need to make sure 324 00:22:03.120 --> 00:22:08.370 that you have a solution that's going to protect, you know, 325 00:22:08.790 --> 00:22:13.080 indiscriminate access to data that it needs to be tied to a 326 00:22:13.080 --> 00:22:20.040 policy and in his opinion, the current sort of data loss 327 00:22:20.040 --> 00:22:24.330 protection tools that we've been using for years are really not 328 00:22:24.660 --> 00:22:27.810 quite up to snuff, you know, you're going to have to be 329 00:22:27.810 --> 00:22:32.820 looking at a new solution for that. The other kind of footnote 330 00:22:33.240 --> 00:22:36.930 from Chase, which I thought was really interesting, was he 331 00:22:36.930 --> 00:22:40.260 points out, and he says that he personally could build a zero 332 00:22:40.260 --> 00:22:49.170 trust solution today, using just open-source software. And he 333 00:22:49.170 --> 00:22:52.470 said that it's not for the faint of heart, though, you know, you 334 00:22:52.470 --> 00:22:55.650 obviously would have to have a lot of skills and special 335 00:22:55.650 --> 00:22:59.700 coding, and, of course, you'd be responsible for keeping the 336 00:22:59.700 --> 00:23:03.000 integration up, and making sure that your open-source software 337 00:23:04.530 --> 00:23:09.720 can't be hacked. But I thought it was an interesting point is, 338 00:23:09.930 --> 00:23:13.440 you could build zero trust today. And it wouldn't cost you 339 00:23:13.440 --> 00:23:16.170 a penny, you know, and he wouldn't have to go through a 340 00:23:16.170 --> 00:23:21.180 software vendor, but obviously, these many software vendors will 341 00:23:21.180 --> 00:23:24.240 give you a different point of view. And there certainly is, 342 00:23:24.270 --> 00:23:27.780 you know, some value to the fact that they're continuously 343 00:23:27.780 --> 00:23:33.450 updating their products, and making sure that they're viable. 344 00:23:33.960 --> 00:23:39.960 Other sort of big players, you know, in this field, would be 345 00:23:39.960 --> 00:23:44.310 the systems integrators, you know, the folks who will make 346 00:23:44.310 --> 00:23:48.600 sure that there's operability and integration between these 347 00:23:49.500 --> 00:23:53.310 various software tools, assuming they use multiple vendors, and 348 00:23:53.310 --> 00:23:56.700 then also, you know, kind of on the horizon or the managed 349 00:23:56.700 --> 00:24:00.960 services providers, because this is a big headache, you know, I 350 00:24:00.960 --> 00:24:04.950 mean, this is talking about re-architecting your entire 351 00:24:05.370 --> 00:24:10.560 enterprise security, you know, environment. So, at some point, 352 00:24:10.920 --> 00:24:14.820 especially a large global organization, it just may be too 353 00:24:14.820 --> 00:24:17.910 complex. And you might just say, "Well, I'm just going to hand 354 00:24:17.910 --> 00:24:24.270 this off to let it be somebody else's headache." So, a lot of 355 00:24:24.750 --> 00:24:29.100 opportunities, obviously, are, you know, being created by this 356 00:24:31.170 --> 00:24:31.440 team. 357 00:24:32.310 --> 00:24:34.260 Mathew Schwartz: A lot of opportunities, like you say, for 358 00:24:34.260 --> 00:24:38.610 systems integrators or software developers, for consultants to 359 00:24:38.610 --> 00:24:41.820 help people get over the finish line, or I guess, with these 360 00:24:41.820 --> 00:24:44.130 projects, you keep getting over the finish line adding 361 00:24:44.130 --> 00:24:51.000 capabilities. Yeah, so fascinating stuff. Big buzzword. 362 00:24:51.300 --> 00:24:55.710 Interested to see if it will be the buzzword at RSA 2023. 363 00:24:55.740 --> 00:25:00.390 Everybody, stay tuned. I'm sure. So, Cal thanks so much for that 364 00:25:00.840 --> 00:25:06.090 overview of the latest on zero trust. And so as the Editors' 365 00:25:06.090 --> 00:25:12.300 Panel draws to a close or more close to a close, my final bonus 366 00:25:12.300 --> 00:25:14.070 question — we'd like to do something to just shake things 367 00:25:14.070 --> 00:25:18.480 out at the end here — would be just to ask each of you, is 368 00:25:18.480 --> 00:25:21.060 there a quirky, I know zero trust can be quirky, I know 369 00:25:21.060 --> 00:25:25.350 ransomware can be quirky, but is there an especially quirky story 370 00:25:25.650 --> 00:25:28.800 on the cybersecurity front that either of you have been tracking 371 00:25:28.890 --> 00:25:29.460 recently? 372 00:25:30.120 --> 00:25:31.470 Marianne McGee: Sure, if you want to pick up on that 373 00:25:31.470 --> 00:25:36.840 ransomware theme, I would say the ongoing investigation, which 374 00:25:36.840 --> 00:25:40.980 is also being done on the side here by our colleague, Jeremy 375 00:25:40.980 --> 00:25:45.510 Kirk - shout out to Jeremy on his Ransomware Files podcast. 376 00:25:45.510 --> 00:25:51.060 But yeah, he's been digging into the infamous Dr. Zagala, 377 00:25:51.330 --> 00:25:55.740 Venezuelan cardiologist, who has been charged by US prosecutors 378 00:25:55.770 --> 00:26:00.330 of creating and selling ransomware that's been used in 379 00:26:00.360 --> 00:26:05.160 many attacks on organizations across the world. So, you know, 380 00:26:05.670 --> 00:26:09.990 a doctor doing this, it's sort of astounding to me, but it's an 381 00:26:09.990 --> 00:26:11.010 interesting story. 382 00:26:11.490 --> 00:26:14.670 Mathew Schwartz: Extra unusual. Yeah, with the allegedly evil 383 00:26:14.670 --> 00:26:18.840 Dr. Ransomware. Yes, right. Yes. Excellent. What about you, 384 00:26:18.840 --> 00:26:19.260 Cal? 385 00:26:20.230 --> 00:26:26.590 Cal Harrison: Well, for me, I've been tracking the Honda key fob 386 00:26:26.590 --> 00:26:31.990 hack, which it was first, you know, sort of went public 387 00:26:32.020 --> 00:26:40.690 several weeks ago, fairly unknown. Researcher, Kevin2600, 388 00:26:42.580 --> 00:26:46.090 finally published a report, you know, saying that he was able to 389 00:26:46.090 --> 00:26:50.920 use, you know, sort of off-the-shelf hardware in order 390 00:26:50.920 --> 00:26:57.790 to capture the code going from the key fob to a number of Honda 391 00:26:58.060 --> 00:27:05.650 models, several different models from 2012 all the way up to 392 00:27:05.680 --> 00:27:11.830 2022, he was able to demonstrate how he could hack into the 393 00:27:11.830 --> 00:27:16.300 system and unlock the doors and start the car. If you think 394 00:27:16.300 --> 00:27:21.430 about it, it is the thing of movies, you know, where the evil 395 00:27:22.030 --> 00:27:26.260 corporation will hack into somebody's car, and you know, 396 00:27:27.610 --> 00:27:30.580 grab control of the steering wheel and drive off. 397 00:27:31.750 --> 00:27:33.940 Mathew Schwartz: You see that the robber dressed all in black 398 00:27:33.940 --> 00:27:36.430 with a little black box, and they press the button and the 399 00:27:36.430 --> 00:27:37.330 car goes tik tik.. 400 00:27:38.650 --> 00:27:44.980 Cal Harrison: Yeah. But what's neat about it, because this has 401 00:27:44.980 --> 00:27:48.910 actually been shown, you know, to happen in the past, this is 402 00:27:48.910 --> 00:27:52.540 probably the first big one that's come up, I'd say in more 403 00:27:52.570 --> 00:27:57.880 modern vehicles. But the thing that's kind of quirky about this 404 00:27:57.880 --> 00:28:05.380 is that the researcher actually contacted Honda, you know, tried 405 00:28:05.380 --> 00:28:09.400 to report the vulnerability, and they refer to him to the 406 00:28:09.430 --> 00:28:16.330 customer service line to like, report a, you know, a problem 407 00:28:16.330 --> 00:28:18.940 with a seatbelt or, you know, something like that. 408 00:28:21.370 --> 00:28:37.450 Mathew Schwartz: Marianne could give you an hour's long dissertation on that, probably. 409 00:28:24.000 --> 00:28:28.922 Yeah, he couldn't get through to anybody, you know, to actually 410 00:28:29.000 --> 00:28:33.922 report the vulnerability. So, he finally, he goes, he goes live 411 00:28:34.000 --> 00:28:38.766 with it. And, of course, they call Honda and Honda says, this 412 00:28:38.844 --> 00:28:43.766 is completely unsubstantiated, you know, this is, you know, the 413 00:28:43.844 --> 00:28:48.922 test was obviously not performed correctly, this couldn't happen. 414 00:28:49.000 --> 00:28:53.532 And then, of course, in the meantime, several other people 415 00:28:53.610 --> 00:28:58.454 just took the same methodology and you know, hacked into their 416 00:28:58.532 --> 00:29:03.219 own Honda and published it online. So, finally, Honda came 417 00:29:03.297 --> 00:29:08.063 out last week and said, yes, actually, it is a vulnerability. 419 00:29:08.141 --> 00:29:12.516 They have a rolling code system in their key fob that is 420 00:29:12.595 --> 00:29:17.126 supposed to prevent this, but unfortunately, they were not 421 00:29:17.204 --> 00:29:21.892 able to prevent it. And there is a CV on this, but the other 422 00:29:21.970 --> 00:29:26.501 interesting thing is that there's actually no fix, there's 423 00:29:26.579 --> 00:29:31.267 no patch for it. So, I'm not sure exactly what Honda's going 424 00:29:31.345 --> 00:29:34.080 to do. They finally did come clean. 425 00:29:34.000 --> 00:29:37.750 Mathew Schwartz: Maybe get away with some lawsuits, if history is any 426 00:29:37.750 --> 00:29:40.780 guide. Well, thank you, Cal. Sounds like unwelcome news for 427 00:29:40.780 --> 00:29:44.260 Honda. No doubt we'll be tracking that story as they 428 00:29:44.350 --> 00:29:48.730 attempt to get their act together with it. Quirky story 429 00:29:48.730 --> 00:29:53.590 for me quickly. There is an arrest recently in Thailand and 430 00:29:53.620 --> 00:29:58.060 as documented by the Bangkok Post, there was a gentleman who 431 00:29:58.060 --> 00:30:03.610 has apparently confessed he was stealing phones from the Banana 432 00:30:03.880 --> 00:30:09.190 IT shops. And he's been called a Robin Hood man in underpants 433 00:30:09.430 --> 00:30:13.270 because he would apparently only wear his underwear when he broke 434 00:30:13.270 --> 00:30:17.200 into shops, because it made him feel normal apparently is what 435 00:30:17.200 --> 00:30:21.430 he said. But the wrinkle here is he would steal these phones and 436 00:30:21.430 --> 00:30:24.310 give them to the poor. And so, the police have been trying to 437 00:30:24.310 --> 00:30:27.550 track him for years. But because he wasn't functioning as a 438 00:30:27.550 --> 00:30:31.180 reseller, or a chokepoint, I suppose, he was giving these 439 00:30:31.210 --> 00:30:35.230 phones away. They couldn't figure it out. But eventually 440 00:30:35.230 --> 00:30:39.460 they got some CCTV, and they tracked it back and he said the 441 00:30:39.460 --> 00:30:42.850 reason he targeted Banana IT shops was because he liked 442 00:30:43.030 --> 00:30:50.680 bananas and the color yellow. So, there's a quirky cybercrime 443 00:30:50.680 --> 00:30:54.730 story to help seal off what's been an in-depth, occasionally 444 00:30:54.730 --> 00:31:00.070 quirky, and always entertaining series of vignettes from the 445 00:31:00.070 --> 00:31:03.610 both of you. So, thank you very much, Marianne and Cal, for your 446 00:31:03.610 --> 00:31:05.230 insights this week on our Editors' Panel. 447 00:31:05.590 --> 00:31:06.640 Marianne McGee: My pleasure. 448 00:31:06.760 --> 00:31:08.800 Cal Harrison: You're welcome. Good to talk to you, Matt. 449 00:31:10.150 --> 00:31:11.800 Mathew Schwartz: Thank you for joining us. I'm Matthew Schwartz 450 00:31:11.800 --> 00:31:13.570 with ISMG. We'll catch you next time