WEBVTT 1 00:00:00.030 --> 00:00:01.980 Michael Novinson: Hello, this is Michael Novinson with 2 00:00:01.980 --> 00:00:04.710 Information Security Media Group. Speaking with Taylor 3 00:00:04.710 --> 00:00:08.040 Lehmann. He is the director at the Office of the CISO for 4 00:00:08.040 --> 00:00:10.170 Google Cloud. Morning, Taylor, how are you? 5 00:00:10.200 --> 00:00:11.460 Taylor Lehmann: Good. Thanks. Good morning. 6 00:00:11.660 --> 00:00:13.610 Michael Novinson: Let's get into healthcare cybersecurity here 7 00:00:13.610 --> 00:00:16.010 starting at a high level, what are some of the biggest 8 00:00:16.040 --> 00:00:20.060 challenges that organizations have when it comes to securing 9 00:00:20.810 --> 00:00:21.770 healthcare-type environments? 10 00:00:22.260 --> 00:00:23.670 Taylor Lehmann: Well, I think it's important to frame 11 00:00:23.700 --> 00:00:27.390 healthcare is, you know, it's an industry that has many different 12 00:00:27.390 --> 00:00:31.500 flavors. You have hospitals, you have health plans, you have life 13 00:00:31.500 --> 00:00:34.080 sciences companies, and all sorts of different flavors in 14 00:00:34.080 --> 00:00:37.890 there, some that do research, pharmaceuticals, manufacturing 15 00:00:37.890 --> 00:00:41.640 technology that delivers care, treats people, all of these 16 00:00:41.640 --> 00:00:44.910 industries or sub industries have different sets of issues. 17 00:00:45.870 --> 00:00:48.420 And all of these industries are populated by companies with 18 00:00:48.420 --> 00:00:51.030 different levels of capabilities. So, the big pharma 19 00:00:51.030 --> 00:00:55.890 tech companies tend to have, you know, specific types of threats 20 00:00:55.890 --> 00:00:58.110 and specific types of technologies they protect, and 21 00:00:58.290 --> 00:01:01.020 specific capabilities aligned to protect them, whereas a 22 00:01:01.020 --> 00:01:03.990 hospital, you know, has a different sort of technology. So 23 00:01:03.990 --> 00:01:06.090 each of them are a little different. But you know, 24 00:01:06.090 --> 00:01:09.990 broadly, I would say, obviously, protecting patient health data 25 00:01:10.020 --> 00:01:12.690 is obviously required by law almost everywhere in the world, 26 00:01:12.900 --> 00:01:17.670 but certainly has value from a variety of perspectives. And 27 00:01:17.670 --> 00:01:21.870 people want it, both from a confidentiality and as we've 28 00:01:21.870 --> 00:01:24.720 learned more and more, the availability of that information 29 00:01:24.720 --> 00:01:28.920 is super important, and has value too as well. Intellectual 30 00:01:28.920 --> 00:01:31.860 property, especially when it comes to research, new 31 00:01:31.860 --> 00:01:35.910 technology to treat patients, treat populations, make them 32 00:01:35.910 --> 00:01:39.570 healthy, but also sustain and thrive, super valuable to a 33 00:01:39.570 --> 00:01:44.400 variety of threat actors around the world. And, beyond, you 34 00:01:44.400 --> 00:01:46.980 know, the data and the intellectual property, you know, 35 00:01:47.010 --> 00:01:50.790 I'd say just generally, allowing these industries to sort of 36 00:01:51.000 --> 00:01:56.070 operate and innovate, you know, individuals have found ways to 37 00:01:56.310 --> 00:02:00.270 threaten the safe operation of organizations of all types, 38 00:02:00.810 --> 00:02:04.920 through things like ransomware. And, you know, just different 39 00:02:04.920 --> 00:02:07.470 targeted attacks on the infrastructure to basically 40 00:02:07.650 --> 00:02:10.800 affect their ability to thrive as businesses and run and treat. 41 00:02:10.800 --> 00:02:14.280 And so it's an interesting set of dynamics, each industry has 42 00:02:14.280 --> 00:02:18.630 its own sorts of unique issues. But, confidentiality, 43 00:02:18.930 --> 00:02:21.660 intellectual property theft and availability of systems tend to 44 00:02:21.660 --> 00:02:25.110 be, you know, where I'd say most people are focused on, both 45 00:02:25.170 --> 00:02:26.400 attacking and securing. 46 00:02:26.700 --> 00:02:28.290 Michael Novinson: Let's double click for a moment here in terms 47 00:02:28.290 --> 00:02:32.130 of the patient data issue, and I want to get a sense from you of 48 00:02:32.130 --> 00:02:34.380 what are some of the newer emerging challenges associated 49 00:02:34.380 --> 00:02:36.810 with safeguarding patient data? What are some threats you're 50 00:02:36.810 --> 00:02:38.310 seeing that are making that more challenging? 51 00:02:38.870 --> 00:02:42.260 Taylor Lehmann: Well, it's HIPAA, as a rule, one of the 52 00:02:42.260 --> 00:02:45.110 earlier rules, but now, you know, every country around the 53 00:02:45.110 --> 00:02:47.720 world has its own sort of flavor of it. You know, they were 54 00:02:47.720 --> 00:02:49.670 focused originally on the confidentiality of this 55 00:02:49.670 --> 00:02:52.130 information. I think, for years, people said, well, as long as 56 00:02:52.130 --> 00:02:56.090 it's safe and private, then you know, I'm compliant. And I think 57 00:02:56.090 --> 00:02:58.550 two things have changed. One is compliance is not the goal, 58 00:02:58.580 --> 00:03:03.050 necessarily, when it comes to protecting data. You know, this 59 00:03:03.050 --> 00:03:05.540 data has value that I don't think the original rule makers 60 00:03:05.540 --> 00:03:07.940 could have predicted when they set the rules. And so some of 61 00:03:07.940 --> 00:03:10.640 the rules around hey, how do what do I need to do be 62 00:03:10.670 --> 00:03:14.120 compliant, differ greatly from what do I need to be secure in 63 00:03:14.120 --> 00:03:18.920 this day and age. And I would say, what's new or recently new 64 00:03:18.920 --> 00:03:22.160 is now the availability and integrity of that information, 65 00:03:22.490 --> 00:03:25.310 which, you know, some would say, was contemplated in the original 66 00:03:25.310 --> 00:03:28.940 rules, I would say, probably not. But hey, you know, let's go 67 00:03:28.940 --> 00:03:31.520 with it. Integrity and availability are important. And 68 00:03:31.520 --> 00:03:35.600 here's why. Availability of patient data basically makes 69 00:03:35.600 --> 00:03:39.170 sure that if you show up at the hospital, you can get care, 70 00:03:39.590 --> 00:03:43.520 right. And that's threatened with things like ransomware. So 71 00:03:43.520 --> 00:03:46.010 if you can't get care, because your data can't be accessed, 72 00:03:46.220 --> 00:03:50.060 your ability to, you know, leave healthy is challenged. And, you 73 00:03:50.060 --> 00:03:52.280 know, that is something that, again, I've mentioned before, 74 00:03:52.280 --> 00:03:55.910 has value from a threat actor's perspective, and that will 75 00:03:55.910 --> 00:04:00.290 compel payment and, you know, motivate different people to do 76 00:04:00.290 --> 00:04:03.020 different things. The more interesting thing, especially 77 00:04:03.020 --> 00:04:06.290 with the advent of, you know, machine learning and artificial 78 00:04:06.290 --> 00:04:09.470 intelligence is integrity of data. Integrity has always been 79 00:04:09.470 --> 00:04:11.870 important. But now when we start talking about adopting new 80 00:04:11.870 --> 00:04:15.440 technologies to streamline care delivery, to make research move 81 00:04:15.440 --> 00:04:18.980 faster, and bring different parties together, and basically 82 00:04:18.980 --> 00:04:22.040 increase the reliance we have on information like patient 83 00:04:22.040 --> 00:04:24.980 protected health information, that integrity piece becomes 84 00:04:24.980 --> 00:04:27.860 really important - not just for care delivery, but for the next 85 00:04:27.860 --> 00:04:30.230 phase of innovation in healthcare. And so we're 86 00:04:30.230 --> 00:04:34.100 starting to see a tax on integrity. Because we know that 87 00:04:34.130 --> 00:04:36.980 downstream of data is being used to build machine learning models 88 00:04:36.980 --> 00:04:39.290 and then affect how care is delivered or how research is 89 00:04:39.290 --> 00:04:42.620 done or how new products are developed. You can very insert 90 00:04:42.620 --> 00:04:45.020 yourself very early in the supply chain to those products 91 00:04:45.290 --> 00:04:47.690 and you know, eventually have different interesting kinds of 92 00:04:47.690 --> 00:04:50.150 outcomes that you might be looking for. So integrity is a 93 00:04:50.150 --> 00:04:50.900 big thing right now. 94 00:04:51.360 --> 00:04:52.650 Michael Novinson: What are some of the unique challenges 95 00:04:52.650 --> 00:04:55.890 associated with safeguarding IoT and OT devices in a healthcare 96 00:04:55.890 --> 00:04:56.280 setting? 97 00:04:57.500 --> 00:05:02.270 Taylor Lehmann: Age, the complexity and the variety of 98 00:05:02.300 --> 00:05:07.250 the IT and OT, I'd say are probably the two biggest. Your 99 00:05:07.250 --> 00:05:10.580 ability to adopt a uniform security strategy with respect 100 00:05:10.580 --> 00:05:16.070 to that technology is basically impossible. In my sort of 101 00:05:16.070 --> 00:05:18.380 travels, what I've noticed probably the only similarity 102 00:05:18.380 --> 00:05:22.610 with any individual piece of IT and OT is that if it's network 103 00:05:22.610 --> 00:05:26.390 connected, you have a network connection, you can pull and use 104 00:05:26.390 --> 00:05:30.110 to build a security strategy from, if it's not, you have far 105 00:05:30.110 --> 00:05:33.830 fewer options. And so when I think teams try to approach IT 106 00:05:33.830 --> 00:05:37.160 and OT, it's a different approach to saying how you might 107 00:05:37.160 --> 00:05:39.710 secure corporate infrastructure, where there are more standards 108 00:05:39.710 --> 00:05:44.180 and more commonality. IT and OT requires heavy emphasis on 109 00:05:44.180 --> 00:05:47.420 threat modeling, heavy emphasis on understanding what tech you 110 00:05:47.420 --> 00:05:50.210 actually have, and how to inventory and find it, a heavy 111 00:05:50.210 --> 00:05:53.000 emphasis on being able to at least understand what's going on 112 00:05:53.000 --> 00:05:55.640 the device to determine good from bad. And those are really 113 00:05:55.640 --> 00:05:57.800 challenging problems that require a lot of time. 114 00:05:58.800 --> 00:06:00.735 Michael Novinson: And from a supply chain perspective, what 115 00:06:00.782 --> 00:06:03.285 are some of the biggest obstacles that organizations, 116 00:06:03.332 --> 00:06:06.212 healthcare organizations have to deal with to ensure the safe 117 00:06:06.259 --> 00:06:07.770 operation of their supply chain? 118 00:06:07.000 --> 00:06:09.534 Taylor Lehmann: Well I mean, I think, you know, all industries 119 00:06:09.589 --> 00:06:13.170 are challenged with this. And we talk a lot about this at Google, 120 00:06:13.226 --> 00:06:16.642 in terms of my role, when I'm working with CISOs is, look, you 121 00:06:16.697 --> 00:06:20.003 know, your security program with any technology, and depends 122 00:06:20.058 --> 00:06:23.253 heavily on your supply chain. And in a variety of factors, 123 00:06:23.309 --> 00:06:26.339 obviously, quality of what you're buying, you know, the 124 00:06:26.394 --> 00:06:29.645 tendencies of the vendors and their ability to maintain and 125 00:06:29.700 --> 00:06:32.951 manage things. And the quality of that is really important. 126 00:06:33.006 --> 00:06:36.091 Those are hard things to bat out until you actually have 127 00:06:36.146 --> 00:06:39.122 something running in your infrastructure. So those are 128 00:06:39.177 --> 00:06:42.538 lessons that are difficult to learn until you've learned them 129 00:06:42.593 --> 00:06:45.844 a little too late. But most importantly, and I think IT and 130 00:06:45.899 --> 00:06:48.984 OT is uniquely challenged by that it's really expensive, 131 00:06:49.039 --> 00:06:52.566 which means it's subject to less technology refresh than, say, a 132 00:06:52.621 --> 00:06:55.761 server or a storage array or cloud service, you might get 133 00:06:55.816 --> 00:06:59.177 where, you know, those have a lifetime of one to three years, 134 00:06:59.233 --> 00:07:02.704 and relatively easy to replace. An MRI machine or lab equipment 135 00:07:02.759 --> 00:07:05.844 lives for 10, 15 years, sometimes even longer past their 136 00:07:05.899 --> 00:07:09.150 end of life. And to me, you know, one of the most important 137 00:07:09.205 --> 00:07:12.126 indicators of your ability to protect yourself is how 138 00:07:12.181 --> 00:07:15.542 frequently you are refreshing technology. And that technology 139 00:07:15.597 --> 00:07:17.140 is not frequently refreshed. 140 00:07:17.550 --> 00:07:19.380 Michael Novinson: And what are the security implications then 141 00:07:19.380 --> 00:07:23.370 of the fact that healthcare technology often is used past 142 00:07:23.370 --> 00:07:24.090 its end of life? 143 00:07:24.000 --> 00:07:27.180 Taylor Lehmann: Well, there's a variety of like, quality of the 144 00:07:27.180 --> 00:07:31.860 products they produce issues, safety in terms of their use, 145 00:07:31.860 --> 00:07:33.990 whereas some of these devices are actually implanted in 146 00:07:33.990 --> 00:07:36.870 people's bodies and keep them alive. So I don't know about 147 00:07:36.870 --> 00:07:40.830 you, but I wouldn't want a piece of medical equipment that was 148 00:07:40.830 --> 00:07:44.400 touching me that was keeping me alive to be 20 years out of date 149 00:07:44.400 --> 00:07:46.620 and vulnerable. Now, that's an extreme example. But there are 150 00:07:46.620 --> 00:07:49.950 examples like that, that are out there. You know, you were asked 151 00:07:49.950 --> 00:07:54.510 earlier about supply chain and risks around it, I think, to me, 152 00:07:56.430 --> 00:07:59.520 you know, really considering the relevance and the importance of 153 00:07:59.520 --> 00:08:03.870 this stuff, and life safety issues. And not just life 154 00:08:03.870 --> 00:08:07.560 safety, but like the criticality of an organization's, you know, 155 00:08:07.560 --> 00:08:10.860 systems, how they flow, what operations they support, and how 156 00:08:10.860 --> 00:08:13.710 important they are. I'm not sure it's always gotten the same 157 00:08:13.710 --> 00:08:18.120 level of sort of understanding and therefore, awareness of the 158 00:08:18.120 --> 00:08:20.190 risks that it carries and ability to then mitigate those 159 00:08:20.190 --> 00:08:23.490 risks throughout what I would say is a typically normal or 160 00:08:23.550 --> 00:08:25.470 unsophisticated security program, which many 161 00:08:25.470 --> 00:08:28.470 organizations have to run and use to try to understand and 162 00:08:28.470 --> 00:08:29.880 secure this stuff. So ... 163 00:08:30.450 --> 00:08:31.770 Michael Novinson: Let me ask you here finally, what should 164 00:08:31.770 --> 00:08:34.140 healthcare organizations be expecting from upcoming 165 00:08:34.140 --> 00:08:34.980 regulations? 166 00:08:36.130 --> 00:08:39.220 Taylor Lehmann: I mean, I think it depends, you know, there's 167 00:08:39.220 --> 00:08:43.420 been a lot of talk around what U.S. government is doing in the 168 00:08:43.420 --> 00:08:45.580 federal space, or excuse me, in the financial space around 169 00:08:45.580 --> 00:08:49.510 boards, equipping boards with more security expertise, you 170 00:08:49.510 --> 00:08:51.340 know, with the belief that that's going to compel 171 00:08:51.340 --> 00:08:55.210 organizations to get better at cyber. You're seeing those rules 172 00:08:55.210 --> 00:08:58.360 in Europe, you're seeing those rules, sort of different flavors 173 00:08:58.360 --> 00:09:02.140 of it. But eventually, you know, the same sort of idea is that in 174 00:09:02.140 --> 00:09:04.960 order to make progress against cyber risks, we have to engage 175 00:09:04.960 --> 00:09:07.360 the business more and more and more, and there's no better way 176 00:09:07.360 --> 00:09:10.060 to do that, then do it through those who hold the business 177 00:09:10.060 --> 00:09:12.730 accountable to not only themselves but to shareholders. 178 00:09:13.180 --> 00:09:15.700 So we're seeing a lot of and I'm this is something that Google 179 00:09:15.700 --> 00:09:18.340 Cloud has been doing a lot of is, you know, and through some 180 00:09:18.340 --> 00:09:20.890 of our board horizons reporting. And some of the things that 181 00:09:21.340 --> 00:09:25.630 we've recently published is trying to equip boards with the 182 00:09:25.630 --> 00:09:28.660 information and knowledge they need to not only ask the right 183 00:09:28.660 --> 00:09:31.240 security questions of the security team, but ask the right 184 00:09:31.270 --> 00:09:34.840 security questions of the others at the board, but also ensure 185 00:09:35.020 --> 00:09:37.780 that you know what information they're being brought is 186 00:09:37.960 --> 00:09:41.500 effective and help them as their role as leaders of an 187 00:09:41.500 --> 00:09:44.470 organization is overseers of an organization can do that job 188 00:09:44.470 --> 00:09:48.790 effectively. So there's a lot there. I think with the FDA has 189 00:09:48.790 --> 00:09:54.880 recently done with finalizing guidance around medical device 190 00:09:54.880 --> 00:09:57.730 safety, security is really positive. I know there's some 191 00:09:57.730 --> 00:10:01.960 deadlines later this year, where, you know, effectively, 192 00:10:02.140 --> 00:10:05.410 this is a summarization. But organizations that sell 193 00:10:05.410 --> 00:10:07.990 regulated medical equipment will need to meet security standards 194 00:10:07.990 --> 00:10:10.660 in order to sell their product, which is new. And it's something 195 00:10:10.660 --> 00:10:12.760 the industry has been pushing for, for a really long time. 196 00:10:13.000 --> 00:10:15.310 Because you know, the end-of-life software, 197 00:10:15.430 --> 00:10:18.640 end-of-life hardware, it's not something that is secure. And 198 00:10:18.640 --> 00:10:21.820 nobody believes that really should be allowed necessarily as 199 00:10:21.820 --> 00:10:27.430 a, you know, that fact to be true about equipment that's 200 00:10:27.430 --> 00:10:32.470 being bought in these days and used, so, you know, what's been 201 00:10:32.470 --> 00:10:35.410 done in terms of securing medical devices now, adding a 202 00:10:35.410 --> 00:10:38.830 regulatory angle to that I think is super positive. I think we're 203 00:10:38.830 --> 00:10:43.270 seeing sort of, we're hearing and we're seeing mostly in 204 00:10:43.270 --> 00:10:46.180 Europe, but issues around digital sovereignty, data 205 00:10:46.180 --> 00:10:50.590 residency, protecting information of individuals, and 206 00:10:50.620 --> 00:10:53.200 ensuring that information about those individual stays within 207 00:10:53.200 --> 00:10:56.770 those countries that includes health data. But it includes a 208 00:10:56.770 --> 00:11:00.940 broader array of data. And so organizations, I'd say, who are 209 00:11:00.940 --> 00:11:03.040 operating in those countries, you know, obviously health data 210 00:11:03.040 --> 00:11:06.280 is, in a sense, personal data as well, has to meet those 211 00:11:06.280 --> 00:11:09.010 regulations too. So you're starting to see, you know, 212 00:11:09.010 --> 00:11:11.980 further progression of an implementation of security rules 213 00:11:11.980 --> 00:11:13.840 in Europe, and you're seeing different approaches and 214 00:11:13.840 --> 00:11:16.120 strategies that are needing to be adopted in this day and age 215 00:11:16.120 --> 00:11:18.910 to comply with the way the rules now work, which not only just 216 00:11:18.910 --> 00:11:21.490 say, hey, you need to be careful on who you share it with and how 217 00:11:21.490 --> 00:11:24.880 it's secured. Just need to make sure it never leaves, you know, 218 00:11:24.880 --> 00:11:27.280 certain areas to which is an interesting problem, especially 219 00:11:27.280 --> 00:11:30.550 if you're working in cloud and other distributed technologies. 220 00:11:30.550 --> 00:11:33.910 Where not, it's not always been easy to figure that out. So ... 221 00:11:33.950 --> 00:11:35.660 Michael Novinson: Interesting stuff. Taylor, thank you so much 222 00:11:35.660 --> 00:11:36.170 for the time. 223 00:11:36.350 --> 00:11:36.980 Taylor Lehmann: Thank you. 224 00:11:37.560 --> 00:11:39.390 Michael Novinson: We've been speaking with Taylor Lehmann. He 225 00:11:39.390 --> 00:11:42.930 is the director at the Office of the CISO for Google Cloud. For 226 00:11:42.930 --> 00:11:45.570 Information Security Media Group, this is Michael Novinson. 227 00:11:45.870 --> 00:11:46.740 Have a nice day.