WEBVTT 1 00:00:00.420 --> 00:00:02.820 Tom Field: Hi there, I'm Tom Field. I'm senior vice president 2 00:00:02.820 --> 00:00:05.610 of editorial with Information Security Media Group. My 3 00:00:05.610 --> 00:00:08.400 privilege to be talking today with Anna Westelius. She's the 4 00:00:08.400 --> 00:00:11.550 director of security engineering with Netflix. Anna, thanks so 5 00:00:11.550 --> 00:00:12.690 much for taking time to speak with me. 6 00:00:12.720 --> 00:00:13.590 Anna Westelius: Thank you for having me. 7 00:00:13.740 --> 00:00:15.840 Tom Field: So let's start here. You are new to ISMG studio. 8 00:00:15.870 --> 00:00:16.350 Anna Westelius: Yeah. 9 00:00:16.860 --> 00:00:19.560 Tom Field: Talk a bit about your career and how you arrived in 10 00:00:19.560 --> 00:00:20.160 this role today. 11 00:00:20.220 --> 00:00:22.230 Anna Westelius: Sure, absolutely. So I am from a bit 12 00:00:22.230 --> 00:00:24.780 of an untraditional background, I am self taught in 13 00:00:24.780 --> 00:00:27.000 cybersecurity. And so when I grew up, we didn't have, you 14 00:00:27.000 --> 00:00:30.750 know, cybersecurity programs and universities or anything. And so 15 00:00:30.750 --> 00:00:34.110 I taught myself. I ended up running a couple of companies 16 00:00:34.110 --> 00:00:37.800 myself, mostly to make money. You have to get by, get more 17 00:00:37.800 --> 00:00:41.610 computer parts. And then I started working in larger 18 00:00:41.610 --> 00:00:44.820 security firms, I ended up essentially running an MSSP out 19 00:00:44.820 --> 00:00:48.210 of Stockholm. For a long time, we split that up as a 20 00:00:48.210 --> 00:00:51.090 subsidiary, and that ended up being acquired by a company here 21 00:00:51.090 --> 00:00:54.300 in California. So I've been here for about eight years, doing 22 00:00:54.300 --> 00:00:58.050 variety of security research and engineering leadership roles in 23 00:00:58.050 --> 00:01:00.390 a variety of companies, both on the security vendor side, but 24 00:01:00.390 --> 00:01:03.600 also in big corporations. And now I am fortunate enough to 25 00:01:03.600 --> 00:01:05.850 lead the security engineering team at Netflix. 26 00:01:05.880 --> 00:01:06.540 Tom Field: How long have you been there? 27 00:01:07.050 --> 00:01:07.560 Anna Westelius: Two years. 28 00:01:07.590 --> 00:01:10.380 Tom Field: Excellent. Now you were speaking at RSA Conference, 29 00:01:10.380 --> 00:01:14.310 and the topic was "Construction Time Again: A Lesson in Paving 30 00:01:14.310 --> 00:01:17.040 Paths for Security." What's the premise of this presentation? 31 00:01:17.070 --> 00:01:19.140 Anna Westelius: So it's about figuring out how to build 32 00:01:19.140 --> 00:01:22.230 sustainable security programs for scale. A lot of the times 33 00:01:22.230 --> 00:01:24.660 when we're in especially application security, we do a 34 00:01:24.660 --> 00:01:26.970 lot of manual reviews and assessments. And we spend a lot 35 00:01:26.970 --> 00:01:30.090 of time like repeating manual work. And what we've done at 36 00:01:30.090 --> 00:01:33.390 Netflix is instead spending, letting the fire sort of burn a 37 00:01:33.390 --> 00:01:36.210 little bit, and go and make significant high-leverage 38 00:01:36.210 --> 00:01:40.350 investments in big infrastructure investments. That 39 00:01:40.350 --> 00:01:42.630 gives us more leverage over time, instead of doing those 40 00:01:42.630 --> 00:01:45.300 manual reviews. So this is really a template for others to 41 00:01:45.300 --> 00:01:48.420 take and figure out which sort of high-leverage high-impact 42 00:01:49.440 --> 00:01:51.780 security product investments they can do to make their 43 00:01:52.110 --> 00:01:53.220 security programs scale. 44 00:01:53.220 --> 00:01:55.560 Tom Field: Now Netflix, you're pretty famous for some what we 45 00:01:55.560 --> 00:01:58.380 say non-traditional security testing? Is that fair to say? 46 00:01:58.380 --> 00:02:00.540 Anna Westelius: Yeh, that's fair. 47 00:02:00.570 --> 00:02:02.550 Tom Field: How would you describe that? You sort of blow 48 00:02:02.550 --> 00:02:03.030 things up. 49 00:02:03.150 --> 00:02:05.400 Anna Westelius: We do, a little bit. We do a lot of security 50 00:02:05.400 --> 00:02:09.330 chaos engineering and stuff like that. Right. So it's been a big 51 00:02:09.330 --> 00:02:11.910 part of our strategy. And I think historically, we've had, I 52 00:02:11.910 --> 00:02:14.280 guess, the fortune of being, we are an entertainment company. So 53 00:02:14.280 --> 00:02:16.380 our threat model is different than would you say for a bank, 54 00:02:16.380 --> 00:02:16.920 for example. 55 00:02:16.950 --> 00:02:17.310 Tom Field: Yes. 56 00:02:17.400 --> 00:02:18.960 Anna Westelius: And so with that, we've had a lot of time 57 00:02:18.960 --> 00:02:21.810 and effort to spend on these like high-leverage technology 58 00:02:21.810 --> 00:02:24.090 investments instead of doing some traditional security. 59 00:02:24.990 --> 00:02:27.570 Tom Field: Talk to me about the challenges organizations face 60 00:02:27.570 --> 00:02:29.190 when they do want to scale up their programs. 61 00:02:29.430 --> 00:02:31.890 Anna Westelius: I think for us, in particular, one of the 62 00:02:31.890 --> 00:02:34.530 challenges is the diversity of our portfolio now. We used to be 63 00:02:34.530 --> 00:02:36.870 a software company, and we're now a Hollywood studio. We're a 64 00:02:36.870 --> 00:02:37.500 gaming company. 65 00:02:37.500 --> 00:02:38.700 Tom Field: It used to be a DVD company. 66 00:02:38.700 --> 00:02:40.980 Anna Westelius: Yeah, exactly. And so now we're doing a lot of 67 00:02:40.980 --> 00:02:43.770 different things. And people in those areas, I think, understand 68 00:02:43.770 --> 00:02:44.970 security very differently. 69 00:02:45.000 --> 00:02:45.300 Tom Field: Sure. 70 00:02:45.750 --> 00:02:49.350 Anna Westelius: So a lot of our efforts is translation. But then 71 00:02:49.350 --> 00:02:52.770 also scale as the company grows. We have more than 7,000 72 00:02:52.770 --> 00:02:54.750 internally developed applications that supports our 73 00:02:54.750 --> 00:02:57.810 ecosystem. And managing security for that is quite extensive. 74 00:02:57.990 --> 00:03:00.510 Tom Field: When you talk about paving roads for security. What 75 00:03:00.510 --> 00:03:01.950 are you paving? Are they cow paths? 76 00:03:03.000 --> 00:03:06.060 Anna Westelius: Not really. But I would say that paving roads is 77 00:03:06.060 --> 00:03:09.570 more conceptually, making developers' lives as easy as 78 00:03:09.570 --> 00:03:12.870 possible, instead of telling them what to do. So in certain 79 00:03:12.870 --> 00:03:15.060 areas, you might have somebody wanting to do something fairly 80 00:03:15.060 --> 00:03:17.280 esoteric, like they want to do something in a language that 81 00:03:17.280 --> 00:03:20.370 we're not supporting. And that's fine. But doing the most easiest 82 00:03:20.370 --> 00:03:23.190 thing that well-supported thing is the most secure thing. And so 83 00:03:23.190 --> 00:03:26.430 with that, we can be almost entirely hands off as a security 84 00:03:26.430 --> 00:03:29.040 engineering team, which is great. And lets us focus on the 85 00:03:29.040 --> 00:03:30.030 more difficult things. 86 00:03:30.060 --> 00:03:32.250 Tom Field: Now, there's a natural tension between security 87 00:03:32.250 --> 00:03:34.890 and developers that many organizations deal with. And 88 00:03:34.890 --> 00:03:39.060 this has become a Mars and Venus issue. How have you bridged 89 00:03:39.060 --> 00:03:39.150 that? 90 00:03:39.300 --> 00:03:42.390 Anna Westelius: It does. And we were very intentional about 91 00:03:42.420 --> 00:03:45.000 essentially being the security team that doesn't say no. And 92 00:03:45.000 --> 00:03:47.610 that's put us in trouble a couple of times, I think. But it 93 00:03:47.610 --> 00:03:51.630 really, like, put our relationship with our developers 94 00:03:51.630 --> 00:03:54.570 at a high level. And so they seek us out for guidance, which 95 00:03:54.570 --> 00:03:57.420 I think is amazing. And we owe that to all the people on the 96 00:03:57.420 --> 00:03:59.520 ground whose been working on building those relationships 97 00:03:59.520 --> 00:04:04.260 over time. But we avoid that tension as much as we can. 98 00:04:04.290 --> 00:04:06.570 Tom Field: So talk to me about how you can create shared 99 00:04:06.570 --> 00:04:08.820 accountability and the relationships that will support 100 00:04:08.820 --> 00:04:09.090 that. 101 00:04:09.300 --> 00:04:11.280 Anna Westelius: It's really about setting shared goals, 102 00:04:11.310 --> 00:04:13.860 right. So we meet with all of our cross-functional partners, 103 00:04:13.860 --> 00:04:17.010 very recurringly. We set goals together, and we build toward 104 00:04:17.040 --> 00:04:19.560 that collectively. And so if they have a big infrastructure 105 00:04:19.560 --> 00:04:21.900 initiative that we would like to invest in, like we build 106 00:04:21.900 --> 00:04:24.960 security controls where they are, as opposed to like finding 107 00:04:24.960 --> 00:04:28.080 our own novel way that they then have to discover and learn. 108 00:04:28.680 --> 00:04:32.640 Tom Field: Is this something you can pick up and move with other 109 00:04:32.640 --> 00:04:35.220 departments as well? Or is this unique to security and 110 00:04:35.220 --> 00:04:35.790 development? 111 00:04:35.910 --> 00:04:37.650 Anna Westelius: No. So we work with the entirety of the 112 00:04:37.650 --> 00:04:40.380 business and people, of course, and to my point earlier, like 113 00:04:40.380 --> 00:04:42.870 Hollywood have a very different relationship with security, 114 00:04:42.870 --> 00:04:45.600 right. And so with that, I think we're trying to meet them where 115 00:04:45.600 --> 00:04:48.840 they are in their technologies and their understanding of those 116 00:04:48.840 --> 00:04:51.330 workflows. Right. So we are doing a lot of work in Google's 117 00:04:51.360 --> 00:04:53.190 G Suite, for example, because that's where we have our 118 00:04:53.190 --> 00:04:55.530 documentation, and that's where a lot of that Hollywood people 119 00:04:55.530 --> 00:04:58.440 work. So we make sure that we build security controls that 120 00:04:58.440 --> 00:05:00.540 they don't have to understand but that meets them where they 121 00:05:00.540 --> 00:05:00.720 are. 122 00:05:00.780 --> 00:05:02.460 Tom Field: Ok, so you find opportunities, you build the 123 00:05:02.460 --> 00:05:05.190 relationships, you're able to ultimately reduce security 124 00:05:05.190 --> 00:05:05.970 risks. So ... 125 00:05:06.000 --> 00:05:07.980 Anna Westelius: Yeah, I mean, we build the right things. And then 126 00:05:07.980 --> 00:05:10.080 we measure the right things. I think metrics is really 127 00:05:10.080 --> 00:05:12.870 important. In a lot of cases for security controls, people think 128 00:05:12.870 --> 00:05:15.060 about coverage, more so than they think about actual 129 00:05:15.060 --> 00:05:18.180 utilization of those controls. So it's really important to 130 00:05:18.180 --> 00:05:20.520 understand how you've reduced security controls with the 131 00:05:20.520 --> 00:05:22.200 capabilities that you're putting in place. 132 00:05:22.290 --> 00:05:23.910 Tom Field: Can you give me some specificity. Give me some 133 00:05:23.910 --> 00:05:25.650 examples of what you've done, and how you've been able to 134 00:05:25.650 --> 00:05:26.550 build these bridges. 135 00:05:26.580 --> 00:05:28.200 Anna Westelius: I mean, for example, if you're looking at 136 00:05:28.200 --> 00:05:30.630 things like access control, right? It's one of our most 137 00:05:30.630 --> 00:05:33.150 impactful controls for security, access control, vulnerability 138 00:05:33.150 --> 00:05:36.870 management, incident response, when you implement, obviously 139 00:05:36.870 --> 00:05:38.910 making sure that people are actually putting the right 140 00:05:38.910 --> 00:05:43.290 policies in place, as opposed to just putting anything in place. 141 00:05:43.530 --> 00:05:45.930 Because a lot of the times if you investigate those policies, 142 00:05:45.930 --> 00:05:47.430 they might not necessarily be correct. 143 00:05:48.360 --> 00:05:50.850 Tom Field: For you, what are the security risks of greatest 144 00:05:50.850 --> 00:05:51.660 importance today? 145 00:05:52.530 --> 00:05:54.630 Anna Westelius: That is a really hard question. And I don't know 146 00:05:54.630 --> 00:05:56.310 if I could really answer that. 147 00:05:56.310 --> 00:05:57.990 Tom Field: Because you got content certainly, people want 148 00:05:57.990 --> 00:06:00.900 to pirate. You've got a service that some would like to disrupt. 149 00:06:01.020 --> 00:06:04.350 Anna Westelius: Yes. And I think it sort of depends so much on 150 00:06:04.350 --> 00:06:06.630 what sort of organization you're running, like everybody's threat 151 00:06:06.630 --> 00:06:08.640 model is so different. We're again, we're an entertainment 152 00:06:08.640 --> 00:06:11.310 company, not a bank. And so our investment is going to look very 153 00:06:11.310 --> 00:06:14.430 different than say, PayPal or like Wells Fargo. 154 00:06:14.490 --> 00:06:16.800 Tom Field: Absolutely. How has this shaped your career? 155 00:06:18.370 --> 00:06:23.860 Anna Westelius: The what is the priority? I think it's had, it 156 00:06:23.860 --> 00:06:27.670 had to have me learn a lot of new things. And so you pivot 157 00:06:27.670 --> 00:06:30.160 between different technologies, because that is what is, you 158 00:06:30.160 --> 00:06:33.100 know, the thing right now, and you have to learn very quickly 159 00:06:33.100 --> 00:06:36.070 where the real risks are. And I think, as the security industry, 160 00:06:36.070 --> 00:06:39.670 we're still not very good at handling the same type of core 161 00:06:39.670 --> 00:06:42.460 issues that we've been dealing with for years and years. Look, 162 00:06:42.460 --> 00:06:45.580 at every breach we've had in the several past months, it's all 163 00:06:45.580 --> 00:06:48.250 credentials in a variety of ways, you know, and the MFAs, 164 00:06:48.280 --> 00:06:51.190 extortion, stuff like that. And so I think it's balancing, I 165 00:06:51.190 --> 00:06:53.770 think, the shininess of new technology with fixing these 166 00:06:53.770 --> 00:06:54.940 core issues at scale. 167 00:06:55.440 --> 00:06:57.870 Tom Field: So as someone that is you say came up through a 168 00:06:57.870 --> 00:07:00.930 non-traditional path in security, you see the 169 00:07:00.930 --> 00:07:03.840 opportunities around you now, and there's great opportunity 170 00:07:03.840 --> 00:07:08.250 for people coming in. What would your advice be to a younger Anna 171 00:07:08.280 --> 00:07:10.950 today, while establishing a career? 172 00:07:11.460 --> 00:07:13.710 Anna Westelius: Go for the things that motivate you. I 173 00:07:13.710 --> 00:07:16.350 think as security professionals, we oftentimes find ourselves in 174 00:07:16.350 --> 00:07:19.080 this like uphill battle, right, you're pushing the rock uphill, 175 00:07:19.080 --> 00:07:22.410 and you can't fix everything. So you really need to be really 176 00:07:22.410 --> 00:07:25.290 motivated and excited by the work that you're doing to sort 177 00:07:25.290 --> 00:07:29.400 of counterbalance the not the sadness, but the impact of that 178 00:07:29.400 --> 00:07:30.960 continuous sort of effort. 179 00:07:31.080 --> 00:07:31.770 Tom Field: What motivates you? 180 00:07:33.030 --> 00:07:36.270 Anna Westelius: Supporting people, I think the more senior 181 00:07:36.270 --> 00:07:39.240 I've gotten in my career, it's become more about making the 182 00:07:39.240 --> 00:07:43.530 people doing the work, who are way smarter than I am. To be 183 00:07:43.530 --> 00:07:46.170 really successful at that and see them be able to have that 184 00:07:46.170 --> 00:07:47.550 impact is really what motivates me. 185 00:07:47.900 --> 00:07:50.300 Tom Field: When you give this discussion, What Paving Paths 186 00:07:50.300 --> 00:07:52.340 for Security. What are the types of questions you get from people? 187 00:07:52.340 --> 00:07:53.510 What's the feedback you get? 188 00:07:53.000 --> 00:07:54.800 Anna Westelius: A lot of people just want to know how to 189 00:07:54.800 --> 00:07:58.340 implement the tools that we have. We've had, again, we have 190 00:07:58.340 --> 00:08:00.830 a fantastic developer productivity organization that's 191 00:08:00.830 --> 00:08:03.470 build a lot of the foundations that we're securing on top of 192 00:08:03.470 --> 00:08:06.290 and so making that available, we have a strong OSS culture at 193 00:08:06.290 --> 00:08:08.660 Netflix. And so we've open sourced a lot of our tooling. 194 00:08:08.660 --> 00:08:10.670 And so a lot of people are just like, how can I plug and play 195 00:08:10.670 --> 00:08:13.760 the thing you're doing? Which oftentimes doesn't work because 196 00:08:13.760 --> 00:08:15.680 their environment is different. They use a different tech stack 197 00:08:15.680 --> 00:08:19.880 or whatever. But oftentimes, that's the first question like, 198 00:08:19.880 --> 00:08:20.690 how do I do this? 199 00:08:21.090 --> 00:08:22.560 Tom Field: Well, I appreciate you taking time to share insight 200 00:08:22.560 --> 00:08:23.760 with us today Anna, thanks so much. 201 00:08:23.790 --> 00:08:24.240 Anna Westelius: Thank you. 202 00:08:24.840 --> 00:08:27.600 Again, I've been talking with Anna Westelius. She is the 203 00:08:27.600 --> 00:08:30.180 director of security engineering with Netflix. And for 204 00:08:30.180 --> 00:08:33.210 Information Security Media Group, I'm Tom Field. Thank you 205 00:08:33.210 --> 00:08:34.800 for giving us your time and attention today.