WEBVTT 1 00:00:07.260 --> 00:00:10.110 Anna Delaney: Hello, this is the ISMG Editors' Panel where ISMG 2 00:00:10.110 --> 00:00:13.320 editors gather to evaluate the week's top cybersecurity and 3 00:00:13.320 --> 00:00:16.620 technology stories and figure out what they will mean. I'm 4 00:00:16.620 --> 00:00:19.620 Anna Delaney, director of productions at ISMG. And I'm 5 00:00:19.620 --> 00:00:22.410 joined by colleagues Tom Field, senior vice president of 6 00:00:22.410 --> 00:00:24.960 editorial; Mathew Schwartz, executive editor of 7 00:00:25.020 --> 00:00:27.930 DataBreachToday and Europe; and Michael Novinson, managing 8 00:00:27.930 --> 00:00:30.660 editor of ISMG business. Very good to see you all. 9 00:00:31.290 --> 00:00:32.280 Tom Field: Thanks for having us over. 10 00:00:32.670 --> 00:00:33.660 Mathew Schwartz: Nice to be back. 11 00:00:34.440 --> 00:00:35.160 Michael Novinson: Nice to see you. 12 00:00:35.790 --> 00:00:40.530 Anna Delaney: Well, Tom, you have light shining, radiating, 13 00:00:40.770 --> 00:00:41.310 tell us more. 14 00:00:42.030 --> 00:00:44.420 Tom Field: I wanted to set the tone here for this discussion. I 15 00:00:44.466 --> 00:00:47.086 thought this might be a good look. Just a couple of weeks 16 00:00:47.132 --> 00:00:49.891 ago, I was flying out of my local airport in Augusta, Maine, 17 00:00:49.937 --> 00:00:52.879 and it happened to be about time the sun was coming up. It's one 18 00:00:52.925 --> 00:00:55.683 of those you look out the window and you just see it and you 19 00:00:55.729 --> 00:00:58.442 know, it's the moment and then you think there I have got a 20 00:00:58.488 --> 00:01:01.246 virtual background for the next Editors' Panel? You know 21 00:01:01.292 --> 00:01:01.890 that feeling. 22 00:01:01.900 --> 00:01:05.800 Anna Delaney: I do, indeed. Always on the lookout. Mathew, 23 00:01:05.860 --> 00:01:10.450 so from the sky to the upside-down world. Tell us more. 24 00:01:10.000 --> 00:01:37.060 25 00:01:10.440 --> 00:01:13.035 Mathew Schwartz: Yeah, I'm literally in the gutter here, 26 00:01:13.102 --> 00:01:17.162 Anna, and on a street in Dundee where the rain seems to never 27 00:01:17.229 --> 00:01:21.289 stop. It's actually, it's been really arresting, visually. I've 28 00:01:21.356 --> 00:01:25.416 been, I've launched a project on reflections. Just because we have had so much 29 00:01:25.483 --> 00:01:29.210 of water these last couple of months that there's been some 30 00:01:29.277 --> 00:01:33.204 really, you know, lovely reflections, especially around dusk, which is 31 00:01:33.270 --> 00:01:34.469 where we are here. 32 00:01:37.450 --> 00:01:40.030 Tom Field: I was going to suggest that this was the Stranger Things version of Dundee, but apparently not. 33 00:01:41.110 --> 00:01:43.150 Mathew Schwartz: It's always the upside-down world here. 34 00:01:43.690 --> 00:01:45.070 Anna Delaney: That's a reflection in the water. 35 00:01:45.850 --> 00:01:48.430 Mathew Schwartz: Yeah, this is a reflection. Yeah. So if you were 36 00:01:48.430 --> 00:01:50.800 to pan upwards, there would actually be the building up 37 00:01:50.800 --> 00:01:50.890 there. 38 00:01:51.590 --> 00:01:55.220 Anna Delaney: Excellent. So clear. Brilliant. And, Michael, 39 00:01:55.490 --> 00:01:56.450 you've just got to explain. 40 00:01:56.430 --> 00:01:58.646 Michael Novinson: Of course, coming to you from Orange 41 00:01:58.708 --> 00:02:02.341 Connecticut side of the Pez Museum. It's been a Pez factory 42 00:02:02.402 --> 00:02:06.281 for quite a while. They opened it as a Pez Museum back in 2011. 43 00:02:06.343 --> 00:02:09.976 We were, my family was passing through on the way home from 44 00:02:10.037 --> 00:02:13.732 Thanksgiving and figured why not take a little stop there to 45 00:02:13.793 --> 00:02:17.672 learn about the history of Pez, all the different dispensers. I 46 00:02:17.734 --> 00:02:21.675 just find all the Pez elves can make your own customized Pez and 47 00:02:21.736 --> 00:02:25.431 somehow came home with three separate Pez dispensers. So not 48 00:02:25.492 --> 00:02:29.063 a bad use. Not a bad use of $10, not a bad way to spend our 49 00:02:29.125 --> 00:02:31.650 driving through the Connecticut highways. 50 00:02:31.000 --> 00:02:34.510 Anna Delaney: A museum for everything that seems. 51 00:02:34.840 --> 00:02:35.230 Michael Novinson: Yes. 52 00:02:36.140 --> 00:02:38.150 Anna Delaney: Well I'm in the beautiful town of Corleone in 53 00:02:38.150 --> 00:02:41.810 Sicily. And if you've watched the Godfather series, you may 54 00:02:41.810 --> 00:02:44.990 remember the character Vito Corleone, played by Marlon 55 00:02:44.990 --> 00:02:48.560 Brando and inspired by this town, because in previous years, 56 00:02:48.890 --> 00:02:51.770 it was known for its association with the mafia. And it's an 57 00:02:51.770 --> 00:02:55.430 image that the residents are very keen to dispel. But 58 00:02:55.460 --> 00:02:59.210 regardless, it's very much worth a visit if you are in Italy. 59 00:03:00.200 --> 00:03:01.490 Tom Field: You have new connections, Anna? 60 00:03:02.630 --> 00:03:06.260 Anna Delaney: Working on them. Well, Michael, let's start with 61 00:03:06.260 --> 00:03:09.110 you this week, because it's fair to say that the week leading up 62 00:03:09.110 --> 00:03:12.380 to the Thanksgiving holiday in the U.S. was dominated by one 63 00:03:12.380 --> 00:03:17.510 story, and that was of OpenAI, Sam Altman, and that saga. So 64 00:03:17.720 --> 00:03:20.990 you were working, it has to be said around the clock on the 65 00:03:20.990 --> 00:03:25.640 continually changing story. Now the drama has settled somewhat, 66 00:03:26.120 --> 00:03:29.630 at least for now. I'd love to know where we are right now. But 67 00:03:29.630 --> 00:03:32.900 I'm sure that will include a recap of what happened and how 68 00:03:32.900 --> 00:03:33.740 the story evolved. 69 00:03:34.920 --> 00:03:36.330 Michael Novinson: Absolutely, Anna, and thank you for the 70 00:03:36.330 --> 00:03:39.360 opportunity. So let's briefly address how we got here. And 71 00:03:39.360 --> 00:03:41.670 then we can get into what things are going to look like going 72 00:03:41.670 --> 00:03:46.080 forward. So we are talking about OpenAI that is the nonprofit 73 00:03:46.110 --> 00:03:48.810 organization that has a for-profit wing that created 74 00:03:48.840 --> 00:03:53.250 ChatGPT, approximately a year ago, publicly launched that and 75 00:03:53.910 --> 00:03:57.510 has been, had three co-founders since then, 76 00:03:57.510 --> 00:04:01.560 since it was founded by Sam Altman. So a late Friday afternoon, 77 00:04:01.560 --> 00:04:04.140 Friday before Thanksgiving, lot of Americans starting out to 78 00:04:04.620 --> 00:04:08.910 take a long Thanksgiving break, a news release drops at 3:30 79 00:04:08.910 --> 00:04:11.730 that afternoon saying that there's been a CEO change. Sam 80 00:04:11.730 --> 00:04:14.610 Altman has been removed, specifically due to the fact 81 00:04:14.610 --> 00:04:17.070 that he wasn't completely candid with the board, and the board had 82 00:04:17.070 --> 00:04:20.250 lost confidence in his ability to oversee the company, 83 00:04:20.610 --> 00:04:23.550 completely blindsided everyone, which is hard to do nowadays in 84 00:04:23.550 --> 00:04:29.010 the tech world. Led to a ton of drama. Multiple efforts to 85 00:04:29.010 --> 00:04:31.650 reinstate him. The company went through multiple interim CEOs, 86 00:04:31.680 --> 00:04:36.660 the first interim, who was their CTO essentially moved aside 87 00:04:36.660 --> 00:04:40.860 because she wanted Altman back. So they brought in another CEO, 88 00:04:40.890 --> 00:04:44.580 the former CEO of Twilio, as their second interim CEO who 89 00:04:44.580 --> 00:04:47.580 eventually put pressure on them saying that unless the board 90 00:04:47.580 --> 00:04:50.220 could produce documentation of what Sam Altman had done wrong, 91 00:04:50.220 --> 00:04:55.110 he was also going to resign. So after 106 hours and multiple 92 00:04:55.110 --> 00:04:58.680 false starts, threats of Altman and most of the OpenAI staff 93 00:04:58.680 --> 00:05:01.800 and camp even going to Microsoft, they did finally come to a 94 00:05:01.800 --> 00:05:05.550 resolution. Sam Altman is back as CEO, Greg Brockman, who was 95 00:05:05.730 --> 00:05:08.370 one of the co-founders was the president had been removed as 96 00:05:08.370 --> 00:05:12.090 chair of the board, is back as president to the board. Ilya 97 00:05:12.090 --> 00:05:15.210 Sutskever who was the one who had informed them of the firing, 98 00:05:15.210 --> 00:05:19.050 who is the third co-founder, chief scientist over there. For 99 00:05:19.050 --> 00:05:22.680 now appears to still be at the company, not on the board, but 100 00:05:22.680 --> 00:05:28.590 as the chief scientist. And just a lot of uncertainty about where 101 00:05:28.590 --> 00:05:31.920 things go from here. So you really had two camps within 102 00:05:31.920 --> 00:05:36.090 OpenAI. The recent reporting is fleshed out that you had the 103 00:05:36.090 --> 00:05:38.580 folks who are focused on commercializing and who are 104 00:05:38.580 --> 00:05:41.730 really figured that OpenAI takes the lead here that we can 105 00:05:41.730 --> 00:05:45.510 develop AI in a safe and responsible manner. And we want 106 00:05:45.510 --> 00:05:48.360 to try to get our products in the hands of many, as many 107 00:05:48.360 --> 00:05:51.840 people as possible. And that's a camp that includes Sam Altman 108 00:05:51.840 --> 00:05:54.840 and Greg Brockman, perhaps some members of the board who had 109 00:05:54.840 --> 00:05:59.820 left in 2023. And then you had a more altruistic wing of the 110 00:05:59.820 --> 00:06:03.090 board. And maybe Sutskever was aligned with them, who was 111 00:06:03.090 --> 00:06:05.790 concerned that the company was moving too fast, was doing too 112 00:06:05.790 --> 00:06:08.250 much. from a development standpoint, it really needed to 113 00:06:08.250 --> 00:06:11.520 focus on research and making sure that AI was working for the 114 00:06:11.520 --> 00:06:14.220 benefit of humanity wanted the company to slow down a little 115 00:06:14.250 --> 00:06:17.430 bit. So those are the folks who essentially have forced Altman 116 00:06:17.430 --> 00:06:19.890 out. It's unclear how much of it was ideological, it's unclear 117 00:06:19.890 --> 00:06:23.760 how much of it was personal. But here's where we stand now, which 118 00:06:23.760 --> 00:06:27.480 is that essentially, in order to break this impasse, there's a 119 00:06:27.510 --> 00:06:31.230 three-member board and none of them are really clear if they're 120 00:06:31.290 --> 00:06:35.250 in the more the AI skeptic camp or in the AI acceleration camp, 121 00:06:35.700 --> 00:06:39.030 you don't have any AI, OpenAI employees on the board. You have 122 00:06:39.030 --> 00:06:42.720 Bret Taylor, who's the former co-CEO of Salesforce, 123 00:06:42.750 --> 00:06:45.420 well-known, well-liked in Silicon Valley. He's chairing 124 00:06:45.420 --> 00:06:50.340 the board for now. You have Larry Summers, who's really from 125 00:06:50.340 --> 00:06:52.620 the public policy world from a treasury secretary under 126 00:06:52.620 --> 00:06:56.430 Clinton, president of Harvard University in the 2000s. He's on 127 00:06:56.430 --> 00:06:59.160 the board - more of a public policy rather than a technology 128 00:06:59.160 --> 00:07:04.080 background. And then you have the CEO of Quora, who is the one 129 00:07:04.080 --> 00:07:06.450 holdover from the previous board to the current board. He was 130 00:07:06.450 --> 00:07:09.210 part of the board that fired Altman, but he remains on the 131 00:07:09.210 --> 00:07:13.650 board. The question from here internally to OpenAI is really 132 00:07:13.650 --> 00:07:16.410 how does the board get from three members today to nine 133 00:07:16.410 --> 00:07:19.230 members, which is where indications are the board wants 134 00:07:19.230 --> 00:07:22.590 to end up. Recent reporting has suggested that they are still, 135 00:07:23.040 --> 00:07:26.130 the board is still going to be a nonprofit board that's focused 136 00:07:26.130 --> 00:07:29.520 on developing AI for the good of humanity, the board is not going 137 00:07:29.520 --> 00:07:33.390 to be focused on the interests of shareholders or investors. So 138 00:07:33.420 --> 00:07:36.060 signals today are that Microsoft will not be getting a voting 139 00:07:36.060 --> 00:07:39.390 board seat. Other investors, Sequoia, Thrive Capital will not 140 00:07:39.390 --> 00:07:42.300 be getting voting representation on the board, it will be 141 00:07:42.300 --> 00:07:45.810 different than a traditional for-profit board. In terms of 142 00:07:45.810 --> 00:07:49.290 will they have a non-voting observer, that's to be 143 00:07:49.290 --> 00:07:52.440 determined. I mean, in the Sam Altman firing, Microsoft was 144 00:07:52.440 --> 00:07:56.940 notified one minute beforehand, given that they own 49% of the 145 00:07:56.970 --> 00:08:00.120 for-profit entity and have invested 30 billion. I don't 146 00:08:00.120 --> 00:08:03.180 think that they are really anybody wants a repeat of that 147 00:08:03.180 --> 00:08:07.920 going forward. So yeah, I think there's just a lot of questions 148 00:08:07.920 --> 00:08:12.300 about who populates the board. It seems like if anybody's in 149 00:08:12.300 --> 00:08:14.700 one of the camps around AI, they're not going to be a good 150 00:08:14.700 --> 00:08:16.620 fit. But obviously, you want people with subject matter 151 00:08:16.620 --> 00:08:20.010 expertise. There's also signs that Altman is, they are 152 00:08:20.040 --> 00:08:22.410 conducting an internal investigation into some of the 153 00:08:22.410 --> 00:08:25.380 claims made by the board, once that investigation concludes, 154 00:08:25.380 --> 00:08:28.260 does Altman get a board seat back, given that he's the CEO of 155 00:08:28.260 --> 00:08:32.010 the company, normally, CEOs do sit on their own board, even if 156 00:08:32.010 --> 00:08:35.340 they're not the chair, they're usually a director. In terms of 157 00:08:35.340 --> 00:08:39.660 the external piece of it, I mean, OpenAI is still the 158 00:08:39.660 --> 00:08:42.600 clubhouse leader, they're aligned with Microsoft. ChatGPT 159 00:08:42.600 --> 00:08:45.750 was really first to market with generative AI and by most 160 00:08:45.750 --> 00:08:50.220 reporting is, has a significant edge over Bard or Anthropic or 161 00:08:50.220 --> 00:08:53.100 some of the others who are taking this on. But I mean, over 162 00:08:53.100 --> 00:08:56.640 the past several days since some larger enterprises kicked the 163 00:08:56.640 --> 00:08:59.940 tires, take a look at Anthropic or Bard, some of the alternates 164 00:08:59.940 --> 00:09:02.190 that they wouldn't have considered otherwise. I mean, 165 00:09:02.190 --> 00:09:04.860 probably will some of those customers migrate over a better 166 00:09:04.860 --> 00:09:08.370 price, something took them by surprise, possibly. And then I 167 00:09:08.370 --> 00:09:11.340 think the other piece here is really around those third 168 00:09:11.340 --> 00:09:16.170 parties who are building products, either on OpenAI or 169 00:09:16.230 --> 00:09:20.490 are on other platforms, and do they want to be so focused on 170 00:09:20.490 --> 00:09:23.940 building on just OpenAI? Or is that seen as too risky nowadays, 171 00:09:23.940 --> 00:09:27.300 given the corporate structure, they want to diversify a bit and 172 00:09:27.300 --> 00:09:30.720 have some capabilities on Anthropic or around Bard so that 173 00:09:31.080 --> 00:09:35.010 if OpenAI goes in a direction that they don't like, their 174 00:09:35.010 --> 00:09:38.130 company doesn't go kaput. So all important things to watch in the 175 00:09:38.130 --> 00:09:39.030 weeks and months ahead. 176 00:09:39.330 --> 00:09:41.610 Tom Field: May I ask you a question? Michael, you and I 177 00:09:41.610 --> 00:09:44.490 have been back and forth on this a lot from the weekend that this 178 00:09:44.490 --> 00:09:47.700 all was coming down. Where were the adults in the room? 179 00:09:50.480 --> 00:09:53.690 Michael Novinson: It's a good question. It is interesting, because I 180 00:09:53.690 --> 00:09:56.630 mean, we're trying to develop AI that can predict how they will 181 00:09:56.660 --> 00:10:00.860 behave the same way humans do. And it does seem like the board 182 00:10:00.890 --> 00:10:03.200 didn't really think many steps ahead. So they made an 183 00:10:03.230 --> 00:10:06.740 accusation in public that Altman wasn't 184 00:10:06.740 --> 00:10:09.350 completely candid, which again, is extremely unusual. I've seen 185 00:10:09.350 --> 00:10:12.410 a lot of CEO departure press releases, and they want to spend 186 00:10:12.410 --> 00:10:15.470 more time with their family, dealing with health issues, 187 00:10:15.500 --> 00:10:19.100 almost never anything of an accusatory nature. So of course, 188 00:10:19.100 --> 00:10:21.530 they say this. And then people, lots of people in private 189 00:10:21.530 --> 00:10:24.830 company employees, Microsoft, Sequoia, other investors go to 190 00:10:24.830 --> 00:10:28.760 them and say, Okay, we want evidence. When was he not 191 00:10:28.760 --> 00:10:30.560 completely candid? And of course, it's not that they have 192 00:10:30.560 --> 00:10:32.780 to put this out in the public, but at least in private to share 193 00:10:32.780 --> 00:10:37.310 with people, what are this? What are some of the examples of when 194 00:10:37.310 --> 00:10:39.830 this happened, and then, by all accounts, they didn't produce 195 00:10:39.830 --> 00:10:42.620 any. There were some talk that he was actually so deceptive, 196 00:10:42.620 --> 00:10:45.560 that they weren't able to document his deception because 197 00:10:45.560 --> 00:10:49.280 he was so tricky. But again, after a while, that's going to 198 00:10:49.280 --> 00:10:51.830 hold up. So in terms of your question, Tom, adults in the 199 00:10:51.830 --> 00:10:55.430 room again, it was a really unusual board. At the time, yes, 200 00:10:55.430 --> 00:10:59.750 CEO of Quora, you had somebody who had run the security 201 00:10:59.810 --> 00:11:03.680 research team at Georgetown, and then you had a philanthropist, 202 00:11:04.040 --> 00:11:07.550 the spouse of a well-known American actor. These were not 203 00:11:07.550 --> 00:11:09.890 people who generally sit on major corporate boards, who 204 00:11:09.890 --> 00:11:14.750 usually have C-suite figures from a major multinational 205 00:11:14.750 --> 00:11:18.470 corporation. So I just don't think they anticipated the level 206 00:11:18.470 --> 00:11:22.310 of scrutiny and pushback. And and if they had, they might have 207 00:11:22.310 --> 00:11:25.130 engaged with legal counsel, engaged with crisis 208 00:11:25.130 --> 00:11:28.640 communications from prior to making that announcement. 209 00:11:30.200 --> 00:11:32.510 Anna Delaney: There is this recurring phrase that's lack of 210 00:11:32.540 --> 00:11:36.860 transparency. Do we know whether OpenAI plans to take 211 00:11:36.860 --> 00:11:41.540 steps to rebuild trust amongst investors, its employees and the 212 00:11:41.540 --> 00:11:42.020 public? 213 00:11:43.290 --> 00:11:45.294 Michael Novinson: Yeah, I mean, I certainly think that's 214 00:11:45.347 --> 00:11:48.458 important. I think, it would have been more important if it 215 00:11:48.511 --> 00:11:51.570 had been a different leadership team, and you're having to 216 00:11:51.623 --> 00:11:54.893 explain to customers, why are these new folks in charge, given 217 00:11:54.946 --> 00:11:58.110 that Altman and Brockman and Sutskever and the founding team 218 00:11:58.163 --> 00:12:01.170 is there, and none of the employees, or there was not a mass 219 00:12:01.222 --> 00:12:04.229 resignation of employees. Of course, some may have backed 220 00:12:04.281 --> 00:12:07.710 chosen a different path. I think there's less reassurance needed. 221 00:12:07.763 --> 00:12:10.927 But I think all eyes are really on who fills out the rest of 222 00:12:10.980 --> 00:12:14.250 this board and to their view, seem to align more with 223 00:12:14.303 --> 00:12:17.151 Altman? Do they seem to align more with some of the AI 224 00:12:17.204 --> 00:12:20.316 skeptics? Are they folks just outside the AI sector because 225 00:12:20.368 --> 00:12:23.480 it's, anybody who's been in AI is too hot to touch because 226 00:12:23.533 --> 00:12:26.750 they have a point of view on this. So I think that's going to 227 00:12:26.803 --> 00:12:30.179 be awesome to watch. And I just thought that the one other piece 228 00:12:30.231 --> 00:12:33.554 to watch here too, is in terms of Microsoft, and they've really 229 00:12:33.607 --> 00:12:36.666 had a very partner-centric AI strategy that really leaning 230 00:12:36.719 --> 00:12:40.094 heavily on OpenAI to power their ambitions in this space. And do 231 00:12:40.147 --> 00:12:43.206 they, they've already talked about an AI research team, do 232 00:12:43.259 --> 00:12:46.371 they start to develop more capabilities internally? Do they 233 00:12:46.424 --> 00:12:49.588 feel it's too risky to rely solely on a partnership? Do they 234 00:12:49.641 --> 00:12:52.911 need to have native capabilities that they have direct control 235 00:12:52.964 --> 00:12:56.340 over? So that to me will also be an interesting piece to watch. 236 00:12:56.000 --> 00:12:59.120 Anna Delaney: So many questions. Well, thank you so much, 237 00:12:59.120 --> 00:13:03.740 Michael, for now, on this story. Well, Tom, you have spoken with 238 00:13:03.770 --> 00:13:07.400 an analyst at Gartner research, Avivah Litan, more generally 239 00:13:07.430 --> 00:13:10.130 about generative AI trends. So what can you share? 240 00:13:10.870 --> 00:13:13.060 Tom Field: Well, maybe it came from this because all this was 241 00:13:13.060 --> 00:13:15.640 going on. It was Avivah, is one of the foremost analysts at 242 00:13:15.640 --> 00:13:18.610 Gartner that follows the AI space and risk in particular. 243 00:13:18.610 --> 00:13:22.480 And so the conversation started out a week ago about her 244 00:13:22.480 --> 00:13:25.840 observations of this drama, when we were sitting down after 245 00:13:25.840 --> 00:13:29.650 things resolved, and talked about the potential impacts of 246 00:13:29.650 --> 00:13:32.530 this, in a conversation about it. And by 247 00:13:32.530 --> 00:13:35.500 the way, it coincides with the first anniversary, the public 248 00:13:35.500 --> 00:13:38.860 birthday of ChatGPT, which is this week, so a timely 249 00:13:39.400 --> 00:13:43.030 opportunity to sit and talk about the current state and 250 00:13:43.030 --> 00:13:46.780 future of generative AI. But I did ask her specifically, after 251 00:13:46.840 --> 00:13:50.470 all that went down in a 106 hours that Michael so thoroughly 252 00:13:50.470 --> 00:13:56.620 documented, what impact does she expect to see in this nascent 253 00:13:56.620 --> 00:13:58.810 industry. So want to share with you an excerpt of our 254 00:13:58.810 --> 00:14:00.400 conversation. 255 00:14:00.000 --> 00:14:03.630 Avivah Litan: I think companies and organizations of any stripe 256 00:14:03.630 --> 00:14:09.030 that use AI will be more reluctant again to put their 257 00:14:09.030 --> 00:14:12.990 eggs in with one vendor. And we'll be looking for solutions 258 00:14:12.990 --> 00:14:17.310 that buffer them from the centralized powers, if you will. 259 00:14:17.640 --> 00:14:21.180 So there is a layer of middleware services that are 260 00:14:21.180 --> 00:14:25.800 emerging with AI, that basically ensure that the companies are 261 00:14:25.830 --> 00:14:28.920 keeping their own intellectual property, the data stays with 262 00:14:28.920 --> 00:14:32.940 them. And they're independent of the backend. So if one player 263 00:14:32.940 --> 00:14:36.960 fails, they can just move their logic to another player. So from 264 00:14:36.960 --> 00:14:40.020 a tech industry point of view, those solutions are going to 265 00:14:40.020 --> 00:14:44.460 become much more imperative now. There's a great selling point, 266 00:14:44.490 --> 00:14:49.080 like do you want to be locked in OpenAI? The answer is, No. Then 267 00:14:49.080 --> 00:14:52.170 use our middleware software, which will keep you safe, 268 00:14:52.170 --> 00:14:56.100 secure, and you will own your intellectual property and 269 00:14:56.100 --> 00:14:59.940 independence. So I think that's how the industry has changed 270 00:14:59.940 --> 00:15:03.810 from a technology perspective. And from a psyche perspective, 271 00:15:03.840 --> 00:15:09.150 people realize how fragile this whole scene is. Where, you know, 272 00:15:09.150 --> 00:15:14.130 the fate of AI is in the hands of boards and CEOs and 273 00:15:14.130 --> 00:15:17.970 individuals, not that they're bad individuals, but they've got 274 00:15:17.970 --> 00:15:21.990 too much power. And it's a little, to me that's sacrier 275 00:15:21.990 --> 00:15:22.860 than anything. 276 00:15:23.340 --> 00:15:25.440 Tom Field: She's spot on. There's too much power. And 277 00:15:25.440 --> 00:15:26.580 that's scarier than anything. 278 00:15:26.000 --> 00:15:28.730 Anna Delaney: And so following the saga at OpenAI, Tom, has 279 00:15:28.730 --> 00:15:30.890 your perspective shifted in any way when it comes to how this 280 00:15:30.890 --> 00:15:38.120 tech evolves, but also how it's managed, how its governed? 281 00:15:38.540 --> 00:15:38.870 282 00:15:39.860 --> 00:15:43.580 Tom Field: You know, we've said, consistently for the past year, 283 00:15:43.640 --> 00:15:47.750 we've never seen a technology enter the market the way 284 00:15:47.750 --> 00:15:51.950 generative AI has. In all my years of journalism, I've never 285 00:15:51.950 --> 00:15:56.780 seen a story drama evolve the way the Sam Altman one did over 286 00:15:56.780 --> 00:16:00.590 the course of five days a week or so ago. It raises some 287 00:16:00.590 --> 00:16:05.240 significant questions. This isn't how investment properties 288 00:16:05.330 --> 00:16:10.130 behave. This isn't how boards behave. Everything here is new 289 00:16:10.580 --> 00:16:14.540 to the Silicon Valley scene, I think that I made the joke about 290 00:16:14.540 --> 00:16:16.850 where were the adults in the room, we need some more adults 291 00:16:16.850 --> 00:16:21.320 in the room. This is a technology that needs 292 00:16:21.470 --> 00:16:24.740 guardrails, not just within the customer organizations, but 293 00:16:24.740 --> 00:16:29.510 within the provider organization. So I don't want to 294 00:16:29.510 --> 00:16:32.840 use the term wake-up call. But I hope this makes somebody wake up 295 00:16:32.840 --> 00:16:36.200 and realize that we do need more guidance going forward for 296 00:16:36.200 --> 00:16:39.500 organizations such as these because the power is great. If 297 00:16:39.500 --> 00:16:42.680 Spider-Man taught us anything is with great power comes great 298 00:16:42.680 --> 00:16:44.570 responsibility and good boards. 299 00:16:45.120 --> 00:16:47.340 Mathew Schwartz: Yeah, well, and also move fast and break things 300 00:16:47.370 --> 00:16:50.340 has been a Silicon Valley mantra. And that really got out 301 00:16:50.340 --> 00:16:54.480 of control here. As Michael and you have so eloquently stated. 302 00:16:54.720 --> 00:16:57.420 Anna Delaney: Well, Mat, moving on to your story. There's been a 303 00:16:57.420 --> 00:17:00.360 gain in the fight to stop ransomware gangs or at least 304 00:17:00.360 --> 00:17:03.660 make their lives more difficult as police have busted high-profile 305 00:17:03.660 --> 00:17:07.020 ransomware gang suspects in Ukraine. Tell us about it. 306 00:17:07.570 --> 00:17:09.850 Mathew Schwartz: Yes. So some good news on the ransomware 307 00:17:09.850 --> 00:17:15.070 front. Last week, police in Ukraine, Ukrainian cybercrime 308 00:17:15.070 --> 00:17:19.930 police, backed by law enforcement officials or 309 00:17:19.930 --> 00:17:23.410 investigators, I guess, from multiple other countries, 310 00:17:23.530 --> 00:17:28.960 including I believe the FBI, the US Secret Service, also Norway, 311 00:17:29.020 --> 00:17:34.960 Germany, busted some ransomware suspects, including the alleged 312 00:17:34.990 --> 00:17:40.930 ringleader of a group that launched in 2020. And it's not 313 00:17:40.930 --> 00:17:43.420 clear to me if this is a stand-alone group that developed 314 00:17:43.420 --> 00:17:46.990 its own ransomware or if they procure the ransomware. 315 00:17:47.380 --> 00:17:50.830 Authorities said they definitely improved the ransomware. So 316 00:17:50.980 --> 00:17:57.280 we're talking about strains like LockerGoga, Dharma, MegaCortex, 317 00:17:57.700 --> 00:18:02.440 Hive. Now Hive I definitely know for sure was, well, I'm pretty 318 00:18:02.440 --> 00:18:04.450 sure it was developed by another group and a lot of people 319 00:18:04.450 --> 00:18:09.190 participated in it. But this particular group that's been 320 00:18:09.220 --> 00:18:14.980 rolled up, has been charged with hitting some really big victims 321 00:18:15.310 --> 00:18:22.390 - 1,800 victims or more across over 70 countries. Since 2018, 322 00:18:23.110 --> 00:18:26.770 operating from Ukraine, and apparently operating not just 323 00:18:26.770 --> 00:18:32.080 from 2018, but after Russia launched its all-out invasion of 324 00:18:32.080 --> 00:18:38.020 Ukraine in February of 2022. One of the really big victims 325 00:18:38.200 --> 00:18:44.440 ascribed to the group is Norsk Hydro, the Norwegian, louver aluminum 326 00:18:44.500 --> 00:18:49.360 giant, which led to Norwegian authorities getting closely 327 00:18:49.360 --> 00:18:54.850 involved in this investigation. The group in 2019, same year as 328 00:18:54.850 --> 00:19:00.610 Norsk Hydro, also allegedly hit a chemical company in the 329 00:19:00.610 --> 00:19:06.040 Netherlands, owned by a U.S. firm and demanded a ransom, then 330 00:19:06.040 --> 00:19:12.730 worth $1.3 million. So start to multiply this by the 1,800 331 00:19:12.730 --> 00:19:15.880 victims or more. The ransom demands wouldn't have been the 332 00:19:15.880 --> 00:19:19.720 same, the ransom payments, who knows what they were, but you're 333 00:19:19.720 --> 00:19:25.000 looking at potentially a lot of illicit revenue for this group 334 00:19:25.090 --> 00:19:29.560 operating from Ukraine. This is the first time we've heard about 335 00:19:29.560 --> 00:19:32.740 this group. About a year ago, there was a first round of 336 00:19:32.740 --> 00:19:36.700 arrests of 12, what were described as high-value targets 337 00:19:36.850 --> 00:19:42.850 in Ukraine and Switzerland. Shortly thereafter, authorities 338 00:19:42.880 --> 00:19:48.940 released free decryptors for at least some of the files that 339 00:19:48.940 --> 00:19:52.240 would have been encrypted by this group, working with 340 00:19:52.750 --> 00:19:56.050 Romanian cybersecurity from Bitdefender. The Swiss 341 00:19:56.050 --> 00:20:01.480 authorities said they developed decryptors for LockerGoga and 342 00:20:01.480 --> 00:20:06.580 MegaCortex ransomware. And so now here a year later, we're 343 00:20:06.580 --> 00:20:10.750 seeing what looks to be the application of digital forensic 344 00:20:10.750 --> 00:20:14.170 evidence that was seized in that initial raid, based on an 345 00:20:14.170 --> 00:20:18.100 investigation that originally launched in 2019. And this 346 00:20:18.280 --> 00:20:20.800 additional evidence has allowed them to identify who they 347 00:20:20.800 --> 00:20:24.430 suspect the ringleader is, along with a handful of accomplices. 348 00:20:24.730 --> 00:20:29.860 So it appears to be the very slow, I won't say rolling up, 349 00:20:29.980 --> 00:20:32.590 because it appears the group has been defunct since the end of 350 00:20:32.590 --> 00:20:38.530 last year. But this very slow justice, catching up with these 351 00:20:38.590 --> 00:20:43.540 alleged ironing massive pain-causing, ransomware-wielding 352 00:20:44.410 --> 00:20:46.840 criminals operating from Ukraine. 353 00:20:47.230 --> 00:20:50.050 Anna Delaney: Excellent, Mat. And so as you said, the 354 00:20:50.080 --> 00:20:52.450 operation against this cybercriminal group started in 355 00:20:52.450 --> 00:20:56.260 2019. What challenges did law enforcement face throughout that 356 00:20:56.260 --> 00:20:59.980 time when it comes to tracking and dismantling these highly 357 00:20:59.980 --> 00:21:03.940 sophisticated operations in all sorts of countries? But also, do 358 00:21:03.940 --> 00:21:08.650 you think the war in Ukraine against Russia with Russia had 359 00:21:08.650 --> 00:21:13.180 an impact at all in hindering or even helping law enforcement? 360 00:21:13.000 --> 00:21:17.170 Mathew Schwartz: The short answer is, I have no idea. I've 361 00:21:17.200 --> 00:21:22.450 fired off questions about this to the FBI and others. Even with 362 00:21:22.450 --> 00:21:27.160 the decryptors that came out last year, just saying, Look, is 363 00:21:27.160 --> 00:21:30.490 this group responsible for developing LockerGoga? Or were 364 00:21:30.490 --> 00:21:33.880 they procuring it from somewhere else? And what I heard back was, 365 00:21:33.880 --> 00:21:37.510 No comment. And I suppose because this investigation is 366 00:21:37.510 --> 00:21:41.200 continuing, and authorities who said that it's still continuing, 367 00:21:41.470 --> 00:21:44.980 we're not going to get that kind of detail, that stuff that maybe 368 00:21:44.980 --> 00:21:51.580 will come out if these people go to court as in a trial by jury, 369 00:21:51.670 --> 00:21:56.260 as opposed to reaching some kind of a plea agreement. Definitely 370 00:21:56.260 --> 00:22:00.190 though, I would think that having this full-fledged war 371 00:22:00.190 --> 00:22:05.320 happening must complicate things. And so kudos to 372 00:22:05.320 --> 00:22:10.210 authorities in Ukraine for continuing to chase down the 373 00:22:10.210 --> 00:22:14.380 suspects, continuing to press forward with this investigation, 374 00:22:14.500 --> 00:22:19.150 working very closely with partners backed by Europol, 375 00:22:19.180 --> 00:22:22.480 which has been coordinating, offering intelligence. Eurojust, 376 00:22:22.480 --> 00:22:26.080 which has been coordinating and also helping with all this. So 377 00:22:26.140 --> 00:22:30.490 big investigation, lots of going back and forth, people from 378 00:22:30.490 --> 00:22:34.570 Ukraine at Europol and vice versa. So yeah, big kudos, 379 00:22:34.660 --> 00:22:39.070 middle of a massive war. This can't be easy, but great that 380 00:22:39.070 --> 00:22:43.090 they are continuing to continue to press on these criminals. I 381 00:22:43.090 --> 00:22:44.380 hope we see more of this. 382 00:22:45.460 --> 00:22:48.550 Tom Field: Follow-up question. When you hear yourself talking 383 00:22:48.550 --> 00:22:51.550 about these ransomware criminal groups, do you ever stop to 384 00:22:51.550 --> 00:22:53.680 think that it sounds like you could be talking about the 385 00:22:53.680 --> 00:22:54.460 Transformers? 386 00:22:54.000 --> 00:23:00.390 Mathew Schwartz: Give me a little bit more to work with 387 00:23:00.390 --> 00:23:00.990 that, Tom. 388 00:23:01.620 --> 00:23:03.090 Tom Field: Rattle off these names again. 389 00:23:04.050 --> 00:23:06.780 Mathew Schwartz: Oh, LockerGoga and MegaCortex. 390 00:23:07.110 --> 00:23:08.460 Tom Field: Why can't they be Transformers? 391 00:23:08.000 --> 00:23:11.840 Mathew Schwartz: Well I mean, there's a sociology treaties or 392 00:23:11.840 --> 00:23:14.840 whatever to be written about the fact that a lot of these groups 393 00:23:14.870 --> 00:23:19.700 are men in their early 20s, with a lot of time on their hands, 394 00:23:19.700 --> 00:23:24.020 talking a lot of smack. There's this soap opera of my crypto 395 00:23:24.020 --> 00:23:27.320 locking code is worse than yours. There's lots of 396 00:23:27.380 --> 00:23:31.580 adolescent-level drama, so I'm not surprised that you're taking 397 00:23:31.580 --> 00:23:36.890 away a certain, I don't know. I won't say youthful flavor, but 398 00:23:38.120 --> 00:23:43.040 less mature, perhaps, language being used around a lot of this 399 00:23:43.040 --> 00:23:43.580 stuff. 400 00:23:43.880 --> 00:23:45.020 Tom Field: That would be my beat. Yes. 401 00:23:47.120 --> 00:23:50.540 Anna Delaney: That sets up my final fun question very nicely. 402 00:23:50.690 --> 00:23:53.960 If we were to transport a historical figure into the 403 00:23:53.960 --> 00:23:57.320 digital age with today's technology, which one do you 404 00:23:57.320 --> 00:24:00.410 think would excel as the most formidable hacker and why? 405 00:24:02.130 --> 00:24:07.620 Tom Field: Benjamin Franklin. He invented the lightning rod, he 406 00:24:07.620 --> 00:24:10.680 invented the Franklin stove, the bifocals, created the first 407 00:24:10.680 --> 00:24:13.440 library, was the first postmaster of the U.S., is 408 00:24:13.440 --> 00:24:17.010 notable for so many innovation and inventions. I think, he would 409 00:24:17.010 --> 00:24:21.030 do a terrific job in today's era. I would look forward to 410 00:24:21.030 --> 00:24:23.640 reading his Poor Richard's AI Almanack. 411 00:24:25.650 --> 00:24:27.720 Anna Delaney: Excellent choice. Mathew? 412 00:24:27.000 --> 00:24:31.770 Mathew Schwartz: Yeah, Benjamin Franklin's a really good one. I 413 00:24:31.800 --> 00:24:37.020 would transport just for fun, or whatever, Charles Ponzi, who's 414 00:24:37.050 --> 00:24:41.730 not been with us since 1949, although his name lives on. He 415 00:24:41.730 --> 00:24:44.700 certainly didn't invent the concept of robbing Peter to pay 416 00:24:44.700 --> 00:24:49.230 Paul, but perhaps industrialized it on a scale we've never seen 417 00:24:49.230 --> 00:24:53.850 before, where you get people to invest and you pay off the early 418 00:24:53.850 --> 00:24:58.380 investors with the later investors, promising them 419 00:24:58.740 --> 00:25:01.350 returns that you just can't deliver. 420 00:25:01.000 --> 00:25:06.100 Anna Delaney: Very good, Ponzi choice, great. And Michael? 421 00:25:06.690 --> 00:25:08.670 Michael Novinson: I was thinking of Napoleon. I mean, my goodness 422 00:25:08.670 --> 00:25:11.940 in the pre-cyberworld, he escaped Elba, escaped an island 423 00:25:12.150 --> 00:25:16.050 with hundreds of people, commandeered a ship, and took 424 00:25:16.050 --> 00:25:19.170 over. He told the military that they're welcome to shoot him, and 425 00:25:19.170 --> 00:25:21.210 they just let him take over as leader at least for another 426 00:25:21.210 --> 00:25:24.450 couple of months until all the other countries kicked him out 427 00:25:24.450 --> 00:25:27.630 again. But my goodness, I mean, if he had our cyber tools and 428 00:25:27.630 --> 00:25:30.720 cyber weaponry in his hands, imagine what he could do given 429 00:25:30.720 --> 00:25:33.900 how effective he was at social engineering in a 430 00:25:33.900 --> 00:25:36.420 non-cyber, non-digital world. 431 00:25:37.020 --> 00:25:39.120 Anna Delaney: That's a good one and very topical. While I was 432 00:25:39.120 --> 00:25:41.580 going for the 16th century astrologer and physician, 433 00:25:41.580 --> 00:25:46.020 Nostradamus. I think he'd serve very well as a modern-day 434 00:25:46.050 --> 00:25:49.650 digital Oracle, who will have the ability to perceive 435 00:25:49.650 --> 00:25:53.640 vulnerabilities and conduct predictive cyberthreat analysis 436 00:25:53.640 --> 00:25:57.840 and be a master at staying hidden in the digital realm, and 437 00:25:58.020 --> 00:26:01.950 be able hopefully to use AI to predict the future developments. 438 00:26:02.250 --> 00:26:06.270 And I also want to ask him what he thinks about it all. And how 439 00:26:06.270 --> 00:26:10.440 soon would we see AGI. So plenty of questions for him. 440 00:26:10.920 --> 00:26:12.450 Mathew Schwartz: Didn't he predict AI, Anna? 441 00:26:12.780 --> 00:26:13.380 Anna Delaney: Did he? 442 00:26:18.810 --> 00:26:23.460 Mathew Schwartz: And cybercrime, everything. Tom Field: Everything else put together, it will be a hell of a dinner party 443 00:26:22.510 --> 00:26:25.960 Anna Delaney: Yes, this will be a great one. Drinks on me. Okay. 444 00:26:25.960 --> 00:26:28.990 Thank you so much, Tom, Michael, Mathew. Always a pleasure. Fantastic! 445 00:26:28.200 --> 00:26:29.010 Michael Novinson: Thank you, Anna. Mathew Schwartz: Arrivederci, Anna. 446 00:26:28.990 --> 00:26:29.740 447 00:26:29.010 --> 00:26:36.750 Anna Delaney: Arrivederci, ciao. 448 00:26:34.910 --> 00:26:36.740 Thanks so much for watching. Until next time.