WEBVTT 1 00:00:07.170 --> 00:00:09.810 Anna Delaney: Hi, and thanks for joining us for the ISMG Editors' 2 00:00:09.810 --> 00:00:12.840 Panel. I'm Anna Delaney, and this is a weekly editorial 3 00:00:12.840 --> 00:00:16.800 analysis of the top trending stories in cybersecurity. It's 4 00:00:16.800 --> 00:00:19.410 great to be joined today by friendly faces, Tom Field, 5 00:00:19.470 --> 00:00:22.350 senior vice president of editorial; Marianne Kolbasuk 6 00:00:22.350 --> 00:00:25.650 McGee, the executive editor of HealthcareInfoSecurity; and 7 00:00:25.650 --> 00:00:28.620 Mathew Schwartz, executive editor of DataBreachToday and 8 00:00:28.620 --> 00:00:30.300 Europe. Very good to see you all. 9 00:00:30.960 --> 00:00:31.620 Tom Field: Nice to be seen. 10 00:00:32.310 --> 00:00:33.300 Marianne McGee: Thanks for having us. 11 00:00:33.480 --> 00:00:35.220 Mathew Schwartz: Great to be here! Trick or treat! 12 00:00:35.490 --> 00:00:39.450 Anna Delaney: Trick or treat, indeed! Tom, I know this is not 13 00:00:39.450 --> 00:00:42.270 the happiest story to start with, but tell us about your 14 00:00:42.270 --> 00:00:42.630 background. 15 00:00:43.230 --> 00:00:45.780 Tom Field: Oh, you can see the Lewiston Strong photo behind me. 16 00:00:46.110 --> 00:00:49.470 You saw the news last week about the shooting here. I live 30 17 00:00:49.470 --> 00:00:52.290 miles from where last week's shootings occurred, and it was a 18 00:00:52.290 --> 00:00:56.370 uniquely frightening week for me, for my family, for friends 19 00:00:56.370 --> 00:00:59.220 and just for the greater community. And, for those that 20 00:00:59.220 --> 00:01:02.520 don't know, Lewiston is an old mill town that until last week 21 00:01:02.520 --> 00:01:06.390 was probably best known as the city where boxer Muhammad Ali 22 00:01:06.390 --> 00:01:10.620 staged his first title defense back in 1965. And, now it's 23 00:01:10.620 --> 00:01:13.530 forever going to be known as a site for a senseless mass 24 00:01:13.530 --> 00:01:16.320 shooting, the deaths of innocent people and the tearing apart of 25 00:01:16.320 --> 00:01:20.760 countless lives. I'll end by saying, for 48 hours we lived in 26 00:01:20.760 --> 00:01:24.120 isolation, we lived in fear. Our schools or stores or restaurants 27 00:01:24.120 --> 00:01:27.270 they were all closed, helicopters flew overhead, we 28 00:01:27.270 --> 00:01:30.780 heard conflicting news about the number of shooters and victims, 29 00:01:30.780 --> 00:01:34.590 as well as the progress of the manhunt and Halloween activities 30 00:01:34.590 --> 00:01:37.380 were canceled. And, honestly, we had to answer a children's 31 00:01:37.380 --> 00:01:41.010 questions that we'd never heard before. So, that's why, I used 32 00:01:41.000 --> 00:01:45.290 Anna Delaney: Unimaginable and our hearts are with you all. 33 00:01:41.010 --> 00:01:41.880 this background today. 34 00:01:47.330 --> 00:01:47.960 Mathew? 35 00:01:48.710 --> 00:01:52.430 Mathew Schwartz: Well, there's a little bit of ancient history. 36 00:01:52.910 --> 00:01:57.620 Here, I was in Aberdeen, north of where I live, in Scotland. 37 00:01:57.620 --> 00:02:02.510 And, this is a statue of Robert I, also known as Robert the 38 00:02:02.510 --> 00:02:07.040 Bruce, who was king of the Scots from 1306 until his death in 39 00:02:07.430 --> 00:02:12.290 1329. And, he led Scotland during the First War of Scottish 40 00:02:12.290 --> 00:02:16.730 independence against England. Here we are. 41 00:02:17.540 --> 00:02:19.490 Anna Delaney: We're always getting ourselves into trouble 42 00:02:19.490 --> 00:02:25.700 with you aren't we? Marianne, Trick or treat, indeed! To, 43 00:02:26.120 --> 00:02:26.720 Mat's comment. 44 00:02:27.900 --> 00:02:32.940 Marianne McGee: Yeah, I'm just sitting near a farm stand that 45 00:02:32.940 --> 00:02:38.100 has some interesting white and orange pumpkins. That's all I've 46 00:02:38.100 --> 00:02:38.490 got. 47 00:02:38.880 --> 00:02:40.950 Anna Delaney: You've got the best pumpkins! That's all good. 48 00:02:41.100 --> 00:02:48.240 Well, I'm standing by a part of a mosaic artwork in a subway 49 00:02:48.240 --> 00:02:51.630 station in Brooklyn, New York. And, I think I shared part of 50 00:02:51.630 --> 00:02:55.080 this very artwork last year, but it was depicting the moon and 51 00:02:55.080 --> 00:02:59.010 this is obviously the sun. And, the idea of work is to bring the 52 00:02:59.010 --> 00:03:02.040 sun, the sky and the moon to the underground, so I thought it was 53 00:03:02.040 --> 00:03:04.230 quite uplifting; it's quite lovely. 54 00:03:04.650 --> 00:03:05.400 Tom Field: Very good! Well done! 55 00:03:06.930 --> 00:03:09.930 Anna Delaney: Well, Tom, you've recorded another interview, in a 56 00:03:09.930 --> 00:03:14.790 series which, sort of, meets with Israeli cybersecurity 57 00:03:14.790 --> 00:03:18.000 leaders and looks at the cybersecurity scene there. And, 58 00:03:18.000 --> 00:03:21.450 we had a very moving an important conversation about 59 00:03:21.480 --> 00:03:25.050 building and rebuilding business resilience last week. So, why 60 00:03:25.050 --> 00:03:26.850 don't you tell us about this latest conversation? 61 00:03:27.240 --> 00:03:28.950 Tom Field: Yeah, well, I've done a series of these, I think I had 62 00:03:28.950 --> 00:03:32.250 recorded one when last we spoke, I've done three more since. And, 63 00:03:32.250 --> 00:03:35.580 honestly, it's not disconnected to what I talked about up-front. 64 00:03:36.270 --> 00:03:40.290 Because having gone through the experience of what was happening 65 00:03:40.290 --> 00:03:43.710 in Maine last week, and conducting these interviews, you 66 00:03:43.710 --> 00:03:46.650 know, there was a sense of kinship and for me, what 67 00:03:46.650 --> 00:03:49.530 happened in Maine was personal. I was born in Lewiston, I still 68 00:03:49.530 --> 00:03:52.200 have family members who live in the area, my wife works in the 69 00:03:52.200 --> 00:03:55.680 community. We both first heard about the tragedy from her 70 00:03:55.680 --> 00:03:58.740 family and friends who were locked down that evening, 71 00:03:58.770 --> 00:04:03.660 fearful for their lives. And, it was surreal! To drive by the 72 00:04:03.660 --> 00:04:06.720 very hospital I was born and to see international news crews 73 00:04:06.720 --> 00:04:09.390 doing their stand ups and to know that at that time, we were 74 00:04:09.390 --> 00:04:12.570 at the epicenter of the biggest news story in the world. So, it 75 00:04:12.570 --> 00:04:15.930 gave me, just briefly, a sense of what our friends and 76 00:04:15.930 --> 00:04:18.360 colleagues in Israel have experienced over the past three 77 00:04:18.360 --> 00:04:21.180 to four weeks. And, as I interviewed these security and 78 00:04:21.180 --> 00:04:24.870 technology leaders, I had a better empathy for the trauma 79 00:04:25.200 --> 00:04:28.710 that their society is amidst right now. Not to the same level 80 00:04:28.710 --> 00:04:31.560 by any degree, I don't pretend to know what it's like to live 81 00:04:31.560 --> 00:04:35.400 in a warzone and to experience lives turned upside down to the 82 00:04:35.400 --> 00:04:39.270 degree that they have. But, I do know what it's like to live in 83 00:04:39.270 --> 00:04:42.630 fear and to answer questions you've never been asked before. 84 00:04:43.020 --> 00:04:46.140 So, I want to share with you an excerpt of one of the 85 00:04:46.140 --> 00:04:49.920 interviews, where I asked the security leader, in this case it 86 00:04:49.920 --> 00:04:56.490 is Chen Shmilo. I asked Chen Shmilo, from Israel, about what 87 00:04:56.490 --> 00:05:00.660 his message is to the world and we'll share the excerpt of what 88 00:05:00.660 --> 00:05:01.500 he shared with me. 89 00:05:03.040 --> 00:05:06.220 Chen Shmilo: The story of the Israeli high-tech is resilience, 90 00:05:06.340 --> 00:05:12.070 agility and boldness. We are, the Israeli people, resolved, 91 00:05:12.310 --> 00:05:16.750 not just to beat Hamas-ISIS, but also to keep our ecosystem 92 00:05:16.750 --> 00:05:21.700 thriving. I know, and this is the message, you know, to my 93 00:05:21.700 --> 00:05:25.570 employees, to the board members, to different tech ecosystem 94 00:05:25.570 --> 00:05:29.530 players who are our partners: Israel will keep being the 95 00:05:29.530 --> 00:05:34.840 flagship of daringness and out-of-the-box thinking. We will 96 00:05:34.840 --> 00:05:38.440 have to work together with domestic but also international 97 00:05:38.440 --> 00:05:42.310 players to benefit from the knowledge and spirit that have 98 00:05:42.310 --> 00:05:47.650 been uplifted and upgraded during the war. Personally, I 99 00:05:47.650 --> 00:05:53.200 know that we, as the 8200 Alumni Association, are going to expand 100 00:05:53.590 --> 00:05:58.810 the scope of the activities from ideation to post-acceleration 101 00:05:58.810 --> 00:06:03.610 programs by strengthening the 8200 hub, and recruiting new 102 00:06:03.970 --> 00:06:08.620 partners. And, even more specifically, you know, a month 103 00:06:08.620 --> 00:06:12.760 before the war started, we were already working and preparing 104 00:06:12.790 --> 00:06:18.520 our 8200 global first batch to New York to present magnificent 105 00:06:18.550 --> 00:06:22.090 Israeli startups who are ready to scale to the U.S. market, 106 00:06:23.140 --> 00:06:28.780 which had, you know, to be stopped, unfortunately. But, I 107 00:06:28.780 --> 00:06:33.100 cannot wait to renew the 8200 global first batch, come to New 108 00:06:33.100 --> 00:06:37.780 York City, present the Israeli innovation, create new 109 00:06:37.780 --> 00:06:41.200 partnerships, and, you know, keep the Israeli economy 110 00:06:41.200 --> 00:06:41.740 thriving. 111 00:06:43.350 --> 00:06:46.860 Tom Field: A message of resiliency, you know. And, one 112 00:06:46.860 --> 00:06:50.340 thing I'll point out and I ask these leaders consistently, what 113 00:06:50.340 --> 00:06:53.610 is your message to your employees, to your customers, to 114 00:06:53.610 --> 00:06:59.370 your global partners? I want to share mine. The crisis has 115 00:06:59.370 --> 00:07:02.610 passed, the news crews have certainly moved on, the 116 00:07:02.610 --> 00:07:05.850 headlines are focused on different topics, but lives and 117 00:07:05.850 --> 00:07:09.240 families are forever shattered. And, a rural community that once 118 00:07:09.240 --> 00:07:13.200 thought "never here" now knows completely differently. It's not 119 00:07:13.200 --> 00:07:16.080 better, just differently. We're going to heal, we're going to 120 00:07:16.080 --> 00:07:17.880 sing, we're going to dance, we're going to play, we're going 121 00:07:17.880 --> 00:07:21.120 to celebrate Halloween this very evening. But, we will never, 122 00:07:21.210 --> 00:07:24.300 ever, forget what we've experienced. And, we pray that 123 00:07:24.300 --> 00:07:27.570 our children and our children's children never see such an 124 00:07:27.570 --> 00:07:30.330 experience again. So, for me, it's been a very moving week. 125 00:07:30.870 --> 00:07:34.950 Anna Delaney: Yeah, I'm sure. And, he mentioned collaboration 126 00:07:34.950 --> 00:07:37.470 there, though, did you do you get a sense that there is 127 00:07:37.920 --> 00:07:41.940 stronger collaboration perhaps than we're used to in our 128 00:07:41.940 --> 00:07:47.220 cybersecurity companies, whether in the U.K. and the U.S.? Is 129 00:07:47.220 --> 00:07:50.790 there much collaboration among companies in the Israeli 130 00:07:50.820 --> 00:07:54.000 cybersecurity ecosystem that maybe we could learn from? 131 00:07:54.450 --> 00:07:56.490 Tom Field: Well, it comes from crisis, doesn't it? I think 132 00:07:56.490 --> 00:08:01.920 we're seeing in Israel, similar to what Marianne, Mat and I saw 133 00:08:01.980 --> 00:08:05.340 in the U.S. after the terrorist attacks of September 11, back in 134 00:08:05.370 --> 00:08:11.310 2001; the competitive walls come down. And, you'll see rivals 135 00:08:11.340 --> 00:08:14.970 work together for common cause. And, that's happening in Israel 136 00:08:14.970 --> 00:08:18.480 right now. I have no doubt in some ways that the Israelis will 137 00:08:18.480 --> 00:08:21.540 emerge stronger from tragedy. And, so yes, you are seeing 138 00:08:21.540 --> 00:08:24.030 collaboration that you wouldn't have seen two months ago. 139 00:08:25.770 --> 00:08:27.810 Anna Delaney: Thanks, Tom. We look forward to watching the 140 00:08:27.840 --> 00:08:29.880 interview on our sites later this week. That's great. 141 00:08:29.880 --> 00:08:30.390 Tom Field: Thank you very much. 142 00:08:30.630 --> 00:08:33.720 Anna Delaney: So, Mathew, time for more ransomware trends. 143 00:08:33.720 --> 00:08:34.500 What's been happening? 144 00:08:35.500 --> 00:08:38.650 Mathew Schwartz: Well, it's more ransomware all the time. And, 145 00:08:38.650 --> 00:08:42.280 unfortunately, the latest look at ransomware that we've seen 146 00:08:42.340 --> 00:08:46.450 suggests that ransomware is continuing to increase. Now, 147 00:08:46.480 --> 00:08:50.080 that might not sound surprising, but we have seen rises and 148 00:08:50.080 --> 00:08:53.410 falls. And, it's never clear which way it might be going. 149 00:08:53.650 --> 00:08:57.850 Unfortunately, though, we have seen two record-breaking months 150 00:08:57.910 --> 00:09:01.600 this year, so far, including September, which is the last 151 00:09:01.600 --> 00:09:07.660 month for which we currently have data. Specifically, groups, 152 00:09:07.720 --> 00:09:10.840 researchers who look at ransomware groups have been 153 00:09:10.840 --> 00:09:14.680 counting the number of victims that get posted to data leak 154 00:09:14.710 --> 00:09:19.300 sites. I like to call this "public displays of infection" 155 00:09:19.570 --> 00:09:25.420 because it's ransomware groups being really adolescent, listing 156 00:09:25.840 --> 00:09:29.650 all the victims that they claim haven't paid. And, these are not 157 00:09:29.770 --> 00:09:34.150 perfect numbers to go by either. It's not clear how many victims 158 00:09:34.180 --> 00:09:38.830 don't get listed because they didn't pay, you know, it's also 159 00:09:38.830 --> 00:09:43.060 not clear how many victims did pay to avoid listing. But, if we 160 00:09:43.060 --> 00:09:46.540 look at how many victims have gotten listed, in September, 161 00:09:46.690 --> 00:09:51.850 there were 514 victims, which, like I said, is, unfortunately, 162 00:09:52.060 --> 00:09:56.560 a record. We have some well-known names in the 163 00:09:56.560 --> 00:09:59.080 listings. If you look at the top 10 who's listing the most, 164 00:09:59.530 --> 00:10:04.300 LockBit, for example, which counted 72 victims, again, 165 00:10:04.450 --> 00:10:08.050 victims who didn't pay, we have no idea how many did pay. 166 00:10:08.320 --> 00:10:13.450 Estimates are that possibly about 34% of victims do pay a 167 00:10:13.450 --> 00:10:18.550 ransom, which is part of what perpetuates the ransomware 168 00:10:18.670 --> 00:10:21.700 business model for these attackers and keeps them coming 169 00:10:21.700 --> 00:10:26.110 back for more. We have some established groups at play, we 170 00:10:26.110 --> 00:10:29.230 also have some newcomers. So, there's a few groups you may 171 00:10:29.230 --> 00:10:34.990 have heard of, Cactus is one, RansomedVC is another - which 172 00:10:35.020 --> 00:10:39.340 recently posted a very high profile victim in the form of 173 00:10:39.370 --> 00:10:43.300 Sony - another group called 3AM, another one called CiphBit. I 174 00:10:43.300 --> 00:10:48.820 mentioned them only to show that there is a continuing influx, I 175 00:10:48.820 --> 00:10:53.110 don't want to say of new talent, but at least new players. All of 176 00:10:53.110 --> 00:10:56.170 these groups that I've mentioned practice double extortion, which 177 00:10:56.170 --> 00:11:00.250 we've been seeing for years, which is where a group steals or 178 00:11:00.310 --> 00:11:04.060 claims to have stolen data before crypto locking systems, 179 00:11:04.300 --> 00:11:07.750 and then threatens to leak that data in order to get a ransom 180 00:11:07.750 --> 00:11:11.740 payment. They'll also demand a separate ransom for a free 181 00:11:11.740 --> 00:11:15.880 decrypter. So, they're trying to monetize these attacks in any 182 00:11:15.880 --> 00:11:21.610 way possible. So, not great news there, we're seeing a lot of 183 00:11:21.610 --> 00:11:25.270 ongoing ransomware attacks targeting vulnerabilities that 184 00:11:25.270 --> 00:11:28.030 have just come to light. There have been a lot of ransomware 185 00:11:28.030 --> 00:11:33.220 groups really quick off the mark, in the last few months, 186 00:11:33.640 --> 00:11:36.490 looking for vulnerabilities. For example, there's a new Cisco 187 00:11:36.490 --> 00:11:39.730 vulnerability, which has been patched, which needs a very 188 00:11:39.850 --> 00:11:44.410 careful handling in order to ensure that you've wiped your 189 00:11:44.530 --> 00:11:48.370 credentials from the memory of the device. And, ransomware 190 00:11:48.370 --> 00:11:51.460 attackers are increasingly hitting these sorts of things 191 00:11:51.580 --> 00:11:55.660 quicker and quicker. We also have seen Cl0p this year, 192 00:11:56.050 --> 00:11:59.920 hitting secure file transfer software, the most recent time 193 00:11:59.920 --> 00:12:04.390 against progress software's MOVEit software. And, Marianne 194 00:12:04.390 --> 00:12:09.070 has been reporting on this extensively, as have I. But, the 195 00:12:09.160 --> 00:12:12.280 group managed to steal a lot of data, they didn't crypto lock 196 00:12:12.280 --> 00:12:16.510 the systems, but they've been using extortion, threatening to 197 00:12:16.510 --> 00:12:21.070 leak the data, and earned an estimated $75 million to $100 198 00:12:21.070 --> 00:12:25.030 million from doing this very early in the attacks, which they 199 00:12:25.030 --> 00:12:29.350 unleashed at the end of May. Although victims and the count 200 00:12:29.350 --> 00:12:32.410 of affected individuals and organizations are still coming 201 00:12:32.410 --> 00:12:36.340 to light. So, if I have to sum up the whole ransomware picture 202 00:12:36.340 --> 00:12:40.330 at the moment, it's not looking great. And, a lot of experts are 203 00:12:40.330 --> 00:12:43.090 forecasting that it's going to get worse before the end of the 204 00:12:43.090 --> 00:12:44.350 year here before it gets better. 205 00:12:46.290 --> 00:12:49.770 Anna Delaney: What can I say? Fantastic news? Not at all. But, 206 00:12:50.370 --> 00:12:56.940 you've kind of summarized the evolution on the criminal side. 207 00:12:56.970 --> 00:13:00.270 What about the defender side? What are the most notable 208 00:13:00.750 --> 00:13:05.400 changes in terms of our defenses in their evolution? And, what's 209 00:13:05.430 --> 00:13:06.330 really working? 210 00:13:07.230 --> 00:13:09.060 Mathew Schwartz: That's a great question, and the defenses have 211 00:13:09.060 --> 00:13:11.760 been getting better and better. I mean, all of this is not 212 00:13:11.760 --> 00:13:15.240 happening in a vacuum. Defenders have been responding, 213 00:13:15.270 --> 00:13:18.930 governments have been helping to improve cybersecurity 214 00:13:18.960 --> 00:13:23.940 resilience, you have law firms and incident response firms who 215 00:13:23.940 --> 00:13:27.990 are helping victims practice. And the ones who really recover 216 00:13:27.990 --> 00:13:30.780 well are the ones who have practiced before they get hit, 217 00:13:31.050 --> 00:13:34.170 because it's a long, slow, painful process. But, it's less 218 00:13:34.170 --> 00:13:38.820 long and slow and painful if you have reviewed how you're going 219 00:13:38.820 --> 00:13:41.790 to need to react when this happens, and you've looked at 220 00:13:41.790 --> 00:13:45.240 the kinds of defenses you have in place. So, you know in a 221 00:13:45.240 --> 00:13:48.840 worst-case scenario if you need to wipe all of your systems and 222 00:13:48.840 --> 00:13:52.590 restore from backups. If that's your worst case, that's actually 223 00:13:52.590 --> 00:13:56.130 not a bad case because it means you're not having to pay your 224 00:13:56.130 --> 00:14:00.810 attackers for the promise, which they may or may not give you, of 225 00:14:00.810 --> 00:14:04.560 a decryption tool. So, defense has definitely been getting 226 00:14:04.560 --> 00:14:09.180 better, I'm hearing that again and again from security experts. 227 00:14:09.600 --> 00:14:13.710 I think another shift we're seeing is a cautionary note. So, 228 00:14:13.740 --> 00:14:16.680 yes, I've been talking about LockBit and some of the other 229 00:14:16.680 --> 00:14:21.150 big groups that we see active right now. But we shouldn't get 230 00:14:21.150 --> 00:14:24.420 too tied up on attribution. A lot of these groups are using 231 00:14:24.420 --> 00:14:28.020 the same tactics. Some of them are startlingly simple, it's 232 00:14:28.020 --> 00:14:32.310 remote access tools that they're guessing the password for, and 233 00:14:32.310 --> 00:14:35.700 they're getting in. That shouldn't be happening; that is 234 00:14:35.700 --> 00:14:40.170 happening. A lot of the attackers are affiliates who 235 00:14:40.170 --> 00:14:42.090 work with different groups. So, maybe they're with LockBit 236 00:14:42.090 --> 00:14:45.900 today, and they're with Cactus tomorrow. So, it's less useful 237 00:14:45.930 --> 00:14:48.510 in some respects to look at these as being standalone 238 00:14:48.510 --> 00:14:51.270 groups, and more are just a bunch of individuals, like a 239 00:14:51.270 --> 00:14:54.360 drug cartel, who are doing something, they're doing it very 240 00:14:54.360 --> 00:14:57.330 well at the expense of others, and there should really be more 241 00:14:57.330 --> 00:15:00.510 of a focus on defense. We're seeing it, but I think there 242 00:15:00.510 --> 00:15:05.100 needs to be even more focus. Don't get too caught up in who 243 00:15:05.100 --> 00:15:08.250 these specific attackers are, look at what they're doing, how 244 00:15:08.250 --> 00:15:10.710 they're getting into networks and push it back. 245 00:15:11.880 --> 00:15:14.070 Anna Delaney: Very good. Well said. Thanks, Mat. Okay, 246 00:15:14.070 --> 00:15:18.240 Marianne, retail giant Costco is facing legal challenges for its 247 00:15:18.240 --> 00:15:21.570 alleged unauthorized disclosure of sensitive customer 248 00:15:21.570 --> 00:15:24.030 information to third parties. So, what's the story? 249 00:15:25.080 --> 00:15:28.470 Marianne McGee: Sure, yeah, Costco is the latest company 250 00:15:28.470 --> 00:15:32.250 actually to face proposed class-action data privacy 251 00:15:32.250 --> 00:15:36.360 litigation, involving its alleged use of online tracking 252 00:15:36.360 --> 00:15:41.190 pixels to scrape health and personal information about their 253 00:15:41.190 --> 00:15:45.000 visitors to their websites and to transmit this information, 254 00:15:45.030 --> 00:15:49.710 allegedly, to third-party social media and marketing companies. 255 00:15:50.040 --> 00:15:53.760 Now, the litigation against Costco is certainly not the 256 00:15:53.760 --> 00:15:57.390 first class-action lawsuits that we've seen filed against 257 00:15:57.390 --> 00:16:00.960 organizations accused of doing similar things with their 258 00:16:01.170 --> 00:16:05.310 website-tracking technology. But, in the healthcare space, 259 00:16:05.550 --> 00:16:09.180 the cases against Costco are interesting and little unusual 260 00:16:09.180 --> 00:16:12.360 because Costco is really not thought of as being a major 261 00:16:12.360 --> 00:16:16.350 player in the healthcare space. The company is best known for 262 00:16:16.350 --> 00:16:23.370 selling toiletries, appliances, TVs, automotive supplies, tires, 263 00:16:23.640 --> 00:16:29.880 furniture, office supplies, and bulk-sized groceries. But, these 264 00:16:29.880 --> 00:16:32.730 two lawsuits that were filed against the company, which were 265 00:16:32.730 --> 00:16:37.620 both filed in the same federal Washington State Court, alleged 266 00:16:37.620 --> 00:16:41.610 that the customers who went on Costco's website to refill 267 00:16:41.610 --> 00:16:45.870 prescriptions, or to seek information about immunizations 268 00:16:46.080 --> 00:16:51.060 from Costco pharmacies, did not have knowledge or give their 269 00:16:51.060 --> 00:16:54.600 consent for their sensitive information to be scraped and 270 00:16:54.600 --> 00:16:59.580 shared with companies like Meta, Google and others. Now, the 271 00:16:59.580 --> 00:17:03.720 lawsuits allege similar claims, and that includes that Costco 272 00:17:04.140 --> 00:17:08.220 allegedly disclosed various identifiers, including IP 273 00:17:08.220 --> 00:17:12.090 addresses of individuals who, you know, the state has 274 00:17:12.090 --> 00:17:15.750 considered protected health information under HIPAA. And, 275 00:17:15.750 --> 00:17:21.060 that the warehouse giant also violated FTC regulations as well 276 00:17:21.060 --> 00:17:25.110 as federal and state wiretapping, and other, sort of, 277 00:17:25.110 --> 00:17:30.330 laws. So, we have that going on. But, in the meanwhile, in the 278 00:17:30.330 --> 00:17:35.040 background, you have the FTC and HHS also, in recent months, 279 00:17:35.070 --> 00:17:39.900 warning of potential enforcement actions against hospitals and 280 00:17:39.900 --> 00:17:44.460 telehealth companies about their use of tracking tools without 281 00:17:44.460 --> 00:17:49.260 the knowledge or consent of patients and consumers. So, in 282 00:17:49.260 --> 00:17:52.800 terms of some of these other lawsuits that we've seen filed 283 00:17:52.800 --> 00:17:56.640 against organizations like hospitals, some of these 284 00:17:58.110 --> 00:18:01.350 litigations and the lawsuits have been settled. And, in the 285 00:18:01.350 --> 00:18:05.610 meantime, you also have Meta, facing a large consolidated 286 00:18:05.610 --> 00:18:08.520 class-action lawsuit that's working its way through the 287 00:18:08.520 --> 00:18:12.240 federal courts. So, it'll be interesting to see what happens 288 00:18:12.240 --> 00:18:17.280 with the Costco lawsuit. But, right now, Costco hasn't yet 289 00:18:17.280 --> 00:18:21.180 responded to my request for comment on the lawsuit. You 290 00:18:21.180 --> 00:18:23.820 know, I don't really quite know how this is going to shake out 291 00:18:23.820 --> 00:18:27.420 for the company. If it's bad publicity, will they settle? You 292 00:18:27.420 --> 00:18:30.900 know, what's the defense there? But, the other thing I was going 293 00:18:30.900 --> 00:18:35.940 to also mention, not related to the Costco saga, is that I'm 294 00:18:35.940 --> 00:18:39.390 also keeping my eye on how President Biden's executive 295 00:18:39.390 --> 00:18:43.170 order, this week, which was very thoroughly covered by our new 296 00:18:43.170 --> 00:18:48.450 colleague, Chris Ryota, about AI. And, I'm, kind of, looking 297 00:18:48.450 --> 00:18:51.960 at how this will, kind of, shake out for the healthcare sector 298 00:18:52.200 --> 00:18:56.790 and the Department of Health and Human Services. I've not really 299 00:18:56.790 --> 00:19:00.090 dug too deeply into the executive order yet, but it 300 00:19:00.090 --> 00:19:05.040 looks like HHS is directed to establish a safety program to 301 00:19:05.040 --> 00:19:11.280 receive reports of and then to act to remedy harms or unsafe 302 00:19:11.280 --> 00:19:15.930 healthcare practices involving AI. What exactly that means? I'm 303 00:19:15.930 --> 00:19:20.940 not sure yet, you know. I've made my request for FDA and 304 00:19:20.940 --> 00:19:25.680 other leaders to, sort of, weigh in on that, as we know, for 305 00:19:25.710 --> 00:19:28.500 medical devices, right now, for instance, or for drug 306 00:19:28.500 --> 00:19:34.110 development, involving the use of AI tools; FDA is the 307 00:19:34.110 --> 00:19:36.720 regulator that, kind of, oversees that. So, it'll be 308 00:19:36.720 --> 00:19:39.930 interesting to see how this executive order also will play 309 00:19:39.930 --> 00:19:43.860 out in terms of what sort of portal, perhaps, that the 310 00:19:43.860 --> 00:19:47.400 Department of Health and Human Services will create for, 311 00:19:47.400 --> 00:19:51.360 perhaps, receiving public complaints about AI in 312 00:19:51.390 --> 00:19:54.990 healthcare and unsafe, sort of, practices or discriminatory, 313 00:19:54.990 --> 00:19:58.830 sort of, practices. I don't know yet, but it'll be interesting to 314 00:19:58.830 --> 00:20:04.290 see how much oomph, I guess, this E.O. has for healthcare? 315 00:20:04.000 --> 00:20:07.450 Anna Delaney: Yeah, something to watch for sure. And we've got 316 00:20:07.450 --> 00:20:10.840 our own AI summit in the U.K. as well, so we'll be sharing some 317 00:20:10.840 --> 00:20:13.720 key takeaways from there. But, in terms of data privacy, 318 00:20:13.720 --> 00:20:16.840 Marianne, what should security professionals take away from 319 00:20:16.840 --> 00:20:20.320 these lawsuits and the growing scrutiny on companies using 320 00:20:20.320 --> 00:20:22.150 tracking technologies on their websites? 321 00:20:22.810 --> 00:20:24.640 Marianne McGee: Well, you know, when it comes to the healthcare 322 00:20:24.640 --> 00:20:29.290 space, both FTC and HHS have offered, you know, sort of 323 00:20:29.290 --> 00:20:34.120 similar guidance that you have to be very clear to, you know, 324 00:20:34.120 --> 00:20:37.120 your patients or consumers that are on your website that you're, 325 00:20:37.450 --> 00:20:41.170 you know, collecting certain information and potentially 326 00:20:41.170 --> 00:20:46.720 sharing it with other companies and get their consent. And then 327 00:20:46.720 --> 00:20:53.680 also, for HHS and HIPAA, these companies that are using, like, 328 00:20:53.680 --> 00:20:56.980 you know, for instance, Meta Pixel or Google Analytics, they 329 00:20:56.980 --> 00:21:00.760 have to get business associate agreements from those vendors, 330 00:21:01.480 --> 00:21:04.810 saying that these vendors are taking certain actions to 331 00:21:04.810 --> 00:21:08.710 protect, you know, patients' information. And, you know, one 332 00:21:08.710 --> 00:21:11.650 of the complaints I hear often is that some of these large 333 00:21:11.650 --> 00:21:15.340 vendors, you know, the Metas and the Googles, they're hesitant to 334 00:21:15.340 --> 00:21:19.570 sign BAAs, in many cases, so, you know, it's kind of a 335 00:21:19.570 --> 00:21:20.320 challenge. 336 00:21:21.220 --> 00:21:23.530 Anna Delaney: Yeah, sure. Thanks, Marianne. Thanks for 337 00:21:23.530 --> 00:21:27.070 that update. And, finally, and just for fun, if you could have 338 00:21:27.070 --> 00:21:30.880 an AI cybersecurity sidekick with any fictional characters' 339 00:21:30.880 --> 00:21:35.620 personality, who would you choose and why? Go for it, Tom. 340 00:21:36.520 --> 00:21:39.070 Tom Field: Well, I want someone that has good investigative 341 00:21:39.070 --> 00:21:43.090 abilities. That's what I want to use it for. So, to make a 1980s 342 00:21:43.090 --> 00:21:46.000 television reference that you probably won't get, I'm going to 343 00:21:46.000 --> 00:21:50.260 have Tom Selleck as my partner, and I'm going to call it Magnum 344 00:21:50.290 --> 00:21:50.830 AI. 345 00:21:53.170 --> 00:21:59.080 Anna Delaney: Well, I know Tom Selleck. He's great. Marianne? 346 00:21:59.490 --> 00:22:01.980 Marianne McGee: Mine is also a reference from a TV show, but 347 00:22:01.980 --> 00:22:06.330 this is, I think it was mostly 1970s, M*A*S*H, Radar O'Reilly. 348 00:22:06.540 --> 00:22:06.690 Tom Field: Oh! 349 00:22:06.720 --> 00:22:10.950 Marianne McGee: Yeah, the kind-hearted, very dependable 350 00:22:11.280 --> 00:22:15.120 assistant of the commander of this Mobile Army Surgical 351 00:22:15.120 --> 00:22:21.090 Hospital setting. The setting was the Korean War, and Radar 352 00:22:21.090 --> 00:22:24.480 had a knack of hearing things and seeing things and knowing 353 00:22:24.480 --> 00:22:26.970 things before everyone else, sort of, realized they were 354 00:22:26.970 --> 00:22:28.140 happening. So that'd be good. 355 00:22:28.350 --> 00:22:29.220 Tom Field: Here come the choppers! 356 00:22:30.540 --> 00:22:30.930 Marianne McGee: Right. 357 00:22:32.730 --> 00:22:33.330 Anna Delaney: Mat? 358 00:22:34.170 --> 00:22:36.600 Mathew Schwartz: I'm thinking I should have gone with KITT from 359 00:22:36.600 --> 00:22:39.510 Knight Rider, but I didn't. I know! This is great, you guys! 360 00:22:39.540 --> 00:22:43.500 Wow! You totally nailed this. So, you said fiction, and so, I 361 00:22:43.500 --> 00:22:46.830 started thinking about books, and just to play the science 362 00:22:46.830 --> 00:22:51.090 fiction card, I love the work of Iain M. Banks. And, one of the 363 00:22:51.090 --> 00:22:55.560 repeat characters in his books are ships who are omniscient and 364 00:22:55.560 --> 00:22:59.250 keep people around, little bit for the fun of it, and also, so 365 00:22:59.250 --> 00:23:02.700 they don't go insane. Because if you are omniscient and thought 366 00:23:02.700 --> 00:23:05.580 about things all the time, you need a little comedic relief. 367 00:23:05.850 --> 00:23:09.270 So, you may or may not know Banks, but one of the great 368 00:23:09.270 --> 00:23:12.300 things he does with his ships is gives them these amazing names, 369 00:23:12.420 --> 00:23:15.600 and he's a Scot. And, I think that comes through in the ship 370 00:23:15.600 --> 00:23:19.980 names, things like "Mistake Not," "Frank Exchange of Views" 371 00:23:19.980 --> 00:23:24.930 - that's a warship - and "Beats Working." So, in the books, I'm 372 00:23:24.930 --> 00:23:28.230 maybe not giving the flavor of it, but they're often very dry, 373 00:23:28.290 --> 00:23:30.510 somewhat egotistical. They think they're amazing, because they 374 00:23:30.510 --> 00:23:35.460 are, but it's just this sort of doesn't take life too seriously, 375 00:23:35.490 --> 00:23:39.450 sort of, approach to things and I don't know how it would sound, 376 00:23:39.570 --> 00:23:41.310 but certainly on the page it looks nice! 377 00:23:42.540 --> 00:23:44.940 Anna Delaney: Brilliant. I think you'll all go further than me. 378 00:23:45.540 --> 00:23:49.170 I've chosen a dog, a very famous dog, from The Wizard of Oz, 379 00:23:49.200 --> 00:23:52.770 Toto. We're not in Kansas anymore, and I think that he'll 380 00:23:52.770 --> 00:23:57.900 be great! Loyal, resourceful, a great companion, brave, bit of 381 00:23:57.900 --> 00:23:58.650 humor there. 382 00:23:59.730 --> 00:24:01.200 Tom Field: And he'll hunt down that witch for you. 383 00:24:01.620 --> 00:24:06.870 Anna Delaney: Exactly! Well, Tom, Mat, Marianne, this has 384 00:24:06.870 --> 00:24:08.640 been an immense pleasure. Thank you so much. 385 00:24:09.720 --> 00:24:10.200 Mathew Schwartz: Thanks, Anna. 386 00:24:11.130 --> 00:24:11.730 Marianne McGee: Thanks, Anna. 387 00:24:12.180 --> 00:24:13.950 Anna Delaney: And, thanks so much for watching! Until next 388 00:24:13.950 --> 00:24:14.040 time.