WEBVTT 1 00:00:07.590 --> 00:00:09.990 Anna Delaney: Thanks for joining us for the ISMG Editors' Panel. 2 00:00:10.020 --> 00:00:13.170 I'm Anna Delaney. And this is where ISMG editors meet on a 3 00:00:13.170 --> 00:00:17.640 weekly basis to reflect upon and analyse the top security news 4 00:00:17.730 --> 00:00:21.030 events and trends. I'm joined today by my brilliant 5 00:00:21.030 --> 00:00:23.910 colleagues, Tom Field, senior vice president of editorial, 6 00:00:24.150 --> 00:00:26.730 Marianne Kolbasuk McGee, executive editor for 7 00:00:26.730 --> 00:00:29.730 HealthcareInfoSecurity and Michael Novinson, managing 8 00:00:29.730 --> 00:00:33.210 editor for ISMG Business. Excellent to see you all. 9 00:00:33.900 --> 00:00:34.740 Tom Field: Excellent to be back. 10 00:00:35.520 --> 00:00:36.240 Thanks for having me. 11 00:00:36.690 --> 00:00:37.710 Marianne Kolbasuk McGee: Yeah, thanks. 12 00:00:38.070 --> 00:00:41.640 Anna Delaney: So Tom, I was saying wonderful colors on our 13 00:00:41.640 --> 00:00:43.980 screens today. So why don't you start us off because we got 14 00:00:44.010 --> 00:00:45.210 Barbie in the house. 15 00:00:45.540 --> 00:00:49.110 Tom Field: It's a Barbie world. You heard that right. Marianne 16 00:00:49.110 --> 00:00:51.510 and I were in New York this week, we had a healthcare 17 00:00:51.510 --> 00:00:55.290 security event in Times Square, and the view from our venue into 18 00:00:55.290 --> 00:00:59.010 Times Square was all about the promotion of the Barbie movie. 19 00:00:59.040 --> 00:01:02.130 So yes, it was the colors that attracted me. And I thought 20 00:01:02.670 --> 00:01:04.800 that's a virtual background to happen right there. 21 00:01:06.510 --> 00:01:09.930 Anna Delaney: Well done. Marriane, you're in New York, I 22 00:01:09.930 --> 00:01:10.500 suppose. 23 00:01:10.650 --> 00:01:13.530 Marianne Kolbasuk McGee: Yeah. Well, I didn't take this photo 24 00:01:13.530 --> 00:01:15.930 this week. Actually, my daughter took this a while ago and sent 25 00:01:15.930 --> 00:01:19.230 it to me. When I was planning for this morning's session, I 26 00:01:19.230 --> 00:01:20.970 was like, "Oh, God, I didn't take any photos." 27 00:01:21.930 --> 00:01:22.560 Tom Field: Super busy. 28 00:01:23.220 --> 00:01:24.960 Marianne Kolbasuk McGee: Went with my sister and her husband, 29 00:01:25.200 --> 00:01:28.380 I didn't have any skyline. So this was on my camera from a 30 00:01:28.380 --> 00:01:29.100 while back. 31 00:01:29.490 --> 00:01:32.280 Anna Delaney: Marianne, the ISMG Editors' Panel should always be 32 00:01:32.280 --> 00:01:38.340 on your mind. So remember, for next time. Michael, you're 33 00:01:38.370 --> 00:01:40.860 always bringing out the inner child in us. Where are you 34 00:01:40.860 --> 00:01:41.280 today? 35 00:01:42.030 --> 00:01:44.520 Michael Novinson: Yes, well, I have an outer child to take care 36 00:01:44.520 --> 00:01:47.640 of. So I am coming to you from Springfield, Massachusetts, at 37 00:01:47.640 --> 00:01:51.780 the Dr. Seuss Museum. Had a long weekend last weekend; did some 38 00:01:51.780 --> 00:01:55.020 camping up in the Adirondack, saw some extended family on the 39 00:01:55.020 --> 00:01:57.600 way home, pass through Western Massachusetts and had never 40 00:01:57.600 --> 00:02:00.870 taken my daughter to the Dr. Seuss Museum. So figured why not 41 00:02:00.870 --> 00:02:04.140 stop for an hour or two and spend some time there. Very fun 42 00:02:04.140 --> 00:02:08.160 museum, lots of things to play with. She enjoyed the Dr. Seuss 43 00:02:08.160 --> 00:02:10.410 books. She enjoyed reading, someone had beanbag chair. So 44 00:02:10.920 --> 00:02:13.500 good way to pass the time even if your kid is not quite old 45 00:02:13.500 --> 00:02:14.220 enough to read. 46 00:02:18.630 --> 00:02:21.060 Anna Delaney: Well, this week I present to you a beautiful 47 00:02:21.060 --> 00:02:26.580 scenic view from France, where I escaped to last week. And it was 48 00:02:26.580 --> 00:02:29.400 just lovely to catch my breath for a couple of days in these 49 00:02:29.400 --> 00:02:30.360 stunning surroundings. 50 00:02:30.930 --> 00:02:32.790 Tom Field: Smells and sounds and looks so exotic. 51 00:02:33.420 --> 00:02:37.680 Anna Delaney: It's alright. There's lots of Rose and great 52 00:02:37.680 --> 00:02:43.410 conversations and sunshine and water. So it was all right, back 53 00:02:43.410 --> 00:02:48.570 to reality. Well, Marianne and Tom, you are, of course, both 54 00:02:48.570 --> 00:02:51.570 fresh from New York having hosted and curated ISMG's U.S. 55 00:02:51.600 --> 00:02:55.350 Healthcare Summit and annual event. Marianne, maybe start 56 00:02:55.350 --> 00:02:57.450 with you. How did it go? What were the hot topics? 57 00:02:58.380 --> 00:03:01.650 Marianne Kolbasuk McGee: It went well. And I think our attendees 58 00:03:01.650 --> 00:03:06.330 were pretty well-engaged. One of the highlights, you know, this 59 00:03:06.330 --> 00:03:10.500 year, has been, sort of, in the past, is our medical device, 60 00:03:11.040 --> 00:03:15.900 panel discussion and on every year, Dr. Suzanne Schwartz very 61 00:03:15.900 --> 00:03:20.610 graciously agreed to present to our audience, you know, the 62 00:03:20.610 --> 00:03:25.020 latest regulatory efforts of the FDA. You know, this year, she 63 00:03:25.020 --> 00:03:31.410 went over a new refuse-to-accept policy. That is, it's already in 64 00:03:31.410 --> 00:03:36.240 effect and basically what happened was last December, as 65 00:03:36.240 --> 00:03:40.710 part of an omnibus funding bill, the FDA was granted expanded 66 00:03:40.710 --> 00:03:44.220 authority over medical device cybersecurity. So, you know, 67 00:03:44.220 --> 00:03:49.470 part of their new duties there at the FDA is to more closely 68 00:03:49.470 --> 00:03:52.920 vet new products that gets submitted to them, any sort of 69 00:03:52.950 --> 00:03:55.740 internet-connected device software, anything like that. 70 00:03:56.250 --> 00:04:00.210 And this new refuse-to-accept policy that went into effect in 71 00:04:00.210 --> 00:04:07.170 March basically allows the FDA to reject, you know, just send 72 00:04:07.170 --> 00:04:10.980 back from the get go any new device submissions that are 73 00:04:10.980 --> 00:04:15.510 lacking certain cybersecurity details, such as the 74 00:04:15.510 --> 00:04:19.320 manufacturer has planned for coordinated disclosure of 75 00:04:19.320 --> 00:04:23.430 vulnerabilities, software, build materials, you know, so on and 76 00:04:23.430 --> 00:04:28.230 so forth. So now, this policy went into effect in March, but 77 00:04:28.230 --> 00:04:32.760 right now the FDA is giving some hand holding, you know, so if a 78 00:04:32.760 --> 00:04:38.010 device maker submits their device, you know, application, 79 00:04:38.280 --> 00:04:43.380 is lacking some of the details for cyber that FDA is requiring. 80 00:04:43.590 --> 00:04:46.320 FDA will let them know, "Hey, you got to do this, you got to 81 00:04:46.320 --> 00:04:48.870 do that," and then you can resubmit it. But beginning on 82 00:04:48.870 --> 00:04:53.610 October 1, the hand holding will stop. So device makers have got 83 00:04:53.610 --> 00:04:57.090 to get their act together and know exactly what it is that the 84 00:04:57.090 --> 00:05:03.270 FDA will be expecting. And also before that happens on October 85 00:05:03.270 --> 00:05:10.230 1, the FDA will issue final guidance for the pre-market of 86 00:05:10.230 --> 00:05:16.050 medical devices. And they had last April 2022, issued some 87 00:05:16.050 --> 00:05:18.870 draft guidance. So now this is going to be finalized and, you 88 00:05:18.870 --> 00:05:21.120 know, part of the responsibilities now, the FDA, 89 00:05:21.390 --> 00:05:24.540 with this new authority that Congress gave them is they have 90 00:05:24.540 --> 00:05:28.140 to, you know, regularly update these guidance materials. And 91 00:05:28.140 --> 00:05:31.800 then we had other government speakers at our event as well. 92 00:05:31.800 --> 00:05:35.580 We had someone from HHS OCR, you know, giving us an update on 93 00:05:35.580 --> 00:05:38.730 some of the HIPAA rulemaking that's going on, the trends that 94 00:05:38.730 --> 00:05:41.640 they're seeing with the breaches that are reported, you know, 95 00:05:41.640 --> 00:05:45.870 hacking incidents, ransomware, so on and so forth. And then our 96 00:05:45.870 --> 00:05:53.310 keynoter was deputy director, Nitin Nagarajan, who is the 97 00:05:53.730 --> 00:05:57.330 deputy director of CISA. And he actually, he has a background in 98 00:05:57.330 --> 00:06:01.500 healthcare. So he's had the perfect role for this keynote, 99 00:06:01.500 --> 00:06:04.320 because he's not just, you know, a security guy, but he knows 100 00:06:04.320 --> 00:06:08.670 healthcare very well, he was a paramedic at one point, but he 101 00:06:08.670 --> 00:06:11.430 gave us sort of, you know, the lay of the landscape there with 102 00:06:11.820 --> 00:06:15.060 the cyberthreats that are facing the healthcare sector. So it was 103 00:06:15.090 --> 00:06:19.080 well-rounded, like what in terms of the regulatory issues that 104 00:06:19.080 --> 00:06:21.150 everyone worries about in healthcare, because it's such a 105 00:06:21.150 --> 00:06:25.020 regulated industry, but then we also had, you know, CISOs, from 106 00:06:25.800 --> 00:06:29.670 the private healthcare entities. And you know, maybe Tom wants to 107 00:06:29.670 --> 00:06:30.570 talk about that. 108 00:06:32.730 --> 00:06:35.610 Anna Delaney: That was a great overview. Tom, anything to add? 109 00:06:36.060 --> 00:06:40.320 Tom Field: No, she covered it. I do actually. I want to emphasize 110 00:06:40.320 --> 00:06:43.380 the word Summit, because I think that really distinguishes what 111 00:06:43.380 --> 00:06:47.130 we do from other conferences. It's a smaller group, we were 112 00:06:47.130 --> 00:06:50.790 talking scores of people, not thousands of people, and the 113 00:06:50.790 --> 00:06:53.640 opportunity for people to have meaningful conversations. And I 114 00:06:53.640 --> 00:06:56.850 heard this from sponsors as well as attendees and speakers that 115 00:06:56.850 --> 00:07:00.360 they had the chance to really have meaningful conversations 116 00:07:00.360 --> 00:07:04.440 with one another, not lost in a bigger event such as HIMSS or 117 00:07:04.440 --> 00:07:08.160 RSA. So it underscores what we're doing. The topics are 118 00:07:08.160 --> 00:07:11.160 terrific, the speakers are excellent pedigree, as Marianne 119 00:07:11.160 --> 00:07:14.310 was pointing out, but they really get a chance beyond the 120 00:07:14.310 --> 00:07:16.830 stage to have conversations with one another and come away with 121 00:07:16.830 --> 00:07:21.990 some new ideas. So highlights for me; really enjoy the 122 00:07:21.990 --> 00:07:25.140 business email compromise interactive exercise that we 123 00:07:25.140 --> 00:07:29.220 did. We did it after lunch this time, we got groups together and 124 00:07:29.220 --> 00:07:32.580 we gave them a business email compromise scenario, and they 125 00:07:32.580 --> 00:07:36.120 had to answer some tough questions about how to respond, 126 00:07:36.330 --> 00:07:40.140 what one would do differently, when or if to involve law 127 00:07:40.140 --> 00:07:44.610 enforcement and responsibilities going forward. And we had this 128 00:07:44.610 --> 00:07:49.020 with the guidance of the local office of the FBI, led by 129 00:07:49.020 --> 00:07:52.470 supervisory special agent Michael DeNicola, he and I sat 130 00:07:52.470 --> 00:07:54.780 down and talked a little bit about BEC and some of the latest 131 00:07:54.780 --> 00:07:58.740 trends and some of the scary numbers of reported incidents. 132 00:07:59.100 --> 00:08:01.830 And then he and his agents went from table to table 133 00:08:01.830 --> 00:08:04.290 participating in this conversation afterwards, we 134 00:08:04.290 --> 00:08:07.650 brought some participants up on stage to share some of what they 135 00:08:07.650 --> 00:08:12.060 talked about understanding the phishing and business email 136 00:08:12.060 --> 00:08:15.000 compromise, and socially engineered schemes in general 137 00:08:15.030 --> 00:08:18.990 are big for the healthcare community. In particular, this 138 00:08:18.990 --> 00:08:22.560 was a vital exercise and I think everybody got a lot out of it. 139 00:08:22.920 --> 00:08:27.120 So kudos to Raquel Sanchez and the CyberEdBoard team for 140 00:08:27.120 --> 00:08:29.430 bringing this together. I think it's a terrific addition to the 141 00:08:29.430 --> 00:08:32.880 summit. Another session I want to call out, Marianne started 142 00:08:32.880 --> 00:08:37.410 hinting at it, was we had a couple of good talkative CISOs. 143 00:08:37.620 --> 00:08:41.400 We had John Frushour, the CISO of New York-Presbyterian and we 144 00:08:41.400 --> 00:08:45.990 had Anahi Santiago, the CISO of ChristianaCare. They sat down 145 00:08:46.050 --> 00:08:48.780 and had an open conversation with one another about one of 146 00:08:48.780 --> 00:08:53.850 Mr. Novinson's favorite topics, generative AI, and it was nice 147 00:08:53.850 --> 00:08:57.000 to hear them. It wasn't exactly point-counterpoint. But they 148 00:08:57.000 --> 00:09:01.800 covered the broad perspective of, "Okay, generative AI is in 149 00:09:01.800 --> 00:09:06.570 our enterprise. Now, what do we do about it? Do we block it? Do 150 00:09:06.570 --> 00:09:11.250 we put particular guardrails around it? What can we do to 151 00:09:11.250 --> 00:09:16.710 understand and address the risks as well as the potential of the 152 00:09:16.710 --> 00:09:20.310 tools that are available to us now?" Excellent conversation. I 153 00:09:20.310 --> 00:09:23.910 think we did come away with some good ideas for how to put 154 00:09:23.910 --> 00:09:26.670 appropriate guardrails around this and not just put blinders 155 00:09:26.670 --> 00:09:30.240 on and say we're going to block this and attempt to put 156 00:09:30.240 --> 00:09:33.060 unreasonable controls around the tools that are so broadly 157 00:09:33.060 --> 00:09:36.570 available. For me, one of the highlights of that entire 158 00:09:36.570 --> 00:09:40.770 session was afterwards when session ended, people left the 159 00:09:40.770 --> 00:09:45.060 stage. John Frushour, in particular, gathered quite a 160 00:09:45.060 --> 00:09:48.660 group around him. And our colleague, Raquel Sanchez, came 161 00:09:48.660 --> 00:09:51.750 up to me and pointed out this is what happens when you mentioned 162 00:09:51.750 --> 00:09:54.840 on stage you've got a $13 billion security budget. 163 00:09:56.910 --> 00:10:00.720 Anna Delaney: Rarity. Fantastic, but I'd love to know more about 164 00:10:00.720 --> 00:10:03.750 how they're thinking of generative AI in terms of use 165 00:10:03.750 --> 00:10:07.140 cases to help them in defense. Do they talk about that? Are 166 00:10:07.140 --> 00:10:09.030 they thinking about that? Strategizing for that? 167 00:10:09.510 --> 00:10:11.790 Tom Field: A bit, but a lot of this right now is just 168 00:10:11.790 --> 00:10:15.630 understanding what's available. Understanding how it's being 169 00:10:15.630 --> 00:10:18.960 used in the organization and realizing that you have got some 170 00:10:18.960 --> 00:10:22.500 of the most critical of critical data within your organization 171 00:10:22.500 --> 00:10:24.900 that's got to be protected appropriately, no matter how 172 00:10:24.900 --> 00:10:28.620 it's being used in the generative AI tools. I think 173 00:10:28.620 --> 00:10:31.800 that John from New York-Presbyterian has got some 174 00:10:31.800 --> 00:10:35.580 particularly progressive views on this. Marianne, perhaps we 175 00:10:35.580 --> 00:10:38.190 should have him on for a private conversation about what the 176 00:10:38.190 --> 00:10:41.340 organization is doing, and looking to do, going forward. 177 00:10:41.340 --> 00:10:44.640 But I came away encouraged that, you know, I've talked to a lot 178 00:10:44.640 --> 00:10:47.790 of people in financial services. And initially there was a big, 179 00:10:47.880 --> 00:10:51.000 "We've got to block this, we've got to put regulations around 180 00:10:51.000 --> 00:10:55.290 this, we've got to hold this down." And I didn't see that so 181 00:10:55.290 --> 00:10:58.140 much with our friends from healthcare. I think that that 182 00:10:58.140 --> 00:11:03.030 represents to some degree the maturity of the executives in 183 00:11:03.030 --> 00:11:05.820 these positions, but I also think it represents the 184 00:11:05.820 --> 00:11:10.950 difference of the conversation about generative AI in July 2023 185 00:11:11.040 --> 00:11:15.180 versus January of 2023. Now, what really excites me is 186 00:11:15.180 --> 00:11:18.690 Michael and I are going to Blackhat in August, and I expect 187 00:11:18.690 --> 00:11:21.960 the conversations to be rich with details of how 188 00:11:21.960 --> 00:11:24.030 organizations are starting to harness these tools. 189 00:11:24.630 --> 00:11:27.810 Anna Delaney: Yeah, I can't wait. What about emerging trends 190 00:11:27.810 --> 00:11:32.940 or challenges? Were there any that came up? And as of right 191 00:11:32.940 --> 00:11:35.850 now, in July 2023? Marianne, as well? 192 00:11:36.630 --> 00:11:38.100 Tom Field: Yeah, Marianne, I'm going to defer you because 193 00:11:38.310 --> 00:11:43.590 beyond the use of generative AI and what organizations are 194 00:11:43.590 --> 00:11:46.950 trying to do to protect themselves from ransomware and 195 00:11:46.950 --> 00:11:50.730 socially engineered schemes, I didn't come away with much. That 196 00:11:50.730 --> 00:11:52.800 would tell me there's something there that we hadn't talked 197 00:11:52.800 --> 00:11:53.610 about prior. 198 00:11:54.450 --> 00:11:56.220 Marianne Kolbasuk McGee: Yeah, and I would agree, you know, I 199 00:11:56.220 --> 00:11:59.130 think everyone always is worried about what's coming that we 200 00:11:59.130 --> 00:12:04.020 don't know about or that we're not ready for. And, you know, 201 00:12:04.080 --> 00:12:07.500 again, the deputy director of CISA, you know, sort of 202 00:12:07.500 --> 00:12:14.130 emphasized how these ransomware attackers and DDoS attackers, 203 00:12:14.310 --> 00:12:18.780 for that matter, this year, in particular, they're going after 204 00:12:18.810 --> 00:12:22.470 all sorts of - it's not just the big guys, it's the little doctor 205 00:12:22.470 --> 00:12:26.190 practices and clinics, and you know, and then also, you know, 206 00:12:26.190 --> 00:12:29.700 pharmaceutical companies, but, you know, especially the smaller 207 00:12:29.880 --> 00:12:33.060 organizations are just not prepared to deal with things 208 00:12:33.060 --> 00:12:35.370 that we already know about, let alone things that we don't know 209 00:12:35.370 --> 00:12:38.760 about. So, you know - and one of the things that he was 210 00:12:38.760 --> 00:12:42.480 emphasizing is the importance of information sharing. And if 211 00:12:42.480 --> 00:12:45.450 you're not going to be sharing your own intelligence 212 00:12:45.450 --> 00:12:48.450 information, then you'd better be paying attention to what's 213 00:12:48.450 --> 00:12:53.160 out there that we can inform you about, you know, just stay on 214 00:12:53.160 --> 00:12:53.910 your toes. 215 00:12:54.810 --> 00:12:57.930 Tom Field: I will offer this. Theresa Lanowitz of AT&T shared 216 00:12:57.930 --> 00:13:00.720 a recent report that they had done on cybersecurity threat 217 00:13:00.720 --> 00:13:03.720 trends, industries in general, but some healthcare in 218 00:13:03.720 --> 00:13:06.660 particular. What do you think going around our little screen 219 00:13:06.660 --> 00:13:08.970 here, number one cybersecurity threat to healthcare might be 220 00:13:08.970 --> 00:13:10.620 these days. Michael? 221 00:13:12.990 --> 00:13:14.910 Michael Novinson: Put me on the spot here. I'm gonna say 222 00:13:14.910 --> 00:13:15.570 ransomware. 223 00:13:16.050 --> 00:13:17.280 Tom Field: Okay. Anna? 224 00:13:19.560 --> 00:13:23.100 Anna Delaney: Medical devices, IoT poor security on those 225 00:13:23.100 --> 00:13:23.550 devices. 226 00:13:24.270 --> 00:13:25.530 Tom Field: Marianne, I think you're out of the room. You 227 00:13:25.530 --> 00:13:26.880 might not have caught this. I'd love for you. 228 00:13:27.810 --> 00:13:28.740 Marianne Kolbasuk McGee: Phishing? 229 00:13:29.970 --> 00:13:30.840 Tom Field: DDoS. 230 00:13:31.440 --> 00:13:34.980 Marianne Kolbasuk McGee: Okay, yeah. All right. That's why I 231 00:13:35.280 --> 00:13:39.150 said earlier, there was this big surge in DDoS attacks, I think 232 00:13:39.150 --> 00:13:44.160 it was in January. And it wasn't even because, you know, the 233 00:13:44.160 --> 00:13:46.890 attackers wanted to ransom. They just wanted to disrupt. 234 00:13:47.970 --> 00:13:49.530 Tom Field: So that was one of my takeaways. 235 00:13:50.220 --> 00:13:53.460 Anna Delaney: Did they say that that is due to Russia-Ukraine 236 00:13:53.460 --> 00:13:58.710 war, because we've seen lots of DDoS erupt from the war. 237 00:14:00.150 --> 00:14:03.180 Tom Field: DDoS has become so there's there's very low barrier 238 00:14:03.210 --> 00:14:06.330 to be able to launch a DDoS attack, often the ransomware 239 00:14:06.360 --> 00:14:08.850 attempts that Michael was talking about come with a DDOS 240 00:14:08.850 --> 00:14:12.660 component. So I think it's just become a weapon that is very 241 00:14:12.660 --> 00:14:16.680 easy to wield these days, and can have the disruptive results 242 00:14:16.680 --> 00:14:18.210 that the adversaries are looking for. 243 00:14:19.560 --> 00:14:23.310 Anna Delaney: And you've both hosted this summit for a few 244 00:14:23.310 --> 00:14:28.050 years now. So what encouraged you this time, compared to other 245 00:14:28.080 --> 00:14:28.560 years? 246 00:14:30.870 --> 00:14:33.450 Marianne Kolbasuk McGee: Well, I don't know about you, Tom, but 247 00:14:33.480 --> 00:14:36.900 again, you know, because I meet them, I go to some of these 248 00:14:36.900 --> 00:14:39.540 larger shows like HIMS, where there's thousands of people, but 249 00:14:39.540 --> 00:14:44.010 it's not all security, it's all sorts of IT. I think what 250 00:14:44.010 --> 00:14:48.150 impressed me with our Summit was because it's sort of a smaller 251 00:14:48.180 --> 00:14:52.350 sort of setting, people tend to be more honest, also on stage 252 00:14:52.350 --> 00:14:56.160 about, you know, their concerns and what they're doing, because 253 00:14:56.190 --> 00:15:00.570 there's just a small audience there and people are engaged. 254 00:15:01.080 --> 00:15:04.020 Tom Field: To give you an example, at Riverport, I was 255 00:15:04.020 --> 00:15:07.620 telling you about the Theresa Lanowitz of AT&T presented. The 256 00:15:07.620 --> 00:15:11.220 attendees are instantly challenging her on the makeup of 257 00:15:11.220 --> 00:15:16.170 the healthcare responding group, the number of the breakdown, you 258 00:15:16.170 --> 00:15:19.290 know, what did healthcare providers versus insurers have 259 00:15:19.290 --> 00:15:23.040 to say, they had good and tough questions. I think this is 260 00:15:23.040 --> 00:15:26.640 something you get. People don't bring canned presentations to 261 00:15:26.640 --> 00:15:30.780 our stages. And the attendees don't come in and feel like that 262 00:15:30.780 --> 00:15:33.150 they're separate from this. They're very engaged and eager 263 00:15:33.150 --> 00:15:36.540 to take the microphone or just stand up and ask good and tough 264 00:15:36.540 --> 00:15:38.850 questions. That's why we call these summits, we want that 265 00:15:38.850 --> 00:15:40.830 level of dialogue. That's what impressed me. 266 00:15:41.310 --> 00:15:42.720 Marianne Kolbasuk McGee: Yeah, they kind of remind me if you 267 00:15:42.720 --> 00:15:45.990 have a really favorite college professor or high school 268 00:15:45.990 --> 00:15:49.290 teacher, and the kids are really engaged with the conversation, 269 00:15:49.320 --> 00:15:53.280 that sort of the feel some of the sessions had that I thought. 270 00:15:53.850 --> 00:15:56.130 Anna Delaney: Yeah, wonderful to hear. I know that when we host 271 00:15:56.370 --> 00:16:00.750 healthcare roundtables, it does tend - they do attract a 272 00:16:00.750 --> 00:16:05.340 cerebral bunch. And they're great thinkers. So glad it was 273 00:16:05.340 --> 00:16:08.910 valuable. Sounds great. Well, Michael, moving on to your 274 00:16:08.910 --> 00:16:11.520 story. So following the news that Microsoft and U.S. 275 00:16:11.550 --> 00:16:15.270 officials confirmed a threat actor based in China had hacked 276 00:16:15.270 --> 00:16:18.930 the Outlook email accounts of U.S. government agencies and at 277 00:16:18.990 --> 00:16:23.340 least 25 European governments. Microsoft has now decided to 278 00:16:23.340 --> 00:16:27.060 give all customers access to expanded logging capabilities. 279 00:16:27.060 --> 00:16:29.550 So tell us more about what's behind this decision. 280 00:16:30.510 --> 00:16:33.480 Michael Novinson: Absolutely. And this has been a conversation 281 00:16:33.480 --> 00:16:37.230 for years and really, at its core, is a ethical, moral 282 00:16:37.230 --> 00:16:40.530 question of what security features and tools are public 283 00:16:40.530 --> 00:16:45.030 good. Obviously, these are provided by private sector 284 00:16:45.030 --> 00:16:48.600 organizations whose goal is profit maximization and a 285 00:16:48.600 --> 00:16:52.020 typical way of selling security capabilities is on a tiered 286 00:16:52.020 --> 00:16:55.110 basis that organizations can decide how much security do we 287 00:16:55.110 --> 00:16:57.990 want, and how much are we willing to pay. The federal 288 00:16:57.990 --> 00:17:01.560 government has been more muscular in recent years about 289 00:17:01.890 --> 00:17:06.360 saying that there are certain items that need to be on the 290 00:17:06.360 --> 00:17:10.080 threshold that are bundled in at no cost. One example in recent 291 00:17:10.080 --> 00:17:12.780 years has been multifactor authentication that Jen Easterly 292 00:17:12.780 --> 00:17:18.210 at CISA had been very clear that you can't have a tiered solution 293 00:17:18.210 --> 00:17:20.610 where it's, Okay, well, you get a username and password, and 294 00:17:20.610 --> 00:17:23.100 that's at the base level. And oh, you want a second factor 295 00:17:23.100 --> 00:17:26.070 that well, then you have to pay more for that second factor when 296 00:17:26.340 --> 00:17:30.540 we all know how important MFA is to security today. So in the 297 00:17:30.540 --> 00:17:33.300 case of Microsoft, they have some unique challenges, because 298 00:17:33.300 --> 00:17:37.290 they also are a technology provider. And I know with this 299 00:17:37.560 --> 00:17:40.560 secure by design, secure by default, that there's been a lot 300 00:17:40.560 --> 00:17:44.340 of pressure from the federal government for manufacturers to 301 00:17:44.340 --> 00:17:47.520 build security into their products, which is a unique 302 00:17:47.640 --> 00:17:50.670 situation Microsoft faces versus some of their pureplay security 303 00:17:50.670 --> 00:17:53.820 competitors. So can't be like, oh, we'll give you email for 304 00:17:53.820 --> 00:17:56.400 free. But oh, you want to secure your email, then you have to pay 305 00:17:56.400 --> 00:18:00.330 for that. So what we're dealing with here, this has been a years 306 00:18:00.330 --> 00:18:03.690 old issues around logging data. And this actually came up a lot 307 00:18:04.380 --> 00:18:07.200 after the SolarWinds attack where the Russian Foreign 308 00:18:07.200 --> 00:18:09.360 Intelligence Service took advantage of Microsoft's 309 00:18:09.360 --> 00:18:11.460 technology. They didn't compromise anyone through 310 00:18:11.730 --> 00:18:14.700 Microsoft, but they use Active Directory and Azure Active 311 00:18:14.700 --> 00:18:18.840 Directory to propagate themselves who move laterally 312 00:18:18.840 --> 00:18:22.740 and to expand their presence. And it was very difficult for 313 00:18:23.220 --> 00:18:26.310 victim organizations to see what had happened unless they had a 314 00:18:26.310 --> 00:18:30.300 premium level license. And Microsoft, Brad Smith, who's the 315 00:18:30.300 --> 00:18:32.760 president of Microsoft was in front of the Senate. He was in 316 00:18:32.760 --> 00:18:36.420 front of the house in February of 2021. And he got grilled by a 317 00:18:36.420 --> 00:18:39.480 whole lot of knowledgeable folks in the cybersecurity space. Jim 318 00:18:39.480 --> 00:18:43.860 Langevin of Rhode Island grilled him, Benny Thompson. And people 319 00:18:43.860 --> 00:18:47.940 were really upset that given that Microsoft's technologies, 320 00:18:47.940 --> 00:18:51.000 they can take advantage of that, the victims couldn't actually 321 00:18:51.000 --> 00:18:53.760 see what had happened to them unless they paid Microsoft a 322 00:18:53.760 --> 00:18:56.820 bunch of extra money beforehand. So let's fast forward to the 323 00:18:56.820 --> 00:19:00.900 present day here. So as you had said in the last week, we did 324 00:19:00.900 --> 00:19:07.530 have this campaign attributed to China, where Outlook emails 325 00:19:07.530 --> 00:19:10.770 compromised 25 organizations, including the State Department, 326 00:19:10.770 --> 00:19:14.070 the Commerce Department of Commerce Secretary Gina 327 00:19:14.070 --> 00:19:17.130 Raimondo, as well as some governments in Europe as well as 328 00:19:17.130 --> 00:19:20.340 some private sector organizations. And given how 329 00:19:20.340 --> 00:19:24.990 technically sophisticated the attack was, victim organizations 330 00:19:24.990 --> 00:19:29.670 couldn't actually see what was happening unless they had this 331 00:19:29.670 --> 00:19:32.730 premium level license, which is an E5 license in the private 332 00:19:32.730 --> 00:19:38.130 sector and G5 license for public sector organizations. So you had 333 00:19:38.490 --> 00:19:41.880 a system putting out guidance saying, like, look at these 334 00:19:41.880 --> 00:19:45.030 premium logs, but you could only do this if you have a premium 335 00:19:45.030 --> 00:19:50.310 license. Microsoft alerting U.S. cybersecurity vendors that hey, 336 00:19:50.310 --> 00:19:52.530 it looks like one of your customers has been compromised. 337 00:19:52.530 --> 00:19:56.580 But the managed service security service provider couldn't 338 00:19:56.580 --> 00:19:59.130 actually see any of that because they didn't have that license. 339 00:19:59.610 --> 00:20:01.860 And this has really been the straw that broke the camel's 340 00:20:01.860 --> 00:20:04.380 back, you have run right into Senator out of Oregon who's just 341 00:20:04.380 --> 00:20:08.640 been on hammering Microsoft on this. He's been for a while, but 342 00:20:08.640 --> 00:20:12.330 in particular in the wake of this. So yesterday, Microsoft 343 00:20:12.540 --> 00:20:16.200 finally capitulated, they said that they weren't going to 344 00:20:16.200 --> 00:20:19.170 include more logging capabilities with their G3, or 345 00:20:19.170 --> 00:20:24.180 their E3 standard level license. And to give you the details on 346 00:20:24.840 --> 00:20:27.210 detailed logs around email access, which would have been 347 00:20:27.210 --> 00:20:31.680 very helpful here, as well as 30 other types of log data, and 348 00:20:31.680 --> 00:20:34.890 that the default retention period on logs at the standard 349 00:20:34.890 --> 00:20:40.740 level is going to go from 90 to 180 days. So a useful first 350 00:20:40.740 --> 00:20:44.970 step. I think there's definitely still more that CISA wants to 351 00:20:44.970 --> 00:20:48.030 see from Microsoft. And there's definitely more that some of the 352 00:20:48.030 --> 00:20:50.760 folks in the private sector want to see. And it's not that 353 00:20:50.760 --> 00:20:54.180 they're throwing out. They're not getting rid of the tearing 354 00:20:54.180 --> 00:20:56.820 around logs, that if you you're still getting all kinds of 355 00:20:56.820 --> 00:21:00.510 goodies, if you pay for that E5, for that G5 license that you 356 00:21:00.510 --> 00:21:04.890 don't get at the E3 or the G3 level things are things like 357 00:21:04.890 --> 00:21:07.770 longer default retention periods, automatic support for 358 00:21:07.770 --> 00:21:11.100 important log data, and intelligent insights, which help 359 00:21:11.100 --> 00:21:14.490 determine the scope of potential compromise, that those features 360 00:21:14.490 --> 00:21:17.850 are still only available at the premium level. But I think the 361 00:21:17.850 --> 00:21:20.610 feeling is that from a visibility standpoint, they 362 00:21:20.610 --> 00:21:25.200 needed to step up the game for standard level Microsoft license 363 00:21:25.200 --> 00:21:25.860 holders. 364 00:21:26.310 --> 00:21:27.810 Anna Delaney: Yeah, that was really thorough, Michael, thank 365 00:21:27.810 --> 00:21:31.440 you. And I guess maybe it's the same for you, Tom, or whenever 366 00:21:31.440 --> 00:21:36.030 we host a roundtable on cloud, have discussions, transparency 367 00:21:36.030 --> 00:21:38.850 always comes up. And I always feel that security leaders are 368 00:21:38.850 --> 00:21:41.760 still not sure whether they trust cloud providers. So I 369 00:21:41.760 --> 00:21:45.030 think this is a good reminder that even the cloud is 370 00:21:45.060 --> 00:21:45.750 vulnerable. 371 00:21:46.470 --> 00:21:49.380 Tom Field: It's one of the drivers for multi-cloud 372 00:21:49.380 --> 00:21:53.010 environments is because the CISOs don't want to have all 373 00:21:53.010 --> 00:21:56.340 their eggs in a single provider's basket. It's a 374 00:21:56.340 --> 00:21:58.650 quandary because on one hand, they don't want to be tied to 375 00:21:58.650 --> 00:22:02.160 one particular vendor, but then they decry the lack of 376 00:22:02.160 --> 00:22:05.280 visibility across different vendor environments. So that's 377 00:22:05.280 --> 00:22:06.720 the state of cloud security today. 378 00:22:07.650 --> 00:22:09.840 Anna Delaney: And as you say, Michael, the premium service is 379 00:22:09.840 --> 00:22:12.690 not going away. But I think this is a good move. 380 00:22:13.530 --> 00:22:15.210 Michael Novinson: Yeah. And I think there's certainly a 381 00:22:15.210 --> 00:22:18.120 feeling that if Microsoft wouldn't do this voluntarily, 382 00:22:18.120 --> 00:22:21.150 that they're going to be forced into doing it, or at the very 383 00:22:21.150 --> 00:22:23.700 least, that there's going to be more congressional hearings 384 00:22:23.700 --> 00:22:27.210 where the executives are going to be subpoenaed and Congress 385 00:22:27.210 --> 00:22:30.630 people working to secure them. So I think there's a sense that 386 00:22:30.630 --> 00:22:33.630 either you take this action on your own or we might try to make 387 00:22:33.630 --> 00:22:34.350 you do this. 388 00:22:34.410 --> 00:22:36.840 Anna Delaney: Yeah, I really like the analogy of seatbelts 389 00:22:36.840 --> 00:22:39.810 and airbags. I mean, every car should have that regardless. 390 00:22:41.760 --> 00:22:46.050 Okay, well, finally, obviously sad news this week as we learned 391 00:22:46.080 --> 00:22:49.590 of the death of a hacker legend, Kevin Mitnick. And I think it 392 00:22:49.590 --> 00:22:53.010 has all come as a bit of a shock to us. So in his honor, I'd love 393 00:22:53.010 --> 00:22:57.390 to hear your favorite Mitnick story, memory moment, what comes 394 00:22:57.390 --> 00:22:57.840 to mind? 395 00:22:58.470 --> 00:23:04.050 Tom Field: You can't overstate the impact that he's had on our 396 00:23:04.050 --> 00:23:07.800 sector. And I understand that he committed crimes and spent time 397 00:23:07.800 --> 00:23:10.950 paying for those crimes. But I've come to know him more in 398 00:23:10.950 --> 00:23:14.880 recent years as the chief hacking officer of KnowBe4. And 399 00:23:14.880 --> 00:23:18.720 I know that his influence on cybersecurity education over the 400 00:23:18.720 --> 00:23:21.870 past decade or so has been immense. But I think that I'm 401 00:23:21.870 --> 00:23:25.170 going to credit him with much of what we understand about 402 00:23:25.170 --> 00:23:30.510 socially engineered schemes. That was where he made his fame, 403 00:23:30.900 --> 00:23:34.920 was in social engineering. And I think that so much of what 404 00:23:34.920 --> 00:23:39.270 happens today still goes back to basic, socially engineered 405 00:23:39.270 --> 00:23:43.830 schemes. And I'm reminded of a quote that he once offered us, 406 00:23:43.830 --> 00:23:48.720 which was that it is far easier to manipulate humans than it is 407 00:23:48.720 --> 00:23:51.660 technology. That was true. And he said, it is true today, it's 408 00:23:51.660 --> 00:23:54.450 going to be true forever. And it's something that we should 409 00:23:54.450 --> 00:23:56.790 pay attention to as we think about Kevin Mitnick and his 410 00:23:56.790 --> 00:24:01.320 impact on our field as for as much as we know, about socially 411 00:24:01.320 --> 00:24:03.780 engineered schemes, as much as we've learned, as much as he's 412 00:24:03.810 --> 00:24:06.180 taught us, there's still much more to pay attention to, 413 00:24:06.180 --> 00:24:09.000 because we're falling victim to these things every day. And 414 00:24:09.000 --> 00:24:11.070 that's something that he brought to our attention. 415 00:24:11.820 --> 00:24:14.340 Anna Delaney: Yeah, absolutely. I think his books are worth a 416 00:24:14.340 --> 00:24:16.290 reread. Michael? 417 00:24:17.910 --> 00:24:19.890 Michael Novinson: Trip down memory lane myself. I'd actually 418 00:24:19.890 --> 00:24:22.230 covered a keynote that he had delivered back in October of 419 00:24:22.260 --> 00:24:25.920 2017. At a conference hosted by Continuum, there are remote 420 00:24:25.920 --> 00:24:27.780 monitoring and management vendors are now part of 421 00:24:27.780 --> 00:24:32.010 ConnectWise. And he was talking there about the need for 422 00:24:32.130 --> 00:24:35.250 organizations to move away from information security manuals 423 00:24:35.250 --> 00:24:38.130 that will read like the Las Vegas Penal Code. He was saying 424 00:24:38.130 --> 00:24:41.820 companies should have brokers with lots of images, less tax 425 00:24:41.820 --> 00:24:45.000 and that delve into specific topics like choosing a good 426 00:24:45.000 --> 00:24:48.450 password. And one of the quotes you'd have in the keynote was if 427 00:24:48.450 --> 00:24:51.540 it's boring and disinterested, nobody's going to read it. And 428 00:24:51.540 --> 00:24:53.340 it certainly, if there's one thing that's true about 429 00:24:53.460 --> 00:24:55.350 academics life, it was never boring. 430 00:24:58.290 --> 00:25:05.550 Marianne Kolbasuk McGee: I just say overall in general that he 431 00:25:05.550 --> 00:25:09.300 was able to turn his life around and starting sort of on the 432 00:25:09.300 --> 00:25:15.000 criminal side of hacking, and to not only make a positive impact 433 00:25:15.000 --> 00:25:18.540 on the sector, but you know, he built a brand new career helping 434 00:25:18.570 --> 00:25:21.900 other companies. So I think that's commendable. 435 00:25:22.170 --> 00:25:25.380 Anna Delaney: Yeah, absolutely. Being from the world's most 436 00:25:25.380 --> 00:25:29.220 wanted hacker to really important voice of the defender 437 00:25:29.220 --> 00:25:33.120 community. I really liked this story when he was 16, and he 438 00:25:33.120 --> 00:25:36.180 hacked into a McDonald's driving, where he pretended to 439 00:25:36.180 --> 00:25:39.240 be a McDonald's employee and taking customers' orders. 440 00:25:39.240 --> 00:25:43.620 There's a real playful attitude of hacking and curiosity and 441 00:25:44.040 --> 00:25:47.820 breaking systems. But I think what he's done has been very 442 00:25:47.820 --> 00:25:53.250 positive I think. FBI, yes, being the world's most wanted, 443 00:25:53.250 --> 00:25:57.300 but I think he taught us and lessons, indeed, and he said 444 00:25:57.300 --> 00:25:59.820 that anything out there is vulnerable to attack, given 445 00:25:59.820 --> 00:26:02.430 enough time and resources. So wise words. 446 00:26:03.210 --> 00:26:05.460 Tom Field: Over and over, all the best to his widow and to his 447 00:26:05.460 --> 00:26:06.120 unborn child. 448 00:26:07.320 --> 00:26:09.960 Anna Delaney: Brilliant mind and man. So thank you, everyone. 449 00:26:10.050 --> 00:26:12.570 This has been a lovely conversation. Thanks for your 450 00:26:12.570 --> 00:26:18.180 insight. Thank you so much. Until next time.