WEBVTT 1 00:00:00.360 --> 00:00:03.150 Anna Delaney: Hello and welcome to the ISMG Editor's Panel. I'm 2 00:00:03.150 --> 00:00:06.000 Anna Delaney, and this is where I sit down with my fellow 3 00:00:06.000 --> 00:00:09.450 editorial teammates and discuss the latest in the world of 4 00:00:09.450 --> 00:00:13.500 cybersecurity. And I am pleased to say, our excellent Managing 5 00:00:13.500 --> 00:00:15.930 Editor of Security and Technology, Jeremy Kirk, is 6 00:00:15.930 --> 00:00:18.960 joining us for the first time this week. We also have the 7 00:00:18.960 --> 00:00:22.530 brilliant Dan Gunderman, Staff Writer of the News Desk, and the 8 00:00:22.530 --> 00:00:26.490 one and only exceptional Tom Field, Senior Vice President of 9 00:00:26.490 --> 00:00:30.150 Editorial. Very good to see you all, Jeremy, especially you. 10 00:00:30.720 --> 00:00:33.120 Tom Field: All time listener, first time caller, Jeremy Kirk. 11 00:00:34.290 --> 00:00:36.840 Jeremy Kirk: Thanks for having me. Or accommodating me. 12 00:00:38.100 --> 00:00:40.350 Anna Delaney: Jeremy, talk to us about your background. Where are 13 00:00:40.350 --> 00:00:40.770 you? 14 00:00:41.730 --> 00:00:44.550 Jeremy Kirk: I'm in Sydney. So I'm actually not where this 15 00:00:44.550 --> 00:00:47.700 photograph was taken. But I'm pretty close. So this was taken 16 00:00:47.700 --> 00:00:50.670 last weekend in Darling Harbor. So if you're kind of familiar 17 00:00:50.670 --> 00:00:53.850 with Sydney, it's kind of a nice sort of touristy place that a 18 00:00:53.850 --> 00:00:58.440 lot of people go. My wife and I had gone to see Hamilton 19 00:00:58.440 --> 00:01:02.070 actually, we were walking out, later in the evening, and I took 20 00:01:02.070 --> 00:01:02.970 this wonderful photo. 21 00:01:03.660 --> 00:01:04.320 Anna Delaney: You enjoyed it? 22 00:01:05.069 --> 00:01:06.509 Tom Field: I'm glad you had the opportunity to be out. 23 00:01:07.290 --> 00:01:09.210 Jeremy Kirk: It was great. It was great. It was awesome. 24 00:01:10.140 --> 00:01:11.940 Anna Delaney: So Dan, you're out in the snow? 25 00:01:12.960 --> 00:01:14.700 Dan Gunderman: Yeah, you know, Tom and I seem to have the 26 00:01:14.700 --> 00:01:18.360 matching theme of a Tundra in recent weeks. So this is my 27 00:01:18.360 --> 00:01:22.230 version. But oddly enough, just briefly, I actually have a ski 28 00:01:22.230 --> 00:01:25.560 area in the family that's operated by my aunt and uncle in 29 00:01:25.560 --> 00:01:27.690 the lower Hudson Valley in New York. So that's often where I 30 00:01:27.690 --> 00:01:30.540 spend a good chunk of my weekends. So that's where I was 31 00:01:30.540 --> 00:01:35.670 last weekend and I think it was maybe 3 °F that felt like 32 00:01:35.670 --> 00:01:39.720 negative 10. So that was a downside. But altogether, some 33 00:01:39.720 --> 00:01:40.290 wins are fun. 34 00:01:40.740 --> 00:01:42.720 Anna Delaney: So you're a good person to know when I come to 35 00:01:42.720 --> 00:01:49.110 the States. And Tom, a beautiful backdrop of your house perhaps? 36 00:01:49.380 --> 00:01:52.320 Tom Field: This is my Sydney. This is my little town. This is 37 00:01:52.650 --> 00:01:55.410 my little harbor with the one restaurant that I can go to on 38 00:01:55.410 --> 00:01:58.770 the weekend that's open for breakfast and for lunch. Yes. 39 00:01:59.610 --> 00:02:01.560 Anna Delaney: Very good. And I thought I'd spice things up this 40 00:02:01.560 --> 00:02:07.710 week. Behind me are barrels of Armenian brandy, no less. And 41 00:02:07.710 --> 00:02:12.120 this is taken at a brandy factory of course, you're given 42 00:02:12.120 --> 00:02:14.640 a selection of brandies to taste, which is always a bit 43 00:02:15.030 --> 00:02:21.030 dangerous. But, Tom, you have been busy recording for upcoming 44 00:02:21.180 --> 00:02:24.750 Zero Trust Summit. Is it fair to say that zero trust is graduated 45 00:02:24.750 --> 00:02:27.150 beyond the buzzword label? 46 00:02:27.570 --> 00:02:29.490 Tom Field: Yes, you can say two years ago, when we were at the 47 00:02:29.490 --> 00:02:32.100 RSA Conference, the last one we attended live before the 48 00:02:32.100 --> 00:02:35.430 pandemic, that zero trust was the t-shirt that everybody took 49 00:02:35.430 --> 00:02:38.670 home from the event. And it really has taken root, I mean 50 00:02:38.670 --> 00:02:42.450 look at all the conversations we had over the past year about 51 00:02:42.450 --> 00:02:45.720 organizations embracing zero trust, the US Federal Government 52 00:02:45.720 --> 00:02:48.870 embraced it. And even just this year, two memos have gone out 53 00:02:48.870 --> 00:02:52.740 from the government from the various agencies about zero 54 00:02:52.740 --> 00:02:56.280 trust and about implementation. So it has taken root in the 55 00:02:56.310 --> 00:02:59.910 event that we have upcoming is terrific. I've sat in on some 56 00:03:00.690 --> 00:03:04.350 keynote sessions and some panels and some individual sessions and 57 00:03:04.350 --> 00:03:07.380 we've got people such as John Kindervag, the creator of zero 58 00:03:07.380 --> 00:03:10.650 trust. He's the chair of the event and handpicked many of the 59 00:03:10.650 --> 00:03:14.550 people participating, including Chase Cunningham, who he refers 60 00:03:14.550 --> 00:03:18.750 to as Dr. Zero Trust, and other people, Retired General Greg 61 00:03:18.750 --> 00:03:22.020 Tullio among them. So some terrific people participating in 62 00:03:22.020 --> 00:03:24.600 this event. And I think you and I both have had the opportunity 63 00:03:24.600 --> 00:03:26.910 to be involved in some of these sessions. And it's enlightening. 64 00:03:28.410 --> 00:03:30.330 Anna Delaney: So what are you going to share with us? You got 65 00:03:30.330 --> 00:03:31.500 some footage? 66 00:03:32.800 --> 00:03:34.720 Tom Field: It's interesting. Yesterday, I did a panel and the 67 00:03:34.720 --> 00:03:38.620 panel I did was with one with John Kindervag, who is the 68 00:03:38.620 --> 00:03:42.640 godfather, the soul, the father of zero trust. Also was Grant 69 00:03:42.640 --> 00:03:45.910 Schneider, the former government official, he's one of our 70 00:03:45.970 --> 00:03:50.980 editorial advisors and knows government in and out. The other 71 00:03:51.070 --> 00:03:55.930 was E.P. Matthew, who is the Deputy CISO of the Defense 72 00:03:55.930 --> 00:03:59.620 Intelligence Agency (DIA). And so I asked each participant 73 00:03:59.740 --> 00:04:05.140 upfront, to talk to me about what role zero trust plays in 74 00:04:05.140 --> 00:04:08.980 your work today, I want to share with you the response that E.P. 75 00:04:08.980 --> 00:04:09.820 Matthew gave. 76 00:04:10.110 --> 00:04:13.350 E.P. Matthew: DIA is, you know, the responsible party for JWICS 77 00:04:13.350 --> 00:04:14.670 (Joint Worldwide Intelligence Communications System), which is 78 00:04:14.670 --> 00:04:18.690 our top secret network that runs sensitive compartmented 79 00:04:18.840 --> 00:04:23.550 information. So from a mission perspective, this is how we 80 00:04:23.550 --> 00:04:27.120 connect to the IC and DOD and this is how we share data. 81 00:04:27.750 --> 00:04:31.230 Right? So when you think of the killing of OBL, Osama bin Laden, 82 00:04:32.970 --> 00:04:37.650 the iconic picture that you see on one side with the President 83 00:04:37.650 --> 00:04:40.320 and his national security team huddled in the White House and 84 00:04:40.320 --> 00:04:44.970 on the other side, images of Pakistan, that was transmitted 85 00:04:45.000 --> 00:04:49.680 through JWICS. So against that backdrop, we in the IC and DOD 86 00:04:50.010 --> 00:04:52.350 have a great interest in securing our infrastructure and 87 00:04:52.350 --> 00:04:56.160 our networks, and then hence the migration towards a zero trust 88 00:04:56.160 --> 00:04:56.640 architecture. 89 00:04:57.020 --> 00:04:59.750 Tom Field: How do you turn to other panelists such as John 90 00:04:59.780 --> 00:05:02.810 Kindervag and Grant Schneider, and say compete with this, after 91 00:05:02.840 --> 00:05:06.050 E.P. has just brought Osama bin Laden into the conversations of 92 00:05:06.050 --> 00:05:07.670 defense needing zero trust. 93 00:05:07.000 --> 00:05:10.600 Anna Delaney: And you tend to just leave it at that. 94 00:05:12.150 --> 00:05:13.920 Tom Field: I did a great job, mind you. And I think you'll 95 00:05:13.920 --> 00:05:16.320 enjoy the summit. I encourage everyone to register for this. 96 00:05:16.320 --> 00:05:19.800 It's a terrific event with great conversations. But I have never 97 00:05:19.800 --> 00:05:25.650 started a panel discussion with quite that level of urgency. 98 00:05:27.180 --> 00:05:29.790 Anna Delaney: Tom, I don't think I've ever moderated a roundtable 99 00:05:29.850 --> 00:05:32.850 on the topic of zero trust where attendees all said, "Wow, this 100 00:05:32.850 --> 00:05:36.780 is really easy to implement." Were there any specific 101 00:05:36.780 --> 00:05:40.350 takeaways that panelists shared on the implementation of zero 102 00:05:40.350 --> 00:05:40.920 trust? 103 00:05:40.990 --> 00:05:43.660 Tom Field: Well, it turns out that one of the panels is on 104 00:05:44.020 --> 00:05:48.490 myths and misconceptions about the zero trust architecture. And 105 00:05:48.970 --> 00:05:52.000 two of the myths that John Kindervag really tackles upfront 106 00:05:52.000 --> 00:05:56.710 are that it's hard, that it's expensive. So we spent a fair 107 00:05:56.710 --> 00:06:00.100 amount of time talking about that, you know, there's a lot of 108 00:06:00.100 --> 00:06:04.330 marketing to undo, to be able to spread the true gospel here. And 109 00:06:05.080 --> 00:06:08.560 I hope that this summit does a pretty good job of getting the 110 00:06:08.560 --> 00:06:11.350 conversation going. And as you know, ISMG has got some plans 111 00:06:11.350 --> 00:06:15.190 going on later in the year where we're going to be taking the 112 00:06:15.190 --> 00:06:18.310 zero trust message on the road very thoroughly. 113 00:06:19.470 --> 00:06:21.270 Anna Delaney: Alright, good. Well, we look forward to the 114 00:06:21.270 --> 00:06:22.320 summit. Thanks, Tom. 115 00:06:22.470 --> 00:06:23.340 Tom Field: Great event. Thank you. 116 00:06:23.670 --> 00:06:26.640 Anna Delaney: So Jeremy, you have been plowing away and 117 00:06:26.640 --> 00:06:30.300 crafting a superb podcast series called The Ransomware Files. I 118 00:06:30.450 --> 00:06:33.000 think you've just completed episode four, which is on our 119 00:06:33.000 --> 00:06:34.470 site. Could you tell us more? 120 00:06:35.440 --> 00:06:38.890 Jeremy Kirk: Yeah, that's right. So that episode, focused on 121 00:06:38.890 --> 00:06:42.550 Maersk, which was the big shipping giant that was hit by 122 00:06:43.270 --> 00:06:47.080 NotPetya, which was ransomware that was actually destructive 123 00:06:47.080 --> 00:06:49.810 malware. So I kind of thought like, "Well, should this 124 00:06:49.810 --> 00:06:51.850 actually fit? It's called The Ransomware Files. But this 125 00:06:51.850 --> 00:06:54.730 actually isn't ransomware." But if you think about what happened 126 00:06:54.730 --> 00:06:58.120 to Maersk, and all the other companies that were affected by 127 00:06:58.270 --> 00:07:01.240 NotPetya was kind of the same as ransomware, in the sense that 128 00:07:01.240 --> 00:07:03.970 they would have to recover their systems. And it was devastating. 129 00:07:04.150 --> 00:07:08.980 All the effects were essentially the same. The backstory was a 130 00:07:08.980 --> 00:07:11.500 lot of mystery and intrigue. You know, of course, there was 131 00:07:11.500 --> 00:07:15.610 Russian engineered malware that was aimed at probably just 132 00:07:15.610 --> 00:07:20.410 Ukraine, but then sort of got completely out of hand. So the 133 00:07:20.410 --> 00:07:24.490 story focuses on two people who worked in Maersk's identity and 134 00:07:24.490 --> 00:07:28.840 access management, and they were the ones that kind of almost 135 00:07:28.840 --> 00:07:32.830 sort of bore the brunt. Because a lot of company systems, you 136 00:07:32.830 --> 00:07:35.770 know, you're just basically when attackers encrypt your Active 137 00:07:35.770 --> 00:07:39.280 Directory in your domain controllers, and those go down, 138 00:07:39.550 --> 00:07:42.280 nothing else works, right? That's kind of like the first 139 00:07:42.280 --> 00:07:46.450 thing that has to be restored after an attack, one of the 140 00:07:46.450 --> 00:07:51.130 first things at least. So they lost all their Active Directory. 141 00:07:51.160 --> 00:07:56.140 And one of the people I interviewed, Bharat Halai, who 142 00:07:56.140 --> 00:08:00.040 was actually head of identity access management. When he kind 143 00:08:00.040 --> 00:08:02.350 of saw this sort of landscape, I was like, "Oh, we don't have any 144 00:08:02.350 --> 00:08:04.420 copy of Active Directory", meaning they would have to start 145 00:08:04.420 --> 00:08:07.210 over, reestablish every account for every person who worked at 146 00:08:07.210 --> 00:08:11.740 Maersk, which would be 80,000 people. I mean, it's just one of 147 00:08:11.740 --> 00:08:13.870 these unconquerable kind of things. I mean, they would have 148 00:08:13.870 --> 00:08:17.980 gotten to it already. But he had this idea. He told everybody in 149 00:08:17.980 --> 00:08:21.400 his tech offices, I called all the sites where we run anything 150 00:08:21.430 --> 00:08:24.340 and see if they have a copy of Active Directory. And sure 151 00:08:24.340 --> 00:08:27.850 enough, they had one copy in Lagos, Nigeria, that was 152 00:08:27.880 --> 00:08:34.270 uncorrupted, because they had a internet outage, and the malware 153 00:08:34.270 --> 00:08:36.760 can't get us, NotPetya couldn't get to something that was 154 00:08:36.760 --> 00:08:42.040 completely offline. And so that saved them. They had one of 155 00:08:42.040 --> 00:08:45.130 their people in Ghana, go to Lagos and pick it up, fly it to 156 00:08:45.130 --> 00:08:49.480 London, and then take it out to Maidenhead. So it's a pretty 157 00:08:49.510 --> 00:08:51.460 incredible story. It's been recounted before, you know, 158 00:08:51.460 --> 00:08:55.630 before I did the podcast, but it is one of those really 159 00:08:55.630 --> 00:08:58.810 interesting stories of recovery. And both of those people had all 160 00:08:58.810 --> 00:09:02.740 kinds of good tips for like what to do in a ransomware attack. 161 00:09:02.770 --> 00:09:04.900 It's kind of devastating. And they actually had some quite 162 00:09:04.900 --> 00:09:08.860 interesting and emotional perspectives on it as well. So 163 00:09:08.890 --> 00:09:13.300 yeah, it's an episode on The Ransomware Files that doesn't 164 00:09:13.300 --> 00:09:15.280 deal with ransomware. But something that's pretty close to 165 00:09:15.280 --> 00:09:16.030 it, at least. 166 00:09:16.530 --> 00:09:18.930 Anna Delaney: I thought one of the attendees I think was Gavin 167 00:09:18.930 --> 00:09:22.320 Ashton and what he said about Active Directory and juicing 168 00:09:22.560 --> 00:09:27.600 Active Directory, it was a good takeaway and commoditizing how 169 00:09:27.600 --> 00:09:31.410 AD is secure. Care to elaborate? 170 00:09:32.430 --> 00:09:35.070 Jeremy Kirk: Yeah, absolutely. I mean, they had moved to 171 00:09:35.070 --> 00:09:39.270 something called Azure Password Sync. So it basically meant that 172 00:09:39.990 --> 00:09:43.830 even if their on-premises Active Directory was knocked out, they 173 00:09:43.830 --> 00:09:47.040 were still able to use because Azure AD is kind of like the 174 00:09:47.040 --> 00:09:50.550 cloud version of Active Directory. And Microsoft has, 175 00:09:50.550 --> 00:09:54.390 and so people were still able to get into their Office 365. So 176 00:09:54.390 --> 00:09:57.030 that meant even though you've had this absolutely devastating 177 00:09:57.060 --> 00:09:59.730 ransomware attack, people were still able to log into services 178 00:09:59.730 --> 00:10:02.760 and so that It's great because like, one of the big challenges 179 00:10:02.760 --> 00:10:06.030 a lot of organizations have is that they can't communicate, 180 00:10:06.060 --> 00:10:09.480 because literally everything is out, they can't use email, their 181 00:10:09.480 --> 00:10:12.630 phones, they may have IP phones, those might be all screwed up 182 00:10:12.630 --> 00:10:16.260 too. So they're kind of like, "What's your phone number? So we 183 00:10:16.260 --> 00:10:18.180 can WhatsApp." And that's usually like what a lot 184 00:10:18.180 --> 00:10:21.660 organizations ended up going to is just sort of like consumer 185 00:10:21.660 --> 00:10:23.760 collaboration, not consumer collaboration with like, 186 00:10:23.820 --> 00:10:26.580 consumer tools just to communicate just to be able to 187 00:10:26.580 --> 00:10:30.540 like, talk to each other. So that was one thing too. And 188 00:10:30.540 --> 00:10:32.850 yeah, generally, he's like, as much as you can move away from 189 00:10:32.850 --> 00:10:37.230 these centralized repositories of information, the better. So 190 00:10:37.230 --> 00:10:40.980 he's like, you won't get rid of all your on-premises software, 191 00:10:40.980 --> 00:10:44.220 because he's like, there's always going to be like a 10, or 192 00:10:44.220 --> 00:10:47.610 20%, that's going to hang around for reasons of compatibility or 193 00:10:47.610 --> 00:10:51.030 legacy or, you know, our specific application that runs 194 00:10:51.030 --> 00:10:54.360 in the aluminum smelter will only run on, you know, this 195 00:10:54.930 --> 00:10:59.490 particular version or whatever. So, yeah, but it's like, as much 196 00:10:59.490 --> 00:11:02.250 as you can get rid of the old stuff and kind of move to more 197 00:11:02.250 --> 00:11:06.150 secure, cloud based stuff. As Tom was saying about zero trust, 198 00:11:06.150 --> 00:11:08.580 that sort of stuff is all very good for defending against 199 00:11:08.580 --> 00:11:12.510 ransomware. Yeah, so it was a lot of good tips. 200 00:11:13.740 --> 00:11:16.770 Anna Delaney: So Jeremy, I implore everyone to listen to 201 00:11:16.770 --> 00:11:18.690 the podcast series. What's the next one on? 202 00:11:19.410 --> 00:11:21.600 Jeremy Kirk: There's good stuff coming up, there's really good 203 00:11:21.600 --> 00:11:28.800 stuff coming up. I can't really talk about it. But I'll say it's 204 00:11:28.800 --> 00:11:34.620 going to involve REvil, which is probably the biggest, you know, 205 00:11:34.620 --> 00:11:36.840 we've had a lot of things happening with REvil lately, you 206 00:11:36.840 --> 00:11:40.260 know, Russia, arrested a bunch of people, and it's charged a 207 00:11:40.260 --> 00:11:43.170 bunch of people related to that. REvil was central to some of the 208 00:11:43.170 --> 00:11:47.310 largest ransomware attacks in the US. And it's been a focus 209 00:11:47.310 --> 00:11:50.370 for law enforcement as well. So not only is it been a 210 00:11:50.370 --> 00:11:54.960 devastating and highly profitable gang, it's also a lot 211 00:11:54.960 --> 00:11:57.750 of people are in hot pursuit of it. And, you know, there's this 212 00:11:57.750 --> 00:12:00.090 question right now with what's happening in Ukraine, and the 213 00:12:00.090 --> 00:12:03.090 rest is like, all this tied together. Is Russia just 214 00:12:03.270 --> 00:12:05.640 arresting these ransomware folks, so they can get a little 215 00:12:05.640 --> 00:12:08.730 bit of slack on what they're doing in Ukraine? You know, a 216 00:12:08.730 --> 00:12:11.460 lot of people are speculating that there might be some, you 217 00:12:11.460 --> 00:12:14.910 know, some sort of quid pro quo there or is this completely 218 00:12:14.910 --> 00:12:17.340 unconnected, and we don't really know, you know, diplomatic 219 00:12:17.340 --> 00:12:20.850 stuff, kind of, maybe years later, it'll come out sort of 220 00:12:20.850 --> 00:12:23.670 what it was kind of happening based on recollections, but it's 221 00:12:23.670 --> 00:12:26.520 interesting, because all this stuff does kind of tie together 222 00:12:26.520 --> 00:12:29.610 and probably is related in some tangential ways. 223 00:12:30.120 --> 00:12:33.030 Tom Field: I'm going to join in. Anyone that hasn't sampled The 224 00:12:33.030 --> 00:12:36.600 Ransomware Files, please do so. It's so terrific listen. You get 225 00:12:36.600 --> 00:12:38.610 a lot from it, and it's entertaining as well. So thank 226 00:12:38.610 --> 00:12:39.090 you, Jeremy. 227 00:12:39.900 --> 00:12:40.560 Jeremy Kirk: Thank you. 228 00:12:41.350 --> 00:12:44.170 Anna Delaney: So speaking of Ukraine, you set this up very 229 00:12:44.170 --> 00:12:48.130 nicely, Jeremy. Dan, you have been reporting on the story all 230 00:12:48.130 --> 00:12:50.200 week. Cyber espionage. 231 00:12:50.750 --> 00:12:53.480 Dan Gunderman: Yes, absolutely. And again, great segue there, 232 00:12:53.900 --> 00:12:58.310 Jeremy. So it's no doubt been a busy week and start to 2022 for 233 00:12:58.310 --> 00:13:01.730 US diplomats and the nation's cybersecurity leaders. Russian 234 00:13:01.730 --> 00:13:04.520 President Vladimir Putin for weeks has hinted at a potential 235 00:13:04.520 --> 00:13:08.270 full scale invasion of Ukraine after amassing some 100,000 236 00:13:08.270 --> 00:13:12.380 troops along the country's eastern border. The prospect of 237 00:13:12.380 --> 00:13:15.260 kinetic war, however, isn't the only fear among experts right 238 00:13:15.260 --> 00:13:18.200 now, who say that any physical campaign will no doubt be 239 00:13:18.200 --> 00:13:22.610 enabled or deepened by a direct cyber attack likely on Ukrainian 240 00:13:22.610 --> 00:13:26.180 infrastructure. So with tensions flaring and diplomatic rhetoric 241 00:13:26.210 --> 00:13:29.330 worsening, cybersecurity experts warn that Ukraine may face the 242 00:13:29.330 --> 00:13:32.300 wrath of Russian state-backed hackers. So the Department of 243 00:13:32.300 --> 00:13:35.690 Homeland Security, here in the US, told network defenders to 244 00:13:35.690 --> 00:13:38.600 anticipate a similar strike, should President Joe Biden 245 00:13:38.600 --> 00:13:42.320 intervene in the crisis. Putin, meanwhile, has moved to prevent 246 00:13:42.320 --> 00:13:45.950 Ukraine's entry into NATO, the term that has drawn widespread 247 00:13:45.950 --> 00:13:49.370 criticism around the world. And then several events in recent 248 00:13:49.370 --> 00:13:52.370 days that we've been tracking have also teased a longer play 249 00:13:52.370 --> 00:13:55.940 here. So in fact, on Monday Symantec's Threat Hunter team 250 00:13:55.940 --> 00:13:58.880 outlined a specific Russian cyber espionage campaign 251 00:13:59.090 --> 00:14:02.480 conducted against an unnamed Ukrainian network in 2021. 252 00:14:02.720 --> 00:14:05.840 Symantec calls the group in question Shuckworm, and the 253 00:14:05.840 --> 00:14:08.870 researchers say it's leveraged phishing emails to inject remote 254 00:14:09.200 --> 00:14:12.620 access tools for reconnaissance and possible data exfiltration. 255 00:14:12.890 --> 00:14:15.950 Symantec says the spies are linked to Russia's Federal 256 00:14:15.950 --> 00:14:19.460 Security Service or FSB, which is the main successor to the 257 00:14:19.460 --> 00:14:23.480 USSR as KGB. They tracked one incident that occurred between 258 00:14:23.480 --> 00:14:27.410 July and August of 2021. But similar attacks dating back to 259 00:14:27.410 --> 00:14:31.250 2014, reportedly number in the 1000s. So then on Tuesday, 260 00:14:31.250 --> 00:14:34.070 senior Biden administration official confirmed that the US 261 00:14:34.070 --> 00:14:37.670 had dispatched its top cybersecurity official to Europe 262 00:14:37.670 --> 00:14:41.300 to continue talks on the Russian cyber situation that was Deputy 263 00:14:41.300 --> 00:14:44.000 National Security Advisor for Cyber and Emerging Technology, 264 00:14:44.000 --> 00:14:47.510 Anne Neuberger, who was set to meet with EU officials along 265 00:14:47.510 --> 00:14:51.440 with representatives from NATO. In a statement on Tuesday 266 00:14:51.440 --> 00:14:55.520 announcing Neuberger's trip, the same senior administration 267 00:14:55.520 --> 00:14:58.820 official said, "Across all of these engagements, our focus is 268 00:14:58.820 --> 00:15:01.790 on ensuring that the US and our allies and partners are prepared 269 00:15:01.790 --> 00:15:04.610 for any cyber-related contingency and prepared to 270 00:15:04.610 --> 00:15:09.470 respond in the current political environment." So tensions also 271 00:15:09.470 --> 00:15:12.530 spilled over to the UN on Tuesday when the US top 272 00:15:12.530 --> 00:15:15.980 ambassador denounced Russia's mobilization and then the top 273 00:15:16.340 --> 00:15:19.820 Russian ambassador then in turn criticized the US for stirring 274 00:15:19.820 --> 00:15:24.260 tensions and almost encouraging the headlines that we've been 275 00:15:24.260 --> 00:15:28.310 seeing of late. So meanwhile, to boil this down, cybersecurity 276 00:15:28.310 --> 00:15:30.860 experts here in the US have stated in recent days that 277 00:15:30.860 --> 00:15:33.680 network defenders, you know, truly just must not let their 278 00:15:33.680 --> 00:15:37.970 guard down and understand the TTPs and of what these Russian 279 00:15:37.970 --> 00:15:42.770 actors are doing and may contend to do in the weeks ahead. And 280 00:15:42.890 --> 00:15:46.340 I've spoken with a couple of experts who have particularly 281 00:15:46.340 --> 00:15:49.490 pointed out potential cyber attacks on the US software 282 00:15:49.490 --> 00:15:51.830 supply chain or contractor servicing the federal 283 00:15:51.830 --> 00:15:58.730 government, among others. As Russia is in this sort of 284 00:15:58.730 --> 00:16:02.960 staging phase, there's certainly a fear among the top US 285 00:16:02.960 --> 00:16:05.780 officials here. So no doubt an area to watch in the days and 286 00:16:05.780 --> 00:16:07.100 weeks ahead. A lot to unpack. 287 00:16:07.260 --> 00:16:10.410 Anna Delaney: Yeah, lots going on, uncertainty, as Jeremy 288 00:16:10.710 --> 00:16:14.160 pointed out earlier, and again I reckon there's been some good 289 00:16:14.160 --> 00:16:18.300 advice from US government, UK, about resilience, back to basics 290 00:16:18.300 --> 00:16:21.510 and patching. Do we know anything? Or is there any 291 00:16:21.960 --> 00:16:26.970 guidance that's specific or even unique to these Russian cyber 292 00:16:26.970 --> 00:16:27.720 operations? 293 00:16:28.650 --> 00:16:31.470 Dan Gunderman: Well, the first thing that comes to my mind is 294 00:16:31.470 --> 00:16:34.410 the DHS bulletin that went out to law enforcement agencies last 295 00:16:34.410 --> 00:16:39.810 week that outlines what CISA has seen to dig in terms of some of 296 00:16:39.810 --> 00:16:43.230 the cyber espionage campaigns, what they expect out of what 297 00:16:43.230 --> 00:16:46.470 could be the first direct military offensive on Ukraine or 298 00:16:46.500 --> 00:16:49.530 US critical infrastructure. So obviously, there's been a big 299 00:16:49.530 --> 00:16:53.070 push within the Biden administration in recent months 300 00:16:53.070 --> 00:16:55.380 since the executive order last year to shore up critical 301 00:16:55.410 --> 00:16:59.340 infrastructure. So obviously, we've seen continued dialogue 302 00:16:59.370 --> 00:17:01.980 around that. And then, of course, you've seen Anne 303 00:17:01.980 --> 00:17:05.490 Neuberger, in addition to her trip to the EU, has been pretty 304 00:17:05.520 --> 00:17:08.820 outspoken about the need to shore up critical 305 00:17:08.820 --> 00:17:11.880 infrastructure. And I spoke to couple of other folks who said, 306 00:17:11.880 --> 00:17:15.660 it really, whether Russia decides to cross the use of 307 00:17:15.660 --> 00:17:20.220 force threshold against the US is a really complex situation. 308 00:17:20.220 --> 00:17:25.200 And a lot of folks don't believe that Putin will, as Jeremy 309 00:17:25.200 --> 00:17:28.740 mentioned in his section there, it could just be, we're not sure 310 00:17:28.740 --> 00:17:30.420 what the antics are at this point. And like he said, 311 00:17:30.420 --> 00:17:34.020 sometimes his diplomatic endeavors take years to 312 00:17:34.050 --> 00:17:39.720 determine, so strap in and we'll see what happens. It's certainly 313 00:17:40.470 --> 00:17:41.100 alarming. 314 00:17:41.280 --> 00:17:41.520 Anna Delaney: Yeah. 315 00:17:41.550 --> 00:17:43.140 Jeremy Kirk: I thought it was pretty interesting, too, that 316 00:17:43.260 --> 00:17:46.950 there was Wiper malware that was deployed, like Microsoft wrote 317 00:17:46.950 --> 00:17:49.890 about this a couple of weeks ago. And literally, that's what 318 00:17:49.890 --> 00:17:54.900 NotPetya was. And you know, US government, prosecutors indicted 319 00:17:54.900 --> 00:17:58.830 six Russian GRU officials for that, for NotPetya. And 320 00:17:58.830 --> 00:18:01.530 literally right before, you know, when this Ukraine stuff is 321 00:18:01.530 --> 00:18:05.310 kicking off, we see the Wiper malware deployed again, which I 322 00:18:05.430 --> 00:18:08.460 thought was like, you know, quite coincidental and also 323 00:18:08.460 --> 00:18:09.870 suspicious really. 324 00:18:10.260 --> 00:18:11.160 Anna Delaney: Oh, I was going to ask you. 325 00:18:12.240 --> 00:18:14.100 Tom Field: And to come full circle, should I point out that 326 00:18:14.100 --> 00:18:15.870 zero trust would mitigate much of this? 327 00:18:17.730 --> 00:18:22.080 Anna Delaney: Exactly. Actually, Tom, are the discussions around 328 00:18:22.110 --> 00:18:25.170 Ukraine and Russia, are they weaving their way into your 329 00:18:25.170 --> 00:18:27.690 discussions around zero trust in the panels? 330 00:18:28.710 --> 00:18:33.510 Tom Field: No, it hasn't come up in those discussions. Other 331 00:18:33.510 --> 00:18:36.150 discussions mean, you can't help it, but talk about it. But no, 332 00:18:36.210 --> 00:18:38.040 not in the zero trust discussions with them. 333 00:18:39.120 --> 00:18:40.980 Anna Delaney: Jeremy, I'm interested to know what the 334 00:18:40.980 --> 00:18:43.920 parallels are between 2017, as you said, you mentioned a few 335 00:18:43.920 --> 00:18:48.270 there. Have we seen techniques change in any way or even 336 00:18:48.270 --> 00:18:49.050 advance? 337 00:18:50.070 --> 00:18:53.670 Jeremy Kirk: Ah, you know, I don't know, I think it like part 338 00:18:53.670 --> 00:18:56.430 of the puzzle, is all this is trying to figure out who is 339 00:18:56.430 --> 00:19:01.410 doing what and why. And, you know, we heard that possibly 340 00:19:01.410 --> 00:19:05.640 Belarus was involved in some stuff, too. And I think that's 341 00:19:05.640 --> 00:19:08.370 part of the thing is trying to discover... the difficult part 342 00:19:08.370 --> 00:19:12.600 of discovery about, you know, sort of aggressive, you know, 343 00:19:12.630 --> 00:19:16.260 malware deployment of malware is like, who is doing it? Why are 344 00:19:16.260 --> 00:19:18.780 they doing it? Is that is that actually who is doing it? Are 345 00:19:18.780 --> 00:19:21.840 these just like false flags? So there's always a stuff, like 346 00:19:21.840 --> 00:19:24.750 lots of flapping kind of in the breeze, when you sort of get 347 00:19:24.750 --> 00:19:27.000 samples of malware you can look at and go, Well, it kind of 348 00:19:27.000 --> 00:19:30.360 looks like this one. But of course, anybody can copy malware 349 00:19:30.360 --> 00:19:33.900 and do what they want with it. So then trying to attribute that 350 00:19:33.900 --> 00:19:37.500 back to some sort of political motivation or political aim. 351 00:19:37.530 --> 00:19:40.800 It's really quite difficult. I think oftentimes, you never 352 00:19:40.800 --> 00:19:45.060 really get an answer. It's just kind of like a lot of, sort of, 353 00:19:45.060 --> 00:19:47.310 guesswork. I guess you'd probably have to talk to the 354 00:19:47.310 --> 00:19:50.340 actual people who deployed it eventually to see why did you do 355 00:19:50.340 --> 00:19:55.590 this? What was your boss or tasker telling you to do? If 356 00:19:55.620 --> 00:19:56.880 even they knew that too. 357 00:19:57.270 --> 00:19:59.190 Tom Field: You're saying stay tuned for a future episode? 358 00:20:00.300 --> 00:20:00.900 Jeremy Kirk: Yes. 359 00:20:03.060 --> 00:20:05.490 Anna Delaney: Yeah, absolutely. Stay tuned. Well, thank you very 360 00:20:05.490 --> 00:20:09.570 much. So, finally, they say you learn something new every day. 361 00:20:10.050 --> 00:20:13.170 What is something you have learnt this week in the world of 362 00:20:13.170 --> 00:20:14.010 cybersecurity? 363 00:20:14.220 --> 00:20:16.680 Tom Field: Well, I'll tell you, I learned the other day that in 364 00:20:16.680 --> 00:20:19.620 your town, one can be escorted out of Parliament for calling 365 00:20:19.620 --> 00:20:21.510 the Prime Minister a liar. I didn't know that. 366 00:20:22.170 --> 00:20:25.350 Anna Delaney: Yeah. Which is amazing in many ways. 367 00:20:26.010 --> 00:20:29.070 Tom Field: But I will say relevant to cybersecurity, I 368 00:20:29.100 --> 00:20:32.760 hosted a roundtable discussion yesterday, a continuation of a 369 00:20:32.760 --> 00:20:36.240 discussion that you've hosted with Axonius, on asset 370 00:20:36.270 --> 00:20:41.970 management and asset tracking. Mine was a healthcare event. And 371 00:20:41.970 --> 00:20:47.280 I was surprised by the maturity of the attendees in the 372 00:20:47.280 --> 00:20:51.180 roundtable, organizations that they come from, the challenges 373 00:20:51.180 --> 00:20:56.310 that they still have, just being able to track assets connecting 374 00:20:56.310 --> 00:20:58.620 to their networks. Now I understand digital 375 00:20:58.620 --> 00:21:02.430 transformation has added a new degree of difficulty. You've got 376 00:21:02.430 --> 00:21:05.610 people working from places they didn't before, on devices they 377 00:21:05.610 --> 00:21:09.360 didn't use before. But beyond that, there's the third, fourth, 378 00:21:09.360 --> 00:21:12.300 fifth party risk, who's connecting to your partners, 379 00:21:12.390 --> 00:21:16.470 networks, asset management, the Wild West, and I was a little 380 00:21:16.470 --> 00:21:19.980 surprised at just how wild that West is today. 381 00:21:21.990 --> 00:21:23.610 Anna Delaney: Alright, good. Jeremy? 382 00:21:24.920 --> 00:21:29.690 Jeremy Kirk: Sure. Well, I learned, I think yesterday that 383 00:21:29.690 --> 00:21:30.980 there's kind of an ISAC (Information Sharing and 384 00:21:30.980 --> 00:21:34.790 Analysis Center) for schools. So it's called K12 SIX. I was 385 00:21:34.790 --> 00:21:38.300 unaware this organization existed. And it's basically 386 00:21:38.300 --> 00:21:42.590 trying to help schools deal with things like ransomware, which is 387 00:21:42.590 --> 00:21:46.280 just like an absolutely huge problem. Schools are 388 00:21:46.280 --> 00:21:48.980 particularly unsuited for defending themselves against 389 00:21:48.980 --> 00:21:53.720 ransomware, or any sort of attack for a variety of reasons. 390 00:21:54.140 --> 00:21:57.020 So, yeah, I'm going do a video interview, I think, with this 391 00:21:57.050 --> 00:21:59.240 with this group and learn a little bit more about how 392 00:21:59.240 --> 00:22:01.670 they're trying to increase the resiliency of schools. 393 00:22:02.040 --> 00:22:02.670 Anna Delaney: Nice. 394 00:22:03.270 --> 00:22:04.770 Tom Field: Huge issue in the US, huge issue. 395 00:22:05.520 --> 00:22:07.110 Anna Delaney: Yeah. Dan? 396 00:22:07.390 --> 00:22:09.700 Dan Gunderman: Yeah, well, Anna, I would just say, it's not news 397 00:22:09.700 --> 00:22:12.430 to anyone, but something that's just been reaffirmed to me this 398 00:22:12.430 --> 00:22:16.120 weekend of late and covering the government beat, is just the 399 00:22:16.120 --> 00:22:19.420 sort of really overwhelming impact of the SolarWinds breach. 400 00:22:19.420 --> 00:22:23.500 And obviously, I just sat in on a congressional hearing this 401 00:22:23.500 --> 00:22:26.620 morning, it was around the FISMA modernization efforts in 402 00:22:26.620 --> 00:22:31.330 Congress pushing through a bill that would entail sweeping 403 00:22:31.330 --> 00:22:34.840 cybersecurity updates across the board for federal agencies and 404 00:22:35.380 --> 00:22:39.040 in the dialogue that the lawmakers had today and what 405 00:22:39.040 --> 00:22:42.490 I've seen in recent weeks is just really SolarWinds being the 406 00:22:42.490 --> 00:22:46.570 wake up call to drive so much of this change and efforts at the 407 00:22:46.570 --> 00:22:50.050 White House and in Congress. Log4j did come up in that 408 00:22:50.050 --> 00:22:54.040 specific hearing as well. So obviously, you're starting to 409 00:22:54.040 --> 00:22:57.370 see the longer term effects of that. But I would just say, just 410 00:22:57.580 --> 00:23:01.600 being reminded of the impact of SolarWinds across the federal 411 00:23:01.600 --> 00:23:05.560 agencies in 2020, and we're still seeing such an impact from 412 00:23:05.560 --> 00:23:05.800 that. 413 00:23:06.420 --> 00:23:08.220 Tom Field: And you'll appreciate this, at a conversation earlier 414 00:23:08.220 --> 00:23:13.200 today with Octavia Howell. She is the CISO of Equifax, Canada. 415 00:23:13.680 --> 00:23:17.760 And she was referring to Log4j as the COVID of zero days. 416 00:23:20.670 --> 00:23:23.850 Anna Delaney: Quite a title. So I learned something new on 417 00:23:23.850 --> 00:23:30.180 Jeremy's podcast. Sausage factory. Correct me if I'm 418 00:23:30.180 --> 00:23:36.780 wrong. But essentially it's a diagram, which the security team 419 00:23:36.780 --> 00:23:40.860 at Maersk used to show the deployment of their tiered 420 00:23:40.860 --> 00:23:42.660 access model. If I'm correct. 421 00:23:43.530 --> 00:23:45.450 Jeremy Kirk: Yeah, I don't think you'll see that in any sort of 422 00:23:45.450 --> 00:23:49.200 formal documentation anywhere. But that was what they used. 423 00:23:49.230 --> 00:23:54.450 That's basically for moving. How do you take infected systems? 424 00:23:55.320 --> 00:23:59.520 You know, redeploy them, but also improve their access 425 00:23:59.520 --> 00:24:02.760 controls too. And they had to do this for so many systems. And a 426 00:24:02.760 --> 00:24:06.300 lot of that just involved moving stuff from Windows 7 to Windows 427 00:24:06.300 --> 00:24:09.780 10, which seems to be kind of also a big recurring theme 428 00:24:10.050 --> 00:24:12.390 around sort of ransomware. It's like, well, we've been wiped 429 00:24:12.390 --> 00:24:14.430 out. We're not going to go back to that, although, as we'll go 430 00:24:14.430 --> 00:24:17.220 up to the new one, so it has kind of prompted a lot of 431 00:24:17.460 --> 00:24:21.090 migration like that, too. But yeah, it was quite a funny bit, 432 00:24:21.090 --> 00:24:24.840 too. And Gavin, on his blog, had a diagram of it as well. So 433 00:24:25.260 --> 00:24:26.610 yeah, good stuff. 434 00:24:27.240 --> 00:24:30.990 Anna Delaney: Yes. So listen to the podcast to find out more and 435 00:24:30.990 --> 00:24:34.230 learn more about these wonderful phrases. But that is all we have 436 00:24:34.230 --> 00:24:37.710 time for. Tom, Dan, Jeremy, thank you very much. Let's do 437 00:24:37.710 --> 00:24:38.550 this again soon. 438 00:24:39.270 --> 00:24:40.800 Tom Field: Jeremy, really is a pleasure to see you. Thanks so 439 00:24:40.800 --> 00:24:41.670 much for joining us. 440 00:24:42.810 --> 00:24:43.470 Jeremy Kirk: Anytime. 441 00:24:44.160 --> 00:24:46.620 Anna Delaney: Thank you so much for watching. Until next time,