WEBVTT 1 00:00:00.000 --> 00:00:02.550 Anna Delaney: Hi there, and welcome to Proof of Concept, the 2 00:00:02.580 --> 00:00:05.850 ISMG talk show where we'll discuss today's and tomorrow's 3 00:00:05.880 --> 00:00:09.450 cybersecurity challenges with experts in the field, and try to 4 00:00:09.450 --> 00:00:12.540 figure out how we can potentially solve them. We are 5 00:00:12.540 --> 00:00:15.270 your hosts. I'm Anna Delaney, director of productions at 6 00:00:15.270 --> 00:00:17.130 Information Security Media Group. 7 00:00:17.810 --> 00:00:20.180 Tom Field: I'm Tom Field, senior vice president of Editorial, 8 00:00:20.180 --> 00:00:22.790 also with the Information Security Media Group. Thanks for 9 00:00:22.790 --> 00:00:24.980 being with us today. Anna, it's such a pleasure. 10 00:00:25.520 --> 00:00:28.460 Anna Delaney: Always. Tom, you've had a relatively busy 11 00:00:28.460 --> 00:00:32.600 summer so far, moderating all these in-person roundtables and 12 00:00:32.600 --> 00:00:36.020 events or summits. Tell us what has been the highlight so far? I 13 00:00:36.020 --> 00:00:38.030 know you were at the Government Summit last week. 14 00:00:38.540 --> 00:00:40.550 Tom Field: The highlights so far is this is the first Monday in a 15 00:00:40.550 --> 00:00:43.640 month I've been home. That's a highlight. You're right. It's 16 00:00:43.640 --> 00:00:47.690 been roundtables in Chicago and in Charlotte, an event last 17 00:00:47.690 --> 00:00:50.750 Monday in Washington DC and a roundtable again in New York 18 00:00:50.750 --> 00:00:54.560 City, so full and flush with information from all of our 19 00:00:54.650 --> 00:00:57.530 speakers and sponsors and guests. What would you like to 20 00:00:57.530 --> 00:00:57.770 know? 21 00:00:58.880 --> 00:01:00.830 Anna Delaney: I want to know about the Government Summit. So 22 00:01:00.830 --> 00:01:04.190 what was the highlight for you? Main takeaway? 23 00:01:04.820 --> 00:01:05.750 Tom Field: Meeting Grant Schneider? 24 00:01:06.000 --> 00:01:10.650 Tom Field: Beyond that. It was a terrific event. We had excellent 25 00:01:06.050 --> 00:01:06.530 Anna Delaney: Yeah. 26 00:01:10.650 --> 00:01:13.350 attendance from public and private sector in the Greater 27 00:01:13.350 --> 00:01:18.240 Washington DC area. On our stages, we had every alphabet 28 00:01:18.240 --> 00:01:22.530 agency you can imagine from CISA, to the FBI to the Secret 29 00:01:22.530 --> 00:01:27.060 Service to the NSA, and terrific conversations about topics that 30 00:01:27.060 --> 00:01:30.030 I'm sure we'll discuss today, including zero trust and the 31 00:01:30.030 --> 00:01:33.150 federal government's progress, and meeting the requirements of 32 00:01:33.150 --> 00:01:36.420 President Biden's executive order. We talked about 33 00:01:37.050 --> 00:01:41.100 nation-state adversaries and threats post Russia invasion of 34 00:01:41.100 --> 00:01:45.180 Ukraine, and talked a lot. I would say an ongoing theme was 35 00:01:45.210 --> 00:01:49.860 public-private partnership, and how this time, we're actually 36 00:01:49.860 --> 00:01:54.120 ready for it. And organizations in both sides are clamoring for 37 00:01:54.120 --> 00:01:57.870 it not just in terms of what you can give me but what I can give 38 00:01:57.870 --> 00:02:00.150 you. So I would say it was very encouraging. 39 00:02:00.750 --> 00:02:02.850 Anna Delaney: And was there a particular hot topic in terms of 40 00:02:03.090 --> 00:02:04.710 what attendees were asking? 41 00:02:04.000 --> 00:02:06.820 Tom Field: One attendee that came up to me midday, and it was 42 00:02:06.820 --> 00:02:08.530 as the theme of public-private partnership was emerging and he 43 00:02:08.860 --> 00:02:16.930 was a well-placed CISO at an organization, he came up with "I 44 00:02:17.380 --> 00:02:20.410 have a question for you. I don't want to ask it. But I want you 45 00:02:20.410 --> 00:02:25.330 to." And the question was, "if Donald Trump gets elected 46 00:02:25.330 --> 00:02:29.230 president again, in another couple of years, what happens to 47 00:02:29.230 --> 00:02:32.290 any momentum that's going forward on public-private 48 00:02:32.290 --> 00:02:35.260 partnerships?" And so, of course, I did the responsible 49 00:02:35.260 --> 00:02:39.460 thing. I took the question, I phrased it appropriately. And I 50 00:02:39.460 --> 00:02:41.950 handed it off to Grant Schneider and asked his next panel. 51 00:02:41.000 --> 00:02:45.440 Anna Delaney: It's interesting how he didn't want to ask the 52 00:02:45.440 --> 00:02:50.720 question. So that was a bold, professional move, Tom. 53 00:02:51.390 --> 00:02:52.140 Tom Field: It then 54 00:02:52.160 --> 00:02:54.470 Because he's worked in government for so long, Grant 55 00:02:54.470 --> 00:02:56.990 Schneider turned that around. Well, he said, I don't have a 56 00:02:56.990 --> 00:03:00.590 question from Tom field. But I want to pose to the panel. 57 00:03:01.950 --> 00:03:05.820 Anna Delaney: And did anything encourage you, Tom, from these 58 00:03:05.820 --> 00:03:06.420 events? 59 00:03:07.020 --> 00:03:09.600 Tom Field: Yes, I would say that would encourage me with the 60 00:03:09.600 --> 00:03:12.660 conversations that we were having. And then we were talking 61 00:03:12.660 --> 00:03:16.560 about progress on zero trust. And we were talking about active 62 00:03:16.560 --> 00:03:19.170 initiatives for public-private partnership. And look, that's 63 00:03:19.170 --> 00:03:22.590 not just a buzzword. That's not just something we talk about 64 00:03:22.590 --> 00:03:26.520 because it's a popular thing to say. We're talking about 65 00:03:26.580 --> 00:03:30.540 organizations being able to share and receive real-time 66 00:03:30.540 --> 00:03:34.800 threat information, so they can respond proactively and not just 67 00:03:34.800 --> 00:03:37.440 validate, yes, we get attacked that way, yes, these are the 68 00:03:37.740 --> 00:03:41.250 indications or indicators of compromise that see. This is 69 00:03:41.250 --> 00:03:44.280 something that's for the health of all organizations, public and 70 00:03:44.280 --> 00:03:49.980 private. It's very encouraging to see stakeholders in all 71 00:03:50.190 --> 00:03:53.670 aspects of that partnership, stepping up and talking about 72 00:03:53.670 --> 00:03:56.490 what they're doing and what can be done better. So, I take that 73 00:03:56.490 --> 00:03:57.000 away from it. 74 00:03:57.300 --> 00:04:01.290 Anna Delaney: Positive, indeed, and here's to calm August. So, 75 00:04:01.290 --> 00:04:05.520 welcoming the man of the moment. Mr. Grant Schneider, our good 76 00:04:05.520 --> 00:04:08.850 friend, senior director for cybersecurity services at 77 00:04:08.850 --> 00:04:12.960 Venable LLP and former federal chief information security 78 00:04:12.960 --> 00:04:15.690 officer. Great to see you as always, Grant, and well done in 79 00:04:15.720 --> 00:04:16.470 the event. 80 00:04:17.620 --> 00:04:19.780 Grant Schneider: Great to be here. It was a fantastic event. 81 00:04:19.780 --> 00:04:22.900 And I really thought Tom wanted to own that question. So that 82 00:04:22.900 --> 00:04:25.480 was why I thought I was just giving him the appropriate 83 00:04:25.480 --> 00:04:26.650 credit he was looking for. 84 00:04:27.670 --> 00:04:30.970 Anna Delaney: Very good, indeed. So Grant, I have a question for 85 00:04:30.970 --> 00:04:35.500 you. The first ever cyber safety review board report has landed 86 00:04:35.500 --> 00:04:38.110 and generally, it's been considered an excellent deep 87 00:04:38.110 --> 00:04:42.220 dive on the Log4j event. What are your initial thoughts? Did 88 00:04:42.220 --> 00:04:45.100 it meet expectations? Did you learn anything new? 89 00:04:46.500 --> 00:04:51.270 Grant Schneider: I mean, it was the first, so expectations were 90 00:04:51.270 --> 00:04:55.410 probably all over the board for people. I think it was really 91 00:04:55.410 --> 00:05:01.020 good and that it's a great idea to have the board. I'm 92 00:05:01.020 --> 00:05:04.800 definitely supportive of our ability as a nation to really 93 00:05:04.800 --> 00:05:08.160 take a deep dive, look at significant incidents that 94 00:05:08.160 --> 00:05:11.400 affect, you know, particularly this one, affected federal 95 00:05:11.400 --> 00:05:14.310 government, affected critical infrastructure, and affected all 96 00:05:14.310 --> 00:05:18.000 sorts of organizations. And so, I think the board did a great 97 00:05:18.000 --> 00:05:22.440 job of taking a deep dive, looking to the various impacted 98 00:05:22.470 --> 00:05:26.760 individuals, as well as how this came about, or how it was 99 00:05:26.760 --> 00:05:30.690 disclosed, and what the challenges were. And I think 100 00:05:30.690 --> 00:05:33.540 this was a good case study because of the challenges with 101 00:05:33.540 --> 00:05:36.780 Log4j, on just how it's implemented at different 102 00:05:36.780 --> 00:05:41.850 agencies, the difficulty, or, you know, perhaps even just not 103 00:05:41.850 --> 00:05:44.790 knowing whether you have it in your ecosystem, or where it 104 00:05:44.790 --> 00:05:48.180 might be in your ecosystem, let alone how to track it down and 105 00:05:48.180 --> 00:05:51.870 do the mitigation. So, I thought it was really good for that. I 106 00:05:51.870 --> 00:05:56.940 think the parts around enhanced cyber hygiene, and, you know, 107 00:05:56.940 --> 00:06:00.000 some of the future looking aspects, I think it'd be 108 00:06:00.000 --> 00:06:02.700 interesting to see, as we have future reports and future 109 00:06:02.700 --> 00:06:06.450 incidents, that a lot of those are just core fundamentals that 110 00:06:06.450 --> 00:06:10.350 I could almost see being largely replicated into almost any 111 00:06:10.350 --> 00:06:14.400 future incidents, because there's so many basic hygiene 112 00:06:14.400 --> 00:06:18.390 things that organizations just need to do and haven't really 113 00:06:18.390 --> 00:06:19.350 changed over time. 114 00:06:19.800 --> 00:06:22.290 Anna Delaney: Sure. And some have actually criticized that 115 00:06:22.410 --> 00:06:27.270 general, you know, the broadness of its recommendations, as you 116 00:06:27.270 --> 00:06:32.700 say, good cyber hygiene, and build a safer, better software 117 00:06:32.700 --> 00:06:37.140 ecosystem and investments in the future. Is that criticism fair? 118 00:06:39.490 --> 00:06:43.030 Grant Schneider: I mean, it's certainly understandable. And I 119 00:06:43.030 --> 00:06:49.390 think the question is, you know, if you're the board, I don't 120 00:06:49.390 --> 00:06:51.940 know how you can put out a report without making those 121 00:06:51.940 --> 00:06:55.000 recommendations, because they are things that, you know, 122 00:06:55.000 --> 00:06:58.480 they're messages that need to be said over and over again. And 123 00:06:58.540 --> 00:07:01.330 hopefully, each time they're said, a few more organizations 124 00:07:01.360 --> 00:07:05.740 adopt them. That said, I also understand the criticism that 125 00:07:05.740 --> 00:07:09.730 is, we could have written that report at anytime in the last 126 00:07:09.730 --> 00:07:12.520 decade, and in fact, we probably have written a lot of those same 127 00:07:12.520 --> 00:07:16.090 things at any time in the last decade. So I understand the 128 00:07:16.090 --> 00:07:20.050 tension there. Personally, I think it's a part, you know, 129 00:07:20.050 --> 00:07:23.350 those parts of the report could be, maybe, a smaller chunk of 130 00:07:23.350 --> 00:07:29.680 the report, a little more refined. But I, again, I don't 131 00:07:29.680 --> 00:07:33.310 know how you don't cover it. I really think it's a necessary 132 00:07:33.310 --> 00:07:36.880 message to be told, after really any incident. 133 00:07:37.540 --> 00:07:39.820 Anna Delaney: Do you think the general nature of these 134 00:07:39.820 --> 00:07:43.210 recommendations just shows how far we have to go to make 135 00:07:43.210 --> 00:07:44.950 critical software safer? 136 00:07:46.460 --> 00:07:52.340 Grant Schneider: Well, yes, we have a lot to do. And we 137 00:07:52.340 --> 00:07:57.500 continue to expand our threat surface by connecting more and 138 00:07:57.500 --> 00:08:00.950 more devices and interconnecting more and more devices, and we 139 00:08:00.950 --> 00:08:06.080 get a lot of productivity and functionality and enjoyment, you 140 00:08:06.080 --> 00:08:08.870 know, in the entertainment systems for that 141 00:08:08.870 --> 00:08:13.280 interconnection, but we pay a price and risk and, you know, 142 00:08:13.280 --> 00:08:16.310 the people that are developing those capabilities and systems 143 00:08:16.310 --> 00:08:19.550 just aren't always thinking about cybersecurity. That's 144 00:08:19.580 --> 00:08:23.900 often not their first objective. Their first objective is meeting 145 00:08:23.900 --> 00:08:28.730 some mission requirements. And so, you know, getting more and 146 00:08:28.730 --> 00:08:31.850 more into the fundamentals of how does the education and 147 00:08:31.850 --> 00:08:34.670 training of individuals that are doing software development 148 00:08:34.700 --> 00:08:38.660 really make, you know, that everyone's number one job almost 149 00:08:38.660 --> 00:08:42.560 needs to be cybersecurity, as things are being developed in 150 00:08:42.560 --> 00:08:45.770 order for us to get ahead of this? We can't bolt 151 00:08:45.800 --> 00:08:49.940 cybersecurity cleanly on at the end. We really need to be doing 152 00:08:49.940 --> 00:08:53.420 it at the beginning. But a lot of that is fundamentals and 153 00:08:53.420 --> 00:08:58.160 basics that aren't always exciting, right? It's not always 154 00:08:58.160 --> 00:09:01.400 exciting. I have a friend and I've said this before, years 155 00:09:01.400 --> 00:09:04.130 ago, said, you know, cybersecurity is like working in 156 00:09:04.130 --> 00:09:08.060 a brewery. It sounds really cool. I'm in a brewery and 157 00:09:08.060 --> 00:09:10.340 they're making, you know, fabulous beer and doing 158 00:09:10.340 --> 00:09:13.220 interesting things. And if you've ever brewed beer, or 159 00:09:13.220 --> 00:09:16.370 worked in a brewery, it's about cleaning stuff. It's about 160 00:09:16.370 --> 00:09:19.490 sanitization and making sure that nothing bad gets in and 161 00:09:19.490 --> 00:09:22.100 that you're cleaning everything and it's really about the 162 00:09:22.100 --> 00:09:25.160 basics. And cybersecurity is that way. It's really about the 163 00:09:25.160 --> 00:09:28.340 basics. And the basics aren't always exciting. They're always 164 00:09:28.340 --> 00:09:29.210 necessary, though. 165 00:09:30.710 --> 00:09:32.450 Anna Delaney: So, there was an interesting takeaway that the 166 00:09:32.450 --> 00:09:36.110 board applauded Alibaba for following recognized practices 167 00:09:36.110 --> 00:09:40.430 for coordinated vulnerability disclosure for Log4j, but is 168 00:09:40.430 --> 00:09:43.340 concerned about the Chinese government's vulnerability 169 00:09:43.340 --> 00:09:46.820 disclosure rules, which compel researchers to tell the 170 00:09:46.820 --> 00:09:50.450 government about vulnerabilities within two days of discovery. 171 00:09:50.450 --> 00:09:54.770 So, the worry is that the PRC government could gain early 172 00:09:54.770 --> 00:09:58.460 access to serious exploitable vulnerabilities before they're 173 00:09:58.460 --> 00:10:01.730 patched. What are your thoughts? It's an interesting point. Is 174 00:10:01.730 --> 00:10:03.830 this a concern of yours as well? 175 00:10:05.100 --> 00:10:07.689 Grant Schneider: Yeah, it's definitely a concern of mine. 176 00:10:07.754 --> 00:10:11.250 It's a concern of mine that China has that law. It's a 177 00:10:11.315 --> 00:10:15.071 concern about how it could be implemented, how it could be 178 00:10:15.135 --> 00:10:18.891 leveraged maliciously. It's also, I think, a fair warning, 179 00:10:18.955 --> 00:10:22.711 because we've seen somewhat similar drafts or proposals in 180 00:10:22.776 --> 00:10:26.207 the US as well to have early, you know, vulnerability 181 00:10:26.272 --> 00:10:30.157 reporting to the government. And, you know, I would caution, 182 00:10:30.221 --> 00:10:33.394 the US government and our lawmakers, we can't put 183 00:10:33.459 --> 00:10:37.538 something on the books in the US that we have concerns about in 184 00:10:37.603 --> 00:10:41.617 China. And it's easy for us to say, "but we're going to use it 185 00:10:41.682 --> 00:10:45.631 for good," right, "we just want to be able to mitigate and we 186 00:10:45.696 --> 00:10:49.257 want to mitigate critical infrastructure and federal in 187 00:10:49.322 --> 00:10:53.401 advance." And I understand that, and I want us to be able to do 188 00:10:53.466 --> 00:10:57.480 that. At the same time, if it becomes a precedent, if we have, 189 00:10:57.545 --> 00:11:00.782 you know, undisclosed, unmitigated vulnerabilities 190 00:11:00.847 --> 00:11:04.602 being reported to governments. That's then, you know, less 191 00:11:04.667 --> 00:11:08.746 friendly governments are going to follow suit, they're going to 192 00:11:08.811 --> 00:11:12.760 have the same laws, and they're going to point at us and say, 193 00:11:12.825 --> 00:11:16.904 "we're doing the same thing as US." So, I think it's a bit of a 194 00:11:16.969 --> 00:11:20.919 warning for us. It's definitely a concern with China, though, 195 00:11:20.983 --> 00:11:24.609 having that on the books. And I also applaud Alibaba for 196 00:11:24.674 --> 00:11:28.623 following appropriately what, you know, I would coin at least 197 00:11:28.688 --> 00:11:32.832 as international norms and what we expect from the cybersecurity 198 00:11:32.897 --> 00:11:33.480 industry. 199 00:11:33.750 --> 00:11:36.990 Anna Delaney: Yeah. And complex. As always, Grant, this has been 200 00:11:36.990 --> 00:11:39.510 brilliant. Thank you very much for your thoughts on this. 201 00:11:39.540 --> 00:11:40.320 Really appreciate it. 202 00:11:43.860 --> 00:11:45.450 And, Tom, it's over to you. 203 00:11:46.050 --> 00:11:47.850 Tom Field: Excellent. Well, I have the opportunity now and the 204 00:11:47.850 --> 00:11:51.510 privilege really to introduce a frequent guest here. He is 205 00:11:51.510 --> 00:11:54.270 called the father of zero trust. I call him the godfather 206 00:11:54.270 --> 00:11:57.720 , and some people even know him as the senior vice president 207 00:11:57.720 --> 00:12:01.080 of cybersecurity strategy with ON2IT. John Kindervag, it's such 208 00:12:01.080 --> 00:12:02.160 a pleasure to see you again. 209 00:12:02.700 --> 00:12:04.440 John Kindervag: Hey, great to see you again, Tom. 210 00:12:04.800 --> 00:12:07.350 Tom Field: So, I understand the state of Texas has released you 211 00:12:07.350 --> 00:12:09.570 some, you've been out and back out on the road out in the wild. 212 00:12:09.750 --> 00:12:11.160 What's it like to be back traveling? 213 00:12:12.660 --> 00:12:16.800 John Kindervag: Oh, it's, well, it's insane. I spent all of June 214 00:12:16.800 --> 00:12:21.870 and July on the roads. So we saw each other. You, I, Anna, Grant 215 00:12:21.870 --> 00:12:25.500 I saw somewhere in there, maybe RSA, so, I've seen everybody in 216 00:12:25.500 --> 00:12:30.120 person within the last two months. And it's good. The world 217 00:12:30.120 --> 00:12:33.180 needs people to see each other face to face. There's just 218 00:12:33.210 --> 00:12:34.080 demand for that. 219 00:12:34.600 --> 00:12:37.120 Tom Field: Oh, I agree with you 100%. Now, I know that you spent 220 00:12:37.120 --> 00:12:41.530 significant time in Europe. Tell us how is the conversation about 221 00:12:41.530 --> 00:12:45.700 zero trust any different in Europe now than it is in what 222 00:12:45.700 --> 00:12:48.910 you've seen in the US? And it's in top of the mind in the the 223 00:12:48.910 --> 00:12:49.570 people you met? 224 00:12:49.000 --> 00:12:54.490 John Kindervag: Well, you know, given the fact that I like to 225 00:12:54.490 --> 00:12:58.930 say the world is flat because TCP/IP made it flat, right? 226 00:12:59.290 --> 00:13:02.140 We're all directly connected to the world's most malicious 227 00:13:02.140 --> 00:13:06.700 actors. So, I think the people in Europe have faced the same 228 00:13:06.700 --> 00:13:12.130 challenges. And as we've talked before, you and I, several 229 00:13:12.130 --> 00:13:16.360 times, the presidential executive order gave people in 230 00:13:16.360 --> 00:13:19.690 the US the freedom to think, "Okay, I might, could do the 231 00:13:19.690 --> 00:13:23.800 zero trust stuff. It's okay, because, you know, the incentive 232 00:13:23.800 --> 00:13:27.220 structure has changed." Well, I found that that resonated all 233 00:13:27.220 --> 00:13:30.610 the way over to Europe. So, there was a lot of discussion 234 00:13:30.610 --> 00:13:35.500 about the presidential executive order that happened in May of 235 00:13:35.530 --> 00:13:41.260 2021. And they're very aware of that. And so, it seems to have 236 00:13:41.260 --> 00:13:43.600 changed the incentive structure over in Europe as well. 237 00:13:44.290 --> 00:13:47.980 Tom Field: Do you see any other government mandate that has the 238 00:13:47.980 --> 00:13:49.480 stature of the executive order? 239 00:13:51.880 --> 00:13:54.520 John Kindervag: Well, there's different governmental 240 00:13:55.990 --> 00:13:59.950 compliance or regulatory efforts in almost every country that 241 00:13:59.950 --> 00:14:05.020 will have, you know, they're a bit of the stick, right? The 242 00:14:05.020 --> 00:14:10.000 executive order is much more of a carrot for other countries 243 00:14:10.000 --> 00:14:14.260 because they're not mandated to do it according to a law, like 244 00:14:14.260 --> 00:14:16.930 you would have to follow that if you were US federal government 245 00:14:16.930 --> 00:14:21.580 agency, but it's all about incentives and feeling good 246 00:14:21.580 --> 00:14:24.760 about doing what you're doing, right? So, they now feel like 247 00:14:24.760 --> 00:14:28.570 it's okay to talk about zero trust. It's no longer the first 248 00:14:28.570 --> 00:14:31.240 rule of fight club, right? As I've joked so many times, it's 249 00:14:31.240 --> 00:14:37.150 okay to talk about it. And then that emboldens people to want to 250 00:14:37.150 --> 00:14:40.030 talk about it and want to do it and that's very gratifying. 251 00:14:40.510 --> 00:14:42.250 Tom Field: In your lifetime, John, you're going to see the 252 00:14:42.250 --> 00:14:43.780 GDPR of zero trust. 253 00:14:44.710 --> 00:14:47.417 John Kindervag: Maybe. I don't know what's going to happen in 254 00:14:47.478 --> 00:14:51.208 my lifetime. I've been around for so long that a lot of things 255 00:14:51.269 --> 00:14:55.000 have happened that I never would have imagined them happening. 256 00:14:55.510 --> 00:14:58.543 Tom Field: Well, I'm going to give you your Chinese flavor of 257 00:14:58.604 --> 00:15:02.304 this conversation now. As you know, cybersecurity researchers 258 00:15:02.365 --> 00:15:05.884 have found some severe software vulnerabilities in popular 259 00:15:05.944 --> 00:15:09.160 Chinese-made automotive GPS tracker. It's used in 169 260 00:15:09.220 --> 00:15:12.679 countries and poses a potential danger to highway safety, 261 00:15:12.739 --> 00:15:16.197 national security supply chains. CISA, in fact, said in a 262 00:15:16.258 --> 00:15:19.534 statement recently that it was not aware of any active 263 00:15:19.595 --> 00:15:23.417 exploitation of vulnerabilities. First question for you. Are we 264 00:15:23.477 --> 00:15:25.480 inflating the problem about this? 265 00:15:26.860 --> 00:15:29.290 John Kindervag: No, absolutely not. We're not inflating it. I 266 00:15:29.290 --> 00:15:32.800 mean, it's just having an awareness of it. Because the 267 00:15:32.800 --> 00:15:36.850 first time that somebody cuts off the fuel line to some truck 268 00:15:36.850 --> 00:15:39.640 on a highway and it crashes and kills people, and we find out 269 00:15:39.640 --> 00:15:42.880 that it was caused by, you know, this particular vulnerability, 270 00:15:42.880 --> 00:15:46.480 everybody will want to know why weren't we aware of it. So, you 271 00:15:46.480 --> 00:15:51.370 have to disclose it early and often. Secondly, it speaks to 272 00:15:52.690 --> 00:15:56.230 what is the fundamental problem, which is when people design 273 00:15:56.230 --> 00:15:58.420 things, they aren't thinking about everything that could 274 00:15:58.420 --> 00:16:02.230 possibly go wrong. And they're not incentivized to think about 275 00:16:02.230 --> 00:16:05.830 what could possibly go wrong. And therefore, a lot of things 276 00:16:05.830 --> 00:16:09.400 can go wrong. And especially in an industry like this, the 277 00:16:09.400 --> 00:16:12.130 manufacturer of this device maybe knew that there was a 278 00:16:12.130 --> 00:16:15.940 problem, maybe knew they could add more security, but I'll bet 279 00:16:15.970 --> 00:16:20.020 it's a very low-margin sale. So, they didn't want to add 280 00:16:20.020 --> 00:16:23.230 something that would cost an extra 25 cents, which to you and 281 00:16:23.230 --> 00:16:26.650 I means nothing. But if you're selling millions and millions of 282 00:16:26.650 --> 00:16:32.350 something, 25 cents an item adds up pretty darn quickly. So we're 283 00:16:32.350 --> 00:16:35.590 seeing this is a great example of one of the fundamental 284 00:16:35.590 --> 00:16:38.920 problems we have the disconnect between the people who make the 285 00:16:38.920 --> 00:16:43.510 technology and the people who have to secure it, and the lack 286 00:16:43.510 --> 00:16:46.960 of incentives to build technology that has security 287 00:16:46.960 --> 00:16:50.500 controls in it, so that we can do something to protect it. 288 00:16:51.370 --> 00:16:54.910 Tom Field: Good point, the sprint is always prioritized ahead of the 289 00:16:54.910 --> 00:16:56.830 security. How do we deal with this? 290 00:16:59.270 --> 00:17:03.770 It's, you know, you have to have something that makes people want 291 00:17:04.250 --> 00:17:07.100 to do things the right way, right? I mean I've struggled 292 00:17:07.100 --> 00:17:11.990 with this my whole career. My first experience was with 293 00:17:12.950 --> 00:17:18.560 morphine infusion pumps, and trying talking to manufacturers 294 00:17:18.560 --> 00:17:22.730 to put some security controls in there for security people, 295 00:17:22.730 --> 00:17:26.720 because you had very little and they just didn't care, right? 296 00:17:27.380 --> 00:17:29.870 It's not going to benefit us well. But what about the poor 297 00:17:29.870 --> 00:17:33.860 patient, not our problem. So you got to make it everybody's 298 00:17:33.860 --> 00:17:37.790 problem. And I don't know if you do that with legislation. I 299 00:17:37.790 --> 00:17:40.370 don't know if you do that with some other sort of governance. 300 00:17:40.670 --> 00:17:45.170 But there has to be, you know, people will sometimes do the 301 00:17:45.170 --> 00:17:49.070 right thing. But more often, they have to be drugs, kicking, 302 00:17:49.070 --> 00:17:54.110 and screaming to be forced to do the right thing. And that's what 303 00:17:54.110 --> 00:17:54.680 we need to do. 304 00:17:54.860 --> 00:17:57.470 Tom Field: Full circle. We're back to the GDPR of zero trust 305 00:17:57.500 --> 00:17:58.280 in your life. 306 00:17:58.880 --> 00:17:59.750 John Kindervag: Maybe so. 307 00:18:00.560 --> 00:18:02.480 Tom Field: With that let me bring Anna and Grant back. Anna, 308 00:18:02.000 --> 00:18:35.870 Grant Schneider: Well, I mean, I think the first aspect is just 309 00:18:02.480 --> 00:18:02.930 please? 310 00:18:36.020 --> 00:18:39.290 culture in general, which I think is, you know, from an 311 00:18:39.290 --> 00:18:42.410 organization culture, where you're trying to build a 312 00:18:42.410 --> 00:18:46.430 community inside of any organization, and that goes for 313 00:18:46.430 --> 00:18:50.030 your security organization, that goes for, you know, the entirety 314 00:18:50.030 --> 00:18:55.580 of your organization. It also, you know, we learned so much 315 00:18:55.580 --> 00:19:01.370 from each other just interacting that we are not even aware of, 316 00:19:01.370 --> 00:19:07.220 right? And I think that some of that is lost when you can't have 317 00:19:07.220 --> 00:19:11.420 teams that get together. I think you're also with your your 318 00:19:11.420 --> 00:19:15.410 teams, you know, when you're in person, everyone's in person. 319 00:19:15.830 --> 00:19:19.280 And so, you're getting to share with everyone as opposed to 320 00:19:19.280 --> 00:19:21.680 sharing with your favorite person at work, who maybe you do 321 00:19:21.680 --> 00:19:26.450 call 10 times a day or 10 times a week, and you're able to have 322 00:19:26.450 --> 00:19:29.870 that interaction. So, I think it's a challenge across the 323 00:19:29.870 --> 00:19:33.350 board for just learning about an organization, learning about 324 00:19:33.350 --> 00:19:37.730 your craft and growing in your field. That, you know, as John 325 00:19:37.730 --> 00:19:41.330 said, humans are meant to see one another and in person and 326 00:19:41.330 --> 00:19:42.080 face to face. 327 00:19:43.970 --> 00:19:46.220 Anna Delaney: John, what about these specific securities 328 00:19:46.280 --> 00:19:47.180 concerns for you? 329 00:19:48.590 --> 00:19:49.970 John Kindervag: Well, I'm not very concerned about the 330 00:19:49.970 --> 00:19:53.060 security problems because I think we've solved that. We 331 00:19:53.060 --> 00:19:57.560 proved during the pandemic that we could securely deliver access 332 00:19:57.590 --> 00:20:01.700 to corporate resources, no matter where anybody was. And 333 00:20:01.700 --> 00:20:06.140 that's actually one of the big incentives or big drivers for 334 00:20:06.980 --> 00:20:10.940 the growth of zero trust was the pandemic. So, the security 335 00:20:11.000 --> 00:20:14.000 things aren't a big issue. You're not more secure if you're 336 00:20:14.000 --> 00:20:19.400 in the office than out of the office. But you might learn a 337 00:20:19.400 --> 00:20:22.550 few things, right? But in general, people don't want to go 338 00:20:22.550 --> 00:20:25.700 back to the office because for their day-to-day work, they 339 00:20:25.700 --> 00:20:29.180 don't need to be in an office, which is the downside of the 340 00:20:29.180 --> 00:20:32.180 technology we enabled during the pandemic, because we made it so 341 00:20:32.180 --> 00:20:35.390 seamless to work away from the office, then you think, "why 342 00:20:35.390 --> 00:20:38.270 should I go in the office," and a lot of companies are seeing 343 00:20:38.270 --> 00:20:40.730 people, when they tell them come back to the office, they say, 344 00:20:40.760 --> 00:20:44.180 now just find another job where you don't have to go into an 345 00:20:44.180 --> 00:20:48.560 office. So, you have to ask the question: why do you need people 346 00:20:48.560 --> 00:20:54.020 in the office? And sometimes, there's compelling reasons to 347 00:20:54.020 --> 00:20:59.240 have people in the office and I think when those things occur, 348 00:20:59.450 --> 00:21:03.500 then people are willing to go, right? So, have an event, have a 349 00:21:03.500 --> 00:21:08.600 special meeting but not every day. But if you try to pretend 350 00:21:08.600 --> 00:21:11.720 that you need them to go to the office, and the real reason is, 351 00:21:11.750 --> 00:21:14.930 "well, I'm paying for the darn real estate, and I want to get 352 00:21:14.930 --> 00:21:18.350 my money's worth," then that's not a compelling reason for the 353 00:21:18.350 --> 00:21:20.600 average employee. They don't really care about that 354 00:21:20.780 --> 00:21:22.190 particular problem you have. 355 00:21:23.780 --> 00:21:25.040 Anna Delaney: Tom, are you hearing the same? 356 00:21:25.540 --> 00:21:28.120 Tom Field: I am. And I think that they've nailed it. What 357 00:21:28.120 --> 00:21:30.850 we're in is a cultural shift right now. And I think that the 358 00:21:30.850 --> 00:21:34.630 focus of this culture shift is on the tension between those 359 00:21:34.630 --> 00:21:37.690 that want people to return to central office and those that 360 00:21:37.690 --> 00:21:40.900 want to maintain their independence. And the focus is 361 00:21:40.900 --> 00:21:44.380 on that discussion. Meanwhile, the adversaries are taking 362 00:21:44.380 --> 00:21:47.620 advantage of divide and conquer. They've got people in disparate 363 00:21:47.620 --> 00:21:51.940 offices and now are able to focus social engineering efforts 364 00:21:51.940 --> 00:21:55.300 on individuals, where now, it used to be if you saw a 365 00:21:55.300 --> 00:21:57.880 suspicious email or something came through with an attachment, 366 00:21:57.880 --> 00:22:00.520 you could look up or go into the next office and say, "Hey, take 367 00:22:00.520 --> 00:22:03.460 a look at this, how does this look to you?" That's gone now. 368 00:22:03.640 --> 00:22:05.740 And I think that there are more people that are falling victim 369 00:22:05.740 --> 00:22:08.110 to some of these social engineering schemes because of 370 00:22:08.110 --> 00:22:12.040 it, or they're being taken advantage of by external actors, 371 00:22:12.220 --> 00:22:15.940 and being unwitting, insider risks, which is a whole other 372 00:22:15.940 --> 00:22:18.310 issue to deal with. That's the conversation I wish we were 373 00:22:18.310 --> 00:22:21.610 having more, and not just about where you happen to set your 374 00:22:21.610 --> 00:22:22.630 laptop down today. 375 00:22:23.530 --> 00:22:26.440 John Kindervag: And that conversation is easily solved 376 00:22:26.440 --> 00:22:29.500 within the zero trust realm where we say what are we trying 377 00:22:29.500 --> 00:22:32.650 to protect, right? So, if you, again, if you have a perimeter 378 00:22:32.650 --> 00:22:36.820 defense, you're not, it's not going to work, whether people 379 00:22:36.820 --> 00:22:40.150 come into the office or not into the office, and maybe you're 380 00:22:40.150 --> 00:22:43.030 right, maybe if there's a problem, somebody could, you 381 00:22:43.030 --> 00:22:46.690 know, go to the next cube and ask somebody, but they might not 382 00:22:46.690 --> 00:22:52.060 know anyway, so we have to, and every company has ways to 383 00:22:52.060 --> 00:22:55.450 report, you know, suspected phishing emails and all that 384 00:22:55.450 --> 00:22:58.270 kind of stuff. And they all have security awareness training, and 385 00:22:58.690 --> 00:23:02.320 whether that's useful or not I think is still a question. But 386 00:23:03.070 --> 00:23:08.680 if we can really protect the digital assets, the data, the 387 00:23:08.680 --> 00:23:13.570 assets, the services, then we're, you know, we're going to 388 00:23:13.630 --> 00:23:16.930 enable remote work, we're going to enable on-premises work, 389 00:23:16.930 --> 00:23:19.870 right? I mean, you're even seeing changes in interior 390 00:23:19.870 --> 00:23:24.190 design. Like, I would go to places and this whole area used 391 00:23:24.190 --> 00:23:28.270 to be cubicles, and now it's, you know, multifunctional 392 00:23:28.270 --> 00:23:31.750 workspace with pool tables, because, well, they've kind of 393 00:23:31.780 --> 00:23:34.870 acquiesced to the fact that people are only going to come in 394 00:23:34.870 --> 00:23:39.520 occasionally. But I do think it gives, you know, the upside is 395 00:23:39.520 --> 00:23:44.350 it gives really thoughtful leaders opportunities to create 396 00:23:44.350 --> 00:23:47.500 reasons for people to want to come into the office, you know, 397 00:23:47.500 --> 00:23:50.800 whether it's a special event. You know, when I was in Europe, 398 00:23:51.580 --> 00:23:58.420 I had large crowds in these offices that had come in to see 399 00:23:58.420 --> 00:24:01.930 my presentation. So that became a special event and there's some 400 00:24:01.930 --> 00:24:07.000 food and all that kind of stuff. People still go out and do a lot 401 00:24:07.000 --> 00:24:12.640 of this business in pubs and bars at night. So, you know, 402 00:24:12.640 --> 00:24:17.590 that's still, that hasn't slowed down. So there's other ways to 403 00:24:17.590 --> 00:24:23.260 accomplish the same goal as opposed to, you know, what was 404 00:24:23.260 --> 00:24:27.640 that movie, The Hudsucker Proxy kind of world of corporate 405 00:24:27.640 --> 00:24:30.340 America. You know, I think that we got too far into that. 406 00:24:31.180 --> 00:24:32.710 Tom Field: This is why he's the godfather. 407 00:24:35.200 --> 00:24:37.780 John Kindervag: It's a circle, right? Remember that? 408 00:24:38.410 --> 00:24:42.100 Anna Delaney: Yeah. We're social animals, after all. So this has 409 00:24:42.100 --> 00:24:45.370 been a brilliant conversation. Thank you very much, thoroughly. 410 00:24:45.370 --> 00:24:48.280 Gentlemen. I really enjoyed this. John Kindervag and Grant 411 00:24:48.280 --> 00:24:50.380 Schneider, thank you so much for your time and insight. 412 00:24:50.870 --> 00:24:52.580 Tom Field: Gentlemen, thank you. Pleasure to be with you. Anna, 413 00:24:52.610 --> 00:24:53.270 as always. 414 00:24:54.110 --> 00:24:55.100 Anna Delaney: It's been a pleasure. Thank you. 415 00:24:55.000 --> 00:24:56.530 Grant Schneider: Thank you so much. Really appreciate it.