WEBVTT 1 00:00:00.300 --> 00:00:02.400 Anna Delaney: Hello and welcome to this special identity 2 00:00:02.400 --> 00:00:05.520 security edition of the ISMG Editors' Panel. I'm Anna 3 00:00:05.520 --> 00:00:08.400 Delaney, and this week joining us to discuss identity and 4 00:00:08.400 --> 00:00:11.820 access management trends and solution and much more is our 5 00:00:11.820 --> 00:00:15.300 great friend, the distinguished Jeremy Grant, managing director 6 00:00:15.510 --> 00:00:19.860 - technology business strategy at Venable LLP. The band also 7 00:00:19.860 --> 00:00:23.370 includes Tom Field, senior vice president of editorial and 8 00:00:23.370 --> 00:00:26.550 Mathew Schwartz, executive editor of DataBreachToday and 9 00:00:26.550 --> 00:00:32.310 Europe. Very good to see you all. Jeremy, welcome back. 10 00:00:33.390 --> 00:00:34.080 Jeremy Grant: Good to be here. 11 00:00:34.230 --> 00:00:35.760 Anna Delaney: It's been a while. How are you doing? 12 00:00:35.970 --> 00:00:37.470 Jeremy Grant: I've been well, thanks. How have you been? 13 00:00:39.090 --> 00:00:42.210 Anna Delaney: Very good. We are all happy. Happy to see you. And 14 00:00:42.390 --> 00:00:43.710 where are you today, Jeremy? 15 00:00:44.200 --> 00:00:47.470 Jeremy Grant: Oh, well, I'm in my office. But this is not the 16 00:00:47.470 --> 00:00:50.890 view from it. Two months ago, I was in Taipei for a week and 17 00:00:50.890 --> 00:00:55.060 this was a scene from their big lantern festival, they were 18 00:00:55.480 --> 00:00:59.290 celebrating the Year of the Rabbit. This bunny must have 19 00:00:59.290 --> 00:01:03.040 been 50 or 60 feet rotating around, shooting lasers out and 20 00:01:03.340 --> 00:01:06.820 video screens for eyes that were looking all around in kind of 21 00:01:06.820 --> 00:01:10.510 crazy ways. It was good fun. Definitely my favorite bunny 22 00:01:10.510 --> 00:01:12.580 that I have seen in 23. 23 00:01:13.210 --> 00:01:16.480 Anna Delaney: Sounds a bit eerie. But what an incredible 24 00:01:16.510 --> 00:01:21.070 opportunity to be there that time. Tom, you're out in the 25 00:01:21.190 --> 00:01:22.060 skies again. 26 00:01:22.420 --> 00:01:25.210 Tom Field: Surprise. High above Houston. But I'll tell you I saw 27 00:01:25.210 --> 00:01:28.990 something last week I have never seen in decades of air travel. 28 00:01:29.380 --> 00:01:32.380 Saturday taking off in a rainstorm. I looked out my 29 00:01:32.380 --> 00:01:37.000 window, and right beside me was this gorgeous rainbow. Now, I'm 30 00:01:37.000 --> 00:01:40.210 not sure where the pot of gold was. If that was the end, or 31 00:01:40.210 --> 00:01:42.730 that was the end, I'm pretty sure the pot of gold was not in 32 00:01:42.730 --> 00:01:45.310 Houston. But the rainbow impressed me. So I took this 33 00:01:45.310 --> 00:01:47.860 picture. I said this is for the next Editors' Panel. 34 00:01:48.460 --> 00:01:52.630 Anna Delaney: The goal is in the shot, indeed. Mathew, lovely 35 00:01:52.630 --> 00:01:54.580 light display going on there as well. 36 00:01:55.570 --> 00:01:58.210 Mathew Schwartz: Thank you. Yeah, this is Edinburgh last 37 00:01:58.240 --> 00:02:00.730 weekend. I managed to get away for a little while; this is the 38 00:02:00.940 --> 00:02:04.630 Balmoral Hotel. I think it used to be called the North British 39 00:02:04.720 --> 00:02:07.900 Hotel back in the day. But it's been rebranded since then and 40 00:02:07.900 --> 00:02:10.510 it's a beautiful landmark on Princess Street in the New Town. 41 00:02:11.320 --> 00:02:15.850 Anna Delaney: Gorgeous architecture. Well, I am at the 42 00:02:15.880 --> 00:02:19.120 Chelsea Flower Show. It was taken last year, spring is in 43 00:02:19.120 --> 00:02:22.180 the air here. So I thought I'd just share this nice view of the 44 00:02:22.180 --> 00:02:26.050 flowers. And it's an annual retreat for green-fingered 45 00:02:26.050 --> 00:02:31.510 folks, unlike me, but mostly green-fingered folks. So Jeremy, 46 00:02:31.510 --> 00:02:34.210 as you know, we have a few questions for you. At this 47 00:02:34.210 --> 00:02:37.180 point, I'll hand over to Tom for his opening move. 48 00:02:37.660 --> 00:02:39.550 Tom Field: Excellent. Jeremy, pleasure to see you today. And 49 00:02:39.550 --> 00:02:42.220 we've talked a lot over the past three years. And as we come into 50 00:02:42.220 --> 00:02:45.610 the spring of 2023, it's a good time for reflection. So I'm 51 00:02:45.610 --> 00:02:49.300 going to ask you, after three years plus of digital 52 00:02:49.300 --> 00:02:53.500 transformation, how would you say all of this has changed the 53 00:02:53.500 --> 00:02:57.490 way business and security leaders view identity today both 54 00:02:57.490 --> 00:02:59.920 as a strength and as a vulnerability? 55 00:03:01.270 --> 00:03:05.770 Jeremy Grant: Well, three years ago, just for reminder for 56 00:03:05.770 --> 00:03:08.650 everybody was when we were just starting our lockdown for COVID. 57 00:03:08.710 --> 00:03:10.690 And so I think when we talk about three years of digital 58 00:03:10.690 --> 00:03:14.440 transformation, a lot of it is - for purposes of level setting - 59 00:03:14.650 --> 00:03:18.400 was forced "holy crap, we've got to hurry something out quickly" 60 00:03:18.400 --> 00:03:21.580 digital transformation, as opposed to the kind of, "hey, 61 00:03:21.580 --> 00:03:24.370 let's come up with a multi-year plan and put something in 62 00:03:24.370 --> 00:03:31.660 place." And so I'd say with that, the results are mixed. I 63 00:03:31.660 --> 00:03:35.260 think for some companies who have embraced identity and 64 00:03:35.260 --> 00:03:37.570 perhaps even doubled down on it and gone back to some of the 65 00:03:37.570 --> 00:03:40.450 things they raised that a few years ago and really made some 66 00:03:40.450 --> 00:03:44.290 firmer investments, identity is definitely strength, they feel 67 00:03:44.290 --> 00:03:46.000 comfortable about their security, they're not 68 00:03:46.000 --> 00:03:48.790 necessarily particularly worried about identity-centric attacks 69 00:03:48.790 --> 00:03:53.710 on their systems, for the simple reason that they've closed off a 70 00:03:53.710 --> 00:03:57.700 lot of the most commonly exploited vulnerabilities, to 71 00:03:57.700 --> 00:03:59.920 the extent that they have good identity and access management 72 00:03:59.920 --> 00:04:02.500 systems, for the customer-facing websites. They're able to put 73 00:04:02.500 --> 00:04:07.180 more high value, high risk applications online for them in 74 00:04:07.180 --> 00:04:09.190 a way that might be hard if you didn't have that sort of 75 00:04:09.190 --> 00:04:12.490 infrastructure. From a vulnerability perspective, I 76 00:04:12.490 --> 00:04:14.650 think, the sad thing there is we're not seeing that many 77 00:04:14.650 --> 00:04:17.050 organizations who have done this, and so we're still seeing 78 00:04:17.380 --> 00:04:22.720 year after year people taking advantage of the same - what I 79 00:04:22.720 --> 00:04:25.120 would call two or three deficiencies in digital identity 80 00:04:25.120 --> 00:04:29.470 infrastructure to steal lots of money and data. And so they were 81 00:04:30.160 --> 00:04:33.970 continuing on a monthly basis, there's another breach, and it's 82 00:04:33.970 --> 00:04:37.450 like, "oh, well, I think I've read about this one before," 83 00:04:37.450 --> 00:04:42.040 because the same bad actors or their cousins or nephews or 84 00:04:42.040 --> 00:04:45.940 whatever, who are basically using the same attack vectors to 85 00:04:45.940 --> 00:04:50.410 get in. And, I think, we have a pretty good idea around how 86 00:04:50.410 --> 00:04:52.420 these things happen and why they're happening, but not 87 00:04:52.420 --> 00:04:55.150 necessarily seeing a lot of urgency in terms of trying to 88 00:04:55.150 --> 00:04:56.980 put solutions in place that can stop them. 89 00:04:57.940 --> 00:04:59.860 Tom Field: Now, that's well said. For follow-up question, we 90 00:04:59.860 --> 00:05:00.520 turn to Matt. 91 00:05:02.560 --> 00:05:04.270 Mathew Schwartz: Great to see you again, thanks for being 92 00:05:04.270 --> 00:05:08.290 here. I really appreciate your identity insights. And one of 93 00:05:08.290 --> 00:05:11.710 the things I've been following recently is the Improving 94 00:05:11.740 --> 00:05:16.300 Digital Identity Act in the U.S., which I believe recently 95 00:05:16.300 --> 00:05:18.850 advanced out of committee, and is going to be going to the 96 00:05:18.850 --> 00:05:22.690 Senate for full consideration. And the legislation is aiming to 97 00:05:22.690 --> 00:05:26.980 establish a government-wide effort to develop secure methods 98 00:05:27.010 --> 00:05:31.180 for government agencies to be dealing with identity. So talk 99 00:05:31.180 --> 00:05:34.090 us through this, if you will, what would this bill be doing 100 00:05:34.120 --> 00:05:36.190 that isn't already being done? 101 00:05:37.180 --> 00:05:40.180 Jeremy Grant: So I think the big issue around the improving 102 00:05:40.180 --> 00:05:43.180 digital identity act is really focusing on how can we bring 103 00:05:43.180 --> 00:05:47.290 digital identity infrastructure to consumers. And so to the 104 00:05:47.290 --> 00:05:49.960 extent that it's looking to coordinate an effort around 105 00:05:49.960 --> 00:05:52.720 different government agencies, what it's really looking at, at 106 00:05:52.720 --> 00:05:56.710 its core, is the fact that, we talked before with Tom about 107 00:05:56.710 --> 00:05:59.350 three years of digital transformation. Let's expand it 108 00:05:59.350 --> 00:06:05.260 to, say 30. Going back, you know, to the early 90s, when we 109 00:06:05.260 --> 00:06:09.790 first started going online, I often talk about, there was a 110 00:06:09.790 --> 00:06:12.130 famous cartoon that was published, it'll be 30 years in 111 00:06:12.130 --> 00:06:15.160 July, where the two dogs are on the internet. And one dog turns 112 00:06:15.160 --> 00:06:16.990 to the other and says, "On the internet, nobody knows you're a 113 00:06:16.990 --> 00:06:19.630 dog." And we're still dealing with that problem today. In 114 00:06:19.630 --> 00:06:22.150 fact, it's actually gotten a lot worse than it was 30 years ago, 115 00:06:22.150 --> 00:06:25.570 when the cartoon was published. I've also pointed out those dogs 116 00:06:25.570 --> 00:06:28.690 are sadly a blessing memory because of dog years. But the 117 00:06:28.690 --> 00:06:32.560 problem is still with us. A big issue that we're dealing with, 118 00:06:32.560 --> 00:06:34.690 and I think really, that this legislation is looking to deal 119 00:06:34.690 --> 00:06:39.100 with is that at least in the U.S., we don't have a national 120 00:06:39.100 --> 00:06:41.050 ID, but we do have a number of what I would call 121 00:06:41.080 --> 00:06:43.840 nationally-recognized authoritative digital identity 122 00:06:43.840 --> 00:06:46.660 systems - be it the birth certificate I got from the 123 00:06:46.660 --> 00:06:49.600 county I was born in, the driver's license I get from the 124 00:06:49.600 --> 00:06:52.840 state that I live in now, or things like my passport, and my 125 00:06:52.840 --> 00:06:55.240 social security number that I get from the federal government. 126 00:06:55.240 --> 00:06:59.470 So federal, state and local, all issuing nationally-recognized 127 00:06:59.470 --> 00:07:02.140 authoritative documents, all of them stuck in the paper and 128 00:07:02.140 --> 00:07:06.070 plastic world. And so when we talked before about all of the 129 00:07:06.070 --> 00:07:09.370 bad things that keep happening with identity-centric attacks, 130 00:07:09.370 --> 00:07:13.240 particularly when it comes to stealing consumer information, 131 00:07:13.600 --> 00:07:16.960 one of the challenges that we have is that as we look to put 132 00:07:16.960 --> 00:07:20.830 more and more things online, we have this identity gap between 133 00:07:20.830 --> 00:07:23.080 these legacy things that are stuck in the paper and plastic 134 00:07:23.080 --> 00:07:26.170 world, and what we have in the digital, what we need in the 135 00:07:26.170 --> 00:07:28.660 digital, I should say. And so at the core of the Improving 136 00:07:28.660 --> 00:07:31.450 Digital Identity Act is really directing the White House to 137 00:07:31.450 --> 00:07:34.690 pull together key issuers from the federal, state and local 138 00:07:34.690 --> 00:07:37.360 level along with private sector stakeholders, and privacy and 139 00:07:37.360 --> 00:07:39.280 civil liberty advocates and others that you want at the 140 00:07:39.280 --> 00:07:41.980 table to try and figure out a coordinated approach for how you 141 00:07:41.980 --> 00:07:44.740 close that gap. So that people could actually have digital 142 00:07:44.740 --> 00:07:49.060 counterparts to the documents that they have that work in the 143 00:07:49.060 --> 00:07:51.730 paper or plastic world but do not work in digital. 144 00:07:53.170 --> 00:07:55.660 Mathew Schwartz: Great stuff. Thank you, Jeremy. Going to hand 145 00:07:55.660 --> 00:07:57.340 over to Anna, tag team. 146 00:07:58.620 --> 00:08:00.630 Anna Delaney: Well, thank you so much. So on this program, we 147 00:08:00.630 --> 00:08:04.590 often talk about the state of passwordless security and often 148 00:08:04.590 --> 00:08:07.650 have positive discussions about the progress towards a 149 00:08:07.650 --> 00:08:11.940 password-free future. However, I often talk with CISOs, who still 150 00:08:11.940 --> 00:08:14.400 think this is a fantasy scenario. And I'm sure you're 151 00:08:14.400 --> 00:08:19.410 aware that in the EU, the use of biometrics is still challenged, 152 00:08:19.410 --> 00:08:22.500 has its doubters. So I'm wondering, what it's going to 153 00:08:22.500 --> 00:08:25.500 take to turn this around. We recall that last year, Google 154 00:08:25.590 --> 00:08:29.700 Apple and Microsoft adopted FIDO, of course, but we haven't 155 00:08:29.700 --> 00:08:33.570 heard much since. So what gears do these companies still have a 156 00:08:33.570 --> 00:08:34.350 lot of work to do? 157 00:08:35.380 --> 00:08:38.380 Jeremy Grant: I mean, I'm very bullish these days in the 158 00:08:38.380 --> 00:08:42.040 ability to go passwordless. And so this picture I mentioned I 159 00:08:42.040 --> 00:08:46.690 was in Taipei was for FIDO's winter plenary. And I will say 160 00:08:46.690 --> 00:08:52.420 the progress that I was hearing there from members, as well as a 161 00:08:52.420 --> 00:08:54.430 lot of their customers in terms of the ability to build a 162 00:08:54.430 --> 00:08:57.790 passwordless, I really think, as this year goes on, you're going 163 00:08:57.790 --> 00:09:00.880 to see more and more companies, particularly in the consumer 164 00:09:00.880 --> 00:09:04.510 space, replacing passwords with passkeys, the idea of a 165 00:09:04.510 --> 00:09:09.760 multi-device credential that could essentially be synced 166 00:09:09.760 --> 00:09:12.550 across different devices. You're already seeing a number of big 167 00:09:12.550 --> 00:09:15.220 consumer brands. Pay Pal has probably been the most notable, 168 00:09:15.700 --> 00:09:19.360 who have already started to launch it. And so the idea that 169 00:09:19.360 --> 00:09:22.960 you default for consumer applications is going to be that 170 00:09:22.960 --> 00:09:26.800 you're going to need to ask them to create a password. The first 171 00:09:26.800 --> 00:09:29.860 time they sign up for service, I think is going to go away. Now, 172 00:09:30.010 --> 00:09:33.640 these things take time. I think, part of what we saw in 2022 was 173 00:09:33.640 --> 00:09:36.730 an announcement from the big tech platforms to all embrace 174 00:09:36.730 --> 00:09:43.750 this concept of multidevice passkeys. There was a little bit 175 00:09:43.750 --> 00:09:46.870 of a gap between first their announcement of their intent to 176 00:09:46.870 --> 00:09:51.310 support it, versus then rolling out support for newer platforms 177 00:09:51.310 --> 00:09:54.190 that has largely happened right now, particularly in Apple and 178 00:09:54.190 --> 00:09:57.250 Google platforms and Microsoft's got a lot in the works as well. 179 00:09:58.300 --> 00:10:02.230 So, it is happening and you're starting to see things roll out 180 00:10:02.470 --> 00:10:04.690 on the enterprise side. It's a little bit different just in 181 00:10:04.690 --> 00:10:06.790 that the requirements might be a little different. Sometimes the 182 00:10:06.790 --> 00:10:11.230 regulatory requirements also vary between countries. But I'm 183 00:10:11.230 --> 00:10:14.320 seeing a lot of folks who are being able to go truly 184 00:10:14.320 --> 00:10:17.410 passwordless, say embracing things like FIDO security keys, 185 00:10:17.890 --> 00:10:21.130 which can be rolled out with or without biometrics. And as you 186 00:10:21.130 --> 00:10:26.080 mentioned, biometrics, we have this in the U.S. as well. It 187 00:10:26.980 --> 00:10:30.400 presents some issues, if you're actually creating and storing 188 00:10:30.400 --> 00:10:34.000 central repositories of biometrics, the FIDO model is 189 00:10:34.000 --> 00:10:36.700 all leveraging match on device, which is actually good for 190 00:10:36.700 --> 00:10:41.170 privacy, good for regulatory compliance, in that there really 191 00:10:41.170 --> 00:10:43.900 isn't any risk of some big biometric database being 192 00:10:43.900 --> 00:10:46.630 compromised, or people reverse engineering things. It's all 193 00:10:46.630 --> 00:10:49.300 just stored securely on your device. And then you can use 194 00:10:49.300 --> 00:10:53.590 that to then unlock a cryptographic key. So we're not 195 00:10:53.590 --> 00:10:58.210 there yet. And one thing I've seen for my years in this space, 196 00:10:58.210 --> 00:11:00.700 is there's a gap between the announcement of a new standard, 197 00:11:00.700 --> 00:11:04.990 and then when it gets adopted. So I think it'll take some time. 198 00:11:05.140 --> 00:11:08.320 But the flip side is, if I'm Cisco, who's actually interested 199 00:11:08.320 --> 00:11:10.780 in going passwordless, here's some options that are out there 200 00:11:10.780 --> 00:11:14.440 right now that we're not there a year ago, and I would suggest 201 00:11:14.440 --> 00:11:18.070 that people do a deep dive, look at what's actually happening 202 00:11:18.070 --> 00:11:23.080 over the last 912 months, and take advantage of some of these 203 00:11:23.080 --> 00:11:25.750 new tools that are now out there and widely supported in, 204 00:11:26.710 --> 00:11:28.450 particularly people's consumer devices. 205 00:11:30.250 --> 00:11:31.600 Anna Delaney: Excellent. Well, Tom, back to you. 206 00:11:32.680 --> 00:11:35.680 Tom Field: I want to talk to you about MFA fatigue. It's a term 207 00:11:35.680 --> 00:11:39.430 that arose in 2022. We heard consistently these days, and 208 00:11:39.430 --> 00:11:42.490 it's a legitimate concern. But my question for you is how do we 209 00:11:42.490 --> 00:11:46.060 counter the narrative, which might suggest that MFA is a bad 210 00:11:46.060 --> 00:11:48.370 thing? I don't think that's the intent. But it could be 211 00:11:48.370 --> 00:11:49.240 interpreted that way. 212 00:11:49.870 --> 00:11:52.690 Jeremy Grant: No, I think the big message is all MFA is not 213 00:11:52.690 --> 00:11:59.620 the same. And it's a little frustrating, because it seems 214 00:11:59.620 --> 00:12:02.440 like right around the time that we actually got most people to 215 00:12:02.470 --> 00:12:06.010 understand the importance of MFA, we saw a shift in the kinds 216 00:12:06.010 --> 00:12:08.950 of attacks we were seeing where some legacy MFA types, be then 217 00:12:09.220 --> 00:12:12.850 based on one-time passcodes, or based on push notifications, 218 00:12:13.000 --> 00:12:17.380 suddenly became susceptible to phishing attacks. And so the 219 00:12:17.380 --> 00:12:20.740 idea of it can all be compromised, so don't bother. 220 00:12:20.860 --> 00:12:25.600 That's a really dumb idea. One, in that even the most basic MFA 221 00:12:25.600 --> 00:12:30.010 is still going to block over 99% of attacks. The real issue comes 222 00:12:30.010 --> 00:12:32.440 down to if you're going to have a very determined attacker who's 223 00:12:32.440 --> 00:12:35.800 trying to actually target you, well, then if they can phish, 224 00:12:35.800 --> 00:12:38.560 you look just like they can phish you in a way that you can 225 00:12:38.560 --> 00:12:42.430 get you to hand over your 12 or 14 character password, they can 226 00:12:42.430 --> 00:12:45.400 fit a one-time passcode, or as we've seen with MFA fatigue 227 00:12:45.400 --> 00:12:48.400 attacks, if it's based on a push notification, they can 228 00:12:48.400 --> 00:12:51.280 potentially trick you into pushing approve, and then 229 00:12:51.490 --> 00:12:57.310 somebody's owned you on the backend. The good news is, don't 230 00:12:57.310 --> 00:12:59.770 use that MFA. I mean, going back before to what we were talking 231 00:12:59.770 --> 00:13:04.270 about with passkeys and FIDO, FIDO, can't be phished. So 232 00:13:04.480 --> 00:13:07.210 there's no, there's no shared secret on either end, or there's 233 00:13:07.210 --> 00:13:09.340 nothing that you're going to do to trick somebody. It's 234 00:13:09.340 --> 00:13:12.220 basically whether using a security key or an embedded 235 00:13:12.220 --> 00:13:15.190 authenticator that's built into the platform, it's using 236 00:13:15.190 --> 00:13:17.260 asymmetric public key cryptography. So this is why 237 00:13:17.260 --> 00:13:22.510 we've seen the White House and CISA, and in the in the over in 238 00:13:22.510 --> 00:13:25.690 Europe, the Netherlands, the NCSC - Netherlands Cybersecurity 239 00:13:25.690 --> 00:13:28.270 Center - or a lot of other governments across the globe 240 00:13:28.270 --> 00:13:31.480 have been flagging this for several years, which is that MFA 241 00:13:31.480 --> 00:13:34.330 can be compromised. So make sure you're using an MFA that can 242 00:13:34.330 --> 00:13:37.510 actually stand up the phishing attacks. To me, it's a pretty 243 00:13:37.510 --> 00:13:42.370 straightforward solution. There's a great set of case 244 00:13:42.370 --> 00:13:45.160 studies that's out there from Cloudflare, who - last summer 245 00:13:45.160 --> 00:13:47.950 MFA fatigue attacks were coming around - basically came up with 246 00:13:47.950 --> 00:13:50.890 a blog that, I may get this slightly wrong so if anybody 247 00:13:50.890 --> 00:13:54.400 from Cloudflare is watching, please don't get too upset, but 248 00:13:54.400 --> 00:13:57.580 the basic gist of it was, "hey, that same attack that was 249 00:13:57.610 --> 00:14:02.020 victimizing a lot of other companies, hit us as well." And 250 00:14:02.020 --> 00:14:05.290 our folks actually got tricked into clicking on a link. But 251 00:14:05.290 --> 00:14:08.650 because we were using FIDO security keys, the attack died 252 00:14:08.650 --> 00:14:11.650 right there. They couldn't get in. So the technology is there, 253 00:14:11.650 --> 00:14:15.310 we know how to stop it. And it's I would say a little frustrating 254 00:14:16.300 --> 00:14:19.270 to have people keep throwing their hands up and saying, well, 255 00:14:19.720 --> 00:14:22.240 now MFA can be compromised, what are we supposed to do when 256 00:14:22.240 --> 00:14:24.940 there's a lot of really clear evidence that's out there, that 257 00:14:24.940 --> 00:14:26.770 suggests what kind of authentication you should be 258 00:14:26.770 --> 00:14:29.620 using that can stand up to those attacks. And increasingly, we're 259 00:14:29.620 --> 00:14:32.620 seeing regulators point to this as well. I mean, I mentioned the 260 00:14:32.620 --> 00:14:35.680 White House, their zero trust strategy that covers government 261 00:14:35.680 --> 00:14:40.150 sites. But if you're a financial services firm, the CFPB - the 262 00:14:40.150 --> 00:14:43.150 Consumer Financial Protection Bureau - put out guidance last 263 00:14:43.150 --> 00:14:46.210 September that said, you should be using phishing resistant MFA 264 00:14:46.210 --> 00:14:48.580 and they pointed to the FIDO standards. You look at the 265 00:14:48.580 --> 00:14:50.920 recent Drizly settlement from the Federal Trade Commission 266 00:14:50.920 --> 00:14:54.430 where they had a pretty consequential breach. One of the 267 00:14:54.430 --> 00:14:57.430 things that was put into their agreement, their settlement that 268 00:14:57.430 --> 00:14:59.440 they put in place was that they should be using 269 00:14:59.470 --> 00:15:02.140 phishing-resistant MFA, which I'll say is sitting in a law 270 00:15:02.140 --> 00:15:05.260 firm, where we're often advising companies on how to avoid 271 00:15:05.260 --> 00:15:08.170 getting in trouble with the FTC. That's a great settlement to 272 00:15:08.170 --> 00:15:11.260 point to. Don't be like those guys, if you do this and 273 00:15:11.260 --> 00:15:14.320 something happens. Well, one, it probably won't happen, the bad 274 00:15:14.320 --> 00:15:16.750 thing, but if it does, you can at least say to the regulators, 275 00:15:16.900 --> 00:15:19.480 you were using the best stuff out there. So I feel like the 276 00:15:19.480 --> 00:15:21.970 tools are out there right now. But there's definitely a lot of 277 00:15:21.970 --> 00:15:26.590 confusion in the market still around, back to that point I 278 00:15:26.590 --> 00:15:28.930 made before, a new standard emerges. It's great, but there's 279 00:15:28.930 --> 00:15:31.540 a lag in adoption between when it emerges and when we actually 280 00:15:31.540 --> 00:15:35.470 get it used more ubiquitously, but we can definitely stop MFA 281 00:15:35.470 --> 00:15:36.280 fatigue attacks. 282 00:15:36.760 --> 00:15:38.350 Tom Field: Excellent insight. Thank you, Jeremy. With that, I 283 00:15:38.350 --> 00:15:39.550 want to pass this back to Matt. 284 00:15:41.050 --> 00:15:44.050 Mathew Schwartz: Hi again, I love how we have all these 285 00:15:44.080 --> 00:15:46.990 burning identity issues. We just throw the other, you just kind 286 00:15:46.990 --> 00:15:51.820 of fall back. This is really great. As a reporter, one of the 287 00:15:51.820 --> 00:15:56.290 things I've been tracking is the questionable, if you will, 288 00:15:56.440 --> 00:16:03.400 activity with login.gov. So very briefly, on March 29, the U.S. 289 00:16:03.400 --> 00:16:06.940 General Service Administration's inspector general appeared 290 00:16:06.940 --> 00:16:12.340 before a House Subcommittee to detail how GSA "misled customers 291 00:16:12.340 --> 00:16:15.850 on login.gov is compliance with digital identity standards," 292 00:16:16.210 --> 00:16:20.620 according the IG's report that came out last month. So the IG 293 00:16:20.620 --> 00:16:23.890 alleges GSA was billing its customers, as in U.S. government 294 00:16:23.890 --> 00:16:27.700 agencies for multifactor authentication login platform 295 00:16:27.730 --> 00:16:31.060 standards, as required by NIST, but which actually were failing 296 00:16:31.060 --> 00:16:35.140 to meet the NIST requirements. What's your take on this? Is 297 00:16:35.140 --> 00:16:39.190 this a cultural issue? Are there broader, more cautionary 298 00:16:39.310 --> 00:16:42.430 takeaways? You were talking about MFA getting a bad rap. It 299 00:16:42.430 --> 00:16:45.340 seems like somebody else is getting a bad rap here too, 300 00:16:45.340 --> 00:16:48.040 about the quality of identity being provided? 301 00:16:49.510 --> 00:16:51.880 Jeremy Grant: Yeah, I mean, look, it was a pretty depressing 302 00:16:51.880 --> 00:16:54.490 report to read, the Inspector General's report that came out a 303 00:16:54.490 --> 00:16:56.980 couple of weeks before, and the hearing that you mentioned last 304 00:16:56.980 --> 00:17:01.870 week, I mean, this was the House Oversight Committee, which is 305 00:17:01.870 --> 00:17:04.360 not necessarily known as the place where there's like a lot 306 00:17:04.360 --> 00:17:09.040 of bipartisan cooperation. But this was a very bipartisan 307 00:17:09.040 --> 00:17:12.700 hearing in terms of Democrats and Republicans, all basically 308 00:17:13.030 --> 00:17:17.200 outraged about what had happened. To dive into it, 309 00:17:17.710 --> 00:17:22.930 login.gov is a system that the U.S. General Services 310 00:17:22.930 --> 00:17:25.360 Administration has been building. They must have 311 00:17:25.360 --> 00:17:31.780 started, you know, close to 10 years ago, 2013/2014 onward. And 312 00:17:32.020 --> 00:17:34.540 talking about MFA versus identity proofing just a level 313 00:17:34.540 --> 00:17:39.370 set, what login.gov has been, is a single sign-on, and account 314 00:17:39.370 --> 00:17:42.400 management and authentication service. So the idea being that, 315 00:17:43.090 --> 00:17:46.330 as an American, I can go to four or five government websites, 316 00:17:46.600 --> 00:17:49.330 sign-in with the same username and password. And then there's 317 00:17:49.330 --> 00:17:51.640 an MFA layer on top of it. In fact, login.gov was one of the 318 00:17:51.640 --> 00:17:56.590 very first consumer facing sites to implement FIDO. So actually, 319 00:17:56.590 --> 00:18:01.090 their MFA components are pretty sound. Where they got in trouble 320 00:18:01.090 --> 00:18:03.880 is on the account creation process. I'm not worried about 321 00:18:03.880 --> 00:18:07.030 authentication, how do I log back in, but for those 322 00:18:07.030 --> 00:18:09.100 applications, where it's actually pretty important to 323 00:18:09.100 --> 00:18:11.830 prove I'm really Jeremy Grant, and in particular, Jeremy Grant, 324 00:18:12.130 --> 00:18:16.630 identity proofing is important. And so there, the standard that 325 00:18:16.630 --> 00:18:19.840 agencies have been expected to meet for quite some time is 326 00:18:20.080 --> 00:18:22.840 identity assurance level two is defined by NIST and their 327 00:18:22.840 --> 00:18:29.560 digital identity guidelines. So where GSA got in trouble is they 328 00:18:29.560 --> 00:18:32.290 never actually had a solution for identity proofing that met 329 00:18:32.290 --> 00:18:37.960 this identity assurance level 2 (IAL2). Further, they decided 330 00:18:38.260 --> 00:18:41.350 somewhere on the path to getting to be IAL2 compliant, but they 331 00:18:41.350 --> 00:18:43.810 actually weren't going to bother because they weren't comfortable 332 00:18:43.810 --> 00:18:47.890 with the use of the biometrics when they're used in the context 333 00:18:47.890 --> 00:18:49.840 of a one to one selfie match where you'd like to take a 334 00:18:49.840 --> 00:18:52.390 picture of your driver's license, and then take a selfie 335 00:18:52.390 --> 00:18:55.750 on the phone that would match to that. But they didn't tell 336 00:18:55.750 --> 00:18:58.540 anybody. They didn't tell the agencies. They kept telling the 337 00:18:58.540 --> 00:19:01.180 agencies they were complying with IAL2. This went on for 338 00:19:01.180 --> 00:19:05.320 about a year. Agencies were paying for it. And then when it 339 00:19:05.320 --> 00:19:10.150 was discovered, as you might imagine, some things kind of hit 340 00:19:10.150 --> 00:19:13.180 the fan. Now GSA to its credit, when they discovered this went 341 00:19:13.180 --> 00:19:16.540 to the inspector general in their agency and said, something 342 00:19:16.540 --> 00:19:19.330 happened here, we actually think you need to dive into this, this 343 00:19:19.330 --> 00:19:22.690 report details would happen. Some people lost their jobs out 344 00:19:22.690 --> 00:19:26.050 of it. I mean, it was not a very happy ending. There's a new team 345 00:19:26.050 --> 00:19:30.520 there now that is hopefully in a position to do a little bit 346 00:19:30.520 --> 00:19:33.370 better. But yeah, a lot of what we heard in the hearing last 347 00:19:33.370 --> 00:19:40.840 week was a lot of frustration with the agency but the whole 348 00:19:40.840 --> 00:19:44.410 idea of having the government develop this as a shared service 349 00:19:44.530 --> 00:19:47.140 is so that you have a single platform that agencies can trust 350 00:19:47.140 --> 00:19:49.810 and if agencies can't trust it, then there's some broader issues 351 00:19:49.810 --> 00:19:50.110 there. 352 00:19:51.060 --> 00:19:52.560 Mathew Schwartz: Fantastic. It'll be an interesting story to 353 00:19:52.560 --> 00:19:55.080 follow. Hopefully, they're going to get everything sorted out 354 00:19:55.080 --> 00:19:57.720 sooner than later because obviously we need it. Thank you, 355 00:19:57.720 --> 00:19:57.960 Jeremy. 356 00:19:57.960 --> 00:20:00.390 Jeremy Grant: I will say, tying back to your original question 357 00:20:00.390 --> 00:20:02.880 about the Improving Digital Identity Act. I mean, this gets 358 00:20:02.880 --> 00:20:07.140 into, and I've actually done an OpEd on this a few weeks ago. 359 00:20:07.380 --> 00:20:11.190 There's a bigger question around, is the GSA even in a 360 00:20:11.190 --> 00:20:14.550 position to be able to do identity proofing at scale for 361 00:20:14.550 --> 00:20:17.280 millions of Americans. They're not an agency that is in the 362 00:20:17.280 --> 00:20:20.910 identity business, at least when it comes to trying to figure out 363 00:20:20.910 --> 00:20:24.450 who's who. And back to the point I made of driver's license 364 00:20:24.450 --> 00:20:27.420 bureau, state vital records bureau, passport, social 365 00:20:27.420 --> 00:20:29.940 security numbers, we have nationally recognized 366 00:20:29.940 --> 00:20:33.540 authoritative identity systems that are out there today. The 367 00:20:33.540 --> 00:20:37.740 best way, from my perspective to help an agency like GSA or any 368 00:20:37.740 --> 00:20:40.770 other federal agency actually know who's who online, is to 369 00:20:40.770 --> 00:20:42.990 come up with those digital counterparts to those 370 00:20:42.990 --> 00:20:45.240 authoritative systems, rather than try and build some new 371 00:20:45.240 --> 00:20:48.480 system that essentially creates, I guess, what I would call the 372 00:20:48.480 --> 00:20:51.810 digital equivalent of the DMV for people to go through again. 373 00:20:52.020 --> 00:20:54.900 So I already went through a crappy process at the DMV, a few 374 00:20:54.900 --> 00:20:57.360 years ago. I'd like to reuse that rather than have to go 375 00:20:57.360 --> 00:21:00.810 through a new version of that at the GSA, just so I can engage in 376 00:21:00.810 --> 00:21:04.950 something involving a federal service. That's a way to, I 377 00:21:04.950 --> 00:21:08.100 think, get to not only better security, but also streamline 378 00:21:08.100 --> 00:21:10.470 the user experience so that I don't have to spend 15 or 20 379 00:21:10.470 --> 00:21:14.610 minutes proving who I am. So from my perspective, the 380 00:21:14.610 --> 00:21:16.650 approach would have been Improving Digital Identity Act 381 00:21:16.650 --> 00:21:19.500 of moving through Congress is pushing would actually help 382 00:21:19.680 --> 00:21:22.170 solve this problem for GSA. So they don't have to build their 383 00:21:22.170 --> 00:21:24.600 own system. They can just leverage other tools that are 384 00:21:24.600 --> 00:21:26.970 out there that tie back to authoritative sources already. 385 00:21:27.660 --> 00:21:30.300 Mathew Schwartz: Yeah. Not rebuilding things from scratch, 386 00:21:30.390 --> 00:21:33.930 maintaining usability. I know usability is a big question that 387 00:21:33.930 --> 00:21:35.490 Anna has, as well. 388 00:21:36.300 --> 00:21:44.880 Anna Delaney: Definitely. Nicely set up. So I'm glad MFA fatigue 389 00:21:44.880 --> 00:21:47.640 attacks have been raised. Because this is my next topic. I 390 00:21:47.640 --> 00:21:51.330 recently moderated a roundtable on that very topic, tackling MFA 391 00:21:51.330 --> 00:21:55.050 fatigue attacks. And it became abundantly clear early on in the 392 00:21:55.050 --> 00:21:58.020 conversation that most of the organizations still have a lot 393 00:21:58.020 --> 00:22:02.010 of work to do, when it comes to deploying MFA where it's needed. 394 00:22:02.010 --> 00:22:04.590 Actually, somebody said, Well, this is a bit of a luxury even 395 00:22:04.590 --> 00:22:08.100 talking about MFA fatigue attacks. So what's your advice 396 00:22:08.100 --> 00:22:12.450 to help companies get to a place where MFA is deployed everywhere 397 00:22:12.630 --> 00:22:13.500 they want it to be? 398 00:22:14.340 --> 00:22:18.330 Jeremy Grant: Well, I think, part of it is, break down where 399 00:22:18.330 --> 00:22:20.790 your risks are, and break down the different use cases and the 400 00:22:20.790 --> 00:22:23.040 populations of people and, you know, focus on those that are 401 00:22:23.040 --> 00:22:25.650 highest priority. So look, I mean, I see a lot of 402 00:22:25.650 --> 00:22:27.900 organizations where they're starting with something like 403 00:22:27.900 --> 00:22:32.070 FIDO security keys for privileged users. And where to 404 00:22:32.070 --> 00:22:37.530 lock down remote access. So the places where things could most 405 00:22:37.530 --> 00:22:39.480 likely go wrong, and where there's potentially the most 406 00:22:39.480 --> 00:22:42.660 consequences, and then, from there look to roll things out, 407 00:22:43.830 --> 00:22:47.370 to other users, to the extent that you can, but you don't 408 00:22:47.370 --> 00:22:50.730 basically need to tackle everything all at once. You can 409 00:22:50.730 --> 00:22:54.540 really break things into parts. So, to be clear, this is 410 00:22:54.540 --> 00:22:56.670 actually where I get excited about passwordless with things 411 00:22:56.670 --> 00:23:01.020 like FIDO standards is we can actually deliver something that 412 00:23:01.050 --> 00:23:03.600 is not only much more secure, but also has a much better user 413 00:23:03.600 --> 00:23:06.480 experience. Nobody's going to yell at you, if you eliminate 414 00:23:06.480 --> 00:23:08.370 passwords from your organization. In fact, they'll 415 00:23:08.370 --> 00:23:12.570 probably carry you around on their shoulders and yell hurrah, 416 00:23:12.570 --> 00:23:16.380 hurrah, because that's actually a great thing that you can do. 417 00:23:18.060 --> 00:23:20.220 And I think there's certainly some good stories that are out 418 00:23:20.220 --> 00:23:22.620 there around how they've eliminated passwords, and it 419 00:23:22.620 --> 00:23:24.840 turns out that people are much happier with the IT department. 420 00:23:25.320 --> 00:23:29.190 But look, you can't necessarily focus on every user right away. 421 00:23:29.190 --> 00:23:31.860 There's always going to be, what I would call edge cases, corner 422 00:23:31.860 --> 00:23:34.980 cases, where something that you want to roll out for most people 423 00:23:34.980 --> 00:23:40.890 might not work. So I love my YubiKey and I'll carry it with 424 00:23:40.890 --> 00:23:44.910 me till the day I die, or who knows how long that will be. But 425 00:23:46.140 --> 00:23:48.360 there may be other solutions that work for other people as 426 00:23:48.360 --> 00:23:51.150 well. I think the most important thing is at least be able to 427 00:23:51.150 --> 00:23:54.660 lock down those highest risk accounts and highest risk use 428 00:23:54.660 --> 00:23:56.850 cases and instead of what to push other solutions out from 429 00:23:56.850 --> 00:23:57.120 there. 430 00:23:58.890 --> 00:24:00.900 Anna Delaney: Very good. Well, this has been insightful as 431 00:24:00.900 --> 00:24:03.480 always educational. Jeremy, thank you so much. Just one more 432 00:24:03.480 --> 00:24:07.950 question. Before we log off, I'll ask Tom and Matt first, 433 00:24:07.950 --> 00:24:10.350 perhaps give you a break. But I know Jeremy that you're a bit of 434 00:24:10.350 --> 00:24:14.100 a song lyrics buff, so this shouldn't be too tricky for you. 435 00:24:14.310 --> 00:24:18.000 If I were to ask you to write a song on the theme of identity 436 00:24:18.000 --> 00:24:22.080 security, what would the title of that song be? Maybe Tom and 437 00:24:22.080 --> 00:24:23.790 Matt first? Go for it. 438 00:24:24.390 --> 00:24:27.600 Tom Field: You can tell everybody that this is your 439 00:24:27.600 --> 00:24:31.170 song. Your Song by Elton John. 440 00:24:32.010 --> 00:24:35.850 Anna Delaney: Beautifully sung as well. Elton was a bit of a 441 00:24:35.850 --> 00:24:36.600 threat in the house. 442 00:24:40.230 --> 00:24:42.150 Mathew Schwartz: Final two or someone's got to pick up that 443 00:24:42.180 --> 00:24:47.490 baton and carry it forward. So I was thinking like, do we 444 00:24:47.490 --> 00:24:51.510 go with an established tune like, Don't Go Breaking My 445 00:24:51.510 --> 00:24:59.430 Heart. Or do we have something that we go with giant kittens 446 00:24:59.430 --> 00:25:02.160 with laser just shooting out of their head like two-factor 447 00:25:02.160 --> 00:25:07.080 reactor. I can see that being a real disco hit. But I thought 448 00:25:07.080 --> 00:25:10.080 I'd go with something a little more R&B, something a little 449 00:25:10.080 --> 00:25:15.390 more contemplative. Since You, that's the capital letter YOU - 450 00:25:15.420 --> 00:25:16.650 Since YOU Authenticated. 451 00:25:17.670 --> 00:25:23.370 Anna Delaney: Oh, that's really good. Very impressed. Jeremy? 452 00:25:25.410 --> 00:25:28.200 Jeremy Grant: Well, I'm not singing but I was thinking if I 453 00:25:28.200 --> 00:25:31.200 was writing a song about the state of identity security would 454 00:25:31.200 --> 00:25:34.500 be called Why Do the Same Dumb Things Keep Happening to Me? 455 00:25:40.740 --> 00:25:42.360 Mathew Schwartz: I am sure, there's a pickup truck and a dog 456 00:25:42.360 --> 00:25:42.930 in that one. 457 00:25:44.850 --> 00:25:48.420 Jeremy Grant: And a bottle of whiskey. But I really do feel 458 00:25:48.420 --> 00:25:53.130 like when it comes to identity security, we have joke about 459 00:25:53.130 --> 00:25:56.370 this, because this is where I spend a lot of my job, or a lot 460 00:25:56.370 --> 00:25:59.190 of my career, I'd love to solve this stuff. It is the same dumb 461 00:25:59.190 --> 00:26:04.020 stuff happening year after year. Most of it tied to things like 462 00:26:04.020 --> 00:26:06.510 compromised passwords, and people aren't implementing MFA 463 00:26:06.510 --> 00:26:10.350 or the right kind of MFA. And, like, we know what the attackers 464 00:26:10.350 --> 00:26:12.630 are doing. We know how to solve this. And yet, we're singing 465 00:26:12.630 --> 00:26:18.210 this same sad song year after year. Most of the time you see 466 00:26:18.210 --> 00:26:20.400 these attacks, and the company comes out and goes, "oh, was a 467 00:26:20.400 --> 00:26:24.420 very sophisticated attack using cutting-edge techniques from a 468 00:26:24.420 --> 00:26:27.060 hostile nation state." And I'm like, well, that sounds really 469 00:26:27.060 --> 00:26:29.040 defensible. How could you protect against that? And then 470 00:26:29.040 --> 00:26:31.980 it's like, "oh, yeah, we had a compromised password." And then 471 00:26:31.980 --> 00:26:34.080 they came in and escalated privilege. And I'm like, I've 472 00:26:34.080 --> 00:26:37.260 only seen that in about 300 incidences over the last 10 473 00:26:37.290 --> 00:26:41.310 years. So why do the same dumb things keep happening to me? It 474 00:26:41.310 --> 00:26:43.680 would be nice if we solve this, and I could go do something else 475 00:26:43.680 --> 00:26:47.220 with the back half of my career. But I'll always still be happy 476 00:26:47.220 --> 00:26:50.820 to come and hang out with the ISMG crew to talk about this. 477 00:26:50.850 --> 00:26:53.070 Anna Delaney: Oh, you could be a lyricist. I mean, that's 478 00:26:53.430 --> 00:26:57.690 definitely a viable job for you, or career to say. Well, I was 479 00:26:57.690 --> 00:27:01.470 going to say Always on My Mind, because, I mean, this stuff is 480 00:27:01.470 --> 00:27:06.810 always on our minds. Right? Jeremy, this has been a real 481 00:27:06.810 --> 00:27:09.660 pleasure for all of us. So thank you so much for joining the ISMG 482 00:27:09.660 --> 00:27:10.380 Editors' Panel. 483 00:27:10.920 --> 00:27:11.910 Jeremy Grant: Thank you. Appreciate it. 484 00:27:12.150 --> 00:27:12.690 Mathew Schwartz: Thank you. 485 00:27:14.010 --> 00:27:15.900 Anna Delaney: And thank you so much for watching. Until next 486 00:27:15.900 --> 00:27:16.290 time.