WEBVTT 1 00:00:00.030 --> 00:00:02.970 Anna Delaney: Hello, I'm Anna Delaney and welcome to the ISMG 2 00:00:02.970 --> 00:00:05.760 Editors' Panel. And this week, we are joined by a very good 3 00:00:05.760 --> 00:00:09.180 friend of ISMG. And that is, of course, the excellent Jeremy 4 00:00:09.180 --> 00:00:12.030 Grant, managing director of technology business strategy at 5 00:00:12.030 --> 00:00:15.990 Venable, also coordinator of the Better Identity Coalition. And 6 00:00:15.990 --> 00:00:19.770 also with us, familiar faces, Tom Field, senior vice president 7 00:00:19.770 --> 00:00:22.830 of editorial, and Matthew Schwartz, executive editor of 8 00:00:22.830 --> 00:00:26.160 DataBreachToday and Europe. Welcome, Jeremy, delighted to 9 00:00:26.160 --> 00:00:26.610 have you join us. 10 00:00:27.240 --> 00:00:28.440 Jeremy Grant: Thanks. Good to be here with you. 11 00:00:28.000 --> 00:00:31.720 Anna Delaney: So Jeremy, where are you? I'm going to take a 12 00:00:31.720 --> 00:00:33.280 guess. Maybe, Lisbon. 13 00:00:33.880 --> 00:00:35.920 Jeremy Grant: I'm in Lisbon, which is where I should be two 14 00:00:35.920 --> 00:00:40.420 weeks from today, if all goes well. Although I am most 15 00:00:40.420 --> 00:00:43.900 definitely not where Matthew is. We were just talking before this 16 00:00:43.900 --> 00:00:50.320 started. RSA was the super spreader event that has ... I'm 17 00:00:50.320 --> 00:00:54.040 doing this with COVID, now. So assuming I recover, I will be 18 00:00:54.040 --> 00:00:58.330 there in two weeks and definitely not at RSA. 19 00:00:59.140 --> 00:01:02.320 Anna Delaney: Well, wishing you a speedy recovery, Jeremy, and 20 00:01:02.320 --> 00:01:04.270 it was great to meet you in person as well. That was 21 00:01:04.000 --> 00:01:05.830 Jeremy Grant: Most definitely! Long overdue. 22 00:01:04.270 --> 00:01:04.630 definitely ... 23 00:01:07.210 --> 00:01:10.780 Anna Delaney: So Matthew, you are, perhaps, in San Francisco? 24 00:01:11.830 --> 00:01:15.610 Matthew Schwartz: Yes. I am flashing back to San Francisco. 25 00:01:15.610 --> 00:01:19.720 So, you know, just to Jeremy's point, we had worried that 2020 26 00:01:19.930 --> 00:01:25.120 would be the kind of RSA doomsday scenario. Sorry, not 27 00:01:25.120 --> 00:01:29.110 RSA, COVID. But it seems like there's a fair few number of 28 00:01:29.110 --> 00:01:32.140 people coming out of the event this year, unfortunately, who 29 00:01:33.070 --> 00:01:37.000 did catch it this time. So far, so good for me, but fingers 30 00:01:37.000 --> 00:01:40.300 crossed. And it's a lovely event. It was wonderful to see 31 00:01:40.300 --> 00:01:43.930 people again in person, to get to meet Jeremy, and just to have 32 00:01:43.990 --> 00:01:45.100 a bit of chat with everybody. 33 00:01:45.670 --> 00:01:48.580 Jeremy Grant: Hearing all the people who've got it, there's a 34 00:01:49.180 --> 00:01:53.440 famous line from the rapper Nas, where he says, "I don't sleep 35 00:01:53.440 --> 00:01:56.620 because sleep is the cousin of death." And I'm thinking RSA is 36 00:01:56.620 --> 00:01:59.830 the cousin of death this year, given all the people who have 37 00:01:59.830 --> 00:02:02.920 emerged with disease and pestilence coming out of it. 38 00:02:02.000 --> 00:02:06.110 Matthew Schwartz: That's not the transform tagline they were 39 00:02:06.110 --> 00:02:06.770 going with, Jeremy. 40 00:02:08.870 --> 00:02:10.670 Jeremy Grant: Cousin of death. It's going to be the theme next 41 00:02:10.670 --> 00:02:11.030 year. 42 00:02:12.740 --> 00:02:19.790 Anna Delaney: So, Tom, are you in Maine? Is that where you are? 43 00:02:19.000 --> 00:02:23.410 Tom Field: I am. That's because I had COVID pre RSA. And I was 44 00:02:23.410 --> 00:02:25.900 unable to travel this year, because I didn't want to be the 45 00:02:25.900 --> 00:02:28.840 super spreader. So I stayed home. And while you folks were 46 00:02:28.840 --> 00:02:32.140 opening your studio in San Francisco last week, I stepped 47 00:02:32.140 --> 00:02:35.170 out on my front porch and captured this sunrise. So yes, 48 00:02:35.170 --> 00:02:37.990 I'm in Maine. Will be here for a while. Jeremy, I'm here to 49 00:02:37.990 --> 00:02:42.460 encourage you. I came through in a good five to eight days, and I 50 00:02:42.460 --> 00:02:45.130 think you'll be fine too. And you'll be in Portugal soon. 51 00:02:45.570 --> 00:02:47.160 Jeremy Grant: I'm looking forward to that. Thanks. So, 52 00:02:47.460 --> 00:02:48.960 feeling well enough to be with you today. 53 00:02:49.440 --> 00:02:51.390 Anna Delaney: Take lots of rest. But of course, Tom, you were 54 00:02:51.390 --> 00:02:54.750 there in spirit because you were helping us so much on the 55 00:02:54.750 --> 00:02:58.080 ground. So we appreciate that. Well, guess where I am? 56 00:02:59.160 --> 00:03:01.650 Tom Field: I have an idea. But I'll keep that to myself. 57 00:03:01.870 --> 00:03:04.480 Anna Delaney: Well, after four intense days in the studios at 58 00:03:04.480 --> 00:03:08.290 RSA, I thought catching my plane back to London. I thought I need 59 00:03:08.320 --> 00:03:12.010 air and I need nature. So this is what I found. I guess it's 60 00:03:12.010 --> 00:03:15.940 along the coast and around the Muir Beach. But that was a 61 00:03:15.940 --> 00:03:16.900 glorious finale. 62 00:03:16.000 --> 00:03:16.900 Tom Field: Sausalito? 63 00:03:18.700 --> 00:03:22.030 Anna Delaney: Yeah, around that, sort of driving around. So, 64 00:03:23.080 --> 00:03:27.820 forgive my lack of geography here. But specifics. But anyway, 65 00:03:28.360 --> 00:03:33.400 I loved it. Jeremy, back to RSA for a moment, even though you 66 00:03:33.400 --> 00:03:36.880 probably don't want to talk about it for a while now. But 67 00:03:36.970 --> 00:03:40.510 from an identity perspective, what were the highlights for you 68 00:03:40.510 --> 00:03:42.070 and observations really? 69 00:03:42.460 --> 00:03:45.190 Jeremy Grant: Beyond meeting you in person and talking identity? 70 00:03:45.310 --> 00:03:47.050 Tom Field: Close the conversation right there. 71 00:03:47.240 --> 00:03:50.960 Jeremy Grant: Yes. I mean, I'll say, you know, RSA, look, it's 72 00:03:50.960 --> 00:03:53.960 become a conference that I go out for the week. I actually 73 00:03:53.960 --> 00:03:56.450 rarely go to the conference. I did speak this year. And I think 74 00:03:56.450 --> 00:04:00.140 the highlight for me was that panel, I got to moderate a great 75 00:04:00.140 --> 00:04:03.470 panel with Lisa Lee from Microsoft and Dr. Rébecca 76 00:04:03.740 --> 00:04:07.430 Kleinberger, who's a researcher with the MIT Media Lab. Looking 77 00:04:07.430 --> 00:04:10.880 at voice, talking about voice both as a substitute for the 78 00:04:10.880 --> 00:04:14.090 keyboard but also as an authentication tool. And sort of 79 00:04:14.090 --> 00:04:17.240 looking ahead to all of the security concerns around it. 80 00:04:18.380 --> 00:04:21.320 I'll say it's not particularly rosy in terms of when you're 81 00:04:21.320 --> 00:04:24.680 looking at voice but looking ahead to say perhaps 82 00:04:24.680 --> 00:04:27.740 applications in the metaverse where voice will be the 83 00:04:27.740 --> 00:04:31.220 keyboard. You're not going to be typing while you're wearing, you 84 00:04:31.220 --> 00:04:34.160 know, headset for example. And you know, what might that mean? 85 00:04:34.190 --> 00:04:36.800 So really good discussion, good audience participation in 86 00:04:36.800 --> 00:04:41.750 questions. And you know, that was a real fun panel to dive 87 00:04:41.750 --> 00:04:43.430 into for 45 minutes. 88 00:04:43.000 --> 00:04:46.240 Anna Delaney: And Jeremy, just before passing the baton to Tom, 89 00:04:46.450 --> 00:04:48.370 did you learn anything new this year? 90 00:04:50.010 --> 00:04:52.680 Jeremy Grant: I think the main thing I learned was just how 91 00:04:52.680 --> 00:04:56.490 little I knew about voice and Dr. Kleinberger, she's done some 92 00:04:56.490 --> 00:04:58.920 TED talks online. I'll say if you want to, you know, she's not 93 00:04:58.920 --> 00:05:02.220 a security researcher. She's a voice researcher, and I think 94 00:05:02.220 --> 00:05:06.690 looks at the topic from just a bunch of different dimensions 95 00:05:06.690 --> 00:05:08.640 that most of us have never thought about, including 96 00:05:08.670 --> 00:05:11.430 apparently to be making any of the sounds I'm making right now. 97 00:05:11.430 --> 00:05:14.820 I'm coordinating. What did she say? More than 100 muscles in my 98 00:05:14.820 --> 00:05:20.100 body all at the same time. So, fun fact, you can break that out 99 00:05:20.100 --> 00:05:23.100 of the party sometime and see how close people can get to the 100 00:05:23.100 --> 00:05:23.550 number. 101 00:05:24.360 --> 00:05:26.880 Anna Delaney: I'll keep that one. Tom, over to you. 102 00:05:26.000 --> 00:05:29.174 Jeremy Grant: So I would say that was the other highlight I 103 00:05:26.510 --> 00:05:29.213 Tom Field: You know, given what Jeremy has just told us, I've 104 00:05:29.244 --> 00:05:33.547 didn't get from RSA was the big FIDO seminar on that Tuesday, 105 00:05:29.267 --> 00:05:32.727 had a heck of a workout already today. I'm feeling good. Jeremy, 106 00:05:32.781 --> 00:05:36.024 I want to ask you a couple of questions that we've discussed 107 00:05:33.618 --> 00:05:37.779 which was four hours talking all about the new announcement 108 00:05:36.078 --> 00:05:39.484 in previous conversations. But given that you've just come back 109 00:05:37.850 --> 00:05:42.012 around passkeys and multidevice credentials, and how you're 110 00:05:39.538 --> 00:05:42.458 from the cybersecurity Mardi Gras, you might have some 111 00:05:42.082 --> 00:05:45.750 basically seeing all the platforms, you know, really 112 00:05:42.512 --> 00:05:45.755 different perspectives. One is about passwordless. Given the 113 00:05:45.809 --> 00:05:48.945 conversations, you've been able to have, the presentations 114 00:05:45.821 --> 00:05:50.194 double down on their commitment to expanding their commitment, 115 00:05:48.999 --> 00:05:53.000 you've seen, how would you now describe our current state of passwordless? 116 00:05:50.265 --> 00:05:54.286 to the FIDO standards, to finally kill the password. So I 117 00:05:54.356 --> 00:05:58.306 think I used the phrase, talk to my hand. There's an old 118 00:05:58.377 --> 00:06:02.891 Parliament record, "Standing on the verge of getting it on." And 119 00:06:02.962 --> 00:06:07.124 that's kind of how I would describe things right now. We've 120 00:06:07.194 --> 00:06:11.144 been talking about this for years, there's been a lot of 121 00:06:11.215 --> 00:06:15.165 hype, a lot of interesting individual solutions that are 122 00:06:15.236 --> 00:06:19.680 out there, but most of them are proprietary. Now you got Apple, 123 00:06:19.750 --> 00:06:24.194 Google and Microsoft, you know, who basically collectively make 124 00:06:24.265 --> 00:06:28.144 the operating system on everything we're using. And you 125 00:06:28.215 --> 00:06:32.729 know, a lot of devices as well, saying, "this is how we're going 126 00:06:32.800 --> 00:06:36.820 to do this going forward, building off of FIDO standards, 127 00:06:36.891 --> 00:06:40.841 and you know, leveraging some new innovations around the 128 00:06:40.912 --> 00:06:45.356 passkey concept." I think, you know, this now means in the next 129 00:06:45.426 --> 00:06:49.870 18 to 24 months, you'll start to see consumer experiences where 130 00:06:49.941 --> 00:06:53.891 the default is, rather than create a password, you'll be 131 00:06:53.961 --> 00:06:58.194 asked to create a passkey that will be synced instead across 132 00:06:58.264 --> 00:07:02.638 your devices, and you'll never have to use the password again. 133 00:07:02.708 --> 00:07:04.190 It's pretty exciting. 134 00:07:04.000 --> 00:07:06.940 Tom Field: I'm really glad you didn't describe the state as the 135 00:07:06.940 --> 00:07:10.660 cousin of death. Thank you. Jeremy, a few weeks ago, we 136 00:07:10.660 --> 00:07:14.200 spoke immediately after Microsoft and Apple and Google 137 00:07:14.200 --> 00:07:17.650 came together in support of the FIDO protocol. So a few weeks 138 00:07:17.650 --> 00:07:20.950 later, can you describe the significance of it then? What, 139 00:07:20.950 --> 00:07:26.050 if any, domino effect do you now see, now that we have this 140 00:07:26.170 --> 00:07:26.860 endorsement? 141 00:07:28.310 --> 00:07:30.530 Jeremy Grant: I think it's, you know, Dominoes is probably a 142 00:07:30.530 --> 00:07:32.900 good analogy. I think what you're seeing is a lot of people 143 00:07:32.900 --> 00:07:35.540 are sort of looking at this and trying to, you know, parse the 144 00:07:35.540 --> 00:07:38.810 announcement to understand what's underneath it all. But 145 00:07:38.810 --> 00:07:41.150 you know, there's been really good buy in, I would say across 146 00:07:41.150 --> 00:07:43.700 the board. You know, I was excited when the announcement 147 00:07:43.700 --> 00:07:45.830 came out a few weeks ago, Jen Easterly, who's the head of 148 00:07:45.830 --> 00:07:48.740 CISA, contributed a quote to the press release, she's been on a 149 00:07:48.740 --> 00:07:55.550 real crusade based on ... Boston's more than a feeling, AD 150 00:07:55.550 --> 00:07:59.030 said to, you know, promote more than a password, Bob Lord from 151 00:07:59.030 --> 00:08:02.750 CISA came and spoke at the FIDO seminar at RSA. I think, you 152 00:08:02.750 --> 00:08:06.050 know, we're getting an impact or inputs on the FIDO side from, 153 00:08:06.080 --> 00:08:08.060 you know, other governments as well, looking at this and kind 154 00:08:08.060 --> 00:08:10.910 of looking into this and saying, "this is actually really good 155 00:08:10.910 --> 00:08:13.160 advancement forward." And I think a lot of big players in 156 00:08:13.880 --> 00:08:17.060 the private sector as well are, you know, I would say that the 157 00:08:17.060 --> 00:08:19.790 mood is one of excitement. Not that there aren't questions, I 158 00:08:19.790 --> 00:08:23.180 think, I mean, you know, underneath the core announcement 159 00:08:23.180 --> 00:08:25.160 that everybody's rallying together, there's also some 160 00:08:25.160 --> 00:08:28.910 change in terms of how cryptographic keys for login are 161 00:08:28.910 --> 00:08:32.600 going to be used. You know, namely, I think that the big 162 00:08:32.660 --> 00:08:36.020 trade off on the FIDO announcement has always been 163 00:08:37.790 --> 00:08:40.790 one, when you look at, you know, authentication based off 164 00:08:40.790 --> 00:08:43.550 cryptographic keys, which is what FIDO is, well, what happens 165 00:08:43.550 --> 00:08:46.280 if you have several devices. And what you're seeing now is the 166 00:08:46.280 --> 00:08:49.010 platforms are going to find ways to securely sync to the 167 00:08:49.310 --> 00:08:52.790 credentials across the devices, which makes it a lot easier for 168 00:08:52.790 --> 00:08:55.820 your everyday user to actually go passwordless. And, you know, 169 00:08:55.820 --> 00:08:59.000 I think what I'm seeing is a lot of heads nodding, saying, "okay, 170 00:08:59.000 --> 00:09:01.520 there's some security versus usability trade offs here, but 171 00:09:01.520 --> 00:09:05.870 the security benefits are far outweighing" ... I'm sorry, the 172 00:09:06.050 --> 00:09:09.230 security benefits, because it's so much more usable, are far 173 00:09:09.230 --> 00:09:11.390 outweighing than any security downsides from this, and so a 174 00:09:11.390 --> 00:09:12.350 lot of good excitements. 175 00:09:12.780 --> 00:09:14.880 Tom Field: Very good! I'm going to stick with the music theme, 176 00:09:15.300 --> 00:09:19.200 pass this on to my colleague, Matthew, and I will invoke the 177 00:09:19.200 --> 00:09:21.300 Grateful Dead as we keep on "Truckin". 178 00:09:21.350 --> 00:09:21.830 Jeremy Grant: Oh, my! 179 00:09:25.110 --> 00:09:27.810 Matthew Schwartz: Picking up the musical baton, in the immortal 180 00:09:27.810 --> 00:09:31.710 words of Richard Marx, "Right here waiting," is how we have 181 00:09:31.710 --> 00:09:35.610 been for such a long time when it comes to a federal data 182 00:09:35.670 --> 00:09:40.350 privacy law. Now, this week, we had Representative Cathy 183 00:09:40.350 --> 00:09:45.030 McMorris Rodgers, from Washington, saying she sees the 184 00:09:45.030 --> 00:09:48.120 best opportunity we've had to pass a federal data privacy law 185 00:09:48.150 --> 00:09:52.980 in decades happening now. We've got bipartisan legislation that 186 00:09:52.980 --> 00:09:55.710 would bolster consumers on privacy rights and is gaining 187 00:09:55.860 --> 00:09:59.820 momentum. So some of the tech industry, the tech giants, 188 00:10:00.030 --> 00:10:02.760 advertisers think the protections afforded to 189 00:10:02.760 --> 00:10:05.610 consumers in the current draft of the bill might be overly 190 00:10:05.610 --> 00:10:11.550 broad, typical kind of tension there. But after decades of 191 00:10:11.640 --> 00:10:15.270 lawmakers at the federal level, failing to pass such a privacy 192 00:10:15.270 --> 00:10:18.840 bill, do you think it could now be in reach? 193 00:10:19.920 --> 00:10:23.670 Jeremy Grant: So we've covered, let's see, Richard Marx, Nas, 194 00:10:23.910 --> 00:10:26.550 Parliament, and the Grateful Dead. So I used to run a free 195 00:10:26.550 --> 00:10:29.040 form radio station back in Ann Arbor, but I'm not sure I ever 196 00:10:29.040 --> 00:10:34.380 had a set that had all of that. So I will not say privacy is 197 00:10:34.380 --> 00:10:37.710 "Standing on the verge of getting it on," like Parliament, 198 00:10:37.740 --> 00:10:42.990 like passwordless. But there has been some real movement the last 199 00:10:42.990 --> 00:10:45.120 couple of weeks in Washington, and I think it's worth paying 200 00:10:45.120 --> 00:10:49.800 attention to. In that you now do have, in fact, I've got a copy 201 00:10:49.800 --> 00:10:52.410 of the discussion draft right here, because I've been combing 202 00:10:52.410 --> 00:10:54.840 through it for, you know, identity and cybersecurity 203 00:10:55.140 --> 00:10:58.830 elements, a bipartisan privacy bill that you know, has been 204 00:10:58.830 --> 00:11:03.060 introduced by or will be introduced by what people are 205 00:11:03.060 --> 00:11:05.430 calling three of the four corners. So, you know, based in 206 00:11:05.430 --> 00:11:09.600 Washington, DC, to dive into, you know, Congressional lingo 207 00:11:09.600 --> 00:11:12.960 for a bit. You've got the chairman and ranking member of 208 00:11:12.960 --> 00:11:15.000 the Senate Commerce Committee and the chairman and ranking 209 00:11:15.000 --> 00:11:17.520 member of the House Energy and Commerce Committee, and those of 210 00:11:17.520 --> 00:11:19.860 the committees that really have jurisdiction to write a privacy 211 00:11:19.860 --> 00:11:23.790 bill. In the Senate side, ranking member of the lead 212 00:11:23.790 --> 00:11:26.940 Republican Roger Wicker, along with you mentioned Cathy 213 00:11:26.940 --> 00:11:29.190 McMorris Rogers and Frank Pallone, the Democrat and 214 00:11:29.190 --> 00:11:32.400 Republican, have all come together around that draft I was 215 00:11:32.400 --> 00:11:35.370 holding up, which is called the American Data Privacy and 216 00:11:35.370 --> 00:11:39.810 Protection Act. So the fact that you have a bipartisan bill 217 00:11:39.810 --> 00:11:41.940 that's in the House and Senate, even if you don't have the 218 00:11:41.940 --> 00:11:45.810 support of the Senate Commerce chair, Maria Cantwell, it's 219 00:11:45.810 --> 00:11:48.210 pretty notable. And I think a lot of people are paying 220 00:11:48.210 --> 00:11:50.760 attention. There was a big house hearing yesterday where they 221 00:11:50.760 --> 00:11:55.230 dove into the bill and got a lot of feedback. It's in an 222 00:11:55.230 --> 00:11:59.130 interesting spot, in that, as you pointed out, a lot of folks 223 00:11:59.130 --> 00:12:02.460 in industry, in the advertising space, are saying it's terrible. 224 00:12:02.610 --> 00:12:05.790 The ACLU is also saying it's terrible. So you know, I started 225 00:12:05.790 --> 00:12:09.060 my career as a Senate staffer in the 90s. One of the rules of 226 00:12:09.060 --> 00:12:12.570 thumb was if you have a bill, and everybody hates it on all 227 00:12:12.570 --> 00:12:17.070 sides, pass the bill, because that means that you've actually 228 00:12:17.070 --> 00:12:20.190 done something right. If nobody's really happy with it, 229 00:12:20.190 --> 00:12:24.210 that might be the right policy. I'll say going through the bill, 230 00:12:24.330 --> 00:12:27.600 it's not ready to pass. It needs a lot of work. There's a lot of 231 00:12:27.600 --> 00:12:33.450 problems with it right now. But at a high level, the concepts 232 00:12:33.450 --> 00:12:35.670 they're laying out if you know, you're actually drawing this 233 00:12:35.670 --> 00:12:37.830 fire from, you know, different parties, maybe you're onto 234 00:12:37.830 --> 00:12:41.850 something. And so I do think that there's just not enough 235 00:12:41.850 --> 00:12:44.580 time between now and the elections. I mean, Congress, for 236 00:12:44.580 --> 00:12:46.980 all purposes, will shut down in mid September — getting ready, 237 00:12:46.980 --> 00:12:50.490 so people can campaign for the midterm elections. But what 238 00:12:50.490 --> 00:12:54.180 you're seeing here, even if the House and Senate flip, and you 239 00:12:54.180 --> 00:12:58.560 have Republicans in charge next year, is an outline of something 240 00:12:58.560 --> 00:13:01.320 that could be a bill that could advance. And so I think, you 241 00:13:01.320 --> 00:13:05.400 know, people are certainly more excited in DC, everybody's 242 00:13:05.400 --> 00:13:07.890 paying attention to this draft, taking it much more seriously 243 00:13:07.890 --> 00:13:10.740 than anything I've seen in the last few years, because they 244 00:13:10.740 --> 00:13:13.680 realize that this could be the starting point for a 245 00:13:13.680 --> 00:13:14.910 comprehensive privacy bill. 246 00:13:15.890 --> 00:13:18.080 Matthew Schwartz: I was struck by the last Congressional 247 00:13:18.080 --> 00:13:22.730 session. I think we had at least a few dozen bills that touched 248 00:13:22.730 --> 00:13:26.120 in some way on privacy. And there's been a real shift, even 249 00:13:26.120 --> 00:13:29.000 if some of them are, or a lot of them, most of them are in 250 00:13:29.000 --> 00:13:32.600 passing. It seems like there's a groundswell toward data privacy 251 00:13:32.600 --> 00:13:34.220 rights, hopefully coming in at some point. 252 00:13:34.660 --> 00:13:36.850 Jeremy Grant: Yeah, I mean, I'll say a big challenge on this has 253 00:13:36.850 --> 00:13:39.940 always been wanting to do with California, particularly with, 254 00:13:40.750 --> 00:13:43.240 you know, you're trying to get a bill passed, you know, by the 255 00:13:43.240 --> 00:13:46.840 House, you need Democrats to do it. I think one in four members 256 00:13:46.840 --> 00:13:49.180 of the Democratic caucus is from California, and you want them to 257 00:13:49.180 --> 00:13:53.410 vote to preempt the laws already in the state. So what this new 258 00:13:53.410 --> 00:13:55.930 bill would do is actually carve out California, but also carve 259 00:13:55.930 --> 00:13:59.050 out the Illinois privacy law that has gotten a lot of 260 00:13:59.050 --> 00:14:02.650 attention. And so there'd be a couple of carve outs, but not a 261 00:14:02.650 --> 00:14:05.650 lot. It's you know, again, from a policy perspective, a little 262 00:14:05.650 --> 00:14:08.050 hokey, but when you start to look at the political compromise 263 00:14:08.050 --> 00:14:11.560 it might take to pass a federal law, you know, that's an 264 00:14:11.560 --> 00:14:12.610 interesting approach. 265 00:14:13.800 --> 00:14:15.630 Matthew Schwartz: Wonderful. Well, one more question for you. 266 00:14:15.630 --> 00:14:18.180 And I can't think of his own transition, although maybe it'll 267 00:14:18.780 --> 00:14:22.980 come to me. But you were talking about voice back to the future, 268 00:14:22.980 --> 00:14:27.210 right? What other new identity technologies are you tracking? 269 00:14:27.240 --> 00:14:31.050 New is old, old is new, whatever, are you tracking for 270 00:14:31.050 --> 00:14:33.270 this year? Next year? What do you think we'll see some more 271 00:14:33.270 --> 00:14:33.930 innovation? 272 00:14:34.310 --> 00:14:35.840 Tom Field: Marvin Gaye; "What's going on?" 273 00:14:35.000 --> 00:14:37.673 Jeremy Grant: There we go. I was going to go with Men at Work; 274 00:14:35.000 --> 00:15:52.760 Matthew Schwartz: In the immortal words of The Who, "Who 275 00:14:37.728 --> 00:14:41.181 "Who can it be now?" Excellent identity song if there ever was 276 00:14:41.237 --> 00:14:44.801 one. So in terms of new stuff, I mean, I'll say, you know, there 277 00:14:44.856 --> 00:14:48.253 are some companies that are, you know, looking at some things 278 00:14:48.309 --> 00:14:51.094 around what I call next-generation authentication. 279 00:14:51.149 --> 00:14:54.324 You know, where do we go from beyond FIDO that are pretty 280 00:14:54.379 --> 00:14:57.721 interesting, but very early stage? But I actually think 2022 281 00:14:57.776 --> 00:15:01.006 and 2023, you know, are going to be focused on, again, the 282 00:15:01.062 --> 00:15:04.515 deployment to FIDO and passkeys and the further refinements of 283 00:15:04.570 --> 00:15:07.522 that concept. I really think passwordless is going to 284 00:15:07.578 --> 00:15:10.696 transform a lot of things. And then the second is on the 285 00:15:10.752 --> 00:15:14.037 identity verification side, a lot of attention. In fact, to 286 00:15:14.093 --> 00:15:17.602 give a preview, I'll be at the Identiverse conference in Denver 287 00:15:17.657 --> 00:15:20.999 next week, leading the panel looking at bias issues and face 288 00:15:21.054 --> 00:15:23.672 recognition and how it's impacting the identity 289 00:15:23.727 --> 00:15:27.124 verification market. You know, companies that can demonstrate 290 00:15:27.180 --> 00:15:30.410 that they actually work well across every age group, every 291 00:15:30.466 --> 00:15:33.918 ethnicity, every sex, you know, with all of the concerns we've 292 00:15:33.974 --> 00:15:37.427 seen from Congress and in the press and advocates and just the 293 00:15:37.482 --> 00:15:40.490 general public this year around that, I think identity 294 00:15:40.545 --> 00:15:43.942 verification technologies that can demonstrate that they work 295 00:15:43.998 --> 00:15:46.894 equitably, across, you know, every different type of 296 00:15:46.950 --> 00:15:50.180 population is going to be a really big deal in this space. 297 00:15:52.760 --> 00:15:53.210 are you?" 298 00:15:55.070 --> 00:15:57.920 Jeremy Grant: Yeah, not a great identity song. One of the best 299 00:15:57.980 --> 00:16:00.590 identity events of my life was when I was at an identity 300 00:16:00.590 --> 00:16:03.410 standards meeting in Phoenix, and we realized The Who was in 301 00:16:03.410 --> 00:16:05.360 town, and so of course, we all had to go see them. 302 00:16:07.310 --> 00:16:08.000 Matthew Schwartz: Happy ending. 303 00:16:08.660 --> 00:16:11.690 Anna Delaney: Well, I'm going to ruin this medley, I think, with 304 00:16:11.690 --> 00:16:15.350 no song, but maybe you can help me out. So Jeremy, you told me 305 00:16:15.350 --> 00:16:18.710 earlier today that you've done some analysis of the new privacy 306 00:16:18.710 --> 00:16:22.190 law. Any identity issues that you want to highlight? 307 00:16:23.170 --> 00:16:28.780 Jeremy Grant: Yeah. Sorry, the COVID's acting up a little. Got 308 00:16:28.780 --> 00:16:31.300 to the mute button for the second cough. So going through 309 00:16:31.300 --> 00:16:34.630 the bill, I think there's a few things that stand out. And you 310 00:16:34.630 --> 00:16:36.790 know, we're actually within the Better Identity Coalition 311 00:16:36.790 --> 00:16:39.100 looking through it and looking to reach out to the sponsors to 312 00:16:39.100 --> 00:16:42.010 talk about what I would call some perfecting amendments. 313 00:16:42.370 --> 00:16:44.020 Because I think there's some things that they're looking to 314 00:16:44.020 --> 00:16:46.990 do in there that maybe weren't drafted, as carefully as they 315 00:16:46.990 --> 00:16:52.210 could be. You know, one that stood out is ... essentially 316 00:16:52.210 --> 00:16:54.160 says, you know, there's a bunch of things that are just flat out 317 00:16:54.160 --> 00:16:56.950 restricted, and it puts limits on the collection, processing or 318 00:16:56.950 --> 00:17:00.610 transferring of social security numbers, except when necessary 319 00:17:00.610 --> 00:17:04.060 to do a few things, including authentication. And I read that 320 00:17:04.060 --> 00:17:06.190 and was a little horrified because you never want to use an 321 00:17:06.190 --> 00:17:09.580 SSN for authentication. Authentication requires a secret 322 00:17:09.580 --> 00:17:11.860 and the social security number stopped being a secret a long 323 00:17:11.860 --> 00:17:15.370 time ago. You do want to use it for identity verification, or if 324 00:17:15.370 --> 00:17:18.310 you need to have a use case where you need to essentially 325 00:17:18.310 --> 00:17:22.120 resolve an individual to unique identity. So there's, you know, 326 00:17:22.120 --> 00:17:26.020 what, probably 35,000 Tom Fields, only one hopefully has 327 00:17:26.020 --> 00:17:29.380 his SSN. And I need to use it for that. But I never want to 328 00:17:29.380 --> 00:17:31.960 ask Tom for the last four, because the Russians have that, 329 00:17:31.960 --> 00:17:34.690 and the Chinese have that, and a lot of other people can get it 330 00:17:34.690 --> 00:17:38.260 on the darkweb for about 87 cents. So I, you know, think 331 00:17:38.590 --> 00:17:40.990 getting that right terminology in there in terms of where you 332 00:17:40.990 --> 00:17:44.080 do and don't want to use SSN, you know, our point, you know, 333 00:17:44.080 --> 00:17:47.230 back to the sponsors was, you actually want to codify the use 334 00:17:47.230 --> 00:17:49.900 of the SSN for authentication. That seems like it's a terrible 335 00:17:49.900 --> 00:17:56.020 idea. Beyond that, they've got some restrictions, just in terms 336 00:17:56.020 --> 00:17:59.170 of biometric information, where again, they call it a bunch of 337 00:17:59.170 --> 00:18:01.900 exceptions where you can use it, but identity verification is not 338 00:18:01.900 --> 00:18:05.440 one of them. You know, they say you can't be transferring 339 00:18:05.440 --> 00:18:08.710 passwords, that's more sensitive information. But you know, we 340 00:18:08.710 --> 00:18:10.660 want to point it out to them; it's not just passwords anymore, 341 00:18:10.660 --> 00:18:13.420 there's other authenticators as well. And so you really want to, 342 00:18:13.450 --> 00:18:17.380 you know, keep that information also protected. And then I think 343 00:18:17.380 --> 00:18:20.980 there's a bigger set of issues, which is, you know, this bill, 344 00:18:20.980 --> 00:18:24.670 like GDPR, like the California laws, would give individuals 345 00:18:24.910 --> 00:18:27.760 data ownership and control, essentially, the ability to, you 346 00:18:27.760 --> 00:18:31.570 know, go to a company. I could go to ISMG and say, "I want to 347 00:18:31.570 --> 00:18:33.940 see what information you have on me," or "I don't like that, I 348 00:18:33.940 --> 00:18:36.160 want to correct it, I want you to delete it, I want you to move 349 00:18:36.160 --> 00:18:40.510 it somewhere else?" Well, if ISMG doesn't really know that 350 00:18:40.510 --> 00:18:45.190 it's me making that request, that's a great attack vector for 351 00:18:45.220 --> 00:18:47.740 an adversary to actually come in and, you know, steal that 352 00:18:47.740 --> 00:18:49.840 information. In fact, there was a great paper at BlackHat, a 353 00:18:49.840 --> 00:18:52.330 couple years ago, where a couple of researchers showed how they 354 00:18:52.330 --> 00:18:55.240 were able to steal information by making GDPR requests to 355 00:18:55.240 --> 00:18:58.240 companies. So you know, just to show that this isn't just 356 00:18:58.570 --> 00:19:04.030 theoretical. And, you know, so the sponsors of the bill did a 357 00:19:04.030 --> 00:19:07.780 good job, I think, and pointing out, well, if your company can't 358 00:19:07.780 --> 00:19:11.140 actually verify the identity, you don't have to submit that 359 00:19:11.140 --> 00:19:17.080 data. And you know, we want to point out to them, "look, that's 360 00:19:17.080 --> 00:19:19.870 good, because identity verification is hard." But if 361 00:19:19.870 --> 00:19:22.990 you really want to recognize this ideal that, you know, we 362 00:19:22.990 --> 00:19:25.840 actually can give people true control over their data and 363 00:19:25.840 --> 00:19:28.630 access to it, well, what you really need is some identity 364 00:19:28.630 --> 00:19:31.300 infrastructure to enable that. And so, you know, this is where 365 00:19:31.300 --> 00:19:33.340 it gets back to some of the core work of the Better Identity 366 00:19:33.340 --> 00:19:37.300 Coalition. If we had these investments on the government 367 00:19:37.300 --> 00:19:40.930 side that can help to close the gap between the nationally 368 00:19:40.930 --> 00:19:43.060 recognized authoritative credentials that government 369 00:19:43.060 --> 00:19:45.580 issues today and that work in the physical world and this gap 370 00:19:45.580 --> 00:19:48.400 we have in the digital world be easier than ever to actually 371 00:19:48.400 --> 00:19:51.250 give people control over their data. And so, you know, I think 372 00:19:51.250 --> 00:19:55.030 we want to weigh in to suggest that as part of passing the 373 00:19:55.030 --> 00:19:58.000 privacy bill that don't ignore the identity layer of this 374 00:19:58.000 --> 00:20:00.070 because otherwise you don't really get to deliver that 375 00:20:00.070 --> 00:20:00.940 benefit to people. 376 00:20:02.220 --> 00:20:05.640 Anna Delaney: Very comprehensive answer, Jeremy. Thank you. So, 377 00:20:05.850 --> 00:20:10.980 lastly, finally, quick fire question. I'm going to give you 378 00:20:10.980 --> 00:20:13.590 a break, Jeremy, for a moment because you've been talking 379 00:20:13.620 --> 00:20:18.540 quite a bit. What's the biggest lie sold or told in 380 00:20:18.540 --> 00:20:20.550 cybersecurity? Tom, go for it. 381 00:20:21.180 --> 00:20:24.180 Tom Field: We genuinely care for our customers' security and 382 00:20:24.180 --> 00:20:28.590 privacy. If you did, we wouldn't be having this conversation. 383 00:20:28.000 --> 00:20:33.970 Matthew Schwartz: And I always love it when a company comes out 384 00:20:33.970 --> 00:20:36.670 and says, "we have bulletproof security that it is impossible 385 00:20:36.670 --> 00:20:40.510 to hack." If you ever want an invitation to hackers to come 386 00:20:40.510 --> 00:20:43.450 and prove you wrong, them's fightin' words. 387 00:20:45.130 --> 00:20:47.470 Jeremy Grant: I'd say it's "if you buy our product, there's a 388 00:20:47.470 --> 00:20:50.380 lot of scenarios as a show floor, you're going to stop the 389 00:20:50.380 --> 00:20:53.380 attackers. But just our product, none of the other products." 390 00:20:55.840 --> 00:20:57.640 Anna Delaney: You been in this industry a while, haven't you? 391 00:20:57.790 --> 00:21:01.660 Well, I also think, humans are the weakest link is another one, 392 00:21:01.870 --> 00:21:05.530 which is great because well, this wasn't designed for humans. 393 00:21:05.530 --> 00:21:10.930 So there! But this has been very enjoyable. Jeremy, thank you so 394 00:21:10.930 --> 00:21:13.360 much for your insight. We always feel better informed and 395 00:21:13.360 --> 00:21:15.430 educated. So we appreciate. 396 00:21:16.660 --> 00:21:18.130 Jeremy Grant: Great talking again, virtually, if not in 397 00:21:18.130 --> 00:21:19.030 person. Thank you. 398 00:21:19.510 --> 00:21:20.170 Matthew Schwartz: Thanks, Jeremy. 399 00:21:21.010 --> 00:21:23.200 Anna Delaney: Thank you so much for watching. Until next time!