WEBVTT 1 00:00:03.150 --> 00:00:06.060 Anna Delaney: Hello, welcome to the ISMG Editors' Panel, live at 2 00:00:06.090 --> 00:00:09.750 RSA Conference 2023. This is day three, and I'm joined by my 3 00:00:09.750 --> 00:00:13.680 colleagues, Mathew Schwartz and Michael Novinson. How's it going 4 00:00:13.680 --> 00:00:14.460 so far, gentlemen? 5 00:00:14.490 --> 00:00:16.170 Mathew Schwartz: It's excellent. Great to be here. 6 00:00:16.650 --> 00:00:17.970 Michael Novinson: Enjoying the show, lots of great 7 00:00:17.970 --> 00:00:18.900 conversations. 8 00:00:19.170 --> 00:00:21.900 Anna Delaney: So let's recap day two, first of all. What were 9 00:00:21.900 --> 00:00:23.190 highlights for you, Michael? 10 00:00:23.840 --> 00:00:25.490 Michael Novinson: Some of the highlights for me were 11 00:00:25.520 --> 00:00:28.460 interacting with Jay Chaudhary. He is the CEO of Zscaler. We 12 00:00:28.460 --> 00:00:31.340 talked a lot about zero trust, not just theoretically, but how 13 00:00:31.340 --> 00:00:34.370 to put it into action, what CIOs are doing, what board members 14 00:00:34.370 --> 00:00:37.040 are doing, what the federal government is doing, and how 15 00:00:37.040 --> 00:00:39.860 organizations can get started on that zero trust journey. But 16 00:00:39.860 --> 00:00:42.920 really the topic du jour of day two, as well as the conference 17 00:00:42.920 --> 00:00:45.770 as a whole is generative AI. From the standpoint of venture 18 00:00:45.770 --> 00:00:48.650 capitalists, well, how can they make money off of it? What does 19 00:00:48.650 --> 00:00:53.750 it look like to secure AI data, AI algorithms, AI models? Where 20 00:00:53.750 --> 00:00:56.630 can they get started with that? As well as what are some of the 21 00:00:56.630 --> 00:00:59.210 cyber risks and some of the cyber opportunities around 22 00:00:59.210 --> 00:01:02.750 generative AI? So for me, those are the two big themes from day 23 00:01:02.750 --> 00:01:03.020 two. 24 00:01:03.170 --> 00:01:04.130 Anna Delaney: Do you think there's some concrete 25 00:01:04.130 --> 00:01:08.330 conversations from what you've heard? How much of it is buzzy 26 00:01:08.330 --> 00:01:08.720 buzzy? 27 00:01:10.460 --> 00:01:12.410 Michael Novinson: Definitely some buzzy, buzziness. I think 28 00:01:12.410 --> 00:01:14.750 certainly. And this actually was a day one conversation with 29 00:01:14.750 --> 00:01:17.390 Nikesh Arora, the CEO of Palo Alto Networks, that there is 30 00:01:18.050 --> 00:01:20.810 this pressure to be first. And everybody wants to say, "Oh, 31 00:01:20.810 --> 00:01:24.110 we've embedded ChatGPT into our product." Really, what does that 32 00:01:24.110 --> 00:01:26.510 mean? If you've done it in four weeks. Is that just you really 33 00:01:26.510 --> 00:01:29.000 just put an interface on the front of your product, where you 34 00:01:29.000 --> 00:01:31.610 can ask questions and natural language that connects you and 35 00:01:31.610 --> 00:01:34.340 sure that's nice and requires less coding. But is that really 36 00:01:34.340 --> 00:01:38.630 adding value? Is that really taking that quantum leap 37 00:01:38.630 --> 00:01:41.540 forward? And I think some of the folks certainly - Palo Alto 38 00:01:41.540 --> 00:01:44.270 Networks and Zscaler - are biding their time. They're 39 00:01:44.270 --> 00:01:47.150 really trying to figure out how can we use generative AI to 40 00:01:47.150 --> 00:01:51.710 really increase the efficacy or detection, not just to interact 41 00:01:51.710 --> 00:01:54.560 with a bot, but really to make our technology better at doing 42 00:01:54.560 --> 00:01:56.450 what it does? And what are some of new things that we have to 43 00:01:56.450 --> 00:01:59.390 think about protecting as a result of what adversaries are 44 00:01:59.390 --> 00:02:00.680 able to do with generative AI. 45 00:02:00.740 --> 00:02:02.270 Mathew Schwartz: Yeah, that was one of the big topics with 46 00:02:02.270 --> 00:02:05.540 Cisco, when I spoke to them today, or yesterday, I should 47 00:02:05.540 --> 00:02:09.320 say. They were looking at how you can use AI, how attackers 48 00:02:09.320 --> 00:02:12.860 can use AI and also how you need to use AI to secure AI, or some 49 00:02:12.860 --> 00:02:17.450 flavor thereof in the evolving AI sphere, I guess. 50 00:02:18.920 --> 00:02:20.720 Anna Delaney: What were highlights for you, Mathew? 51 00:02:20.870 --> 00:02:23.300 Mathew Schwartz: So day two, fascinating stuff. I got to 52 00:02:23.300 --> 00:02:27.260 speak with Hugh Thompson. A lot of people know him because he 53 00:02:27.260 --> 00:02:30.140 shows up on stage every year. He has for a long time. He has for 54 00:02:30.140 --> 00:02:35.510 15 years been the programming director for RSA. So helping 55 00:02:35.510 --> 00:02:38.120 lead the charge when it comes to sifting through all of the 56 00:02:38.480 --> 00:02:41.990 applications they get to speak. And he highlighted three things 57 00:02:42.140 --> 00:02:46.610 - generative AI. Thank you, Michael. Software composition; 58 00:02:46.700 --> 00:02:49.910 so getting into things like the software bill of materials, what 59 00:02:49.910 --> 00:02:52.370 does that even mean? And then when you have that, how do you 60 00:02:52.370 --> 00:02:58.190 begin to try to apply it. Huge issues there. And then also, 61 00:02:58.220 --> 00:03:02.450 just all of the dynamic changes, as he said, in cyber. Everything 62 00:03:02.450 --> 00:03:06.560 is so dynamic. And that was a big theme I heard around AI, 63 00:03:06.590 --> 00:03:10.670 around how it is is being used, how it could be used. I 64 00:03:10.670 --> 00:03:14.600 mentioned the Cisco discussion I had. That was excellent. That 65 00:03:14.600 --> 00:03:18.380 was with Jeetu Patel and Tom Gillis. They had done a keynote, 66 00:03:18.530 --> 00:03:21.440 excellent keynote, where they were talking about trying to 67 00:03:21.440 --> 00:03:25.250 find new ways of thinking, identifying the flaws with the 68 00:03:25.250 --> 00:03:28.640 old and seeing where you go forward. There's a great line 69 00:03:28.640 --> 00:03:31.400 from Tom in his keynote. He's talking about trying to find a 70 00:03:31.400 --> 00:03:36.050 synchronized symphony of security defenses, i.e., trying 71 00:03:36.050 --> 00:03:38.840 to get everything talking to each other in a good way. So 72 00:03:39.230 --> 00:03:41.090 Cisco obviously has some thoughts about how they think 73 00:03:41.090 --> 00:03:44.570 you should do that. So do some other people, but wonderful, 74 00:03:44.570 --> 00:03:47.630 fascinating discussions. And then I'm just going to pivot 75 00:03:47.810 --> 00:03:52.370 into the cryptographers panel. Because ChatGPT was a big theme 76 00:03:52.400 --> 00:03:55.700 there as well as you would expect it to be because they 77 00:03:55.790 --> 00:03:58.460 rigorously chase down whatever the hype is, whatever the 78 00:03:58.460 --> 00:04:02.960 buzzwords are, I think with an eye toward can we puncture this, 79 00:04:03.860 --> 00:04:08.270 you know, thin concept if it is a thin concept, so I always 80 00:04:08.570 --> 00:04:12.320 really appreciate the deep thoughts that they bring to some 81 00:04:12.320 --> 00:04:15.500 of the big issues of the day. What are those big issues of the 82 00:04:15.500 --> 00:04:19.790 day, Matt? Well, I'm glad you asked. Chatbots were a big one, 83 00:04:20.030 --> 00:04:24.770 as we've been discussing, I love Adi Shamir. He is the "S" in the 84 00:04:24.770 --> 00:04:29.090 RSA cryptosystem. Small claims to fame there, especially with a 85 00:04:29.570 --> 00:04:33.710 cryptography audience. But he said that a year ago, when it 86 00:04:33.710 --> 00:04:39.020 came to AI, he thought there was going to be more, well, lots of 87 00:04:39.110 --> 00:04:42.710 potential good application on the defensive side and minimal 88 00:04:42.740 --> 00:04:46.280 offensive application. And I don't think I've ever heard him 89 00:04:46.280 --> 00:04:49.040 say this before. He said, I've completely changed my thinking 90 00:04:49.310 --> 00:04:53.900 over the past year, and he is extremely concerned about a tool 91 00:04:53.990 --> 00:04:58.130 that can sound human and what you can do with that at scale, 92 00:04:58.310 --> 00:05:02.450 given the gullibility of humans when it comes to things like I 93 00:05:02.450 --> 00:05:06.110 don't know, our election or a Nigerian prince, you know, 94 00:05:06.530 --> 00:05:09.920 telling me that my million dollars has finally arrived. So 95 00:05:09.950 --> 00:05:13.460 that was one of the big fascinating things. And the 96 00:05:13.460 --> 00:05:16.670 other one was, I'd say the decline and fall of blockchain. 97 00:05:17.000 --> 00:05:20.630 It wasn't declared dead. And certainly blockchain and 98 00:05:20.630 --> 00:05:23.840 cryptocurrencies are two different things. But where 99 00:05:23.870 --> 00:05:29.270 people have been bullish on blockchain in the past, they are 100 00:05:29.270 --> 00:05:31.670 a lot less bullish. Now, some of the people on the cryptographers 101 00:05:31.670 --> 00:05:34.220 panel have never been bullish on it. Just to be clear, they've 102 00:05:34.220 --> 00:05:37.100 said, you all look at all the options. And if blockchain is 103 00:05:37.100 --> 00:05:41.450 the one and it won't be then do blockchain. So there's a bit of 104 00:05:41.450 --> 00:05:44.120 nuance there. Maybe not when it comes to blockchain. But just 105 00:05:44.450 --> 00:05:45.110 great discussion. 106 00:05:45.440 --> 00:05:46.940 Anna Delaney: Fantastic. So Michael, were there any 107 00:05:46.940 --> 00:05:47.810 surprises? 108 00:05:49.070 --> 00:05:50.900 Michael Novinson: I think so, in terms of some of the 109 00:05:50.900 --> 00:05:53.870 conversations I had around critical infrastructure, really 110 00:05:53.870 --> 00:05:58.610 the evolution in that space, that I think historically has 111 00:05:58.610 --> 00:06:01.370 been really a larger organization focus, and now 112 00:06:01.370 --> 00:06:04.160 trying to figure out both how to bring it down to the municipal 113 00:06:04.490 --> 00:06:07.430 level, municipal water, municipal electric, as well as 114 00:06:07.430 --> 00:06:09.800 the simultaneous challenge. If you're a critical infrastructure 115 00:06:09.800 --> 00:06:13.640 organization, you still have to balance that with securing your 116 00:06:13.640 --> 00:06:17.030 classic IT systems. And how can you do that in concert, in 117 00:06:17.030 --> 00:06:20.150 collaboration with one another? So certainly some dialogue 118 00:06:20.150 --> 00:06:23.780 there. And I think as well, there were some conversations 119 00:06:24.560 --> 00:06:28.790 about I think, really just a sense that we don't really know 120 00:06:28.790 --> 00:06:32.690 what we're doing with ChatGPT yet. It's fun to play with, and 121 00:06:33.380 --> 00:06:36.470 whatever. It can sing us songs and write us poems. And that's 122 00:06:36.470 --> 00:06:38.660 fine. And that's consumer. But what does it actually mean in 123 00:06:38.660 --> 00:06:41.780 the context of cyber, just, we've had the privilege of 124 00:06:41.780 --> 00:06:43.640 getting dragged with some very smart people, and they really 125 00:06:43.640 --> 00:06:46.220 don't know yet. So I think there's a lot that remains to be 126 00:06:46.220 --> 00:06:46.640 learned. 127 00:06:47.180 --> 00:06:49.220 Mathew Schwartz: Anna, how about you? Highlights? 128 00:06:50.830 --> 00:06:56.110 Anna Delaney: Talking with you. Well, it was great to speak with 129 00:06:56.110 --> 00:07:00.370 Joe Carson, of course, about gamification. It's interesting 130 00:07:00.370 --> 00:07:05.320 with security, how do you get through to people and make it 131 00:07:05.320 --> 00:07:10.540 fun, and he gave some really cool thoughts about how to apply 132 00:07:10.570 --> 00:07:16.780 gamification into organizations. So more on that later. So were 133 00:07:16.780 --> 00:07:17.680 there any surprises for you? 134 00:07:17.000 --> 00:07:21.560 Mathew Schwartz: Surprises for me, I think the decline and fall 135 00:07:21.560 --> 00:07:25.940 of blockchain maybe was one of them. But the nuance around 136 00:07:25.970 --> 00:07:31.460 quantum computers was a surprise, just in terms of do 137 00:07:31.460 --> 00:07:34.070 people need to be worried. One of the takeaways from the 138 00:07:34.100 --> 00:07:37.580 Cryptographers' Panel was, if you've got secrets that need to 139 00:07:37.580 --> 00:07:42.590 be secret still in 30 years, in 50 years, then you should be 140 00:07:42.590 --> 00:07:46.010 looking really closely into what you do with that. And maybe 141 00:07:46.040 --> 00:07:48.680 encrypting it is not the answer. You know, I don't know about 142 00:07:48.680 --> 00:07:52.160 locking it up in a safe or something. But again, Adi Shamir 143 00:07:52.160 --> 00:07:56.000 was saying 99.99% of things, and he said, probably add some more 144 00:07:56.000 --> 00:07:59.510 nines on there, that we have stored in data format. I was 145 00:07:59.510 --> 00:08:03.290 like, do you want to have lunch tomorrow, and where, or we have 146 00:08:03.290 --> 00:08:05.780 this secret new product we're developing, but it's going to be 147 00:08:05.780 --> 00:08:07.940 out in a year. And all of a sudden, that's not secret 148 00:08:07.940 --> 00:08:12.110 anymore. So he was bringing a lot of nuances to the concerns 149 00:08:12.140 --> 00:08:16.280 that people rightly have about will we have crypto systems that 150 00:08:16.280 --> 00:08:20.150 are quantum resistant? That is a concern. The NSA is saying it 151 00:08:20.150 --> 00:08:22.910 is. NIST is saying that is a concern. So we need to listen to 152 00:08:22.910 --> 00:08:26.600 those voices. But he said, is it a concern for you? And what do 153 00:08:26.600 --> 00:08:30.080 you do about it? And I think that for the moment, most people 154 00:08:30.080 --> 00:08:32.000 are not going to need to do something about it. So it's 155 00:08:32.000 --> 00:08:35.030 fascinating and fun to follow. But the sky is not falling. 156 00:08:35.540 --> 00:08:37.460 Anna Delaney: Well, let's look at today. And what are the 157 00:08:37.460 --> 00:08:39.440 highlights for you? What are you looking forward to? Who you're 158 00:08:39.440 --> 00:08:42.560 speaking with? Are there any sessions you're attending? So, 159 00:08:42.590 --> 00:08:43.610 Michael, take it away? 160 00:08:44.110 --> 00:08:45.970 Michael Novinson: Absolutely. Well, so I'm really excited. I'm 161 00:08:45.970 --> 00:08:47.800 going to be speaking with Michael Sentonas, second in 162 00:08:47.800 --> 00:08:50.500 command at CrowdStrike, about some of the place they're making 163 00:08:50.500 --> 00:08:53.080 with Google. And some of the place they are making to broaden 164 00:08:53.080 --> 00:08:57.160 that XDR, and identity and cloud ecosystem. As well getting to 165 00:08:57.160 --> 00:09:00.160 speak with Rob Lee at Dragos, about what they're doing around 166 00:09:00.160 --> 00:09:02.890 critical infrastructure, and really looking at some of the 167 00:09:03.130 --> 00:09:06.100 risks that organizations have, and then how companies can get 168 00:09:06.100 --> 00:09:08.710 started on their critical infrastructure journey. Two 169 00:09:08.710 --> 00:09:10.450 conversations I'm very excited for today. 170 00:09:11.530 --> 00:09:13.300 Mathew Schwartz: So my, if I have to limit myself to two 171 00:09:13.300 --> 00:09:16.780 conversations, I'm going to pick Brian Honan. He's a great 172 00:09:16.780 --> 00:09:20.560 resource for us here, at ISMG. I'm speaking with him, and we're 173 00:09:20.560 --> 00:09:24.010 going to be, I think, rounding up the last five years of GDPR. 174 00:09:26.620 --> 00:09:27.250 Michael Novinson: Two-three minutes. 175 00:09:27.280 --> 00:09:31.540 Mathew Schwartz: Just a couple of minutes. And then also, I'm 176 00:09:31.540 --> 00:09:35.590 looking forward to speaking with John DiMaggio. He's a ransomware 177 00:09:35.590 --> 00:09:39.790 researcher. He's promised me some insights into, I think, 178 00:09:39.940 --> 00:09:42.880 some identities and some groups perhaps that haven't come to 179 00:09:42.880 --> 00:09:48.550 light before, or we'll see. So check back with me in 24 hours 180 00:09:48.550 --> 00:09:50.800 time and hopefully we will know a little bit more about 181 00:09:50.800 --> 00:09:52.000 ransomware than we did before. 182 00:09:52.000 --> 00:09:54.310 Michael Novinson: Yeah. Great. What are you most excited for 183 00:09:54.310 --> 00:09:54.460 today? 184 00:09:54.790 --> 00:10:01.450 Anna Delaney: Speaking again with you tomorrow and today, no 185 00:10:01.450 --> 00:10:07.060 there's plenty going on, with day by day here, so that's a 186 00:10:07.060 --> 00:10:10.360 wrap for us. Thank you so much for watching. Stay tuned for our 187 00:10:10.360 --> 00:10:13.000 updates tomorrow. I'm Anna Delaney for ISMG.