WEBVTT 1 00:00:05.760 --> 00:00:09.510 Anna Delaney: Hello, I'm Anna Delaney with ISMG. I am with my 2 00:00:09.510 --> 00:00:13.020 colleagues, Mathew Schwartz and Tony Morbin, at the ISMG studios 3 00:00:13.020 --> 00:00:17.970 after a very successful couple of days live at Black Hat, 2023. 4 00:00:18.540 --> 00:00:19.650 It's been great, hasn't it? 5 00:00:19.980 --> 00:00:22.590 Mathew Schwartz: It's wonderful to be back in London. Yes. 6 00:00:22.890 --> 00:00:25.680 Anna Delaney: So, you've attended some of the sessions, 7 00:00:26.430 --> 00:00:28.140 Mathew. So, what were the highlights for you? 8 00:00:29.130 --> 00:00:32.820 Mathew Schwartz: I'll emphasize the keynotes, and maybe I'll 9 00:00:32.820 --> 00:00:37.920 just preview that by saying I was here last year, and it was 10 00:00:37.920 --> 00:00:40.680 recovering still from the pandemic, but it felt a little 11 00:00:40.680 --> 00:00:45.060 tentative to me. This year, there's been a palpable buzz in 12 00:00:45.060 --> 00:00:48.060 the air. I think, there's a lot more people than there were last 13 00:00:48.060 --> 00:00:51.600 year, there was a lot of energy. We had great keynotes last year, 14 00:00:51.630 --> 00:00:54.900 they had great keynotes this year. And, it just started right 15 00:00:54.900 --> 00:00:58.710 off the blocks with Ollie Whitehouse of the National Cyber 16 00:00:58.710 --> 00:01:02.850 Security Centre. Wonderful talk, talking about current 17 00:01:02.850 --> 00:01:06.150 challenges, what needs to be done? What his hopes are. 18 00:01:06.420 --> 00:01:09.990 Charging the audience with a bit of a mission about things that 19 00:01:09.990 --> 00:01:14.430 they could do, including, maybe not a surprise, working with 20 00:01:14.430 --> 00:01:16.890 engaging with the government, finding out where there's 21 00:01:16.890 --> 00:01:20.040 problems, highlighting those problems, and trying to build 22 00:01:20.040 --> 00:01:24.240 consensus solutions for them. This morning, we had a wonderful 23 00:01:24.240 --> 00:01:29.520 keynote from Joe Sullivan, the former CSO of Facebook, Uber and 24 00:01:29.550 --> 00:01:34.980 Cloudflare. Uber tenure is what got him convicted of a data 25 00:01:34.980 --> 00:01:38.910 breach cover-up, and he went to the stage to share some lessons 26 00:01:38.910 --> 00:01:43.620 learned from the case against him. Again, with a bit of a call 27 00:01:43.620 --> 00:01:48.120 to action to the cybersecurity audience saying, choose your own 28 00:01:48.360 --> 00:01:51.780 destiny, choose your own future here, you can't be 29 00:01:51.780 --> 00:01:55.230 technologists, you've got security in the title, you've 30 00:01:55.230 --> 00:01:58.980 got chief security officer - when you're a CISO or CSO - in 31 00:01:58.980 --> 00:02:01.800 the title. And, we need to transform that into someone 32 00:02:01.800 --> 00:02:05.490 who's seen as shepherding technology to someone who is 33 00:02:05.490 --> 00:02:08.160 truly driving the board - not doing presentations in front of 34 00:02:08.160 --> 00:02:11.010 the board, arguing with the board when necessary, helping 35 00:02:11.010 --> 00:02:15.960 drive the strategic direction of a company. So, two excellent 36 00:02:15.960 --> 00:02:19.410 keynotes that really engaged with the audience and said, 37 00:02:19.440 --> 00:02:22.170 here's where we are, here's where I think we need to be 38 00:02:22.170 --> 00:02:23.610 tomorrow, are you with me? 39 00:02:23.790 --> 00:02:26.610 Anna Delaney: Fantastic, great overview. And, Tony, you had a 40 00:02:26.610 --> 00:02:30.030 few interviews here in the studios, any that, sort of, 41 00:02:30.510 --> 00:02:31.440 stuck out for you, 42 00:02:31.000 --> 00:02:33.639 Tony Morbin: I would say Joe Sullivan, again, I thought he 43 00:02:33.698 --> 00:02:37.041 was excellent. And, we were really discussing, obviously, 44 00:02:37.100 --> 00:02:40.854 the whole area of responsibility for the CISO. And, he was, sort 45 00:02:40.912 --> 00:02:44.666 of, saying, you know, he's had a couple of people saying to him, 46 00:02:44.725 --> 00:02:48.185 "do I really want to be a CISO?" That's both people who are 47 00:02:48.244 --> 00:02:51.939 looking at becoming a CISO and actual CISOs. And, that's on the 48 00:02:51.998 --> 00:02:55.224 back of not only his own experience but also the recent 49 00:02:55.283 --> 00:02:58.861 case with SolarWinds CISO and the new NIS directive, which is 50 00:02:58.919 --> 00:03:01.852 bringing in personal responsibility and liability. 51 00:03:01.911 --> 00:03:05.078 And, I think the issue that he had, which reflects his 52 00:03:05.137 --> 00:03:08.832 experience at Uber, was you're being given this responsibility, 53 00:03:08.891 --> 00:03:12.351 but are you being given the authority to actually deal with 54 00:03:12.410 --> 00:03:16.046 it? So, it's all very good, now being at the top table, and we 55 00:03:16.105 --> 00:03:19.566 do want management to be held responsible for these things. 56 00:03:19.624 --> 00:03:23.261 But, does that mean just giving it to the CISO? Or, should the 57 00:03:23.320 --> 00:03:27.015 CEO or maybe some other members of the board be there alongside 58 00:03:27.074 --> 00:03:30.769 them? And, I think that really was the issue. It's all well and 59 00:03:30.827 --> 00:03:34.581 good having responsibility, but you got to have the authority to 60 00:03:34.640 --> 00:03:38.277 do it, or you can't be accused of being negligent if you asked 61 00:03:38.335 --> 00:03:40.330 for controls and were denied them. 62 00:03:40.900 --> 00:03:43.810 Mathew Schwartz: Yeah, Jeff Moss, the founder and creator of 63 00:03:43.810 --> 00:03:48.220 the Black Hat conference, who serves as the emcee and 64 00:03:48.250 --> 00:03:51.940 introduces the keynotes and does a closing Locknote each day, he 65 00:03:51.940 --> 00:03:55.120 had some very interesting things to say about he sees the 66 00:03:55.120 --> 00:03:59.020 appetite changing. For example, in the United States, people are 67 00:03:59.020 --> 00:04:02.080 sick of data breaches, governments are sick of data 68 00:04:02.080 --> 00:04:06.040 breaches. And, he thinks very shortly, we're going to see a 69 00:04:06.040 --> 00:04:10.030 regulatory focus, finally, on this. He thinks, congressional 70 00:04:10.030 --> 00:04:13.090 staffers and lawmakers, they're digital natives now, 71 00:04:13.420 --> 00:04:17.350 increasingly anyway, well most of them. And, then we've also 72 00:04:17.350 --> 00:04:21.040 got the cryptocurrency discussions that had been 73 00:04:21.040 --> 00:04:24.340 happening. And, he sees much more of an appetite there based 74 00:04:24.340 --> 00:04:27.820 on what has been happening with cryptocurrency, attempting to 75 00:04:27.820 --> 00:04:31.300 regulate it, perhaps discussions already happening with AI, not 76 00:04:31.300 --> 00:04:34.270 necessarily in an advanced state, but it's happening. And, 77 00:04:34.330 --> 00:04:38.560 so he foresees greater regulations looking at outcomes, 78 00:04:38.590 --> 00:04:43.270 for example, stopping data breaches, finally, perhaps some 79 00:04:43.300 --> 00:04:48.010 tough measures to make that happen. And, when that happens, 80 00:04:48.010 --> 00:04:50.260 the CISOs are going to have to have more power or companies are 81 00:04:50.260 --> 00:04:51.670 going to be in big trouble. 82 00:04:51.730 --> 00:04:53.530 Tony Morbin: Absolutely. I mean, long-term, I think the 83 00:04:53.530 --> 00:04:56.860 regulation is going to be there and we are going to get that 84 00:04:57.310 --> 00:05:01.000 government and private sector reaching that level they should 85 00:05:01.000 --> 00:05:04.540 be. But, I think it could be very rough along the way, and 86 00:05:04.540 --> 00:05:07.660 there may be a few, you know, sacrificial lambs. 87 00:05:08.710 --> 00:05:11.260 Anna Delaney: Were there any surprises that came up for you? 88 00:05:12.230 --> 00:05:14.570 Mathew Schwartz: I would say from a research standpoint and 89 00:05:14.570 --> 00:05:18.260 going to some of the briefings at the conference, there were no 90 00:05:18.260 --> 00:05:22.370 sessions outright about ransomware, which has been a 91 00:05:22.370 --> 00:05:26.630 huge topic in recent years. And, there was only one session on 92 00:05:26.660 --> 00:05:30.950 artificial intelligence, and that was being used to train 93 00:05:31.070 --> 00:05:35.420 side-channel attacks in tactics that we've seen before, but the 94 00:05:35.420 --> 00:05:39.260 AI was doing it in a very useful - if you're an attacker - kind 95 00:05:39.260 --> 00:05:42.050 of way. So, I think it was interesting that we didn't see 96 00:05:42.050 --> 00:05:46.520 more of that; I think we will. I don't know always how the 97 00:05:46.520 --> 00:05:49.820 research comes in. There's fads, there's trends, I talked to some 98 00:05:49.820 --> 00:05:53.660 of the members of the review board, they told me, they're not 99 00:05:53.660 --> 00:05:56.390 seeing much on the application security front, which is a very 100 00:05:56.390 --> 00:06:01.190 big surprise to them. So, it's hard to know what to reverse 101 00:06:01.190 --> 00:06:03.950 engineer from all this or how to deduce what is and isn't 102 00:06:03.950 --> 00:06:07.160 happening. But, the lack of AI was a surprise for me. 103 00:06:07.520 --> 00:06:28.190 Tony Morbin: How about you, Tony, did generative AI come up 104 00:06:09.420 --> 00:06:13.855 Well, it wasn't so much the lack of AI and the AI did come up in 105 00:06:13.924 --> 00:06:18.360 every discussion, but it came up as just a fact. Whereas I think 106 00:06:18.429 --> 00:06:22.934 the last ISMG event that we were at in London, it was a real buzz 107 00:06:23.003 --> 00:06:27.369 of excitement about, you know, AI had, sort of, transformed the 108 00:06:27.438 --> 00:06:31.388 industry. It's very quickly become business as usual. AI, 109 00:06:29.060 --> 00:06:55.340 a little bit? 110 00:06:31.458 --> 00:06:35.755 oh, it's just automation down the road from where we were with 111 00:06:35.824 --> 00:06:40.121 machine learning. It's a little bit moved on. Plus, we already 112 00:06:40.190 --> 00:06:44.625 had AI anyway, you know, and now it's just, you know, generative 113 00:06:44.695 --> 00:06:48.160 AI large language models are available. So, it was 114 00:06:48.229 --> 00:06:51.902 interesting that it has just become a norm, a part of 115 00:06:51.971 --> 00:06:56.130 automation, and the hype is lessened. But, it's still there. 116 00:06:56.000 --> 00:06:58.489 Mathew Schwartz: Yeah, I mean, again, Jeff Moss had a great 117 00:06:58.548 --> 00:07:02.342 point, "what do we mean when we say AI?" And, certainly, I had a 118 00:07:02.401 --> 00:07:05.957 conversation with a researcher who was talking about virtual 119 00:07:06.017 --> 00:07:09.751 kidnapping, where maybe you SIM swap a target - a child - knock 120 00:07:09.810 --> 00:07:13.485 them offline, phone the parent, say, I've got your kid, you're 121 00:07:13.544 --> 00:07:16.804 going to transfer me some horrible sum of money, or bad 122 00:07:16.864 --> 00:07:20.302 things are going to happen as soon as I hang up this call. 123 00:07:20.361 --> 00:07:23.562 And, it's all virtual, thankfully, it's not real, it's 124 00:07:23.621 --> 00:07:27.355 horrible! He was saying with AI - by which he meant the ability 125 00:07:27.414 --> 00:07:30.674 to spoof voice, perhaps video - you have these horrible 126 00:07:30.734 --> 00:07:34.290 techniques that can be easily brought to bear to create more 127 00:07:34.349 --> 00:07:37.906 industrialized approaches to fraud and whatnot. So, that was 128 00:07:37.965 --> 00:07:41.521 the voice and the video aspect of it. Moss was talking about 129 00:07:41.581 --> 00:07:45.137 how you have AI as a probability engine. So, for coding, for 130 00:07:45.196 --> 00:07:48.397 example, or all sorts of applications. I mean, that is 131 00:07:48.456 --> 00:07:52.190 also AI. And, his big takeaway was, we need to mean what we say 132 00:07:52.250 --> 00:07:55.925 when we talk about AI, because if we're trying to regulate it, 133 00:07:55.984 --> 00:07:59.600 these are very different kinds of areas, and we don't want to 134 00:07:59.659 --> 00:08:03.215 confuse them or unnecessarily water down the discussion. So, 135 00:08:03.275 --> 00:08:06.950 we need a lot more nuance for what's being lumped into AI now. 136 00:08:06.000 --> 00:08:09.469 Tony Morbin: I'll just say one more thing on AI, it was quite 137 00:08:09.541 --> 00:08:14.240 interesting, Bugcrowd, you know, ethical crowdsource hackers. So, 138 00:08:14.312 --> 00:08:17.854 we're saying that 94% of attackers at the sea are 139 00:08:17.926 --> 00:08:22.770 actually using AI now. So, it's just become part of the repertoire. 140 00:08:24.130 --> 00:08:25.990 Anna Delaney: And, I was talking to the CISO of Zscaler, he was 141 00:08:25.990 --> 00:08:30.520 saying, actually, most organizations are embracing AI. 142 00:08:30.550 --> 00:08:33.430 They're happy to do that. So, but, they also - nearly 100% of 143 00:08:33.430 --> 00:08:37.240 them - know the risks, and they are sort of fearful of this 144 00:08:37.270 --> 00:08:40.060 technology. So, you get that that dichotomy there. 145 00:08:40.060 --> 00:08:41.680 Mathew Schwartz: Well, we saw that at the summit, as well, in 146 00:08:41.680 --> 00:08:44.680 London that we held. We had CISOs grappling with what does 147 00:08:44.680 --> 00:08:48.010 this mean? Where do I need to put guardrails inside my 148 00:08:48.010 --> 00:08:52.930 organization. For example, if it's for insurance, so that we 149 00:08:53.290 --> 00:08:56.800 hope that we're training our system with ethical data, but we 150 00:08:56.800 --> 00:08:59.230 need to look at all these things before perhaps we do something 151 00:08:59.230 --> 00:09:00.700 that might get us in trouble as well. 152 00:09:01.530 --> 00:09:04.170 Anna Delaney: researchers here in the studios, about IoT 153 00:09:04.170 --> 00:09:08.670 devices and OT devices that have vulnerabilities there. The 154 00:09:08.910 --> 00:09:17.250 importance, yes, again, about security by design, testing - 155 00:09:18.000 --> 00:09:21.930 continuous testing - and monitoring these products. So, 156 00:09:22.080 --> 00:09:23.640 good reminders, do the basics! 157 00:09:23.000 --> 00:09:26.090 Mathew Schwartz: Do the basics! Yes, one of the discussions I 158 00:09:26.090 --> 00:09:30.200 had was on quantum computing, and there's concern about what 159 00:09:30.200 --> 00:09:33.530 we're going to need to do and there was a project that was- 160 00:09:33.740 --> 00:09:37.010 the research was presented here to help organizations figure out 161 00:09:37.010 --> 00:09:41.090 where they have cryptography now, so that if they do need to 162 00:09:41.090 --> 00:09:44.210 go to quantum, they have a better idea of what to swap out. 163 00:09:44.390 --> 00:09:46.970 But, it identified a more pressing concern, which is if 164 00:09:46.970 --> 00:09:50.210 something happens to your current cryptography library, 165 00:09:50.390 --> 00:09:53.720 even before we get to quantum, having the ability to switch 166 00:09:53.720 --> 00:09:56.930 that out, because we've seen unexpected events occur 167 00:09:57.080 --> 00:10:01.280 previously in the cybersecurity realm and having an increased 168 00:10:01.280 --> 00:10:05.480 amount of agility there would be super helpful. 169 00:10:06.260 --> 00:10:07.970 Anna Delaney: Brilliant insights! Thank you so much, 170 00:10:07.970 --> 00:10:11.120 gentlemen. You've already got a couple of articles online. We've 171 00:10:11.120 --> 00:10:15.590 got some videos going to be published soon. But, a great 172 00:10:15.800 --> 00:10:17.390 couple-of-days' work. Excellent. 173 00:10:17.480 --> 00:10:19.130 Mathew Schwartz: Nonstop! It's been good fun. Thank you. 174 00:10:19.160 --> 00:10:20.750 Tony Morbin: And, what was your biggest surprise? 175 00:10:20.000 --> 00:10:23.780 Anna Delaney: So, some of the biggest surprises I discovered 176 00:10:23.780 --> 00:10:28.100 here was some research from Omdia about decision makers. 177 00:10:28.130 --> 00:10:32.120 And, apparently Europe is faring better when it comes to their 178 00:10:32.120 --> 00:10:34.700 overall security posture compared to other regions of the 179 00:10:34.700 --> 00:10:37.280 world. So, I thought that was very interesting. But, surprise! 180 00:10:37.280 --> 00:10:41.150 Surprise! Staffing shortages, skills challenges still up 181 00:10:41.150 --> 00:10:46.100 there, and ransomware, as a big concern for European companies. 182 00:10:46.130 --> 00:10:48.740 Mathew Schwartz: Thank you! For that ransomware quote. I felt 183 00:10:48.740 --> 00:10:52.370 after all of the focus on crypto blocking malware in the last few 184 00:10:52.370 --> 00:10:54.860 years, they could've used just a little bit, a little sprinkle 185 00:10:54.860 --> 00:10:57.890 here at the conference. But sadly, that didn't come to pass. 186 00:10:57.000 --> 00:10:58.140 Anna Delaney: Well, it'll weave its way back in, somehow. 187 00:10:57.000 --> 00:11:01.980 Mathew Schwartz: It'll weave its way back in, as it always seems 188 00:11:01.980 --> 00:11:04.320 to! Ransomware, insidious and all. 189 00:11:04.410 --> 00:11:06.960 Anna Delaney: Well, brilliant work, gentlemen. Fascinating 190 00:11:06.960 --> 00:11:11.040 times and stay tuned for more interviews from ISMG. 191 00:11:11.070 --> 00:11:12.463 Mathew Schwartz: Definitely. Yeah. Thank you. It's been a 192 00:11:12.498 --> 00:11:14.310 great- it's been a it's been a pleasure. So, thanks. 193 00:11:14.000 --> 00:11:16.700 Anna Delaney: And, thank you so much for being with us. For 194 00:11:16.700 --> 00:11:18.170 ISMG, I'm Anna Delaney.