WEBVTT 1 00:00:00.210 --> 00:00:02.730 Anna Delaney: Hello, thanks for joining us for the ISMG Editor's 2 00:00:02.730 --> 00:00:05.820 Panel. I'm Anna Delaney and this is a weekly conversation between 3 00:00:05.820 --> 00:00:09.000 ISMG journalists on the cybersecurity topics that matter 4 00:00:09.000 --> 00:00:12.810 the most right now. Our star players this week include Tom 5 00:00:12.810 --> 00:00:16.230 Field, senior vice president of editorial, Suparna Goswami, 6 00:00:16.260 --> 00:00:20.190 associate editor at ISMG Asia, and Rashmi Ramesh, senior sub 7 00:00:20.190 --> 00:00:27.240 editor for ISMG's global news desk. Great to see you all. Lots 8 00:00:27.240 --> 00:00:29.760 of color this week. Rashmi, tell us more. 9 00:00:30.900 --> 00:00:33.930 Rashmi Ramesh: I'm at an open air concert, where the headliner 10 00:00:33.930 --> 00:00:38.220 was an Indian music artist. So, I went to a concert after two 11 00:00:38.220 --> 00:00:44.280 years - three years actually. So this is confetti raining down on 12 00:00:44.280 --> 00:00:47.250 us during the concert that we had cleaned up after - very 13 00:00:47.250 --> 00:00:49.650 responsible. But it was just a great night. 14 00:00:50.430 --> 00:00:54.630 Anna Delaney: Yeah, I bet it felt excellent to be at that. 15 00:00:54.840 --> 00:00:57.810 Concert in person is a great feeling, isn't it? 16 00:00:57.990 --> 00:00:58.740 Rashmi Ramesh: Absolutely. 17 00:00:59.310 --> 00:01:02.580 Anna Delaney: Suparna, bringing back memories of the World Cup. 18 00:01:02.000 --> 00:01:04.346 Suparna Goswami: Oh, yes. So yes, you can make out it's a 19 00:01:02.000 --> 00:01:34.640 Tom Field: I had no idea you were such a football fan. 20 00:01:04.405 --> 00:01:07.866 FIFA World Cup. I'm also wearing blue, the similar one that 21 00:01:07.924 --> 00:01:10.916 Argentina has. So it was an unbelievable final with 22 00:01:10.975 --> 00:01:14.495 Argentina finally lifting the trophy after 36 years. And, of 23 00:01:14.553 --> 00:01:18.366 course, Messi playing so well is just an icing on the cake. World 24 00:01:18.425 --> 00:01:21.828 Cup trophy was something which was missing from his record 25 00:01:21.886 --> 00:01:25.523 books. And he got it - not that not winning it would have made 26 00:01:25.582 --> 00:01:28.984 him any less of a player. But still, it feels just special 27 00:01:29.043 --> 00:01:32.270 just to see him with that World Cup trophy in the hand. 28 00:01:35.780 --> 00:01:38.300 Suparna Goswami: Oh, I'm generally a sports enthusiast. 29 00:01:38.930 --> 00:01:40.610 Anna Delaney: I knew the cricket. I didn't know the 30 00:01:40.610 --> 00:01:44.450 football. It was a wonderful match to watch actually. But 31 00:01:44.450 --> 00:01:48.470 both teams played very, very well. Tom, more snow? 32 00:01:49.820 --> 00:01:53.150 Tom Field: Similar to Rashmi's confetti, this came watering 33 00:01:53.150 --> 00:01:57.020 down from the sky last weekend and there was about a foot of it 34 00:01:57.530 --> 00:02:01.010 that fell. Heavy snow and so it took down trees which took down 35 00:02:01.010 --> 00:02:04.790 powerlines and so I spent much of last weekend without 36 00:02:04.910 --> 00:02:08.150 electricity and heat and putting a lot of wood into the 37 00:02:08.150 --> 00:02:11.450 woodstoves to keep spirits bright. This is a view on one of 38 00:02:11.450 --> 00:02:13.520 the windows, by the way. You can see how heavy the snow is on the 39 00:02:13.520 --> 00:02:14.090 branches. 40 00:02:14.540 --> 00:02:20.240 Anna Delaney: Yeah, beautiful to look at. I am at Hampton Court 41 00:02:20.240 --> 00:02:23.660 Palace wherein King Henry VIII used to live. And this week they 42 00:02:23.660 --> 00:02:27.020 put on a light show for Christmas. So there were various 43 00:02:27.020 --> 00:02:29.630 light installations scattered throughout the grounds of 44 00:02:29.630 --> 00:02:33.530 palace. It was quite impressive to take in with a cup of mulled 45 00:02:33.530 --> 00:02:34.790 wine as well, of course. 46 00:02:34.000 --> 00:02:35.710 Tom Field: Did you go there? 47 00:02:35.000 --> 00:02:41.480 Anna Delaney: Yeah, yeah, so it's local to where my family 48 00:02:41.480 --> 00:02:46.670 lives. Speaking of lights, Tom, I believe pharmaceutical Johnson 49 00:02:46.700 --> 00:02:48.680 & Johnson is losing one of their leading lights. 50 00:02:48.740 --> 00:02:51.440 Tom Field: They are losing a bright light. Last time that you 51 00:02:51.440 --> 00:02:53.990 and I spoke, we talked about some of the interviews that we 52 00:02:53.990 --> 00:02:57.110 really enjoyed over the past year. And my choice was Marene 53 00:02:57.110 --> 00:02:59.480 Allison. She's the CISO of Johnson & Johnson, just an 54 00:02:59.480 --> 00:03:04.280 illustrious career. She was in the first female class at West 55 00:03:04.280 --> 00:03:08.210 Point Military Academy. She's had some high-profile CISO jobs 56 00:03:08.210 --> 00:03:13.370 in her career. She even spent time doing undercover drug busts 57 00:03:13.430 --> 00:03:16.460 with her husband in New Jersey for the FBI, which is an 58 00:03:16.460 --> 00:03:19.550 entirely different story I hope to get Sunday, but she is 59 00:03:19.550 --> 00:03:23.420 retiring at the end of this year from her role at Johnson 60 00:03:23.480 --> 00:03:26.360 &Johnson; she has been a CISO for 30 years, not just a Johnson 61 00:03:26.360 --> 00:03:29.510 & Johnson but as a CISO for 30 years. She's moving on to do 62 00:03:29.510 --> 00:03:32.480 other things. And so, we did have a chance to catch up this 63 00:03:32.480 --> 00:03:36.380 week. Interesting of all the things that we talked about: the 64 00:03:36.440 --> 00:03:40.910 point that resonated most with me was when I asked her, What 65 00:03:40.910 --> 00:03:45.290 was it in her career that really inspired her and let her know, 66 00:03:45.290 --> 00:03:48.500 yes, this is the moment, this is why I'm doing what I'm doing. So 67 00:03:48.500 --> 00:03:51.380 I want to share with you an excerpt so you can hear her 68 00:03:51.380 --> 00:03:53.000 response to my question. 69 00:03:53.890 --> 00:03:58.354 Marene Allison: You know, I have three that stand out. So, the 70 00:03:58.451 --> 00:04:04.468 first one was understanding that voice over IP and what it was 71 00:04:04.565 --> 00:04:10.486 going to mean. And I was the head of global security at Avaya 72 00:04:10.583 --> 00:04:16.503 tower communications. And they asked me to step into what was 73 00:04:16.600 --> 00:04:22.229 literally a CISO role or IT security role, and asked me to 74 00:04:22.326 --> 00:04:27.567 run the security operation center for the World Cup in 75 00:04:27.664 --> 00:04:33.002 Korea, in Japan, in 2002. We were using open source and 76 00:04:33.099 --> 00:04:38.826 creating what were we going to do. You know, the friendlies 77 00:04:38.923 --> 00:04:43.872 were scanning us, the unfriendlies were scanning us 78 00:04:43.969 --> 00:04:50.181 and trying to figure out how we were going to support that. That 79 00:04:50.278 --> 00:04:56.392 was a defining moment. Tom, that was when I knew I had bitten - 80 00:04:56.489 --> 00:05:02.410 I've been bitten by the IT bug and I could never change who I 81 00:05:02.507 --> 00:05:07.360 was and what I wanted the trajectory of my career. 82 00:05:07.000 --> 00:05:09.710 Tom Field: Here we are, some 20 years later and she's just 83 00:05:09.768 --> 00:05:13.113 wrapping up that illustrious career. I don't think, by any 84 00:05:13.171 --> 00:05:16.458 means, she's going to stop. I know she's looking at board 85 00:05:14.830 --> 00:05:26.470 Anna Delaney: It all started at the World Cup. 86 00:05:16.516 --> 00:05:19.977 opportunities and wants to be active in the community, wants 87 00:05:20.034 --> 00:05:23.668 to be active as a mentor, still just stepping down from the day 88 00:05:23.726 --> 00:05:24.130 to day. 89 00:05:27.920 --> 00:05:30.530 Tom Field: Little did I know that I was playing that extra. 90 00:05:31.610 --> 00:05:34.220 But here we go. Tis' the season of giving 91 00:05:35.090 --> 00:05:37.310 Anna Delaney: So, Tom, she didn't reveal any details about 92 00:05:37.310 --> 00:05:40.460 her former career doing undercover drug busts? 93 00:05:40.910 --> 00:05:44.720 Tom Field: That is promised over drink some time at RSA 94 00:05:44.720 --> 00:05:47.480 conference with someplace where we can get together. She did 95 00:05:47.480 --> 00:05:50.270 leave some hints about her future. She says that she is 96 00:05:50.270 --> 00:05:53.420 speed dating right now with all of her opportunities, and that 97 00:05:53.420 --> 00:05:56.000 she's a bit overwhelmed, but she's going to start winnowing 98 00:05:56.000 --> 00:05:58.760 down, disappointing a few suitors and delighting some 99 00:05:58.760 --> 00:05:59.180 others. 100 00:05:59.630 --> 00:06:01.730 Anna Delaney: I can imagine she's spoilt for choice. Sure. 101 00:06:02.510 --> 00:06:05.630 So, Suparna, you've been quizzing experts on zero trust 102 00:06:05.630 --> 00:06:09.260 and the importance of knowing what your crown jewels are. What 103 00:06:09.260 --> 00:06:09.920 was discussed? 104 00:06:11.190 --> 00:06:13.560 Suparna Goswami: Sure, Anna, I have been interacting with 105 00:06:13.560 --> 00:06:15.780 practitioners closely, thanks for the many roundtable 106 00:06:15.780 --> 00:06:19.950 discussions we at ISMG have with them. So one common theme that 107 00:06:19.950 --> 00:06:23.490 emerged from my discussion with the practitioners in roundtable 108 00:06:23.490 --> 00:06:27.060 was how to apply zero trust strategy for your crown jewels. 109 00:06:27.240 --> 00:06:30.660 But the important question is, how do companies decide on the 110 00:06:30.660 --> 00:06:34.260 crown jewels? To begin with, how do you know what exactly is the 111 00:06:34.260 --> 00:06:38.610 crown jewel? So I spoke with Dr. Chase Cunningham, who is called 112 00:06:38.610 --> 00:06:42.510 the doctor of zero trust, Maureen Rosado from BT as well 113 00:06:42.510 --> 00:06:45.180 as Patrick English, who is a zero trust consultant with 114 00:06:45.360 --> 00:06:49.950 Ztsolutions. And I asked him is how he helps organizations to 115 00:06:49.950 --> 00:06:53.490 understand what exactly are their crown jewels. And he said, 116 00:06:53.490 --> 00:06:56.220 very interesting thing, he said that unless the companies are in 117 00:06:56.220 --> 00:06:58.860 that critical position, they will never understand the crown 118 00:06:58.860 --> 00:07:01.800 jewels. So they have to deal with the reality of compromise 119 00:07:01.800 --> 00:07:05.220 to know what really is valuable to them, and be in that 120 00:07:05.220 --> 00:07:08.160 uncomfortable position. So it is all about having that attack 121 00:07:08.160 --> 00:07:11.880 simulation exercise. And no, it's not about red teaming and 122 00:07:11.880 --> 00:07:14.520 looting where, you know, this is a simulation going on, but be in 123 00:07:14.520 --> 00:07:17.820 that situation as if it's actually happening. So that's 124 00:07:17.820 --> 00:07:20.220 when he said the company is actually, "Okay, this is what I 125 00:07:20.220 --> 00:07:23.340 want to protect." But my question was: for every 126 00:07:23.340 --> 00:07:28.740 business, their data will be the most critical. So how does the 127 00:07:28.740 --> 00:07:32.940 overall, an organization makes that decision? What is critical? 128 00:07:32.940 --> 00:07:36.510 So, for instance, if I'm from the HR department, for me, the 129 00:07:36.510 --> 00:07:40.410 employee data will be the most critical and if I'm a supply 130 00:07:40.410 --> 00:07:43.890 chain manager, it might be what I manufacture. So how do we 131 00:07:43.890 --> 00:07:47.910 decide? So here, Maureen said a very interesting thing. One is 132 00:07:47.910 --> 00:07:51.330 obviously that what she is mentioned, have those exercises. 133 00:07:51.330 --> 00:07:54.510 And the other part is, you really need to sit the security 134 00:07:54.510 --> 00:07:57.090 team, really need to sit with the business and have those 135 00:07:57.360 --> 00:08:01.380 important conversation on what exactly is your crown jewel? And 136 00:08:01.380 --> 00:08:05.070 how, in the overall picture, how does it matter. So those 137 00:08:05.070 --> 00:08:08.040 conversation is really important, we might not get them 138 00:08:08.220 --> 00:08:11.520 kind of importance. But that really matters to complement the 139 00:08:11.520 --> 00:08:15.270 technology. Another interesting point that was mentioned by 140 00:08:15.270 --> 00:08:19.200 Patrick, which I really liked, was often organizations think 141 00:08:19.200 --> 00:08:23.100 that x y, z is the crown jewel, but that is actually not. So, 142 00:08:23.100 --> 00:08:25.470 for instance, a lot of them think that servers are the crown 143 00:08:25.470 --> 00:08:28.170 jewels, but it is not actually the server, but it's actually 144 00:08:28.170 --> 00:08:31.080 the data on the server, which is the crown jewel. So those 145 00:08:31.110 --> 00:08:35.190 nuances they need to understand. I was also curious to know that 146 00:08:35.220 --> 00:08:39.540 if they are seeing companies invest in zero trust strategies 147 00:08:39.540 --> 00:08:43.470 and want zero trust strategy to be applied to everything. So I 148 00:08:43.470 --> 00:08:47.190 wasn't very surprised that maybe knew the answer. It was yes. You 149 00:08:47.190 --> 00:08:51.450 know, many are trying too fast and fail. And then management 150 00:08:51.450 --> 00:08:54.210 comes after them on why have you invested in zero trust, our 151 00:08:54.210 --> 00:08:57.480 investments have gone to the dogs, and they dropped the whole 152 00:08:57.480 --> 00:09:01.740 idea of zero trust. But that's not how it works. That's what 153 00:09:01.740 --> 00:09:05.130 Chase said, you must know what you're aiming for. So, aim is to 154 00:09:05.130 --> 00:09:08.850 reduce the risk of compromise and apply zero trust in areas 155 00:09:08.850 --> 00:09:12.450 where compromise is most likely to happen, and then work your 156 00:09:12.450 --> 00:09:15.300 way up from that. And not everything needs to be under 157 00:09:15.300 --> 00:09:18.510 zero trust. We need to accept the fact that some things can 158 00:09:18.510 --> 00:09:22.860 never come under zero trust. So the idea to secure the best way 159 00:09:22.860 --> 00:09:26.100 possible, what matters most to you is important and then live 160 00:09:26.100 --> 00:09:28.560 with the reality that there will be no perfect defense. So it's 161 00:09:28.560 --> 00:09:32.040 not that zero trust means zero attacks, there will be no 162 00:09:32.040 --> 00:09:35.160 perfect defense. And also controls need to be applied in 163 00:09:35.160 --> 00:09:40.530 areas where it makes sense. So one interesting anecdote, he 164 00:09:40.530 --> 00:09:43.410 mentioned that if users on the internet is a problem for you, 165 00:09:43.740 --> 00:09:47.970 then browser isolation is a solution. Then you need to tell 166 00:09:47.970 --> 00:09:51.660 the user why are you doing this. So you're doing X to achieve Y 167 00:09:51.660 --> 00:09:54.480 and that will give you a good user experience. But you need to 168 00:09:54.480 --> 00:09:58.800 educate the users also, so that they are in your entire plan. 169 00:09:59.340 --> 00:10:01.740 And they should be able to support you, so the gist is 170 00:10:01.740 --> 00:10:04.680 essentially begin our journey today, do not really wait. 171 00:10:05.040 --> 00:10:08.040 Because innovation is happening every day. So we have new 172 00:10:08.040 --> 00:10:11.520 organizations coming up with new tools. So if your organization 173 00:10:11.520 --> 00:10:15.240 doesn't start with zero trust now, you will be way behind. So 174 00:10:15.240 --> 00:10:17.700 just start with it. Start small, but start with it. 175 00:10:18.750 --> 00:10:21.360 Anna Delaney: Very important, really interesting insights, 176 00:10:21.360 --> 00:10:23.850 Suparna. I know, as you say, you've been speaking to many 177 00:10:23.880 --> 00:10:27.420 security practitioners this year at roundtables, summits. How do 178 00:10:27.420 --> 00:10:29.820 you think the conversation around zero trust has changed 179 00:10:29.820 --> 00:10:31.530 the most specifically this year? 180 00:10:32.530 --> 00:10:34.390 Suparna Goswami: See, as I said, I've been interacting and a lot 181 00:10:34.390 --> 00:10:36.970 of my roundtables have been around zero trust, and they have 182 00:10:36.970 --> 00:10:41.170 certain apprehensions on where to start from, so not everybody 183 00:10:41.170 --> 00:10:44.050 I have interacted with have started the journey. As I said, 184 00:10:44.050 --> 00:10:47.140 they have certain apprehensions, but they know the importance of 185 00:10:47.140 --> 00:10:50.140 zero trust. And those who have begun the journey, it is all 186 00:10:50.140 --> 00:10:53.980 about this nuanced things, how to apply zero trust for critical 187 00:10:53.980 --> 00:10:57.160 assets or how to apply zero trust in the cloud? How is it 188 00:10:57.160 --> 00:11:00.370 different from applying zero trust on-premises? Or how to 189 00:11:00.370 --> 00:11:03.520 apply zero trust on the network? The conversation around 190 00:11:03.550 --> 00:11:07.210 data-centric zero trust approach has just about started, it 191 00:11:07.210 --> 00:11:10.210 hasn't matured enough. But I think it is the identity which 192 00:11:10.210 --> 00:11:14.890 still dominates the conversation. But now, I'm 193 00:11:14.890 --> 00:11:17.530 seeing vendors talking about data-centric approach to zero 194 00:11:17.530 --> 00:11:20.560 trust. I think the conversation is just about begun. But yes, 195 00:11:21.070 --> 00:11:24.250 they all know the importance, but not everybody at least in 196 00:11:24.250 --> 00:11:27.010 this region has begun the process of zero trust. 197 00:11:27.780 --> 00:11:29.730 Anna Delaney: Well no doubt, we'll be discussing zero trust 198 00:11:29.760 --> 00:11:35.130 on a daily basis next year. Thanks, Suparna. Rashmi, lots 199 00:11:35.130 --> 00:11:37.710 has been happening in the crypto space this year keeping you on 200 00:11:37.710 --> 00:11:40.650 your toes. I believe you've been discussing the application of 201 00:11:40.710 --> 00:11:44.040 banking standards to regulate blockchain-based digital assets 202 00:11:44.040 --> 00:11:47.910 with two of our global ISMG contributors, Ari Redbord and 203 00:11:47.940 --> 00:11:50.670 Troy Leach. What insights did you glean? 204 00:11:52.010 --> 00:11:55.490 Rashmi Ramesh: So, I want to set some context before we begin. So 205 00:11:55.580 --> 00:12:00.410 ironically, regulation is the buzzword in blockchain-based 206 00:12:00.500 --> 00:12:04.820 digital assets industry these past few months, even the year, 207 00:12:04.970 --> 00:12:09.020 rather. I say blockchain-based digital assets and not just 208 00:12:09.020 --> 00:12:12.050 crypto, because it involves a whole lot of other assets 209 00:12:12.050 --> 00:12:15.740 outside of crypto. So, you have your central bank digital 210 00:12:15.740 --> 00:12:18.890 currencies, you have stable coins, you have non-fungible 211 00:12:18.920 --> 00:12:22.730 tokens and a bunch of other assets. And this call for 212 00:12:22.730 --> 00:12:26.000 regulation has only become stronger, with situations like 213 00:12:26.270 --> 00:12:30.530 the Terra-Luna crash, and FTX and hundreds of other smaller 214 00:12:30.530 --> 00:12:33.470 incidents that most people haven't even heard of. Now, this 215 00:12:33.470 --> 00:12:37.160 question of applying standards meant for one industry to 216 00:12:37.160 --> 00:12:40.670 another comes up because the logic is that it's all money. 217 00:12:40.940 --> 00:12:43.910 But it's a lot deeper than that, and a lot more nuanced than 218 00:12:43.910 --> 00:12:48.860 that. Now the best panel that we could draw between the payment 219 00:12:48.860 --> 00:12:52.790 card industry before PCI DSS came into being and the current 220 00:12:52.790 --> 00:12:55.940 crypto industry, and I spoke to Troy Leach, like you said, who 221 00:12:56.210 --> 00:13:00.350 helped establish and lead the PCI Security Standards Council, 222 00:13:01.430 --> 00:13:04.790 who said that there's actually quite a bit of similarity 223 00:13:04.790 --> 00:13:07.880 between the two, especially when it comes to elements of 224 00:13:07.880 --> 00:13:12.380 cybersecurity, because they're both dependent on the same types 225 00:13:12.380 --> 00:13:16.700 of technology and business controls, and that they ideally 226 00:13:16.850 --> 00:13:20.870 should have the same types of checks and oversight. Now, the 227 00:13:20.960 --> 00:13:26.690 PCI standards also came by after major data breaches and fraud 228 00:13:26.720 --> 00:13:29.690 that happened in the payment space that became a universal 229 00:13:29.690 --> 00:13:34.880 pain point. And the crypto industry, he said, is likely at 230 00:13:34.880 --> 00:13:39.530 the same time in history of payments right now. So there is 231 00:13:39.530 --> 00:13:43.370 an obvious question, right? Why can't you just, you know, pick 232 00:13:43.370 --> 00:13:48.440 up the PCI DSS standard and apply to crypto as it is? And 233 00:13:48.530 --> 00:13:50.930 several cybersecurity professionals who have nothing 234 00:13:50.930 --> 00:13:54.050 to do with crypto payments have brought this up in conversation 235 00:13:54.050 --> 00:13:59.450 as well. But it also has an obvious answer. Not all digital 236 00:13:59.450 --> 00:14:02.930 assets are centralized. Not all crypto transactions follow the 237 00:14:02.930 --> 00:14:07.100 same processes. And not all tokens function the same way, 238 00:14:07.100 --> 00:14:10.640 not even all cryptocurrency behave the same way. So, a 239 00:14:10.640 --> 00:14:15.530 shovelware model does not really make sense. What does make sense 240 00:14:15.560 --> 00:14:20.180 are two of the key things that mostly come from experts like 241 00:14:20.180 --> 00:14:25.280 Ari and Troy, who know a thing or two about regulation. So one 242 00:14:25.280 --> 00:14:29.780 is that, yes, certain banking standards are applicable to some 243 00:14:29.780 --> 00:14:32.540 form of digital assets in some instances, like I mentioned 244 00:14:32.540 --> 00:14:35.840 earlier, for example, centralized financial 245 00:14:35.840 --> 00:14:39.110 institutions are the best examples. They're easier to 246 00:14:39.110 --> 00:14:42.050 regulate than most other institutions in the space. They 247 00:14:42.050 --> 00:14:45.950 already have regulations in place for them. For one, you can 248 00:14:45.950 --> 00:14:48.770 ensure that the structure of the centralized financial 249 00:14:48.770 --> 00:14:51.440 institution is right, that it has an independent board, that 250 00:14:51.440 --> 00:14:56.090 it has a CFO, that it has a security officer - and I'm sure 251 00:14:56.090 --> 00:14:59.180 you know what I'm referring to - all of these functions actually 252 00:14:59.180 --> 00:15:03.470 have qualified people heading them. So put in place KYC, have 253 00:15:03.470 --> 00:15:07.910 a travel rule in place, the issue is compliance, not really 254 00:15:07.940 --> 00:15:11.540 a lack of regulation in this instance. So that needs to be 255 00:15:11.540 --> 00:15:15.560 addressed. The second was that there are standards out there 256 00:15:15.560 --> 00:15:20.360 meant specifically for cryptocurrency. One is a 257 00:15:20.360 --> 00:15:23.990 cryptocurrency security standards, which is designed to 258 00:15:24.020 --> 00:15:27.950 augment information security best practices and also 259 00:15:27.950 --> 00:15:32.240 complement things like PCI DSS under this framework. So that 260 00:15:32.240 --> 00:15:36.350 was my long answer. The short answer is yes, you can apply 261 00:15:36.350 --> 00:15:39.620 some aspects of banking standards to blockchain-based 262 00:15:39.650 --> 00:15:43.550 digital assets, especially the principles behind them. But it 263 00:15:43.550 --> 00:15:45.110 cannot be a cut and paste. 264 00:15:46.340 --> 00:15:48.440 Anna Delaney: Excellent analysis, Rashmi. So what do you 265 00:15:48.440 --> 00:15:51.860 think has been the most important crypto event or 266 00:15:51.860 --> 00:15:54.650 incident of 2022 that really will have an impact on 267 00:15:55.010 --> 00:15:56.210 developments next year? 268 00:15:57.260 --> 00:16:01.010 Rashmi Ramesh: Easy answer, FTX. If you had asked me a couple of 269 00:16:01.010 --> 00:16:03.470 months ago, I would have probably said I don't know. They 270 00:16:03.470 --> 00:16:07.880 are all in bridge hack, but it's now definitely FTX. It's brought 271 00:16:07.880 --> 00:16:13.100 to light so many gaps in just how this ecosystem functions, 272 00:16:13.490 --> 00:16:17.030 the rampant fraud that goes on and the hacks that happen every 273 00:16:17.030 --> 00:16:19.280 day. So, definitely FTX. 274 00:16:19.460 --> 00:16:23.150 Tom Field: Good, Rashmi. FTX is to crypto with the Target breach 275 00:16:23.150 --> 00:16:27.590 was to retail manufacturing over a decade ago. 276 00:16:29.660 --> 00:16:31.460 Anna Delaney: Yeah, that's come up a lot, hasn't it? That 277 00:16:31.460 --> 00:16:35.900 comparison. Well, finally, moving on. Tom, can you guess my 278 00:16:35.900 --> 00:16:36.860 final question? 279 00:16:37.310 --> 00:16:39.410 Tom Field: Let's see, in the past two weeks, we've done the 280 00:16:39.410 --> 00:16:45.080 ghost of Christmas past, a ghost of Christmas present. We're 281 00:16:45.080 --> 00:16:45.740 looking ahead. 282 00:16:46.020 --> 00:16:48.673 Anna Delaney: We're looking into the future, the ghost of 283 00:16:48.734 --> 00:16:52.375 cybersecurity future. It's quite difficult to name a person 284 00:16:52.436 --> 00:16:55.953 because I haven't been able to go round schools, quiz the 285 00:16:56.015 --> 00:16:59.840 future CISOs. But I'm going to extend this to maybe there's an 286 00:16:59.902 --> 00:17:03.789 initiative or an organization or even technology that you place 287 00:17:03.851 --> 00:17:07.676 hoping to help mitigate or even solve tomorrow's cybersecurity 288 00:17:07.738 --> 00:17:11.070 challenges. Who is it going to be? What's it gonna be? 289 00:17:11.069 --> 00:17:13.678 Tom Field: I hadn't thought of this before. But I could 290 00:17:13.737 --> 00:17:17.295 actually just give you a name, you'd have no way to validate 291 00:17:17.354 --> 00:17:20.912 that. It's a child in school that I think is going to be the 292 00:17:20.971 --> 00:17:24.707 future. Now, I would, you know, I think of this year as sort of 293 00:17:24.766 --> 00:17:27.791 a bookend. Beginning of the year, I spoke with Dawn 294 00:17:27.850 --> 00:17:31.408 Cappelli, who at the time was retiring from her CISO role at 295 00:17:31.467 --> 00:17:35.262 Rockwell Automation. She went on to take a role with with Dragos 296 00:17:35.321 --> 00:17:38.879 but regardless of that she was stepping down from a position 297 00:17:38.939 --> 00:17:42.734 she'd held for a long time. And then I ended the year talking to 298 00:17:42.793 --> 00:17:46.647 Marene Allison, who we just saw, who is stepping down as the CISO 299 00:17:46.707 --> 00:17:50.146 of J&J. And both of these are women who have had extensive 300 00:17:50.205 --> 00:17:53.704 careers, who have made a mark, who have been educators, who 301 00:17:53.763 --> 00:17:57.380 have been mentors, they have a following and they're stepping 302 00:17:57.440 --> 00:18:00.642 down. For me, the ghost of cybersecurity future is the 303 00:18:00.701 --> 00:18:04.259 person stepping into leadership now, who doesn't know a time 304 00:18:04.318 --> 00:18:08.054 when there wasn't an internet. Who doesn't who came of age with 305 00:18:08.113 --> 00:18:11.849 cloud applications and APIs and a hybrid workforce and is going 306 00:18:11.908 --> 00:18:15.585 to step in without the legacy of knowledge and context that we 307 00:18:15.644 --> 00:18:18.965 all had before in a non-digital world. I think there's a 308 00:18:19.024 --> 00:18:22.700 terrific opportunity with his next generation of cybersecurity 309 00:18:22.760 --> 00:18:26.258 leaders. And I'm encouraged. I don't know who the person is 310 00:18:26.318 --> 00:18:29.816 going to be. But I know there are people out there stepping 311 00:18:29.876 --> 00:18:33.611 into these roles right now. And I look forward to being able to 312 00:18:33.671 --> 00:18:37.288 support them and bring them into our network and help them to 313 00:18:37.347 --> 00:18:41.024 spread the word and maybe share some education as well. That's 314 00:18:41.083 --> 00:18:42.210 what I look toward. 315 00:18:42.659 --> 00:18:45.389 Anna Delaney: Excellent, excellent answer. Rashmi? 316 00:18:46.620 --> 00:18:49.980 Rashmi Ramesh: Well, you cannot think cybersecurity and the work 317 00:18:50.010 --> 00:18:52.530 that's happened on the government side over the past 318 00:18:52.530 --> 00:18:57.540 year or so without acknowledging these two incredible women. One 319 00:18:57.570 --> 00:19:02.160 would be Jen Easterly of CISA. And the Australian Cybersecurity 320 00:19:02.160 --> 00:19:06.060 Minister Clare O'Neil, who have done some amazing work in the 321 00:19:06.060 --> 00:19:08.850 past few months and I cannot wait to see how they transform 322 00:19:08.850 --> 00:19:10.590 this industry over the next few years. 323 00:19:11.610 --> 00:19:13.260 Anna Delaney: And there's a boldness, isn't there? Not 324 00:19:13.260 --> 00:19:18.420 afraid to call out as they are? No BS. 325 00:19:19.140 --> 00:19:21.360 Tom Field: I think Jen Easterly has been a nominated ghost for 326 00:19:21.360 --> 00:19:23.070 every one of these conversations. She should get 327 00:19:23.070 --> 00:19:24.210 some special award. 328 00:19:26.610 --> 00:19:29.760 Anna Delaney: Omnipresent, omnipotent. Suparna? 329 00:19:29.000 --> 00:19:33.860 Suparna Goswami: I'll not name a person, but maybe something I'm 330 00:19:33.860 --> 00:19:37.010 very hopeful for SBOM which President Biden signed in May 331 00:19:37.040 --> 00:19:41.360 2021. While every organization I feel uses the same open-source 332 00:19:41.360 --> 00:19:44.660 components, which has niche organizations, they scan for 333 00:19:44.660 --> 00:19:48.470 vulnerabilities and they analyze the risk and everything but they 334 00:19:48.470 --> 00:19:52.790 do it in silos. So SBOMs will have the common infrastructure 335 00:19:52.790 --> 00:19:58.310 and data exchange format, which I think will go a long way in 336 00:19:58.310 --> 00:20:02.240 helping companies and it essentially clarifies both 337 00:20:02.270 --> 00:20:05.510 identity and material inputs beneath across the supply 338 00:20:05.510 --> 00:20:10.760 chains. So I think SBOMs: I'm just waiting that when will 339 00:20:10.760 --> 00:20:13.130 India take that step or when will countries in Asia will 340 00:20:13.130 --> 00:20:17.300 probably take that step and make that mandatory. 341 00:20:18.200 --> 00:20:20.900 Anna Delaney: So, Tom, remember, you started the year at the 342 00:20:20.900 --> 00:20:23.600 Editors' Panel, saying this would be the year of the SBOM. 343 00:20:23.990 --> 00:20:25.550 Suparna, we're full circle. 344 00:20:25.530 --> 00:20:28.340 Tom Field: It didn't take off quite so fast as I would hoped 345 00:20:28.398 --> 00:20:31.954 it would, but I know we've had a lot of conversations about it 346 00:20:28.600 --> 00:20:42.760 Anna Delaney: Well, I'm going along with what Tom said about 347 00:20:32.012 --> 00:20:35.511 and movie buffs will appreciate that I will join the partners 348 00:20:35.568 --> 00:20:39.240 bandwagon and look at 2023 the year we learned to love the SBOM. 349 00:20:42.760 --> 00:20:46.870 our future talent. We've got cyber first in the U.K., and 350 00:20:46.870 --> 00:20:50.440 that really tries to nurture a diverse range of talented young 351 00:20:50.440 --> 00:20:53.650 people into a cybersecurity career. And then we also have 352 00:20:53.650 --> 00:20:56.470 the cyber first girls competition, which aims to 353 00:20:56.470 --> 00:21:01.000 develop talented women CISOs such as Marene Allison and Dawn 354 00:21:01.000 --> 00:21:02.530 Cappelli. So there is hope. 355 00:21:02.000 --> 00:21:04.940 Tom Field: It's encouraging. You got a generation of leaders has 356 00:21:04.940 --> 00:21:07.790 never had to think about whether you should or should not use a 357 00:21:07.790 --> 00:21:08.930 personal device for work. 358 00:21:09.290 --> 00:21:13.310 Anna Delaney: Totally. It's mind blowing. Well, Tom, Suparna, 359 00:21:13.310 --> 00:21:15.740 Rashmi, this has been an absolute pleasure and a joy. 360 00:21:15.800 --> 00:21:16.640 Thank you very much. 361 00:21:17.210 --> 00:21:19.310 Suparna Goswami: Thank you. Merry Christmas. 362 00:21:20.070 --> 00:21:22.020 Anna Delaney: Yes, absolutely. Thank you for watching. Happy 363 00:21:22.020 --> 00:21:22.590 Christmas.