WEBVTT 1 00:00:00.030 --> 00:00:01.920 Mathew Schwartz: Hi, I'm Mathew Schwartz with Information 2 00:00:01.920 --> 00:00:05.340 Security Media Group. And it's my pleasure to welcome Ron Gula, 3 00:00:05.340 --> 00:00:09.180 president and co-founder of Gula Tech Adventures to the ISMG 4 00:00:09.210 --> 00:00:11.100 studio. Ron, great to have you here today. 5 00:00:11.130 --> 00:00:12.030 Ron Gula: Thanks for having me here today. 6 00:00:12.420 --> 00:00:15.270 Mathew Schwartz: It's exciting times. We're here at RSA. You've 7 00:00:15.270 --> 00:00:18.180 been described as an investor, a philanthropist and someone who 8 00:00:18.270 --> 00:00:21.600 appreciates a bit of policy work. What does RSA have in 9 00:00:21.600 --> 00:00:22.290 store for you? 10 00:00:22.540 --> 00:00:25.270 Ron Gula: We come to RSA for a wide variety of reasons. A lot 11 00:00:25.270 --> 00:00:27.640 of those policymakers are here, we get to interact with them. 12 00:00:27.940 --> 00:00:29.770 We're doing our million-dollar grant this afternoon. It's 13 00:00:29.770 --> 00:00:32.230 competitive grant for neurodiversity this year. And we 14 00:00:32.230 --> 00:00:35.830 have about 10 companies on the portfolio on the vendor floor 15 00:00:35.830 --> 00:00:37.960 right now that we've got to go see and participate in events 16 00:00:37.960 --> 00:00:38.200 with. 17 00:00:38.230 --> 00:00:40.000 Mathew Schwartz: Okay, tell me a little more. So you have a 18 00:00:40.000 --> 00:00:43.600 number of companies, all cyber, I am guessing. You tell me a 19 00:00:43.600 --> 00:00:45.220 little bit more about what you're doing? 20 00:00:45.250 --> 00:00:47.440 Ron Gula: So when my wife, Cyndi, and I came out at Tenable 21 00:00:47.440 --> 00:00:49.810 Network Security, we started Gula Tech Adventures with the 22 00:00:49.810 --> 00:00:53.380 intent of doing investing in founders, right. We're stage 23 00:00:53.380 --> 00:00:56.380 agnostic, but we also invested in some funds. So right now we 24 00:00:56.380 --> 00:00:59.500 have about 30 companies that do a wide variety of cybersecurity 25 00:00:59.500 --> 00:01:02.800 work. Some pushing the grounds of innovation, and some just 26 00:01:02.800 --> 00:01:05.620 bringing cyber hygiene to markets that haven't had it yet. 27 00:01:06.370 --> 00:01:09.400 Mathew Schwartz: What excites you from a capability 28 00:01:09.400 --> 00:01:12.490 standpoint? I mean, you're investing, you know, what is, 29 00:01:12.910 --> 00:01:16.510 you've been in industry for a long time as well, what is 30 00:01:16.510 --> 00:01:19.840 really getting you excited about where we're going and where we 31 00:01:19.840 --> 00:01:20.440 could go? 32 00:01:20.000 --> 00:01:22.310 Ron Gula: So I still think we have a long way to go with 33 00:01:22.310 --> 00:01:24.560 cybersecurity. And one of the things we've realized is that 34 00:01:24.560 --> 00:01:27.380 when you invest in a cybersecurity company, you can 35 00:01:27.380 --> 00:01:30.140 give them guidance, you know, some road map advice, even some 36 00:01:30.140 --> 00:01:33.230 capital to invest, get them to that next level. And we realized 37 00:01:33.230 --> 00:01:35.390 that we can do the same thing with nonprofits. A lot of our 38 00:01:35.390 --> 00:01:38.240 nonprofits, we treat like companies. We make them, we have 39 00:01:38.240 --> 00:01:42.050 them compete against specific cybersecurity social topics, 40 00:01:42.170 --> 00:01:45.110 such as this year, we're doing neurodiversity, autism, ADHD. 41 00:01:45.380 --> 00:01:48.740 And what we've seen is when all of those are awarded, they go to 42 00:01:48.740 --> 00:01:51.200 the next level and next level for a company could be getting 43 00:01:51.200 --> 00:01:53.600 acquired, could be getting further investment. Next level 44 00:01:53.600 --> 00:01:56.930 for a nonprofit could be getting investment from DHS, grants from 45 00:01:56.930 --> 00:02:00.020 a University, grants from a local state. We're excited to be 46 00:02:00.020 --> 00:02:00.860 part of all of that. 47 00:02:01.230 --> 00:02:05.340 Mathew Schwartz: So you've got grants, which you see as or, as 48 00:02:05.340 --> 00:02:09.300 you say, that can be equivalent to investing in companies. 49 00:02:10.320 --> 00:02:12.870 What's interesting to me about this approach is your you 50 00:02:12.870 --> 00:02:15.810 finding people who are doing things that you like, and giving 51 00:02:15.810 --> 00:02:18.570 them the opportunity, I guess, to do more of it, and to do 52 00:02:18.570 --> 00:02:20.460 better in a more sustainable fashion. 53 00:02:20.000 --> 00:02:23.420 Ron Gula: Yeah. So I mean, after running Tenable for a good 15, 54 00:02:23.420 --> 00:02:26.540 16 years, I thought I saw everything. Now with all these 55 00:02:26.540 --> 00:02:28.850 different companies we're in, all the different funds we're 56 00:02:28.850 --> 00:02:31.370 involved in, all these different nonprofits, the access we have 57 00:02:31.370 --> 00:02:34.430 to all the different policymakers, we still have a 58 00:02:34.430 --> 00:02:37.340 lot of work to do. So we're very excited about being able to help 59 00:02:37.340 --> 00:02:39.800 and give back. But one of the things we want to do besides 60 00:02:39.800 --> 00:02:43.370 just creating great technology and great people, is change the 61 00:02:43.370 --> 00:02:46.670 conversation. We'd like to broaden the word cybersecurity, 62 00:02:46.880 --> 00:02:49.490 to something called data care. This gives you personal 63 00:02:49.490 --> 00:02:52.280 responsibility at the board level, and also makes it 64 00:02:52.280 --> 00:02:55.040 something a little bit more inspirational to those outside 65 00:02:55.040 --> 00:02:58.130 of cybersecurity, especially minorities, who might not be 66 00:02:58.130 --> 00:03:01.970 drawn to the fire, like drawn to the military terminology we have 67 00:03:01.970 --> 00:03:04.370 in cybersecurity. And we just kind of think we need more 68 00:03:04.370 --> 00:03:06.830 people. And that, of course, is called data care. 69 00:03:07.770 --> 00:03:09.180 Mathew Schwartz: I live in Britain, and it's been 70 00:03:09.180 --> 00:03:13.470 interesting to attend events aimed at promoting what's still 71 00:03:13.620 --> 00:03:17.250 typically referred to as cybersecurity, and to hear from 72 00:03:17.400 --> 00:03:20.970 younger people just entering the field, how they were drawn in by 73 00:03:20.970 --> 00:03:23.490 other people's stories, or by the use of more inclusive 74 00:03:23.490 --> 00:03:27.060 language, people explaining what it means and people going, oh 75 00:03:27.120 --> 00:03:30.330 wow, that is exciting. But the terminology, I guess, you know, 76 00:03:30.570 --> 00:03:35.160 prior to potential rebranding hasn't been lighting any fires. 77 00:03:35.000 --> 00:03:37.370 Ron Gula: Well, we don't want to take anybody's cybersecurity 78 00:03:37.370 --> 00:03:39.470 budget away. Right. That's one of the objections that's out 79 00:03:39.470 --> 00:03:42.290 there. But we've got a lot of experience speaking to the 80 00:03:42.290 --> 00:03:45.020 younger generation to try to go into this career field, as well 81 00:03:45.020 --> 00:03:47.540 as boards. And I gotta tell you, when you talk to a board, and 82 00:03:47.540 --> 00:03:51.380 they're trying to decide, have we taken enough risk, to run our 83 00:03:51.380 --> 00:03:53.810 company and keep us safe from the Russians, keep the oil 84 00:03:53.810 --> 00:03:56.720 flowing, and that sort of stuff. It's really refreshing to say, 85 00:03:56.720 --> 00:03:59.570 look, it's really all about the data. And every time, everybody 86 00:03:59.570 --> 00:04:01.550 in cybersecurity, they all remember confidentiality, 87 00:04:01.550 --> 00:04:03.470 integrity and availability. They're not talking about 88 00:04:03.470 --> 00:04:04.880 systems, they're not talking about your phones, they're 89 00:04:04.880 --> 00:04:08.000 talking about the data that's in there. So it just gets to the 90 00:04:08.000 --> 00:04:10.850 back of what's important for our industry. 91 00:04:11.510 --> 00:04:13.730 Mathew Schwartz: So speaking of important, you're doing a lot of 92 00:04:13.730 --> 00:04:17.300 work with neurodiversity. So tell me more about what you are 93 00:04:17.300 --> 00:04:20.030 doing at the RSA Conference in particular. 94 00:04:20.000 --> 00:04:22.700 Ron Gula: So this is our fifth grant. I'm a couple of years 95 00:04:22.700 --> 00:04:24.530 ahead of schedule, right. I thought we would run a couple of 96 00:04:24.530 --> 00:04:27.800 competitive grants and then approach RSA. We got introduced 97 00:04:27.800 --> 00:04:31.370 to RSA by Aaron Turner. He's got a long history with that. Britta 98 00:04:31.370 --> 00:04:34.490 Glade is on our advisory board for making the grant. She's 99 00:04:34.490 --> 00:04:37.280 involved with RSA. And we've been doing the grants at RSA. 100 00:04:37.280 --> 00:04:39.620 During COVID, it was virtual. Last year, it was kind of 101 00:04:39.620 --> 00:04:42.560 blended. This is the first year that we're fully back. So at 12 102 00:04:42.560 --> 00:04:46.370 O'clock today, we are doing a grant, and the grant is going to 103 00:04:46.370 --> 00:04:50.510 award a million dollars to we're actually awarding it to a total 104 00:04:50.510 --> 00:04:53.690 of five recipients, first place, second place, third. We 105 00:04:53.690 --> 00:04:57.290 actually have a tie for third place and some runners-up, and 106 00:04:57.290 --> 00:05:00.590 it's all different types of cybersecurity nonprofits that 107 00:05:00.590 --> 00:05:05.150 do different things with autism, and ADHD and other types of 108 00:05:05.150 --> 00:05:07.880 learning. Some of them are pure cyber, like let's do workforce 109 00:05:07.940 --> 00:05:10.670 development and get people into different jobs. Some are doing 110 00:05:10.670 --> 00:05:13.310 things like taking people with autism, and trying to get them 111 00:05:13.310 --> 00:05:16.760 involved with national security think tanks and organizations. 112 00:05:17.000 --> 00:05:19.580 We don't go for volume, we don't go for just quality. We look for 113 00:05:19.580 --> 00:05:22.520 a variety of cybersecurity nonprofits, and we're just happy 114 00:05:22.520 --> 00:05:25.070 to be able to help these people out, get them to the next level. 115 00:05:25.510 --> 00:05:29.620 Mathew Schwartz: And so this is your way as someone with deep 116 00:05:29.680 --> 00:05:33.940 experience, expertise in the industry, as you said, you look 117 00:05:33.970 --> 00:05:36.820 at the challenge, perhaps, sometimes integrating or 118 00:05:36.820 --> 00:05:39.610 interfacing with policy. So I guess this is more of a 119 00:05:39.610 --> 00:05:42.220 bottom-up attempt to shape things the way they need to get 120 00:05:42.220 --> 00:05:42.640 going. 121 00:05:42.690 --> 00:05:45.200 Ron Gula: So the way I look at it is we have a responsibility, 122 00:05:45.248 --> 00:05:48.097 right? And cybersecurity was good to us. We were taking the 123 00:05:48.145 --> 00:05:50.849 sort of the results of our efforts of working at Tenable 124 00:05:50.897 --> 00:05:53.842 and trying to get back, right. And it's more than just money. 125 00:05:53.890 --> 00:05:56.546 It's trying to talk to the policymakers, trying to help 126 00:05:56.594 --> 00:05:59.346 people who are trying to change things. Specifically with 127 00:05:59.394 --> 00:06:02.436 neurodiversity, I've had so many people in our industry pull me 128 00:06:02.484 --> 00:06:05.381 outside, say, hey look, my son, my niece, my nephew has, you 129 00:06:05.429 --> 00:06:08.326 know, say autism, ADHD, that sort of thing. And they want to 130 00:06:08.375 --> 00:06:11.368 go into IT, they want to go into cybersecurity. And, you know, 131 00:06:11.416 --> 00:06:13.927 just like any type of social issue, there's a lot of 132 00:06:13.975 --> 00:06:17.017 unspoken, hey, can you say that word? Can you not say that word? 133 00:06:17.065 --> 00:06:19.962 Can I ask for help? How do we have that conversation? That's 134 00:06:20.010 --> 00:06:22.714 the one thing I've learned because this is the fifth one 135 00:06:22.762 --> 00:06:25.176 we've done, we've done increasing African-American 136 00:06:25.224 --> 00:06:28.025 engagement with cybersecurity, we've done increasing board 137 00:06:28.073 --> 00:06:31.018 work, we've done trying to get just more general awareness of 138 00:06:31.066 --> 00:06:33.770 data care in cybersecurity. Every time we've done one of 139 00:06:33.818 --> 00:06:36.715 those grants, we've come away learning a lot more, and we're 140 00:06:36.763 --> 00:06:39.708 happy to help but then we can share that knowledge with those 141 00:06:39.757 --> 00:06:41.640 policymakers about what we're learning. 142 00:06:42.260 --> 00:06:44.900 Mathew Schwartz: So another message if I may never 143 00:06:44.900 --> 00:06:48.140 underestimate the impact of giving people language to speak 144 00:06:48.140 --> 00:06:50.570 about things. I mean, cybersecurity, as we know, the 145 00:06:50.570 --> 00:06:53.630 language, that's often the very first hurdle for people when 146 00:06:53.630 --> 00:06:57.170 they get into it is how to talk about all the various aspects of 147 00:06:57.170 --> 00:06:59.750 it. Even, you know, some of them can be quite interesting and 148 00:06:59.750 --> 00:07:02.270 engaging to people from outside cybersecurity, but they might 149 00:07:02.270 --> 00:07:03.770 not know how to crack that nut. 150 00:07:03.750 --> 00:07:06.267 Ron Gula: And, it's a, you know, words mean things. And if you're 151 00:07:06.313 --> 00:07:08.921 uncomfortable talking about these topics, because of what 152 00:07:08.967 --> 00:07:11.485 you see in the news, or because you're afraid to offend 153 00:07:11.530 --> 00:07:14.231 somebody, it's really hard to kind of take that first step. 154 00:07:14.276 --> 00:07:17.114 And then those on the outside of cybersecurity, you know, they 155 00:07:17.160 --> 00:07:19.952 might interpret that as being cold or not welcoming, which is 156 00:07:19.998 --> 00:07:22.744 again, one of those words, we want to call it data care. But 157 00:07:22.790 --> 00:07:25.398 it's also one of the reasons we're doing these grants and 158 00:07:25.444 --> 00:07:27.870 making them competitive and making it more about just 159 00:07:27.916 --> 00:07:30.845 winning, you know, a little bit more funding for your nonprofit, 160 00:07:30.891 --> 00:07:33.545 is to really give them more exposure to people here at the 161 00:07:33.591 --> 00:07:35.880 RSA Conference, and to people in cyber in general. 162 00:07:36.860 --> 00:07:38.720 Mathew Schwartz: Where to from here, if it's not too early to 163 00:07:38.720 --> 00:07:41.810 ask that. I mean, you have outlined a number of excellent initiatives, 164 00:07:41.810 --> 00:07:44.300 this is all tied to your foundation, I believe. What, 165 00:07:44.330 --> 00:07:49.070 where do you see other areas in which we need help and you 166 00:07:49.070 --> 00:07:53.120 perhaps attempting to talk about the improving those areas? 167 00:07:53.130 --> 00:07:55.484 Ron Gula: Specifically for Gula Tech Adventures, you know, the 168 00:07:55.530 --> 00:07:58.473 Adventures is the venture capital arm of it. The foundation is the 169 00:07:58.519 --> 00:08:01.100 nonprofit arm of it, you know, we're really operating. We 170 00:08:01.145 --> 00:08:04.044 actually was just me and my wife when we started, we've got some 171 00:08:04.089 --> 00:08:06.806 great people working with us now. We're really good partners 172 00:08:06.851 --> 00:08:09.342 with a number of other investors. You had Alberto YƩpez 173 00:08:09.388 --> 00:08:12.331 from Forgepoint on, we're really good with them. So we're looking 174 00:08:12.376 --> 00:08:15.003 to double down and do more. Now right now I'm filming this 175 00:08:15.048 --> 00:08:17.720 interview in 2023. You know, the market is at a crossroads. 176 00:08:17.766 --> 00:08:20.392 Right? There's got a lot of economic pressures on startups 177 00:08:20.438 --> 00:08:23.336 who are traditionally funded by venture capital. Right? So we're 178 00:08:23.381 --> 00:08:26.234 watching that really close. What is the role of funding? What's 179 00:08:26.280 --> 00:08:29.042 the role of a good product? So we're trying to make sure that 180 00:08:29.087 --> 00:08:31.895 the companies we're working with have great exits, and a great 181 00:08:31.940 --> 00:08:34.567 strategy to help defend the country for the next couple of 182 00:08:34.612 --> 00:08:37.284 years. It's the same thing in the nonprofit side. You know, 183 00:08:37.330 --> 00:08:39.594 what's how do you grow a nonprofit? How do you get 184 00:08:39.639 --> 00:08:41.541 funding? What's your relationship with the 185 00:08:41.587 --> 00:08:43.851 government? Can we go international? Those are all 186 00:08:43.896 --> 00:08:45.210 things that we're looking at. 187 00:08:45.240 --> 00:08:46.800 Mathew Schwartz: So keeping a close eye on that, as you say, 188 00:08:46.800 --> 00:08:49.710 there have been some hiccups in the market lately. And we do 189 00:08:49.710 --> 00:08:52.320 need to make sure that startups are having the access while 190 00:08:52.320 --> 00:08:54.690 having access to the capital they require and the guidance as 191 00:08:54.690 --> 00:08:54.990 well. 192 00:08:55.020 --> 00:08:57.257 Ron Gula: Absolutely. You have to worry about if you have a 193 00:08:57.302 --> 00:08:59.859 cyber widget, you know, if there's 20 companies that are 194 00:08:59.905 --> 00:09:02.553 doing the same kind of cyber widget, you know, what's your 195 00:09:02.599 --> 00:09:04.836 strategy? Should you differentiate, should you be 196 00:09:04.882 --> 00:09:07.530 more competitive with that, that's classic venture capital 197 00:09:07.576 --> 00:09:10.087 101. So we're tracking things like that. But we're also 198 00:09:10.133 --> 00:09:12.827 tracking the new AI, quantum encryption. We're tracking all 199 00:09:12.872 --> 00:09:15.658 the different types of what you can do with the cloud and how 200 00:09:15.703 --> 00:09:18.534 this impacts even things like privacy and availability of what 201 00:09:18.580 --> 00:09:19.950 you and I do on a daily basis. 202 00:09:20.430 --> 00:09:22.542 Mathew Schwartz: Any other issues you're keeping close 203 00:09:22.599 --> 00:09:25.854 track of I mean, quantum so huge. AI, ML, there's lots of 204 00:09:25.911 --> 00:09:29.565 discussion about that at RSA. I think you've got to have that in 205 00:09:29.622 --> 00:09:32.820 your marketing tagline, no matter what. Any other areas? 206 00:09:32.850 --> 00:09:35.056 Ron Gula: So you know, we recently as a nation just 207 00:09:35.104 --> 00:09:37.406 released the National Cybersecurity Strategy. So 208 00:09:37.454 --> 00:09:40.428 everything in that is all good, moving in the right direction. 209 00:09:40.476 --> 00:09:43.449 So the question now is, how is it going to be implemented? You 210 00:09:43.497 --> 00:09:46.375 know, the government really wants the vendor, especially the 211 00:09:46.423 --> 00:09:48.630 large vendors to do more. What does that mean? 212 00:09:48.660 --> 00:09:50.220 Mathew Schwartz: They were clear about that in the strategy. 213 00:09:50.760 --> 00:09:53.167 Ron Gula: Yeah. What does that mean to you? What does that mean 214 00:09:53.212 --> 00:09:55.847 to somebody who's running a Windows operating system? What 215 00:09:55.892 --> 00:09:58.390 does that mean to a cloud operator? Right? What does it 216 00:09:58.436 --> 00:10:01.252 mean for the privacy? I've seen this before. When I was 217 00:10:01.297 --> 00:10:04.068 at NSA, the government wanted to push the Clipper chip, which 218 00:10:04.113 --> 00:10:06.838 would have given us really good security, if you trusted the 219 00:10:06.884 --> 00:10:09.791 government, right? So now we're there's similar language, right? 220 00:10:09.836 --> 00:10:12.425 And if you look back at when Microsoft was probably going 221 00:10:12.471 --> 00:10:14.923 against the DOJ as being a monopoly, the government is 222 00:10:14.969 --> 00:10:17.876 basically asking Microsoft to be more of a monopoly. And so it's 223 00:10:17.921 --> 00:10:20.737 kind of interesting to look at this from the last 20, 25 years 224 00:10:20.783 --> 00:10:23.644 and see where we're doing. I'm just glad we're having the debate. 225 00:10:23.690 --> 00:10:26.006 I believe cyber policy as a combination of privacy, 226 00:10:26.051 --> 00:10:28.595 availability, integrity, all that. But the question from 227 00:10:28.640 --> 00:10:31.366 whose point of view, what's the government's role, right? If 228 00:10:31.411 --> 00:10:34.046 it's going to protect you and give you safe harbor, to let 229 00:10:34.091 --> 00:10:36.816 them see your source code, or who your customers are, has to 230 00:10:36.862 --> 00:10:39.632 be an interesting time. I'm just glad we're having the debate 231 00:10:39.678 --> 00:10:40.950 kind of out in the open now. 232 00:10:40.980 --> 00:10:42.979 Mathew Schwartz: I mean, as you say, words matter. It's hugely 233 00:10:43.024 --> 00:10:45.512 significant. I think the government's come out and said, 234 00:10:45.556 --> 00:10:48.223 these things are important. We don't always necessarily know 235 00:10:48.267 --> 00:10:51.022 how to get there. But here is where we would like to get to. I 236 00:10:51.066 --> 00:10:52.800 think it's been a notable step forward. 237 00:10:53.130 --> 00:10:55.530 Ron Gula: One other thing, I'm really happy about the current 238 00:10:55.577 --> 00:10:58.496 state. Well, you know, we don't want to see violence, we don't 239 00:10:58.543 --> 00:11:01.462 want to see, you know, having to do intelligence and what not. 240 00:11:01.509 --> 00:11:04.428 But I'm very excited that our intelligence community and Cyber 241 00:11:04.475 --> 00:11:07.393 Command is actually getting some credit for doing some work to 242 00:11:07.440 --> 00:11:10.077 keep our country safe. And that involves offensive cyber 243 00:11:10.124 --> 00:11:12.948 actions. And that's something that we were very, not talking 244 00:11:12.995 --> 00:11:15.773 about much over the last 20 years except in the occasional, 245 00:11:15.820 --> 00:11:18.597 sensational book or things like that. But now it's almost a 246 00:11:18.644 --> 00:11:20.010 daily thing in our news feed. 247 00:11:20.190 --> 00:11:22.155 Mathew Schwartz: Yep. You had the Hollywood version and now we 248 00:11:22.199 --> 00:11:24.819 have the real version, always a little more nuanced than the 249 00:11:24.863 --> 00:11:27.396 movies. Well, Ron, it's been a pleasure to speak with you. 250 00:11:27.440 --> 00:11:29.100 Thanks for being in our studios today. 251 00:11:29.160 --> 00:11:30.780 Ron Gula: Thanks for the opportunity. I hope you have a 252 00:11:30.780 --> 00:11:31.380 great RSA. 253 00:11:31.410 --> 00:11:33.930 Mathew Schwartz: You too. Thank you. I'm Mathew Schwartz with 254 00:11:33.960 --> 00:11:35.880 ISMG. Thanks for joining us.