WE'RE IN!

Bringing Humanity to Cybersecurity with Lea Kissner, CISO of LinkedIn

Episode Summary

 Lea Kissner, CISO of LinkedIn, describes the dangers of perverse metrics, the importance of phishing-resistant technologies, and the ongoing challenge of recruiting and retaining top talent in the field. Lea also explains how they deal with complex privacy issues at scale every day.

Episode Notes

 Lea Kissner, CISO of LinkedIn, describes the dangers of perverse metrics, the importance of phishing-resistant technologies, and the ongoing challenge of recruiting and retaining top talent in the field. Lea also explains how  they deal with complex privacy issues at scale every day. Lea and Blake also touch on LinkedIn's efforts to balance security with user privacy preferences, and the evolving threat landscape posed by AI.

Find Lea on LinkedIn

Find Blake on LinkedIn

Follow WE'RE IN!

 

Episode Transcription

[00:00:00] Blake: Hello and welcome to We're in a podcast that gets inside the brightest minds in cybersecurity, and I'm thrilled to be joined today by none other than Lea Kissner, Chief Information Security Officer at LinkedIn. Lea, thanks so much for joining me.

[00:00:12] Lea: Thanks for having me.

[00:00:14] Blake: So we'll jump right into it. You started your career in experimental robotics.

[00:00:20] How exactly did you come to move into the cybersecurity field?

[00:00:23] Lea: Well, for one, I was doing a lot of soldering in very poorly ventilated research labs, and I decided that it was probably a good idea to do something else. So I looked around. I was an undergrad at the time, and though I'd gotten to do a bunch of cool things, I got to work on a Mars rover. I got to work on reconfigurable modular robotics.

[00:00:41] I looked around, I was taking all math and CS theory classes, and I was like, what do I do with this? And cryptography uses the most beautiful math, just absolutely gorgeous. I would not have believed you at the time if you told me how much time I was gonna spend on global geopolitics as part of my job.

[00:01:00] Blake: Can imagine. Also, brief aside, you mentioned cryptography, crypto, does it stand for cryptography or cryptocurrency? That's the age old debate.

[00:01:08] Lea: Cryptography. Cryptocurrencies are currencies, right? They are not all cryptographic applications, but I will make a deal. We both get off the term and we leave it to the cryptozoology people.

[00:01:21] Blake: The cryptozoology people, outta left field. Okay. Well played. I'm, I'm, I'm on board with that. I think you can unite the crypto, uh, the, the two or more crypto, uh, communities around that. So I, I, I wanted to ask, speaking of some of your mathematical background and the analysis that you bring to these roles, you've mentioned before that a, a quote.

[00:01:41] Perverse metric. It was a, a term that I was unfamiliar with before I started doing some research for this podcast. It's one where you can make the metric move in the right direction, whichever one you want, but in the real world, the outcome isn't what you want. End quote, how do you avoid perverse metrics, which I imagine could be quite perilous for a CISO role, and, and how do you actually find data that carries real weight?

[00:02:06] Lea: Yeah. And perverse metrics are incredibly common in security. Just they are everywhere because it's so difficult to measure anything in this world, and we have so many adversaries. But to figure out if you've got a perverse metric, um, assume that there is an all powerful but very grumpy being who's gonna make your metric, do whatever you want.

[00:02:27] But in the worst way possible. So you want your traffic to go up. Bam. DDoS attack, right? So instead you wanna look for indicators that align with real world safety and resilience. So for example, running phishing tests on employees and then training them into submission, like that's an approach that a lot of folks take.

[00:02:49] We actually have some really good solid research data at this point that says that that doesn't increase your resilience to actual phishing, but what it does do is piss everybody off at the security team really badly. So instead, I like to move to phishing resistant authentication, like, uh, Yuba keys or pass keys and measure, okay, where do we not have the technical protections rather than try to stop people from being people 

[00:03:17] Blake: Right. And I, I, I imagine that's still challenging though, because just speaking, even from my limited knowledge of our own security en. Environment here at Synack,, there might be compliance issues at stake though, right? You gotta do some of those phishing exercises. You gotta clear some of the, some of the hoops against that backdrop, how do you recruit and retain the talent you need to sustain LinkedIn security mission, you know, uh, so that you're not pissing people off, as you said, and making them leave.

[00:03:44] Lea: So because we're always facing new attackers with new techniques, 'cause people have way too much free time, apparently we need people who can figure out new things. So we're looking for people who can figure things out holistically, like a, a great engineer. Understands not just the security, but the infrastructure and the software and how things connect.

[00:04:05] So when we show up, we are competent and we are helpful. So for example, um, at my last job, uh, they were like, okay, well we need to cut costs on, on, on our systems. It turned out that the most impactful cost cutting projects, they were all security projects. Every single one of them was just, yeah, we're, turns out we're, we're, we were making your infrastructure so much better for security reasons that it also works better and it also costs less.

[00:04:37] So you look for people who can, uh, understand why other people care about things, reason through ambiguity, and really care about the people who are using our product. And so that also means that we invest a lot in teaching folks new things and cross team mobility, right? So if we have somebody move into our team from SRE, we are the winners.

[00:04:58] We just got a bunch of skills that we wouldn't necessarily have. But also when we have a security person move out into product engineering, we have somebody who, who knows security on that product. So everybody wins in this scenario.

[00:05:13] Blake: Speaking of product, I'd be curious to hear how you interact with the product team. Do you ever get pulled into conversations about I, you know, how could we make this feature work in a secure way? Or do you have to slam the door on certain projects if they present some, you know, security risks that you've identified.

[00:05:27] Lea: So we try and be in there really, really early so that we can find ways to make things work and solve the problems for the people who we really serve. I have been in former employers, especially in some conversations where, where we look at it and we go, oh, that's not gonna work out well. And you know, not everybody is thrilled about that, but they're really excited that I'm finding it out and not the New York Times.

[00:05:53] Blake: Fair, fair point. And I, I feel like that's such a common thread, right? This, this push and pull between. Security, privacy and like exciting product features or things that people might want on paper, but then don't really think about the cybersecurity or privacy implications. Now, earlier in your career, you were a principal engineer at Google and eventually took on the title of Global Lead of Privacy Technology.

[00:06:17] I liken Google to the Final boss of Privacy at Scale. Everybody uses it. I mean, maybe more people are using AI now, but that's a different story. What can you tell me about that experience? 

[00:06:26] I. 

[00:06:27] Lea: So, okay, I'm gonna back up one step there, right. I think that if you wanna build something great, it has to really respect the people who are using it and the people who are affected by it who aren't even using it. Right? And security, privacy, trust and safety, those are all part of it. If you build a feature that doesn't do well by people, it's not a good feature.

[00:06:49] So when you're working with, especially with really good product people, there's less tension than you would think because we're all trying to figure out how to build something that works really well. Now Google, not boring, right? Privacy and engineering and security overlap very, very heavily because in all of these, we're trying to make sure that we have a system that does the right thing for the people who use it, even in a world where things don't always work properly.

[00:07:17] Um, but even the simplest sounding promise that you're making to people can be very hard to keep at scale. So, for example. Have you ever tried to make sure the data is deleted from a large scale system? It's way more complicated than it sounds, right? Because people are like, okay, I go to the database and I delete it, and then it's gone, right?

[00:07:38] Blake: Right, right.

[00:07:40] Lea: No, because there are caches for speed all over the place, or there's computations that take literally days and they have little bits of data all mixed up in them. There are multiple copies of data 'cause nobody wants to lose your email or your baby pictures. On top of that, you have just like 17 million systems going on in a large, in a large company, and you need to make sure you're cleaning it out of all of these different places.

[00:08:07] And when you delete data out of a database, databases have extra copies. It of your data in it, just because that's part of how the database works. So you end up with this incredibly complex co cross between a cat wrangling problem and a technical challenge.

[00:08:26] Blake: Right. Not to mention legal getting lurking over your shoulder. I'm sure. Being like, okay, is this actually ironclad or is what you're doing really, really deleting this? So it's not, not quite as simple as just dragging a file to the trash bin on your Mac, it sounds like.

[00:08:38] Lea: The I, I've gotten the luck to work with a bunch of amazing lawyers and the ones I've worked with for a while, they appreciate an amazing dashboard. 'cause they're like, I know this is so complicated. How do I, you know, you know, can, can we go and check that it's still working all the time? Which with my SRE hat on, I agree we should be checking this stuff all of the time.

[00:09:02] And it's also really tricky because there are a lot of compliance frameworks. They're changing really rapidly. Nobody really knows what's going on with the new ones, and we're trying to work our way through it. And people are also different. And what it means to have good privacy is very different for different people, which I think we all tend to go, well, what, what works for me?

[00:09:25] But that may not be the same thing that works for you. So part of what I did at Google was, was spending a lot of time thinking, um, with engineers, with lawyers, with civil society, people with regulators. Not so much with regulators, but uh. What does it actually mean to have a system that does a really good job by privacy?

[00:09:48] And how do you, like, how do you really make something work for people in a, in, in a, in a world that's very complicated, right? I had a conversation with somebody, uh, where they're like, well, can we delete the data instantly? And I had to explain that the speed of light actually says no.

[00:10:05] Blake: I like that technicality, but absolutely correct. Even in the best case, that's not gonna be instant. There's still gonna be that response time and that lag of however many hundreds of thousands of miles an hour it is. So. Applying some of this to LinkedIn. You know, there are some really interesting LinkedIn features.

[00:10:19] You mentioned everybody kind of has their own definition of privacy. They, they're like privacy and security appetites. Right. In a way, and I'm thinking of features like, the recent verified on LinkedIn partner program, the ability to show that you're open to work without tipping off your direct colleagues in your organization.

[00:10:35] So like shutting off some of that, obviously that's not gonna fit everybody's privacy appetite, but you wanna make sure these features are being imple implemented in a way that people trust and use them and get the most out of them. How do you strike that balance when everybody's personal feelings about the matter are gonna be so widely varied?

[00:10:52] Lea: Yeah, it, it, it's really tricky and it requires a lot of sitting down, especially really, really early in the design and thinking through where are the trade-offs? How do we serve different people? And, and not every feature is right for every person, which is why we wanna make sure that people can make the decisions that they need to make for themselves because they know their situation better than I can ever possibly do.

[00:11:16] Blake: Yeah, that's, that's a really good point. And before heading up security at LinkedIn, you were over at Lacework and I, I, I wanted to just highlight something that you wrote there that I found was really interesting, quote, "The role of the CISO isn't just about protecting assets, it's about caring for the organizations people."

[00:11:33] End quote. I've heard plenty of CISOs talk about securing the human element. I mean, we even touched on it earlier a little bit, uh, but not so many actually use the word "caring." What's behind that word choice?

[00:11:46] Lea: So security is very, very deeply human. So yeah, we defend infrastructure. I spent a lot of time as an infrastructure engineer, but our mission is to protect people. Their jobs, their data, their trust. And if you treat security like a purely technical function or as a series of check boxes, you're going to miss that, right? You're not solving problems unless you're solving them for real people, and you really do have to feel the impact of that to make good decisions about it. And also, care means creating a culture where people feel safe to report issues, to ask questions, to learn to grow. And I, I think that comes down to everyone has something to teach and something to learn. So you have to have that, that culture of empathy and responsibility. It's not a mechanical function.

[00:12:42] Blake: That's a really good point and so important and, and can't be overlooked in building today's security cultures. Uh, speaking of which I, I know you also pointed out that two and three security leaders report their team's lack. The necessary skills to protect against modern threats. Thinking of AI here a little bit, I know we've gotten this far in the conversation and have only kind of flicked at ai. How is that changing the skillset needed? How is that changing the game, uh, for you and your teams? 

[00:13:09] I. 

[00:13:10] Lea: So in terms of skillset, like I said, I wanna hire people who can go figure new things out. So, so we're still looking for that. We're still looking for that skillset. It's another tool to put in the, uh, put in the toolbox both for us and for our attackers. Right. Um, I'm lucky enough to have been at companies where people have been throwing ai, AI based attacks at us for years and years, so this is a little bit less of a surprise to me.

[00:13:40] They're better now, but what they, what they are is more democratized. You're gonna get a broader range of actors with better attacks. And there are some places where it's changing the game for defenders. Some of that is things like being able to answer questions. Uh, we have, I work with so many people who care a lot about security, but they don't know everything.

[00:14:04] I don't know everything about their team. They don't know everything about mine. There's a reason why we make teams to solve problems, and so we're able to answer their questions a lot faster. Um, and we also can use it to try to find patterns and data better, uh, which is good when we have a lot of extremely inventive, uh, attackers.

[00:14:30] Some of that is, is everything from, how do we do account recovery safely in a world where folks are showing up with deep fakes to try to fool the it help desk to just, uh, bread and butter detection and response, kind of expanding into more techniques to find signals.

[00:14:51] Blake: I think about that even with video recordings like this one, right? People can use this and potentially leverage it. It's so alarming and, and definitely seems to change the game for the threat landscape. There, there we go. It's, yeah, it's, it's tough.

[00:15:03] Lea: I, I'm, I'm not a deep fake, I swear I'm not a deep fake.

[00:15:07] Blake: but a deep fake would say that, right? That's the problem. Uh, no, I, I also wanted to circle back quickly to something that you mentioned earlier, which I think is so. Important insecurity of some of the benefits and, uh, talking about even efficiencies that can be realized, different outcomes that can be achieved through your role as CISO and through an effective security function other than just avoiding breach.

[00:15:31] Right. Uh, I don't know if you have any additional thoughts to share about perhaps how peer leaders could even articulate that so that security isn't viewed as just a cost center.

[00:15:42] Lea: I'm showing up in a way where I believe that, for example, 90% of the things you do for your systems to just stay up and work, they're the same things that security wants. They're exactly the same things, not just they overlap, they are exactly the same thing. System dependency, uh, control. It's the same thing as an Apple control system.

[00:16:04] If you design it, if you design it well. I have had so many conversations with my SRE buddies where they're like, where they don't like people touching production. I don't like touch people touching production, right. So we're, if we come in with this, this mindset of we have the same problems for slightly different reasons.

[00:16:25] We can go and just build better systems and it becomes, uh, a lot easier for everybody. But also the security team becomes very popular because we, we show up and, and, and help them solve problems that they're maybe not equipped to or don't have the time to solve on their own.

[00:16:42] Blake: Yeah. What's that old rule? The 1 10, 100. I always wonder if that's actually apocryphal or, or true of, of production of like fixing a security bug in production costs. However many more times than addressing it earlier on. But anyway, that's, that's neither here nor there. Uh, I wanted to ask a question that we, that we ask of all of our wearing gusts actually, but it feels especially appropriate given your day job, which is what's something that we wouldn't know about you, Lea, just by looking at your LinkedIn profile.

[00:17:11] Lea: Okay, so I like to go backpacking. Um, to the point where I have my own dehydrator, freeze dryer. I make all of my own food. And I have, uh, I've gone a little down the ultralight, uh, path. So my backpack fully loaded without food and water in it is 10 and a half pounds.

[00:17:32] Blake: Wow.

[00:17:34] Lea: I am too lazy to haul a 40 pound backpack up a hill. It's just not happening.

[00:17:39] Blake: I just went backpacking in Big Sur last month and it was beautiful. I promise you, we did a competition. My backpack was like 40 pounds. It was not, it was not the most efficient. I like that. That's, that feels very apropos for our era as well, right? Like finding the efficiency, drawing out any little ounce that you can squeeze out of that. That's impressive. 10 and a half, you said? 10 and a half.

[00:18:00] Lea: 10 and a half pounds, not with, uh, without,

[00:18:02] Blake: lighter than a, that's lighter than my cat. Lea, it's great talking with you. Thank you so much for joining me and uh, for listeners, give them a follow on on LinkedIn of all places I think. I think you won't have any trouble finding Lea there. And uh, really appreciate you joining me on, on We're In. Thanks so much.

[00:18:18] Lea: Thank you so much, Blake.