Beau Woods knows firsthand how every moment counts when it comes to medical cybersecurity. He launched his career in a hospital, where it wasn’t always possible for doctors to punch in complex passwords or spare a second thought for cybersecurity. Beau went on to found I Am the Cavalry, a group of cyber ambassadors dedicated to improving the security of devices ranging from pacemakers to connected door locks. In his current role as senior advisor for the Cybersecurity and Infrastructure Security Agency, Beau helps fill gaps in U.S. cyber defenses by boosting organizations that may not have the resources or knowledge needed to secure critical connected equipment like insulin pumps. “If you can get ahead of things and help them to build better procurement processes, help them to identify more securable technologies that have better business models, that will have greater longevity, then you can stop the flow of inbound, insecurable devices and – over the next decade or two – eventually that cyber hygiene tide line can rise,” he said in this episode of WE’RE IN!
Beau Woods knows firsthand how every moment counts when it comes to medical cybersecurity. He launched his career in a hospital, where it wasn’t always possible for doctors to punch in complex passwords or spare a second thought for cybersecurity. Beau went on to found I Am the Cavalry, a group of cyber ambassadors dedicated to improving the security of devices ranging from pacemakers to connected door locks.
In his current role as senior advisor for the Cybersecurity and Infrastructure Security Agency, Beau helps fill gaps in U.S. cyber defenses by boosting organizations that may not have the resources or knowledge needed to secure critical connected equipment like insulin pumps.
“If you can get ahead of things and help them to build better procurement processes, help them to identify more securable technologies that have better business models, that will have greater longevity, then you can stop the flow of inbound, insecurable devices and – over the next decade or two – eventually that cyber hygiene tide line can rise,” he said in this episode of WE’RE IN!
----------
Here are a few more reasons to tune in:
* Learn Beau’s tips for making cybersecurity issues more engaging, from gamification to building empathy
* Hear about his unconventional career path from psychology to security
* Build awareness on the state of healthcare cybersecurity and CISA’s role in government
----------
Links:
* https://www.cisa.gov/
* https://iamthecavalry.org/
* https://www.synack.com/
* https://readme.security/
Beau: [00:00:00] Alright, my name is Beau woods and my title is senior advisor.
Jeremiah: awesome. Thanks. Uh, Hey Bo, welcome to the show. Really appreciate you joining, uh, thanks so much for your time. during Def con this last year, uh, you got to participate in several things. One of which that I had the pleasure of sitting into, which was the do no harm panel. Um, I was wondering if you could tell us a little bit about that and how that experience was for you.
Beau: Yeah, sure. Uh, so, [00:01:00] um, do no harm is a, it's a longstanding event out at hacker summer camp? Um, we started it actually, I think in 20 13, 20 14. Uh, and it started as just an informal meetup that happened in my room, uh, out at, uh, at, besides, um, and as it grew over time, uh, we expanded it. And the first time we did it at Def con, it was, uh, a nighttime lounge event.
And we had like two or 300 people. We actually had to get a bigger room. Um, and they had. Uh, clear packet hacking village out. Uh, and so we jumped in there. They were gracious enough to let us have their space. Um, and we had a, you know, Jeff Moss came by the founder of Def con. Uh, we had, uh, actually representative land driven, a member of Congress who we had brought out, uh, for a different thing came in.
Um, and we had a ton of reception, so we just kept doing them. Uh, getting bigger, becoming more fun. Uh, and we, we've got kind of a [00:02:00] casting, um, a rotating cast of characters that we bring in. Uh, but at its core, do no harm is, uh, and attempt to bring together people from the hacker community. Who are interested in healthcare who are working in healthcare or around healthcare who have healthcare concerns?
You know, not necessarily just a, you know, I've got a device in my body, although we've got several hackers who have implanted medical devices or wearable devices like insulin pumps or, uh, ICDs. Um, but people who really see, uh, healthcare. Uh, and public health as kind of a bigger mission for them to serve, um, as a part of, uh, as a part of what they do in insecurity.
So, uh, the panel is always a mix of hackers of doctors or hospital administrators, of patients, of people who work on medical devices, work in healthcare. We've had folks from the food [00:03:00] and drug administration and HHS, a
Jeremiah: was
Beau: of anchors.
Jeremiah: there was something interesting that you said too with Congress and then some of the HHS representatives you mentioned. Um, I S I saw that you, you work on a project called DC to Def con as well. Is that part of that flow?
Beau: So DC to Def con is something that we started a few years ago, uh, with, uh, where I was working at the time, the Atlantic council, which is a Washington DC based think tank. And we said, what would be the craziest thing we could do? Uh, maybe it's to get a couple of members of Congress to come out to Def con with us and just immerse themselves in hacker sauce for the weekend.
Uh, and so we had a representative land driven representative heard, uh, came out. And we did a variety of events and activities with them. Um, had them in some, some conversations for a full day, basically, uh, with members of the hacker community, uh, brought in a bunch of people who are really interested in policy, had them sit down and have conversations.
Jeremiah: That's [00:04:00] awesome.
Beau: They did a full talk out there. Um, did a, uh, like a, an interactive Q and a session. And then when they got back, um, actually representative land given on, uh, on the floor of Congress, uh, gave kind of an impassioned plea and one of the hearings for why Congress and the government should look to hackers as leaders of cybersecurity community.
I live in DC. So I'm allowed to say cyber without drinking.
Jeremiah: I'm just outside as well. So
Beau: So that, uh, so that other fellow Congress, people of the, of the representative, um, could understand that that hackers are not a monolithic thing, right? Uh, that security researchers can be a net benefit in that gain for the country and for national security, uh, not just something to be.
Bella: And do you think experiences like that are really. Starting. I feel like I hear a lot of like small experiences like this, where, um, folks in government are exposed to, you know, quote [00:05:00] hackers, and then it sort of changes their mind. But I feel like I've been hearing these experiences for kind of a while.
Do you think we're, we're finally starting to see like a broader shift now more than just these one-offs.
Beau: Yeah. If you go back. You know, 20 or 30 years, the relationship between government and the hacker community was tense.
Jeremiah: It is.
Beau: Um, and yet, uh, it was, um, not a static thing, right. So if you go back and look at the cast of characters for the very first Def con uh, one of them was the FBI. Um, agent who was in charge of tracking down legitimate criminals and bringing them to justice, uh, and, uh, the Def con crew recruited her to come out and give a talk about on that, uh, because they said, look, we need to work together in this society in order to make it safer for the things that we all have.
And so there's been an undercurrent [00:06:00] of this kind of collaboration that's been going on, even through the tough times hackers and the government have been working together to influence, uh, the way that, um, certain law enforcement, uh, things are carried out, um, to be, uh, a helping hand in informing congressional staffers and members of Congress to team up with agencies, the white house.
Uh, as well as governments around the world, um, to really help raise the timeline and to distinguish between the positive, good faith work and the truly, truly the things that we want to weed out of our society and out of our.
Bella: Yeah, that's really interesting. It's interesting to think about the, the, the, the distance that's that we've come so far, I guess, on that. Um, so before we get, uh, too deep into what you're working on, Um, I wanted to ask you a question. I know I noticed that you have a degree in psychology from Georgia tech.
And when I read that, I thought [00:07:00] that was just really interesting because I think I imagine there are probably a lot of interesting ways to sort of look at cybersecurity, look at hackers and, you know, from a psychology perspective, uh, is that something that has influenced your, your career in cyber?
Beau: uh, every day probably, um, in, in various different ways. So, uh, you're right. I have a degree in psychology. I started out calling. Thinking that I was going to be the best computer scientist in the world. And I was going to learn all the things and be great at it. And I got in and I realized that I was way far behind the power curve.
Right. I was, I was kind of technical and I liked computers, but then I got in there and I saw that everybody else was way more technical than me said. All right. You know, this is, uh, I am, I'm going to be behind my entire career. If I go down this path and just go. In this one way. Um, so I, I started taking the basic classes, ended up, uh, really enjoying the knowledge, the education, the capabilities that, [00:08:00] uh, psychology courses brought me.
And so I decided to get a major in that. Um, now when I graduated, you know, I took a non-traditional path for a psychology graduate, which usually that means masters and, and university. Uh, I ended up getting a, a job at a hostile. Um, and really, really quickly on the job at the hospital. Uh, my psychology training.
Uh, kind of kicked in and allowed me to, to see things a little bit differently. Right. So, you know, working with, um, doctors and nurses where every second counts, um, if you ask them to have a really long password, it's really complex and change it all the time, then they'll look at you kinda funny. And then, uh, fortunately, you know, my boss at the time invited me to, to go hang out at the emergency department, um, and just watch the workflow.
Right. And so what I saw. Doctors and nurses going up to these workstations all the time, they would pop between four or five or six, just depending on which one was closer to the patient. And if they had, you know, [00:09:00] to put in a complicated password, every time they jumped into the computer, They would've had a lot of issues, right.
Um, if they forgot their password or needed to reset it, then that's a 15 minute cycle. That's 15 minutes out of somebody's day when they can't be helping patients in a place where seconds count. Right. So it gave me an appreciation and ability to see things through a different lens and start to build some empathy.
With the position that other people are in. And then start applying that to myself and say, what can I do as an engineer, uh, in order to build something that works better for them, not just whatever is, you know, maybe the standard practice of, uh, doing security in a financial services organization or in a manufacturing organization, or, uh, you know, in aerospace aviation, every place has its different trade-offs and considerations.
And how can we bring, uh, Bring a better set of security tools in a way that's, uh, [00:10:00] psychologically acceptable, which is one of the original principles that, uh, kind of helped launch information security. Uh, back in the seventies is this idea that things have to be acceptable to people so that they'll actually use them and not try and circumvent.
Bella: Yeah, that's really interesting.
Jeremiah: Just real quick. I questioned on that, um, around, um, authentication because that, that, that resonates with me. Right. Um, you know, Had past experience inside the government, inside of government contracting. I come from the red team operation space, myself, and, um, I've always been a supporter of, of passwords, right?
Just, uh, super complex passwords that, that help you to authenticate. And that particular problem is not one I've thought of. And as you were talking, it kind of resonated with me for a moment there, because these are, these are critical, critical times for patients, especially in that moment. I mean, it can be, and, and any, any delay in time from treating the patient until, [00:11:00] uh, when a doctor can see them in and get on the computer systems or, or things that they need to access to be able to treat them is, is critical.
My perspective, um, in doing that, are there current solutions out there to this problem, this, this authentication password, uh, delay problem that are currently in effect.
Beau: Yeah. I mean, there's, uh, there's some really high-tech approaches. There's some really low tech approaches like low-tech is you put as few barriers between the doctor and the patient as possible. And that may mean, you know, for certain capabilities that are computer connected. That may mean no password.
And the protection that you might fall back on is, well, it's in a physically separate environment where there are other more dangerous things that some, a bad actor could use if they really wanted to, uh, it may be things like, um, More seamless authentication, frictionless authentication. So a proximity card [00:12:00] instead of a password or a single sign on throughout, you know, logging into the computer, getting into 1, 2, 3, 5 clinical applications at a time and having them bream up, bring up the connections.
So, you know, thinking through what that workflow needs to be in order to. Uh, serve the mission of the organization, whatever that mission is and the requirements that they have, the things that keep them going. I think that's really key and critical to building any part of a security system, whether it's authentication or intrusion detection or, uh, you know, Addie virus.
Jeremiah: So by proxy card, uh, As I don't necessarily know exactly what you mean by that. Uh, but in my mind, like I'm holding up a credit card that happens to have a wifi or wireless authentication piece that's attached to it. Right. And anytime I go into a wireless. Wireless authorized, uh, paying system. I can just kind of swipe my card [00:13:00] and it authenticates and uses, uh, uh, uses the stuff that's embedded inside the card.
And then it authorizes and charges my account. Are we talking about something like that?
Beau: yeah, something like that or, uh, even easier. So this is, um, you know, I won't hold the front side up to the camera, but this is my, uh, DHS identity card, my CIS identity card. Uh, and it's got, uh, a chip, you know, kind of like your credit card that. You plug it into the computer and it reads it. And that gives, uh, in addition to, uh, my pen, which I know something I know, and it never changes.
Um, this gives me the ability to really quickly and easily authenticate. There are also systems where it's approx card, like badging into a building, right. And so you get close to it and you can, uh, uniquely identify yourself. Uh, and there's some challenges with some of these approaches, but, uh, You know, by and large, those are overcomeable with a little bit better engineering.
Um, we just have [00:14:00] to apply ourselves to that. And, you know, I, I would take a different perspective of you as loving passwords. I don't love passwords at all. I, I kind of hate them. Um, and I think that, uh, password. When you had very little to memory space, we had very few inputs. Um, passwords were a simple approach.
Uh, to being able to, to secure authentication, but like we tell our kids stories about Alibaba and the 40 thieves when they're like two to five years old, right. It's about bad password management. We've been doing this for a couple of thousand years and getting passwords wrong. Why would we think that introducing more complexity in more technology would all of a sudden fix that basic fundamental.
Underlying thing, because again, going back to psychology, passwords tend to be harder to manage than something you might have on you.
Jeremiah: It's not always about the particular [00:15:00] technology or tool that you're focusing on behind those things. It's a human. And so it's human values. I think that come to the core of a lot of these things and how do we solve the human, the human humanistic approach to these things? That's interesting.
Bella: I think this anecdote of, uh, like, you know, experiencing first or second hand, like what it's like to work as a doctor in the ER, and understand the needs of the technology. Like I think it's really, I always find it really important to talk about. The like practical sense of security. And I think like it stories like this, that remind me why it's really important to get folks from all different backgrounds into this industry.
And like, you know, I have, I have very strong feelings on, uh, the kind of, some of the gatekeeping that happens in this industry about, you know, requiring certain formal education and whatnot. And I think it's, it's stories like this that reminds me, like, we need everyone with all different backgrounds so that we can approach. [00:16:00] Real problems with the background to know the, the requirement.
Beau: Yeah. Yeah. And, uh, a quick little story. Um, so in 20, I think it was 2020 at the RSA conference. RSA has a sandbox spaces and their little self-contained, um, subject matter areas where you can go in and learn about things. It's, you know, in some ways it's a respite from the RSA, um, vendor floor where you're, uh, you know, just kind of walking a gauntlet, um, to, to get through, to go to the place you're trying to go, uh, and getting a lot of.
Uh, you know, sales, uh, kind of, uh, content around you. Uh, but these are much more kind of community led community driven. And, uh, one of the things that they recognized was that there wasn't any content really, uh, like this for supply chain and supply chain was increasingly becoming [00:17:00] prominent. And so the, uh, Cecilia who.
Uh, RSA sandbox at Bo, why don't you come and create something for us? And so, uh, you know, supply chain security was at that time, not really sexy. It wasn't the thing that was on the tip of everybody's lips. Um, and so the challenge was. How do you make supply chain security? Interesting. You know, you can try and scare people.
That's one way to go about it. And that's the, you know, the typical way that we try and bring people into something and get them interested. Uh, but we took a different approach, which we said, uh, You know, if we can bring people in and get them to have fun and enjoy this and learn things as a byproduct, then that would be really cool.
Yeah, we're gamifying it. And so we went a hundred percent games, what we found is that, uh, we created something that really drew people in and, and cause them to stick around for a little while. And then also like it sent them back to their offices, back to their businesses, thinking about supply chain security differently and understanding that they had to, to team up with, um, their procurement officers.
Procurement is the heart of supply chain. And when you're buying things, you can either try and secure it once you get it, or you can build better selection criteria for what you get. So that it's more securable or ideally do both. Right. So how do you work with. Uh, supply chain people, your procurement people.
How do you work with the CEO, the CIO with your legal team, all of these other constructs. We actually built some of that into a game [00:19:00] where we had three roles, CIO, CEO, and CSO. Uh, they had three different wind conditions and we had them compete. With each other, right. You can win alongside each other, but you can also win at the cost of someone else.
And what was really fun and interesting is to see, you know, scissors just light up like, oh, this is what a CEO values, and this is why they've made some of those decisions. Right. So we kind of put them in a, it was like a role-playing game. Uh, we put these in, uh, put them in these decision paths. Where they had, uh, a little bit of cognitive dissonance, right?
They could go down the path. They would go down as a or they could go down the path. They would go down as a CEO and then they became a lot more aware of those types of trade-offs and decisions that a CEO might be facing. And I, you know, that was, uh, very widely liked. It was one of the reasons why I got recruited to come into scissor to bring some of that.
Um, Private sector knowledge and experience, uh, that [00:20:00] creativity, uh, that imagination and the playfulness, um, into scissor to help out, uh, to, uh, you know, kind of transform the culture within the organization.
Jeremiah: That's an interesting, um, I've always, I've certainly always been a fan of gamifying things and I've, I've seen huge value in. From, from some of the experience perspective.
I was wondering if you could talk a little bit more about that. I know, as you said, um, you've gotten the opportunity to work or take a look at some of the medical devices or medical, uh, medical industry. And you're currently working on some of the medical devices and other kinds of IOT research. Is that, is that correct?
Beau: Yeah. Um, one of my first jobs, I alluded this earlier, but one of my first jobs, uh, in the InfoSec field was at a hospital. So I've got a little bit of a background in medical and healthcare. And so a few years ago, um, some friends of mine and I started, uh, building, um, [00:22:00] Uh, an initiative called I am the cavalry, which several of the listeners might have heard of, uh, it's essentially an attempt to kind of change the way that we do things with, um, uh, insecurity so that we.
Uh, getting more on the side of, of solving problems rather than pointing out problems and building with non-traditional teammates, uh, to come to common objectives more quickly. Um, and that led to some work with the FDA, with medical device makers, with the biohacking village device lab, which I helped stand up and found and, uh, and, uh, kind of brought everybody together under that.
Um, and, uh, when we, uh, When we, um, when I joined , uh, we actually, uh, it was during COVID and we pivoted really quickly from a generalized, uh, you know, it helps us to become better at private, private sector engagement into, uh, let's build a [00:23:00] COVID task force, um, to help across the entire healthcare ecosystem.
Everyone gets a little bit better, a little bit smarter, a little bit more safe. Um, and there was a number of the things that we did there, uh, including, you know, some pioneering work working across cities, across government to identify the, uh, top supply chain organizations that could have an impact on vaccine delivery and healthcare delivery during the time of the pandemic.
Um, including building out some, um, new capabilities and, and ways of working within , uh, to engage those communities where they were rather than, uh, Um, kind of hoping that they would come to us. Um, we built out, uh, some new. Uh, guidance documents like the bad practices, like stuff off search, um, like some of these other things that were a little bit more healthcare targeted and focused, or at least inspired by healthcare.
Um, even though everybody has some of the same [00:24:00] common issues, uh, it kind of started with that. Um, and we worked really closely across the healthcare ecosystem, not just hospitals, not just pharmaceutical manufacturers, uh, not just medical device makers, but bringing in doctors, um, talking. Very regularly with different parts of HHS, uh, that is the us department of health and human services, um, to ensure that we've got, uh, uh, a whole of industry, whole of community, whole of nation, whole of world approach to really solving some of these common issues.
Bella: You mentioned something like to, to meet companies, you know, where they currently are rather than expecting them to come to you. What do you, what do you, what do you mean by that approach? And why is that important specifically within the healthcare.
Beau: Yeah. So, um, you know, the traditional governance approach is, uh, if we put it out there, it will come. They'll come. And I think that's been something that, uh, , [00:25:00] we've tried to change that dynamic because we recognize that in our role, which is a little bit different than other government agencies, we have to be a little bit more proactive, right.
We're we're tasked with partnering up with the private sector with going and. Meeting them where they are. And so we've got a, actually a really large, um, field team that's out in the regions. We have 10 regions across the country. Uh, we have, um, hundreds of people out in those areas where, uh, Uh, we can send them out and they'll go and they'll talk to people in the region.
You know, they, they know the common zeitgeists in those areas. Uh, they know what's going on, they can identify a little bit better. So we do things like have those regional teams, um, uh, host some meetings and some other little groups in the area. We also have made a big push, particularly in healthcare, particularly during the pandemic, uh, to reach out to other organizations that are already doing stuff in this area, right in health care.
Bella: And it sort of sounds like, I think there's this, there's a big trend right now. Um, that I hear about a lot in security too. I mean, there's like a bunch of different buzzwords for it, but basically to move away from identifying the problems that already exist, um, and, [00:27:00] and move more towards. Coming up with solutions beforehand I think that particularly in healthcare, that's gotta be really important to, to try to shift away from like, Hey, look at all this stuff that's broken and move into that more proactive approach.
Beau: Yeah. It's hugely important. I mean, if you, if you look at the lifetime of a medical device, let's just take an infusion pump. It's like a digital Ivy. It's probably the most common medical device in hospitals. And that device will be able to save patients lives for 20 plus years. Um, Now it might not be, uh, what we would think of traditionally as securable for 20 plus years, because the operating system will age, the firmware will age, uh, the software on it will age.
Um, but it will be clinically relevant for [00:28:00] a couple of decades before you have to replace it. So if you go to, uh, you know, a medical director of a hospital, There's a small chance that if this series of events happens, it could result in worse patient conditions one day. And the alternative that they see.
Well, let me take a bunch of my budget that I would put into, you know, personal protective equipment or bringing on new nursing staff or, you know, new training on life, saving procedures and use that money instead to replace this still clinically relevant device. You know, it doesn't serve them. Um, and so that's what you see a lot in healthcare is that they're looking at some of these trade-offs, uh, and, and they're not willing to replace systems that still work to serve their mission.
On the other hand, if you can get ahead of things and help them to build better procurement processes, help them to identify, you know, more securable [00:29:00] technologies that have better business models. Uh, have greater longevity, then you can, uh, kind of stop the flow of inbound in securable devices and over the next decade or two, eventually that kind of cyber hygiene timeline can rise.
Right? You can equip them to be better able to defend so that when they do have budget for security things, they can pretty quickly put them towards those.
Jeremiah: How do you address some of the concerns that I think others in the federal space have raised around? Um, And ability to shift left, right? There's a quote that's out there of an individual that has stated, um, our federal, uh, our capacity in cyber from both a federal and DOD perspective when compared to, um, you know, uh, when compared to the [00:30:00] commercial sector, is that of a kindergartener, right? And I only asked that because, um, from a process perspective, like what are some of the things that are inhibiting us from moving in a more agile fashion?
Beau:
So I've worked in private sector. Most of my career, I've worked a couple of stints in government. Uh, I've worked [00:32:00] closely with government agencies with, uh, folks in Congress.
And I think that there are misperceptions on both sides about the capabilities that the other has. Right. I think government has a lot of incredible capabilities to bring to bear in cybersecurity. Um, they've got a lot of Different types of levers that they can pull that the private sector just can't. At the same time, uh, the private sector has a lot of levers that the government just can't pull.
Um, they've got, uh, different motivations, different incentive structures, right? So where, uh, private sector generally, uh, is, is profit motivated. They're trying to serve the most profitable markets. Um, government has to look across all of society and say, how do we ensure that we take care of the needs of.
Yeah of everyone at the same time. And so they've just got different spaces, um, which, you know, doesn't mean that there's not room to learn each learn from each other. So there's definitely things that government can learn from the private sector. There's things that, that [00:33:00] the private sector can take direction on from a government.
And there's, there's things that the government can do to encourage and nudge, um, uh, a greater, uh, improvement across the private sector. Um, I think if you look in terms of. Rock capacity. What is the best of the best? Uh, I think that private sector has a lot of just amazing capabilities. There's no doubt about that, but if you look across the entire ecosystem, what you tend to see is that even though, um, the best capabilities in.
Not every organization is or can take advantage of them. There's organizations that we call target rich and cyber poor. And these are organizations that maybe they can't afford to have, you know, a large team like, uh, like some of the private sector companies spend hundreds of millions of dollars on security a year.
Now, if you go to a company that's making a couple of million dollars a year, We got to spend $400 million a year. They look at you like you crazy. Right? It's just, it doesn't even make sense to [00:34:00] think about it that way. And I think that that's where, um, uh, actually government and scissor has a great role to play is to help bolster those organizations that maybe can't afford to spend a ton of money that maybe are, are kind of lost to begin with.
Um, we also put out a lot of, uh, [00:36:00] of guidance. When a private sector company says something, it gets a certain amount of attention from a certain type of group. When the government says something, it gets different types of attention from different groups.
So one example is the bad practices that we put out now, the bad practices document. Uh, everybody has best practices or good practices. There's hundreds of different lists that you can look out. A scissor has its own set of good practices, effective practices. Um, but the bad practices is meant to help prioritize.
These are the things that we know reliably fail or that reliably allow adversaries to, uh, gain a foothold in an organization or cause some type of harm. So even if we can't do all of the hundreds of things are in the best practice. Can we just avoid doing these three things or five things right now, the list is at three, but there are some candidates to grow that.
And they're generally unambiguous. They're not technically complex things to do. In most [00:37:00] cases, they're not expensive in most cases. Uh, they're just things that we know work really, really well. So let's put that out. The known exploited vulnerabilities list that we put out, you know, of all of the vulnerabilities that you might find in your systems.
And there's, you know, hundreds of thousands of vulnerabilities published every year that have deep supply chain roots. These are the ones that are actively exploited by adversaries for some, uh, gain on their part and some harm to organizations. Can we just start with patching those and make sure that they don't exist or don't exist on our perimeter?
Uh, we've got a document, get your stuff off search. And the idea behind that was a lot of organizations don't know their, their footprint out there. So how can we help those smaller organizations to get better at that?
Bella: I think these kinds of conversations are what I like most about. Talking about cybersecurity, not just like what's bad, what's scary out there, but like, okay, how do we actually make use of all that information?
Um, and it reminds me a lot of like something that, so I have a background, uh, as a penetration tester, um, and I've, I've gotten to interface with a lot of customers about how to secure their stuff. And I feel like something that I consistently get to like teach or discuss with customers is like the idea of [00:39:00] risk focused or risk informed security.
And so like this concept of, of customers or organizations, or really everyone being so overwhelmed by these massive lists of like, here's all the good security stuff that you could be doing. Um, Like it's so important to get to cut that down and say like, okay, sure. There's everything that you could be doing, but let's talk about the actual risk posed to your organization based on the things that you have, the assets, the, the technology, the infrastructure, whatever.
Um, and I think that mindset is something that like maybe is lacking or that maybe we should just talk about more within this industry.
Beau: Yeah, I think you're absolutely right. Like if you, um, so I, I also have a background. I started in penetration testing, you know, some blue teaming, also some red teaming, um, and, uh, had to transition into talking to my clients. My customers. [00:40:00] Risks because you know, the risks from one type of vulnerability, uh, let's say it's a CV, uh, a CVE CVSs 10.
Um, the risks from that might be significantly less than the risks from what would be a CV S S five in a different system. That's part of a different, uh, value chain for them and part of a different business line. Um, and so you just, yeah. It forces you to understand their perspective a little bit better.
And one of the things I used to love doing, I, I did, uh, probably 40 to 50. Different clients a year is just going in and learning their different business models. Like there's a bunch of different things that people do out there. There's a bunch of different ways to, um, uh, to make the world a better place.
And if you can learn from those individuals and those organizations, what they do and how they do it, then you just, you're a lot better at doing what you're supposed to be doing for them, which is [00:41:00] to really help, um, manage rather than eliminate risk. Um, one of the, one of the things that one of my mentors.
Used to say is businesses in the business of taking risks. So if you're trying to eliminate risk from what they're doing, then you're not helping them.
Jeremiah: Yeah. I mean, the birth of innovation is with. Uh, inevitably. And so how are you going to innovate and how are you going to grow and how are you going to bring value to the marketplace and expand upon current products if you don't, if you don't take risk. And so I, I totally get that. Um, there's you, you mentioned something which was really interesting, and I kind of want to just quickly circle on that, which is. has ranking, right? Like that's, that's, that's a guideline it's not necessarily meant to be like, uh, if you find something that's a CVSs 10, then sure. It might be a CVSs 10, but at the same time, if you've got current mitigations and defense in depth put in place, then that CVSs 10 might drop to a CVS has [00:42:00] five who knows?
Uh, it just depends on your specific environment. And I really liked the fact that you said you. We spent some time getting to know the business processes because, uh, that directly relates to what these things can be from a risk perspective to the organization. And separately, a lot of individuals don't necessarily pay a lot of attention to say CVSs 3.5 and CVSs fours.
Once it gets to a five, they start looking at it and beyond. You can change things together that are 3.5 and four, to get it to be a 10, especially if you can get that remote code execution, which is interesting in and of itself. Um, and I don't, I don't know what you think about that, but.
Beau: Yeah, absolutely. And you know, different, different architectures, uh, architectural designs will give you different types of, um, uh, results. So, [00:43:00] you know, what I'm thinking about here is, is like, There are a lot of medical devices out there that are built with a principle of separating the network connectable piece to the patient care delivery piece.
So there's a telemetry. Um, that that goes out potentially, and there's some ability to communicate with the device to get that telemetry. So that for instance, you can report it back to a nursing station, but the part that interfaces with patients doesn't connect to the network, or it's not supposed to connect to the network at any time.
Um, and so sure like this is something I see all the time in healthcare and in other spaces, as well as. Sure you got remote code execution on that thing, but that doesn't really matter. Like you're not, you're going to be able to get some telemetry data that doesn't have patient information in it. You can't shut the whole device down and cause harm.
So like what's the real risk there. It's being able to interrupt the flow of information [00:44:00] and you increase costs because a nurse has to come up and go check every once in a while. But it's also fairly fixable because you can just reboot the device and it comes back with a static image. There's no persistence.
And so like some of the treatments that we have for, um, uh, you know, that, that type of capability by the way, was not put in for security reasons. Um, Uh, solid engineering teams have been doing that, uh, isolation separation containment for many, many years, just so that things don't accidentally fail or so that one, one system fails there's a backup process or something that's more robust.
Um, and you find those types of engineering everywhere, uh, that have a material impact on what the like true risk of a finding is beyond just a CVSs.
Jeremiah: So with these risks, right. And the complexities of dynamic environments, uh, that you work in and that, uh, Bella and I have seen in the past, um, [00:45:00] Is there a place for more cybersecurity regulations to force, if you will, organizations to comply with certain things with regards to breaches or implementations of certain security measures.
Beau: Yeah, so Susan is, is not a regulatory organization. Um, it's our charter to work with private sector, uh, as well as government, um, in a more collaborative way. Uh, I think that there's a lot that we can do from that standpoint, um, in order to improve cybersecurity, uh, information security, uh, and really, uh, you know, safety and resilience, um, of our, our nation, our national critical functions, uh, that don't involve, uh, regulatory requirements at all.
I would love to see more people from our community, from the hacker community, the security researcher, community come and work with government, whether it's, you know, from a position inside, whether it's working together on the outside and, you know, advising, um, whatever the role is. I think that there's, there's a lot of strength in being able to team up with, uh, government agencies with, uh, congressional.
Uh, groups with, um, you know, whatever, uh, state local territorial, tribal, or even going international, you know, the local government, wherever you happen to be. Um, there are some really, really good programs, uh, to [00:48:00] bring, uh, folks like, you know, non-traditional hires like myself into roles at, uh, into roles at health and human services and the white house.
Jeremiah: Well, I think that's, that's one of the first steps that government's doing. Right. You know what I mean is bringing folks in like you and wanting to partner with, um, Influencers in the industry, like the hacking community and those who are, who are having an impact and are passing great advice. I think that's kind of like step one and trying to, trying to change the culture and trying to do what's right.
Beau:
Bella: The last question that I wanted to ask, just kind of reflecting on a lot of this stuff that we've talked about today. Um, [00:54:00] we've kind of discussed this idea of like understanding risk and kind of. I want to say, like the S the, like a smart way to approach security, uh, which sounds really cheesy, but it's fine.
Um, I'm wondering if you have any last thoughts or last advice for folks that are in the security industry, but like ideas on how to better do, like, you know, perform our jobs and responsibilities insecurity in that smart risk informed.
Beau: Yeah, I'd say, uh, no matter what you're doing and ask more questions, then you give out answers. Um, you know, none of us is an expert in everything, even in our very narrow fields. Uh, and if you can better understand and empathize with the people that you're working with, that you're trying to help, you'll do a much better job and you'll come away from the experience, knowing a lot more and, and having built better rapport with the folks that, uh, that you're working with.
So, you know, Empathy is an underdeveloped skill in our community. And I'd like to see, um, [00:55:00] I'd like to see us all try and develop it a little bit more, both with ourselves and encourage it a little bit more with our peers and the others that we work with.
Jeremiah: Uh, one last question. Uh, from our side is, uh, something we asked, you know, everyone that comes on the show and it's around, uh, what's something that you generally wouldn't, [01:00:00] what's something people wouldn't be able to identify about you based off of your LinkedIn profile.
Beau: Uh, well, now that I've, I've updated my LinkedIn profile, so I've got, uh, you know, died Mohawk. Um, they can identify that. Uh, let's see God, that's such a great question. Um, How about that? Uh, that I was taught how to throw a curve ball from a professional major league baseball player.
Bella: Nice.
Jeremiah: Well, that's cool. Huh? That's unusual.Thank you so much for your time. We really appreciate it.