WE'RE IN!

Anthony Newman on cyberthreats to higher education

Episode Summary

In this episode of WE’RE IN!, Anthony Newman, executive director at Research and Education Networks Information Sharing and Analysis Center (REN-ISAC), highlights the need for protecting research infrastructure in higher education, dealing with credential dumps and monitoring the dark web for potential threats. He also discusses the challenges faced in higher education, such as securing a diverse range of resources, navigating risks posed by a litany of third-party vendors and recovering quickly from breaches.

Episode Notes

In this episode of WE’RE IN!, Anthony Newman, executive director at Research and Education Networks Information Sharing and Analysis Center (REN-ISAC), highlights the need for protecting research infrastructure in higher education, dealing with credential dumps and monitoring the dark web for potential threats. He also discusses the challenges faced in higher education, such as securing a diverse range of resources, navigating risks posed by a litany of third-party vendors and recovering quickly from breaches. 

Anthony also digs into the impact of AI in the cybersecurity landscape, emphasizing the need for trust and the potential benefits of automation. 

Listen to hear more about:

Episode Transcription

[00:00:00] Blake: Welcome, Anthony. Thank you so much for joining me on the podcast. Really appreciate your time.

[00:00:03] Anthony: Yeah, thank you for having me. 

[00:00:05] Blake: Anthony, it's so great to have you on the show. So, uh, last summer you were appointed executive director of the Research and Education Networks, information Sharing and analysis center. Uh, what can you, what can you tell me about that role?

[00:00:18] Anthony: My role is really focused on leading the organization. So if you think of like a CEO, it's, it's a lot of that. So it's fiscal responsibility, managing staff, making sure our services are kind of what our members expect. And, you know, growing the membership or customer base while also, you know, providing kind of best of breed services. That's a lot of what the role consists of today.

[00:00:40] And so those services all revolve around the establishment of a trust community. Uh, performing some threat intelligence, uh, we also serve as a CSIRT and so because of that we do a lot around credential dumps and, and kind of monitoring the dark web for those types of things, um, both for members and non members, so yeah, a lot of it's mostly around the threat intelligence component. That's what I think in the last five years or so out of the 20 that we've been around, we've been doing a lot more of that.

[00:01:13] Blake: And so the, the, the REN ISAC, you referenced the members, um, I, I guess these institutions can be found across Five Eyes Nations. What sort of security challenges, what constitutes a typical member and what sort of security challenges do they face?

[00:01:26] Anthony: So it's interesting. I mean, it is, you know, it's higher ed. And research focused, uh, again, across five eyes, we have considered expanding that recently. So that was something that, that I proposed when I came in. I said, look, CISA is able to work with, even, foreign nations that are, you know, and sometimes, uh, in some cases, not ones that we would typically work with.

[00:01:45] So if they can do it, we could probably figure out a way to do it as well. So we are exploring that, but, but our, our members range from really small, education institutions, community colleges, all the way up to the largest research institutions. That, you know, the world looks to for Department of Defense, research, things like that. Not only for the U. S., but for other countries. And so, it's that gamut of really small to large.

[00:02:15] Blake: And you have, as I understand it, a background yourself at a large research, a large research institution. Can you tell me a little bit about your role as Chief Information Security Officer prior to joining the REN ISAC?

[00:02:29] Anthony: Sure, yeah, so I was a CISO for Purdue University. They're an R1 institution in Indiana. They, you know, primarily are, uh, I shouldn't say primarily, people at Purdue would Would hurt me, but, but they're an engineering school, first and foremost. They, they, they're a land grant university. They do a lot in the ag space, and they have some other great colleges, but, you know, from a outside research perspective, and even from a thing, you know, a patent perspective, I think 70 plus percent of their patents are in the engineering world, and so they're really heavy into that space.

[00:03:01] They're growing. And the, um, secured research area. So you think about like classified research, they're growing pretty strongly in that area, even since I left. Uh, and just recently they hired a former FBI agent to run their research security. So. Uh, yeah, they are heavy in that space, and a lot of my job was protecting all of that infrastructure. I had a very large team that helped me accomplish that. 

[00:03:27] Blake: How do you go about securing an R1 research institution's data? That sounds like it encompasses quite a lot, whether ag, engineering, security sensitive, like sounds like maybe classified research even. That's, that's, that's a lot.

[00:03:39] Anthony: Yeah, it is. Uh, you know, it's really difficult, and I'd say that we adopted, well before my time, the idea that we probably had threat actors already on the network or in the environment, and that's a pretty common focus in those types of institutions. So it was more around recoverability, segmentation, so if there was any sort of breach or compromise, really limiting the space in which it could have a negative impact.

[00:04:07] And then being able to recover quickly, as you know, universities are, are not just, uh, research, they're not just education, they're, they're also kind of a public private partner, and so you get a lot of non student, non staff, non faculty people that all of a sudden need accounts and so you take a school like Purdue. You will have a million identities, plus, people logging in and logging out of various resources any, any given year.

[00:04:32] And so, it's very complex. I always compared it to, you know, a Fortune 500 company. So you take a company like GE. And you compare it, and it's, it's, there are some pretty, pretty easy comparisons to make, both from a revenue standpoint and from a staffing and needs perspective.

[00:04:49] Blake: That makes a lot of sense. And honestly, I've, I've heard that a lot of organizations say, you know, Bring your own device policies can be quite challenging to work around and still secure your networks. And I would imagine that a higher education institution, a bring your own device mothership, right? Everybody's, everybody's got their, got their own devices. There's no centralized to those million people that you just referenced. Nobody's sending them out corporate issue laptops there for Purdue. How can you improve organizational security while still navigating that and continuing to use devices?

[00:05:22] Anthony: Yeah, I mean, it's exactly what you said. So a school like Purdue, I think IU's in a similar boat. A lot of those large, uh, R1 institutions where you have a heavy on campus presence, you know, students, especially those first, second year students, they're bringing Xboxes, iPads, laptops all to school and expecting to not just perform their studies, but also live and play and communicate with friends and family.

[00:05:48] So, you know, the, the biggest focus that, that worked for us and I think works for most institutions is ideally a line in the sand. of protected resources versus kind of open, you know, network availability, or direct internet access, if that's possible. Um, when that's not possible, and in some institutions it's not, then that's where segmentation and then a strong authentication practice comes into play. So, the idea that as you escalate into various resources, you might require additional forms of authentication to access those. Um, that's a bit more complicated, but in, in infrastructures where it's all kind of tied together, you can't isolate or you can't segment. That's what's necessary to, to protect it.

[00:06:31] And then again, recoverability. That's what the name of the game is, especially with ransomware, that's the only way to move forward today.

[00:06:39] Blake: Right, right. And I remember the ransomware threat is particularly acute these days, it seems like, from both a healthcare and lately a medical perspective. I remember last year, I think the National Student Clearinghouse reported that some 900 colleges and universities were affected by the MVIT, the infamous MVIT software, zero day vulnerability, that just, absolutely hammered. I mean, thousands of organizations writ large, but especially the education sector. What do you remember from those days? How did, were you, were you at Purdue when that played out? Was that, was that a challenge.

[00:07:10] Anthony: I was. Yeah, I was. I mean, you think about the difference between higher ed and corporate is corporate, in most cases, the company buys and controls the environment. At higher ed, especially if you take an institution as old as Purdue, it's not that way. It's moving towards that direction, but it's definitely not that way today.

[00:07:31] And so, um, you have software installed that IT may not know about. Even in a tightly controlled environment where procurement prevents, uh, unauthorized kind of purchases, uh, from moving forward. All it takes is, is a faculty member or a staff member saying well I make a decent amount of money, I'm just going to go pay for this myself because my team needs it, and I have installed permissions and then it's, it's game over.

[00:07:58] And so, yeah, those are very challenging. What we're seeing in industry today is a lot stronger policies. A lot more push towards faculty and staff taking ownership if there is a ransomware attack. So, for instance, if it's, you know, tracked back to an individual who did something they shouldn't have done up to and including being fired, terminated, and possible lawsuits.

[00:08:20] I mean, I think, I think all those are on the radar now today with, with higher ed. Ransomware is probably the largest area that we're seeing from an attack perspective. All still coming through the same mechanisms we've seen for the last 10 years though, so we're still seeing phishing being an entry point. Broad admin access, being able to kind of pivot from that initial email to installing and taking control of something.

[00:08:46] Blake: Yeah, it does, you know, based on the headlines, ransomware is everywhere, of course, but probably can't forget about some of the more advanced, persistent nation state level threats that also have put some education sector institutions under scrutiny, particularly when you mentioned earlier, some of the more government funded research, some of the more sensitive data that maybe is, can be collected and processed in a university setting. How do you go about countering suspected state sponsored cyber espionage in a university context?

[00:09:17] Anthony: Sure. Yeah, so that was probably the biggest shock to me when I moved into the CISO role at Purdue, was how often our perimeter was being hit by APTs and, you know, what we perceived. We didn't do a lot of attribution because we didn't have a lot of time, but , We definitely saw the IP ranges matching to some of those countries that were of particular threat.

[00:09:40] So it's a, it's a mix of, you know, understanding the threats there, reporting that up the chain. So to making, you know, in our case, making sure that the president, the board of trustees, and the CIO are all aware. When it moves beyond that, you know, so interesting component for, for Purdue University particular, I don't think you mentioned, you know, China, the country itself, but, but the president is, was born in China.

[00:10:04] And so it adds, you know, an additional threat. And then you add the type of research. So if you, you know, Google Chips Act. And you see the schools involved in the CHIPS Act, Purdue is one of the top schools involved in that. And so you look at what's happening with AI and, and, you know, the future of technology, Purdue's at the forefront of that. And so foreign adversaries want to get their hands on any research that they can. I will say from a classified perspective, as it stands today at most universities, I won't speak for any specific, most universities in a classified environment still operate in an enclave type mentality, so those systems do not kind of touch the internet, have any direct connection, and so any type of attack that they would have to see, they would actually have to see a kind of a local type of attack. It wouldn't be, it wouldn't be coming through a phishing example, um, and so we can rest assured that most of our nation's top research is still pretty well protected. David. Obviously as things change, uh, and, and progress, there will be a, I'm positive there will be a desire to have access to those resources wherever you are, and, and that just adds additional threat.

[00:11:16] Blake: Yeah, and I, I wanted to circle back to something you mentioned, the, uh, the Chips and Science Act. What is the significant of that, the significance of that vis-a-vis the threat for, for listeners who might not be familiar with some of the work that these research institutions are undertaking.

[00:11:34] Anthony: Yeah, so, so the majority of it is around, you know, protecting, the, I'd say the entire supply chain, if you look at it that way. of, of that type of technology. What most people don't realize is what AI, so, so AI will probably be on the topic, but, that we talk about, but what most people don't realize is, is how impactful the last 50 years of silicon chips essentially has played into that. And, and, you know, what used to be a video card maker, NVIDIA, how closely tied they are now to The future state of AI. It goes back and forth, but, but a lot of the new technologies we see in the world actually come from researchers that have, you know, tested, designed, tested, designed, failed, tested, and then ultimately they hit a success and then they release it.

[00:12:26] It's waned a little bit in the last probably 30 years where we see more private industry doing, you know, some of that, but, but researchers are still heavily. Invested in, and the Chips and Science Act really is around protecting that. In fact, they actually, NSF had a, as part of that, a proposal, or solicitation rather, to, figure out a way to secure, or protect research security in ways that, you know, probably requires, you know, a lot of work.

[00:12:52] Some federal funding. And so that just went through in the fall. I think they just talked to Congress recently about it and were asked about it. And so I expect, if not already, an announcement should be coming out of, of who will be awarded that. And that will lead to, you know, even further protection of, of research security across the U. S.

[00:13:11] Blake: And I looked it up. I realized, of course, with, uh, with any legislation worth its salt, there has to be a fun acronym. And it's the CHIPS stands for Creative, Creating Helpful Incentives to Produce Semiconductors. That's quite the acronym there. And, you talk about AI and some of the cutting edge research that's actually being undertaken at these research universities. What's next there? Like, how is that impacting your work, both at the ISAC and? Where you see AI, uh, trending in the education sector.

[00:13:41] Anthony: Yeah, I'll just say I'm glad I'm not a SUSE right now. Because my job is really I'd say when there are incidents, you know, we, we work at, at them at a different level than maybe a school or institute, you know, research institution would. AI is amazing. I think, you know, I've, I've kids and I've talked to them. Often about how amazing it is, but how, you know, that's why I don't let them have social media. That's why, you know, even if I were to have a video or a picture, I wouldn't, you know, I'd blur out their images because it's, it's, it's a very, very big risk that is not getting, you know, going away in any means at all.

[00:14:20] Um, so you think about, you know, today. We might have two factor solutions. In fact, I just talked to an executive at a school recently. This was just a few days ago, and it was a quick conversation. It was someone I knew. I texted them, said, hey, I want to run something past you, and they texted back. They called later in the evening.

[00:14:39] That could have easily, all of that conversation, because it was probably a 30 second conversation, you know, I, I have a handful of videos out there of me that are probably public with just with that information and, and how I talk and my tone, anybody could be fooled into thinking that they were on the phone with me, add to that, you know, spoofing a cell phone is pretty simple, even with all of the security practices, you know, then you start thinking about, you know, two factor and how that works.

[00:15:09] Well, two factor is great. But if two, if two factor isn't there, or if it's a something that we trust, like a phone call, uh, or a FaceTime video that, you know, many of us that have iPhones think, well, that's secure, but even that could be mimicked in a way where you're using AI, it looks and sounds like me, but it's not me. And that's what was in the movies 10 years ago. I think, I think those are the things that are concerning. So, you know, an institution like Purdue, we would have phishing examples where students would get an email pretending to be a faculty member saying this is so and so from this class. Of course that, you know, take a chemistry course. Virtually every freshman takes a chemistry course. It's a pretty good bet that that person might reach out to a student. You know, that faculty member asking for them to go buy a gift card and they'll reimburse them in the next time they're in class. And we had students actually do that. And so you take that and you move it forward to now they can have a phone call with, with. And faculty members videos are all over the place publicly. It would be very simple to mimic someone using AI, and even with some of the weird things that happen in AI, like hands and feet movement, you could probably get past that. Like even our video today, right, had some technical challenges. And, and so, you know, you get past that by just saying there's a problem, I'm going to turn off video and just have audio. So, so it's, it's very scary.

[00:16:40] Blake: I was going to say concerning to me, uh, dear listener, uh, Anthony is off video and now I'm second guessing whether we're actually talking to an AI rendition of, uh, of Anthony Newman or not, uh, but no, I, I, obviously it's had a dramatic impact on the potential for the threat landscape to really evolve in some quite significant ways as you laid out.

[00:16:59] On the flip side, how is AI potentially helping Cyber defenders actually parry the next generation of threats.

[00:17:09] Anthony: Yeah, so I think that's what, that's why I, you know, I talk to my kids about the positive side of it. Um, I, I think from a threat perspective, you know, when someone might be looking for hashes or things like that, all of a sudden now you, if you have it in an automated way where you have AI doing that work, you no longer need an analyst to search for.

[00:17:27] Those clues, especially if you train it over time, you know, that, that AI could actually detect anomalies that, that are, you know, somewhat actionable pretty quickly. So you could, I don't think any technology is really there yet commercially, you know, I think they're, they're getting there. Um, but I think that's what is going to be necessary to, to move forward in a way where we can trust.

[00:17:56] Um, not only AI, but just technology in general. Uh, but I think that the key for me is, is, it's not replacing those analyst jobs, it's, it's making their jobs a lot simpler. And allowing them to do the hard work of thinking and reasoning versus, you know, taking some of those large language models and, you know, some of the data that it can ingest. You know, to me, that's kind of the dumb side of AI, but it does it really, really well. So, yeah, I think, you know, we'll see. I think we're still probably a year away, but, you know, then again, I thought the video Uh, was going to be a couple years away, and they just put out a release, I think, a week or two ago that was pretty good, so it's moving a lot faster than I think anyone recognizes.

[00:18:40] Blake: I saw those videos as well, I believe it's, uh, is it Quora, the engine that's using those? Or is there some sort of, uh, really impressive array of, uh, you know, smooth movement, not the herky jerky stop motion animation style you might expect. It's, it's kind of It's, yeah, I agree with you. Surprising and incredible what's being done.

[00:19:00] To bring it back to the, uh, Research and Education Network's ISAC, you know, how A, how might this affect some of the services like the Computer Security Incident Response Team that you alluded to earlier? And B, kind of pivoting just generally how those are used, how often do REN ISAC members really engage with these services and how are they using them now? And I'm referring both to the CSIRT and to the, uh, for instance, the Higher Education Community Vendor Assessment Tool is another that the REN ISAC has available.

[00:19:32] Anthony: A few things. Our members Regardless of, of their level, you know, their size, um, one of the things that we do around the CSIRT is if we identify that there's been a credential dump or something like that, um, we're doing a lot of that manually today. And so going kind of back to AI, I think, I think, you know, Relying on some of that over the next year or two to automate that is something I've tasked my team with, I didn't specifically task them with AI, but I just said, look, we should automate this if we can because we don't get paid for it. We do it kind of because it's what we should do as a leader in cyber and higher ed. Um, but then what we do kind of in addition for our members is if there is a credential dump, many schools will actually want access to that, and we provide that to our members, whereas our non members, we don't, and that's primarily based around the trust.

[00:20:26] So, you know, one of the things that we do that most ISACs do, and not everyone likes this, I'm actually one that's on the fence, I will say, of this approach. We require vetting for anyone who's going to join. So, unlike a business that might just sell a service because someone wants to pay, you know, if a school approaches us and they want access. We will take the person we work with, which we call management rep, and put it out to our community. So our community is 760 plus institutions with almost 4, 000 members. Um, and we basically say this person is joining. Are there essentially any grievances that would prevent them from joining? And we've had a few, that would prevent someone from joining, but it's all around trust because we do discuss things like indicators of compromise, for instance, or we might share, hey, we see this IP address, you know, hitting our firewall constantly. And so because of those types of, of sharing, we do require vetting, uh, that's also why we don't share, you know, some of what we see maybe on the dark web or, you know, out in the wild with non members because we don't have that level of trust established. Um, you know, on the HECVAT side, so that's the, the vendor assessment tool you mentioned, so we were one of the groups that, that helped create it.

[00:21:40] Um, along with Internet2 and Educause. I'm sure there were other partners. This was well before my time and I will say it's a great tool. So, uh, it's probably used dozens of times every week. We don't track that closely. What I will say is, um, it's ready for its next. Kind of a wave of development. So, you think about any vendor, Cinec as an example, you think of any vendor who wants to partner and do work with a higher ed institution, and that higher ed institution says, hey, we have this form, go fill it out. That's the extent of what HECVET does today. They go fill it out, they submit it, and then it's up to the institution, some person at that institution, to read through it and confirm, you vetted everything they answered, and they answered. You know, everything. What we don't know is, did they answer everything truthfully?

[00:22:31] And that's where I say it's really ready for the next wave of, of, are they actually attesting to anything that could be, you know, where they could be held liable if something bad happens? 

[00:22:40] Yeah. I guess SolarWinds is actually a prime example of of that, where you see a third party vendor having some issues. I think there were, now we found out since then, other, you know, potential challenges the company had. But, you know, ultimately, had SolarWinds filled something like that out, And they didn't talk about any of the downstream partners that they might have, you know, been using.

[00:23:05] You know, the question is, well, what value does that have? And so, HECVAD, I think, is ready for its next wave. Uh, we've had some conversations with Internet2 and Educause. We'll still host it for free, so it is a free and open source tool available to anyone, whether you're a member or not. You can go to our website and get it.

[00:23:21] Uh, and it's a great starting point. I would say it's not a great finishing point. So, I would, I would love to see A company come along and take that further. The problem is it's not easily monetizable. That's, I mean, that's the main challenge. It's very difficult to come up with a way to to make money from it, but it's a great tool and it helps, uh, especially smaller institutions make a lot of progress where maybe they only have one or two security staff, or even less, they have IT who are not dedicated security, who have to, you know, identify, can we trust this company? You know, 20 years, 20 years ago. When I would pay a company to do something, or at that time it was me recommending to my boss to pay a company to do something. No one trusted, no one asked about trust, like, can we trust this company, it was like, hey, they'll provide a good product and we should buy this.

[00:24:10] No one really considered, you know, by installing this, are there any backdoors that we're installing that we just don't even know, because None of us are skilled or talented enough in the space to know what that looks like today. And today I think it's a very different world where you have to ask those questions.

[00:24:26] Blake: Yeah, and the trust piece also plays into a point you mentioned earlier of like vetting the members that are even going to be a part of the ISAC and getting and discussing potentially sensitive security related information. You know, one, one sort of philosophical point about these information sharing and analysis centers, I sometimes joke. You almost need like an ISAC for ISACs because there are quite a few for various sectors, but how do you get the information? Maybe there's a pertinent threat facing the education sector that crops up in like the financial services sector or, you know, or the electricity sector. How do you go about fitting into that broader ecosystem of security sharing? 

[00:25:01] How do you as an ISAC do that? Like, what sort of threat feeds can you get? Can you, are you in communication with other ISACs? Do you go through the federal government for some of that? Is it like, it just strikes me as such a difficult thing to tease out when everybody rightfully, arguably, wants to keep some of this threat information pretty close to their chest.

[00:25:19] Anthony: Yeah, so that's a great point. I'm glad you brought that up. It was something I had hoped we would get to. So, so yeah, to answer the first part of the question, yes, we, we. Partner with everyone you mentioned. We are part of the National Council of ISACs, and so I think the majority of the ISACs are part of that, if not all, we meet regularly, we also go to Capitol Hill, uh, as a group to encourage Congress and, and other folks to, to support us.

[00:25:49] Today, WREN does not receive any federal funding, but other, ISACs do. And so, some of them actually sit under, under DoD or CISA. And so, we have various feeds from, from all of those groups as well as, some open source that anyone could go ahead and get. As well as some, some paid threat feeds.

[00:26:09] And then what our analysts do today is, is so there's a lot of crossover, obviously, between all of those. And so you might see the same kind of threat from multiple. You might also see articles, you know, like dark reading or websites like that. Our analysts will actually scour the web for each one of those and identify the best source.

[00:26:28] And we provide those into like a daily report to our members. That's probably our most sought after surface. Which is interesting to me because it's a lot of busy work, but it's, it's also like really good. Uh, and, and so you think 3, 500 members, we provide really great services in other categories that automate, you know, things that firewalls can do from blocking various, you know, you know, potential threats.

[00:26:52] It's like, yeah, that's great. We have a very small percentage of schools take advantage of that, but our daily watch, everyone loves. It's almost like the TLDR, like it's, it's, it just saves a lot of time and it gives you a snapshot of here's what you need to worry about today.

[00:27:05] As far as, you know, our, our trust and, and how we, how we handle all of that, today, I would say we're absolutely critical infrastructure. 20 years ago, there was a question whether education should be, you know, and I think that our, our trust that we've established is by far the, the thing that really sets us apart from, say, a, a, a paid service.

[00:27:29] You know, there are lots of paid services. For instance, if, you know, if you're a Cisco shop, you can pay Cisco for their software. They have a variety of threat tools, but their, their threat, uh, analysis, assessments, you know, all that, of course, feed into their hardware. And so, um, you could pay for that.

[00:27:44] You're going to pay. Hundreds of thousands of dollars for that. And yes, it automates it, but it's also, if you've dealt with any of that, it's very expensive. You have to hire skilled staff, and many schools can't, not only can't afford the, the licensing for that, but they can't afford to pay the staff for that.

[00:27:59] In our case, our max price today is like, it's less than 3, 500 a year. And it gives you that, that, you know, uh, threat information. It does require a human to go through it and read it. But that's still far more affordable than any other kind of for profit service I'm aware of. So, to me, that's what sets us apart.

[00:28:21] You add the ability to ask random questions to 3, 500 people. And I think that's a value add as well, especially for, for people who maybe are younger or getting started. I think it's a great, great area as well.

[00:28:33] Blake: Just as long as you don't start a REN ISAC reply all chain. I feel like that could be, uh, that could get messy fast. I've seen some of those spiral out of control at universities before, but it's interesting you mentioned that that discussion over whether education even is a critical infrastructure, and I know it's been firmly established, and I would just kind of reinforce that from some of my own experience in my past life reporting on some energy and environmental issues. I remember often encountering some of the few places in the U. S. with nuclear issues. Significant nuclear capabilities, essentially, and I'm not talking about nuclear weapons, but like research into nuclear power with like, or nuclear medicine with, you know, various, highly regulated environments where universities that would come up and, and so that, that definitely, it wasn't something that was really on my radar, but then I realized from both the cyber defense and threat perspective, like, oh yeah, okay, this is actually pretty critical. There's a lot going on there.

[00:29:28] Anthony: Yeah, for sure. I think that that's, um, you know, going back to what we talked about, that, that a lot of what we see in the commercial space actually is, is generated from, higher ed first. And so like, you know, nuclear as, as an example is one, but even, even things that are just cool, right?

[00:29:44] I think Purdue is involved in one of these, but, you know, a couple years ago there was, like, the, the widest paint and the blackest paint, and so one, you know, absorbs sun energy, one reflects it, and there's tons of applications for that, both, you know, put it on your roof, for instance, and all of a sudden you have an extremely energy efficient roof, but then you go kind of to the darker side of that, like, Purdue, for instance, has a nuclear reactor on campus, like, it's, It's right there, and great that they're doing research, but that's a threat. And, and it's not hidden. It's well publicized. In fact, there are websites dedicated to it. Um, and you just don't, you think about, you know, the energy sector as an example. They don't share a lot of that information, but, but research institutes love sharing that information. And many schools today don't have dedicated staff, you know, focusing on that.

[00:30:35] So there's a lot of crossover between our ISAC and other ISACs. There was the recent, uh, just very recent, if you, if you happen to take any prescriptions, you're probably aware of it, but there was a recent, issue UnitedHealthcare and lots of pharmacies downstream. We obviously have many of our institutions that have healthcare centers involved, and so we were aware of that right away.

[00:30:59] And we're involved in kind of helping schools navigate that. Again, because of what we cross, we handle a diverse range of threats that, you know, if you're in the, say, water ISAC or aviation ISAC, you deal with a spectrum. I'd say higher ed probably deals with the largest spectrum because we cross over in many areas.

[00:31:25] Blake: That makes a lot of sense. And, you know, part of your work, it sounds like, as executive director, is building some alliances, opening up some of these lines of communication. What can you tell me? What's, what's next for you? What's next on the horizon?

[00:31:37] Anthony: A couple things. One, we, we've been around 20 years, and I think it was 2018, Reneisac started doing what they call peer assessments, so, we perform general security assessments for schools, and we use peers, so members of our institution or of our REN ISAC that have been vetted, and we pay them a small stipend for essentially doing a general assessment.

[00:32:02] You take a school, we'll just say IU as an example, you know, they might, if they were to pay an outside party to do a general assessment, they might pay anywhere from 250, 000 to say half a million dollars for that general assessment and primarily it follows NIST 800 171. So Google NIST 800 171 and ask all the questions of the institution and you could charge a half million dollars.

[00:32:24] That's pretty much what it is. I've been through them and that's, they're boring, they do produce a great report that gives you all the risks, but they're very boring, but you follow a script. Prior to me they said we should do this and we charge A lot less than that. It's mostly a cost recovery solution, but it helps higher eds.

[00:32:45] And when I came, I said we should do more of that. So if you're familiar with CMMC, a lot of schools are going after that. Most schools don't want to fail that first assessment. Where it's somewhat public, and so we're exploring offering a pre assessment. So it'll be kind of an inexpensive tip your toe into the CMMC world.

[00:33:05] We'll tell you where we think you would fall if you were to have an outside party assess you. You could fix those before paying that outside, you know, party to come in. Um, we're doing more, uh, more pen testing, and that's been growing over the last year and a half or so. And I think that'll continue to grow as schools recognize the value of that.

[00:33:27] Uh, and we're also talking about physical security. So that's something we haven't done in the past, but, it's often what we're asked about when we have kind of open forums. Uh, it goes towards physical security. I will say, I think that will be a niche. So most R1 institutions, as an example, have a local police department that sits on their campus.

[00:33:49] Who also handles their, their local security and, and physical security. I see a place for that for the, the small medium institutions, the R2s and, and, you know, that around that same size, maybe they don't have that same ability. Growing membership right now we have, uh, 760 ish members.

[00:34:07] There are over 4, 000 institutions in the U. S. specifically, and so I tasked the team with growing that. Uh, we're trying to get more partnerships as well so we can keep our costs low. We did raise our fees to about 3, 200 for the largest institution. And so, again, that's, that's not that much money for those large institutions.

[00:34:24] For smaller institutions, though, it's very expensive. Uh, and so, uh, we're looking, you know, to, to grow partnerships. Um, it's a lot of that kind of stuff for me, which Which is, you know, again, why I said earlier, I'm glad I'm not a CISO because my job's pretty easy now compared to what they have to deal with.

[00:34:42] Blake: Well, and an issue that you alluded to, which is the difficulty of finding some of the qualified cybersecurity talent to actually accomplish some of these things. Like, for instance, if you, if you say you have a higher ed institution, even an R1 school that undergoes one of these like maturity assessments that you're talking about, and they find just, they just tear it apart, right? And they say, you've got to work on X, you've got to work on Y, you've got to work on Z. How does, does Ren Isaac kind of like step back there and say, good luck, or what, what happens next, right?

[00:35:16] Anthony: Yeah, yeah, that's a good point. Yeah, so, so today we don't do any of the follow up work. I'd say we're capable of it, and that's something we could explore. If we did it, we'd probably do it similar to the same peer model, so we probably wouldn't hire staff dedicated to that. We do point them in the direction, though.

[00:35:32] So, for instance, I will say many R1s are already capable of, of moving forward with that. It's the R2s and, and I don't want to say below because that gives the, the connotation that they're worse. It's just the, the less staffed schools, uh, community colleges, R2s, um, and, and such. So, I think those schools have a harder time and, and we do give guidance.

[00:35:53] We don't do the work. typically. One thing to note is, and you know, you're aware of this, but listeners may not be, so REN ISAC is actually hosted. At Indiana University. And so, uh, because of that, we abide by Indiana University kind of policy. We're employees of Indiana University, uh, we get benefits. And so we also have the kind of risk modeling of, of Indiana University.

[00:36:18] So if you think about that, I'm a higher ed employee. And me taking on the potential risk of trying to fix a threat at a school and saying this is fixed and giving it kind of a sign off might add risk to Indiana University. Uh, I will

[00:36:36] Blake: That makes,

[00:36:36] Anthony: That's not, yeah, that's not been shared to me, but that's my take, which is why I will probably tiptoe very carefully around Wren doing that. But we, but that's where partners come in, right? I think we have, we have lots of partners that would be capable of doing a lot of that work, um, and, and giving guidance more of advisory level guidance is really, I think, what we'd be after.

[00:36:56] Blake: Of course. And, you know, when you talk about pursuing, you know, more members among that 4, 000 plus pool of potential members, you know, that's not going to scale either. If your REN ISAC can't swoop in and solve everyone's problems, lovely as that would be. But yeah, it's, it's, it's been really great having you on the show and gleaning some of these insights on the higher ed sector and some of the threats.

[00:37:16] There is one question that we ask all of our guests. that's, has to do with your LinkedIn or rather, what's something that we wouldn't know about you just by looking at your LinkedIn profile, Anthony?

[00:37:27] Anthony: What can I share that won't get exploited?

[00:37:29] Blake: Yeah, right. The next, next phishing email will be even,

[00:37:32] Anthony: yeah, yeah. You know, I, I love people. That's probably the, the thing is, and you can maybe glean some of that from, from LinkedIn. But, um, yeah, I, I love getting to know people. I love building relationships with people. I'm pretty easygoing even when I disagree with people. Uh, and so I, yeah, I think that's probably, to me, that's what life's all about. So, yeah, that's probably the biggest aspect.

[00:37:56] Blake: All right. well, this is, this has been really great. Uh, again, appreciate you coming on.

[00:38:00] Anthony: Yeah, thank you for having me.