Nationalize Cloudflare? Berkeley Researcher Nick Merrill on Making it a Public Utility

Episode Summary

In this episode, Nick Merrill, a research fellow at the UC Berkeley Center for Long-Term Cybersecurity, makes a cybersecurity case for nationalizing major CDNs such as Cloudflare, issues some pretty stark warnings about the dangers of machine learning, and digs into why stereotypical images of hackers in hoodies doesn’t help anyone. His viewpoints are sobering if not controversial and worth listening to for anyone who cares about the future of the global internet.

Episode Notes

In this episode, Nick Merrill, a research fellow at the UC Berkeley Center for Long-Term Cybersecurity, makes a cybersecurity case for nationalizing major CDNs such as Cloudflare, issues some pretty stark warnings about the dangers of machine learning, and digs into why stereotypical images of hackers in hoodies doesn’t help anyone. His viewpoints are sobering if not controversial and worth listening to for anyone who cares about the future of the global internet. 


Why you should listen:

* Get a fresh perspective on some of the biggest risks to the global web: unchecked algorithmic bias, the risk of attacks on massive CDNs, and the growing internet fragmentation.

* Consider some of the boldest ideas from one of the sharpest thinkers when it comes to how policymakers can make fundamental changes to protect the internet.

* Hear Nick’s take on why art matters in cybersecurity -- and why stereotypical images of hackers in hoodies harm the public’s perceptions of information security. 

* Learn more about Fairness, Accountability and Transparency in Machine Learning and the growing movement to look more critically at the hidden algorithms that control the internet and much of technology today. 

* Consider how ransomware takedowns and other large-scale cyberattacks such as Colonial Pipeline erode public trust in technology.

* Get a better understanding of why diversity in the cybersecurity industry matters when it comes to identifying real-world threats.


Key Quotes:

* “That power over the internet is like a huge strategic asset for the U.S. It's analogous to controlling global trade.”

* “Imagine a Stuxnet level attack on Cloudflare.”

* “I would nationalize Cloudflare. I would make it like a national publicly-run utility company.”

* “This word ‘hacker’ got so diluted. It means different things to different people. And it became this totally useless way for describing what's actually happening in security.” 

* “The future of cybersecurity … is the future of machine learning.”

* “The real risk of ransomware is just that it freaks people out.” 


Related Links:

* Synack.com

* https://nickmerrill.substack.com/about

* iSchool (Berkeley) Bio

* https://www.synack.com/lp/enterprise-security-testing-101

* https://cltc.berkeley.edu/

* https://daylight.berkeley.edu/

* https://www.codedbias.com/

* https://www.fatml.org/

Episode Transcription

Jeremiah Roe: [00:00:00] Bella, how are you doing today?

Bella DeShantz: [00:00:24] Hey, Jeremiah, you know, it's a, it's a Friday. So all in all, I'm doing pretty well. Huh?

Jeremiah Roe: [00:00:35] Doing good. I would also like to introduce Dr. Nick MerrillHow's your day going so far? 

Nick Merrill: [00:01:01] Yeah, so far so good. Bella DeShantz: [00:01:44] we're, we're super excited to have you here on the show, So I've heard you reference your approach to cybersecurity as rooted in science and technology. Could you tell us a little bit more about what that means and how that might differ from others perspective?

Nick Merrill: [00:06:49] So I think what I probably said there was that it's rooted in science and technology. Science and technology studies also known as STS. So science technology studies, you know what we think about as kind of the true, [00:07:00] real story, I worry about how science is actually done. So, you know, a great example here, Einstein famously did not want to believe in quantum mechanics.

He said, God, doesn't play dice with the universe. there is a social story behind every kind of science, every kind of technology. What we see today, you know, I think it's popularly imagined as like, you know, he's a very weird kind of billionaires in Silicon valley were romping around and building these things and they have a very particular kind of political idea about how the world's going to work.

So this is an example of the kinds of things that people think about. And, you know, taking this STS kind of perspective to cybersecurity. One of the questions I'm asking is like, who is doing cybersecurity, right? You have this weird confluence of hackers, which is where I came from before I did, you know, all of this kind of PhD stuff.

And then you have people kind of military, you have people who are kind of industry career professionals. And, you know, these people all come at stuff from different angles. Of course they all have their own kind of weird mix of blind [00:08:00] spots. There are some things that maybe none of them catch and you know, what we're trying to understand is kind of, you know, why, when does this matter and what can we do to kind of shift.

The way that cybersecurity is done to make it work more effectively, you know, for different people. So, you know, a great example. I love to use here. My colleague, Diana freed at, at Cornell, she did these really amazing studies where she found that a lot of, uh, women who were in the shelters for domestic violence, they were, you know, stalked or harassed using, uh, techniques.

And, you know, the, the, the threat factors that were being exploited here, there were really legitimate threat factors. I mean, these were, these were, these were exploits and, uh, but the security teams at these companies didn't catch them. And the question becomes a, why did they miss this type of threat so systematically?

And the answer is that, you know, it turns out to be that, that security, it tends to focus on anonymous attackers who don't know their victim in real life. 

Bella DeShantz: [00:09:34] I think like the point that you brought up. What I heard at least, is that the way that we look at these issues or the assumptions that we make about how cybersecurity happens can, can like force us to make, to, to make mistakes and to miss huge issues? Nick Merrill: [00:09:51] we want to catch these issues as much as we can. And, and obviously we don't want to this position of like, um, you know, finger wagging [00:10:00] or, you know, saying, oh, look at these, you know, stupid software developers. That's not helpful. That's not what we're trying to do. We're, we're really trying to play a constructive role here saying, you know, these are the things that, that are getting missed.

Here's what we can do to make this a little bit better. And, you know, our ultimate outputs are, um, you know, practicing. Security practices, you know, kind of talking to policy makers, uh, in a way that, that, that helps rather than confuses them further. 

Bella DeShantz: [00:10:30] you mentioned that you kind of came from a background as a hacker, and now you do a, I think you said this PhD stuff. Um, but I was wondering if you would tell us a little bit more about your hacker quote, hacker background. 

Nick Merrill: [00:10:44] hacker means so many different things to different. I came from this community that, you know, it was really kind of, they were based here in Oakland at the time and Sims kind of split up a bit, but, um, you know, they were really rooted in this kind of like Unix philosophy and, and applied that [00:11:00] to at the time node ecosystem and JavaScript.

And, you know, there was a lot of this was actually before, you know, react or any of these kinds of, uh, declarative. Of, um, you know, building modern web apps, but people were still really into web 2.0 and all of that stuff. And, you know, if you remember at that time, people were just like modifying web pages with Ajax requests and, you know, it was just kind of like crazy time where you could send a weird URL with some queries in it and it totally rewrites what's on the page.

It was just this wild, wild west time, and I'm still wild west, but you know, that was what. Do we get that time? That was my hobby. I didn't, I started a PhD and did all this other stuff and, and, you know, I started learning about STS and learning about design research and all these things.

And I was thinking, oh my God, you know, because no one is applying this stuff to security, but it's so prime for this because it's this weird amalgamation of freaky people. And also some really, really normal people. And it affects everybody right. Bella DeShantz: [00:13:07] I was talking to someone recently who was interested in learning, like, you know, quote, how to hack. A question that I get a lot and I think is funny because it's like there's a million different ways. But the thing that I was talking to them about was like, for me, I feel like the way that hacking works is like, look at something and think about how the developers didn't expect you to use it.

I feel like that's sort of what most of us that say we were, we have been hackers at any at any point, like that's what we did. Right. I think that takes it back to the truest form or the truest sense of the wordNick Merrill: [00:14:22] Yeah. We were actually a while ago, wrote this piece called don't call it a hack. And the idea was kind of talking to journalists saying, you know, don't say that this is a hack, call it on a cyber attack, call it a compromise, call it a data breach and don't call them a hacker call them a criminal, call them a spy, whatever you want.

Use more specific language than this because you know, like you said, this word hacker got so diluted. It means different things to different people. And it became this totally useless way for describing what's actually happening in security. And we were really serious about, you know, trying to get this to journalists.

They don't do this, don't do this, don't do this, use [00:15:00] these terms, use these terms. And amazingly, like over time, we actually did see that behavior change when I was seeing, you know, I'm sure it's different everywhere, but in major news outlets when people were covering. The colonial pipeline, breach ransomware, you know, they use term like rants somewhere cybercriminal instead of like, you know, the, the pipeline has hacked and this gives you so much more, you know, if you're an everyday person, you pick up the newspaper, this gives you so much more to reason with it gives you so much more analytical purchasing. Yeah.

What's actually happening in the world. Then the word hack. 

Bella DeShantz: [00:15:29] Jeremiah Roe: [00:15:45] you mentioned something of interest, obviously it's completely pertinent now, which is the colonial pipeline attack. So what's your particular take on that. And why do you think thatso much attention has been paid there 

Nick Merrill: [00:16:15] The, uh, the one thing that is interesting to me about this attack, it sounds like it's the same thing that's interesting to you, which is like why this one, you know, right before colonial pipeline, there was solar winds, which was really widely covered. And after that, there was the Microsoft exchange breach and it's like very, it seems to me, it very likely that Chinese intelligence got literally every email.

Right. Every government agency that uses exchange, this is huge. It wasn't even covered in the news as much, right? Why this one? I think it's just, the people felt this one, you know, the price of gas went up and also maybe something about American it's oil. But I think that, you know, people really, really felt this in their everyday life and, um, and you know, that's scary.

Uh, and, and it free people out. And [00:17:00] remember the, the gas prices did go up because of the outage. The gas prices went up because people panic. And to me that speaks to the real risk of ransomware in general, which is just that it freaks people out. Right. And, and this pipeline was really big, but, you know, I can imagine something even scary or happening, frankly.

I'm sure we all can. And, and you know, when that happens, I think we're really going to be, you know, facing this kind of like widespread panic type threat. At some point, the panic becomes its own threat factor and. And, and I think that it woke policy makers up, certainly woke up, you know, kind of the executive branch of the U S government to say, this is a national security issue for real, you know, these privatized companies that, that, you know, have lacked standards.

[00:18:00] Jeremiah Roe: [00:18:00] you talked about essentially breakingbreaking the public.

maybe you could explain a little bit more as to what that means . 

Nick Merrill: [00:18:42] Yeah, I mean, disinformation. Formation, whatever you want to call it. Fake news is a great example here to me, breaking public trust. It's hard to do fine. Right. But, um, I think one way to think about it is that there's this kind of aspect of social cohesion, [00:19:00] trust and kind of everyday life. And this is not just about kind of trusting your neighbors, although that's an important aspect of it too.

It's also about trusting that life is going to be predictable. You know, I'm going to go to the grocery store. The prices are going to be about the same. Um, you're free from kind of fear of random violence. All of these things go together, right? And it's this big, big picture. I think what makes that happen? What makes the good society and what makes the good life and cybersecurity, it touches so many things.

[00:20:00] And the errors cascade in this way, that can be really unpredicted, really scary to the average person. I was thinking when Fastly went down earlier this week, right? It's big CDN was so many outages, so many things it's like Amazon is out at as, as a Fastly customer. And it's like, well, are they a Fastly customer or something upstream of their build process of Fastly customer?

Jeremiah Roe: [00:20:22] I took down a good portion of the internet, right? Nick Merrill: [00:20:27] that's scary. Imagine that your credit card isn't working and you try to go online to see the news, to see why it isn't working. And that doesn't work either. 

Jeremiah Roe: [00:20:39] talk a little bit about something that, that you you've, you know, coin says sorta cash rules, everything around me, which is the acronym, uh, cream that you gave, uh, which is, I literally laughed out loud. That was hilarious. 

Nick Merrill: [00:20:57]  I got funding from the internet society to do some of this work we're interested in, in internet centralization. You know, I think that it's widely understood that the internet is kind of, you know, owned by no one, at least that's what we're taught.

It's this global network. And I think there's a lot of fear right now about kind of different countries like Russia and China. Making their own nationalized internets. And sometimes people call this internet fragmentation or the splinter net. there's a lot of anecdote and fear, especially among policymakers around this idea.

And about maybe 2019, we started to say, Okay.

you know, we need some measurement here. We need to really be able to discuss what's happening at a specific level. Kind of the direction and velocity changes to the internet over time. there are a lot of things I can, I can say here, but kind of the one that is really relevant to today's discussion is thatas far as threats to the global internet, forget China and forget Russia [00:22:00] and forget even Amazon or w.

It's the CDNs, these caching services that are so, so centralized that, you know, a failure in even the, we saw last week, a second largest CDN, they have 5% of the market share, causes, significant chaos, right? And this one was an outage. This was just an oopsie. They fixed it within, you know, an hour or something like that.

And so imagine instead that this isn't, it. Right that Russia very successfully, uh, maybe spear fishes or otherwise compromised as critical systems at Fastly. And they're like, really, for real down, down, like all the way down, not just 85% of the network, 100% of the network is down. This is the second largest CDN.

That's not even CloudFlare. that would be really chaotic, right? Like, no one's buying anything on Amazon chaotic. And what I think is salient about this, right? That's going to hit the news big time. That's going to be, you know, this, this last one was a front page story. So this is going to be, you [00:23:00] know, on the order of like a huge earthquake or natural disaster type story.

decentralization was really important to the internet back in the day. Uh, it was, it was really a key key design feature. And, and I think as much as we can probably all complain about big tech monopolies, the internet basic industry. It pretty much is decentralized still except for CDNs, except for caching services.

And because they kind of live between users, queries. it's this bottleneck, you know, it's the bouncer of the internet. It's a security guard at the internet that lets the traffic come through from the general public.

And if that gets compromised, The global internet for the average everyday person who's typing URLs into their URL bar is down. The internet is broken. And when I think about, you know, Russia, I don't know if you remembered, but maybe 2017, they did this drill where they disconnected themselves from the [00:24:00] global internet.

And the reason is that they don't want this dependence on this huge infrastructure. That's highly centralized and highly centralized in the United. Because if the U S you know, had an executive order or something, or some kind of law that said, Hey, CDNs don't serve any traffic to a Russian origin IP address.

Then you've just created a tremendous firewall around the entire country of Russia, just from the United States. That's like a really, really fundamentally how the internet was not supposed to work. there's a lot of talk about antitrust in the U S CDNs are placed where you could go in and break them up.

Right. You really could break up this market. And I think that there would be really great security reasons to do. 

Bella DeShantz: [00:25:10] Why do you think it is though that people don't notice this? DNS being centralized. And, and, and like, you know, you mentioned that people are so worried about Google. Having a monopoly. People are worried about like, oh, China having their own separate internet. People notice these things. Why aren't they noticing CDs?

Nick Merrill: [00:25:28] you know, I don't really know. Super good reason to this, but one thing I've observed, and this actually seems to me from a colleague Frederick Tuesday, she's based in Paris and she had this great observation.

Unfortunately, the paper's only Frenchher claim here is that there was all this talking about internet [00:26:00] fragmentation, and it served this very specific goal from the us state department side.

And you see this also us state department's really into internet freedom, right? And maybe there were altruistic people there, but the, the other observation on the other side of this, and, and, you know, this is something we have data for is that the U S basically runs the gamut.

Right. The U S owns the internet. And when we want to take something down as the unit, it states, what do they do? They go to ICANN or they go to a, you know, some other TLD registrar like GoDaddy, and they say, Hey, you know, give us this domain name. And that power of the internet is like a huge strategic asset for them.

It's analogous to controlling global trade. what we're seeing with Russia and China is that they're saying, Hey, you know, I don't actually need to be part of this. Um, we can do our own thing, and certainly China's done it quite effectively in their way.

Um, and, and. You know, I think that freaks the U S state department out because it threatens this really amazing strategic advantage to the U S house. Jeremiah Roe: [00:27:55] So there's the CDN aspect which is, you know, if you type in yeah. URL inside the address bar inside of your browser and you click enter and you know, what kind of happens in the background?

Well, this is, this is kind of what Dr. Merrill is speaking about there and that sort of background stuff. What happens is it's going to do some look-ups for where you need to go, which then point to sort of IP addresses. And then it's going to send you that, send you to that location. Now, if these things go down, then you know, the important pieces. Unless, you know, the actual IP of these locations of these things you're not getting there. Is that right? 

Nick Merrill: [00:28:43] Yeah, totally. And you know, in some cases, nobody actually knows the IP address of Amazon except for cloud. And the whole reason that Amazon uses Cloudflare's to make sure that that's the case, because you really do want protection from the general public. You know, I [00:29:00] think I, and I want to really underline this CDNs and like people like CloudFlare, they provide an absolutely essential service.

Two absolutely essential. the problem though, of course, is that they've become centralized to such a degree that their central points of failure for the entire global internet. And also, you know, they're not regulated even though they really are more like, kind of a [00:30:00] public utility than they are like a business.

Jeremiah Roe: [00:30:07] So I've, I've been in the, in the second for security industry for, you know, a while now, and obviously have had the pleasure of working with CDNs on multiple things and especially with, with conducting sort of penetration tests and, and red team operations, um, in a past life. And I've never.

Actually thought of it this way, in my personal opinion, this is, this is exceedingly interesting as is sort of these little explosions are kind of going off in my head from, you know, what an attacker could potentially do here. 

Nick Merrill: [00:30:39] Yeah. I mean, they would fuck you up. like imagine a Stuxnet level attack on CloudFlare. I mean, seriously, like nobody's credit carpet work and you wouldn't be able to look up why. 

from a policy perspective, you know, if, if you could, what would you recommend that Congress does in this kind of attention? 

Nick Merrill: [00:31:39] I would nationalize CloudFlare. I would make it like a national publicly run utility company. And I would say, listen, you know, there are advantages to using CloudFlare. We're going to run this with absolutely the highest possible security.

You know, this, we're treating this as, as important as like a nuclear room. [00:32:00] And then, you know, I think that that's never going to happen in the U S I think maybe if I recommended that to Canada, you know, maybe there would be actually be some more.  we're kind of on the [00:35:00] topic of, of how could we do this better? With CDNs, but I think this also applies to kind of general areas, areas of cybersecurity Being prepared for ransomware attacks and cyber security preparedness, I think in general, how do you think we can get the government and also the private sector to kind of work together on this in, in everyone's best interest?

Nick Merrill: [00:35:46] at the center for long-term cybersecurity, we think about the future a lot. it's a huge part of what we do. [00:36:00] And. there are better and worse ways to, to think about the future.

a lot of the time, when we think about the future, people tend toward the utopia and the dystopia, and most of the futures that you actually observe are really mundane. And so we try to imagine your, these mundane futures where, you know, some change has unexpected effects. People use it in unexpected ways.

And so as far as, you know, getting industry and government to work together we've certainly, um, seen the government try to do things like GDPR. And then what our tools allow us to do is say, okay, how are different stakeholders actually going to use or apply GDPR? unfortunately there's no good prescription here for saying, oh, well, here's the formula. You know, here's how you get companies and industry to, to work a lot. And, you know, governments turn over every two to four years and corporate culture changes rapidly.

And under all of that, the technology, the technological landscape is shifting under our feetwhat we. Center for long-term cybersecurity are trying to do is taking in these proposals, thinking about how [00:38:00] they would actually play out, you know, over 5, 10, 15 years or whatever, and saying, here are some scenarios and, and do you really want to do this now?

And, and those are kinds of, yeah. And of course we try to make recommendations as well and flag things. And again, you know, my colleague, Jess cousins, she goes to Sacramento to talk to lawmakers about machine learning.

Jeremiah Roe: [00:38:21] And it's a continuous argument there too, right? because of that sh that change over every two to four years from the lawmakers perspectives. So you just have to continuously make that argument over and over and over. 

Nick Merrill: [00:38:32] And also you have to give kind of these metaphors for helping policymakers understand. So one of the things we do is that we run an arts contest. We fund artists like studio artists to make cybersecurity. And when I explain why this is important, I want to remind you.

I'm sure. We all remember when Ted Steven said that the internet was a series of tubes. obviously he got a lot of, let's say feedback on that comment, but a question that, you know, you might want to ask, my might ask myself is why tubes? Why did he pick two?

And when you look at like wired magazine, this is 2006, you look at wired magazine, 2000, 2005 or so when they talk about the internet, they're like a bunch of tubes. It's like imagery of tubes. There are ones and zeros you're inside of a tube.

Um, there's like a city that's connected by see-through tubes. It's tubes. And the idea here is that imagery really, really affects the way people think about things. Art matters. art provides us like a reference point or an anchor and the reference points and anchors we have around cybersecurity are really.

Bad hackers is like, there's a hacker in a hoodie. He sits in like a dark room. He's typing on like a windows laptop. Right. And, and what does that [00:40:00] represent? Like what does that tell you about the way the world works and what we're not looking at are kind of like. we don't need to Hollywood of this as much as we've been doing, you know? Bella DeShantz: [00:41:33] what is the problem with general, the general public having this incorrect, you know, visual [00:42:00] representation of this. 

Nick Merrill: [00:42:02] people make decisions about cybersecurity every second of every day. That's my process, like a fundamental perspective that I have and, you know, mostly it's because they accept. They open their phones, they log into their Google account from your iPhone. Um, they don't want to turn on two factor cause that shit's annoying, you know, and, and. 

Jeremiah Roe: [00:42:23] Okay. 

Nick Merrill: [00:42:23] And, or, you know, they logged, they check their email account on their phone and they haven't enabled to factor on that email account. Right. These are all just really mundane ways. And if you go through your day and think about, oh my God, I just implicitly made a decision about cybersecurity. You'll see what I mean pretty quickly.

And people make bad decisions. A lot of the time, I think that security professionals basically believe that. Um, I also think designers make bad decisions and providing bad defaults. I think that imagery stories are a wonderful way of helping people. And we want to kind of get those stories out there in the least harmful way possible. So art is something that people interact with voluntarily. You know, it's something that, you know, for most people, I think a lot of their meaningful favorite things in life are art pieces of art movies, TV shows, music, uh, something else.

And we try to hook into that. 

we talked a little bit a while ago, um, specifically when you were talking about, uh, like ransomware, this idea of these attacks and these issues like freaking people out.

And we talked a little bit about that, you know, public trust issue. how could we potentially change our relationship with like cybersecurity and the general public to improve public trust or change this Lance?

Nick Merrill: [00:45:46] when I try to prioritize this stuff, like what keeps me up at night for sure.

Right. I talked about the cash thing that keeps me up at night for real. Another thing that really keeps me up at night is this idea of machine learning bias, which broadly is the idea. You get this, you know, machine learning, AI is algorithm and, uh, you know, typical formulation would be that it learns from some past data.

Let's say it learns about, uh, you know, hiring decisions that that hiring managers have made. And then it learns, uh, bias from those that data. So, you know, if they're biased against women, You get an algorithm that's biased against women and Gary, more subtle formulations here where, you know, there's just some absence in the data.

There aren't enough faces of black people. And so it is bad, you know, at identifying categorizing black people or does so in a way that. Um, and they're even more subtle formulations than that, but broadly defined, you know, I call these issues around bias or fairness. Uh, you know, I just call them machine learning failures.

These for real, keep me up.

And the reason they keep me up is that it is nobody's job. To find them. Nobody is being trained to go and identify machine learning failures. You got a lot of people. And I work at Berkeley. I've seen a lot of grad Stu PhD students let alone undergrads go all the way through their education. They go work at a big company, you know, big tech company that you.

And they work on machine learning algorithms there, and they know about fairness. They know about bias, but if you put an outline in front of them and you say, Hey, is this biased against black back people? They would not know how to answer that question. That is the scariest thing that you could possibly know about the world.

If you have gotten yourself into the kind of same mindset I have machine learning is really going to be everywhere. People are well attuned to the term. Standpoint, the kind of the Terminator threat model, if you want to call it [00:48:00] that, and they're not well attuned to the racist algorithm threat model, or if they are, they're not putting as nearly as much work into it as they should be.

Bella DeShantz: [00:49:04] I just watched a movie on, not on Netflix, not too long ago, called coated bias. That, that like went into this issue with machine learning. And I remember watching it and being. So excited about how every expert, uh, kind of talking about this issue, uh, that was highlighted in the movie.

At least they were almost all women. 

Nick Merrill: [00:49:24] Yeah, I love that one.

Bella DeShantz: [00:49:25] I loved that movie. 

Nick Merrill: [00:49:26] Yeah. 

Bella DeShantz: [00:49:27] But it was like all these women experts talking about this issue. And I was like, yes, look at these amazing women in the field and a lot of women of color. And I was so jazzed. And then there was partway through the movie where I realized, uh, and I'm making an assumption here, but I realized that maybe that's because those are the people that noticed the issue because it affected them.

Nick Merrill: [00:49:57] that's [00:50:00] joy ball and weenie. And you know, I know a lot of people who were interviewed in that. Community called a fat ML. It's like fairness, accountability, transparency, and machine learning. They publish this conference and stuff. And it's this field it's just like dominated by women and women of color in particular.

cybersecurity is headed in this direction of fat ML. I mean, to my mind, the future of cybersecurity [00:51:00] broadly is the future of machine learning And the only tools we have for thinking about that are right now are fairness, accountability, transparency, the stuff that's going on in the fat ML community.

what I would love to see is to have younger students of color, you know, in the high school, Kind of level, or even below say, look at this, look at how amazing the stuff is.

Look at what's happening. And look, you know, these are people who look like you, who are doing this work and trying to make that into a funnel where, you know, I think a lot of students who come from disadvantaged backgrounds who are [00:52:00] disproportionately students of color, They don't see security as something that matters to them.

It protects rich people and their money. And historically that has been what cybersecurity has done, but in the future, cybersecurity is going to protect, you know, people who are at the margins technically and in society. And, and we want, I think I w at least, you know, I think at CLTC we all. To see cybersecurity lean into that identity and lean into these battles and these struggles that are just over the horizon here, as a means, and an end, you know, the means of course are increasing diversity and cybersecurity workforce in the end is, you know, protecting against this threat, which again, you know, really seriously keeps me up. 

Jeremiah Roe: [00:52:41] I would like to potentially shift gears for a moment to sort of your dissertation on mind reading and telepathy. I was wondering if you could speak a little bit about that, maybe, maybe what are three things that we should know about mind reading and telepathy? 

Nick Merrill: [00:53:40] Uh, first thing I, so regret titling my dissertation, that all I get are emails from people who think that they're telepathic with their dog and they want me to prove it for them. That's thing. Number one. Okay. Think thing, number two is that, you know, what I was interested in here was this idea that people have, the computers can read them.

And people believe this, right? People believe the [00:54:00] computers can read their minds. The question became, okay. You know, how and why and the answer. Yeah. That, you know, people think that computers can know things about them, that they cannot know. Like if you put, you know, a brain scanning device on someone, you, you can't know what you're thinking, but they undervalue the risks that really matter.

So, you know, if you have a GPS on someone, you can tell, if they're depressed, you can tell if they suffer from clinical depression, pretty high cost. With just GPS trace data. People don't think about that, right? So this is highlighting this mismatch between what people believe and what can actually be done and how it can actually be done.

There's this big, big security issue. It's this big security problem. And then the third thing is basically designers and engineers are this weird group of people who are really steeped in science fiction. And they absolutely believe that computers can read your mind. And also that that's a good thing and they want to do it and they want to.

And, and that other processes, the other thing I was studying in that dissertation and how they're kind of these two separate issues going on this two separate kind of streams of dialogue and trying to make sense.

Jeremiah Roe: [00:54:59] And then 1, 1, 1 last question sort of before we, you know, jump, what is one thing we wouldn't know about you or be able to tell about you from your LinkedIn?

Nick Merrill: [00:55:18] Wow, what a good question. You know, probably the number one thing for my LinkedIn? profile is that I'm a big enthusiast of a drink called cava, K a V a that's traditionally from the south Pacific. And, uh, uh, you know, I, I, it doesn't feature on my LinkedIn profile, but if you get to know me at all outside of a professional context, it's probably the first thing you learned. 

Jeremiah Roe: [00:55:40] Uh, listeners wanted to get in touch or wanted to read some of your material or wanted to kind of follow you. How would they go about doing that? 

Nick Merrill: [00:55:54] Follow me on sub stack. Uh, Nick Merrill dot sub stack.com. 

Jeremiah Roe: [00:55:58] Perfect. Awesome. it's certainly been a pleasure.

Bella DeShantz: [00:56:08] Awesome. Thank you so much. 

Nick Merrill: [00:56:09] I love talking and you let me do that. So this was a lot of fun.