Earlier this year, the Electronic Frontier Foundation named Matt Mitchell, founder of CryptoHarlem, one of its 2021 Pioneer Award winners for his groundbreaking work to protect Black communities from surveillance. In this episode, Matt talks about what led him to apply his hacking skills to social justice causes and how that led to his role today as a Technology Fellow for the BUILD program at the Ford Foundation. Matt also discusses what Twitch can do to safeguard creators and the steps anyone can take to better protect themselves online.
Earlier this year, the Electronic Frontier Foundation named Matt Mitchell, founder of CryptoHarlem, one of its 2021 Pioneer Award winners for his groundbreaking work to protect Black communities from surveillance. In this episode, Matt talks about what led him to apply his hacking skills to social justice causes and how that led to his role today as a Technology Fellow for the BUILD program at the Ford Foundation. Matt also discusses what Twitch can do to safeguard creators and the steps anyone can take to better protect themselves online.
--------
Why you should listen:
* Hear from a hacker working on the frontlines of today’s most important racial justice issues.
* Better understand the state of digital surveillance in Black communities.
* Hear about what steps platforms such as Twitch can take to better protect creators.
* Learn the three things everyone online should do to better protect themselves on the internet.
* Discover where “Mr. Robot” placed an elusive CryptoHarlem Easter egg.
--------
Key Quotes:
* “It's really about taking the skill that we have and applying it toward something bigger than yourself.”
* “Under the lens of a surveyor, who’s always looking for wrongs, you’ll find what you’re looking for all the time.”
* “We sometimes confuse public safety with surveillance.”
* “I'm pretty realistic. If you look at the number of cyberattacks that came from sticky notes on personal computers, it’s zero. But don’t put a sticky note on the nuclear codes.”
--------
Related Links:
* https://www.cryptoharlem.com/
* https://www.fordfoundation.org/
00:00:00] Jeremiah Roe: Matt, thank you so much for joining us. Welcome to the show we are we're in and, uh, this is my super awesome cohost. Uh, Bella Deshaun's cook.
[00:00:13] Bella DeShantz: Hey, Jeremiah. Thank you for that very kind introduction. It's really great to see both of you today. Um, I'm really excited to get to know you a little bit more, uh, I think we can just jump right in if that's cool with you. Can you tell me exactly how a hacker ends up working at the Ford foundation?
[00:00:35] Matt Mitchell: They didn't know. So now my job's over, so thanks a lot. You heard it here first? Sorry for it. Yeah, no, no. Um, in all seriousness, well, the Ford foundation for people who are listening, who don't know, they're an institutional philanthropy, they've been around for a very long time. Started by. That homie who invented the car, his son Edsel.
So it was a long time ago, you know, Henry Ford. [00:01:00] Um, and, um, what they do today is they support the fight against inequality. That's all they do. That's their mission statement is pretty simple. And how do they do that? They do that by bringing people together, um, allowing different minds and different movements and different things to strategize.
Um, they do that through financial support, uh, uh, 500 million us dollars a year, uh, to nonprofits around the world. And, um, you know, they do that through, you know, the different things that they create research, et cetera, for the field, for all people. And philanthropy is kind of like the like social justice, positive nonprofit, but like working at a bank at the same time.
Um, and they have their own concerns when it comes to cybersecurity, but they have an amazing team there that keeps them. But what about the grantee partners? What about the people we support out in the field? And there's a lot of reasons why that becomes really difficult, [00:02:00] mostly because they're so different and you have no influence power or control over them, nor should you.
So what I do as a tech fellow is I bring tech, you know, just drop that bullion on cube. That is the flavor of tech into this old kind of industry, uh, or older industry, I should say. And, um, there's a lot of tech fellows doing different things, but as a hacker and someone who's expert on cybersecurity, I focus on how do we keep our grantee partners safe?
What are tools and resources we can build to create for them and the world. And so, that's my day job. It's pretty cool.
[00:02:32] Jeremiah Roe: I was kind of perusing through, um, through your website, uh, which is crypto harlem.com, by the way. And as I was going through there, I noticed a little snippet from, uh, Mr.
[00:02:55] Matt Mitchell: yeah. Yeah, it's true.
[00:02:57] Jeremiah Roe: That that is [00:03:00] sick. That is so cool. I never even picked that up before. And I've seen like every one of those episodes, probably 15 times, and that's a little Easter egg that I just never picked up. So for those of you listening season three, episode one, what for it?
[00:03:16] Matt Mitchell: yeah, I mean, I didn't even see it. You know what I'm saying? Like, um, I, I, I'm lucky, like, I'm pretty humble, but I'm lucky to have been around for a while and met some really interesting people and worked with some really serious things, you know? And, um, yeah,
like they, they had called me cause they film in New York, you know, I'm based in New York city, you know, crypto, Harlem.
And they were like, Hey man, you got to come through we're filming, you know? And I was like, oh, that'd be so dope. I said, but um, I'm not in the country. So they're like, yeah. well we can't hold production back. So you could be in the episode, I was like, wait, wait, wait, I'll come back. You know? So, uh, then they were like, well, I, as soon as I got back, I called them and I was like, yo, let's go.
W do you have any more [00:04:00] episodes? They're like, oh yeah, we're filming still. But we film in LA now. So they, they did like a west coast, east coast thing and they were, they had wrapped New York production wrapped. So they were like, we'll figure it out. We'll figure it out. We'll find a way to, you know, get you a little Ahmad, pay little shine and a.
Uh, so then after I watched the first episode, like everyone else does, when it, you know, when it premiers and, uh, they wrote me like, yo, did you see it? Did you get it? And I was like, huh. And I already know that they like, they drop all these little things in there. I didn't even see it. And, uh, I watched it frame by frame.
I was like, what are they talking about? And then I caught
[00:04:37] Jeremiah Roe: That was so smooth. Yeah, it was really great. I loved it. And looking at that photo, I'm like, oh my gosh. Like, so I'm actually going to go back probably tonight and watch it again so I can pick that out myself. Speaking of cybersecurity though, right. I'm I'm really interested to hear about how you got into cybersecurity and then sort of working at places like, you know, the New York times and CNN.
I mean, to me [00:05:00] that that's a very sort of interesting career path. How did, how did all that come about?
[00:05:04] Matt Mitchell: I was really into what, how things work and what's inside of things. Right. And I used to take my toys apart and all that kind of thing.
And you know, that book, like how things work. It's like the little illustrations for kids. Like, that's me, you know what I'm saying? like we had one computer for my whole school and it was a really old computer, uh, back in the days.
And like in the eighties, you know what I'm saying? Cause I'm, I'm, I'm older. So I'm back now. Everyone's got like 3000 computers in their pocket, but back then it was one computer. I didn't understand how it worked. I became obsessed with it. I asked my folks, yo get me a computer. [00:06:00] They were like, we're immigrants.
We can't afford a computer, but we can get you a magazine subscription about?
computers. so I'm like I'm watching other people enjoy their computers.
Right. I'm seeing code that I can't enter into my invisible computer, but it's sparked something in me. And I would like do anything, you know? Like I would beg, borrow anything to like, get on a computer keyboard. I even drew a little cardboard, computer keyboard type things for fun on it. You know, like I was like Rocky in a montage training with Apollo, you know what I'm saying?
So eventually I didn't realize it, but I just developed these skills and that led to me thinking, okay, I, I want to make some, you know, first I, I didn't have any computer. So I had to get on a computer somehow. Then eventually that built my own computer. I'd have any software. How do you get software? So, you know, people I knew were like, oh, I'll tell you where, you [00:07:00] know, I know there's there's ways and wares to get, you know what I'm saying?
So, but, um, so I just developed actual computer hacking skills. 2,600 is based in long island.
That's where I, that's where I grew up. Um, you know, when I was a kid and so I grew, you know, I developed actual computer hacking skills and, um, I just needed to find a way to use it. I use my computer skills to help activists and nonprofits back then as a teenager. And I got my [00:08:00] first job a little bit later. Um, and my first job was working at as big corporation. And they were like, yeah, you're just going to be rolling out machines on desktops, whatever.
I was like, oh, that's easy. But then, you know, a little bit later they had a meeting and they were like, now we're gonna tell you what your real job is. And I'm like my real job. Oh, this is weird. Right. And then they were like, your real job is basically like corporate surveillance of all employees and staff and reporting all people, which is a thing.
Every company, if you work for one, has someone protecting the company's interests and there's a, you know, insider threats of thing. Sometimes they overinvest in that team until they become the problem. Right. They become the danger and I was working for that squad. So, you know, my background, I was like, no, this has been. I've got to tell people about this, but nobody back then, it's like, it would be very strange if someone talked to you about all this stuff.
So, um, so I was just like, you, look, I, it just kicked me off on this thing where I must use these skills to raise awareness about how bad things are. Um, you know, like my parents were involved in like conflict and boar and all that other stuff. And we, we know the harms that come from, you know, when good intentions go bad.
And that's what, this is an example of. So I just committed my whole, my whole life mission to push it back against all that stuff.
[00:09:43] Bella DeShantz: uh, I've heard a little bit about, uh, a few years ago you were really active. In, um, hosting crypto parties. Uh, [00:10:00] and you did a bunch of those in Harlem, and I'm wondering if you can tell us a little bit about, uh, what are those, how do they work?
Who do they benefit? Uh, and if they're still going on.
[00:10:08] Matt Mitchell: Yeah, yeah, yeah, of course. So, you know, what is a crypto party prompts to ask your Wolf? Who, without there being no crypto parties, you know, so, you know, she's this woman from Australia, who's Just like, Hey, I'm reading about this, uh, Snowden thing. And I'm concerned about invasive listening and electronics.
And I want to know more, like, I don't understand, like how, what can I do to protect myself? And of course the internet was being the horrible internet. Boom, shut up. Hey, new, if you don't know, you're talking about whatever. And, um, that's like the obvious wrong answer for mostly people who have no idea what they're talking about, honestly, but that's besides the point. So. The idea of crypto parties was idea. Like, you know, Hey, um, what if I just got other people who are curious like me, you know, if you're Asher and like just sat down and just had living room, who's talked about stuff we [00:11:00] learned, we looked it up together and that's how the whole crypto party thing was born for real, for real.
And, um, they're all around the world. [00:11:59]
Bella DeShantz: I watched a video, um, about the crypto parties. It was really interesting. And there was a moment in the video where you talked about how, uh, in Harlem, at least to get folks interested in the meetings, you, uh, like we're handing out flyers on the streets and going to leaders within the community in person and spreading the word rather than spreading the word on Twitter and Facebook and other sites that like I often see events talked about.
Um, and I'm wondering if you could explain the importance of that approach.
[00:12:30] Matt Mitchell: all of the parties for all different types of people, but crypto, Harlem is for black folks. And if you're part of a marginalized community, you have the dominant culture that your touch on, you know, like it's kind of like. Speak, maybe a different language, their native tongue at home, but then perfect school English or whatever.
Right? So you have the dominant culture that you live in and you must survive in, and then you have this other culture and for black folks [00:13:00] and for all marginalized communities that exist and it's not online, it's a, there's no Google for that. And if you want to reach the real people, it's all through networks of trust, it's all through these different institutions.
That means something in that culture. And that's why the crypto party that I would do in Harlem. Um, it was three hours long. And to go back and circle back to that question, yes, we're still doing it, but now it's online. That's why I literally just got off like a, an hour ago before this. Um, now it's on Fridays at one o'clock on Twitch instead of at a community center at Malcolm X and Martin Luther king Boulevard where they intersect, which is really cool, but it wasn't like by planning or anything.
Um, and. You know, when you, when you have a crypto party and you're reaching your community, you need to go to the barber shops and the beauty salons and the churches and the mosque and the, the, the, you know, in some communities, it's the soccer fields where the parents are there and whatever, you know, [00:14:00] and, um, we were really serious about that mission.
So the crypto part three hours long, I spent three hours on the streets, just walking around, just talking to people. Um, you know, just, just talking, just having a conversation like, Hey, do you have a phone? Do you ever have thoughts or questions about how it works? And do you have any ideas of what negative things you might be bringing in by bringing this technology into your home, into your life, et cetera.
And did you know that there's a place you can go to just get definitive answers and be with like-minded people, et cetera. [00:15:32] Jeremiah Roe: Yeah, no, I think that's great. I love that. I think that's hugely impactful in Mo more so is, is, is it says to people's point, like, look, if you want to get out there and make change, you've got to make change.
And so I just, I just love that, but I'd like to circle back around to this. So there's a lot of, a lot of your work. It revolves around the intersection between sort of technology and social, social [00:17:00] justice in this way. Right. And so I'd really like to talk about the role of, of hackers or people working in the cybersecurity space.
And to that point, like, look, we have a special skill set that many others just do not have, frankly. And so with that, how can we apply that to obtaining justice and ensuring it's for.
[00:17:24] Matt Mitchell: we do have a special skillset. It's kinda like we're time traveling wizards and you go back to a day where everyone leaves their things unlocked and their doors are unlocked and, you know, very simple times. Right. Um, but you know, there's. Evil wizards, evil time travelers about to hit that village or that town or whatever.
Right. But then you're like, I get paid nine to five to make stars show up here. So that's what I do. Wake up my fingers, you know? So that just doesn't seem right. That doesn't really make sense, but that's what we're [00:18:00] doing. And, um, we have, you know, I'm a big comic book nerd, you know, my, my pops always said like, I don't care what you read.
As long as you read, you have to read every day, like insane pages. Right. So I would read like Marvel comics and DC and all that good stuff. And, you know, Stanley, you know, like with great power comes, great responsibility and all that other
[00:18:22] Jeremiah Roe: that
[00:18:23] Matt Mitchell: it's really true. Yeah, It's really true. And it's, you can destroy the planet with the things that we know with the skills that we have.
It doesn't take that much like, um, you know, industrial control, water filtration systems, nuclear reactors, Um, even someone's pro like private thoughts and, and feelings that could be exposed them and ruin their career. Right. Um, all of those things are weird. Hobby are strange interests. It just happens to be that glue that maintains all things.
Right. Um, and not to go [00:19:00] into the George Lucas, the force, you know, but it's similar. So you must ask yourself like, yeah, chase the bag, you know, definitely secure that, get that money. but we're at this point You got to ask yourself, how many hours do I spend working for that money?
And then how many of those hours in my year do I spend protecting what I care about? I mean, it could be a conversation with a family member or a friend about a password manager. It could be, you know, again, whatever community you care about, [00:20:00] whatever, you know, the great thing about social justice is it takes so many forms and look so different.
So if you're a person of faith and you're like, oh, what about my, my religious community? Right? If you're a, uh, you know, someone who just is like a fan of a certain thing, it's like, oh, I like that baseball team. Well, what's their cybersecurity look like, you know, sort of like, it's, it's really about taking the skill that we have and applying it in any way towards something bigger than yourself.
And it's no disrespect, like Yeah, you deserve to make money to put shirts on your backs and fluid in your stomach or whatever. Right. But we're at a point where it's bigger than. And we're all going to be sitting around a fireplace, wondering what happened, where there's no computers and it's like the mad max post-apocalyptic future.
Uh, and I'll say, oh, I should have done something, but it's too late. So, you know, now's the time. And it's actually not that hard. Like there are people who are just like, look, I'm looking for anyone to work at my hospital. We have an issue with cybersecurity. [00:21:00] We want to get ahead of ransomware. It's like, it doesn't mean you got to quit your gig.
Maybe you just say, Hey, look, I don't want the job. I could talk to you for like an hour a month. You know, something like that. Or, Hey, like, you know, there's a shelter for, um, refugees or survivors of domestic abuse and we have questions we have, I. it's like, you know what? Just take it upon yourself to ask.
Would you take an unsolicited one, our professional opinion.
[00:21:27] Jeremiah Roe: That's great. I, I personally have never thought of that. I mean, I've done those things, uh, for some of my local community things that I attend, like I'll notice certain things that I'm like, oh, that's, that's, that's really bad. I've got, I've got to say something and I go up and I say something, but I've never thought of like, Hey, in addition to me saying something, let me, let me say, would you be interested in just sort of a free like, like session, just so I could sit down and help educate your staff yourself, and, and those who are maybe providing technical [00:22:00] services for your organization.
I'd never thought of that.
[00:22:03] Bella DeShantz: I think there's also like a UN this element of like, something that I've done in the last few years is just sort of. Uh, trying to make cybersecurity and personal online security part of norm, like typical conversations that I have with my family and friends, because something that I've realized is that so many of my family and friends and like just people in my community in general, they're afraid of their interactions online, or they see things that are scary and they don't know what to do or who to ask.
And sometimes they're afraid to ask me because they're like, oh, I have this really stupid question about how the internet works. Um, and I think kind of, you know, helping where we can, but also creating these communities of like, Hey, like I know a little bit about this. Let's talk about it. This is just, this is as normal as talking about our jobs, our hobbies, our interests.
[00:22:52] Jeremiah Roe: An open dialogue.
[00:22:53] Bella DeShantz: helps.
[00:22:55] Matt Mitchell: And also, Yeah. I mean, there's nothing special about us, [00:23:00] you know, like we have hubris and can see it and really into this skillset, but It's very teachable and there was a day before you knew the most awesome thing, you know, and you, you were just, and if you keep going back, reverse your them. So I think we forget that Um, you have to hold on to that a little bit. If you are going to talk to someone who doesn't come from. Or you will freak them out because we've seen the headlines right. More frequently. We're seeing all these things about hacks and all, but we understand those headlines. I imagine you didn't know anything about this invisible boogie monster of bacteria living on your hand, let's say, right.
And it makes you suddenly sick. And some, you know, you're, you're from a culture that never had Western medicine and some are from doctors without borders or something is talking to you. And then they're like, yeah, there's these invisible monsters on your hand that made everyone sick. And all you got to do is dip your hands in those normal water.
But with this stuff called self, you'd be like, what is [00:24:00] going on? I can't handle this. So it's also about, um, being able to break it down and be patient and understanding. Right. But I love that you do that.
[00:24:09] Bella DeShantz: Yeah. It's that the like, thank you. I think I really liked that you brought that up. It's that, that culture of like, Uh, openness and learning and group learning, because like you said, as soon as you start coming at a problem with like, I know all this stuff and like, oh my goodness. Y'all don't know anything.
People are not going to be in a position to learn. Um, that's interesting to think about that.
[00:24:32] Matt Mitchell: Yeah. You know, your doctor, doesn't talk about like, let's start with uncurable diseases. Like no, like they talk about, Hey, let's talk about health and things you can do to have a better life. Step-by-step right.
[00:24:43] Jeremiah Roe: So to that point of helping people better understand what, in your opinion, Matt are three actions that people can take to protect themselves and their privacy online.
[00:24:54] Matt Mitchell: uh, one of them is I used to, I used to be the director of digital safety and privacy at this [00:25:00] NGO called tactical tech. And they made this thing called data detox.org. So you go to a T a D E T O x.org, and it's just simple. And I'll, uh, you know, I'll, I'll tell you like my personal three though, but, um, one of them would just be like, if you really want to double down, just hit that website.
But, um, I think the first thing you can do is just use a password manager, right? And it's everything in your life needs to have a unique password. Just if you look at what the average person does watching streaming video, or, you know, interfacing with email or social media, those things have to have completely unrelated passwords because hackers, when we get leaks, you know, we get breaches and we see these passwords, we're human beings.
You know, your password is a sports team. 1, 2, 3, uh, another sports team, 4, 5, 6, we get the pattern, we got this, right. So it's better to have a computer make your password up and not just a locker for your password, but it generates your. [00:26:00] And there's free ones out there too. So there's no reason not to have one.
Like they're all basically the same. Um, there's some minor differences by company on, you know, who funds them and what their motivation is or whatever. And some of them offer a lot of free services to different communities, right? Like whether you're a journalist or you work at a nonprofit or you're involved in anything pro democracy, you know, there's ways you can get discounts.
Sometimes you've got an email sales. If you can't find it by search engine and be, or just be real and be like, look, you know, $8 a month. It's too much for me. How could we get this down? Most of them I've helped people do this. They actually write them back and they actually hook them up with a code or something.
It's real. So password manager, if I only have one thing I could ever tell someone it's that because the breach world is real. I mean, I just Wade through those, those bins and, and So many pastes and so many, so many, um, password leases, nothing you can do to stop a password leak it's going to happen. So you might as well make
[00:26:58] Jeremiah Roe: not the sticky note on your [00:27:00] computer.
[00:27:00] Matt Mitchell: Well, the sticky note of your computers got so much, it's pretty positive. You know, like that, that sticky note, I can't see it. If I'm a hacker it's not electronic. And so if you have a book in your house with just names of the site, the password and the, and the username, and if they have nothing to do with each other, you've one, that's a password manager.
I'm cool with that. Like, you know, I'm pretty, I'm, I'm pretty realistic. If you look at the number of cyber attacks that came from sticky notes on personal computers, zero, maybe if it's work and you've got, you know, you got a spy at work or something, industrial espionage, different story. Don't put a, don't put a sticky note on the
[00:27:33] Jeremiah Roe: Hashtag insider threat.
[00:27:35] Matt Mitchell: exactly, exactly.
But, uh, yeah. Yeah. So I would say that's number one, number two, outside of the password manager. If I was just saying, um, tell someone like what to do, it would again be like locking down that account further with the two factor, you know, like, which we hear you, the second factor. Um, I would just say. If you don't know what it is, just learn about it.
There's a website called two [00:28:00] faa.directory instead of.com it's dot directory. And it's a, it used to be called something else, but they lost the domain. So it's two or fade out directory. And it's just a list of, Hey, I banks every bank that has any kind of two factor. And if it doesn't, you can advocate for it by clicking on a button.
Right. So, you know, like it Just.
makes life really easy. so it's the number two, the letter after the letter, a.directory, boom, you go in you type Facebook and you just click on the docs. That's the documents on two factor that Facebook writes because every website has this because they know that it's the number one way to take over the accounts on the site.
So they have to create a whole documentation on it and they have to create steps to do it, and they have to spend tons of money and engineering time on making it. But they don't make it part of onboarding because they're taught like the more stops, the more people drop off. So they don't make it part of when you make your account.
So you gotta go in, you gotta type in your thing, you got to read and set it up. And this is kind of like the search engine for all of them. So I would say to FAA, um, [00:29:00] like get a second factor. If you're not doing it at all, get a text code. If you're getting a text code, don't stop there, switch to an app and do some kind of third-party app.
If you're already got third party apps, messing around with little hardware, tokens and keys and stuff, and just you all there's levels to this game. And there's always the next one for you. You know what I'm saying? It's like a CrossFit gym.
[00:29:18] Jeremiah Roe: so privacy by design, kind of like your shirt there.
[00:29:21] Matt Mitchell: Yeah. Yeah. You know, like prompts to the calcium, but you can't hear the shirt, but, um, yeah. I mean, it's a non-profit that does a lot of great work, uh, privacy by design and started by the first person to own an ISP that ever pushed back against the national security letter. Everyone else just rolls and give your information.
But, um, you know, Nick from calyx was the first person. Say, Hey, this doesn't sound right. I'm going to talk to the FBI about this and turned it all around, which is dope. Um, and you said three. So let me think of like, what would be the third thing to secure? I think the third thing would be to Google yourself in private browsing mode or incognito mode or [00:30:00] whatever the mode is on your browser.
That makes it, that removes all the information that you give it, um, that it already knows about you. And it just gives you the default search results and see if there's anything there, like your home phone number or home address that you're just don't feel comfortable sharing their steps. You can take to remove your personal identifiable information from the internet.
It, first of all, any site that listed, you have an, a way to opt out, even if they don't advertise, right. Otherwise they wouldn't exist. But if it was too easy, that site would have a database of zero entries. You know what I'm saying? So they gotta make that money. So you could just directly ask the site, how do I opt out of this?
Right. Um, and it's usually a random. Convoluted, not the hard, like sometimes it's call this phone mapping, push one for this press two, for that, for three other ones, it's mail us your, uh, your license, you know, which you should probably redact some things from. And then we'll prove that it's you, and then we'll remove you.
They all have some kind of way. Um, if you don't [00:31:00] want to do all those steps, there.
is, um, a blog post, uh, and a good hub entry called, uh, excuse my French, but it's the, the big ass list of, um, data brokers, um, by my friend, , who's a reporter and writer at consumer reports now, but you know, she wrote this and it's just all the things that you need to manually go through them and remove yourself.
If you did one a day, you'd be doing it for years though. [00:31:47] Bella DeShantz: a lot of the work that you have done, uh, has focused on surveillance, particularly surveillance in communities of color. [00:32:00] Um, and I know there, unfortunately, a lot of folks who, who kind of don't understand the danger and, uh, think like, well, you know, you wouldn't have a problem with surveillance if you weren't doing anything wrong.
Uh, and I'm wondering if you can talk about, you know, that mindset and also how surveillance can lead to real harm and create a culture of fear and intimidation for people.
[00:32:24] Matt Mitchell: um, yeah, I mean, I'm not a superhero, you know what I'm saying? Like I used to be a for the corporate side and I'm black, so I just do the same, those things. It's the intersection of my existence. Um, sometimes confuse public safety with, um, surveillance, right. And there should be the public that's us, me, you, everyone around us and the friends where you get.
There should be some baseline of security for for people and some norms there. Right. And how do we create those norms as a society? Well, we decide there are some societies [00:33:00] where those norms are created through barbed wire and bullets, right there, some societies where those norms are created by like, look, that's just not how we are as a people.
Like we must do this. Think about the kindness, think about the humanity of the other. So those are different ways to go at it. Surveillance is a, is a, a military tool. It's not a, it's not a normal, it's a, it's a, in the barbwire and bullets camp of, uh, finding out if something is wrong and communities that live on the margins, they're pushed out there and they're pushed out there by this like centrifical force.
And in every society, as I used to travel the world, uh, you know, teaching in conflict zones and teaching NGOs and doing a lot of other work with my hacking skills to help people and. I was like, oh, this every group, sometimes I can't even tell, like, I'm like, how do you hate this group? That's so hard for me to even see the difference between you and them.
Uh, they don't seem to appear any difference than you, but he's like, no, you [00:34:00] can tell by their last name or by this license plate we gave them or something. I'm like, wait, that's so strange. So, um, but there's always that group and I'm like, well, how do you know, uh, that they're up to no good. And it's like, well, they must be.
And I'm like, well, do you ever talk to them as like, we don't need to talk to them. We watch them all the time, 24 7 and we're looking and we're gonna find something wrong. So, um, I would say like with crypto, Harlem, I often say looking at the black community and the multilevels of surveillance in the black community, I would invite it to all communities.
I would say, let's not take it away from the black community. Let's just make it even to all communities. And then on that day, everyone would be able to have a very Frank conversation about how this is not the way. Right. Um, but because it's normalized in certain communities, it's just the way it's always been.
They're born into it. And once you have one, one generation that may be resist a little bit, that grows up in something it's normalized, they don't know any different, you know, I've been [00:35:00] to countries where I'm like, does it bother you that this is the way it is here? And they're like, that's all I know. And you can tell me stories of it being another way, but those might as well be fairytales.
Right. And so, um, I would just say with this level of surveillance, anyone would say, no, this is too much and it's not worth, um, it's not worth any kind of perceived public safety win.[00:35:30] Jeremiah Roe: I can guarantee that if somebody watches me long enough, they're going to see me doing something that meets their definition of wrong.
[00:35:36] Matt Mitchell: what does a lot of people and a lot of wrongs, right. And my wrong might not be you're wrong, but if we have a hundred of us looking. One of us will be like, yeah, that's weird. And that's unusual. It's not part of a pattern that I'm used to in my existence. Why is he after, you know, rock that awesome beard and looks so cool or whatever, you know?
So, you know, Like, um, and I think that's, that's what it is. it's a high level of scrutiny. And under that [00:36:00] high level of scrutiny, weird things, or apparently weird things show up on Tuesdays at three, he's always doing this with his ear. You're like, I am, Yeah. I didn't realize if you are, you know, so like, you know, um, and if you look at societies, like, you know, I, I know it's
[00:36:15] Jeremiah Roe: Have you been watching me with.
[00:36:17] Matt Mitchell: you know, if you, if you look at like the Gestapo, if you look at any historical, uh, surveillance units and you go back, you know, a lot of our open records, now you look back at what they were writing and what they were finding and what they were seeing and what happens to those people.
You were like, these were completely, you know, just everyday regular things that under the lens of a surveyor, who's always looking for wrongs. You will find what you're doing. All the time. That's one thing I've learned in my, from my former work. Right.
[00:36:45] Jeremiah Roe: so with the organizations that are, that are getting it right, or, or I guess, you know, at least asking the right questions, um, what do they do when making sure that technology isn't used to further marginalize or [00:37:00] unfairly categorize in harm, people of color or other groups?
[00:37:06] Matt Mitchell: I usually start a conversation on this with like some pretty chill examples. Like for example, there are soap dispensers where if you put brown hands in front of them, no soap comes out because the sensor just wasn't trained to see that hand as a human hand. So just thinks like, is this a shadow or something?
And it doesn't dump the soap. Right? So like, let's talk about, let's say that product. How did that happen? Well, somehow it made it all the way to market that sold and installed and no one ever tested it who had brown hands. So there's an issue there where you have to do. Am I creating a product for all people or just some people maybe even accidentally.
So maybe I should have examples of all people on the consumer side and on the quality assurance side and on the testing side, then it's my team. That's making this, you know, they're excited, we're fixing soap, you know, [00:38:00] like someone had to make that they were like really into that, whatever that industry is, I don't know, but they're all a battered, they have competitors and stuff, but, um, if any of them had brown hands, they would have found this some day zero.
Right? So maybe your team just doesn't have that diversity that you think it has. And maybe by removing or not having like, you know, like, oftentimes I talk about diversity in staffing, um, people will say, well, we feel that there's a need and we understand why it's the right thing to do. And I'm like, no, no, it's not the right thing to do.
It's actually just business aligned. It's going to be good for your business, for your products. And you're not going to, you know, I always say, like I say, Under I say underestimated, you know, when I talk about these groups that are, that I'll be like, look like you, obviously we've underrepresented groups in your industry, but they're also underestimated in your industry.
There's experts who you'd have to beg to work here who are experts of this, who will, you know, are also diverse candidates. Right. But you have to actually think like, well, where are diverse candidates? [00:39:00] Um, you know, there was a very famous moment where, um, I think it was like Wells Fargo or someone, but it could be any bank really.
Um, during a accidental hot mic moment, this person was like, look, I believe in diversity, it was like to the staff meeting, but it's, it's so hard to find diverse candidates. Uh, it's so hard to find camps that actually meet the level that we're looking for. And oftentimes it's not the candidate, it's where you're looking for them.
Right. Like I have a hard time finding, you know, I don't know, flying animals in the ocean. Right. So, um, you have to make sure you're, you're looking and then you just look towards the sky and they're all there. So it's literally. Saturated. There's not enough, but there's definitely more than you can hire.
There's definitely more than you could ever hire and you'll ever need, um, for any particular one industry or company. Right. So I would say, am I doing it right? You have to ask yourself, how is this product hurting people? How could this product hurt people? Because I believe all technology because I love [00:40:00] technology, right?
So I believe all technology has a good and bad side. They say like a dual-edged sword or
[00:40:04] Jeremiah Roe: I agree.
[00:40:05] Matt Mitchell: And you know, like nuclear power. I always say it can run a city for pennies or to turn the city to ashes. Right. So how is our thing able to be at its worst? Like just have a, if you hired one, like speculative scifi person on your team, they would black mirror the, that thing.
And they would find it for you, but you you'd have to ask you to like, how could this hurt people and what are we doing to ensure that it doesn't encroach on civil liberties, on rights, on marginalized people on, on different communities, on different. On different passport holders or citizens, et cetera.
And, um, then what are we doing to ensure it doesn't, are we less than some of those harms and that's doing it right. Otherwise you're just accidentally doing it right. And should intentionally.
[00:40:48] Bella DeShantz: So you sort of touched on this problem of algorithmic bias, which has, is definitely, I feel like in the last few years has kind of. Really become more front and center [00:41:00] in tech. A lot more people are seeing it talking about it. Um, and you also touched on a solution being at the while, creating these products and, and whatnot.
There should be more diversity in the room, but how, how do we like how, what is actually happening now to fix that problem? What is happening now to create that diversity in the early stages of products. And also like, we have so many things with algorithmic bias that are already being used by everybody now, how do we fix the things that are already out there?
[00:41:37] Matt Mitchell: let's look at algorithmic bias. So like, you know, what is an algorithm? First of all, is. Huge mysterious thing. It's just basic rules, right? Uh, you know, you, you might use a streaming service and you type in something and the search doesn't just dump out every literal thing that can come out.
It filters it through well, let's [00:42:00] show them all the results, but catered towards what they tend to watch by John or a category, language length, whatever. Right. That might be an algorithm. It's just a set of rules. And the bias in those algorithms comes from when we start trying to train these rules to have the best outcomes.
And, um, that training data goes wrong or is incorrect. And to find that problem, we need to think adversarially, we have to attack the algorithm and you have to be smart enough to do that. Right. You have to know enough, um, to dismantle it and say, Where do these outcomes come by? Because they look like it works.
It always looks good. How do we know that this isn't leaning towards one way or another? And with many examples, there was a company that's like, just like, we want to do better and do more blind hiring, which is good for diversity.
And we want to be [00:43:00] less biased. So we're gonna have a computer help us find, like go through stacks of resumes and find the best candidates. And the computer just ruled out all women basically. And it did this by noticing, like whether you had a hobby that over like, uh, oversaturated, a high percentile of women tended to have, or, uh, you were part of a club or you did were part of the program.
Things that even us as human beings, we wouldn't realize like really like 90% of the candidates who are women for this. Like this thing that doesn't seem like, you know, like, um, but the computer was fed a Corpus of data, a large body of data that's is trained on. And that data is all the resumes that have ever been entered for this job back when maybe women couldn't even apply for that job back when maybe you can say, I don't like the way this person looks or whatever, or I don't like their outfit and I'm not going to hire her.
So all of that, that, that hidden bias seeped in. And we ended up with a robot that was more biased than a human, right. And those [00:44:00] are examples on the problems and the difficulties. So how do we solve it? We solve it with having people on the outside who can look for this stuff, which means algorithms have to be open.
Right? Most algorithm algorithms are proprietary. The good thing is if you have an adversarial approach, you can expose most algorithms, like not knowing what the algorithm is by just throwing things at it. You can quickly learn. So like when I quickly actually, but you can learn what the rules are. Right.
And then you can test those rules and assumptions and we can reverse it. Now, if you hire a woman you're just built in, she's going to be like, well, why is my resume never showing up? Right. So like you accidentally are ahead of the curve and saving yourselves many thousands or tens of thousands of hours, um, by bringing this mindset or this living embodiment of what you're trying to protect against.
Right. Which is sexism into the room. Um, if we open our algorithms to experts, like, you know, I was a judge on this Twitter thing where [00:45:00] Twitter, the social media platform, a news platform as well. They said, you know what, we're going to take one of our algorithms and we're open it up to like a bug bounty type of program.
Right. Uh, kind of like, and we're going to have hackers at Def con and anywhere else really, but we'll announce it at Def con spend a couple of days messing around and see what they could find on algorithm. That's already in use that we've already checked. We found some problems with it. And they found tons of stuff that they didn't fix.
And that's amazing. Right. And that's how we solve it. And some of the people, you know, they were like, Hey, I'm looking for stuff that affects me because maybe I might wear a job or I might wear a head covering. And I noticed that it's bias against head coverings, which it was a, this image cropping thing.
But other times people are just like, you know, I'm an expert and I just want to have a swing at it. And I believe in what is right. And I love technology. That's fair. So those are a couple of different ways to come at it. [00:45:53] Jeremiah Roe: And for those for, for, for those that are, that are listening to this podcast, um, and you want to [00:46:00] contribute one of the ways immediate ways that I could think of maybe, and, and Bella will keep me honest here, um, is, is, uh, you can, you know, apply that to an open API and then you can run that API into a fuzzing mechanism.
And for those that don't quite know what fuzzing means, it's taking sort of this word list that, that it's, it's, it's, it's a compiled word list that has just numbers, untold numbers of things into sort of almost like a dictionary that can, uh, apply to say, you know, the word dress or the word red or the word, you know, a resume or the word flowers or the word, whatever.
And you can fuzz that through the API that connects directly into these algorithms to see where they might. Well, wind up more than something else. [00:46:48] Matt Mitchell: I think the whole bug bounty thing is a great way to opening it up to algorithms. A lot of times the rule sets they'd be okay, or they'd appreciate you looking at some algorithms. Right. there's tons of bug bounty communities out there just, and it allows you to just get access to code. And you also could see what are people finding, um, in, in the AI space it's heavily academic, uh, no offense to the professors and researchers out there, but there's a newer way or people who are just like, Hey, I'm, I'm doing this in, like with the hacker mindset, the bug bounty mindset to like improve the AI mindset, the API mindset.
Right. Um, I think that makes a lot.
[00:47:41] Bella DeShantz: I don't know how involved in the Twitch community or platform you are, but lately there has been a big issue with, uh, [00:48:00] basically a law.
I think, like I see this as a cybersecurity issue, but basically there is a lack of protection for creators on the platform and, and folks have used like weaponized the platform to target primarily. Yeah, exactly. And they're targeting mostly black creators. \
Um, and I'm wondering like, what's your take on that? How do we pressure this company to. To step up and create, uh, a platform that is safer for their own users.
[00:48:40] Matt Mitchell: I mean, it's pretty weird. I think, you know that there's this product you can buy, like, imagine there's like a bag of chips and they changed the formula and you keep eating it. You know what I'm saying? So like, they're so lucky that this community sticks by them. Cause there's all these other platforms they could be using and yeah, they make money there. that's where their audience is. But [00:49:00] if everyone just left tomorrow, there'd be no more Twitch. Right. It'd be like, you know, tumbler lost like 50% of its user because they made a rule. Boom. Right. Um, things like that, like only families. All their users. So they decided, oh, maybe we should back away from something that we decided on.
Right. But I'm sorry. I live in New York. So nothing I could do by the funnel, New York soundtrack.
[00:49:23] Jeremiah Roe: that's making it life lifelike.
[00:49:26] Matt Mitchell: exactly. It's real, you know, but at the end of the day, this isn't just Twitch. Twitch is just the latest manifestation of this. The internet is a horrible place. Right. You know, there's a book, the internet of garbage. Uh, I'd definitely recommend it. It talks about bias and abuse online on the internet and like tough and horrible places.
Like for example, a kindergarten playground is a horrible place. Bullies pick on those, the, the group that they think they can get away with the most. Right. And that's usually like on the internet would be marginalized communities. Right. But, so [00:50:00] if your identify as a woman, you self identify as a woman, you're like the number one community gets targeted on the internet.
If you're black on the internet, like get ready. So when the bullies come, they target you first, right? So there's a lot of gaslighting it's not really happening, but once you realize it is happening, you band together. And that's where this boycott of Twitch, which would think it was a yesterday, the day before happened where the creator is just like, you know what we're just not getting you up today.
And you'll see it in the stats. You'll see it in the numbers. And imagine we did it for more than a day and that's one way, right? But it's strange because the amount of time it takes to get a press release through all the different eyeballs I have to sign off on it before it can go public versus the amount of time it takes to have a hackathon where your engineers just sit down and come up with a clumsy solution.
It just shows a lack of care, a lack of.
[00:50:42] Bella DeShantz: And that's what what's been really wild to me. So I use Twitter as a platform. I have at least some insight into, into this. Um, what's been super wild to me is that my Twitter feed is full of all of these, just regular people who know stuff about software, who are developing like [00:51:00] intermediary tools for creators to use, to help offer protection.
And they're doing this like over a weekend hackathon, right? Like they're just, they're just putting something together. Why isn't Twitch.
[00:51:11] Matt Mitchell: Yeah, it's pretty easy. I mean, And, um, if you like it, you as a Twitch person, not to turn this into Twitch conversation, but it's a good examination of what's going on across all tech and as an industry, there was an issue where people were playing music in their streams that weren't cleared. Right? And so you'd get a DCMA take down, notice on this.
And, you know, with YouTube, it's quite a very sophisticated system and you get a message and it's like this part of your track of your stream has music that appears not to be cleared. You can defend against it. You can say, oh, I'll take that part out. And they have all these tools built in, um, to avoid what would be them getting in trouble with larger, you know, music, industry, motion, picture industry, et cetera.
Right. And Twitch's handling of it. Uh, [00:52:00] expose something to me about their engineering's setup and how they're designed. if you follow that, you're like, huh. Right. And if that is what they do for something they must do under us law and they can end up not just losing the creators and those people get sued or whatever, but they can let you lose their whole platform by not actually doing what they need to do to protect against piracy.
Um, if that's how they handle that. it does not give me hope or faith in what they will do to protect the very people who give them a paycheck and allow them to have a platform it's completely user-generated content. And these rules have been developed by other user-generated content platforms that have been around for decades, whole generations, that people who weren't born when YouTube came out.
Right. So just follow the basic norms of the industry and, um, What will happen, I think is all, you know, directly affected [00:53:00] people will solve their own problems. You know, if I could sit down with some Python, make a sloppy natural language processing tool, find all the old rates that happened, pull the chats from them, see what the commonalities are and do some auto blocking with the basic algorithm.
Probably, you know, I could poop that out in a couple of days. It won't be perfect and beautiful. It's better than what most people have or not technical. And they're just sitting there cannon fodder, right? So, uh, we must do something and it's not just this platform. All platforms will learn this lesson because people will leave and good people will make tools to help those folks who are directly affected to protect their mental health and their most.
[00:53:37] Bella DeShantz: Yeah, I think for me, this is, uh, I feel, you know, I'll be candid and say, I haven't spent a ton of time in my life, on the internet until sort of recently I'm relatively new to cybersecurity, comparatively speaking. Uh, and this was one of the first instance instances for me, where I'm looking at a company just like kind of fail at protecting its users from a cybersecurity standpoint.
And [00:54:00] then also seeing, like, I feel like this was a, a protest, this was a protest that happened entirely online on the first and like, I dunno, it's I am looking at this as, like you said, as sort of something that's happening, not just on Twitch, but throughout the industry. And I'm curious to see how change, how users can pressure for change, particularly when it's, like you said less about us law and more about safety and even comfort.
[00:54:30] Matt Mitchell: Yeah, well, I have a long list. Businesses and companies and from gaming and blizzard and, you know, dating with Grindr and like, you know, that, you know, I try to push them a little bit and like, whatever standing I have in a public way, I try to privately like reach out to engineers ahead of tack whatever.
Like, Hey, like what, what is wrong? Like, is it a financial thing? Is it what's the roadmap look like? Like, are there some things going on in your industry? Like what can we do to help you [00:55:00] fix this? This is a problem, but you have to care, right? That's the first step you have to actually care. You can't force someone
[00:55:06] Jeremiah Roe: I want a hundred percent agree with that. I think it starts with caring. And to that point, I'm going to kind of make a weird, hard left. So to the caring aspect, um, where can people hear more from you stay up to date with your initiatives and show that they care
[00:55:22] Matt Mitchell: well, you know, like you can listen to the sirens out songs. It's not for me. I hope, uh, well, you could bail me out if you
[00:55:30] Jeremiah Roe: those are people carrying.
[00:55:32] Matt Mitchell: Yeah. I mean, I would say, first of all, it's just like listening to streams like this. So like thank you for even hearing my words and taking the time. We're all like, you know, Richard poor, you don't get the time back, you know, 24 hours.
Is it, you just spent some time listening to this and learning about this. That means the most to me, um, I would just say like, learn more about these issues, whether it's like watching a documentary, like coded bias, which is on Netflix and anyone could watch it or reading books, like, [00:56:00] um, uh, dark matters by, uh, Simone brown, which is a research book.
It's so well written though. It talks about the history of surveillance of black folks. Um, and it's, uh, it's academic, it's very even keel. It's extremely, well-researched in fact jacks. Um, but most of all, just like find a way to give back with your skills to something that you feel is good. That's, that's all I'm doing, but I just crank it up to 11.
Right. So I would just say, do that same thing, you know, I do spend one hour doing that. If you want to hear about more about me, um, crypto harlem.com is our, the website for crypto Harlem. We have a thing there. You could donate time and just try to help us with something. Or you could watch our live stream on Twitch or wherever we move it to Twitch.
Doesn't wake up. Uh, you can, um, you know, like that's, I'm on Twitter, I'm Gemini Matt on Twitter. Um, I've opened DMS. uh, I'm an identical twin. That's not obvious. That's not on my LinkedIn profile, Twitter profile. Um, uh, I, I'm mostly other things about me you could guess. Cause they're just like nerd tropes. Like I have a, I have a glass case of figurines of different things I cover, you know, like I have got a stack of challenge coins or whatever, but, um, I think like, you know, a lot of people throw around the word hacker, I'm a real hacker.
You know what I'm saying? Like you show me some software hardware. I can. Uh, faster, slow. I can really do that. So I think like that's not something that it's like evident there. Um, I don't think that's really a common thing, but I don't think it's really useful anymore. It used to need to be useful, but now I think that it doesn't have to be, but yeah, that's that, I don't know.
I'm trying to think of cool stuff, but that's it I'm vegan. I believe in animal rights, you don't have to, but I don't eat those things. Uh, and, um, that's it, let those, let those little cartoon bunnies pop away.
[00:58:52] Jeremiah Roe: Matt, thank you so much. I had such a blast speaking with you and
[00:58:57] Matt Mitchell: This has been a blast. Thank
[00:58:59] Jeremiah Roe: I [00:59:00] learned a ton, you know, and so I just can't. Thank you enough for joining. Thank you.
[00:59:05] Bella DeShantz: Yeah, thanks for, uh, thanks for ranting about Twitch with me. I appreciated that.
[00:59:08] Matt Mitchell: Yeah. I mean, I'm here for it, you know, I appreciate y'all but like when we see injustice, we must speak up. =