WE'RE IN!

Cory Doctorow and the Infosec Apocalypse

Episode Summary

Cory Doctorow, activist, journalist, and author who wrote the influential Little Brother cyberpunk series, gets into some big issues like surveillance capitalism and his work with the Electronic Frontier Foundation. He doesn’t hold anything back.

Episode Notes

Cory Doctorow, activist, journalist, and author who wrote the influential Little Brother cyberpunk series, gets into some big issues like surveillance capitalism and his work with the Electronic Frontier Foundation. He doesn’t hold anything back.

--------

Why you should listen:

* Hear from one of the smartest and most engaged technologists today on how technology can be used both for malicious purposes or for good.

* Consider how bias can be built into code and have real-world implications. 

* Listen to Cory’s view on tech monopolies and his proposals for reversing their power over users and the internet more broadly.

* Better understand why independent security research might seem counterintuitive to many people. 

* Hear the author of one most influential cyberpunk series discuss the origins of his latest book, Attack Surface.

--------

Key Quotes:

* “Wishful thinking isn’t going to solve real-world technical security issues.”

* “It’s so important that we build safeguards against our own frailty.”

* “Tech has become a kind of dangerous monoculture ...technologically dangerous because a breach or a defect in a system has consequences for hundreds of millions, if not billions of users.”

* “Monopoly is a really bad tool for protecting privacy because monopoly only protects privacy where privacy is in the interests of the monopolist.”

* "We should hold everyone to account for being good privacy actors by having a privacy law -- a real, no fooling privacy law."

* "One of the things that we need to take consideration of is that the security apocalypse is here. It's just not evenly distributed."

--------

Related Links:

* Synack.com 

* https://www.linkedin.com/company/synack-inc-

* https://twitter.com/synack

* https://craphound.com/

* https://pluralistic.net/

* https://twitter.com/doctorow

Episode Transcription

Jeremiah Roe: [00:00:00] Hey, how's it going, Cory? 

Cory Doctorow: [00:00:10] Very well, thank you. Nice 

Jeremiah Roe: [00:00:11] you so much for joining us. W we certainly appreciate it, Bella. How's your day going? 

Bella DeShantz: [00:00:16] You know, not too bad, we're recording on a Friday, which is, which is always nice. Jeremiah Roe: [00:00:23] I'm pretty, I'm pretty stoked. let's go. 

Bella DeShantz: [00:00:50] we're really excited to have you on the show. there's a lot that we could talk about with you, but since this is a cybersecurity podcast, we're going to pretty much stick to [00:01:00] that. what attracted you to this realm in the first.

Cory Doctorow: [00:01:14] Well, I've been involved with computers all my life. You know, as I often tell people who, uh, asked me how to get involved in the industry, uh, that if you don't have the self-discipline and foresight and gumption to be born in 1971, I just can't help you. Uh, but. 

Jeremiah Roe: [00:01:30] You're done. 

Cory Doctorow: [00:01:32] well, that very fortuitous birth meant that when I was six, not only did star wars come out, but my dad brought home a teletype terminal and an acoustic coupler connected to the mainframe at the university where he was training to be a computer scientist.

And, uh, we got an apple two plus in 1979 and a modem in 1980. And. I ended up dropping out of university to be a CD rom programmer for a Voyager, and then, [00:02:00] uh, went into gopher development and then web development, and then did a startup and then moved to San Francisco and then fell in with the electronic frontier foundation and quit my startup to work for them, moved to Europe, to be their European director and all the while I was writing science fiction as well, and was very, um, you know, kind of in math.

And in all of these policy questions, as well as these technological questions. Um, my dad, in addition to being a computer scientist, very political. And so I grew up in the, you know, the anti nuclear proliferation movement and the trade union. And so I was very interested in questions of equity and, you know, the survival of our species and the correct way to think about technology and who should be in charge of it.

And how, and, you know, I think that there are a lot of people who think about what technology does and there's a smaller number of people who think about who it does it for and who it does it too, but merging those two, uh, a kind of good understanding of what technology does [00:03:00] with a good understanding.

Of who it does it for and who it does it to that's uh, I think a distinctive perspective that maybe I was early on that today, I think is more widespread, thankfully. Cause it, it is the kind of major issue of our moment.

Bella DeShantz: [00:03:15] Yeah, it reminds me of the, I feel like I hear this sort of cliche question or our topic of people who are creating technology often ask, can we do this and not, should we do this? Um, so it's kind of cool. Like that's sort of what, you're a little bit related to what you're asking.

Cory Doctorow: [00:03:32] A little bit. It's not quite right though. Um, so that's, I think that's a quote from Jurassic park originally, right? You were so busy thinking about whether you could do it, that you never asked yourself, whether you should. I think that you could have two identical twins, analogies that depending on who used them and who was being used, who they were being used against, uh, you would get a completely different kind of ethical outlook.

So for example, [00:04:00] Take predictive policing tools. Why don't we train a machine learning system on arrest records? And then you use that to predict where crimes will be. And this, this is a terrible way to predict crime because arrests are not a good proxy for where crime occurs. Arrests are a good proxy for where the police look for crime because you only find crime where you're looking for it.

And so if there is discriminatory, Crime prevention, then all of this discrimination gets turned into a kind of faster, more intense version of discrimination with a kind of veneer of empirical facewash. Now one of the people who's done the best work on this, the guy named Patrick Ball, who runs a nonprofit called the human rights data analysis.

And this is a group that does very rigorous statistical work, usually for, uh, human rights, tribunal's war crimes, tribunals, and truth and reconciliation proceedings, where they will take the fragmentary evidence from acts of genocide and use statistical models to [00:05:00] extrapolate hidden acts of genocide things that, that aren't in the record, but that are likely to be in to be true.

So for example, they use this method to infer the location of a bunch of mass graves, narco mass graves in Mexico that were later. Right. They, they, they, it's a, it's a pretty cool thing. They do. I mean, it's awful, but it's really, it's science and math for good. And so they did this, right? They did this with predictive policing.

They took the PredPol model. PredPol's like the big predictive policing firm. And they took the data from Berkeley and they ran the. The PredPol on it. And they said, where will the crime from the year following this year be, uh, specifically the drug crime. The reason that we're looking at drug crime from the following year was the following year was the year that the, um, the NIH did its triennial gold standard survey of American drug use, which is the single best and most reliable picture of who's using drugs in America.

And so they had, what PredPol thought would be the drug usage [00:06:00] pattern. And then they had the actual drug usage pattern. And what they saw was that everyone uses drugs, but PredPol thinks that all the drugs are to be found in neighborhoods where poor and racialized people live. And that's like shows you write that as a tool for telling the cops who to arrest PredPol and its statistical modeling is terrible.

However, what if the year before. That you took the policing data. You had done some intervention to try and remove bias from policing. Like you'd given them some implicit bias training or some other form of, of a technical or a social intervention that was supposed to improve the degree to which there was bias and policing.

How would you know whether it worked well? You could do this, you could do exactly the same thing. You could take the same tool, the same data, and you could do the same operation with it. And you could. The cops went out and arrested people in a way that was not fair, [00:07:00] right. In a way that was like empirically unfair, because we have the prediction data and we have the real data.

And so we can evaluate whether or not the tool was any good. So this isn't a question about whether you should do it. It's a question about who should do it and how it should be used. And, you know, I think that there's like a lot of scope for doing things that are ill-advised if used badly. And then thinking very hard about how to use them.

Like I believe in pen testing and proof of concept, I don't believe even ransomware. Right. And I think it's perfect. Perfectly fine to write ransomware proof of concept as a way of evaluating, like does an anti ransomware countermeasure work, are you safe from ransomware? Right? Like those are really good tools.

And also, you know, as a practical matter, if you want to comprehensively answer the argument you're having with a vendor where the vendor says, nah, it's impossible. No ransomware could ever infect our systems. You can prove they're wrong by infecting their systems with ransomware and a controlled environment with their consent in [00:08:00] a way that shows.

That they're wrong. And that people who rely on their assurances are living in a fool's paradise. That's a very, very positive intervention to make. And again, it's not like who's doing, it's not like what you're doing, it's who's doing it and why, and for whom, and again, 

Jeremiah Roe: [00:08:13] I would. actually really agree with that in the stance that if we're taking a look at just the pure humanistic approach, and then we're also taking a look at the, you know, the ones and zeros and the digital aspect. You can have the exact same thing, right? You take two individuals who are both. One is malicious and the other one is good and you're going based off of their foundation of moralistic beliefs.

And so each person can use a tool in two different ways. And the same aspect goes with, you know, the ones and zeros in technology and the way it's coded and put together and whether or not that individual had bias. And if that bias is reflected inside the code in some way, and then you can extrapolate that out further to.

Really showcase what the differences in identification can be, because if some person believes that identification to your [00:09:00] point can be towards finding a crime inside of his particular area, they're going to be focusing on that area because they, they constantly go to that area and that's to them, that's where crime belongs.

it's an interesting thing, because that can be applied to cybersecurity as well in a number of facets. 

Cory Doctorow: [00:09:54] I think that's right. And I think that, um, one of the things that gets underappreciated, when we think about [00:10:00] this kind of work is the extent to which it cuts both ways. It is often, but not entirely the fact that this stuff discriminates against people who lack social power. There are times in which people have more social power, get the shitty end of the stick and this kind of modeling.

So for example, if you are, uh, experiencing domestic violence or intimate partner violence, uh, you are more likely to have an intervention that, uh, that takes you out of that situation. If you live in an apartment or a semi detached home, and if you live in a fully detached home, you are less likely to have any kind of intervention from the outside.

Can you, can you guess why. 

Jeremiah Roe: [00:10:40] That makes sense. Yeah. You've got the neighbors. You've got folks who are paying attention. Yeah. Yeah. People can 

Cory Doctorow: [00:10:44] The neighbors. Yeah, exactly. Right. Yeah. You're, you're, you're stately home on the, at the end of a, uh, of a high road. Doesn't have a neighbor to phone in and say to the cops, the guy next door is beating up his wife. And so, you know, like it's [00:11:00] really important to understand that the data reflects all kinds of bias and all kinds of.

Distortions and that those distortions, uh, can be accounted for, or at least can be understood and you can be careful about them, but denying them doesn't help. Right. And there's a similar parallel to InfoSec, which is that denial, obscurity. You know, looking for your line or your keys under the light post instead of where you drop them.

That that doesn't help, right? Like, oh, I know how to secure the front end, but not the backend. So I'm just going to secure the front end and I'm going make it as secure as possible. 

Jeremiah Roe: [00:18:09] you've got several things that have come out recently. Uh, one of those is, is attack surface now inside of attack surface. And for those listening, if you haven't gotten that book yet, please do Corey discusses some exceedingly large themes, things like surveillance, privacy, you know, stalker.

Civil liberties, the things that keep you safe. So, Corey, I was wondering if you could sort of break down the thought process when you were deciding to focus on this book? 

Cory Doctorow: [00:18:43] Well to understand that I have to explain a little more about little brother and Homeland at CQL, which, which are books about young people. Facing an encroaching security state in the U S and the early two thousands. And the 2010s as computers are being used more and more to surveil and [00:19:00] control people, notionally in the name of public safety and security.

More often to allow powerful people to do bad things with impunity. And sometimes. And those books really inspired a lot of people to get on the right side of technology, which is really wonderful, but not everyone got on the right side of technology. And I wanted to. Think about how those people might come back. And in particular, I was inspired by Edward Snowden, who wrote the introduction to the reissue of little brother and Homeland, and who took a copy of Homeland with him.

When he went on the run from Hong Kong. If you watch the documentary, you can see him packing his go bag when they're bugging out of the hotel. And one of the things he grabs is his copy of Homeland and about what it takes to kind of look in the mirror one day and say, I am not the person I thought I would be, and that's going to change now.

I'm going to have a reckoning with what I did. And so, uh, uh, attack surface is a book about a young one. Named Masha, who appears as a kind of frenemy of the hero of the first two books who works first for the DHS and then for a beltway bandit on a Ford operations base in Iraq, and then for a private security outfit, like NSO group or blue coat.

Or Palentier who are supplying malware [00:21:00] as a service to the world's worst dictators who goes back to Oakland, where she came from and discovers that her high school, best friend who's now a black lives matter activist is being targeted by the same malware that she spent her career developing, literally tools that she herself made.

And she asks. Reckon with the meaning of her life's work and decide what she's going to do about the fact that this is greatly upsetting. Is she going to compartmentalize it and rationalize it the way she compartmentalized and rationalize so many decisions up until now? Or is she going to change your mind?

And if she does change her mind and change her life, what does she say to the people whose misery she helped create? And what does she say to them when instead of greeting. As a hero, who's joined the fight. They ask her where she's been so long and why it took her so long. And that was, um, a really.

Powerful exercise to go through.Jeremiah Roe: [00:22:25] That particular sort of scenario really reminds me of a famous quote from Oppenheimer. The now I am become death, the destroyer of worlds it's from his own reckoning of, uh, a tool or a thing that he created, which, you know, we all are aware of, which is, you know, w uh, is, is, is, you know, can cause great damage out there, you know? 

Cory Doctorow: [00:22:51] Could render us extinct. Yeah. And you know, I, I think that. Oppenheimer's an interesting character for people who don't know, he ran the Manhattan project and as the [00:23:00] father of the atom bomb, I mean, there are lots of people who can make that claim, but none in quite the same way that Oppenheimer could, because in addition to being a brilliant physicist, he was also a brilliant administrator.

And there, there really weren't a lot of people who had both skill sets and there it's quite likely that we wouldn't have gotten the bomb in the form. We got it in, in the timescale that we got it in used in the way that it was used. If there hadn't been someone who was both a kind of once in a generation administrator and once in a generation physicist who could oversee such a project.

And, you know, one of the things that is striking about the Oppenheimer story is how different it would be if he had had this realization a couple of years earlier, instead of turning away from the first mushroom cloud, is it rose over the nuclear test site in Los Alamos? And said, uh, you know what, um, I'm sorry, I, that thing, but now it's in the world.

What if he, what if he'd said I won't build that thing and, you know, he regretted it forever. You know, the, um, he got thrown out of the white house [00:24:00] for being invited to lunch with the president to celebrate his achievement and for spending that line. Talking about how, how scared and angry he was about nuclear proliferation.

Bella DeShantz: [00:24:24] a lot of the readers of your books work in cybersecurity, the protagonist of attack surface works in cybersecurity. I'm wondering, is this book meant to be sort of, is it specifically meant to reach folks in this industry? Is it somewhat of a call to action?

Cory Doctorow: [00:24:43] Absolutely. But [00:25:00] also I wanted to create a story that would help people who are outside of InfoSec understanding.

What the moral conundrum and character of the work in InfoSec is cause it's a very opaque trade. People struggle with this. I once got in a cab after a security conference in Chicago called thought con T H O T. It's a rather unfortunate name, uh, that predates the slang usage of thought and goes back to, um, old Chicago slang for, for Chicago based on.

Numeric letter substitution for it's it's area code that goes back to phone freaks. And I got in a cab to go back to the hotel after thought con and the cab driver was like, what is thought con and why are you at thought con? And what does this have to do with my understanding, the word thought, which you know was funny.

But then when I explained what the outcome was, he was absolutely appalled. He was like, why should people ever develop knowledge about defects in widely used systems? [00:26:00] And I think that this is widely reflective of a social consensus that we swim upstream against when we try to do the important work of security auditing and other forms of security work, because there is such a counterintuitive, uh, kind of counterintuitive nature to that independent security research.

And when you realize the basis for independent security research, which is effectively that large vendors. Lie. And also don't look very hard at their own security and can't be trusted when they tell you that this thing that you've entrusted your life to is sound. And in fact, you are putting your life ten-year life into your own hands.

Every time you use a technology, because the vendors promises are empty, that's such a terrifying realization that people really have a lot of cognitive armor against it. And I really wanted people to understand the value. Of independent security research, the significance of independent security research.

And to [00:27:00] understand that you have to understand that there's a whole group of people, criminals, state actors, unscrupulous cyber arms vendors who work the other side of the street, who, you know, even if the hackers never reveal the defects in a web server or some other widely used system, those defects will still be uncovered and weaponized by people who are very unscrupulous.

And really the only people who won't know about those. Are not bad guys. It'll be the people who rely on those systems.

Bella DeShantz: [00:27:28] My work prior to working at Synack was as a penetration tester. And I focused on web application security and something that I've always done since working in this field is do what I can to get other folks interested in the field. And I remember I was like talking with some younger women who were interested in cybersecurity, and one of them asked me this question that was like, do you ever feel bad doing it?

Like hacking websites. And the question just totally threw me off because I had [00:28:00] never thought of it that way. You know, I, it, because it is exclusively for good, the whole point of pen testing, the result is, okay, now fix it. Like, let me help you fix this. Let's make the world more secure. And I've always only ever thought of it that way.

And it was so interesting to sort of step back for a moment and think about how other people perceive what I do. And, and if that's a problem.

Cory Doctorow: [00:28:24]this is a place where a lot of mischief can occur is when you're doing something that is. Or whose benefits are a long way off from its costs, even if they outweigh those costs. Uh, you know, there's, there's obviously short term pain and disclosure of a defect. Even if you, even, if you do a manage disclosure, the world's most managed disclosure, still put some people to risk.

Um, and you know, there are a lot of people who, for self-serving reasons would just like those actions never to occur. Not because they're worried about that short term. But because they're worried about their own [00:29:00] kind of long-term parochial interest, being able to make more money and get out before anyone holds them accountable for the way that they cheaped out on security or other technology choices.

And, um, that kind of manufactured doubt where you go around and demonize people who do security research, who do point out defects who tell uncomfortable and inconvenient truths, it can work really well. And. You know, one of the things that I think narrative can do that fiction can do is really take you through what would otherwise be a very abstract argument and really show you what it might feel like to live in the counter factual world, where we do prohibit people from doing that uncomfortable.

Bella DeShantz: [00:29:43] I also remember instances of customers when I tell them the unfortunate, but also sometimes exciting news of like, Hey, I found this really cool vulnerability. [00:30:00] Here's what we need to do to fix it. Um, I've had instances of customers being like, ah, You know, what could you change this?

You have it listed as high severity. Can you change it to medium? Because that way we'll have more time to fix it. And it's, it's the same sort of idea of like, wait, what are we, what's the point here? What do we think is the reason that we're doing this? What's, what's, what's happening to this information once we've brought it to light?

Um, I don't know. It's I think, I think there's a lot of perception that can be changed and should be.

Cory Doctorow: [00:30:30] I think that's a hundred percent, right. I think that, um, Thoughts and wishful thinking are no substitute for better code and an informed, uh, marketplace. I understand why people who are in the heat of the moment who are worrying about their bonus or about their company survival, or about their own reputation might make a compromise that if they, if someone else were doing it, they would condemn, but in their own case, they think of as, okay.

[00:31:00] And that's one of the reasons that I think it's so important that we build. Um, safeguards against our own frailty into the way that we do this stuff. Uh, I talk about a lot in technology context about something called the Ulysses pact, which is a term from behavioral economics that takes its name from the story of Ulysses who was, you know, an early Greek hacker who was going through siren infested water.

Where, if you heard the song of the sirens, you would jump off the sea and be dragged to your death. And so the sailors had a kind of standard countermeasure, which was to fill their ears with wax and Ulysses being a hacker wanted to hear the sirens, but not die. And so he asked his sailors to tie him to the mast so that when he heard the siren song, he wouldn't be able to jump in and the sea.

And what he was doing was taking a circumstance in the future where he knew it would be. And he was taking a countermeasure against that weakness at a moment of strength. And we're really familiar with Ulysses packs in the real world. Like, oftentimes you will do something like throw it all the [00:32:00] Oreos.

Then you go on a diet because you know, if you have to go to the grocery store to in the morning to, to, to gobble a whole sleeve of Oreos, when your blood sugar crashes, then you might not, you know, the barrier might be enough to save you from yourself. In the same way we do things like apply irrevocable licenses to our code, where we.

You know, use free software licenses or creative commons licenses that we can't rescind. Not even if we get to a juncture where our investors say, you must make that code proprietary, or I'm going to pull the plug on your firm and all those people who quit their jobs and put their kids college funds at risk to work for you are going to end up on the street.

You can say, look. I'll make the code proprietary, but I can't make everybody else make the code proprietary. We'll just be the only people in the world who have a proprietary version of it. And everyone else is going to have the GPL version and, um, that kind of Ulysses pack that kind of defense of yourself.

Against yourself [00:33:00] is often present in security context as well, where people make binding promises to act in certain ways when they receive bug reports or who use fiduciary proxies to receive by reports who say in six months, this bug report will go public, regardless of whether or not you've patched it, therefore you can't do anything to lean on the researcher who found the bug report to stop them from disclosing.

And that kind of defense against your own frailty is really important.Bella DeShantz: [00:34:05] Right because humans, unfortunately, you know, we can't always trust ourselves, not to look for the easy option. That's how we work.

Cory Doctorow: [00:34:12] Sure. And also not to be, and not to be blind to your own mistakes, because we are often blind to our own mistakes. That's why we get a second pair of eyes on our stuff before we publish it, because it, the hardest kind of mistake to spot is your.

Bella DeShantz: [00:34:26] you'll be speaking at Def con virtually this year, uh, that will have happened by the time this episode airs, but I want to talk a little bit about your presentation, privacy without monopoly. Uh, can you break that down for us? What does that mean?

Cory Doctorow: [00:34:42] It's actually based on a really great paper that I co-wrote with my colleague Bennett ciphers. And yes, we do have someone at eff whose surname is ciphers, which is really cool. Um, 

Jeremiah Roe: [00:34:53] is the 

Cory Doctorow: [00:34:54] Yeah, it's really great. He's, he's one of our staff technologists Bennett, I should say, did all the heavy lifting on this privacy without monopoly paper, and to understand where the paper is coming from.

You have to understand why. People are talking about demon awfulizing tech and how they're talking about Dean monopolizing tech. So you may, you may remember that there once was a time when the web wasn't just five giant websites filled with screenshots of texts from the other four. And the way that we got to this place.

Where tech has, has become a kind of dangerous monoculture, both technologically dangerous because you know, a breach or a defect in a system has consequences for hundreds of millions, if not billions of users, but also, um, socially dangerous because the choices. That everyday mediocrities who are no better or worse than you or me like mark Zuckerberg make ripple out to billions of people's [00:36:00] lives.

Right? so one of the ways that tech got so monopolized is by exploiting something called the network effectIt's a network effect is present when a technology gets better as more people. So, for example, you might get a Facebook account because there are a bunch of people there you want to talk to. And once you have a Facebook account, other people might join because they want to talk to you. And those network effects are very powerful and they drive growth.

Uh, but when we talk about tech a monopoly, we often. Include network effects, but neglect another idea from economics, which is switching costs, which is what you have to give up to leave Facebook or some other big platform. You know, that if you leave [00:37:00] Facebook, then all the people who are there lose contact with you and vice versa, unless you have some other out-of-band means of contact.

And certainly the communities and the discussions that you're part of the customers that you speak to. Uh, and so on you, you, you kiss all of those people goodbye. And for that reason, people often endure a lot of pain on Facebook because the pain is outweighed. The cost is outweighed by the benefit that they get from being.

But the thing is that that cost is manufactured. There is nothing intrinsic about Facebook that says that if you leave Facebook, you can't send messages to the people who are left behind.

And the higher, the switching cost is for Facebook. The more pain you'll endure before you leaveThe good news is that interoperability offers us a way. To migrate people away from Facebook to change what the switching costs are. And there are a bunch of proposals for interoperability.

So some of them are what we at eff call competitive compatibility or adversarial interoperability, which is, uh, legalizing activities like writing bots that scrape people's Facebook [00:39:00] inboxes, and import them into third-party services. With their consent and their credentials so that you can read and respond to your Facebook messages from, uh, off of Facebook, others are mandates.

So the access act, which was introduced first in the Senate in 2020, and it has been reintroduced in the house this year would, would require Facebook to open it. API APIs that third parties could use to, to talk to it. And, and the European union has proposed a similar mandates, the European or the, the British, um, competition and markets authority, uh, as well as Australian in India and, and other regulators have all, uh, mooted, some form of mandated interoperability.

And one of the things that people say when you talk about interrupt is that this is potentially a huge privacy problem. Indeed. If you watch the markup and the house judiciary committee for the access act, Zoloft grant among others said like, what if the Chinese guy. Is one of the inter operators that can Hoover data out of Facebook.

Uh, Facebook may invade your privacy, but it also protects your privacy from these other people might be worse than Facebook. And this is what prompted Bennett and I to write this paper was to talk about how monopoly is a really bad tool for protecting privacy.

Because monopoly only protects privacy where privacy is in the interests of the monopolist. And monopolous don't have the same interests as their users. the idea that we can defend privacy by trusting monopolies.

Is a pretty incomplete version of how you could protect people. And a much better version would be to have a privacy law that spelled out what was, and wasn't permitted with our data,And so that's the gist of the argument that it, that what we should have is rather than. A system where we deputize firms that are really bad at privacy to defend our privacy that instead we should weaken these firms so that their failings of privacy are not the [00:44:00] last word, so that if a company isn't defending your privacy, you can go somewhere else without paying too high a cost.

And then we should hold everyone to it. For being good privacy actors by having a privacy law are real, no fool and privacy. 

Jeremiah Roe: [00:44:13] so to that point that you made about privacy... Is there any way to deal with this other than legislative action?

Cory Doctorow: [00:45:24] Lawrence Lessig has this framework for. Social change happens for how, how, how social outcomes are determined, where he says that it's all determined by four forces, there's code what's technologically possible. There's law, what's lawfully permissible, there's markets, what's profitable and there's norms.

What's socially acceptable and they all work together. And I think what we can say is that there are times when you can't think of a single way. To solve a problem, say with code, and it may be that the way that you open space for a code based solution is with a law. So for, for [00:46:00] example, if, if we open up a law that requires firms to respect, do not track, uh, headers and HDP requests, then we can write code to audit compliance, and we can use crowdfunded law firms.

To extract large sums of money from companies that don't obey DNT directives. And so you can, you can bring markets and code and law together. And, you know, if we start shutting down and demonizing these companies that don't respect our privacy, then we also make a normative. Then those companies become bad actors.

People don't want to work for them. Um, people won't stick up for them when they go to Congress. So they go to court and say, Hey, I'm being persecuted for something that's totally legitimate. People will say, no, that's not legitimate. And we can tell because we have this social concerns. That, what you're doing is wrong.

And so I think that you're right, that law is a piece of it, but I actually think there is a rule for role for tool Smiths. I think we need to get the law out of the way of tool Smiths to a certain degree, we have to create what I've been calling an inter [00:47:00] operators, defense where modifying a product or service for a legitimate purpose, such as accessibility, security improvement.

Otherwise lawful features that users have requested or desire. And so on, those things should be that, that you should have an absolute defense against claims under patent copyright, including anti circumvention, cybersecurity laws, like the computer fraud and abuse act and so on so that people can move with a certain degree of, of surety to give users the tools that they need to liberate themselves.

Bella DeShantz: [00:47:31] we've talked a little bit about how do we fix privacy problems, privacy, law, things like that. How can we fix, how should we fix the cybersecurity problem?

Cory Doctorow: [00:47:48] Gosh. I mean, I wish I knew clearly. It's, it's, it's a, it's a really dire situation. 

Jeremiah Roe: [00:47:54] huge question too, based off your experience, that what, you know, to Bella's point. 

[00:48:00] Cory Doctorow: [00:48:00] I think what we need to do is break the problem down into two pieces, right? So when you're at the bottom of the hole, stop digging. So what can we do to stop making the security problems worse? And then how can we fix them? Are two separate questions. So one of the reasons security problems are bad now is because of structural impediments.

Disclosing them and remediating them. So you can't push an update to an iOS device that apple hasn't signed. You can't push an update to an embedded system in a car that the manufacturer hasn't signed. And that would be great if apple and the car makers were perfect. They're not. And the cases that they consider when they think about their security model are incomplete and they might treat your cases on edge case.

And that's much more likely to be true if you are a marginalized user than if you're a user who already enjoys a lot of remedies privileges, and [00:49:00] third-party, you know, boosts and power ups that let you navigate the world. one of the things that, that we need to take consideration of is that the security apocalypse is here. It's just not evenly distributed. And so when we make disclosure lawful, and when we put remediation into the hands of users, as well as vendors, we allow for the people who are getting the hardest version of the InfoSec apocalypse, Uh, to, to, uh, derive the most benefit.

Um, so, you know, if we can get the bad stuff out of the way, maybe we can think of some good stuff. Jeremiah Roe: [00:51:26] we could certainly speak with you for hours.

I think I personally could How can our listeners reach out or find more information about you? And if you could share something with us that, that you wouldn't necessarily notice from your LinkedIn profile, something that you wouldn't put on there, but it's very true about who you are. 

Cory Doctorow: [00:52:13] Oh, wow. Um, well in terms of how to find me, I, uh, for a long time, I was the co-editor of Boeing, Boeing. I still co-own the site, but I don't edit it anymore. I left it in, in early 20, 20, and these days I'm doing my own little non-commercial cross-media extravaganza called pluralistic.net. And you can read it as a series of daily Twitter threads or as mastered on threads.

Or as blog posts or as full text RSS or as a plain text email with, uh, delivered, uh, with no embedded links, exclusively by mailman, uh, with no HTML so that there's no security issues. in terms of stuff about me that people don't know, or that's kind of like not widely understood. I guess, you know, um, the, the thing that I don't talk about it, the awful lot in public is my family.

Uh, you know, I'm a dad and a husband and, uh, I'm very busy, very proud of, of who I'm married to and who I'm the father of. Uh, but it's a domain where talking about those, those things tails, uh, requires, uh, um, more mutual consent than is practical, right? And so, you know, if you just read my blog posts in my writing, you wouldn't know how important those things are to me, how much of my day and how much of my thoughts revolve around those to other people in my family.

Um, but, uh, it really is [00:54:00] as important as any of the other things that I do in my. 

Jeremiah Roe: [00:54:03] Awesome. Thank you so much, Corey. We certainly appreciate your time. Cory Doctorow: [00:54:11] Oh, my pleasure. Thanks for having me on.

Bella DeShantz: [00:54:13] Thank you so much for sharing with us. The things that are important. [00:55:00]