WE'RE IN!

Micah Hoffman Breaks Down OSINT, the Dark Web and Beer Apps

Episode Summary

In this episode, Micah Hoffman talks about his career in Open Source Intelligence (OSINT) and the value it has for investigations, cybersecurity and understanding how information is weaponized. He also gets into strategies for safeguarding personal privacy in the face of increasing digital surveillance. This episode will have you thinking twice about what you post on social media!

Episode Notes

In this episode, Micah Hoffman talks about his career in Open Source Intelligence (OSINT) and the value it has for investigations, cybersecurity and understanding how information is weaponized. He also gets into strategies for safeguarding personal privacy in the face of increasing digital surveillance. This episode will have you thinking twice about what you post on social media!


Why you should listen:
* Hear from one of the leading Open Source Intelligence researchers working today.
* Learn about the value of OSINT for offensive and defensive cybersecurity.
* Get a better understanding of all the privacy risks from fitness trackers, apps, shopping online and social media.  

Key quotes:

* "OSINT is a reconnaissance skill. It's all about that preparation work that needs to be done before you do anything in cyber, whether it's attacking or defending."  

* "Once things are on the internet -- or once things are even collected, not necessarily on the internet -- you've lost control of it."
* "The reality is that we give up our privacy every single time we use an app, every single time we choose to purchase something."

Links:

* https://www.spotlight-infosec.com/

* https://osintcurio.us/

* https://www.synack.com/

Episode Transcription

[00:00:00] Jeremiah Roe: Welcome to the show. Micah, thank you so much for joining us today. Um, I like to introduce you to my, uh, exceptional co-host Bella, how are you doing?

[00:00:32] Bella DeShantz Cook: I'm pretty good. Jeremiah, thank you for asking. I'm definitely looking forward to this conversation today. I know very little about what we're going to talk about, so, uh, I'll get to learn some.

[00:00:42] Jeremiah Roe: Yeah, Mike, uh, uh, Mike and I actually used to, for those that don't know, Mike and I used to work together. Uh, we previously worked together over at another government contractor and, um, I used to work directly for Mica funny enough. Um, and, uh, one of the, one of the better bosses that I've had, uh, throughout the years, [00:01:00] uh, always challenging individuals to do better show impact and, um, make it worthwhile for leadership to understand, uh, from, uh, both an operational perspective as well as a business impact perspective.

So again, thanks so much for joining Mica.

[00:01:18] MIcah Hoffman: Well, thank you for being and for, uh, asking me to be here. you. 

[00:01:22] Jeremiah Roe: Oh, certainly my pleasure. Um, I know, um, Micah, since you less, since you've left, uh, previously when, when we worked together, you've, you've, uh, dived really deep into a lot of the passions that you've had, um, in the process. And one of those is of course, open source intelligence gathering. And so, um, from there, you've transitioned into starting your own company called spotlight InfoSec and in the spotlight spotlight InfoSec, that, that mainly revolves around open source intelligence gathering or, oh, scent.

And I was just kind of wondering if you could, uh, talk broadly about what OCM is, [00:02:00] um, how cybersecurity professionals and adversaries use it for both defensive and offensive purposes. 

[00:02:07] MIcah Hoffman: Sure open-source intelligence, or as you said, Osen is a reconnaissance skill. It's all about that preparation work that needs to be done before you do anything in cyber, whether it's attacking or defending or. What has happened on somebody's, uh, mobile device. If you're doing digital forensics, I love it because it's truly an investigation.

You are trying to piece together, different pieces of publicly available information and figuring out what they all mean. I remember back when I was doing pen tests for that government organization that you and I used are for that government contract that you and I worked with. And, uh, I remember one time I was doing a penetration test and I was testing a web app.

And part of our process was just Google the name of the website. And I, I just Googled [00:03:00] it and I pulled back, uh, a PDF help document that said, Hey, um, if you want to log into this website, use a username like this and a password like that. And wouldn't, you know, it, I just typed those exact. Credentials in.

And I got, I logged right into the website and I was like, wow, this is so powerful who needs like hacking what I can just log right in. So it was pretty neat at that time. And I knew that that's where I wanted it to be. 

[00:04:42] Bella DeShantz Cook: Uh, just literally Google searching for, uh, a customer that you're trying to do a penetration test for finding some information and going from there. Um, are there any, like, is there any, do you find yourself having to kind of clarify exactly what Olsen is? Um, like do [00:05:00] people think of it literally just as just Googling, like, is there more to it than.

[00:05:06] MIcah Hoffman: Absolutely, but it really depends on who I'm talking to. You know, when I talk to my family and friends about what I do, I, I usually just simplify it and say, um, I find things about people online, about people that IP addresses in computers or about businesses. And that generally helps out. Um, of course we do use search engines and of course there's a good percentage of people on the internet that think we just do fancy Googling, but that's not all of the things that we do.

Uh, so yeah, there's a lot of it is explaining that. That just because the resources are online. And just because they're freely accessible, doesn't necessarily mean that everybody is doing open source intelligence or doing oh, center doing recon because there's, there's so many different nuances to how, what sites you visit, how you collect the data, how you [00:06:00] store it and how you interpret And analyze the data that are all very important pieces of that overall Odin.

[00:06:07] Bella DeShantz Cook: And like on the, on the other side of misunderstandings about OSA. And do you ever have to clarify, uh, clarify to folks that it's not malicious? I know, you know, I have a background in penetration testing and from folks who didn't know penetration testing, I would constantly get these questions of like, isn't what you're doing bad.

And I know, like, I know that there are people who think, oh, sinch is just, it's like it's spying and it's, it's, it's bad. How do you clarify that for folks who kind of don't really understand the.

[00:06:39] MIcah Hoffman: We do it very similarly to how you might do it for penetration testing. When you're doing pen testing, you know, there is a line that you will not cross without the rules of engagement with your customer without your customer's permission. And so in essence, we have the same thing, except it's a little bit different for us as a lot of the [00:07:00] data that we have, or that we collect is online and it's publicly accessible in our.

In our area. Um, we have not only laws governing what we can and not can and cannot collect about people or companies. And we also have ethics, which pays a, plays a big part in what we do. So, uh, here's an example, you know, um, using breach data when some attacker breaks into somebody's website and steals that data and releases it online, that data is publicly accessible, but it's stolen data and it's data that has sensitive information in or about other people.

And some customers don't want us to use that stolen data in our work sub customers don't mind it because they know that attackers are monetizing or, or, uh, using that data for bad things. So, uh, we, I think we draw the.

line kind of ethics are and, And rules of engagement, much like cyber.[00:08:00]

[00:08:00] Bella DeShantz Cook: And what is so similar to, you know, like you said, I, I am hearing a lot of similarities between oh, set work and penetration testing. And I think, uh, for me working as a pen tester, a big way that I was able to explain it is like, there's this really positive end goal. Right. We, we submit our vulnerabilities and then they're fixed or managed or something.

What's the kind of end goal with

[00:08:26] MIcah Hoffman: It really depends on what you, who your customer is and what they asked you to do. Uh, so in my Olsen consulting practice and that, I know that's a very consulting answer, right. Uh, but in my Austin consulting practice, you know, I might be doing some work where somebody is looking to find out if their spouse or partner is cheating on them.

So we usually start with some kind of question or goal, and then we, we gather the data and analyze. But some of the work that I and my colleagues have done, it can be anything from, uh, [00:09:00] researching businesses, uh, doing competitive intelligence or due diligence, trying to find missing people, uh, to aid law enforcement, to, uh, to researching IPS and domains, to see who's attacking who, and maybe get a little bit about attribution, but to go back to your original question, it all is very much guided by that overall mission question of what do you, what is a successful engagement look like for you?

Um, but in the ocean world, I'm more of an advisor than an action taker, you know, Oh, Sinton recon. We gather all the things we analyze them. And then we recommend, we suggest in the pentest world. Um, and In the cyber world in general, you, you do, you do some of that, like with pen testing, you'll find the holes in applications and systems, and then you'll say here's what I found and here's what I suggest you do.

So, um, it is very similar to cyber work.[00:10:00]

[00:10:00] Jeremiah Roe: In my, in my current role, um, utilizing a lot of these methodologies that Mike is talking about, um, Was a client that we were working with that requested, we conducted an investigation on them prior to, uh, conducting penetration testing. Um, and I was put in charge of doing that. And again, utilizing some of the various tools that, that Micah has, uh, had a direct hand in and methodologies that he's speaking about, we develop this portfolio, um, as to the attack surface for the, uh, for the organization, as well as, you know, um, sort of other meta information.

So. Uh, things about personnel, things that we could track and, and connect to other little bits of data that we're able to pull. And in whole, you form this much more clear picture as to what's happening with your organization, with executives, with individuals, with system admins, uh, et cetera. And you can create this big picture that [00:11:00] attackers will generally utilize towards an attack.

Um, long story short, when we pass this information over to, uh, this potential client, uh, they, they were so freaked out by the information that we shared with them, that it took them three months to get through the data before they cleaned some stuff up in house. And then we were able to engage. So, uh, it can be a powerful tool.

[00:11:23] MIcah Hoffman: Oh, absolutely. And, and I think one of the things that you just mentioned is, is extremely important that many times people don't realize what is online and being revealed about their organizations, themselves, their activities and their families. And this is not only here in the United States, but it's internationally yet.

Yes. In the United States, we share a lot more, the government shares a lot more and companies share a lot more about us. Even internationally, there are a lot of ways that we can collect information about people, uh, businesses and computers. 

[00:11:57] Bella DeShantz Cook: I know you mentioned, uh, one of the aspects [00:12:00] of your work involves working with law enforcement. Um, are you working with them on a lot of criminal investigations? And can you tell us a little bit about what that's like as much as you can say? Of course.

[00:12:13] MIcah Hoffman: Actually, I don't directly work with law enforcement. Um, I usually am in, is, am, um, and retained by a lawyer. That's working on a certain case, uh, the, uh, defense or the. Asking me to gather certain data and present it to them. And then they deal with law enforcement. However, I do work with the NC PTF, the national child protection protection task force.

And with them, I do interface with law enforcement a little bit, uh, getting images, uh, getting other pieces of data to help them locate missing people or people that are being trafficked. And it's, uh, it's wonderful. They're a great group of people. Uh, the, the people that I've worked with are top notch, and I've always been impressed [00:13:00] with their dedication towards resolving the case, uh, whether it's a missing person or a traffic person.

And I've always been impressed with, uh, how they, how Mo how they are willing to share whatever they were allowed to share and accept the help. Um, uh, Jeremiah, I bet you and Bella, you two have probably encountered those clients. You know, are going to benefit from your services. You know, that all of this is online and there's high risk.

And when you tell them they're really hesitant, reluctant to accept your well-meaning even paid for advice, because it goes against what they want. Um, with the law enforcement that I've worked with, they are extremely helpful for the assistance. 

[00:13:43] Bella DeShantz Cook: Yeah, I'm glad that you brought that up because I actually wanted to hear a little bit more about that. I saw that you work with the, uh, like you said, the NCP TF, national child protection task force. Um, and, uh, I think that's one of the areas that I actually have heard a little bit more, uh, oh, stint. [00:14:00] Talk of how, how beneficial it, wasn't, where it can be for this kind of endeavor.

Um, would you tell us a little bit more about, um, how stint is related to this, uh, this task force, um, and what you did.

[00:14:13] MIcah Hoffman: Sure. So the national child protection task force or NCP TF, and other organizations similar to it are groups that aim to assist law enforcement and other groups with finding missing people. Um, there, as it was explained to me, when somebody goes missing and that person is a, has been groomed by a sexual predator or child predator, one of the things that that person that's grooming them will tell them is when you leave the house to come meet me so we can do whatever.

I want you to leave a note saying this, like I've run away instead of I'm I'm meeting this person because the language that is [00:15:00] used in some of these, these notes and letters to parents and loved ones changes how law enforcement can or cannot interact with it. Um, Uh, with long with the NC PTF, what we are is a taskforce of, uh, volunteers that get together that pool our knowledge and our resources to work on some of these more challenging cases, uh, cases that are not actively being worked on because either they're too old or there's no additional data points to pivot on and we generate additional leads and provide that to law enforcement and then they, uh, will go and, and do whatever it is that they do to, to continue working on the case.

Um, and there've been a lot of successes, both in the NC PTF world, as well as in, uh, many of the other, uh, well-meaning nonprofits that are, that are helping out, uh, these for good type of situation. [00:16:00]

[00:16:00] Jeremiah Roe: So with information being so freely available online and accessible by almost anybody, um, on that has access to the internet. Um, is there any kinds of data, um, that people should be more careful about posting online than say other kinds of data and given your particular background? I would imagine that you specifically tailor the types of data that you reveal, um, so that others can.

You know, um, obtain information based off of the data you want them to see as opposed to the information you don't want them to see.

[00:16:41] MIcah Hoffman: Yeah. that is an extremely deep question, Jeremiah. They, uh, they be, I mean, there is, there are so many different aspects to what you just said. Uh, first off you have the concepts of what I want people to see versus what they are able to find out. Um, so that in that, that intention, in some of [00:17:00] our social media, we choose to make it public.

You know, if I make my Instagram public instead of private, you know, I am allowing more people to, to view my social media and in my interactions, but there's also that intention of what I post on those platforms and the intention of what username I posted under what account I posted under. And so there's, there's a lot of places where I may have.

To privacy controls to control what I, what I put out there on the public internet, but also it's a matter of choosing, do I want to, or do I not want to, so it's not just does the platform allow me to, but it's, should I do this or should I not? And that's another level of, of, um, privacy that, that I choose to implement.

So yeah, I don't have social media on certain things. I don't do a lot of things. Uh, I'll give you an example of that less than the current world and just more in the privacy realm, I [00:18:00] just got, uh, an apple watch and I was like, Ooh, I'm going to track my fitness. And so I was like, cool. You know, apple, you know, they protect your stuff, you know, tracking it, whatever.

And, and so I, I, I let it track my steps and let I track when I stood and when I didn't stand, I didn't do the sleep thing. Cause that was just freaky. So I. Went ahead and did that for about three or four days. And then I looked at all of the things and in my apple, uh, in my iPhone, it shows, you know, the minute that I got up and stood the, how many flights of stairs I went up at what date, at what time, when I was walking, when I was running, when I was moving.

And I started thinking with just this metadata about what I've done, that's a lot of data that I'm not comfortable. And I turned that off. 

[00:18:51] Jeremiah Roe: I'd I'd like to hover on that for a moment, if we can. Uh, because I know specifically that you've done a particular talk a few times around, [00:19:00] um, uh, various kinds of fitness applications without sort of naming names. Um, I know that, uh, it's a particularly interesting talk and I was wondering if you could maybe speak a bit more as to what you're able to identify with. 

[00:19:14] MIcah Hoffman: Sure. The, the thing I like about is that the more, you know, about web technologies, the more you know about how the internet is connected and websites are put together that, that core underlying knowledge that is not directly cyber, not directly Olson, the more you can grab off of websites and the deeper you can go.

So, uh, there was one time when I went for a bicycle ride with one of my, my friends and she sent me a link to an exercise apps, results page for the activity that we went on and. And I looked at the page. I was amazed that this data about where we went and how fast we went and who we went and who she went with, um, wasn't protected at all.

And, [00:20:00] uh, so the talk running away from security that I did back in 2015, uh, was one of those talks where it, I dived into the world of exercise applications and, um, kind of circling back to what we, what we just talked about that even though I might choose to restrict who has access to my data on certain platforms, the platforms themselves may not have the privacy controls in place to limit people to not, uh, not retrieve my data.

And I'll give you an example without naming names. On this exercise application platform, you could turn on all of the privacy things to not reveal your last name, to not reveal you this city you are from and implement all those things and feel good about it. But if somebody just guessed the number of your ride, your walk, your, your run, they could still pull up all [00:21:00] that data.

And then based upon the geographic location of where you started and ended your run, your walk, your bicycle ride, probably figure out where you, um, worked or lived. Um, and, and I think that's another one of the, the skills to the skills that I love in Oakland is that one there's kind of detective work.

This thing leads to that thing, leads to that thing. Um, so finding, Hey, that this exercise app is not protecting data is one thing, but understanding how to collect it on mass, analyze it on mass. I figure out what to do is, uh, some of the other interesting things that I love doing within Olsen. 

[00:21:40] Jeremiah Roe: So with those applications and or social media platforms that might be overly verbose in sharing the information that you don't want to have shared. Um, are there, are there any, any specific apps or again, social media platforms that you just will not use for that very purpose? 

[00:21:59] MIcah Hoffman: [00:22:00] So the, the answer is Michael Hoffman personally. Absolutely. There's no way I'm putting certain mobile apps on my phone, uh, for certain platforms. Um, but Michael Hoffman professionally, I absolutely do put those apps on my phone because. Uh, and privacy are, are, um, inversely correlated. So as your privacy goes up, necessarily, the things that we can find about out about you on the internet goes down and, uh, as your privacy goes down, we can find more those apps that we put on our phones, those social media platforms that may take a little bit more of our data, or may not protect it.

Those are present opportunities for people to find out risky things about us, or just other things that I don't want. So, yes, personally, I'm not using certain platforms and I won't even put those apps on my phone. 

[00:22:53] Jeremiah Roe: Yeah. Uh, so just as an example, um, there are companies that are very prevalent [00:23:00] in the news today, and I, I think we can freely say, you know, there are concerns around such platforms such as Facebook or, or Google even, right. Um, around privacy concerns and information that they share and allow to be accessed by other users, um, with that particularly in mind.

Um, are there, are there any sorts of, of, um, are there any things that you would recommend towards privacy concerns that individuals have with regard to the apps that they're currently using?

[00:23:35] MIcah Hoffman: That also is a huge, huge issue. Um, because it, it, it really depends on who you are and what you're doing. Um, the first thing that I always suggested. Is, you have to understand what's out there. And this goes back to what we were talking about in the beginning of the podcast, that, that if you don't know what data is out there, that's either being shared by the social media apps or Google, or just shared by [00:24:00] your church or temple or mosque, then you can't take steps to remove remediate or otherwise, uh, um, manage that data.

So what I like telling people to do is go to whatever social media platform you're using and download your data. You know, Twitter, Instagram, Facebook, even Google has these takeout things, download your archive, download all that stuff. And it doesn't matter if you're going to go and remove your account or whatever, but just see what they collect on.

You. I'll tell you this. When I downloaded my Twitter, And I saw that Twitter was collecting every single direct message that I and collected and stored every single direct message I ever sent every single, like when I had what friend or colleague, uh, joined Twitter and how I, I followed them or they followed me.

And same thing for Facebook, they'll give you the facial recognition [00:25:00] ID code for your face that they use to detect your image in other people's photos. Just seeing that vast amount of data that these companies collect about us absolutely changed my behavior. And, uh, that's, that's one of my first things that I suggest to people.

But aside from that, I mean, after you understand what they have, then you need to decide what your personal risk model is. You know, what are you comfortable with people knowing on the. And, um, each one of us has a different level of that. Um, when my kids were growing up by, I talked to them about this and they actually helped me with some of my not casework, but some of the things that I was doing for my classes, and I would show them things.

I'm like, Hey, look, we can see inside this person's house and all of the toys that they have, and look, here's a bill on their desk because one of their kids is doing some kind of social media streaming, and we can pause the video or whatever. So [00:26:00] I think education is important part to reducing your online perspective, online information as well.

[00:26:08] Bella DeShantz Cook: I have sort of like an interesting thought slash follow up on that. You'll have some perspective on, I feel like, you know, my, my understanding of social media currently, um, it seems like there's a lot of emphasis or like, it seems like a lot of people are embracing this idea of like being transparent online and sharing with your friends and followers.

Exactly what's going on. And you know, these open conversations about what's going on in your life, which cool. But they, they do come at a cost, at least, at least. I think that's what all of us in this call probably believe. How do you like, how can we better think about that balance of having maybe honest conversations online sometimes in some spaces, but also still recognizing the cost in privacy that is associated. [00:27:00]

[00:27:00] MIcah Hoffman: Well, the re the way that I found is most useful. Is to aggregate data and then show that entire profile or show what that shows. For instance, I had a colleague, um, and I'm not sure if, uh, if you've seen this, this talk I gave or this tool I released, I had a colleague that was sharing his drinking behavior online through some social media, uh, app.

That is, uh, I think I can name the app. It's called untapped. And it's a simple app. You put it on your phone, you joined with a social account, and then whenever you drink a beer or you buy a beer, you take a picture of it. You say, oh, this beer tasted of oats and honey. And I drank It in this location at this date and time with these people.

And you can rate the beer. And I, when my, this, this colleague of mine said, Hey, you know, he, I just earned a beer for drinking, the most porters in [00:28:00] Singapore. Um, I replied to him like, why are you telling me my, your drinking behavior? Aren't you like, aren't you scared that somebody is going to do something with it?

Cause it was just one thing. It was just a single badge that the site had given him. And he said, well, you know, what's the worst that somebody can. I was like, all right, game on. So let's figure out the worst, right? I mean, Jeremiah, you're, you're smiling here, but that's exactly I'm thinking, well, what is the worst that somebody can do?

So I created a Python script to harvest all of their drinking activity. So then we could analyze this tool that I created and can analyze the days of the week, the hours of the day, the, the days of the month when somebody drinks, where they drink shows a nice little heat map of the locations they've checked in, who their friends are.

And when I showed that to my colleague and said, well, here's the worst. I had an old profile of when you're drinking, where you're drinking, who you're drinking with, where you visit when you visit it, that made [00:29:00] my colleague change his profile from public to private. 

[00:29:05] Bella DeShantz Cook: It seems like that's the information people are missing that like, okay. Maybe sharing a picture of a beer seems really innocuous and maybe it is, but the way that that data can be aggregated and the way that you can find patterns, I think that's the, that's the part that really frightens me.

[00:29:20] MIcah Hoffman: Yeah. And 

[00:29:20] Jeremiah Roe: think there's two. Uh, so sorry, go ahead, 

[00:29:22] MIcah Hoffman: no, I was going to say, just to confirm with what Bella just said. Yeah. I mean, in, in the profiles that you all create, but when you're doing your recon, it, one of the things you do is you don't just, you know, find one social media profile, you find them all. Then you look at what media is being shared and, and, and you go deeper and it's that aggregation of all of those things.

That really is the compelling thing that can change. 

[00:29:47] Jeremiah Roe: to take that a step further with the heat map that Mike had created for his friend, you know, utilizing, um, typical human behavior from a psychological perspective. And I think Michael appreciate that. [00:30:00] Reasons on site. Um, there's, uh, there's a particular, um, thing that you can develop. Uh, typically people are going to focus around either their place of work or their home.

And when they start visiting these locations and you collect all this different data, you can, uh, utilize different sorts of algorithms to showcase the likely location based off of distance traveled amount of times visited, um, information that can pinpoint where they potentially live separately. Um, if this information is shared with an employer, right, you may form a separate kind of opinion from an employer's perspective that they may be concerned with such as is this person at risk for if they work in the government for disclosing information, because they've got a drinking problem or are they, you know, there's so many different kinds of things that that could be taken.

And that can also be abused from somebody who may or may not have malicious intent towards this.

[00:30:56] MIcah Hoffman: Well, I mean, the, the simple analysis of [00:31:00] when they're logging their. Tool only looks at when they're logging the beers. So if I drink a beer now, then in five hours or two days later, I say, oh, I did this. I'm only getting when the beers logged, but you know, if you look at when people are logging beers, you might be able to see that, Hey, you know, you work from 9:00 AM till 5:00 PM, but all your beers are drank between two and four.

What is happening there? And you're right. I mean, that, that could be something that's very embarrassing, but let me take it a step even further. Jeremiah, when I, I know we're getting all into this, there there's actually two steps here, and this is all within this one in this open app, when people, um, When people share information, think about their human behavior.

People that really love this.

application. Uh,

they will travel around the world and they will log in or check into places and say, oh, I drank a beer here. I do demos where I take a single [00:32:00] person and then look at their last 25 beers that they've logged. And you can see this person went to this airport, had a beer.

Then they went to this Hilton in this Hilton hotel or Hampton Inn. And then they walk to the local pub, two blocks away from the Hampton. And then they got on a plane because they went to the airport. They went to another city, stayed in another place, had a drink at the bar and you could see this repeated behavior.

We can track people around the world. Just to make it even super scary. Uh, I had a student one time when I showed this to them in my class, I said, uh, you know, look, we can do this attract people. And he said, well, you know, I work across the street from, I work at a, um, a secret location and across the street is a restaurant or a bar or something.

Could you watch who drinks at that bar to find patterns? I said, Ooh, yes. And I made another tool. [00:33:00] And what it does is it just watches places. And then everyday it records who logged beers there. And since November of 2019, oh, we're coming up on two years, two years, I have been. I have been, uh, recording, uh, every day, who logs beers at over 75 international airports.

So we can track people as they're traveling around the world. We can track when the pandemic hit and the decrease in airline activity as evidenced by less people, drinking airports and more so,

[00:33:36] Jeremiah Roe: to mention the. Not to mention, to see when people are or are not at home.

[00:33:42] MIcah Hoffman: Yeah, exactly. 

[00:33:44] Bella DeShantz Cook: That's the thing, like I distinctly remember being a little bit younger and having an Instagram account and being so excited to get to share all my cool stuff that I do with all my friends. Uh, I'm very inactive on Instagram now. Um, and I remember going on vacation once and my [00:34:00] dad being like, don't post any photos while you're going on.

And I was like, oh dad, please. Like, I need to show off to my friends. I have to, you know, they have to see all my cool vacation photos. And he was like, take all the photos, post them when you get home. Uh, because if you post them while you're gone, people will know that you're not home. And that was like the first time I think, in my life.

Well, people can really like make connections based on the information that you're putting online. And I think as that, you know, that's one data point, but as, as we have so many more areas that we're putting information and maybe like maybe other people know about don't post vacation and photos on Instagram, but maybe they don't think about their beer.

Uh it's. It is really an interesting thing to think about how much information you're giving away. Just from those small little snippets. 

[00:34:47] MIcah Hoffman: Oh, absolutely. And I mean, there used to be a website out. It was a proof of concept website called please Rob me. And that's exactly what it did, bell. It was, it was out there to say, look, this person just checked in here [00:35:00] or did this. They're probably not at home. And the site quickly went offline because of creepy factor, of course.

But, but the point is, is exactly what you, what you're talking about. Now. I know people in the cybersecurity world who are normally very cybersecurity aware and, and focused on, you know, good cyber hygiene that posts incredibly detailed, personal things online. And if you just look at their Instagram, Twitter, Reddit profiles across a month or a year, you can see a lot about what they do, where they do it.

It doesn't make me sad anymore. It used to make me sad. And I used to feel like, you know, captain privacy, I have to save them and tell them this is wrong. But the reality is is that everybody has their own personal risk level and risk tolerance. And Bella, and you changed your behaviors probably once you became a pen tester and understood how [00:36:00] people use these things, right. 

[00:36:02] Bella DeShantz Cook: Oh, yes, very much so, so speak, we've talked a lot about like all of the different ways that we can, you know, inadvertently or even, you know, fully aware, uh, put all this information online. Um, and we've talked about the kind of negative consequences, um, say that you have, you know, maybe someone listening right now or someone like me right now.

Maybe you've identified some area where you've put more information than you're comfortable with online. Um, what steps can you take to, to kind of scrub that data or, or, or maybe improve your habits going forward? Okay.

[00:36:38] MIcah Hoffman: So the first thing is figuring out what's out there and for a fee, I will do that for no, I should just throw that in. No, I, first thing is figuring out what's out there. And one of the best ways to do that is just look around the application. Uh, all of the social media, usually it's social media over in the United States.

We also have these people search [00:37:00] websites, people finding websites. Because companies can buy and sell our personal data. So depending upon where the data is, you take different actions. So if it's on social media, you can take steps to protect your data. On some platforms, you can take steps to remove and delete that data.

If it's on something like a people search website, you can, on many of those sites, remove your data or ask them to not publicly show that data. They'll still keep it. They just won't show it to people. Um, but that's more of a game of whack-a-mole that is because you ask it to be removed from one website and then another one pops up and they're like, you didn't ask me to remove it from fine people.

Dot com you asked from people fine.com. Um, so that's generally not great. Um, and then the other way that you can do it is instead of trying to remember. Data, uh, put misinformation out there about yourself. Uh, so I have, uh, colleagues that have [00:38:00] very unique names and instead of trying to remove all the places where their name exists, they just create extra social media accounts and then post other things like they like turtles and whatever fried green tomatoes, uh, instead of the real things.

Yeah. So, so that's another way of, of defeat of, uh, changing things. 

[00:38:20] Bella DeShantz Cook: I've actually never 

[00:38:21] MIcah Hoffman: you have to do that. You have to do that over and over and over again, because just like,

a pen test, what you get is at a point in time, right? So if you remove everything today, ballot tomorrow, the next day, you're the race that you ran in.

We'll publish your name and your, your age and your gender and your time. And there's more and more, more stuff out there. Sorry, I interrupt. 

[00:38:43] Bella DeShantz Cook: No, no, it's, it's all great. I, I've never heard that tip about adding information. So I have a relatively unique last name and I always am frustrated by my friends were like, you cannot find them on the internet because they have the most generic names. Uh, I love my name, but also [00:39:00] in that one aspect I'm jealous of, of more generic names.

And I've never heard that tip of just adding fake information. And, uh, that actually sounds a little bit fun.

[00:39:10] MIcah Hoffman: Yeah. Yeah, absolutely. And I mean it, in the United States here, it goes a little bit deeper than just making social media accounts for other versions of you with other ideas and other likes and things, uh, here in the United States, you can't, uh, you have to think about how these data aggregators, like the people finding sites are collecting that data that then they share in the people finding sites.

And one of the ways is, you know, based upon your real retail use. So if you're binding buying things from online retailers or using your loyalty card to get discounts somewhere, all that, that advertising and marketing and demographic data gets aggregated and then released online. So. Instead of you getting packages from Amazon to your name, Bella at your address, maybe [00:40:00] start getting them to John DOE or Jamie Smith or something like that at your address.

So you're the only one getting it, but just changing your name sometimes on some of these E retailers can 

[00:40:12] Bella DeShantz Cook: really smart. I think. Thank you. 

[00:40:15] MIcah Hoffman: Yeah. And it's a simple thing. It really is a simple 

[00:40:18] Bella DeShantz Cook: Are there, wait, so, um, so sorry, I keep having someone follow-ups, I'm just so interested in all of this. Um, are there any other, you mentioned online retailers and, and like, uh, reward systems and things like that as kind of, I see those as areas where you're giving information on yourself actively like intentionally doing.

So are there any other weird areas that folks might not think about?

Sorry to put you on the spot.

[00:40:45] MIcah Hoffman: it be. I mean, that is, that is a deeper question. The answer is yes, of course, anything and everything you do on a smartphone is collected. Um, and it's not just, yeah, exactly. Jeremy, Jeremiah, it's it, everything [00:41:00] in everything, everything, and anything you do on your smartphone is not collected, but there's telemetry telling Google or apple where you are, what you're doing.

You're using the eye tags, using the iWatches you're doing, um, uh, whatever, all of that's being collected as well as well. That, that. I don't mean to sound, you know, conspiracy theory, but, but yeah, I mean, there's huge data out there about tracking people where you go, how long you stand in front of certain, uh, displays, uh, where your devices are headed and what you do online and tying that online, purchasing online social media presence together with you live at this location together with other people that live at that location.

That's big business, at least over here in the United States because of our more lax privacy laws. Um, so what I try to do is, is be intentional about where I buy things [00:42:00] from what I buy, what I do. And some of the things that, that we can do to protect ourselves is. Not using credit cards, which is kind of a scary thing because now everything is tied to a credit card, but there's a really neat site out there.

And I make no money off of this, but privacy.com over here in the United States will allow you to tie that an account to your bank account. And then it will issue you virtual credit cards. Now I know that some credit card companies have these virtual credit card numbers, but this goes beyond that because privacy.com while they know who you are paying, um, they generate unique numbers.

So now your data can't necessarily be aggregated because you're using the same, uh, credit card number on every single site, but also your bank when it gets or your, the, the credit card. Retailer, doesn't get that as well. So if you, if I use my Amex everywhere, American express knows I shop [00:43:00] here and I shop there and I bought this well with privacy.com.

They, uh, they reduce the amount of things that the, my credit cards and my bank knows. So that can help out as well. But that's only in certain countries because of financial things. So there are a whole bunch of neat things that we can do to try to reduce what's online or make it harder for people to 

[00:43:23] Jeremiah Roe: So I know we've primarily been discussing, um, kind of over here in the states. Yeah. And what's possible here, um, due to our privacy laws and, or lack thereof. Um, what about say over in Europe, um, from an open source intelligence gathering perspective and or information, you know, uh, honeypot, is there such a thing really over in Europe and if not, uh, why not? 

[00:43:50] MIcah Hoffman: So you mentioned Jeremiah, the GDPR, or you infer the GDPR privacy rules. Um, those did make a big impact [00:44:00] initially on what was being shared online, uh, in the cybersecurity and pen test worlds. You probably remember back in may of 2018, when those privacy rules went into effect, all of the, who is data got masked.

And so we no longer could quickly and simply reliably find out who owned what domains, because some of that data might've been from EU citizens. The reality over the last three years though, is that. We see that the GDPR is really only being enforced on large, large multinational corporations like Facebook and Google and all and smaller ones are still sharing some information about EU citizens without their consent.

And you have to think about the places where people go to share that those details about themselves. So, you know, social media becomes a big player when we're dealing with some, uh, some of the searching [00:45:00] techniques that we use for people in the EU or in Asia and the Pacific regions in middle east as well.

So we look more for maybe what people are intentionally sharing instead of what's being shared about them in those. 

[00:45:13] Jeremiah Roe: Yeah. Oh, that's interesting. Um, I think that kind of, uh, flows nicely into this next question that I have, which is, um, and I guess to set this question up, I'm just going to briefly touch on something that I think you alluded to a moment ago, which was about the cell phones. And there was a recent study that was done.

And I think there was, this was done with, uh, with a journalist actually, um, where they went into DC and they turn their phone off and then they traveled around to a number of locations and they saw that there were still able to monitor and track where the phone had gone throughout that timeframe, even though the phone was turned off.

And so, uh, this may be something that folks aren't quite aware of. Um, when you think you've turned your phone off, it's not really off. [00:46:00] Um, and so, um, uh, to, to that point and yeah, 

[00:46:07] MIcah Hoffman: Yeah,

And it used to be great. So off off is not off. And, and, um, w in the old days, What.

we used to do is just take out the battery, but You can't take out 

[00:46:19] Jeremiah Roe: take out the battery now. Yeah. Uh, and, and, and this all comes down to, you know, permissions and what the phones, uh, you know, monitoring on the backend and being, staying connected from a geolocation perspective and that information is still being shared. And so, um, to your point about Europe and, and this also applies in the U S as well, um, now I'm going to layer into too, you know, with regard to the deep web and the dark web, um, what types of research can individuals identify about other individuals in those locations and what can you, you know, what [00:47:00] kinds of info can you find there that you can't find it.

[00:47:03] MIcah Hoffman: Um, so. There's a lot of things that you could find in the dark web that you can't necessarily find on the surface web, but I'll tell you this. The distinction between surface deep and dark web is really being blurred in the past. It has really been blurred in the past X number of years because, um, I, you probably have seen people's reviews on the surface web for their favorite, uh, dark web marketplace or, or this site has been taken down.

Heck you can even Google for some dark web marketplaces on the surface web. And because there's software out there like tore to web, which will take a request on the surface web, go into the dark web, pull that content up to the surface web and serve it to people on the surface web. Google and being in Yandex have sometimes indexed dark websites.

We also have dark web search engines, both on the surface web [00:48:00] and in the dark web, whether it's ITP or tour. So the question of, of, of what's down there, that's not up here on the surface web. It's really some things, but not that much. And it really depends also on the dark network that you're on. For instance, things like free net, which is more of a distributed file sharing service.

It is uh, used to share a lot more, um, exploitation videos of children, of women, of other people that are being abused and trafficked because it's, it's impossible to get those images in those videos off of that platform just by how it's set up. Uh, but a lot of places nowadays. And you might've seen this in, in your work, Jeremy.

A lot of the dark web marketplaces that we used to have to operate in these, in the, in the dark web of special software. Now they're just moving up to the surface web into some Bulletproof hosting places. so you know, they pay somebody in, somebody in Iran, [00:49:00] some money or someone in Russia, some money, and they can do whatever they want.

They can sell whatever they want on the surface web without fear of 

[00:49:08] Bella DeShantz Cook: wild to me. I feel like I've, you know, I've I work in cybersecurity. I, I, I think I know a decent amount about cyber security security, but the, all of the, uh, the, I don't know, less savory. Of anything in tech related, I just know nothing about I'm. So like lawful good that I don't even know anything.

And any, any times when start talking about the dark web, I'm just like, what on earth? How does this work?

[00:49:35] Jeremiah Roe: So, so to put it into perspective, Micah, if, if you could. Now that you can kind of access these things on the surface web, through these, um, Bulletproof hosting companies and, and, uh, requests that are made, uh, with the Tor protocol in mind or the onion routing protocol in mind. Um, and those things are brought back up prior to these things kind of happening.

Um, what would you typically see on, you know, the dark [00:50:00] web that you wouldn't necessarily see on the surface wet? Just sort of as an ex. 

[00:50:07] MIcah Hoffman: So it depends on where you are on the dark web. And I will say this, um, much like what you were just saying, Bella, in the cybersecurity world, there are so many different pieces. There are so many different specializations that even though you may be a bad-ass pen tester, you may not know much at all about, uh, digital forensics or other things.

Uh, for me, I'm not an expert in dark web and all of the badness that's out there. Uh, cause the world is very analogous. There's so much to do. Um, but I mean, there are forums on there that aren't where people are selling credentials to credit cards or selling credit cards that have been, um, scraped from, uh, different ATM's.

There are credentials from breaches that are being sold. There's, uh, people that are being trafficked, images and videos [00:51:00] of anything and everything that you could want as well as some. Some nice things. I mean, there, the dark web is not all that. Yes, there's a lot of bad activity that happens in their illegal, immoral and unethical things.

But the dark web is also a very powerful place for people that are being oppressed. People that maybe have alternative lifestyles in countries where they, their views, if they were expressed on the surface web, they might be criminally prosecuted or even, uh, jailed or anything like that. So, so the dark web wallet can be used to find breaches, to find credit card numbers, to buy drugs.

Some people actually using it for okay. Things to. 

[00:51:44] Jeremiah Roe: So, so leading kind of into that, with information that can be available and, or purchase sold, bought, um, there, there are particular companies that are out there that can. They S they claim that they can help, help you to remove your data. And so [00:52:00] what, I'm just kind of curious, what is your personal opinion of those and, um, is that leading to more of a privacy economy where only wealthier people have privacy on.

[00:52:13] MIcah Hoffman: It is hard to remove your data from the internet. Um, the techniques that I talked about earlier with Bella about, about finding out what's out there, then trying to remove it or adding additional pieces to maybe downplay some of those. Less savory things that might be about there. Um, those are some techniques that we use a lot.

And to your point. Yeah, I absolutely think that it is at least over here in the United States. It is, uh, it is something that not richer, people have access to more, but it takes a lot of time and effort to reduce and monitor. What's being posted about you online, especially in the more public that you are.

Uh, [00:53:00] obviously musicians and politicians and other people that have massive online presences. They have a lot of things that they might say and Do that they regret, and there's ways that you can deal with that. Uh, but it all costs. Now I do not have. Experience with any of these privacy focused companies that say, Hey, for X amount of dollars, we'll remove this.

I do have some personal experience, uh, going through a, uh, remove yourself from the internet kind of checklist. And I'll tell you this, that it is, it is something that is extremely labor intensive. I tried to remove myself from a whole bunch of those people search engines that we talked about earlier, and by opting out or by saying, I want you to remove this data.

I coincidentally started getting a whole bunch more phone calls that my car extended warranty was expiring and an email address is. So I don't know if they're like, yeah, we'll remove you. But thanks for [00:54:00] confirming that this is good valid data now. Um, so, so, and it it's a full-time job too. And in, so our data getting, so it goes back to really what Bella was saying earlier that.

What I realize is that once things are on the internet or once things are even collected, not necessarily on the internet, you've lost control of it. So I really try to make sure that only the things that I want to share about myself are given to the credit card companies are given to the different online social media places so that I can protect my identity before it goes to these other places.

[00:54:37] Bella DeShantz Cook: Do you think that there is a version of the future where we would all have more control over our own data and how it's used online? Or I guess also what would, what would be. Required for that version of the future.

[00:54:55] MIcah Hoffman: Aye. Aye. Aye. I would like that. [00:55:00] I would love a world where I'm like a Hoffman have the rights to my personal property, intellectual property, that I have rights to my biological property. But the reality is is that we give up our privacy. Every single time we use an app. Every single time we choose to purchase something with, without, you know, credit cards.

Uh, every time we appear in public, you lose your privacy because there are cameras out there recording what we're doing. And there are Bluetooth sensors that are tracking you as you walk through cities without your consent. And so I do not see any type of future.

where that would be in our hands. Bella.

Unfortunately, uh, what I do see is futures, where more people are educated about some of the things that are more. Uh, they start getting smart about how to counter some of these things. And, uh, we, we reduce the amount of [00:56:00] data that's online, but you know, just like me turning off that step counter in the stand counter and when I'm sleeping and whatever, just like me turning that off, it really comes down to people's personal preferences.

And does the benefit outweigh the cost? If I'm having heart problems, having my iWatch, check my heart rate and then let me know, Hey, it's irregular right now is a valid service and I might opt in for that. Um, but it's a personal, a risk-based decision. 

[00:56:35] Jeremiah Roe: Yeah, I think that, um, you made a really great point there, Mike, and that, um, personal privacy data, privacy is all dependent upon the level of innovation that you would like to take advantage of from a marketplace perspective. 

[00:56:52] MIcah Hoffman: absolutely. Uh, you know, it, that we have services that are nothing but brokers for other services. And when they're, [00:57:00] when they're doing that, they are collecting data on us. Just like we talked about collecting, drinking data on people. I just went on vacation out to a certain part of the United States.

And you know, my credit card company knew when I checked into the garage when I, uh, bought something in that airport when I went here. And while that data is a little bit more protected than just general social media data, it's being collected by a company that's outside of my control. 

[00:59:01] Jeremiah Roe: And then sort of one final question, um, aside from the tailored information that you already have on LinkedIn, um, or what is, 

[00:59:10] MIcah Hoffman: notice. 

[00:59:11] Jeremiah Roe: what is some of the information, uh, or what is one thing that we wouldn't know about you, that, that you feel comfortable with sharing that wouldn't be available on your LinkedIn? 

[00:59:22] MIcah Hoffman: One thing that wouldn't be on my LinkedIn. Um, I have no hair on my head. I'm bald. you go. See, I don't put that on my LinkedIn, Jeremiah. 

[00:59:36] Jeremiah Roe: It's very 

[00:59:37] MIcah Hoffman: There you go. Giant to me, make me reveal private things after spending an hour talking about how I don't reveal private things well played, 

[00:59:46] Bella DeShantz Cook: Uh, which takes a completely different turn after you, like you said, after you've spent an hour talking about not sharing private information. Yeah.

[00:59:55] MIcah Hoffman: Right, 

[00:59:56] Jeremiah Roe: Yeah.

I go to the local pub on Tuesdays. I like to get pizza [01:00:00] and a specific beer, uh, and I'll be there at 5:00 PM every, every Tuesday. So just come on by that's 

[01:00:06] MIcah Hoffman: my activity at my viewers back on. 

[01:00:08] Bella DeShantz Cook: Yeah, this was really,

[01:00:09] Jeremiah Roe: Thank you so much, Micah.

[01:00:10] Bella DeShantz Cook: a walk without my iPhone. No, without my, without my smartwatch, just totally untethered.