Chinese firms drop US and Israeli cybersecurity software - 2026-01-19
S6:E3

Chinese firms drop US and Israeli cybersecurity software - 2026-01-19

Corey Ham:

What up?

Wade Wells:

Alright. So it's a a '24 film, right, which already tells you a lot if you're

Corey Ham:

into What does film it tell us? On Does it mean it's hipster?

Wade Wells:

What does that mean? I guess that too. Yeah. But they're like a little bit more artistic, a little bit more weirder on their subjects, right?

Corey Ham:

Okay.

Wade Wells:

And so what happens is like Justin Long goes to this guy's house Okay. And he Real kidnaps Justin Long.

Corey Ham:

Is that the guy who's like, I'm a Mac and I'm a PC?

Ralph May:

Yeah. It is. Wow. Yeah. Yeah.

Wade Wells:

Dude, man, talk about a callback, man.

Faan Rossouw:

He's been around for a long

Ralph May:

time. Yeah. Die four. Turned into No, a

Corey Ham:

his best work was I'm a Mac.

Wade Wells:

Anyway Die Hard four. I liked him in Zach and Mary make a explicit movie, but that's

Faan Rossouw:

And it's directed by Kevin Smith as well who's a sign of Yeah.

Wade Wells:

Yeah. I didn't realize he directed that. Yeah. That it's a gnarly movie. I would not consider it not watch it as a date night or with any children in the room, whatsoever.

Corey Ham:

Unless you unless you have good taste. Unless your date has good taste.

Wade Wells:

Yeah, I guess.

Faan Rossouw:

It's kinda like I believe human centipede in that realm. Very

Wade Wells:

very on their yeah. Did it come out? No, it did not. Maybe it did come around around the same time too. I don't

Corey Ham:

know if like So human seem to so long or

Wade Wells:

long. I never watched the human centipede. I was like, I'm good.

Faan Rossouw:

Like, I heard enough people. The South Park

Wade Wells:

The South Park one? Yeah.

Corey Ham:

Want their version. Want the centipede human. Bunch of centipedes that like combined together to make a human. That's perfect sense. That sounds like Voltron Yeah.

Wade Wells:

Version. Voltron. Sounds like boogie boogie from from Nightmare Before Christmas.

Corey Ham:

The comments in Discord are just like, oh, we're dropping straight into human centipede. I see it's I see it's a Monday.

Ralph May:

Yep. That kind of Monday.

Corey Ham:

This is the pre show. Okay? This is not recorded. We're allowed to say whatever we want except for, you know, we have our consequences. Let's go ahead and talk about that.

Wade Wells:

Oh, it is recorded. I forgot about that.

Faan Rossouw:

The the South Park episode is very memorable memorable too because there's some there's a documentary, and I forgot what it's called. Maybe it's like five days to air or something like that. But yeah. It's about how South Park used to just go from concept to final episode in, five days, and that was the one that they were creating in in that thing. And just to go through the arc of them, like, tearing their hair out, like, oh my god.

Faan Rossouw:

We're not gonna do it. At the end It like, the FedEx dude is there to get the real and they're like you know, it's like everything works out perfectly each time.

Wade Wells:

Yeah. I used

Corey Ham:

to I don't know how you should live with that production schedule, but also it does let them get crazy up to date, like, way more current events focused stuff, which is pretty funny, I think, in general.

Faan Rossouw:

Yeah. Oh, look who it seems a little bit more structured now.

Wade Wells:

They release it diff their release is different. They they make more money. They just decide Yeah. It's definitely I I haven't watched since like Tegrity Farms. If you're if you're a South Park watcher, you'll know that.

Wade Wells:

I haven't watched since they moved to Tegrity Farms. It's been that

Faan Rossouw:

long but Oh, you guys sold Tegrity Farms now.

Wade Wells:

Yeah. They did? Wow. Okay.

Faan Rossouw:

It's up there.

Corey Ham:

Okay. Sorry. But people are saying in the comments that the pre show banter is included on the podcast. Is that true? I thought you was I thought that we were I thought it was post finger only.

Ryan Poirier:

No. I include

John Strand:

that. Alternative, if it

Alex Minster "Belouve":

Well, that explains

Corey Ham:

If it

Alex Minster "Belouve":

needs some HR meetings.

Corey Ham:

That explains a lot of of messages, uncomfortable messages I've received.

John Strand:

Yep. So if it think pre show banter

Ryan Poirier:

If it's worthy if it's worthy of the podcast, I leave it in.

Corey Ham:

I think we should take it out because then how are we gonna talk about human centipede in the first three minutes of a podcast and not have it get demonetized?

John Strand:

On? What the hell?

Wade Wells:

Yeah. You missed that one.

Corey Ham:

Ralph's like, I'm not even gonna say anything. It's 01/19/2026. We made it. It's today. We're here for a news podcast and I'm not the only one here.

Corey Ham:

Luckily, I'm not just soloing it today. We have John Strand himself who appears to be backstage at the Emmys? I or the Golden Globes?

Ralph May:

I'm not.

John Strand:

I'm in my I'm in my son's closet. Oh,

Wade Wells:

now you just got a curtain. Okay. We see

Corey Ham:

how is. So you're still you're still trapped in the closet, but you've added noise canceling closet films?

John Strand:

No. No. I don't I don't have noise canceling closet film. I have his clothes on his hanger that I'm talking directly into, which actually is fantastic noise dampening.

Corey Ham:

It is. Yeah. So Alright. So that yeah. I mean, just for the record, we do have an office in Sturgis, South Dakota, and John has chosen instead to work in a closet.

Corey Ham:

So if that tells you anything psychological about him, I don't know. We have Ryan, of course, making us sound good and look terrible. We have Ralph, who's here to catch some gators and tell us how he does it. The key is to catch them alive. Right?

Ralph May:

Yeah. Yeah. No. That's why you take them alive and you put them in another pond and then you get called again. It's like, it's a business making business type of

Wade Wells:

I know.

Corey Ham:

We have Mary Ellen who does a ton of work behind the scenes to make our new show happen, find articles, organize them. We have a lot of audience members that do that as well, but Mary Ellen's like the number one article poster.

Faan Rossouw:

Thank

Corey Ham:

you. We have Baloov back again for another another trip around the podcast sun, I guess. Yes. However we wanna call it. And then we have Faan

Corey Ham:

Faan is here. I cannot pronounce his last name, but I'm sure he can introduce himself. He's here to talk about his workshop later this week. You wanna plug That the

Faan Rossouw:

was my cue. That was you, Keith. Yeah. Go. Hey, everyone.

Faan Rossouw:

Yeah. So I'm doing on Friday, I'm doing a four hour workshop with AntiSyphon. And it's kind of the third in the series that I've done on malware development. And in this one, we're basically going to start with a basic framework, a c two framework. And then we're gonna get going to add a whole command handling system to it.

Faan Rossouw:

And we're gonna implement a reflective shell code loader, and it's gonna culminate in us popping calc dot exe on a poor hapless Windows endpoints. So

Wade Wells:

That is my

Corey Ham:

favorite c two, calc dot exe.

Faan Rossouw:

Oh, yeah.

Corey Ham:

I'll detect it. And are you in a closet or no?

Faan Rossouw:

I'm in a office, but it's a very tiny office, but it it actually does double as my closet. All my clothes are in my bed. So but I'm not sticking into it. It's it's it's a side of marketing.

Corey Ham:

John's like, interesting. I should hire this person.

John Strand:

But I I wonder if

Faan Rossouw:

it was like yeah.

John Strand:

At what point does the closet become an office and the office become a closet? I I that's

Ralph May:

confusing. That's actually what I'm in. This is just a bunch of closets that turned into an office.

Corey Ham:

I feel like

John Strand:

I'm doing this whole running a computer security company thing wrong. Like, I feel like I should be posting pictures of myself with supercars or I should be posting pictures of myself flying first class or with nicer watches than the Garmin that

Corey Ham:

I have. You had that option and instead you chose the closet.

John Strand:

We did. Out of all of that stuff, I chose the closet and a folding table with questionable stains from beer pong on it. I feel like I'm missing something here. So we'll see, so.

Corey Ham:

With the magic of venture capital, you can lose it all. So anyway, we all last person, we have Wade who's here. I don't know what dendrologist means. Does that mean like you cut people's hair?

Faan Rossouw:

I have to

Corey Ham:

ask my question. Or do you is that like a tree thing? I don't know what it means. Explain.

Wade Wells:

It's a scientist who specializes in dendrology, duh. No. It's the study of trees and other woody plants, hence the log the logging reference.

Corey Ham:

I see. I see. Nice.

John Strand:

I get it.

Corey Ham:

I like it. I like it. Alright. So what news what what news do we wanna get into first? John, is there anything on your radar that you just want to I drop as a hot

John Strand:

just came off teaching for four hours. My brain is mush.

Corey Ham:

I'll I'll Alright.

John Strand:

I'll throw in crappy hot takes. Like if someone else wants to bring up a news story, we can talk about it. I think that's fine.

Corey Ham:

Yeah. Have a few good ones. I mean, one of the first that's just kind of a funny one, I don't think we need to spend too much time in it, is that China has advised or I guess Beijing, the government, has advised Chinese firms to stop using Israeli and US cybersecurity software. So it's like it's like the UNO reverse card Yeah.

Ralph May:

Or geopolitics. Exactly. Yeah.

Corey Ham:

But it does make sense. I mean, I think they're specifically focused around like Palo Alto Networks, CrowdStrike.

Wade Wells:

They kinda like that. It's it's definitely a blanket of, like, all of them. If I think it's, like, middle of the article. They're just, like, pretty much all cyber security everywhere. The funny part is, like, Fortinet's on that and I'm, like, you guys are the ones hacking it though, like.

Corey Ham:

I thought they Don't you need these vendors that you have enough Yeah. To Well, okay. So here's a question. Like, is this the equivalent of like, if we banned buying stuff from China, like, you just can't buy anything? Like, are there actual good top leading cybersecurity products that don't originate from those two countries?

Corey Ham:

Because it seems like 99% of them are from either Israel A or the

John Strand:

lot of the stuff that China has in this realm is shit that's IP that they've stolen from these companies anyway. So I don't I don't

Corey Ham:

all like, in in in Silicon Valley, Jin Yang has, like, the Chinese version of all this. It's like Chinese CrowdStrike.

John Strand:

And I'm not ripping on it and saying it's a bad thing. I mean, way to go, China. But honestly, I'm surprised that this wasn't a thing sooner. I I'm just I'm just shocked. Right?

John Strand:

Like not necessarily from a security perspective, just from an like an economy perspective. Like basically, China is one great big huge venture like venture funding capital company. Right? They're just worth you know billions and billions and billions of dollars. And like why they wouldn't be forcing people to use Chinese company software because it helps build their own economy is something that's just surprising to me anyway.

John Strand:

And I wouldn't be surprised if that's more at the heart of this than it is the security issues with these products.

Wade Wells:

Was curious with the companies who are actually in China, like it lists the people who are being affected by this mouse, which is I believe who actually had stuff in China? I think it was like Broadcom has a whole bunch of Chinese stuff, Fortinet and Checkpoint, which

Corey Ham:

great. So you're so you're saying that yeah. Well, okay. So I mean, of course, their stock prices have dropped, etcetera. I mean, someone in chat brought up an interesting point, which is like, what percentage of the great firewall or how much of it is The U is US based firewall technology.

Corey Ham:

Right? It's all

Wade Wells:

open source technology. Just ripped

Corey Ham:

What is it? P f sense?

Wade Wells:

Like Exact. It's IP tables, actually.

Corey Ham:

IP tables? Oh my god.

John Strand:

Don't rip on IP tables. Okay.

Wade Wells:

Actually, use free version of of active countermeasures on the whole firewall.

Corey Ham:

Hold on. If you're a real hipster, IP tables is the crappier newer product. EB tables is the real product that you wanna be using.

John Strand:

We gonna do net filtering? Are we gonna go back

Corey Ham:

to that? Yeah. Oh my god. Anyway, I I mean, we don't need to spend too much time on this article. I think the main conclusion is the same as John said, which is how is this not already a thing?

Corey Ham:

Because having a US company with access for like an EDR like CrowdStrike, it's basically a c two. Right? They have full access to all the data they can

John Strand:

is like malware of the day article coming up for CrowdStrike. So yeah.

Wade Wells:

Most most these vendors don't even sell in China though anyways. Right? I think 90% of them, they're like, they won't sell to China which is awesome.

Corey Ham:

No. Because they don't want their IP to get stolen. Right?

John Strand:

Yeah. This might be this might be in the article, but I'm wondering what about US based companies that have offices in China? Do they have to switch out from these vendors?

Corey Ham:

And That's a good one.

John Strand:

So I don't know

Wade Wells:

I I doubt that because usually, I know the the Chinese companies that have bases in US have completely sectioned off networks where they don't they're they're not supposed to allow like there's a clear wall where only Chinese are here

Corey Ham:

National guest Wi

Wade Wells:

Yeah. Pretty much. Yeah. From what I was told, like, even the Chinese workers, like, usually ask, like, hey, can we get access to this? And they're, like, no.

Wade Wells:

You can't have access to, like

Faan Rossouw:

You can't.

Wade Wells:

Legally, like, we cannot give you access. So

Corey Ham:

the last joke I wanna make is some say that China soon gonna switch to using a Huntress trial license for their EDR.

Wade Wells:

Just don't have a funny

Ralph May:

bracket just so I mean, but that's something they would do. Right? They would just

Wade Wells:

I found it funny even China's getting off Rodcom. Right? Just Oh.

Ralph May:

Ouch. The risk was totally getting bad. It always was bad.

Corey Ham:

Yeah. We can talk there's a decent amount of convictions. We can just like hot run through them or, you know, like pending court cases. A lot of people are getting caught this week. First of all, a Tennessee man plead guilty to hacking the supreme court electronic case filing system.

Corey Ham:

This I think happened it was late last year. I remember when we talked about this. Mhmm. And yeah, unfortunately, or I guess fortunately or whatever whichever way you look at it, it was someone based in Springfield. They intentionally accessed a computer without authorization on twenty five different days.

Corey Ham:

What they accessed has not been disclosed, so we're not a 100% sure, like but like Wade brought up in the pre show, the Supreme Court's proceedings and all that, like a lot of that stuff is private. So you know, that is kind of a different typically in The US, the court system is wide open to the public. But with the supreme court that isn't the case. So who knows what they accessed or how much they're gonna get in trouble for it. But seems like the court system might be a little biased.

Corey Ham:

I wonder if they'll get off because they'll say that there's no there's a conflict of interest for the whole court system wanting to protect itself.

John Strand:

Did they ever talk about what the vulnerability was that he was taking advantage of?

Wade Wells:

There is there is another link to the article. He was only 22 as well, if you do

Corey Ham:

the math. We don't have at least in my from what from the court documents, is all we're pulling from right now. I don't think there's it's basically just accessing a computer without authorization.

Wade Wells:

Someone probably left default cred somewhere.

John Strand:

We're really, really super short super short document.

Corey Ham:

Yeah. I don't know like, this is literally a one page document. So there's not a whole lot of details in there. Yeah. But, yeah.

Corey Ham:

There's also I

Alex Minster "Belouve":

mean, talked for, like, the the Tennessee Supreme Court. I thought it was interesting that their one of their solutions was, like, we're just gonna move everything to paper. And rather than doing cybersecurity, we're just going to it really needs to be handled by by paper instead just to not be

Corey Ham:

hacked. I don't need a phone anymore.

Ralph May:

We're just gonna send notes.

Wade Wells:

We're gonna get a there's gonna be an article later on for dumpster diving. Man dumpster dives

Ralph May:

and finds.

Corey Ham:

Exactly. Papers articles. Paper's so much worse. There's no chain of custody. It's like who touched this paper?

Corey Ham:

A lot of people. Alright. Yeah. Hey.

John Strand:

This is wild. These court documents are like one to three pages. I'm just trying to find anything about this. Like it's just kind of crazy.

Corey Ham:

Continuing with getting in trouble, a hacker got seven years in prison for breaching the Rotterdam and Antwerp ports. That was a 44 year Dutch who was the arrest happened in 2021. He was convicted in 2022, appealed multiple times, and now has finally been convicted. It's seven years. I don't know exactly I I I don't think we covered that breach, but yeah, basically it was like you a media drop.

Corey Ham:

So it was a USB stick containing malware. Nice. And yeah, so he was like, you know, the world's worst pen tester, I guess, or best pen tester depending on how you look at USB

John Strand:

dead drops, classy. Classy.

Wade Wells:

Yeah. I'm just surprised. Anyone still falls for those. Right? The other I don't know why.

Corey Ham:

Interesting thing with this one, I hope there's like a Netflix documentary about this at some point because it says that he also got convicted of facilitating he specifically hacked the computers to facilitate drug trafficking and he imported 210 kilograms of cocaine. So more than most pen testers need.

Wade Wells:

That needs to be on the top of the article. I felt like that's more impressive, right, than anything else. Like my first thought was like, why is he trying to actually hack these if he's just some regular dude? But that totally makes sense. Wow.

Corey Ham:

It I hope there's like a dramatized or a documentary about it because it would be really interesting to see like on a technical level, how do you actually turn go from breach to importing cocaine? Like Yeah. Is it like when they go to scan, the the container is just like, not cocaine? Like, I don't know.

Faan Rossouw:

Maybe he just says like change the record to say it was already inspected.

Corey Ham:

Yeah. That could work. That that would that's a good way to think about it. It's like this container has been pre pre cleared or something like that. I don't know.

Corey Ham:

It's probably less interesting than I'm making it sound in my head, but it could be

Faan Rossouw:

interesting. Interesting.

Corey Ham:

Any other convictions?

Wade Wells:

There's one from o m dot n l. The article is actually in Polish.

Corey Ham:

Okay. Why is it in Polish if it's .nl? I'm very

Wade Wells:

Oh, never mind. I don't I don't know. I totally screwed that up. Was I was reading another article. For the same breach?

Wade Wells:

I think it's a different breach. 30 year old three year old man under international surveillance. I see. This one sounds like someone just testing out malware. Oh, my gosh.

Corey Ham:

I'd say this is not n l, so it's it's written in Dutch.

Wade Wells:

Written in Dutch, my bad.

Corey Ham:

No shade for not being able to identify Dutch versus Polish. I don't think I could either, I'm just going based on the TLDs. But yeah, that's that's an interesting one. What did he do? I'm confused.

Corey Ham:

He enabled criminals to test to develop malware?

Wade Wells:

I'm wondering

Corey Ham:

Like he just ran VirusTotal basically?

Wade Wells:

I'm guessing he probably started up a fake company, went and bought anti malware from big vendors and then allowed people to test malware there. That would be

Corey Ham:

nice. That illegal? Yeah. Is that illegal? John, we John, we have to shut down this security company like right now.

John Strand:

Right now. Because reasons.

Ralph May:

Unrelated to the article,

Corey Ham:

actually. Unrelated to the article. I gotta go make a call.

John Strand:

Yeah. That's in other news. Yeah.

Corey Ham:

I don't know. Speaking of trade Yeah. I mean, I

Ralph May:

Oh, sorry. I was I was gonna say No. You're good. I was gonna say, speaking of Segue.

Corey Ham:

Take it away.

Ralph May:

Shut way shut stuff down. It sounds like the the US government's trying to get rid of Nipper. Right?

Corey Ham:

No. Yeah. So for those who don't

Ralph May:

know, Nipper is essentially like the the non secret or non secure, even though that's not actually true, just the non encrypted but that's actually not true. I don't know how to describe it. It's a place that you can go browse Google. It's a network. So in the government, they they separate out classified networks.

Ralph May:

So you have your Nipper, which is like, you know, essence connected to the Internet that we all know. Then you have Sipper, which is information that's classified as secret. It's actually on separate everything. So separate routers, separate switches, separate cables, separate everything. And it has like a separation, like a physical separation between the two.

Ralph May:

And then finally, there's top secret, is also the same exact thing, all different equipment. But in the article, they're talking about actually moving away from using a Nipper at all and just having just commercial Internet. So like instead of having a separate network the government maintains for you to access things like Google or your unclassified email, you would just go on to the Internet like

John Strand:

you would know. So I have a lot of problems with this. So I think that part of this originated from the idea that Nipponet is kind of it's kind of neglected and kind of a train wreck.

Corey Ham:

Least Sure it is.

John Strand:

At least last I looked. Right? It it's just if you look at all of the, like, classified networks, if you're looking at, like, Supernet and JWICs and then CWAN and GWAN. Right? Because you have different WANs available for government people and contractors as well because it gets into all kinds of weird things like, you know, how does a how does somebody from Lockheed Martin do their time sheet when they're in a classified facility and they're not in a Lockheed Martin facility and they don't it it gets complicated.

Faan Rossouw:

It's like 95%.

John Strand:

If you're looking at, like, Nipponet, I feel like there's been a lot of problems with Niprinet. And honestly, you the right way to handle that was try to, you know, try to put some security around it, try to come up some standards and do that. But the hard part about Niprinet is, like, everybody's fighting over it. Right? Like, even if you look at GWAN, CWAN, CIPR, and JWICS.

John Strand:

Right? Where you do those delineations from a handoff from, like, the FBI to the CIA to the NSA to the NRO to DHIS to all of these government agencies, it's it's really hard to say who is definitively in charge of it. Even even if you do, the different groups, they wanna maintain responsibility for their own network segments. And I I feel like they chose the wrong answer. Right?

John Strand:

That it's like, you know what? F it. Everyone's gonna do their own ISP. How's that sound? Right?

Corey Ham:

And It's a huge missed opportunity to do a branding deal with like TrumpNet or something and be like, I use the best Internet. The government it's good. It's military grade.

Wade Wells:

Why did you put that into the Oh

Corey Ham:

my god. It doesn't it could be any, like, any large ISP, but it would be funnier if it was just a completely made up one that's just like, you know, it's still Nipper, but they just rebrand it. Like, you know, it's all it's it's simulation. We anyway, go back to the actual real discussion.

John Strand:

So I I feel like this is just a way of like, somebody clearly high up in the government, they put in charge of trying to deal with the situation, And they clearly just threw their hands up in the air and said, screw it. We're just gonna get rid of it. Everyone hates it.

Corey Ham:

I love that, though. Told

John Strand:

what to do. You're all gonna have your own ISPs. You're all gonna pop out on your own. You're all responsible for your own crap. Good luck.

John Strand:

And I'm willing to bet the meeting was like, yeah. And if you guys don't care take care of your own connections, then bad things are going to be coming to you. Yeah. Eric just put in middle spec, quality at the lowest bidder. Yeah.

John Strand:

Absolutely.

Ralph May:

So, I mean but this even comes down to classified networks too. Because, mean, essentially, classified networks were dedicated lines. Right? Like, I described that. But eventually, they moved to carrier grade delegations between those.

Ralph May:

Right? So essentially, commercial they could write commercial fiber. Right? And and that is your bulk encryptors that they have, which are classified encryptors. I wouldn't say classified.

Ralph May:

They're they're a part of classified programs to send encrypted data over transit fiber. Right? That that also has normal Internet. Okay? So, I mean, this was something that they they, like, worked in to get that to happen as well.

Ralph May:

And So, I mean, like Mhmm. Yeah.

John Strand:

And here's the big problem with getting rid of NipperNet. Right? So if you look at a lot of standards, there's savvy and there's TSAVY, top secret below interoperability, and secret below upper interoperability, where there's standards for how you integrate a classified network with an unclassified network and a high classified network like JWICs down to like Supernet. Right? It's not

Ralph May:

a drive? It's a network drive.

John Strand:

If you're looking at how these things are connected, like one of the things you said, I'm gonna push back, where people are like, you know, Ciprinet and JWICS is completely isolated and it's not. It's effing not. Okay?

Ralph May:

Yeah.

John Strand:

There's lots of different guards that are basically designed for transferring data from the low side to the high side or the unclassed side into the classified side. Right? Because you're gonna be pulling that that open source intelligence, you're gonna be pulling it in for ingest and analysis on the other side. And this is one of those concerns that I have with just getting rid of NipperNet and these standards that exist is NipperNet itself was just a train wreck years ago when I looked at it. I'm sure it got much better.

John Strand:

Oh, a train wreck. Right? And I don't see how having everyone setting up their own unclassified network pops that possibly can be tied into classified networks is somehow going to be a better approach. I just like I said, it clearly looks to me like someone gotten put in charge of this. They looked at how difficult it was, threw their hands up in the air, flipped the card table over, and are walking away at this point, which I'm not gonna disagree with because that may be the best option going forward.

Corey Ham:

Because Oh, yeah. No. I when when clients do this, when we're doing a pen test and I'm like they're just like, alright. Be honest with me. How bad is it?

Corey Ham:

And I'm like, I would just completely decommission this entire system. And they're like and they and then they actually do it. I'm like, nice. Because like, you know, I I the a lot of clients that try to slub something like this along for years, like, I just it the retesting is always like, still vulnerable. Sorry.

Corey Ham:

And they're like, oh, the developer states it. And I'm like, nope. Sorry. And then it's like two years of that, and then eventually they're like, nope. Ditch it.

John Strand:

You get into the point where they talk to their lawyers, and they're like, we can't get the developers, the systems administrators to upgrade this. If we get breached, what's our liability? And the lawyers look at the past three pen test reports and go, bad. It's real bad.

Ralph May:

Like,

John Strand:

if you haven't fixed this in three years and you've been told for three years it's bad, that's all on you.

Mary Ellen Kennel:

And like in the article too, like it's also a usability issue. Like the the one person sort of saying like, you know, I'm secretary of the army and I can't print. I can't use Teams. Know, I can't do it's it's so it sounds it's also like it just doesn't function very well.

Corey Ham:

Well But that's the whole point of I your high

John Strand:

I you know, I read that too. And I yeah. Like I said, that welcome to working in a classified environment. It's just that

Corey Ham:

But I do like the fact that every executive complains about not being able to print. Now there's sub universals.

John Strand:

Feel that pain. So when are they just gonna get rid of printers? There you go.

Ralph May:

Oh. Good idea. Then we don't need them.

Corey Ham:

Okay. John, stop rocking the boat. Listen, we already got rid of Nipper. That project's gonna take ten years to complete. By then, we can talk about printers.

Corey Ham:

Alright?

John Strand:

One thing at a time, pump the brakes.

Corey Ham:

One thing at a time. Alright. What else is going on? There's some potential minor geopolitics in Europe. This is kind of an update to a previous story where essentially, the Polish Polish former justice minister, I'm not gonna try to pronounce his name, I it's it's way out of my wheelhouse.

Corey Ham:

But he was convicted of embezzling money or facing charges for embezzling money. And then I guess he was like, hey, I got a buddy in Hungary who can take me in, is that cool? And somehow that went through and so he's now a Polish fugitive in Hungary, which like generally, you don't go fugitive in the like same European block that you're already in, I feel like.

Wade Wells:

It seems like I don't Hungary has a history of this though. Hunger the the hungry guy who let him in is in the same party for one thing.

Ralph May:

Right.

Wade Wells:

And then he also got caught for the same thing, and they let somebody else in for the same thing. Pretty much running spyware, throwing Pegasus on people's phones.

Corey Ham:

So Hungary is running basically like an asylum service as a service. It's like, pay us. If you buy us a yacht or whatever, you can come take a It's cycle of like Switzerland. We're neutral, but we will take all your money. So come bank with us.

John Strand:

We're that neutral.

Corey Ham:

We're we're neutral. We're not that neutral. Yeah. That's a good one. In AI space, interesting, I guess.

Corey Ham:

There's

John Strand:

This one is now one?

Corey Ham:

This is the Grock stuff. Oh, Grock. Okay. So it's this is kinda gross if you're not if you know, if you're if you want this to be a children's podcast, first of all, what are you doing here? Second of all, you might wanna skip this article.

Corey Ham:

But essentially, the California AG, the attorney general is investigating Grok, which is Elon Musk's AI tool for non consensual deepfakes. Apparently, Grok has something called a spicy mode which has the functionality to

Ralph May:

Of course it does.

Corey Ham:

Which of course it does, I guess. It's super creepy. I no other AI has this to my knowledge, right? I mean, no

Ralph May:

other person's this creepy.

Corey Ham:

No other AI that's like claims to be a reputable company has this capability, right? That I like ChatGPT doesn't have this, Copilot doesn't have this, Claude doesn't have this. But essentially, people can use it, I guess, to generate deep fakes of other people or like have it undress someone and it gets gross and there's like kid stuff involved as well. So

John Strand:

I thought they didn't fix They just put it behind a paywall.

Ralph May:

Well, maybe

Wade Wells:

I remember that. I remember them talking about that. I I wanna say even OpenAI talked about that making like an adults only version of all their AI products to do things like that.

Corey Ham:

Think that's coming soon. Apparently, Elon Musk says that it's never generated anything illegal ever.

Faan Rossouw:

Never. Ever. By his definition, I believe, I believe. It

Corey Ham:

will refuse it will refuse to generate anything illegal. We know AIs, they're super hard to jailbreak. You don't just Yes.

John Strand:

Yeah. Everything I remember my first child porn case that I ever worked. It was actually a a guy who was drawing child porn art. And they basically were like, yep. That's not illegal.

Corey Ham:

And their

John Strand:

whole take was, well, it's not illegal because there's no victim. It's just drawings. Right? And, you know, I guess I can kind of somewhat understand like the twisted logic to get to that point. But I can't help but think that that's how Elon Musk is trying to trying to cut this is like, yes, it's generating these images, but they're not images of real children.

John Strand:

Right?

Corey Ham:

So That would actually more sense than what he's doing. What what he's actually doing is claiming it doesn't generate the images, that there's very clear proof that it does generate.

John Strand:

Yeah. Know. Because in the lower right hand corner, says karak, you know, powered by karak.

Ralph May:

Go ahead and watermark all of them.

Corey Ham:

And and I just Oh, the watermark makes it legal. I think that's how it works.

John Strand:

And someone joked. They're like, well, now Grock's in DOD networks. So that's good to know.

Ralph May:

Oh, yeah. That's true. That that Yeah.

Corey Ham:

There's another Yeah. Yeah.

Ralph May:

That it's a smarter AI. It's definitely more It's a better Yeah. It's

John Strand:

It's the grok of war. Not the grok of guns.

Ralph May:

Yes. The grok of war.

Wade Wells:

That has got

Corey Ham:

to other thing I

Ralph May:

was gonna say about these large language models doing, you know, malicious or not not great things is that a lot of the open source models are are getting catching up to the frontier models. Right? The frontier models will always be ahead, but to the point where it's like good enough. Right? Like and so we'll probably see more of that in different spaces where it's not some you know, one of the big four frontier AI models that's doing this, but, you know, some other smaller company that has, you know, access and set up one of the other open source models that's good enough to do, you know

Corey Ham:

So, okay. I I don't I don't wanna this is kinda gross, so I don't wanna dig into this too much, and I almost don't wanna ask this question, but there's not necessarily is there a legal precedent for the like, the prompts themselves? Would those be illegal? Would the images be illegal? Like

John Strand:

Honestly, if you if you really wanna dig down to where the legality is, and this is the this is the thing that like keeps me awake sometimes at night whenever they're talking about Anytime these AI models have created this type of protected illegal images, it was trained on that. What data did it get for the training? And that's that's the thing that really bothers me because you can come out and say, well they're not real kids, see? But it was trained on some data somewhere. And that ultimate data that's underneath the hood and this is this is something if we had someone that was gonna prosecute this, that should be the question that they're asking.

John Strand:

So if it's generating illegal images, the question that should be immediately asked is how in the hell did the AI model learn? What did it train? What was the dataset that it was trained on to generate that? And that's Well,

Corey Ham:

and we have we have in the past talked about researchers who found There was a researcher who was trying to train his own AI, downloaded an image dataset, and then his Google account got locked because it said there was CSAM in the in the image training set that he was using.

Wade Wells:

And Mhmm.

Corey Ham:

So basically, we know based on that that there are CSAM abuse images in large training models. So I guess

John Strand:

is saying CSAM found

Corey Ham:

How do yeah. So okay. How do we what is the like, if we if we were the California AG or whoever and we were designing a set of rules around this, what would you design? Would you say that the prompts are illegal? Should this be illegal?

Corey Ham:

Like, because on some level you don't wanna police AI too hard, but also it feels wrong to be like, oh you can just type whatever you want in the AI and it's legal. Like, hey AI, how do I hide a body in my backyard? Like, should that be legal? Like, I don't know.

Faan Rossouw:

Is it is it illegal to Google that?

Corey Ham:

It it's not illegal, but it could be used against you in court. Right?

Faan Rossouw:

I think it happens quite a lot too. I've seen many cases where they're like.

Wade Wells:

We we've already seen that starting to happen. Right? Like one of the Yeah. Forest fires that happened in California, the guy asked ChatGPT like, hey, would I be liable for this? Right.

Wade Wells:

Mhmm. It said, yeah, you would be, so

Mary Ellen Kennel:

And there are certain key phrases. It's been a long time now. But back when I did forensics, like, there are certain key phrases that if you are looking for that, you can put them in and that it's like there's like secret phrases that those types of folks know and they know to put them in and it's really disgusting and it's it's like it's like a secret like language. Like, you wouldn't even know it but like when you put it in, it'll get you what you want. It's weird.

Corey Ham:

Mhmm. Mhmm. So I guess James Randolph brings up a really interesting point in Discord which is arguably, you do also need some training data that is CSAM so that you can train CSAM prevention models. Right? Like that's an interesting angle.

John Strand:

That's been that's been done for years. So Right. The FBI, there's this function it's Mary Ellen, it's it's KFF, known file functionality. Is that Yeah. Is that it?

John Strand:

I wanna make sure I got it right.

Mary Ellen Kennel:

Yeah.

John Strand:

So the the the FBI, just we're gonna use them because they have a monster, monster, monster database of child pornography. Now it used to be whenever you were working these cases, and this is right around when I started working and was working some of these cases, is an agent well, you as a forensicator would go through and see something. And as soon as you saw something, hands off the keyboard, you're calling in an agent. And then the agent would look at it, and then they would go through the rest of it, and then they would make a determination. They would do lookups.

John Strand:

Problem with that is that it it it psychologically destroys you over time. So what they developed was a monster hash database. This is, like the earlier in in, like, incarnation of it. They created this monster, like, hash database of known child pornography images. Right?

John Strand:

So an agent wouldn't have to actually sit there and look at it. And if you're working a forensics team, you wouldn't have to look at it either. It would just go through it would hit the hashes in the KFF, and then it would bring back and say, these 23 are known known known CSAM. Now eventually, you would have to have somebody look at it, but you wouldn't have to go through absolutely everything on someone's computer system. Now as that progressed over time, they started developing better image recognition models because the people that do this stuff are wicked, wicked smart.

John Strand:

They would go through and scramble files, they could scramble hashes, they would do all of that stuff to try to get around this then they developed better models that could actually look at the images and then correlate it back to known images again because the hash would be changed. And that gets into a number of really cool image recognition utilities. Now somebody bringing this up, there are absolutely AI models that are not connected to the internet, I hope, that are literally just being fed huge farms of all of that previous child porn data and then basically you can still have the forensics tools do that analysis and then it can basically give you a determination. Now that is a legitimate tool with a legitimate purpose in life for law enforcement. Right?

John Strand:

That that that's different than what I was talking about. Whenever we have these AI models that are public that you can ask it to do this and it does it, that means somewhere in that training data it got a hold of some bad data that it should have never gotten a hold of. And I think that that's where the illegality and that's where there should be some law enforcement research basically saying, how in the hell did your AI model even learn about this in the first place?

Faan Rossouw:

But I I am curious, though. Is that always true, or is there, like, a level of inference that it can, like, fit like, deduce that that's what it's looking at? What I mean, like, if I let's let's say I make something up. Like, I I I have a picture of a parrot in the style of MC Escher. Right?

Faan Rossouw:

And it can recognize that, but it didn't really train on a picture of a parrot in the style of MC Escher. It's it trained in a parrot and an MC Escher. So it could, like, infer that's what it was. I'm I'm just kinda like being devil's advocate out loud here and obviously not trying to challenge it or apologize. Yeah.

John Strand:

I can answer that. The way the system is trained is any of the images that it's trained on are known real victims of abuse. The reason Yeah. Why but the reason why

Faan Rossouw:

What's that? I'm talking about this this not the one the forensics use, but the one, let's say, like, let's say Grok that's able to generate it. I I'm curious. Does it mean it had to be part of its training set, or can it take two separate concepts and combine them to produce that?

John Strand:

That's the question. And I think that that's the thing that should be investigated. Yeah.

Corey Ham:

Yeah. And it's also worth noting that there are some states that have laws against deepfakes or are are making laws against deepfakes. I think Florida has one. So, like, I don't even I mean, California is the big, you know, the thin end of the wedge here. Like, this is probably gonna go a lot further.

Corey Ham:

We know AI is tied up in all kinds of legislation right now in general, but I don't know. Even if it's not child stuff, even if it's full adult stuff, it probably still shouldn't be allowed like, hey, let me make anyone Like, that shouldn't be a thing that companies are providing as a service in

Alex Minster "Belouve":

my opinion. Shouldn't be a thing. I mean, there's Yeah. There's ads for it though. And I

Corey Ham:

think Oh, yeah.

Alex Minster "Belouve":

For for Grok keeping this in the mix, they may be looking at it and saying there are competitors that are doing this, so rather than have everybody leave x in order to go get the stuff done somewhere else, they're like, we're just gonna not bother and let you let you do it here.

John Strand:

But that's one of the things, Alex, where there has to be a conversation, it's like Yeah. I think we're okay losing that market share. Let let's let's lose that market share.

Corey Ham:

How how yeah. How many people are really gonna pay specific I mean, guess I don't wanna know that answer. Way too many. Yeah.

Alex Minster "Belouve":

I'm certain if X is okay with losing market shares or just like, yeah, whatever.

Corey Ham:

Yeah. That's our thing. Our thing is abusive images, okay? No one else can have that market. I know.

Corey Ham:

I think

Alex Minster "Belouve":

apathy apathy reigns at X.

Faan Rossouw:

Yeah. Yeah. No. Said the X-ray glasses at the back of the comic book, you always ordered that as a kid, then you end up being so disappointed when you put them on. They just Yeah.

Faan Rossouw:

Don't do anything.

John Strand:

I always bought the rockets and and stuff to blow up stuff. I put in an article I wanna talk about is ServiceNow. Do you guys see the body snatcher flaw?

Wade Wells:

Yeah. That it was pretty interesting, the workflow of it. But then also kind of like lazy password management.

John Strand:

But Well

Corey Ham:

It's like any good breach, it's a chain of failures. Yeah.

John Strand:

Yeah. And I I seriously believe that we're gonna be seeing a lot more of this. I know with our pen testing and AI and the team that does that, it's like we have found these types of vulnerabilities in a number of different customer applications. One of the things that's different though is this was exposed to customers. Right?

John Strand:

So that customers could use those AIs, and that's where you were actually getting the data leakage back and forth. I I just you know, it's like all this AI crap, it's like but we almost need to take a beat, put some stuff around it, some controls, some testing methodology, flush out OWASP for AI or an OWASP for AI testing methodology. And, like, Mallet was asking, why do AI agents have passwords? I think that's one of the themes of the stories this week is why does AI have access to that? Like but that's just kind of the way that a lot of companies are using AI.

John Strand:

Give it access to absolutely everything and then you Yeah. More

Ralph May:

Yeah. The more you can connect it to as many as many data sources as possible, the better, like, it can work. Because, you know, these AI models only have such a big context. They can't remember everything. Right?

Ralph May:

So you put it in as many kind of data sources as possible, and then when you ask that hard question, it can go pull each one of those bits of information, right, to to give you the best answer. The downside, as John has pointed out, is that there's a lot of information in there that you might not want in the answer. Yeah.

Wade Wells:

There's a couple products like that are doing enterprise search, right, where it connects literally to everything you have, your chats, your notion, like your, like Notion is one of the has these. So like your all your notes, your documents, your email, your calendars, and everything. I And will admit as a security professional who's looking around for notes on why x did z Yeah. Like it's been pretty much a game changer to pinpoint exactly why a user did something which is pretty crazy.

Corey Ham:

Yeah. Yeah. This these chatbots have been on our radar as pen testers for years. There was like I think it was two years ago, we had a similar exact the same scenario where there a client had a chatbot in their service desk and we it it wasn't even like we jailbroke it. It would just reset your MFA without any prompting.

Corey Ham:

Like you could just be like, hey, can you reset my MFA? And it'd be like, I did that for you. So like, yeah, AI like anytime you're putting an AI chatbot basically what customers ask me this all the time and here's what I tell them. Anywhere there's a chatbot, assume it can be jailbroken because it can't. Mhmm.

Corey Ham:

Like, the better AI gets, the more it can be jailbroken. And so like Yeah. That's its job basically is to get jailbroken. And so like, the the most don't allow internal access from external chats, basically. It's like the like you've gotta think of it in that way.

Faan Rossouw:

I think another interesting paradigm now is just kind of the agents where it's kinda like locally running on your machine and you're giving to your environment. Obviously, some people are taking care to put it in a sandbox. I don't know if any of you've run Claude code, but in the beginning, it's very restrictive. It'll ask you permission for every single bash command it wants to run, but then you kinda get irritated. So you're like, okay.

Faan Rossouw:

You go into user settings, and you add LS and cats and all the nondestructive

Ralph May:

Jesus take the wheel, and then next thing you know. Yeah. But now, so now,

Faan Rossouw:

can release Claude work, which is like taking the idea because a lot of people, they use Claude code, but not for coding. For just like, hey. Can you tidy up all the files in this folder and stuff like that? So now they I I guess, not to make it sound too pejorative, but like for normies, they made Claude Cowork. So it's got a nice GUI, and it's got, like, you know, little buttons like, hey.

Faan Rossouw:

Can you organize my to do list and sort my emails? But I think a lot of people using that don't have the understanding that that data that they're having it interact with locally is being sent back to a data center every single time. You know, it's their system.

Corey Ham:

No. For sure. Yeah. I mean, I think that's an interesting the other direction of flow. Like, this article was talking about how someone talked a chatbot into doing something it shouldn't have done, a company was hosting the chatbot, but that same thing applies to a user of a company using a third party chatbot and all that information is leaking.

Corey Ham:

Like that's Yeah. Crazy.

Alex Minster "Belouve":

And I've seen where like I've dealt with vendors that have put chatbots into stuff and a lot of times they'll have the aspect of, well, it's it's only meant in order to do these certain things and please don't beat it up. And I'm here to say that, like, please don't is not a security control. Like saying, please please don't try to jailbreak it. Like, it's supposed to be very simple and it's supposed to just kinda, you know, help you navigate the you know, find things in the menu to where it's sort of like, well, if if your interface is hard to use, adding an AI to it isn't going down.

Ralph May:

The other thing too, while while these chat assistants and large language models, foundational and all this other stuff can and should be expected to be able to save every single thing you say, we also have to assume they can't save every single thing you say. Okay? There's just not enough space to do all of that. They don't make any money off let me rephrase that. They want to save certain things maybe, and it is possible they could be using that to sell to advertisers or other things like that.

Ralph May:

I'm not saying that can't happen. What I am saying is they can't store everything. They don't have if they could, we'd have the smartest agents already because they would already have the memory to remember everything I've said.

Corey Ham:

From the corporate security space, right?

Wade Wells:

Think about most of the contracts these corporations are signing are saying you cannot train your model off what we're saying or you can't even log what we're what we're providing. I'm interested to see like when one when one of these big chat bot chat bots or chat provider AI providers, are they actually telling the truth or like do they have back end logs

Ralph May:

at time day? Doesn't matter. People think that they're like some special snowflake and they're not gonna be able to solve some business problem because they didn't have that information. Maybe they don't need need a credential that's sensitive inside of there. I get that.

Ralph May:

But they're it's gonna continue to make models that are smarter whether you give them the data or not. It's irrelevant to the equation.

Corey Ham:

I have a No. A product for you

Wade Wells:

that'll manage your credentials for your AI for you. I'll tell I'll tell you about it later.

Corey Ham:

Oh. Okay. So the like, it is funny though to think about, like, if it's trained on if if these AIs are trained on like normies, quote unquote, and you have like you you go to the AI, you're like, hey, AI. I need to find something that, like, opens up ass jeeves and types it. You're like, no.

Corey Ham:

I didn't want it trained on the grandma dataset. I wanted it trained on the the Gen Z dataset, and then it just goes to TikTok and like types in whatever topic you're looking for. It's like, no, not that either. Stop. I I wanted like the professional businessman version.

Corey Ham:

And then it's like, oh, yeah. I I don't know. It is funny to think about though, you know, enterprise isn't that special, but and for every company that's not okay with their data being shared, there's probably at least one that is like, for free, I'll share everything we have, because I can't afford to pay for a private model or whatever.

Faan Rossouw:

Yeah. Yeah.

Corey Ham:

So So yeah. Anyway, John's broken. John, what's going on?

John Strand:

I do wanna call out one thing. So this is a recommendation to app Omni for their article. It's a really amazing technical article that Aaron Costello put together. Really, really, really well done. But it's missing something incredibly fundamental to this entire thing.

John Strand:

You're using this for marketing and I'm talking directly to the people at AppOmni.

Corey Ham:

What article is this?

John Strand:

This is the AppOmni Body Snatcher thing, the AgenTik AI article. Okay. We were talking about for ServiceNow on it. And one of the things I wanna call out for this particular article for Body Snatcher, once again, technically well written, shows high level of competency. The one thing that is missing is you have a disclosure timeline where it was disclosed at ServiceNow 10/23/2025, and then ServiceNow remediated it and sent emails to customers on October 30.

John Strand:

Other than that, there's nothing in this article that talks about the collaboration working with ServiceNow to get this remediated. And if you're a company that's gonna be specializing in doing this type of work, you really wanna highlight how did you work with ServiceNow? And it's entirely possible at ServiceNow. We're just mean and they weren't great to work with. And saying nothing is better than saying something bad, and I can understand that.

John Strand:

But anytime you're doing these types of disclosure or coordinated disclosure timelines, it's really important to highlight like how did you work with ServiceNow rather than just dropping a couple of dates into it. So one little recommendation, but other than that the article was incredibly well researched and incredibly well done.

Corey Ham:

I think as yeah. And that's a good like a really good call out is like the if AI firms or you know, AI companies that are implementing AI products are looking for a vendor, they're probably you're gonna be top of mind. So having that communication there is pretty good.

John Strand:

Yeah. Because if I was if I was another like company like ServiceNow, you wanna work with these guys but this article doesn't tell me what it's like to work with you. So

Corey Ham:

So the other thing I guess, like, just kind of a thought that popped into my head is like, if I was a CISO or some other executive, I would want like a list of all the agents we've approved into our corporate LLM or whatever. Because like, how do you even limit the blast radius? Like, let's say there's an a breach of your AI system. Do you even know what like what data could be impacted? Like how many things if you allow people to self-service integrate like, oh, well, one sales guy integrated Salesforce, so that's in the in the blast zone.

Corey Ham:

And also one marketing person integrated Jira or whatever, so that's in the blast zone. It's like your token sprawl could get so massive. I would almost want a list of like what agents we've approved.

Ralph May:

How how do you find out if a employee has used a LLM in some way and sent that data off to solve some small problem at their job? How do you know that?

Corey Ham:

Web proxy logs, I guess?

Wade Wells:

No. No. Yeah.

Ralph May:

I mean, you do this outside the app too. Right? Like copy and paste. Like, you know it's being monitored. So you put it on your phone or something.

Ralph May:

Like, I mean, it You

Corey Ham:

have your phone recording the screen

Faan Rossouw:

and Yeah.

Corey Ham:

Transcribing backwards. Yeah. Well, you can't protect anything from

Faan Rossouw:

a software.

Ralph May:

You can't protect everything. They mean Yeah. Everyone loves to sell you the product that says there's no way you can copy this thing. Every single time we did a DLP test, got that data out. It was

Corey Ham:

I think you you

Wade Wells:

can always get the that's the thing with DLP. You will always be able to get the data out. It's how much trouble is it gonna take for you to get it out. And I have seen people fired over copying and pasting code into AI browsers or into an AI prompt

Faan Rossouw:

Yeah.

Wade Wells:

Via DLP tools. Like, you

Faan Rossouw:

could You better proof.

Ralph May:

Yeah. Yeah. You better have the proof. Yeah. To bring that forward.

Ralph May:

Right? Like, can't just be but my other question to you is, how many companies do you believe are doing it at that high of a level and can afford that kind of thing to make sure that that that's what's happening, etcetera?

Wade Wells:

Every company who has D Techs can see that. I'll tell you that right now.

Faan Rossouw:

But but I think Wade makes a good point because the general sense is now the models are all converging and models are becoming commoditized. So why go through all that extra effort and risks to use, say, Gemini instead of Opus 4.5? You know? Yeah. I I don't know, but it doesn't

Wade Wells:

really need a new You have to give them the ability to use that tool so they don't use one. You don't eat that.

Faan Rossouw:

Yeah.

Wade Wells:

You're they're not allowed to. Right? And prevent

Ralph May:

Spotify effect. Right? Make it easier to, you Make it easier for me. Than it is to steal it. Because Mhmm.

Ralph May:

My my point with you, Wade, is that you're gonna spend all of this money to try to stop something that is inevitable anyways. Right? Like Yeah. Yeah. You're just you're just putting your hurt on on your employees

Wade Wells:

But

Ralph May:

and spending a lot of money to do it to prove the point that they're just gonna do it anyways in the other

Wade Wells:

we argue that's just cybersecurity anyway though?

Corey Ham:

Yes, exactly. Right? Right? Like, at the end of the day,

Wade Wells:

it's all it's all government, it's risk Right? And We we try to mitigate the risk as much

Ralph May:

as Yeah. We one argue one thing about cybersecurity and business. The business is about making money. Cybersecurity is just trying to prevent that data leakage, that security piece. But the business's business function, how it makes money, that's top priority, man.

Ralph May:

It's it's top priority over the security piece. Right? Because you you have to make money or you don't have a business to to do cyber security, right?

Corey Ham:

So Yeah.

Ralph May:

You have to balance those two things. That's what I'm trying to say. I'm not saying forget

Wade Wells:

Yeah. About I I get it.

Corey Ham:

I It's the same thing as yeah. It's the same thing as why do we have insecure conditional access policies? Because the CEO wants email on his phone.

Ralph May:

Yeah. And you're gonna

Corey Ham:

fire him?

Ralph May:

You're gonna get rid of him

Wade Wells:

for that The

Corey Ham:

best thing

Wade Wells:

you can do is put it on the risk registry. Right?

Ralph May:

Yeah. I said so. You know, like

Wade Wells:

I've I've I've been in that exact argument and like the one thing I've learned in cyber security as a defender is you rarely win an argument with sales depending on what

Corey Ham:

you're talking Depending on how your company is aligned.

Wade Wells:

Very very, yes. But most of the time, the sales people are going to try to do as much as they can with as little as they can.

Faan Rossouw:

For sure.

Wade Wells:

But you you try to always don't don't block them, provide them an avenue. Sure. Like push them this way

John Strand:

and not

Wade Wells:

not just stop them.

Ralph May:

Yeah.

Corey Ham:

Alright. So we have five minutes left. I think it's a good time to announce the CTF winners this week. So as far as the people who I guess there's two CTFs. The first CTF is the anti siphon training CTF.

Corey Ham:

The first place prize goes to v l v l v v l v l v l Good name. Who who won a year of on demand access to anti siphon training, any course you like, which is pretty amazing. And then aka a Saza.

John Strand:

It's Ace Aza.

Corey Ham:

Ace Aza has won a one training course of their choice, which is super exciting. And then there's a second CTF somehow. Black Hills InfoSec also had a CTF and Localized Chaos took first prize with one year access of to on demand training. And then skill four zero four a k a not found won one training course of their choice. That's amazing.

Corey Ham:

And also, Tom three eight four two.

John Strand:

They shared.

Corey Ham:

Oh, there was two winners? I thought that was the same person. I bet.

John Strand:

I thought it was two separate people.

Corey Ham:

We have lots of winners this this week. I just Everyone else, you're not a winner. Sorry. No touching for you. Very good.

John Strand:

Localized chaos. Thanks for having a a name that we

Corey Ham:

clearly Yeah.

John Strand:

What your name is. We appreciate that. So

Corey Ham:

Yeah. Yeah. Thank you for that. And then I guess the last thing I wanna do before we get back into chicken articles would be to fun, plug your plug your workshop. Why should I show up?

Corey Ham:

I'm a budding malware developer and I wanna wanna learn how to make malware. What are you gonna teach me?

Faan Rossouw:

Yeah. I mean, I I would first off say that I don't really think it's just for malware developers. I like, I primarily identify as a threat hunter. So why am I into offsec tooling? Well, I read about a cool threat, and it's not like I can go download every malware sample to run the threat myself and see and develop detection vectors.

Faan Rossouw:

Once you know how to create a tool, you're no longer subject to using tools that you can access or buy. Like, everything opens up for you. Any idea that you have of, oh, I wonder if I could do like something that Sunburst kinda did but use a novel record instead of a text record. Well, go ahead. Go go and do that.

Faan Rossouw:

So being able to create your own tools empowers you completely, whether from the offensive point of view or the defensive point of view. So, yeah, that's what I would say.

Corey Ham:

That's awesome. Nice. Yeah. I mean, honestly, it's the same thing as like, why do I recommend that pen testers learn how socks work? Because you gotta know your enemy.

Corey Ham:

Right? Like, whether you're a threat hunter or a pen tester, you gotta know how the other side works. You will learn something, I guarantee you. Like you will pick up trick tricks and tips from blue teamers and blue teamers, you will pick up things from red teamers. That's the best part of this industry is when we work together.

Wade Wells:

Speaking of Know Your Enemy, did were we gonna pitch the new orange book?

Corey Ham:

I don't know what that is.

John Strand:

Oh oh oh, the new orange book, the survival guide.

Wade Wells:

The new survival guides. So there's a new there's a new black hole survival guide for the orange book. You can order it now. It is IR based. The reason I remembered is because the article I wrote in it is called Know Your Enemy and it's about threat actor profiling.

Wade Wells:

But some great stuff. I'll throw the link in Discord or go check it out. You can order it on the web store.

John Strand:

I'd look at

Corey Ham:

The Spearfish General.

John Strand:

But, yeah, the orange book. So each of the survival guides are gonna have a different color and the color is going to have a theme in in honor of the T SIC Rainbow series. So be on the lookout for that. You should just go sign up for Rekka. We give you all kinds of cool stuff.

John Strand:

So please check it out. Are you ready?

Wade Wells:

Oh, we're gonna do

Corey Ham:

the chicken news. Right? Are there actually chicken articles? I

Wade Wells:

can't There is an chicken article, but it's literally the dude just comments on KFC real quick.

Corey Ham:

Oh, yeah. I I I might have deleted that article, but it is funny. We we can we let's talk about it. Okay. This is like a New Zealand this is a New Zealand article.

Corey Ham:

So who don't live in New Zealand, you might be a little confused. But

Wade Wells:

Was that a I don't think it was a New Zealand.

Corey Ham:

Was there There an was another one? Yeah.

Wade Wells:

There was one about the dude pen testing a medical company, medical records.

Corey Ham:

Yeah. That's it's New Zealand. Yeah.

John Strand:

But still,

Corey Ham:

it's fine. It's It's chicken. We'll we'll allow it. I'll I'll post the link. I'm posting it in Discord.

Corey Ham:

So here's the the headline is hilarious. Basically, a a security researcher has claimed that the KFC app is more secure than manage my health, which for those that understand manage my health is a, I guess a New Zealand based version of like my chart or something. I don't know. But basically, there's been a bunch of headlines where manage my health was hacked. Apparently, there were multiple healthcare breaches in New Zealand and there's been the person's name is Callum McMeneman, who's a web standards consultant who's worked on government website security.

Corey Ham:

He told a news article that he, you know, I found this vulnerability, I reported it, no one cared. So yeah, he basically said the very quotable thing he said is that KFC is more secure than manage my health. I guess KFC is like basically, their his justification for this was, in KFC, when you order chicken, there is mandatory two factor auth.

Wade Wells:

Let's let's John KFC.

John Strand:

That's auth to KFC. Right?

Corey Ham:

Right? Dude, I cannot believe they required two factor to order chicken. That seems

Faan Rossouw:

That's the weirdest part my credit card. Account.

Corey Ham:

I I That is the weirdest part.

Ralph May:

Not not to say that's not good security, because it is. Right? But don't you think it slows down the process to getting the chicken done? Like, it you're trying to get to the chicken sale here? Like Yeah.

Faan Rossouw:

Or some people just check out. They're like, man,

Ralph May:

I'll just order No, I'm not doing two factor. I I

Faan Rossouw:

I'm being a red rooster now, dude.

John Strand:

Colonel's about quality. Colonel doesn't cut corners on security.

Corey Ham:

Yeah. Why isn't it 13 factors? 13 factors and spices, or is it 11? Is it 13 or 11? I don't

Ralph May:

know. Don't want work security back then.

Corey Ham:

I don't know how many it is.

John Strand:

So it was like, did you ever see this is years ago where so somebody discovered okay. So what is it? It's like 13 herbs and spices. Right?

Faan Rossouw:

Yeah. 11. And somebody found out that if you looked at

John Strand:

the Twitter account for KFC, it just it it basically followed a bunch of dudes named Herb and the Spice Girl. And no. It gets better. It gets better. The dude found that out and he tweeted it and I guess it had been like that for years.

John Strand:

He goes, I just realized that that literally KFC only follows like 11 herbs and spice and the Spice Girls. Right? Yeah. KFC commissioned an art piece of the guy riding on the back of the colonel, colonel Sanders, like a backpack, pointing off into the distance with colonel Sanders and him going off into the woods together. So I have mad respect for KFC and their ability for marketing, but we gotta wrap it up.

John Strand:

Alright.

Corey Ham:

Yeah. Let's wrap it up.

Mary Ellen Kennel:

Well, everyone, testing. Feather. Feather.

John Strand:

Multi feather.

Wade Wells:

Feather. She was she was sitting on that

John Strand:

for Alright, a guys.

Corey Ham:

Alright. On that terrible dad joke, it's time to end.

Episode Video

Creators and Guests

Corey Ham
Host
Corey Ham
Corey Ham has been with Black Hills Information Security (BHIS) since 2021 delivering red teaming and OSINT services. Currently, Corey leads the ANTISOC team at BHIS, providing subscription-based continuous red teaming to BHIS clients. Outside of his time at BHIS, you can find him out in the woods or up on a mountain somewhere.
John Strand
Host
John Strand
John Strand has both consulted and taught hundreds of organizations in the areas of security, regulatory compliance, and penetration testing. He is a coveted speaker and much loved SANS teacher. John is a contributor to the industry-shaping Penetration Testing Execution Standard and 20 Critical Controls frameworks.
Ralph May
Host
Ralph May
Ralph is a U.S. Army veteran and former DoD contractor who supported the United States Special Operations Command (USSOCOM) with information security challenges and threat actor simulations. Over the past decade, he has provided offensive security services at Optiv Security and Black Hills Information Security (BHIS) across various industries. His expertise spans network, physical, and wireless penetration testing, social engineering, and advanced adversarial emulation through red and purple team assessments. Ralph has developed several tools, including Bitor (set to release in January 2025) and Warhorse, which enhance efficiency in penetration testing infrastructure and operations. He has spoken at numerous conferences, including DEF CON, Black Hat, Hack Miami, B-Sides Tampa, and Hack Space Con.
Wade Wells
Host
Wade Wells
Wade Wells has been working in cybersecurity for a decade, focusing on detection engineering, threat intelligence, and defensive operations. Wade currently works as a Lead Detection Engineer at 1Password, where he helps build and mature scalable detection programs. Outside of his day-to-day work, Wade is deeply involved in the security community through teaching, mentoring, podcasting, and running local events
Alex Minster
Guest
Alex Minster "Belouve"
Alex Minster is a cybersecurity professional with a passion for Open-Source Intelligence (OSINT) , and a desire to use his technical skills to make a meaningful impact on society. With nearly twenty years of experience in cybersecurity, and a current role in Threat Intelligence for a global financial corporation, Alex remains very active in numerous cybersecurity groups including DC608 and Black Hills Information Security. Beyond his professional accomplishments, Alex is an avid oldschool gamer who enjoys arcades, retro gaming, and tabletop games. He brings his passion for adventure and his commitment to helping others to everything he does, both in and out of his professional career.
Faan Rossouw
Guest
Faan Rossouw
I’m a security researcher focused on the intersection of threat hunting and agentic AI. I do research at Active Countermeasures and instruct at AntiSyphon, teaching threat hunting and offensive security tooling. I’m also currently building aionsec.ai, an open-source platform to make elite threat hunting accessible to everyone.
MaryEllen
Guest
MaryEllen
MaryEllen Kennel has held numerous roles in CyberSecurity, and is currently ranked top 1% in MetaCTF. MaryEllen has spoken at several conferences, including Magnet Forensics, KringleCon, and most recently, Wild West Hackin’ Fest in Deadwood, SD. MaryEllen grew up Mennonite, and treasures spending time with family.
person
Producer
Ryan Poirier
Ryan Poirier began his time at Black Hills Information Security (BHIS) as the Video Producer and Editor in August 2020. Ryan polishes and perfects every webcast, podcast, and workshop on the BHIS, ACM, and WWHF YouTube Channels. Prior to Ryan’s time at BHIS, he worked for one of the largest public schools in the United States, conducting their video production and live broadcasting. He joined the BHIS team because he felt like it would be a great group of people to work with, and he couldn’t pass up the perfect next step in his career. Outside of his time with BHIS, Ryan does freelance photography, attends Cars & Coffee events, and expands his knowledge of audio and videos.