Cyber Crime Junkies
Entertaining & Sarcastic Podcast about dramatic stories on cyber and AI, which actually help people and organizations protect themselves online and stop cybercrime.
Find all content at www.CyberCrimeJunkies.com and videos on YouTube & Rumble @CyberCrimeJunkiesPodcast
Dive deeper with our newsletter on LinkedIn and Substack. THE CHAOS BRIEF.
Cyber Crime Junkies
META on TRIAL for Teen Harm. Secrets Exposed. What To Know.
Use Left/Right to seek, Home/End to jump to start or end. Hold shift to jump forward or backward.
Mark Zuckerberg took the stand as did the head of Instagram in a landmark decision. Their own internal documents are damning and the allegations can change the face of social media forever. This is Social Media's Tobacco Lawsuit moment.
Chapters
00:00 Meta's Legal Battle: A New Era for Big Tech Accountability
02:45 The Impact of Social Media on Youth Mental Health
05:37 The Role of Internal Documents in the Trial
08:53 The Consequences of Negligence: Real Stories of Harm
11:44 Facial Recognition and Privacy Concerns
14:42 The Trial's Implications for Future Regulations
17:31 Resources for Parents and Victims of Cybercrime
Growth without Interruption. Get peace of mind. Stay Competitive-Get NetGain. Contact NetGain today at 844-777-6278 or reach out online at www.NETGAINIT.com
🔥New Exclusive Offers for our Listeners! 🔥
- 1. Remove Your Data Online Today! Try OPTERY Risk Free. Sign up here https://get.optery.com/DMauro-CyberCrimeJunkies
- 2. Or Turn it over to the Pros at DELETE ME and get 20% Off! Remove your data with 24/7 data broker monitoring. 🔥Sign up here and Get 20% off DELETE ME
- 3. 🔥Experience The Best AI Translation, Audio Reader & Voice Cloning! Try Eleven Labs Today risk free: https://try.elevenlabs.io/gla58o32c6hq
Dive Deeper:
🔗 Website: https://cybercrimejunkies.com
📰 Chaos Newsletter: https://open.substack.com/pub/chaosbrief
✅ LinkedIn: https://www.linkedin.com/in/daviddmauro/
📸 Instagram: https://www.instagram.com/cybercrimejunkies/
===========================================================
Chapters
00:00 Meta's Legal Battle: A New Era for Big Tech Accountability
02:45 The Impact of Social Media on Youth Mental Health
05:37 The Role of Internal Documents in the Trial
08:53 The Consequences of Negligence: Real Stories of Harm
11:44 Facial Recognition and Privacy Concerns
14:42 The Trial's Implications for Future Regulations
17:31 Resources for Parents and Victims of Cybercrime
Ever wonder why Meta employees literally called themselves "drug pushers" in their own emails?
Right now, Zuckerberg's sitting in a Los Angeles courtroom trying to convince a jury that Instagram *accidentally* became a killing machine for teenagers.
Spoiler: The internal documents say otherwise.
500 school districts got tired of waiting for Big Tech to fix itself — so they sued. TikTok and Snapchat saw the evidence and settled immediately. Meta? They're fighting. Because apparently explaining why 16 hours a day on Instagram "isn't addiction" is a hill worth dying on.
Oh, and while they're defending dead teenagers in court, they just launched facial recognition glasses so strangers can dox you on the street in seconds. Same company that paid *billions* in facial recognition settlements three years ago.
This isn't about screen time. It's about whether billion-dollar companies answer to anyone when kids die using their product.
**New episode of Cyber Crime Junkies is live.** We break down the trial, the internal emails, and the FBI data Meta didn't want you to see.
New edition of The Chaos Brief dives into more details.
Because organized crime doesn't just wear tracksuits and gold chains anymore — sometimes they wear hoodies and says things publicly like “bro we’re so innovative, check out our new fugly glasses”.
This is the trial of a modern generation. It may be one that leads to positive change.
speaker-0 (00:00.206)
So what happens when a company's own employees call themselves drug pushers and their product a drug and then their CEO has to explain that to a jury? We're about to find out. Meta employees call themselves drug pushers. Now their CEO has to explain that to a jury in a Los Angeles courtroom, one of 1500 suits pending. Meta finally has to defend why they engineered Instagram to addict traumatized.
kids, internal documents, FBI data showing 300 % spike in sextortion deaths and a decision to choose boosting ad revenue at the expense of children and mental health. Instead, launching facial recognition software in stupid looking glasses nobody asked for so strangers can dox you in public in seconds. Ryan Last was 17.
In just a few hours after being contacted on Instagram by cyber criminals, posing as a girl his age, he got sextorted and he was dead. He's not alone. There are many. The allegations say Metta had the tools to stop it. They chose ad revenue. 500 school districts said they've had enough. Big tech won't fix itself. So they sued and armed themselves with a mountain of evidence. TikTok and Snapchat
got brought into that suit and settled immediately after seeing the evidence. Metas fighting. This isn't about screen time or what you think good parenting is. It's about whether billion dollar companies answer to anyone when kids die using their product. Stay with me. This gets worse before it gets better, but we will share available resources parents and kids can use toward the end. This is cybercrime junkies.
speaker-0 (02:09.11)
So get this right now at this very moment, Metta is sitting in a Los Angeles courtroom trying to convince a jury they didn't deliberately addicted children to their platform. It's the first ever in history where Big Tech is having its tobacco industry moment. The trial started February 9th and bombshells started dropping right away. Opening statements revealed internal emails where Metta employees literally called Instagram a drug.
and themselves, pushers after they conducted their own and reviewed independent studies on the negative effects on mental health and teens from use of the platform, as well as the constant fraudulent ads and access in use by cyber criminals conducting things like romance scams and sextortion attacks leading to the death of many teens. And the company is about to put Mark Zuckerberg on the stand.
to explain why, hey, it's not our problem and it's all supposedly fine. Now, this is totally different from every other big tech is bad story you've ever heard. Over 500 school districts this time decided they've had enough. They're done waiting for these companies to fix themselves and they all got together and brought suit. There's over 1500 different suits
ending right now. And the courts are allowing it to go to jury trial where the verdict can be in the billions. For the first time ever, Metta has to defend their business model in front of a jury that's not smart enough to get out of jury duty, who can actually hold them accountable. This isn't just about kids scrolling too much or feeling bad about their selfies.
Look, in 2022, a 17 year old kid named Ryan Last from San Jose took his own life within just a few short hours after he got deep faked on Instagram. Hours earlier, he had been targeted in a sextortion scam. Someone pretended to be a girl his age, manipulated him into sending a photo, then demanded thousands of dollars or they send it to everyone he knew. Ryan didn't have thousands of dollars.
speaker-0 (04:28.642)
He was 17. A few hours later, he was gone. Ryan's parents later found out Metta's own internal research showed Instagram was making body image issues worse for one in three teenage girls. The case, all of these cases, alleged that Metta knew that Metta had the data and that Metta had the ability to stop it. And they did nothing.
And just when you think maybe finally accountability might actually work, Meta announces their new AI powered Ray-Ban smart glasses will now include facial recognition. The stupid glasses that nobody asked for and very few people are buying, they're now going to entice you with AI facial recognition. You know, the thing that lets anyone point glasses at you on the street and instantly
Hold your name, address, phone number, and everything else about you without your knowledge or consent. Two Harvard students already proved that this works. They built a system using Meta's glasses that can dock strangers in real time in public. Yeah, that's going to help. And what's Meta's brilliant response to that? We didn't build that specific feature, so it's not our problem. We're just going to connect the dots.
for everyone to use. But wait, it gets better. This is the same company that recently paid billions of dollars in privacy settlements over what? Over facial recognition and facial data collection. The same company that shut down facial recognition back in 2021, claiming privacy concerns. And now they're bringing it back. But in glasses you wear on your face that look ridiculous.
This is a governance pattern. Internal privacy review processes reportedly got relaxed. Risk teams have been made to have less influence. Product speed became prioritized over safety. So look, the technology didn't suddenly become safer. It's just that their risk tolerance simply increased. Because here's what nobody's saying loud enough. Your data isn't just information. It's power. It's your sovereignty.
speaker-0 (06:55.158)
your right to choose what you want and live life free. It's an inalienable right you hold from the moment of birth. When you hand it over, you hand away control of your life, your choice and your freedom disintegrate. This trial is expected to run six to eight weeks. Mark Zuckerberg is scheduled to testify. So is the head of Instagram. And the outcome of this single case could determine
how 1500 other lawsuits against these platforms play out. TikTok and Snapchat were originally part of this trial. Both of them settled right before it started. Undisclosed amounts, they saw the internal documents the plaintiffs had and decided no way is a jury worth this risk. But Meta, Meta knows better. They decided to fight. So now we've got school counselors drowning in anxiety cases.
parents watching their kids unravel smart glasses that can track and docs you on the street and a billion dollar company saying we take safety very seriously while their internal emails call their own employees drug pushers. This isn't about screen time. It's about whether we're finally going to hold this company accountable for building systems that absolutely harm people.
They strip away their freedom all while they're pretending they didn't see it coming. This is the opportunity for a jury system and a court system to allow citizens to take back control. So here's what's actually happening. Meta didn't accidentally create harm. They engineered it. They measured it and they kept it going anyway.
Plaintiffs' Council Mark Lenier walked into that Los Angeles courtroom on February 9th with something Metta probably wished didn't exist. The internal documents. And not just a few. Thousands and thousands of pages of them. And what they show is damning. Metta conducted a study called Project Mist where they surveyed a thousand TGN-agers and their parents. The findings? Kids who experienced trauma or adverse life effects were the most vulnerable.
speaker-0 (09:12.802)
to Instagram addiction and parental controls were found to be absolutely worthless. Almost no difference whatsoever. The moment a kid got hooked, parents were locked out. Metta knew this. They had the data and they kept the features anyway. Look, there are so many lawsuits pending. We have a link in the description. Check it out for listeners only.
We'll put a link in the show notes of the audio podcast, but you've got to check this out. They've compiled thousands and thousands of documents and they have the actual studies that were conducted both independently and inside Meta. The internal communications where Meta employees are calling, you know, calling Instagram a drug the way the algorithm is done and then referred to themselves as pusher isn't a joke.
It is literally what they're talking about is the neurological effect that it's having on kids that they're studying. It's not metaphorical. It's literal. These weren't outsiders criticizing the company. They were people inside meta saying the quiet part out loud to each other. And nothing affects a jury more than blowing up emails and plastrium on the wall and asking people about that, because that is where the truth actually
comes out. It's not just on Instagram. mean, look, Ryan Last's death wasn't an outlier. According to the FBI, there was 300 % increase in financially motivated sextortion cases in the past few years, all targeting minors. At least a dozen kids died by suicide after being targeted in similar scams this past year alone. It's unbelievable because Instagram had the tools to stop it. But the scams
followed the same pattern. Fake accounts that looked real, private direct messages, algorithmic targeting that made it easy to find vulnerable kids. Meta could have built barriers, but they chose ad revenue. And meanwhile, while Meta's dealing with dead teenagers and internal research showing their platform was destroying mental health, they were also planning their next big move. Facial recognition in their s-
speaker-0 (11:34.862)
stupid friggin looking smart glasses. This isn't new technology for Metta. If you recall, they already paid billions of dollars in settlements, not millions, billions of dollars in settlements over collecting facial data without consent. Back in 2021, they shut down facial recognition on Facebook, claiming it was about privacy concerns. They made a big public show of deleting face print data. Everyone thought maybe they learned something.
They didn't. They just waited. Now they're putting facial recognition into the glorious Ray-Ban smart glasses that make you look fantastic, by the way. The same glasses that two Harvard students used to build a real-time doxing system that scans strangers' faces, identifies people instantly, and pulls their home addresses and phone numbers in just seconds. You can follow somebody home without them knowing you even took their picture. And Meta's position is still the same.
We didn't build that exact feature, so we're not responsible for what people do with these wonderful tools that we create. This is organized crime logic and thinking more godfather than high tech Mr. Robot. Build the infrastructure, create plausible deniability and profit while the body counts. So I asked Dr. Sergio Sanchez, who is a
cybersecurity expert, former Apple executive, as well as somebody that's worked at Activision designing some of the most popular games that most kids play. He's got some additional insight. So let's hear what he has to say.
And here is why this really matters today, especially for parents. The reason Meta can do this now is because their internal governance change. Reports indicate privacy review processes got relaxed. Risk teams have now less power. Product speed is prioritized. The technology didn't get safer. The company just decide the risk was worth it.
speaker-1 (13:46.786)
The trial has met a CEO scheduled for testify. Adam Mosseri, head of Instagram, already took the stand and argued social media. Social media use is not a clinical edition. The trial has met a CEO scheduled to testify. Adam Mosseri, head of Instagram, already took the stand and argued social media and argued.
social media use is not a clinical addiction. Meanwhile, Meta own internal study literally measure addiction rates in teenagers and found parental controls useless. The outcome of this trial could put Meta on the hook for tens of billions of dollars across 1,500 similar cases. That's why TikTok and Snapchat settle.
They saw the evidence and decide a jury wasn't a gamble worth to take. Met a look at the same evidence and decide to fight because they believe they can win or because the cost of losing is still cheaper than actually changing how they operate.
Excellent. So what does this actually mean? It means we've been watching a company run the same playbook that organized crime uses, build the system, deny responsibility, settle when caught, then do it again with a different product. As far as taking on the evidence presented so far at trial, if we were in the jury, it's not looking good for Metta at all. Metta isn't a technology company that accidentally caused harm. They're a surveillance and behavioral modification company that uses social connection.
as bait in an email shown to court. Mark Zuckerberg himself demanded that employees increase user time spent on the platform by 12 percent to meet business goals, not user well-being, not safety time spent on the screen on their platform. That's the metric. That's what they optimize for.
speaker-0 (15:57.57)
The internal documents proved they knew Instagram was addictive to traumatize kids. They knew parental controls were useless and they called themselves drug pushers in their own emails. Plaintiffs' counsel, Mark Lanier, stood in that courtroom with children's blocks, spelling out A, B, and C. A for addicting, B for brains, and C for children. That's what this case is actually about.
Now they're putting facial recognition in glasses. People wear in public after paying billions to settle facial recognition claims and dismantling their facial recognition department. See, they're doing it again because their internal risk tolerance changed. Not because the technology got any safer since a few years ago. And 17 year old Ryan didn't
die because he made a mistake. He died because Instagram made it unbelievably and ridiculously easy for predators to find him, target him and harm him. All while feeding and capitalizing on fraudulent ads and making parental controls useless. The result was they isolated him from his parents oversight. Meta had every signal, every data point, every internal study telling them this would happen.
That's the way it seems to us. And look, Ryan wasn't alone. Julianna Arnold, founder of Parents Rise, lost her daughter two weeks after her birthday to online harm. A man approached her unsolicited on Instagram. She sat in that courtroom and listened to Instagram's head claim that they designed for safety above all else. But that head also simultaneously
had the balls to testify that 16 hours a day on Instagram is not addiction. Let that sink in 16 hours a day on just Instagram isn't addiction. That's not that's not addicting people at all. That's a you problem, right? That's that's that's a problem from children who don't have developed brains and aren't mature enough.
speaker-0 (18:22.338)
to handle that. Think about that for a second.
For parents and school leaders, here's what you actually need to do. Stop trusting meta to protect your kids. They won't. The internal communications show that they don't. A 2015 internal email literally shows Mark Zuckerberg made teens the top priority for the company in H1 2017 as a key driver of growth and profit. Not safety, growth.
Enable every control available, but understand these controls were designed to fail. The plaintiff in this case, known only as KGM, started watching YouTube at age six. She got Instagram accessible by herself at age 11 by circumventing parental controls. Snapchat at age 13, TikTok at age 14. Her mother tried using third party software to block access. It didn't work.
The moment she got hooked, her mom was locked out. So have explicit conversations about sextortion, romance scams and the like. Have those now because shame and isolation are what these scams exploit. Push your school districts to adopt policies that assume platforms are adversarial, not neutral, because that's what the evidence shows. And if your school hasn't joined the lawsuits yet, ask why.
because 500 districts from across the country looked at the data and decided legal action was the only language Metta understands. In June, a trial begins in Oakland representing those school districts. Over 40 state attorneys, attorney generals also have filed suits against Metta. In fact, 29 state attorney general just filed a motion demanding a California judge force Metta to remove all accounts belonging to users under 13.
speaker-0 (20:28.536)
delete all that data collected from those users and disable additive design features like infinite scroll and autoplay. They call Meta's recent teen account feature little more than a public relations measure offering minimal real protection. For business leaders understand this. The facial recognition in Meta's glasses isn't a consumer privacy issue. It's an operational security nightmare.
Your employees can be identified, tracked and targeted just by walking into a coffee shop. Social engineering just got a massive upgrade. If you're not updating your security awareness training to include physical surveillance or smart glasses, you're already behind. This trial is going to force Metta to either prove they didn't know their products were harmful or admit they knew and didn't care. Internal emails already answered that question.
Former meta employees who quit and became whistleblowers are also expected to testify. The jury is hearing from people who built these systems and then walked away because they couldn't live with what they created. And if you want to see the evidence yourself, the actual internal documents, the emails where employees call themselves pushers, the studies meta never wanted you to see. Researchers at NYU compiled everything at Meta.
Metasinternationalresearch.org. It's Meta and then with an S. Internationalresearch.org will have a link to it in the podcast. can download all of the documentation and see everything. There's 31 internal studies, thousands of pages, and it's all public. Now we'll go through some of the most damning ones in future episodes as this trial continues. Remember Meta paid billions before and kept going.
They'll probably pay billions again if required to, and they'll still go. The question is whether this time anything actually changes. But remember, now they want to put facial recognition on your face to track everyone you see. The jury gets to decide if what Metta did was legal. But you get to decide if what they're doing is acceptable. OK, as promised, here are some resources that
speaker-0 (22:50.606)
People, parents, children, guardians, can leverage. I mean, it goes without saying that open communications are key to remind your children should anything happen to come to you immediately, no matter what happened in a no judgment zone. Sextortion attacks are designed to create extreme panic and fear with criminals using shame and threats to pressure victims into making immediate permanent decisions for what is a temporary
reversible problem. The time between initial contact and tragic outcome can be incredibly fast, sometimes less than 30 minutes or a few hours. If you're a victim or know someone who is, you can connect with people who can support you by calling or texting the 988 at 988 anytime in the United States and Canada. In the United Kingdom, you can call 111. These services are free, confidential.
and available 24 seven. If you are a victim or know someone who is, do not pay the criminals and send more content and money. This only leads to further demands. Do not engage with the scammer. Block them immediately. Talk to someone you trust, such as a parent, a trusted adult or a mental health professional. Many victims feel they cannot tell anyone, but reaching out is absolutely
the most critical step. Report the incident immediately to official channels. There's the National Center for Missing and Exploited Children, the NCMEC, their CyberTip hotline at report.cybertip.org. That's report.cybertip.org. The FBI can be reached via their Internet Crime Complaint Center, the IC3 as it's called, or local field office.
Google those numbers in the website. You could also check this out. You can check out the Take It Down tool from the NCMEC. If explicit images have been shared, there is a free tool that helps you remove images from online platforms. Take It Down is a free confidential service by the National Center for Missing and Exploited Children designed to remove or prevent
speaker-0 (25:16.162)
the online spread of nude, partially nude or sexually explicit images, videos and the like taken from individuals when they were under the age of 18. Remember, you are the victim and you have done nothing wrong. Help is available and the situation will pass. Your life is absolutely precious. This is Cybercrime Junkies.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.
Breaching the Boardroom
NetGain Technologies, LLC
Detrás de la pantalla
Dr. Sergio E. Sanchez, el Dr. Qubit.