Why TVO is covering a highly sensitive topic

TVO is covering the highly sensitive topics of child sex exploitation, online luring, and AI deepfakes on multiple programs and platforms. Through our coverage we hope to bring attention to the staggering human toll these activities cause, mostly to children, and through this to drive change.

Paul was a kid when his mother’s boyfriend started grooming him. “It was already a broken home, so it's easy to infiltrate that”, Paul says. “He was giving us gifts, you know, Christmas gifts, things, birthday gifts, giving my mother money when she needed stuff like this. And he was like, Oh, I'm going on a trip to amusement park, you know. Do you want to go?”

Those trips to the amusement park started the summer Paul turned eight. So did the sexual abuse. The trips stopped.  But the sexual abuse went on for six years, until Paul was 14.

His mother’s boyfriend videotaped the abuse to sell to buyers on the dark web.

Tens of millions of images a day

Paul is not his real name. But his story, that he agreed to share on the TVO podcast series Arachnid: Hunting the web's darkest secrets, is very real. And it happens time and again. Over and over and over, to children here in Canada and around the world.

Tens of millions of images of child sexual abuse material, CSAM, appear on global online platforms every day. Abusers make money off this illegal content, and, since little is being done to stop them, they create more and more of it.

Images are uploaded to the anonymity of the dark net. But it’s not just there: CSAM can be found on the same online platforms we all use — Facebook, Instagram, X, and TikTok.

Some of the most sought-after images depict crimes that happened decades ago, to people like Paul. Victims often suffer emotionally knowing the images are on the internet and available to anyone who goes searching.

But the seemingly endless demand for CSAM means abuse and exploitation of new victims happens every day.

That exploitation takes many forms. In Paul’s case he knew what was happening to him. But sometimes victims don’t know there are sexually exploitative images of them out in the world, because those images aren’t real.

They are the victims of AI “undressing” technology called deepfakes. The overwhelming targets of these manipulated pictures and videos are women and girls.

Toronto teen Iris (not her real name) was 15 when it happened to her.

She agreed to talk to us for an episode of The Thread with Nam Kiwanuka. We concealed her identity for her safety.

A picture of Iris in a swimsuit was altered using AI to create a naked image.

“It’s incredible how real it looks,” she says. “I still worry about the pictures getting out.  I don’t know if they are actually gone. Pictures can’t really be gone. Digital footprint is a thing.”

According to Canada’s Security Intelligence Service, CSIS, 90 per cent of deepfakes online are nonconsensual pornographic clips of women. And in October 2022 there were 57 million hits for deepfake porn on Google alone.

Iris and three other Toronto teens TVO talked to went to the police. We found that there were several other teens who went to the police to file deepfake complaints. No charges were laid. The teens we talked to were told there was not much that could be done.

And that’s not uncommon, because laws offer little protection to victims.

The exploitation of children online is an epidemic. We know these are disturbing topics. They are hard to read, watch, and listen to.  But the epidemic is only getting worse by the day.  Exploitation has spread from the darkest corners of the internet to the seemingly innocent world of children’s games. TVO Today is pulling back the curtain on the dangers to bring attention to this harm.

In the last months we have released three programs covering this topic from different perspectives.

  • Arachnid: Hunting the web’s darkest secrets is a powerful six-part podcast uncovering the global spread of child sexual abuse images and the urgent fight to eliminate them. Arachnid is a collaboration between TVO, the Toronto Star and the Investigative Journalism Bureau.

  • Dangerous Games: Roblox and the Metaverse Exposed is a TVO documentary that digs into the hidden dangers in online games like Roblox.

  • AI: Deepfake Reality is an episode of The Thread with Nam Kiwanuka that explores how AI deepfakes target women and girls in disturbing new ways. The exploitation of children online is an epidemic. We know these are disturbing topics.  They are hard to read, watch, and listen to.  But the epidemic is only getting worse by the day. Exploitation has spread from the darkest corners of the internet to the seemingly innocent world of children’s games. TVO Today is pulling back the curtain on the dangers to bring attention to this harm.

"Still, right now, there are kids being groomed online.”

Paul and Iris’s stories are about abusive and exploitative images uploaded to the web.

But it can happen the other way around, with innocent online activity turning to sexual exploitation in the real world. It’s called luring.

It happened on Roblox, a multi-million-dollar gaming platform meant for kids. An online gamer using the name Doctor Rofatnik, or Doc, groomed a young 15-year-old Indiana girl while gaming with her on Roblox. And then he lured her to his home in New Jersey, getting her there by sending an Uber to pick her up.

Despite a desperate search to find the girl, “Doc,” whose real name was Arnold Castillo, kept the victim hidden in a small room for eight days, sexually assaulting her and controlling her movements in and out of the room.

In the months before Castillo escalated to criminal sexual activity with a minor, three young female players reported Doc’s luring actions to Roblox. But despite capturing screenshots of Doc’s private chats saying things like “You’re 12, I suspect you to be a little slow on the upbringing, but soon I’ll corrupt you beyond your wildest dreams,” Roblox paid little heed to alarms raised.

One of the young gamers who reported the luring activity still doesn’t feel safe. “I want people to start taking responsibility. Is it the platform?  Is it the law makers?  Is it the people who make the games?” Janae asks in the TVO Original documentary, Dangerous Games: Roblox and the Metaverse Exposed. “Everyone thinks this is just the internet. We have to take this home with us.  Still, right now, there are kids being groomed online.”

In August 2023 Arnold Castillo was sentenced to 15 years in federal prison after pleading guilty to transportation of a minor with intent to engage in criminal sexual activity and coercion and enticement of a minor.

“I just don’t think they care”

Whether it’s real or a deepfake, the one thing CSAM does have in common is the distribution of these images on online platforms. Platforms run by big tech giants like Meta and X. And owned by billionaires like Mark Zuckerberg and Elon Musk.

But despite their power, big-tech billionaires have done little to halt the stream of CSAM online according to Hany Farid, professor of computer science at UC Berkely and chief science officer at GetReal specializing in image analysis and digital forensics,

“The bottom line is they don't want to do it,” Farid told us for the podcast Arachnid: Hunting the web’s darkest secrets. “They don't want to create liability. They don't want to be responsible. I just don't think they care. There's no money to be made. I think it's really what it comes down to.”

Julie Inman Grant is the Australian eSafety commissioner with the challenging job of creating safe online experiences. She told us the big tech companies aren’t doing nearly enough to combat the issue. Inman Grant knows the uphill battle because she used to be behind the scenes, as Twitter's director of public policy and corporate social responsibility for Australia, New Zealand, and Southeast Asia.

“They have the capability. They have the technology. They have the know-how. They have the brainpower. They most certainly have the vast financial resources. No one is really holding them to account.”

Roblox, X, and Meta have denied the allegation that their platforms are not doing enough to combat CSAM or online luring opportunities for abuse.  In responses to TVO they say they are taking active steps to combat abuse.

“Everyone here will agree this conduct is abhorrent.”

On Wednesday, January 31, 2024, Democratic Senator Dick Durbin opened the US Senate Judiciary Committee Hearing on “Big Tech and the Online Child Sexual Exploitation Crisis” saying the purpose of the hearing was to find a pathway to keep kids safe from sexual exploitation online and to eliminate the production of CSAM “which can haunt victims for their entire lives and in some cases take their lives.”

“Everyone here will agree this conduct is abhorrent,” Durbin said.

But, as the oft-testy hearing progressed, it seemed actions by big tech to eliminate CSAM were often overshadowed by arguments for privacy protection.

In one such exchange, Senator Ted Cruz asked Zuckerberg about the process for when a user ignores a message warning about potential CSAM. “What did you do next when someone clicked ‘You may be getting child sexual abuse images and they click see results anyway? What was your next step?” the senator asked. "You said you might be wrong. Did anyone examine was it, in fact child sexual abuse material? Did anyone report that use or did anyone go and try to protect that child? What did you do next?”

“Senator, we take down anything that we think is sexual abuse material on the service,” Zuckerberg responded.

Senior policy analyst Angus Lockhart from the Dias, a public policy think tank at Toronto Metropolitan University, says there’s little motivation for big tech to change. So governments need to step in. “The incentives that exist for tech companies are to profit, to sell your data,” Lockhart says. “It tends to come down to the government making and intervening in these platforms.”

But those interventions aren’t happening. At least not often enough or fast enough to stop the endless flow of child exploitation online.

According to the United States National Center for Missing & Exploited Children's CyberTipline, in 2023, there were more than 36.2 million reports of suspected online child sexual abuse.

In its 2023 report the International Centre for Missing and Exploited Children said there are 138 countries that have legislation “considered to be sufficient” to fight against child sexual abuse material. Canada is one of them.

And yet cases of making and distributing child pornography, non-consensual intimate images, and accessing child pornography in Canada are all on a steep rise.

It’s all hard to fathom for Hany Farid. “For 20 years, we have not been able to get our heads around the one problem that should be very, very easy to get your head around, which is children, as young as a few years old, being sexually assaulted and extorted and brutalized around the world. And if we cannot, as a technology industry, as regulators, and as a public, get our head around that, what hope is there for anything else?”

On May 30, 2024, the Canadian government tabled the Online Harms Act (Bill C-63) meant to control people from sharing hateful content and non-consensual content.  If passed it would have required platforms to take down internet child pornography. But that act died on the order paper when parliament was prorogued in January 2025.

Newly elected Prime Minister Mark Carney has hinted that his government will bring back a version of the Online Harms Act under his tenure.

Click Here

Subscribe to Reveal

Every week we drop a new episode.
Get it in your inbox.

Previous
Previous

Investigative Journalism Bureau podcast collaboration wins U.S. Signal Award for The Ultimate Choice