Skip to content

Pornhub policies reveal legal gaps and lack of enforcement around exploitive videos

2021022713024-603a89f52209191039134537jpeg

OTTAWA — Serena Fleites was in seventh grade when a sexually explicit video of her was uploaded to Pornhub.

The fallout from that fateful post — made without her knowledge after a boyfriend demanded she send him naked images — would send her into a years-long spiral of depression, drug use and self-harm, as the illegal content resurfaced repeatedly on the Montreal-based platform and across the internet.

The California resident, now 19, says Pornhub took more than a week to respond to her initial request to take the video down, and weeks more to actually remove it, only to let it emerge again days later, a traumatic process that played out over and over.

“It had already been downloaded by people all across the world, basically, and it would always be uploaded over and over and over again,” Fleites told a parliamentary committee on Feb. 1, blaming Pornhub in particular.

“They're really selfish. They need to really look at themselves in the mirror, because they're prioritizing money and content over actual human beings' lives.”

Fleites’s words echo those of other survivors who testified over the past month about their depictions in rape videos, revenge porn or sexual material posted to Pornhub without their consent.

Companies like MindGeek, which owns Pornhub and numerous other sites such as YouPorn and RedTube, have thrived for years even as reports poured in about images of children and non-consensual acts, and despite laws against child pornography and sexually abusive material.

“We have been screaming from the rooftops that we are long overdue for regulation,” Lianna McDonald, executive director of the Canadian Centre for Child Protection, said last Monday to the House of Commons ethics committee, which is weighing concerns around privacy and streaming platforms such as Pornhub.

Sites are free to determine their own content moderation policies and reporting processes, leaving victims “at the mercy of these companies.” Meanwhile there is no self-regulating industry body to handle complaints or uphold standards.

“We’ve never needed government more than we do now to step in and intervene,” McDonald said.

Others say laws are in place but lack teeth, hampered by weak enforcement, poor resources and jurisdictional barriers amid a global scourge.

In Canada, users who upload illicit images are jointly liable with the companies that distribute it — including for child pornography or material posted without consent — or that become aware of it afterward and neglect to remove it. That differs from the U.S., where the Communications Decency Act does not hold web platforms responsible for user-generated material.

“Do we really need a new law to tell us that broadcasting child-sexual-assault material is illegal?” asked Daniel Bernhard, executive director of Friends of Canadian Broadcasting. “The issue is not that it is unregulated, the issue is that the law isn’t being applied.”

Grey areas and enforcement hurdles are part of the problem.

No regulations explicitly require sites to screen their content — either via algorithms or human moderators — or verify the ages and consent of participants.

Moreover, what constitutes “knowingly” distributing content where the individual did not give consent, a Criminal Code violation, remains unclear.

“There’s loopholes like that that need to be closed. Because you don’t want a situation where a company is not monitoring and they’ll say, ‘Well I can’t have possibly known because I don’t moderate,’” Lloyd Richardson, IT director at the Canadian Centre for Child Protection, said in an interview.

In December, a Toronto-based platform called YesUp Media became the first corporation in Canada known to be convicted on criminal charges under a 10-year-old federal law obliging companies to report online child pornography if they learn their business is being used to access it.

While setting a precedent, the case underscored the dearth of prosecutions for a crime that is ubiquitous online — the RCMP has received 215,000 reports of online child sexual exploitation since 2018.

YesUp, which had been warned hundreds of times that a Vietnam-based client was hosting massive amounts of child pornography, faced a fine of only $100,000 and probation. And four men affiliated with the company who were charged criminally wound up receiving fines of just $1,000 under a plea deal.

That sanction serves as a small deterrent for porn giants.

The fine is a drop in the ocean of profits for what is arguably Canada’s largest tech company — MindGeek — and its Pornhub subsidiary, which is often ranked in the dozen most-visited websites in the world, ahead of Netflix, Yahoo and Zoom.

The global scope of online pornography pays little heed to borders, causing headaches for investigators hemmed in by jurisdiction.

“Like many forms of cybercrime, online child sexual exploitation is often multijurisdictional and multinational, which creates many complexities for law enforcement,” the RCMP said in an email.

Police do not have a full toolkit to hold the owners of pages sharing videos of young victims accountable or to shut down sites advertising underage sex, concluded a December report from Quebec's select committee on the sexual exploitation of minors.

Among its recommendations was changing the definition of “place” in the Criminal Code, since it can be tough for police to act if the original crime took place in another country or the website's servers are in a different jurisdiction.

MindGeek is based in Montreal but it operates globally, and jurisdiction is tough to determine because it hosts content outside of Canada, said RCMP spokeswoman Cpl. Caroline Duval.

International standards and co-operation are needed to tackle illegal online distribution, said Kate Isaacs, founder of the U.K.-based Not Your Porn, which campaigns against the use of sexual images posted without consent.

Isaacs is calling for rules requiring companies to maintain trained staff who carry out content moderation and removal "at scale," and to keep detailed records of user reports and responses that can be audited.

“You come across children as young as three on there. And it's one of those things that you cannot un-see."

Pornhub has pushed back against accusations it allows child sexual abuse materials on its site, noting it scrubbed some 10 million videos posted by unverified users in December — though only after Visa and Mastercard cut off payment services.

"Any assertion that we allow (that) is irresponsible and flagrantly untrue," the company said in an email late last year, claiming “zero tolerance” for child sexual abuse materials.

The platform works with dozens of non-profit organizations that aim to flag content and stop online child exploitation. It also says it uses extensive measures to shut out such material, including “a vast team” of human moderators to manually review each of the 6.8 million videos uploaded annually and remove illegal material, along with automated detection technologies.

The House ethics committee is expected to present recommendations in a report to Parliament later this year on online privacy and consent.

“It seems that the horrific abuse that Ms. Fleites faced is not an isolated incident,” said NDP ethics critic Charlie Angus. “How is it possible that that can go on?”

This report by The Canadian Press was first published Feb. 28, 2021.

Christopher Reynolds, The Canadian Press

push icon
Be the first to read breaking stories. Enable push notifications on your device. Disable anytime.
No thanks