Pornhub Might Lose Visa and Mastercard After New York Times Exposé

Photo: Ethan Miller (Getty Images)

Last week, New York Times columnist Nicholas Kristof detailed the easily-searchable horrors uploaded in plain view onto Pornhub, arguing that the site goes unpunished for profiting from child sex abuse material (CSAM), sexual assault, and nonconsensual pornography (sometimes called “revenge porn.”) Now, Visa and Mastercard claim that they’re reassessing their relationship with the site’s parent company, Canadian porn megacorp MindGeek.

Personally, as a fan of small business and independent creators: down with MindGeek.

Not to fault the credit card companies for trying, but it’s interesting that Visa and Mastercard would leap to their feet now, in the fallout of a Times exposé, since it’s just the latest in a litany of reports about CSAM that’s embedded in the amateur porn market. Earlier this year, a petition to shut down Pornhub circulated after a mother found her missing 15-year-old daughter via Pornhub uploads. (More reported examples here and here of CSAM that surfaced on Pornhub.)

Last year, Pornhub took months to remove the channel “Girls Do Porn,” a production company which was federally indicted for an alleged business model of nonconsensual porn. And while Pornhub’s models (performers who monetize with ads, video sales, downloads, and tips) have to submit a government-issued ID, Pornhub doesn’t require age or identity verification for non-monetized users to upload a video.

In emailed statements to Gizmodo, Visa said that it is “vigilant” in rooting out illegal activity in its network, and a site will no longer be able to accept Visa payment if it “is identified as not complying with applicable laws or the financial institutions’ acceptable use policies and underwriting standards.” Similarly, Mastercard said that it works “closely with law enforcement and organizations like the National and International Center for Missing and Exploited Children to monitor, detect and prevent illegal transactions.”

G/O Media may get a commission

“We are investigating the allegations raised in the New York Times and are working with MindGeek’s bank to understand this situation,” Mastercard added, “in addition to the other steps they have already taken. If the claims are substantiated, we will take immediate action.”

But it’s unclear whether the credit card companies are investigating the existence of CSAM on MindGeek’s plexus of sites, or whether MindGeek has been aware of CSAM and hasn’t followed legal requirements to act on it. None of the companies have responded to requests for further comment.

Pornhub accepts Visa and Mastercard from users who pay for the Premium Pornhub experience, the perks of which include the ability to download video, message members, access to exclusive content, and view the site’s offerings ad-free. PayPal terminated its relationship with Pornhub last year, following a pattern of payment processors’ unexplained rejection sex work-related payments. Customers and models can still pay, or be paid, through cryptocurrency.

In an email to Gizmodo, Pornhub argued that child sex abuse material is astrononmically higher on mainstream social media sites; Pornhub was quick to point out that Facebook and Instagram reported removing a combined 32.9 million pieces of CSAM in 2020, and Twitter reported removing over 775,000 pieces of CSAM in 2019. Pornhub compared this to the 118 pieces of CSAM reportedly identified by the Internet Watch Foundation to be hosted on Pornhub between 2017 and 2019. The Internet Watch Foundation was not immediately available for comment.

While Facebook was not immediately available for comment, a Twitter spokesperson directed Gizmodo to their zero-tolerance policy for CSAM and said that they immediately remove the illegal tweets, including links to third-party sites, and report it to the National Center for Missing & Exploited Children (NCMEC.)

The company also said that it employs a “vast team of human moderators dedicated to manually reviewing every single upload,” as well as a flagging system and automated detection systems. Pornhub also claims to use YouTube’s automatic detection technology, Google’s detection AI, tech from Microsoft which roots out existing known child exploitation material, and prevention systems for stopping banned videos from being re-uploaded.

Writing for Slate earlier this year, Jezebel’s Rich Juzwiak suggested that paying for professional porn is the only real safeguard against CSAM. And there are good reasons to poke around for indy alternatives to the “Amazon” of porn. Sex workers and advocates have pointed out year after year that the MindGeek conglomerate network monopolizes exposure, takes a sizable cut, and allows piracy to run rampant.

If you find child sexual abuse material, report it to the National Center for Missing and Exploited Children.

Read More

Whitney Kimball