Brad Dickinson

Artificial intelligence won’t save the internet from porn

The content below is taken from the original (Artificial intelligence won’t save the internet from porn), to continue reading please visit the site. Remember to respect the Author & Copyright.

"I shall not today attempt further to define the kinds of material I understand to be embraced within that shorthand description ["hard-core pornography"], and perhaps I could never succeed in intelligibly doing so. But I know it when I see it, and the motion picture involved in this case is not that." — United States Supreme Court Justice Potter Stewart

In 1964, the Supreme Court overturned an obscenity conviction against Nico Jacobellis, a Cleveland theater manager accused of distributing obscene material. The film in question was Louis Malle’s "The Lovers," starring Jeanne Moreau as a French housewife who, bored with her media-mogul husband and her polo-playing sidepiece, packs up and leaves after a hot night with a younger man. And by "hot," I mean a lot of artful blocking, heavy breathing and one fleeting nipple — basically, nothing you can’t see on cable TV.

In six simple words, Justice Stewart encapsulated the near-impossible act of creating a single definition of pornography: "I know it when I see it".

Attitudes toward sex have changed significantly since 1964. Soon after Jacobellis faced the Supreme Court, the United States experienced a sexual revolution followed by the porn boom of the 1970s and, more recently, the advent of the internet. Today, anyone with an internet connection can be knee-deep in creampies and pearl necklaces in a matter of seconds. We’ve come a long way, but one thing remains the same: We’re still nowhere close to a universal definition of pornography or obscenity.

Jean Moreau and Jean-Marc Bory in the not-so-sexy scene from "The Lovers" at the heart of Jacobellis v. Ohio (Image Credit: Getty Images)

But unfettered access to all things smutty, dirty and questionably filthy has created a surge in censorship tools that, in theory, use algorithms and advanced artificial intelligence programs to identify porn and weed it out. Last year, Twitter acquired Madbits, a small AI startup that, according to a Wired report, created a program that accurately identifies NSFW content 99 percent of time and alerts users to its presence. Late last month, Yahoo open-sourced its own deep learning AI porn filter and there are no doubt similar projects underway at other internet companies.

Big players have been sinking big money into cleaning up the internet for decades. The trouble is, censorship is a slippery slope, and obscenity is inherently subjective. If we can’t agree on what constitutes pornography, we can’t effectively teach our computers to "know it when they see it." No matter the sophistication of the technology or the apparent margin of error, porn filters still depend on humans to teach them what is and isn’t NSFW.

Sometimes a naked child is more than a naked child.

In the early days of the world wide web, US libraries and schools implemented filters based on rudimentary keyword searches in order to remain in compliance with the Child Internet Protection Act. The act attempts, as the name suggests, to protect children from the darker side of the internet, specifically "pictures that are: (a) obscene; (b) child pornography; or (c) harmful to minors (for computers that are accessed by minors)."

But that’s not exactly how it played out.

A 2006 report on internet filtering from NYU’s Brennan Center for Justice referred to early keyword filters and their AI successors as "powerful, often irrational, censorship tools."

"Filters force the complex and infinitely variable phenomenon known as human expression into deceptively simple categories," the report continued. "They reduce the value and meaning of expression to isolated words and phrases. An inevitable consequence is that they frustrate and restrict research into health, science, politics, the arts, and many other areas."

The report found that popular filters inexplicably blocked sites belonging to Boing Boing, GLAAD, photographer Robert Mapplethorpe and Super Bowl XXX, among others, and often reflected the political and social prejudices of their creators. While Yahoo and Google’s AI-powered filters have replaced keyword searches with sophisticated image recognition, they still rely on humans to teach them what is and isn’t safe for work. And as Facebook recently discovered, images are no less divisive than words.

(Image Credit: ASSOCIATED PRESS)

The social network faced widespread backlash in early September when it took down the photo above for violating its community standards. The Pulitzer Prize-winning image from 1972 shows a naked 9-year-old girl running away from a napalm attack during the Vietnam War. Facebook originally took the photo down for violating its community standards, saying, "While we recognize that this photo is iconic, it’s difficult to create a distinction between allowing a photograph of a nude child in one instance and not others."

But as the New York Times reported, Facebook reinstated the original post after thousands of users posted the photo to their timelines in protest.

"An image of a naked child would normally be presumed to violate our community standards, and in some countries might even qualify as child pornography. In this case, we recognize the history and global importance of this image in documenting a particular moment in time."

It’s not clear how the image was flagged, but whether it was a human or AI, or some mix of the two, the bottom line is: Sometimes a naked child is more than a naked child.

Sometimes a man with a bullwhip hanging out of his ass is more than a man with a bullwhip hanging out of his ass.

This isn’t the first time Facebook has been criticized for censoring images that many deem to be "clean." The social network has repeatedly come under fire for deleting posts containing exposed female breasts in the context of nursing photos and information about mammograms. More recently it learned a lesson about the fine line between pornography and art, when it deleted and later reinstated a video of a black woman who painted her naked body white on Facebook Live to draw attention to police brutality and the Black Lives Matter movement.

The real world too, is rife with examples of the debate about what is art and what is porn. In 1990, the Contemporary Arts Center in Cincinnati and its director were accused and acquitted of obscenity charges for an exhibition of Robert Mapplethorpe’s photography.

It was the first time such charges were brought against a museum in the US, and the photos in questions — depictions of gay S&M — were at the center of a national debate headed by the Republican Party. The prosecution argued that the exhibition, funded by the National Endowment for the Arts, constituted pornography while the defense defined it as art. That case proved that sometimes a man with a bullwhip hanging out of his ass is more than a man with a bullwhip hanging out of his ass. It also proved that our access to art, no matter how controversial, isn’t always guaranteed.

Our personal prejudices continue to undermine our access to information and freedom of expression, despite advances in internet filtering. We may never agree on what NSFW really means, but without a universal definition, our machines will simply act as conduits for our own opinions. Not one of us can claim to know it when we see it, and no amount of code can change that.

Exit mobile version