On Etsy, anyone can purchase Olivia Munn t-shirts and photos autographed by the actor herself. There are also Olivia Munn fan mugs that can be personalized with a speech bubble. And, until Tuesday, you could buy AI-generated deepfake digital images of her – some of them pornographic.
“With a resolution of 300 DPI, Olivia's details come to life, making it ideal for digital art, design, and printing. Olivia celebrates the elegance of the female form in a tasteful and artistic manner, making it suitable for a variety of creative applications,” read the description for one image on sale for $2.49.
Bryan Sullivan, an attorney for Munn, an actor who has starred in The Newsroom and multiple superhero films, said that his client feels “violated and offended” by these images.
“This is outrageous and a violation of my client’s rights, and more importantly, her dignity,” he told Forbes. “We will be taking action to remove this and prevent this from happening again. And I've already started that process with Etsy.”
Though Sullivan said he immediately notified Etsy after being contacted by Forbes on December 13, Etsy did not take the images down until six days later, after Forbes reached out to the company directly.
Other sellers had offered similar sets for slightly higher prices ($5.51), and even took some creative license: “Jenna Ortega just finished taking a bath. Shaved privates, oversized chest.”
Another vendor offered to make “any celebrity you like nude in different positions…whether naked, during sex or simply in lingerie,” for the low price of $0.92.
“On fuckin’ Etsy! Not to be too blunt about it. That’s how you know it’s gone mainstream, man, it shows up on Etsy.”
Etsy removed those accounts – 16 that Forbes identified – as well, but also left up thousands of other listings that contain AI-generated pornography, all easily discoverable with the most rudimentary of search terms.
“We are deeply committed to the safety of our marketplace and community, and we removed the violating listings in question immediately when they were brought to our attention,” Alice Wu, Etsy’s head of trust safety, said in an emailed statement to Forbes.
“Nude celebrity deepfakes are prohibited. Given this is an emerging issue across the Internet, we are actively working to scale our enforcement efforts in this area.”
This publicly-traded company – which amassed a market cap of nearly $10 billion facilitating the sale of tchotchkes and artisanal handicrafts – has a deepfake porn problem, ushered in by the mainstreaming of AI.
Etsy declined to explain its policies for AI-generated images of real people who aren’t celebrities or explain why there are so many artificial pornographic images on its site.
“While some mature content is allowed on Etsy, pornography is prohibited,” Wu added.
Despite this policy, a search for “deepfake porn” returned over 1,600 results as of December 18. Some of these results are not porn, and merely offer non-explicit services to “make your own deepfake video.”
After Forbes reached out to Etsy, the number of results for that search term had fallen to just under 1,500. Similarly, on Monday, searching for “ai nude” yielded over 4,000 results; after Forbes reached out, that number had decreased to under 3,700.
Some of these AI-generated images had been deepfakes of female celebrities — from Munn and Jenna Ortega to Ariana Grande and others. Others were entirely of fabricated humans, mostly women. (Representatives for Ortega and Grande did not respond to a comment request.)
The listings were quite clear about what they were selling. “This package contains 40 high quality uncensored JPG images featuring many different gorgeous AI-generated, fully nude young women in different poses and different locations. No duplicates from other listed packages,” read one listing.
While Etsy’s Prohibited Items Policy at the time of publication forbids the sale of pornography — defined as “material that explicitly describes or displays sex acts, sex organs, or other erotic behavior for the purpose of sexual arousal or stimulation” — there is plenty of it being sold on the site.
At the time of publication, the most straightforward searches for phrases like "AI porn" returned explicit imagery, including artificially-generated collections of "goth sluts," "naughty nurses" and "winter flashing,” and a decorative throw pillow depicting oral sex.
Meanwhile, Etsy’s recommendation algorithms, at the time of publication, were pointing users to similar images. At the bottom of the listing of the now-deleted Munn deepfake image were listings for several other erotic artificially-generated images of celebrity women sold by the same vendor and suggestions to “explore related searches” using terms like “nsfw ai art”and “olivia munn nude.”
According to Hany Farid, a computer science professor at UC Berkeley, and an expert on generative AI technologies, there is “no technical reason” that Etsy couldn’t do a better job of filtering out these materials. Searches for the same phrase (“deepfake porn”) on other ecommerce platforms, including Amazon and eBay, do not yield return similar deepfake pornographic items.
Officially, Etsy has drawn a line between nude imagery, which it allows, and pornography, which it does not. Etsy adheres to a common legal definition of the genre, forbidding images that depict sex organs or sex acts “for the purpose of sexual arousal.”
“We are still working to determine the place that AI-generated products have in our marketplace, but AI-generated listings that violate our longstanding policies will be removed,” Wu sain a statement.
She said that while sellers are “responsible for complying with our policies,” Etsy monitors the site “both manually and through automatic controls.” She declined to explain exactly what that entails or why, if such precautions are in place, a simple search for “AI porn” continues to return products featuring pornographic deepfakes of well known female actors.
In recent years, deepfake pornography, which disproportionately affects women, has become far more sophisticated, become much easier to create, and has now proliferated to unprecedented levels, experts say.
Readily available software can take nearly any image and make it pornographic, often with near-realistic details. Websites devoted to altering images of real women by using AI to remove their clothes already allow anyone to create endless AI-generated pornographic images in seconds — but these sites are not publicly traded ecommerce platforms.
“On fuckin’ Etsy!,” Farid told Forbes. “Not to be too blunt about it. That’s how you know it’s gone mainstream, man, it shows up on Etsy.”
Etsy, which was founded in 2005, went public in 2015. In 2022, the company made a profit of $643 million, up from $627 million the year before. It laid off 11 percent of its workforce in early December. It is also struggling with managing a flood of AI-generated content like bizarre coloring books and plethora of cheap coffee mugs with quippy sayings as well, The Atlantic reported earlier this year.
According to one Etsy vendor who had been selling these deepfake celebrity images, they are “not particularly popular” because “nowadays, everyone can generate artificial explicit images using AI.” The other vendors did not respond to a comment request.
Rebecca Delfino, a law professor at Loyola Marymount University in Los Angeles who for years has studied the intersection of deepfakes and the law, told Forbes that while there is no federal law protecting real-world victims of deepfakes, there are some state laws.
“When you are selling anything in a commercial way and mass marketing it, then you are subject to the whole bunch of copyright claims, [from] appropriation of likeness to defamation, to false light, and in some states like California and New York there are now civil actions,” she said, noting the states where many of these celebrities are based.
Delfino said most celebrities’ lawyers would send a cease-and-desist letter to Etsy as a way of protecting their clients.
To date, she added, there have not been any major cases that have tested these new state laws surrounding deepfake pornography. However, there is a non-pornographic deepfake currently being adjudicated in federal court in Los Angeles.
Earlier this year, a California reality TV personality, Kyland Young, successfully sued a Ukranian company, Neocortext, which makes the “Reface” face-swapping app. Young argued that Neocortext violated his right to publicity under California state law by allowing users to pay to insert themselves into still images with Young, or even to swap their face onto his body. This month, Neocortext recently appealed its loss to the 9th U.S. Circuit Court of Appeals, arguing that its app is protected under the First Amendment.
With limited legal recourse (not to mention the time and resources required to pursue it), much of the onus for managing the proliferation of porn deepfakes ends up with the tech platforms that host them and make them accessible — whether that’s Etsy, or Google, which makes other sites that publish deepfake porn easy to find.
“I don't think this gets solved with letters and lawyers,” Farid said. “This is the internet, man. You have to think about where the chokepoints are.”
This story has been updated with additional comment from Etsy.
MORE FROM FORBES
"celebrities" - Google News
December 20, 2023 at 10:19PM
https://ift.tt/AjQqk8F
Etsy Has Been Hosting Deepfake Porn Of Celebrities - Forbes
"celebrities" - Google News
https://ift.tt/NarZs7q
https://ift.tt/sOSEFQq
Bagikan Berita Ini
0 Response to "Etsy Has Been Hosting Deepfake Porn Of Celebrities - Forbes"
Post a Comment