1

AI imaging has found its place in porn - The Washington Post

 1 year ago
source link: https://www.washingtonpost.com/technology/2023/04/11/ai-imaging-porn-fakes/
Go to the source link to view the article. You can view the picture content, updated content and better typesetting reading experience. If the link is broken, please click the button below to view the snapshot at that time.

‘Claudia’ offers nude photos for pay. Experts say she’s an AI fake.

Will users feel ripped off as image-generating AI tools fuel a new wave of porn and scams?

April 11, 2023 at 6:00 a.m. EDT
ai-imaging-porn-fakes
AI-generated fake images created in Civitai. (Photo by Monique Woo/The Washington Post; Illustration images by civitai)
Listen
Comment
Gift Article
Share

The photo shows the face of a young woman with long dark hair and a soft smile who says she is “feeling pretty today :).” And on Reddit — where Claudia, as she’s named, offers to sell nude photos to anyone who privately messages her — she is quite popular: “Holy crap you are beautiful,” one commenter said.

But Claudia is fake — a bundle of surprisingly convincing photos made by artificial-intelligence image tools, possibly deployed to pull in cash from unsuspecting buyers, according to two synthetic-media researchers.

The rapid advances in AI-image generators like Midjourney and Stable Diffusion have gained global attention in recent weeks for their inventive art pieces and impressive fakes of ex-presidents and popes.

But Claudia’s case hints at the technology’s more explicit side: By allowing anyone to create images of fake people that look uncannily real, the tools are reshaping how porn is made and consumed.

Advertisement

New technology has for years been pioneered through porn, and AI-image tools have not broken from that pattern. Thousands of accounts are now registered in discussion boards and chatrooms devoted to the creation and refinement of synthetic people, the majority of whom resemble girls and women — a rapid shift that could upend a multibillion-dollar industry, undermine demand for real-world models and actors and fuel deeper concerns about female objectification and exploitation.

A systems administrator at a hospital in the Midwest — who, like the other AI-porn creators and viewers interviewed for this story, spoke on the condition of anonymity — said he has been using Stable Diffusion tools to create fetish photos of adult women in diapers, and that advances in image quality have made it so their fakeness doesn’t matter.

“The average person who’s looking at this stuff, I don’t think they care,” he said. “I don’t expect the person I’m looking at online to be the person they say they are. I’m not going to meet this person in real life. … At the end of the day, if they’re not real, who really cares?”

Advertisement

The Claudia account didn’t respond to requests for comment, making it impossible to confirm how the photos were made — or how much money they raised from the months-old ruse.

But the researchers said the photos carried several clear hallmarks of a fake, including strange background details and a neck mole that went missing between poses. “Actually rather easy to create,” one AI programmer said.

The researchers identified several online profiles of women they believe are fake avatars based on the telltale artifacts that some AI image generators leave behind. Using profiles on Instagram, Reddit, Twitter and OnlyFans, the accounts shared images of women in varying stages of undress — and told viewers they should pay or subscribe if they wanted to see more.

The suspected fake accounts did not respond to questions. And because most AI-generated images are not watermarked or fingerprinted in any way at the time of creation, it can be challenging for any viewer to confirm whether they’re real or not.

Advertisement

One account published the videos of an amateur porn actor from Puerto Rico alongside edited images showing the woman’s face on someone else’s body. Neither the fake nor the real account responded to requests for comment.

Hundreds of online accounts followed and commented on the fake porn accounts, leaving comments that suggested they believed the women were real.

“Feel a bit cheated,” Reddit user “legalbeagle1966” said after a Washington Post reporter told him Claudia was likely a fraud. A week earlier, he’d commented on her photo that she looked “pretty sexy and perfect.”

Many of the newest fake images rely on AI programs, known as diffusion models, that allow anyone to type in a short prompt of words and create a fake photo for free. The images can then be edited even further to make them more convincing, including to cover up glitchy spots and refine their quality.

Advertisement

The tools are even simpler to use than the “deepfake” software that fueled worries over AI images in 2017. Where deepfakes used deep-learning AI techniques to edit existing videos, diffusion models generate entirely new photos by following the patterns found across the billions of images they’ve analyzed before.

But the new class of images raise many of the same concerns, including that they could be used to impersonate real women. On some forums, users talk of how to use diffusion models and other AI-powered techniques, such as “inpainting,” to superimpose the faces of real women onto the bodies of AI-generated fakes.

“To humiliate and push women out of the public sphere, they don’t even need to look exactly like the women. They rely on the shock effect,” said Sam Gregory, the executive director of Witness, a nonprofit group that specializes in video technology and human rights.

Advertisement

“But the shift to the diffusion models has made it a lot easier to create volume, variance and personalization, and to do it around known individuals. The quality is just better,” he added. “People already want to lean into believing something that humiliates or mocks or targets someone they dislike.”

Stability AI, the London start-up behind Stable Diffusion, prohibits people from using it to create images that a “reasonable person could find obscene, lewd … [or] pornographic.”

But because the company has made the tool available to all to download on their computer, it has no way to stop anyone from using it to make whatever they want.

Some forums devoted to using the tool to create AI-generated porn even discuss how the tool can be used to create sexually explicit images of women without their consent. People also have shared guides on how to use the technology to edit real images, including to remove the clothing of women who were photographed fully dressed.

Advertisement

On the AI-art clearinghouse Civitai, one tool touted as “Stable Diffusion models for pervs” that can help generate “uber realistic porn” has been downloaded more than 77,000 times in the last three months. Some of the publicly viewable example images include the prompts that creators used to generate them, such as “dreamy black eyes,” “lust” and “teen.”

The images have fueled some concern in the porn industry about their new technological competition. But not everyone thinks the industry’s days are numbered. Mark Spiegler, an agent for porn actors such as Asa Akira and Riley Reid, said the stars in his industry are performers with charisma, skill and attractiveness with which no AI can compete.

“I don't think you can machine-learn a personality,” he said in an interview. “You can somewhat replicate it, but you're still missing that human spark and spontaneity.”

Advertisement

Zoey Sterling, an art history student and sex worker in Miami who sells explicit pics and videos on OnlyFans, said she’s not concerned.

“The people saying AI could replace sex workers are the same people who act like sex workers aren’t humans,” she said in an interview. “I wish them the best of luck and hope they can figure out what ‘consent’ means. Maybe they can ask the AI.”

Some female avatar accounts explicitly tell viewers that they were created with Stable Diffusion or other AI tools. Others have been a bit more subtle: “Ailice,” an avatar with roughly 10,000 Instagram followers, defines itself as “AI life, real vibes.”

null
An AI-generated fake image posted on heyitsailice’s Instagram account. (The Washington Post illustration; Instagram)

But others, like Claudia, give no indication at all. The account’s first Reddit post, in January, was a long, sexually explicit passage of dirty talk to which several Reddit posters expressed excitement: “I need a good girl like you in my life,” one wrote. ZeroGPT, a tool for detecting computer-generated text, said it was 97 percent confident the passage was generated by an AI language tool such as ChatGPT.

The account posted other explicit passages and images, including one showing a woman in underwear, and anyone who clicked on the profile could see that the account described itself as a 19-year-old woman who would sell private images to paying buyers.

Advertisement

It wasn’t until last week, when Claudia posted the “feeling pretty today” photo to a Reddit forum devoted to “images of human faces,” that people began suspecting an AI was involved.

Some Reddit users posted that Claudia’s skin looked “too smooth, too ‘perfect’” and pointed out that the white, featureless room she was pictured in had a ceiling beam that disappeared behind her head but didn’t reappear on the other side. An AI programmer who spoke with The Post said it also had some technical giveaways: The photo’s width, of 512 pixels, is Stable Diffusion’s default image size.

Not everyone saw through the act. The Reddit user “Lanky_Grade_268,” who said he is a 21-year-old hotel cleaner in Argentina, had called Claudia “beautiful and charming” and said he was unnerved by the revelation she might be fake. “It is scary because it will be difficult to distinguish between an AI and a real person,” he said in a message.

But the Reddit user “Ryan5197,” who’d told Claudia she looked “incredible,” was less disturbed. “My opinion of the image is unchanged,” he said. “She’s a pretty character.”

Tatum Hunter contributed to this report.


About Joyk


Aggregate valuable and interesting links.
Joyk means Joy of geeK