Fake-porn videos are being weaponised to harass and humiliate women

Fake-porn videos are being weaponised to harass and humiliate women, by Drew Harwell.

It was her face. But it had been seamlessly grafted, without her knowledge or consent, onto someone else’s body: a young pornography actress, just beginning to disrobe for the start of a graphic sex scene. A crowd of unknown users had been passing it around online. …

Airbrushing and Photoshop long ago opened photos to easy manipulation. Now, videos are becoming just as vulnerable to fakes that look deceptively real. …

Actress Scarlett Johansson … has been superimposed into dozens of graphic sex scenes over the past year that have circulated across the web: One video, falsely described as real “leaked” footage, has been watched on a major porn site more than 1.5 million times. She said she worries it may already be too late for women and children to protect themselves against the “virtually lawless (online) abyss”.

[Easy to find with a search like “Actress Scarlett Johansson porn video”]

“Nothing can stop someone from cutting and pasting my image or anyone else’s onto a different body and making it look as eerily realistic as desired,” she said. “The fact is that trying to protect yourself from the internet and its depravity is basically a lost cause … The internet is a vast wormhole of darkness that eats itself.” …

In September, Google added “involuntary synthetic pornographic imagery” to its ban list, allowing anyone to request the search engine block results that falsely depict them as “nude or in a sexually explicit situation”. But there’s no easy fix to their creation and spread. …

Easy to make authentic-looking but fake videos of anyone:

Several users who make videos by request said there’s even a going rate: about $20 per fake. …

Videos have for decades served as a benchmark for authenticity, offering a clear distinction from photos that could be easily distorted. Fake video, for everyone except high-level artists and film studios, has always been too technically complicated to get right.

But recent breakthroughs in machine-learning technology, employed by creators racing to refine and perfect their fakes, have made fake-video creation more accessible than ever. All that’s needed to make a persuasive mimicry within a matter of hours is a computer and a robust collection of photos, such as those posted by the millions onto social media every day. …

Fake videos in politics:

Not all fake videos targeting women rely on pornography for shock value or political points. This spring, a doctored video showed the Parkland school shooting survivor Emma Gonzalez ripping up the Constitution. Conservative activists shared the video as supposed proof of her un-American treachery; in reality, the video showed her ripping up paper targets from a shooting range. …

The problem is not going to go away:

“Most guys never land their absolute dream girl,” [deepfake creator using the name “Cerciusx] said. “This is why deepfakes thrive.”

Scarring a person’s life is too easy:

In April, Rana Ayyub, an investigative journalist in India, was alerted by a source to a deepfake sex video that showed her face on a young woman’s body. The video was spreading by the thousands across Facebook, Twitter and WhatsApp, sometimes attached to rape threats or alongside her home address.

Ayyub, 34, said she has endured online harassment for years. But the deepfake felt different: uniquely visceral, invasive and cruel. She threw up when she saw it, cried for days afterward and rushed to the hospital, overwhelmed with anxiety. At a police station, she said, officers refused to file a report, and she could see them smiling as they watched the fake.

“It did manage to break me. It was overwhelming. All I could think of was my character: Is this what people will think about me?” she said. “This is a lot more intimidating than a physical threat. This has a lasting impact on your mind. And there’s nothing that could prevent it from happening to me again.”