LONDON: When engineering student Taylor Klein was told she was in a porn video she was convinced her friend was mistaken but, to her horror, she soon discovered half a dozen explicit clips online.

The realistic videos - known as deepfakes - were created using generative artificial intelligence (AI) to superimpose Klein's face on another woman, but when she went to the police they told her no law had been broken.

"I was just terrified that there was someone out there who wanted to do this to me," said the student, who recounts her quest for justice in a new film called "Another Body".

Klein only agreed to tell her story on film if her identity were hidden. Taylor Klein is a pseudonym and during the documentary she reveals that although viewers are watching her, the face they can see is an AI-generated deepfake created using an actor's features.

The documentary, which has won awards at film festivals and is set to be screened by the BBC in early February, explores how highly abusive deepfake porn has been allowed to proliferate unchecked by laws or regulations.

"This is no longer niche. It has truly hit the mainstream and is exploding," the film's co-director Sophie Compton told the Thomson Reuters Foundation. "We've got to act now."

Campaigners say rapid advances in AI technology and an absence of legislation has helped deepfake porn flourish, facilitated by Big Tech, from search engines to internet service providers and social media.

The devastating consequences of deepfake abuse were highlighted last year when an Egyptian teenager killed herself after she was blackmailed over fabricated images. She left a note that read, "Mom, please believe me. The girl in those pictures isn't me".

The production of deepfakes is doubling every six months, according to tech experts, and while politicians and the media have focused on their potential to disrupt elections and endanger democracy, the vast majority of deepfakes - about 90% - are non-consensual porn of women.

Rights group Equality Now, which is compiling a global study of relevant legislation, has called for countries to introduce strong laws to tackle the issue.

Women who had been targeted told the Thomson Reuters Foundation that legislation must also be accompanied by better police training and a wider shift in cultural attitudes that too often blame the victims of online sexual abuse rather than the perpetrators.


RISK OF MASS-SILENCING OF WOMEN

Campaigners said deepfake abuse was not only "life-shattering" for individuals but could have grave social ramifications - deterring women from engaging in public life and driving them off online platforms, leading to a mass silencing of women.

Noelle Martin, an Australian legal researcher and one of the first women globally to speak out about being targeted, said deepfake porn was often weaponised against women in jobs like politics and journalism to intimidate and undermine them.

High-profile targets of deepfake porn include U.S. Vice President Kamala Harris and politicians Alexandria Ocasio-Cortez and Lauren Book, former German chancellor Angela Merkel, climate activist Greta Thunberg and British actress and women's rights advocate Emma Watson.

"Not only is there a danger that the threat of deepfake porn will deter women away from public life, it can cause women to self-censor," said Martin.

Affected women said it was usually impossible to discover who had created the videos or get them removed, meaning the abuse was permanent.

"It's always going to be there," said Ruby, a British teacher who asked only to use her first name. "That's something victims are very conscious of. Even if you (can) get it taken down it could pop up again."

"A lot of victims I know have never recovered. They completely deleted their online presence. I know women who quit their jobs. Some moved elsewhere. There's a very real impact."


DETECTIVE WORK

There are more than 3,000 sites dedicated to deepfake porn according to #MyImageMyChoice, a campaign co-founded by Compton to kickstart a digital phase of the #MeToo movement and push for legislation.

Compton said the abuse was increasingly being promoted as a genre of porn with creators building businesses and brands around the practice.

In some cases, people were even creating deepfake porn and then asking their targets for money to take it down.

Compton said it was very hard to identify those behind the abuse with many hopping between virtual private networks (VPNs), turning the hunt into a game of "whack-a-mole".

Some sites organise content by the geographic locations of the women targeted, or even their universities.

American student Klein said her name, college and hometown were posted with the deepfakes, heightening her anxiety.

A slew of degrading comments and threats posted below the videos left her terrified that violent men might try to track her down. She also feared that the videos might crop up if potential employers were doing background checks.

With no police help forthcoming, Klein turned detective herself, trawling through a maze of shady sites and forums.

She eventually identified the deepfake creator as a former friend who she discovered had targeted many other students and a high-profile YouTuber, who goes by the name Gibi.

Although the police could not charge the man who created the deepfakes, Gibi is working with a lawyer to bring a civil case against him which could be groundbreaking, according to Compton.


LACK OF LAWS

Equality Now said very few countries had laws around deepfake sexual abuse.

Those leading the way include Australia, South Africa and Britain, which recently passed a long-awaited Online Safety Act.

Equality Now's digital rights expert Amanda Manyame said other laws such as privacy laws, defamation laws and misinformation laws might afford protection in some countries, but this had not been tested in court yet.

With videos potentially shared millions of times across different platforms and jurisdictions, Manyame said legislation needed to be backed up by a strong and coordinated international response.

Campaigners also called for Big Tech to be held accountable.

Compton said search engines, internet service providers and payment systems were complicit in facilitating the creation and proliferation of deepfake porn.

"These (deepfake) sites are operating as businesses and creating the impression that it's another genre of porn when it's not. It's abuse," she said.

She called for search engines to de-rank and de-list deepfake porn and for service providers to block sites, pointing out that they already did this with images of child abuse.

A Google spokesman said they were expanding protections and had made it easier for affected people to request the removal of non-consensual deepfake porn from search results and had also introduced systems to detect and remove duplicates.

But Compton said it was harder to persuade companies to act when there was no legislation in place.

"This needs to be made illegal to very clearly state that this is not something our society thinks is OK and to very clearly state to tech companies that they must act," she said.