WITH fury in his eyes, Michael Grothaus pointed his gun at a cyclist and told him to give up his backpack or die.
At the end of the terrifying 90-second clip, the would-be victim fortunately fled. He was lucky, the gunman knew he was “going to kill this man” if passers-by hadn’t interrupted.
However, this wasn’t the real Michael. It was a ‘deepfake’ – a video where his face had been superimposed on to another man through artificial intelligence technology.
Deepfakes have spiralled out of control after first emerging in 2017 and as technology continues to advance, things will likely only get worse.
And she’s not alone. Across the internet, there are fake videos ‘showing’ Hollywood stars in an orgy, Chinese President Xi Jinping declaring nuclear war… and Tom Cruise playing golf.
Committed armed robbery in deepfake
Michael took a deep dive into this dark and treacherous world for his new book and even requested a deepfake to be made of himself.
In a conversation over an encrypted platform, a stranger explained that he charged £150 ($200) to create convincing videos and often it took less than two days to make.
The journalist considered the ‘deepfake for hire’ quite expensive compared to others he had seen, who advertised their services for between £15 and £112.
The tech whizz, known under the pseudonym Brad, claimed to have worked on “more than 20 but less than 100” jobs and all but one was to create fake celebrity porn.
The only request involving a non-famous person was for a man who wanted to be superimposed so it appeared he was having sex in a number of different positions.
“He wanted to be the one f***ing this Korean porn star in this one video. It was his favourite porn star and his favourite video of her,” Brad said.
The resulting clip was 30 minutes long and took a day and a half to make.
I know I’m going to kill this man, I’m just waiting to hear the bang
Michael Grothaus, watching his deepfake ‘crime’
Michael wanted a clip of himself committing a crime to expose the dangerous potential of such software – and it seemed Brad was able to do it with ease.
The deepfake for hire revealed he only needed a short video to achieve it, because one second of film footage was comprised of at least 30 still images.
For a one-minute clip, there is around 1,800 images and 2,700 for 90 seconds, which can be used to impose over the top of a real person’s face.
What Brad came back with was a video that he described as his “best deepfake yet” and it took him just four days to create.
The fake version of Michael was seen harassing a cyclist before chasing him down and threatening him to hand over his backpack in Spanish.
As the journalist watched his alter-ego hold the stranger at gunpoint, he found himself yelling at the screen: “Just give me the f***ing bagpack, it’s not worth dying over!”
He later recalled thinking: “I know I’m going to kill this man, I’m just waiting to hear the bang.”
Thankfully, nearby strangers intervened and the unknown man – who was the real-life victim of a failed armed robbery – managed to escape.
Michael felt “a bit sick” while watching the clip because it was “so real” and said it confirmed his “worst fears” that someone “could believe I was an armed robber”.
Fighting fake porn is a ‘useless pursuit’, says Scarlett Johansson
Many people have been targeted using deepfake technology, including Hollywood actress Scarlett Johansson – and, like many, she’s been unable to fight back.
There are thousands of photoshopped nudes of the Marvel star and hundreds of fake porn films too – and more continue to be produced.
She warned it was only “a matter of time before any one person is targeted” by lurid forgeries created on the dark web – a part of the internet that allows users to remain anonymous and is used for illicit and illegal activities.
“The fact is that trying to protect yourself from the internet and its depravity is basically a lost cause,” Scarlett told The Washington Post in 2018.
“Nothing can stop someone from cutting and pasting my image or anyone else’s on to a different body and making it look as eerily realistic as desired.”
She described trying to fight as “a useless pursuit, legally, mostly because the internet is a vast wormhole of darkness that eats itself”.
Sextortion, blackmail & financial fraud
Michael explained this is the tip of the iceberg for how to use deepfake technology and some criminals extort victims financially and sexually.
They include people who pose as love interests online and after receiving nude images from their victims use them for blackmail.
In the past, criminals have demanded money or threatened to send loved ones, friends and employers naked photographs or deepfake nudes.
Others have demanded victims fulfill the extorter’s sick sexual demands online – in person or by sending other naked images.
“And for those who do not? Well, enjoy seeing yourself with embarrassing household objects inserted into your orifices when you Google yourself in the future,” Michael wrote.
“Enjoy the rest of the world seeing it, too. This very real possibility is utterly chilling.”
Investigative journalist Rana Ayyub is one of countless victims, she criticised several India politicians in 2018 and was struck by a series of deepfake attacks.
In the days that followed ‘inflammatory’ during a TV appearance, fake tweets emerged that read: “I hate India and Indians!”
And it only got worst from there, while having lunch with a friend she was informed there was a video of “her face on the body of a young woman having sex”.
To many the deepfake clip seemed real and after being posted on the fan page of one politician was shared more than 40,000 times.
Another side to this dark underbelly of the web is the creation of synthetic voices that are used to imitate a person’s actual voice.
One British energy company was conned out of £180,000 when hackers pretended to be their CEO and instructed a managing director to transfer funds to an account in Hungary.
In a statement, the unnamed company said: “The software was able to imitate the voice and not only the voice – the tonality, the punctuation, the German accent.”
Criminals could destroy CCTV evidence
Cyber experts have warned it could only be “several years” before criminals are able to digitally tamper with CCTV footage to hide or obscure people’s faces.
This could allow them to disguise themselves or any other passerby as someone else in live footage and potentially alter evidence that could be used by police in court.
Julija Kalpokiene, a law associate who specialises in IT and data, explained it was a particular risk because “all surveillance systems are interconnected”.
“A cyber-criminal may be able to tweak the systems so the surveillance would not show who is the real criminal,” she told the Daily Star.
‘Worryingly good’ celeb deepfakes
Earlier this year, the ease of using deepfake technology was exposed when a TikTok user was able to imitate Tom Cruise.
Followers were stunned by the life-like and convincing nature of the clips, including one of fake Tom playing golf that accumulated over five million views.
One user described it as “one of the best deepfakes I’ve ever seen”, and noted that the voice was “really good too”.
Another added: “These deepfakes are getting worryingly good. How the heck can we trust what we see on TV?”
Last year, Channel 4 employed a similar tactic in its Deepfake Queen: 2020 Alternative Christmas Message.
In the clip, a pretend version of Her Majesty could be seen dancing and flying through the air after giving Meghan Markle a verbal lashing.
Similar videos have been made of Russian President Vladimir Putin, Facebook’s Mark Zuckerberg and former US Presidents Donald Trump and Barack Obama.
British politicians including Prime Minister Boris Johnson and former Labour leader Jeremy Corbyn have been targeted too.
Experts warn there is a threat to democracy as less tech-savvy people and nations may be unaware the clips are fake.
This could cause serious damage nationally and internationally by leading citizens to change who they vote for, launch protests and even lead to war.
In the clips, all of the individuals were manipulated into saying or doing things they may never act out in real life.
If the public believe deepfakes, it not only risks an individual’s reputation but their jobs, relationships and liberty.
Trust No One: Inside The World Of Deepfakes was published by Hodder & Stoughton this month and is available to buy now.