It’s Now Dangerously Easy To Make Deepfakes On Your Phone

It’s Now Dangerously Easy To Make Deepfakes On Your Phone
A recently viral TikTok video showed a very convincing fake Tom Cruise (TikTok: @deeptomcruise)
To sign up for our daily newsletter covering the latest news, features and reviews, head HERE. For a running feed of all our stories, follow us on Twitter HERE. Or you can bookmark the Gizmodo Australia homepage to visit whenever you need a news fix.

It’s here. There’s now an app that allows anyone to easily make computer generated videos of someone, also known as deepfakes, using only a mobile phone and a photograph.

A recently released app called Avatarify lets users control a picture using their own face. It’s as simple as uploading an image, quickly running it through the app, and recording footage of yourself.

It goes further than ther apps like the Reface and Wombo which allowed user to make deepfakes, but in a more limited way.

Deepfakes have been around for a long time now, including some widely accessible and easy-to-use desktop software that allows anyone to make them. The technology has been used to make creepy looking political ads in Australia, non-consensual pornographic content of women and even to allegedly bully children.

But these latest developments make it more accessible than ever to create convincing deepfakes.

Now, there are reasons to be cautious about embracing the idea we’ve entered into an infopocalypse where we won’t be able to determine what’s real and what’s not.

Firstly, there are people working on tools to detect deepfakes. One recent breakthrough is using the the reflections in the eyes of subjects, with abnormalities cropping up in the eyes of the deepfaked individuals.

Anecdotally speaking, most deepfake footage is still pretty obvious to the layperson if they pay attention.

Secondly, convincing synthetic or fake footage has been around for a long time. Using simple editing tools to cut together videos to make it seem as if something else is happening — called ‘cheapfakes’ or ‘shallowfakes’ — has been around for a really long time.

While maliciously edited videos of public figures like Nancy Pelosi have caused a couple of headaches in the past, they haven’t really had any major impact.

While the increasing availability of deepfake tools might not convince audiences, they certainly make it more difficult to ascertain what is real or not. It’s likely that the biggest impact of deepfakes will be eroding our ability to trust our eyes.