Microsoft’s Made a Tool to Spot Deepfakes

Microsoft’s Made a Tool to Spot Deepfakes
Contributor: Holly Brockwell
This post originally appeared on Gizmodo UK, which is gobbling up the news in a different timezone.

Anyone who was around in the 80s will be pretty surprised that Bill Gates and Microsoft are turning out to be the good guys of 2020, but here we are.

MS has created a new tool called Video Authenticator to identify deepfakes (those spooky videos that all-too-realistically change the face of someone in a video into someone else), giving a confidence score about how likely it is to have been faked.

The idea, of course, is to “combat disinformation” – like videos of politicians saying things they never said. In a lower-tech recent example, a man with motor neurone disease had his computerised voice manipulated in a video to make him say politically-charged words he hadn’t spoken. As the tech improves, so will the fakery.

At the same time, Microsoft has also launched a tool that allows video creators to hide code in their videos that will flag up any unauthorised changes.

Video Authenticator has been launched now because there’s likely to be a flurry of deepfakes coming up to the US election in November. However, MS has sensibly decided not to give access to the public – because that would just help the deepfakers make better videos that pass detection (sigh). Instead, the BBC reports the company is “only offering it via a third-party organisation, which in turn will provide it to news publishers and political campaigns without charge.”

The tool looks for subtle changes to a video that are easily missed by the human eye, including altered pixels at the borders of the artificially-added face to blend it with the background.

Social networks including Facebook, Twitter and TikTok have all banned deepfakes, although there are some very popular apps that can add people’s faces to a video from a single photo.

Nina Schick, author of the book Deep Fakes and the Infocalypse, comments:

“The only really widespread use we’ve seen so far is in non-consensual pornography against women. But synthetic media is expected to become ubiquitous in about three to five years, so we need to develop these tools going forward.

However, as detection capabilities get better, so too will the generation capability – it’s never going to be the case that Microsoft can release one tool that can detect all kinds of video manipulation.”

Also, even if it could – if we’ve learnt one thing from the past few years, it’s that even empirical, unarguable proof that something’s false won’t change people’s minds if it aligns with what they want to believe. What a future we’re heading for. [BBC]


The Cheapest NBN 50 Plans

It’s the most popular NBN speed in Australia for a reason. Here are the cheapest plans available.

At Gizmodo, we independently select and write about stuff we love and think you'll like too. We have affiliate and advertising partnerships, which means we may collect a share of sales or other compensation from the links on this page. BTW – prices are accurate and items in stock at the time of posting.