The U.S. government agency known for developing wild technology like a penny-sized vacuum, enigmatic sky balloons, and a shit-ton of robots is now tasked with solving perhaps the greatest threat to democracy — misinformation.
The U.S. Department of Defence’s Defence Advanced Research Projects Agency (DARPA) “is soliciting innovative research proposals in the area of semantic technologies to automatically assess falsified media,” according to an announcement posted by the agency on August 23. The program, called Semantics Forensics (SemaFor), effectively wants to figure out an automated system that will identify and defend against disinformation campaigns, including text, audio, image and video content.
According to the SemaFor announcement, the program wants to develop a bunch of algorithms that will analyse these coordinated attacks. The text specifically details three types of algorithms — a semantic detection one that would figure out if media is generated or manipulated, an attribution one that would conclude whether media came from a certain organisation or person, and a characterisation algorithm that would determine if media was created or manipulated with malicious intent.
DARPA’s call for proposals makes it clear that they aren’t looking for new spins on old tricks — the agency wants research that explores a new approach to defending against misinformation, not, as it states, “evolutionary improvements to the existing state of practice.”
And while an automated model sounds nice in theory, in execution, to date, these types of algorithmically-generated systems are still flawed and biased, and in more disturbing cases, outright discriminatory. Existing applications do not inspire a lot of faith in a near-future system that would be both effective and just.
It’s not inherently bad that the government wants to funnel resources into developing a unique system to prevent the types of coordinated attacks that have enabled the likes of election interference, dangerous conspiracy theories, and genocide. But it’s a bit strange that the agency most famous for its mostly inapplicable pipe dream-like technology is the one charged with figuring out an essential, albeit complex, solution to an increasingly pervasive societal problem.