Aside from figuring out when to end their shutdowns as quickly as possible, one of the main concerns on the minds of large and small businesses is how they can avoid litigation once they do reopen. And it turns out that these business’s mission to avoid getting the pants sued off of them might inadvertently end up stomping on the personal privacy of the consumers they’re so eager to bleed dry.
That’s according to Reuters which surveyed more than a dozen analytics companies currently pitching businesses on their coronavirus-compliance tech. These sixteen companies—which ranged from smallish startups to massive corporations—are offering souped-up computer vision analytics meant to do everything from ensuring that patrons and employees are wearing masks on a construction site, to seeing whether these employees are standing the right distance apart. The idea being that with enough monitoring and intervention with social distancing violaters, this kind of system might help businesses avoid coronavirus-related lawsuits that might arise, it’s also a move with questionable ethics and efficacy.
Most major corporations are no stranger to using souped-up cameras to surveil shoppers. For years, this sort of tech has been integral to the way retailers around the world keep checkout lines moving, discounts dispensing, and to keep track of which products are selling well. Broadly speaking, this sort of tech is meant to monitor trends, rather than single out individual shoppers, though there has been a push to bring more invasive tech—like facial recognition—to the forefront of retailers worldwide. The reason? To combat what we would call shoplifting, or what they would call “shrinkage,” a scourge that bleeds tens of billions from these retailers annually, according to recent data.
Thanks to the coronavirus, many of the companies that once branded themselves as the key to keeping stores shoplifter-free are now promoting their ability to put brakes on the pandemic’s spread. The video surveillance provider Kogniz, for example, recently pitched what it’s calling its “Health Cam” as a way to identify the faces of feverish people, and stop them from entering a building. Other companies have pitched their ability to detect mask-wearing faces, or their proprietary social-distancing detection tools.
Multiple corporations told Reuters that this sort of software is “crucial” to their plans to reopen—and stay open—in the middle of the pandemic, saying that “it will allow them to show not only workers and customers but also insurers and regulators, that they are monitoring and enforcing safe practices.”
Obviously, the idea of recording shoppers in their day-to-day movements without their knowing consent isn’t just an ethically sticky move, but also one that might not work in the first place. We’ve seen time and time again that this sort of tech can have issues processing non-white, non-male faces—and adding face masks on top of these issues makes the detection issue damn near impossible.
And while it’s likely that some of these companies in the compliance-tech space are able to come through on their pitches, folks in the facial recognition space have a history of overpromising on what their tech can and cannot do. But if onboarding this tech helps these companies sleep better at night, well, that’s what it’s all about, right?