App developers can access more robust data about your face and the expressions you make with iPhone X, raising concerns from privacy advocates who worry that this sensitive facial data will end up in the hands of advertisers.
Phil Schiller, Apple's senior vice president of worldwide marketing, announces features of the new iPhone X, including Face ID. (Photo: AP)
According to its developer agreement, Apple will grant access to face data if the app maker seeks user permission before using it and agrees not to use the data for marketing or share it with advertisers. The details of the developer agreement were first reported by Reuters; Gizmodo later reviewed a copy of the agreement and confirmed the details.
However, this doesn't mean that Apple is cracking open Face ID and sharing its data with anyone who decides to build an app - Face ID compiles rich details about your face using several sensors and cameras on the front of the iPhone X, and that data never leaves your device. The face data that is being shared with developers comes from ARKit, Apple's suite of tools for augmented reality, and relies only on camera input. Earlier reports have conflated the two systems, but they're distinct and rely on different inputs.
Make no mistake, ARKit will provide lots of juicy data about your face as you mug for the camera. In a presentation, Apple's game technologies evangelist Allan Schaffer explained that ARKit uses data from iPhone X's front-facing camera to make a 3D "mesh" of your face that's capable of real-time tracking.
But this is different from the data used in Face ID, and isn't powerful enough to unlock your phone. In fact, Apple's developer agreement expressly forbids app developers from using ARKit data to create their own authentication tools. "You may not use Face Data for authentication, advertising, or marketing purposes, or to otherwise target an end-user in a similar manner," the agreement reads.
Still, privacy advocates are concerned that app developers won't read through the entire 77-page agreement and therefore won't be aware of what their obligations are when it comes to protecting face data. Apple enforces the agreement by booting developers who don't comply from the App Store, but it isn't a perfect system - as Reuters noted, apps don't go through a full code review before launching on the App Store and Apple relies instead on spot checks and user reports to detect sketchy behaviour.