Meta Quest Pro falls short on biometric protections

Meta has released more details about the lengths they’ve gone to protect users of their new eye tracking HMDs.

The inspector from the movie Casablanca is “shocked, shocked that there is gambling in this establishment” that he frequents.
The Inspector is Shocked, Shocked that there is gambling in this establishment that he frequents

I’ve worked with eye tracking technology since 2010 (Hololens) and written publicly since 2019. Here are some thoughts after reading all of the available documents on Meta’s eye tracking protections and their rationales.

Meta clearly wants us to believe they’ve gone to great lengths. But despite some very capable advisors working to offer some protections, I am deeply concerned that genuine critics and outside domain experts were not adequately utilized or listened to. Meta apparently took the easy way out in almost all cases. But kudos for the transparency!

Conclusion up front: I would suggest users and policymakers not be appeased with half-measures. I don’t like regulation, but it’s time to classify and protect this and other biometric data as health data. And until then, I don’t consider this headset safe.

Here are some more detailed examples to support that conclusion:

  1. User opt-in consent is better than opt-out or no-choice. However, most people won’t understand the risks and will turn on this cool new feature just for its benefits, just like they click through coercive EULAs without reading them. The consumer-facing warnings do not convey the risks we’ve identified. This does not reach the expected level of informed consent — where people have the information to decide for themselves.
  2. Discarding raw eye images is certainly better than uploading them. But the most worrying data is actually the so-called “abstracted gaze data” generated on-device. This data could be used for health diagnosis, psychographic profiling and ad-tech manipulation. Meta makes no promises of respecting higher-level choices, like “use it only for depicting my eyes” or “to improve system performance” or “for object selection” vs. myriad more dangerous [ab]uses. Meta claims this will be “continuously overridden,” but omits how it will also be uploaded to its servers to at the very least be sent to remote headsets for rendering your avatar’s eyes for other people — or further utilized by 3rd parties for anything else we don’t know about.
  3. Yes, 3rd party developers get this eye data for free via Meta’s APIs, by simply signing an agreement. The data is very useful, say for foveated rendering in other 3D engines and for rendering social avatars in VR chat apps. Meta prohibits 3rd parties from abusing this data in certain ways and says it will take action otherwise. But will it know?
The Inspector is Shocked, Shocked that there is gambling in this establishment that he frequents

Q. How does Meta know what happens after this sensitive data is streamed to 3rd party servers?

Meta tells its users: “We do not control how a third party app uses, stores, or shares your abstracted gaze data, so you should only allow access to your data to apps that you trust.”

Q. How is a consumer able to determine if an app is worthy of their trust?

e.g., LinkedIn’s app uploaded my contacts a while back before they had to explicitly ask for it.

Q. Whose job is it to inspect the apps?

No one.

Experts surely know there are platform-level solutions available, such as end-to-end encryption, that could help limit this data to only consented use cases, like another user’s HMD rendering our eyes. E2E encryption could add more AAA security assurances. “AAA” here means we can authenticate, authorize, and account for (as in record) all such uses, which is required if a user ever wants to revoke and delete their data. This is a critical part of informed consent, saying “no.” Consent must be able to be revoked, if the user decides for any reason, including a breach of trust.

Meta can’t promise this unless it knows where the data goes. And it clearly doesn’t even try to know.

For the “lighter” (safer) uses of biometric data, a fuzzed or “minimized” API data stream could address the cases where apps only need superficial data, like event triggers. Imagine that looking at some magic 3D object causes a UI event to happen, like it glows. The recipient of this event trigger doesn’t need to know anything until or after it happens. That’s much safer, as long they also limit the number of triggers (say 100 to 1000) so a sneaky developer can’t put hidden triggers everywhere.

There are even stronger mitigations that Meta could consider. In a more locked-down system, third parties would need to use the given platform-level solutions, like system-level foveated rendering code, or some future system-wide avatar codecs that not only provide more UX consistency or features but also never allow the biometric information to be seen by 3rd party apps at all. If they can’t see it, they can’t stream it off device. Meta can keep the data safer, if they’re the only ones who handle it (and they make some real promises about their own limited uses).

That would also unfortunately limit competing solutions on the same hardware, so it should be weighed against the consumer protections. But Meta could also establish a verification program where 3rd parties could prove any biometric-related code is safe and compliant in order to receive trusted access. Not all developers would want or need this level of vetting.

More User Options

One of the easiest things Meta could do is to give users more choice, not simply “on or off,” but to decide how and when the data is used. For example, if they click “OK to use it for my avatar’s eyes” then apps that register this specific intent would get access. But if they click “do not track” (like on Apple devices) then any app that also requested any network access to upload data to external servers could be blocked. It’s drastic, but people will find clever ways around system level protections. So it becomes a bit of a game of cat and mouse. But that’s life in the big city, where your customer’s identities and even their future levels of freedom are at stake.

If I’ve missed something in the Meta terms or gotten anything wrong in my analysis, feel free to comment or otherwise let me know.

Published
Categorized as UX Tagged

Leave a comment

Your email address will not be published.