Skip to content

Latest commit

 

History

History
18 lines (11 loc) · 2.44 KB

foggysight.md

File metadata and controls

18 lines (11 loc) · 2.44 KB

FoggySight – Facial Recognition Privacy

Disclaimer: This is not (yet) a concrete case as far as I know – so far this is only an idea described in a paper.

Description

FoggySight [0] is an anti-facial recognition technology with the aim to enhance privacy in facial lookup settings. It is specifically designed to the situation that companies have already scraped images from social media and already hold a large database of labeled face images. Facial lookup setting refers to a situation where someone wants to know the identity of a person based on an image. He then queries this large look-up database of labeled face images (or the company which owns it). As an output he might be given the top-k matching identities to his query.
The authors suggest and implement a community-driven approach in order to „crowd out“ links between a user’s images and his identity. Community driven, because users modify their images to protect other users from being recognized. These modified „decoy“ images crowd or poison the top-k matching set for a specific image of a user and thus obscure the real identity of that user.
For FoggySight to be successful it needs collective action: The basic idea is that users can only protect the identity of other users (not of themselves) – thus it boils down to a community effort in order to achieve protection for everyone. On a technical level, the decoy photos are generated by adversarial machine learning algorithms. For readers not familiar with this technique, this means in the case of FoggySight that an image is altered with respect to the to-be-protected image, such that a neural network (here: the facial recognition software) considers both images to be similar. However, for the human eye the perturbations (which happen on a pixel-level) are usually not visible or yield negligible reduction in quality.

Aspects of Coordination

The authors discuss various modes how this community effort could be achieved, e.g. through a trusted central party (e.g. a social media company coordinates who protects who and applies image alterations automatically) or through decentralized collaboration, i.e. users select their target images and create alterations on their own, which could be facilitated by a browser extension.

Sources

[0] Evtimov, I., Sturmfels, P., & Kohno, T. (2020). Foggysight: A scheme for facial lookup privacy. arXiv preprint arXiv:2012.08588.