Privacy Concerns for Dual-Use AI Image Clarity Tools

avatar



AI tech is a powerful tool. The original photo (left) was cleaned-up with an AI deep learning algorithm (Image source: from Murilo Gustineli) and restoring tremendous clarity.

The AI researchers outline their progress in their white paper Towards Real-World Blind Face Restoration with Generative Facial Prior (https://arxiv.org/pdf/2101.04061) and code is available for others to try on their project webpage: https://xinntao.github.io/projects/gfpgan.

The GFP-GAN system (Generative Facial Prior GFP — Generative Adversarial Network GAN), published by Xintao Wang, Yu Li, and Honglun Zhang and Ying Shan, is able to restore images much better than previous AI systems. The results are nothing short of impressive.


As a privacy professional, when I see these transformational examples, I have grave concerns about undesired monitoring of the population and the ability to clean-up distant or low-quality surveillance images, to identify and track a population.

Digital cameras are widely deployed by businesses and governments. A major limitation is the clarity of images at a distance. It becomes very difficult to positively identify subjects. With AI enhancing image clarity tools, identifying people at great distances or with poor resolution cameras could be automated at scale. That could allow the tracking of people wherever they go, catalog everyone they speak with, and if eventually applied to read-lips, it could eavesdrop on conversations at a distance.

However, you may be shocked to know that I am equally excited as this is also a potentially PRIVACY ENHANCING technology! Because this same type of AI can be used to perturb clear images in ways that undermine facial recognition algorithms.

Imagine this tech embedded in privacy-supporting cameras that modify the pixels in ways, unnoticeable to the human eye, but thwarts AI systems from conducting bulk identification of people from its video feed. Humans still see unblurred images but automated computer processes are thwarted from harvesting identified personal data at scale. Such a usage could find a potentially desirable balance between security and privacy.

It is up to everyone to decide how such tools will be used.

Posted with STEMGeeks



0
0
0.000
5 comments
avatar

Powerful tech that could be used to undermine privacy. Impressive results though.

0
0
0.000
avatar

It’s crazy to see how the sci-fi tech trope I used to see on TV is now reality.

And the results are dang impressive.

0
0
0.000
avatar

Agreed! Oddly, it was all the sci-fi shows and the Simpsons! Their view of technology and events are coming to reality.

0
0
0.000
avatar

Foresight on the shows' part or innovators trying to make tropes into reality?

0
0
0.000