Reality 2.0 Newsletter - November 3, 2020: Is There a Place for Facial Recognition?

To get this weekly dose of Reality delivered by email, sign up on our Substack page.

A Quick Plug

In our most recent episode, Doc Searls, Katherine Druckman, and Kyle Rankin discuss what happens when facial recognition and AI is in the hands of individuals to identify police, altering the balance of power. Other topics include surveillance and forensics tech, and privacy as it relates to photography. Please remember to subscribe via the podcast player of your choice.

Episode 46: Facial Recognition, Surveillance Technology, and the Balance of Power


When is Facial Recognition Technology Fair Game?

A recent New York Times article by Kashmir Hill featured examples of individuals turning facial recognition technology around on police officers. In one case, Chris Howell, a man from Porland, Oregon, developed software to identify local police officers who taped over their names during clashes with protesters. Interestingly, Portland recently banned the use of facial recognition technology by police departments and public-facing businesses, but this did not impact an individual’s right to write and use such software on their own. And while most discussion of the ethics around this technology has focused on powerful entities such as Clearview AI and law enforcement, the same technology in the hands of the underdogs is somewhat uncharted. Is turnabout fair play here? These questions are complex and the answers are not black and white.

Doc Searls explored facial recognition on his Harvard blog in 2019:

[C]omputers doing facial recognition are proving useful for countless purposes: unlocking phones, finding missing persons and criminals, aiding investigations, shortening queues at passport portals, reducing fraud (for example at casinos), confirming age (saying somebody is too old or not old enough), finding lost pets (which also have faces). The list is long and getting longer.

Yet many (or perhaps all) of those purposes are at odds with the sense of personal privacy that derives from the tacit ways we know faces, our reliance on short term memory, and our natural anonymity (literally, namelessness) among strangers. All of those are graces of civilized life in the physical world, and they are threatened by the increasingly widespread use—and uses—of facial recognition by governments, businesses, schools and each other.

These ethical issues are present whether the tech is used by a casino, a law enforcement investigator, or an individual, but at what point does an individual have the right to embrace its use for their own protection? We explore the answer to this question in the podcast this week, but we can’t say we’ve come to a conclusion, so we would love to know your thoughts. Where do you draw the ethical line?

An alternative, and likely even more interesting, approach to this ethical conundrum is the question of artistic merit. Artist Paolo Cirio, also mentioned in the New York Times article referenced above, attempted to draw attention to privacy ethics in an exhibition he called “Capture,” which publicly displayed photos of 4,000 police officers taken during protests in France.

The series of photos Capture is composed of French police officers’ faces. The artist, Paolo Cirio collected 1000 public images of police in photos taken during protests in France and processed them with Facial Recognition software. Cirio then created an online platform with a database of the resulting 4000 faces of police officers to crowdsource their identification by name. Cirio also printed the officers’ headshots as street art posters and posted them throughout Paris to expose them in the public space. Capture comments on the potential uses and misuses of Facial Recognition and Artificial Intelligence by questioning the asymmetry of power at play. The lack of privacy regulations of such technology eventually turns against the same authorities that urge the use of it. Ultimately, as an activist, Cirio introduced a campaign to ban Facial Recognition technology in all of Europe by organizing a petition in collaboration with privacy organizations.

The work itself is compelling in its dissonance, but the question of whether doxing is ever morally sound remains.

Similarly, the work of posthumously famous photographer Vivian Maier is compelling in its intimacy and voyeurism. Vivian was a quiet woman who worked as a nanny and had a secret talent. After her death, a young man named John Maloof bid on an old, abandoned box of negatives at auction, and thus discovered her work. The story is fascinating, both from a biographical perspective, as well as for its significance to contemporary art history and historic preservation. At its heart though, is a story of peeking into the intimate details of a person’s life when they are no longer around to consent. She never chose to be a recognized artist. Does the obviously tremendous artistic merit in her photographs justify the exposure of her life and her subjects? Do we have a right to view the world through her eyes?

We invite you to draw your own conclusions and join the conversation by commenting here on this post, by visiting us on any of our social outlets, or via our contact form.

Site/Blog/Newsletter | Facebook | Twitter | YouTube | Mastodon


This Week’s Reading List

  • Activists Turn Facial Recognition Tools Against the Police - The New York Times — These activists say it has become relatively easy to build facial recognition tools thanks to off-the-shelf image recognition software that has been made available in recent years. In Portland, Mr. Howell used a Google-provided platform, TensorFlow, which helps people build machine-learning models.

  • Fawkes — The SAND Lab at University of Chicago has developed Fawkes1, an algorithm and software tool (running locally on your computer) that gives individuals the ability to limit how unknown third parties can track them by building facial recognition models out of their publicly available photos.

  • Mass Extraction - Upturn — To search phones, law enforcement agencies use mobile device forensic tools (MDFTs), a powerful technology that allows police to extract a full copy of data from a cellphone — all emails, texts, photos, location, app data, and more — which can then be programmatically searched. As one expert puts it, with the amount of sensitive information stored on smartphones today, the tools provide a “window into the soul.”

  • Doc Searls Weblog · About face — We know more than we can tell.

  • Vivian Maier Photographer | Official website of Vivian Maier | Vivian Maier Portfolios, Prints, Exhibitions, Books and documentary film

  • Welcome to the 21st Century: How To Plan For The Post-Covid Future - O'Reilly Media — So too, when we look back, we will understand that the 21st century truly began this year, when the COVID19 pandemic took hold. We are entering the century of being blindsided by things that we have been warned about for decades but never took seriously enough to prepare for, the century of lurching from crisis to crisis until, at last, we shake ourselves from the illusion that our world will go back to the comfortable way it was and begin the process of rebuilding our society from the ground up.



The Reality 2.0 Podcast explores how tech, privacy, and security impact reality in a post-COVID world. Subscribe now and don't miss a thing! We welcome your feedback at our contact page.

Article Comments

Mastodon