<?xml version="1.0" encoding="UTF-8"?>
<rss version="2.0" encoding="UTF-8" xmlns:dc="http://purl.org/dc/elements/1.1/" xmlns:sy="http://purl.org/rss/1.0/modules/syndication/" xmlns:admin="http://webns.net/mvcb/" xmlns:atom="http://www.w3.org/2005/Atom/" xmlns:rdf="http://www.w3.org/1999/02/22-rdf-syntax-ns#" xmlns:content="http://purl.org/rss/1.0/modules/content/" xmlns:googleplay="http://www.google.com/schemas/play-podcasts/1.0" xmlns:itunes="http://www.itunes.com/dtds/podcast-1.0.dtd" xmlns:fireside="http://fireside.fm/modules/rss/fireside">
  <channel>
    <fireside:hostname>web02.fireside.fm</fireside:hostname>
    <fireside:genDate>Thu, 30 Apr 2026 08:44:59 -0500</fireside:genDate>
    <generator>Fireside (https://fireside.fm)</generator>
    <title>Reality 2.0 - Episodes Tagged with “Facial Recognition”</title>
    <link>https://www.reality2cast.com/tags/facial%20recognition</link>
    <pubDate>Fri, 23 Jul 2021 07:00:00 -0400</pubDate>
    <description>Join Privacy and Open Source advocates, Doc Searls and Katherine Druckman, as they navigate the new digital world, covering topics related to digital privacy, cybersecurity, digital identity, as well as  Linux and open source and other current issues.
</description>
    <language>en</language>
    <itunes:type>episodic</itunes:type>
    <itunes:subtitle>Where tech meets reality. Navigating technology, privacy, security and open source. </itunes:subtitle>
    <itunes:author>Katherine Druckman and Doc Searls</itunes:author>
    <itunes:summary>Join Privacy and Open Source advocates, Doc Searls and Katherine Druckman, as they navigate the new digital world, covering topics related to digital privacy, cybersecurity, digital identity, as well as  Linux and open source and other current issues.
</itunes:summary>
    <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/5/55e8e48a-ae33-492f-bc04-175d577a5a7e/cover.jpg?v=5"/>
    <itunes:explicit>no</itunes:explicit>
    <itunes:keywords>technology, privacy, open source, linux, security, cybersecurity, infosec, FOSS, FLOSS</itunes:keywords>
    <itunes:owner>
      <itunes:name>Katherine Druckman and Doc Searls</itunes:name>
      <itunes:email>podcast@reality2cast.com</itunes:email>
    </itunes:owner>
<itunes:category text="Technology"/>
<item>
  <title>Episode 79: Your Identity - Twitter Verification, Facial Recognition, and More</title>
  <link>https://www.reality2cast.com/79</link>
  <guid isPermaLink="false">ed19afc1-668f-43a4-8de8-2228cd03bf07</guid>
  <pubDate>Fri, 23 Jul 2021 07:00:00 -0400</pubDate>
  <author>Katherine Druckman and Doc Searls</author>
  <enclosure url="https://aphid.fireside.fm/d/1437767933/55e8e48a-ae33-492f-bc04-175d577a5a7e/ed19afc1-668f-43a4-8de8-2228cd03bf07.mp3" length="74500720" type="audio/mpeg"/>
  <itunes:episodeType>full</itunes:episodeType>
  <itunes:author>Katherine Druckman and Doc Searls</itunes:author>
  <itunes:subtitle>Tune in to our new episode! Katherine Druckman, Doc Searls and Shawn Powers chat about Twitter verification, facial recognition, YouTube moderation, and algorithmic bias.</itunes:subtitle>
  <itunes:duration>48:08</itunes:duration>
  <itunes:explicit>no</itunes:explicit>
  <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/5/55e8e48a-ae33-492f-bc04-175d577a5a7e/cover.jpg?v=5"/>
  <description>Tune in to our new episode! Katherine Druckman, Doc Searls and Shawn Powers chat about Twitter verification, facial recognition, YouTube moderation, and algorithmic bias.
Subscribe to our newsletter. (https://reality2cast.com/newsletter)
Reality 2.0 around the web:
Site/Blog/Newsletter (https://www.reality2cast.com)
FaceBook (https://www.facebook.com/reality2cast)
Twitter (https://twitter.com/reality2cast)
YouTube (https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q)
Mastodon (https://linuxrocks.online/@reality2cast) Special Guest: Shawn Powers.
</description>
  <itunes:keywords>technology, privacy, open source, security, linux, facial recognition, twitter, social media, youtube, identity</itunes:keywords>
  <content:encoded>
    <![CDATA[<p>Tune in to our new episode! Katherine Druckman, Doc Searls and Shawn Powers chat about Twitter verification, facial recognition, YouTube moderation, and algorithmic bias.</p>

<p><a href="https://reality2cast.com/newsletter" rel="nofollow">Subscribe to our newsletter.</a></p>

<p><strong>Reality 2.0 around the web:</strong><br>
<a href="https://www.reality2cast.com" rel="nofollow">Site/Blog/Newsletter</a><br>
<a href="https://www.facebook.com/reality2cast" rel="nofollow">FaceBook</a><br>
<a href="https://twitter.com/reality2cast" rel="nofollow">Twitter</a><br>
<a href="https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q" rel="nofollow">YouTube</a><br>
<a href="https://linuxrocks.online/@reality2cast" rel="nofollow">Mastodon</a></p><p>Special Guest: Shawn Powers.</p><p><a rel="payment" href="https://www.patreon.com/reality2cast">Support Reality 2.0</a></p><p>Links:</p><ul><li><a title="I’m Not a Reporter. But I’m Verified as One on Twitter | WIRED" rel="nofollow" href="https://www.wired.com/story/opinion-im-not-a-reporter-but-im-verified-as-one-on-twitter/">I’m Not a Reporter. But I’m Verified as One on Twitter | WIRED</a> &mdash; I NEVER CALLED myself a journalist until Twitter made me. I’m an attorney, activist, and faculty member, but it was only by using the “journalist” label that I was able to get one of the most coveted assets in social media, the blue “verified” checkmark. My months-long effort to get verified revealed a system that is stacked against grassroots activists, particularly BIPOC communities.</li><li><a title="Twitter verification requirements - how to get the blue check" rel="nofollow" href="https://help.twitter.com/en/managing-your-account/about-twitter-verified-accounts">Twitter verification requirements - how to get the blue check</a> &mdash; 
The blue Verified badge  on Twitter lets people know that an account of public interest is authentic. To receive the blue badge, your account must be authentic, notable, and active.</li><li><a title="Goodbye, Fleets" rel="nofollow" href="https://blog.twitter.com/en_us/topics/product/2021/goodbye-fleets">Goodbye, Fleets</a> &mdash; We built Fleets as a lower-pressure, ephemeral way for people to share their fleeting thoughts. We hoped Fleets would help more people feel comfortable joining the conversation on Twitter. But, in the time since we introduced Fleets to everyone, we haven’t seen an increase in the number of new people joining the conversation with Fleets like we hoped. Because of this, on August 3, Fleets will no longer be available on Twitter.

</li><li><a title="Black teen barred from skating rink by inaccurate facial recognition - The Verge" rel="nofollow" href="https://www.theverge.com/2021/7/15/22578801/black-teen-skating-rink-inaccurate-facial-recognition">Black teen barred from skating rink by inaccurate facial recognition - The Verge</a> &mdash; A facial recognition algorithm used by a local roller skating rink in Detroit wouldn’t let teen Lamya Robinson onto the premises, and accused her of previously getting into a fight at the establishment.</li><li><a title="Algorithmic bias - Wikipedia" rel="nofollow" href="https://en.wikipedia.org/wiki/Algorithmic_bias">Algorithmic bias - Wikipedia</a> &mdash; Algorithmic bias describes systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others. Bias can emerge due to many factors, including but not limited to the design of the algorithm or the unintended or unanticipated use or decisions relating to the way data is coded, collected, selected or used to train the algorithm. Algorithmic bias is found across platforms, including but not limited to search engine results and social media platforms, and can have impacts ranging from inadvertent privacy violations to reinforcing social biases of race, gender, sexuality, and ethnicity. The study of algorithmic bias is most concerned with algorithms that reflect "systematic and unfair" discrimination. This bias has only recently been addressed in legal frameworks, such as the 2018 European Union's General Data Protection Regulation. More comprehensive regulation is needed as emerging technologies become increasingly advanced and opaque.</li><li><a title="Face-Detection Cameras: Glitches Spur Charges of Racism - TIME" rel="nofollow" href="http://content.time.com/time/business/article/0,8599,1954643,00.html">Face-Detection Cameras: Glitches Spur Charges of Racism - TIME</a> &mdash; When Joz Wang and her brother bought their mom a Nikon Coolpix S630 digital camera for Mother's Day last year, they discovered what seemed to be a malfunction. Every time they took a portrait of each other smiling, a message flashed across the screen asking, "Did someone blink?" No one had. "I thought the camera was broken!" Wang, 33, recalls. But when her brother posed with his eyes open so wide that he looked "bug-eyed," the messages stopped.</li><li><a title="Meet the Censored: Matt Orfalea - by Matt Taibbi - TK News by Matt Taibbi" rel="nofollow" href="https://taibbi.substack.com/p/meet-the-censored-matt-orfalea">Meet the Censored: Matt Orfalea - by Matt Taibbi - TK News by Matt Taibbi</a> &mdash; Yes, the government is helping crack down on text messages and Facebook posts, but not to worry. At least your private thoughts are safe, right? Not so fast, found filmmaker Matt Orfalea</li><li><a title="Texas’ social media censorship bill pushes unconstitutional limits on free speech" rel="nofollow" href="https://www.dallasnews.com/opinion/commentary/2021/07/11/texas-social-media-censorship-bill-pushes-unconstitutional-limits-on-free-speech/">Texas’ social media censorship bill pushes unconstitutional limits on free speech</a> &mdash; Amid ongoing allegations that social media platforms are censoring conservatives, regulating Big Tech has become one of the hottest issues across the country. In Texas, Gov. Greg Abbott has called a special legislative session in part to debate and pass content moderation legislation.</li><li><a title="I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too. - The New York Times" rel="nofollow" href="https://www.nytimes.com/2019/11/04/business/secret-consumer-score-access.html">I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too. - The New York Times</a> &mdash; Little-known companies are amassing your data — like food orders and Airbnb messages — and selling the analysis to clients. Here’s how to get a copy of what they have on you.</li></ul>]]>
  </content:encoded>
  <itunes:summary>
    <![CDATA[<p>Tune in to our new episode! Katherine Druckman, Doc Searls and Shawn Powers chat about Twitter verification, facial recognition, YouTube moderation, and algorithmic bias.</p>

<p><a href="https://reality2cast.com/newsletter" rel="nofollow">Subscribe to our newsletter.</a></p>

<p><strong>Reality 2.0 around the web:</strong><br>
<a href="https://www.reality2cast.com" rel="nofollow">Site/Blog/Newsletter</a><br>
<a href="https://www.facebook.com/reality2cast" rel="nofollow">FaceBook</a><br>
<a href="https://twitter.com/reality2cast" rel="nofollow">Twitter</a><br>
<a href="https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q" rel="nofollow">YouTube</a><br>
<a href="https://linuxrocks.online/@reality2cast" rel="nofollow">Mastodon</a></p><p>Special Guest: Shawn Powers.</p><p><a rel="payment" href="https://www.patreon.com/reality2cast">Support Reality 2.0</a></p><p>Links:</p><ul><li><a title="I’m Not a Reporter. But I’m Verified as One on Twitter | WIRED" rel="nofollow" href="https://www.wired.com/story/opinion-im-not-a-reporter-but-im-verified-as-one-on-twitter/">I’m Not a Reporter. But I’m Verified as One on Twitter | WIRED</a> &mdash; I NEVER CALLED myself a journalist until Twitter made me. I’m an attorney, activist, and faculty member, but it was only by using the “journalist” label that I was able to get one of the most coveted assets in social media, the blue “verified” checkmark. My months-long effort to get verified revealed a system that is stacked against grassroots activists, particularly BIPOC communities.</li><li><a title="Twitter verification requirements - how to get the blue check" rel="nofollow" href="https://help.twitter.com/en/managing-your-account/about-twitter-verified-accounts">Twitter verification requirements - how to get the blue check</a> &mdash; 
The blue Verified badge  on Twitter lets people know that an account of public interest is authentic. To receive the blue badge, your account must be authentic, notable, and active.</li><li><a title="Goodbye, Fleets" rel="nofollow" href="https://blog.twitter.com/en_us/topics/product/2021/goodbye-fleets">Goodbye, Fleets</a> &mdash; We built Fleets as a lower-pressure, ephemeral way for people to share their fleeting thoughts. We hoped Fleets would help more people feel comfortable joining the conversation on Twitter. But, in the time since we introduced Fleets to everyone, we haven’t seen an increase in the number of new people joining the conversation with Fleets like we hoped. Because of this, on August 3, Fleets will no longer be available on Twitter.

</li><li><a title="Black teen barred from skating rink by inaccurate facial recognition - The Verge" rel="nofollow" href="https://www.theverge.com/2021/7/15/22578801/black-teen-skating-rink-inaccurate-facial-recognition">Black teen barred from skating rink by inaccurate facial recognition - The Verge</a> &mdash; A facial recognition algorithm used by a local roller skating rink in Detroit wouldn’t let teen Lamya Robinson onto the premises, and accused her of previously getting into a fight at the establishment.</li><li><a title="Algorithmic bias - Wikipedia" rel="nofollow" href="https://en.wikipedia.org/wiki/Algorithmic_bias">Algorithmic bias - Wikipedia</a> &mdash; Algorithmic bias describes systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others. Bias can emerge due to many factors, including but not limited to the design of the algorithm or the unintended or unanticipated use or decisions relating to the way data is coded, collected, selected or used to train the algorithm. Algorithmic bias is found across platforms, including but not limited to search engine results and social media platforms, and can have impacts ranging from inadvertent privacy violations to reinforcing social biases of race, gender, sexuality, and ethnicity. The study of algorithmic bias is most concerned with algorithms that reflect "systematic and unfair" discrimination. This bias has only recently been addressed in legal frameworks, such as the 2018 European Union's General Data Protection Regulation. More comprehensive regulation is needed as emerging technologies become increasingly advanced and opaque.</li><li><a title="Face-Detection Cameras: Glitches Spur Charges of Racism - TIME" rel="nofollow" href="http://content.time.com/time/business/article/0,8599,1954643,00.html">Face-Detection Cameras: Glitches Spur Charges of Racism - TIME</a> &mdash; When Joz Wang and her brother bought their mom a Nikon Coolpix S630 digital camera for Mother's Day last year, they discovered what seemed to be a malfunction. Every time they took a portrait of each other smiling, a message flashed across the screen asking, "Did someone blink?" No one had. "I thought the camera was broken!" Wang, 33, recalls. But when her brother posed with his eyes open so wide that he looked "bug-eyed," the messages stopped.</li><li><a title="Meet the Censored: Matt Orfalea - by Matt Taibbi - TK News by Matt Taibbi" rel="nofollow" href="https://taibbi.substack.com/p/meet-the-censored-matt-orfalea">Meet the Censored: Matt Orfalea - by Matt Taibbi - TK News by Matt Taibbi</a> &mdash; Yes, the government is helping crack down on text messages and Facebook posts, but not to worry. At least your private thoughts are safe, right? Not so fast, found filmmaker Matt Orfalea</li><li><a title="Texas’ social media censorship bill pushes unconstitutional limits on free speech" rel="nofollow" href="https://www.dallasnews.com/opinion/commentary/2021/07/11/texas-social-media-censorship-bill-pushes-unconstitutional-limits-on-free-speech/">Texas’ social media censorship bill pushes unconstitutional limits on free speech</a> &mdash; Amid ongoing allegations that social media platforms are censoring conservatives, regulating Big Tech has become one of the hottest issues across the country. In Texas, Gov. Greg Abbott has called a special legislative session in part to debate and pass content moderation legislation.</li><li><a title="I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too. - The New York Times" rel="nofollow" href="https://www.nytimes.com/2019/11/04/business/secret-consumer-score-access.html">I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too. - The New York Times</a> &mdash; Little-known companies are amassing your data — like food orders and Airbnb messages — and selling the analysis to clients. Here’s how to get a copy of what they have on you.</li></ul>]]>
  </itunes:summary>
</item>
<item>
  <title>Episode 57:  You Look Familiar, Did I See You on the Internet?</title>
  <link>https://www.reality2cast.com/57</link>
  <guid isPermaLink="false">e589f5b8-0fae-404e-a7f9-3b8d61e7ef84</guid>
  <pubDate>Fri, 12 Feb 2021 07:00:00 -0500</pubDate>
  <author>Katherine Druckman and Doc Searls</author>
  <enclosure url="https://aphid.fireside.fm/d/1437767933/55e8e48a-ae33-492f-bc04-175d577a5a7e/e589f5b8-0fae-404e-a7f9-3b8d61e7ef84.mp3" length="37813702" type="audio/mpeg"/>
  <itunes:episodeType>full</itunes:episodeType>
  <itunes:author>Katherine Druckman and Doc Searls</itunes:author>
  <itunes:subtitle>Katherine Druckman and Doc Searls talk facial recognition AI using our photos for training, and how we collectively negotiate our own privacy online.</itunes:subtitle>
  <itunes:duration>44:49</itunes:duration>
  <itunes:explicit>no</itunes:explicit>
  <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/5/55e8e48a-ae33-492f-bc04-175d577a5a7e/cover.jpg?v=5"/>
  <description>Katherine Druckman and Doc Searls talk facial recognition AI using our photos for training, and how we collectively negotiate our own privacy online.
Subscribe to our newsletter. (https://reality2cast.com/newsletter)
Reality 2.0 around the web:
Site/Blog/Newsletter (https://www.reality2cast.com)
FaceBook (https://www.facebook.com/reality2cast)
Twitter (https://twitter.com/reality2cast)
YouTube (https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q)
Mastodon (https://linuxrocks.online/@reality2cast) 
</description>
  <itunes:keywords>technology, privacy, open source, security, linux, facial recognition, clearview ai, flickr</itunes:keywords>
  <content:encoded>
    <![CDATA[<p>Katherine Druckman and Doc Searls talk facial recognition AI using our photos for training, and how we collectively negotiate our own privacy online.</p>

<p><a href="https://reality2cast.com/newsletter" rel="nofollow">Subscribe to our newsletter.</a></p>

<p><strong>Reality 2.0 around the web:</strong><br>
<a href="https://www.reality2cast.com" rel="nofollow">Site/Blog/Newsletter</a><br>
<a href="https://www.facebook.com/reality2cast" rel="nofollow">FaceBook</a><br>
<a href="https://twitter.com/reality2cast" rel="nofollow">Twitter</a><br>
<a href="https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q" rel="nofollow">YouTube</a><br>
<a href="https://linuxrocks.online/@reality2cast" rel="nofollow">Mastodon</a></p><p><a rel="payment" href="https://www.patreon.com/reality2cast">Support Reality 2.0</a></p><p>Links:</p><ul><li><a title="This new tool can tell you if your online photos are helping train facial recognition systems - CNN" rel="nofollow" href="https://www.cnn.com/2021/02/04/tech/face-recognition-ai-tool/">This new tool can tell you if your online photos are helping train facial recognition systems - CNN</a> &mdash; Exposing.ai, unveiled in January, lets you know whether photos you've posted to image-sharing site Flickr have been used to advance this controversial application of artificial intelligence by allowing you to search more than 3.6 million photos in six facial-recognition image datasets.</li><li><a title="Check if your photos were used in AI surveillance research projects" rel="nofollow" href="https://exposing.ai/">Check if your photos were used in AI surveillance research projects</a> &mdash; Check if your Flickr photos were used to build face recognition</li><li><a title="Clearview AI’s Facial Recognition App Called Illegal in Canada - The New York Times" rel="nofollow" href="https://www.nytimes.com/2021/02/03/technology/clearview-ai-illegal-canada.html">Clearview AI’s Facial Recognition App Called Illegal in Canada - The New York Times</a> &mdash; Canadian authorities declared that the company needed citizens’ consent to use their biometric information, and told the firm to delete facial images from its database.</li><li><a title="This &#39;Anonymizer&#39; Tool Replaces Your Face With a Fake to Trick Facial Recognition | Debugger" rel="nofollow" href="https://debugger.medium.com/replace-your-face-with-an-a-i-twin-to-trick-facial-recognition-22be6931cf1">This 'Anonymizer' Tool Replaces Your Face With a Fake to Trick Facial Recognition | Debugger</a> &mdash; Try the Anonymizer tool to create a fake face that looks like you
</li><li><a title="Doc Searls Weblog · About face" rel="nofollow" href="https://blogs.harvard.edu/doc/2019/10/31/about-face/">Doc Searls Weblog · About face</a> &mdash; We know more than we can tell.</li><li><a title="Facebook CIA Project: The Onion News Network ONN - YouTube" rel="nofollow" href="https://www.youtube.com/watch?v=juQcZO_WnsI">Facebook CIA Project: The Onion News Network ONN - YouTube</a></li><li><a title="Sun on Privacy: &#39;Get Over It&#39; | WIRED" rel="nofollow" href="https://www.wired.com/1999/01/sun-on-privacy-get-over-it/">Sun on Privacy: 'Get Over It' | WIRED</a> &mdash; "You have zero privacy anyway," Scott McNealy told a group of reporters and analysts Monday night at an event to launch his company's new Jini technology.

"Get over it."</li><li><a title="The Andromeda Strain - MichaelCrichton.com" rel="nofollow" href="https://www.michaelcrichton.com/the-andromeda-strain/">The Andromeda Strain - MichaelCrichton.com</a></li><li><a title="A Vast Web of Vengeance - The New York Times" rel="nofollow" href="https://www.nytimes.com/2021/01/30/technology/change-my-google-results.html">A Vast Web of Vengeance - The New York Times</a> &mdash; Outrageous lies destroyed Guy Babcock’s online reputation. When he went hunting for their source, what he discovered was worse than he could have imagined.</li><li><a title="Canadian Woman Accused of Defaming Dozens Online Is Arrested in Toronto - The New York Times" rel="nofollow" href="https://www.nytimes.com/2021/02/10/technology/nadire-atas-arrest.html">Canadian Woman Accused of Defaming Dozens Online Is Arrested in Toronto - The New York Times</a> &mdash; Nadire Atas, a Canadian woman who wrote thousands of online posts defaming her perceived enemies, was arrested on Tuesday by the police in Toronto. She was charged with crimes including harassment and libel, a Toronto police spokeswoman said.</li><li><a title="CORE Response: Community Organized Relief Effort" rel="nofollow" href="https://www.coreresponse.org/">CORE Response: Community Organized Relief Effort</a> &mdash; Together, we save lives.</li></ul>]]>
  </content:encoded>
  <itunes:summary>
    <![CDATA[<p>Katherine Druckman and Doc Searls talk facial recognition AI using our photos for training, and how we collectively negotiate our own privacy online.</p>

<p><a href="https://reality2cast.com/newsletter" rel="nofollow">Subscribe to our newsletter.</a></p>

<p><strong>Reality 2.0 around the web:</strong><br>
<a href="https://www.reality2cast.com" rel="nofollow">Site/Blog/Newsletter</a><br>
<a href="https://www.facebook.com/reality2cast" rel="nofollow">FaceBook</a><br>
<a href="https://twitter.com/reality2cast" rel="nofollow">Twitter</a><br>
<a href="https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q" rel="nofollow">YouTube</a><br>
<a href="https://linuxrocks.online/@reality2cast" rel="nofollow">Mastodon</a></p><p><a rel="payment" href="https://www.patreon.com/reality2cast">Support Reality 2.0</a></p><p>Links:</p><ul><li><a title="This new tool can tell you if your online photos are helping train facial recognition systems - CNN" rel="nofollow" href="https://www.cnn.com/2021/02/04/tech/face-recognition-ai-tool/">This new tool can tell you if your online photos are helping train facial recognition systems - CNN</a> &mdash; Exposing.ai, unveiled in January, lets you know whether photos you've posted to image-sharing site Flickr have been used to advance this controversial application of artificial intelligence by allowing you to search more than 3.6 million photos in six facial-recognition image datasets.</li><li><a title="Check if your photos were used in AI surveillance research projects" rel="nofollow" href="https://exposing.ai/">Check if your photos were used in AI surveillance research projects</a> &mdash; Check if your Flickr photos were used to build face recognition</li><li><a title="Clearview AI’s Facial Recognition App Called Illegal in Canada - The New York Times" rel="nofollow" href="https://www.nytimes.com/2021/02/03/technology/clearview-ai-illegal-canada.html">Clearview AI’s Facial Recognition App Called Illegal in Canada - The New York Times</a> &mdash; Canadian authorities declared that the company needed citizens’ consent to use their biometric information, and told the firm to delete facial images from its database.</li><li><a title="This &#39;Anonymizer&#39; Tool Replaces Your Face With a Fake to Trick Facial Recognition | Debugger" rel="nofollow" href="https://debugger.medium.com/replace-your-face-with-an-a-i-twin-to-trick-facial-recognition-22be6931cf1">This 'Anonymizer' Tool Replaces Your Face With a Fake to Trick Facial Recognition | Debugger</a> &mdash; Try the Anonymizer tool to create a fake face that looks like you
</li><li><a title="Doc Searls Weblog · About face" rel="nofollow" href="https://blogs.harvard.edu/doc/2019/10/31/about-face/">Doc Searls Weblog · About face</a> &mdash; We know more than we can tell.</li><li><a title="Facebook CIA Project: The Onion News Network ONN - YouTube" rel="nofollow" href="https://www.youtube.com/watch?v=juQcZO_WnsI">Facebook CIA Project: The Onion News Network ONN - YouTube</a></li><li><a title="Sun on Privacy: &#39;Get Over It&#39; | WIRED" rel="nofollow" href="https://www.wired.com/1999/01/sun-on-privacy-get-over-it/">Sun on Privacy: 'Get Over It' | WIRED</a> &mdash; "You have zero privacy anyway," Scott McNealy told a group of reporters and analysts Monday night at an event to launch his company's new Jini technology.

"Get over it."</li><li><a title="The Andromeda Strain - MichaelCrichton.com" rel="nofollow" href="https://www.michaelcrichton.com/the-andromeda-strain/">The Andromeda Strain - MichaelCrichton.com</a></li><li><a title="A Vast Web of Vengeance - The New York Times" rel="nofollow" href="https://www.nytimes.com/2021/01/30/technology/change-my-google-results.html">A Vast Web of Vengeance - The New York Times</a> &mdash; Outrageous lies destroyed Guy Babcock’s online reputation. When he went hunting for their source, what he discovered was worse than he could have imagined.</li><li><a title="Canadian Woman Accused of Defaming Dozens Online Is Arrested in Toronto - The New York Times" rel="nofollow" href="https://www.nytimes.com/2021/02/10/technology/nadire-atas-arrest.html">Canadian Woman Accused of Defaming Dozens Online Is Arrested in Toronto - The New York Times</a> &mdash; Nadire Atas, a Canadian woman who wrote thousands of online posts defaming her perceived enemies, was arrested on Tuesday by the police in Toronto. She was charged with crimes including harassment and libel, a Toronto police spokeswoman said.</li><li><a title="CORE Response: Community Organized Relief Effort" rel="nofollow" href="https://www.coreresponse.org/">CORE Response: Community Organized Relief Effort</a> &mdash; Together, we save lives.</li></ul>]]>
  </itunes:summary>
</item>
<item>
  <title>Episode 46: Facial Recognition, Surveillance Technology, and the Balance of Power</title>
  <link>https://www.reality2cast.com/46</link>
  <guid isPermaLink="false">7737bfad-2921-4af3-9633-1dbb0561bd9b</guid>
  <pubDate>Fri, 30 Oct 2020 11:00:00 -0400</pubDate>
  <author>Katherine Druckman and Doc Searls</author>
  <enclosure url="https://aphid.fireside.fm/d/1437767933/55e8e48a-ae33-492f-bc04-175d577a5a7e/7737bfad-2921-4af3-9633-1dbb0561bd9b.mp3" length="50287126" type="audio/mpeg"/>
  <itunes:episodeType>full</itunes:episodeType>
  <itunes:author>Katherine Druckman and Doc Searls</itunes:author>
  <itunes:subtitle>Doc Searls, Katherine Druckman, and Kyle Rankin talk about facial recognition and surveillance technology in the hands of individuals, and how that affects the balance of power.</itunes:subtitle>
  <itunes:duration>58:33</itunes:duration>
  <itunes:explicit>no</itunes:explicit>
  <itunes:image href="https://media24.fireside.fm/file/fireside-images-2024/podcasts/images/5/55e8e48a-ae33-492f-bc04-175d577a5a7e/cover.jpg?v=5"/>
  <description>Doc Searls, Katherine Druckman, and Kyle Rankin talk about facial recognition and surveillance technology in the hands of individuals, and how that affects the balance of power.
Reality 2.0 around the web:
Site/Blog/Newsletter (https://www.reality2cast.com)
FaceBook (https://www.facebook.com/reality2cast)
Twitter (https://twitter.com/reality2cast)
YouTube (https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q)
Mastodon (https://linuxrocks.online/@reality2cast) Special Guest: Kyle Rankin.
</description>
  <itunes:keywords>technology, privacy, open source, security, linux, surveillance, facial recognition, ai</itunes:keywords>
  <content:encoded>
    <![CDATA[<p>Doc Searls, Katherine Druckman, and Kyle Rankin talk about facial recognition and surveillance technology in the hands of individuals, and how that affects the balance of power.</p>

<p>Reality 2.0 around the web:<br>
<a href="https://www.reality2cast.com" rel="nofollow">Site/Blog/Newsletter</a><br>
<a href="https://www.facebook.com/reality2cast" rel="nofollow">FaceBook</a><br>
<a href="https://twitter.com/reality2cast" rel="nofollow">Twitter</a><br>
<a href="https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q" rel="nofollow">YouTube</a><br>
<a href="https://linuxrocks.online/@reality2cast" rel="nofollow">Mastodon</a></p><p>Special Guest: Kyle Rankin.</p><p><a rel="payment" href="https://www.patreon.com/reality2cast">Support Reality 2.0</a></p><p>Links:</p><ul><li><a title="Activists Turn Facial Recognition Tools Against the Police - The New York Times" rel="nofollow" href="https://www.nytimes.com/2020/10/21/technology/facial-recognition-police.html">Activists Turn Facial Recognition Tools Against the Police - The New York Times</a> &mdash; These activists say it has become relatively easy to build facial recognition tools thanks to off-the-shelf image recognition software that has been made available in recent years. In Portland, Mr. Howell used a Google-provided platform, TensorFlow, which helps people build machine-learning models.</li><li><a title="Fawkes" rel="nofollow" href="http://sandlab.cs.uchicago.edu/fawkes/">Fawkes</a> &mdash; The SAND Lab at University of Chicago has developed Fawkes1, an algorithm and software tool (running locally on your computer) that gives individuals the ability to limit how unknown third parties can track them by building facial recognition models out of their publicly available photos.</li><li><a title="Mass Extraction - Upturn" rel="nofollow" href="https://www.upturn.org/reports/2020/mass-extraction/">Mass Extraction - Upturn</a> &mdash; To search phones, law enforcement agencies use mobile device forensic tools (MDFTs), a powerful technology that allows police to extract a full copy of data from a cellphone — all emails, texts, photos, location, app data, and more — which can then be programmatically searched. As one expert puts it, with the amount of sensitive information stored on smartphones today, the tools provide a “window into the soul.”</li><li><a title="Doc Searls Weblog · About face" rel="nofollow" href="http://blogs.harvard.edu/doc/2019/10/31/about-face/">Doc Searls Weblog · About face</a> &mdash; We know more than we can tell.
</li><li><a title="Vivian Maier Photographer | Official website of Vivian Maier | Vivian Maier Portfolios, Prints, Exhibitions, Books and documentary film" rel="nofollow" href="http://www.vivianmaier.com/">Vivian Maier Photographer | Official website of Vivian Maier | Vivian Maier Portfolios, Prints, Exhibitions, Books and documentary film</a></li><li><a title="Welcome to the 21st Century: How To Plan For The Post-Covid Future - O&#39;Reilly Media" rel="nofollow" href="https://www.oreilly.com/tim/21stcentury/">Welcome to the 21st Century: How To Plan For The Post-Covid Future - O'Reilly Media</a> &mdash; So too, when we look back, we will understand that the 21st century truly began this year, when the COVID19 pandemic took hold. We are entering the century of being blindsided by things that we have been warned about for decades but never took seriously enough to prepare for, the century of lurching from crisis to crisis until, at last, we shake ourselves from the illusion that our world will go back to the comfortable way it was and begin the process of rebuilding our society from the ground up.</li></ul>]]>
  </content:encoded>
  <itunes:summary>
    <![CDATA[<p>Doc Searls, Katherine Druckman, and Kyle Rankin talk about facial recognition and surveillance technology in the hands of individuals, and how that affects the balance of power.</p>

<p>Reality 2.0 around the web:<br>
<a href="https://www.reality2cast.com" rel="nofollow">Site/Blog/Newsletter</a><br>
<a href="https://www.facebook.com/reality2cast" rel="nofollow">FaceBook</a><br>
<a href="https://twitter.com/reality2cast" rel="nofollow">Twitter</a><br>
<a href="https://www.youtube.com/channel/UCdvdT3quikpi9sd5SxTGk3Q" rel="nofollow">YouTube</a><br>
<a href="https://linuxrocks.online/@reality2cast" rel="nofollow">Mastodon</a></p><p>Special Guest: Kyle Rankin.</p><p><a rel="payment" href="https://www.patreon.com/reality2cast">Support Reality 2.0</a></p><p>Links:</p><ul><li><a title="Activists Turn Facial Recognition Tools Against the Police - The New York Times" rel="nofollow" href="https://www.nytimes.com/2020/10/21/technology/facial-recognition-police.html">Activists Turn Facial Recognition Tools Against the Police - The New York Times</a> &mdash; These activists say it has become relatively easy to build facial recognition tools thanks to off-the-shelf image recognition software that has been made available in recent years. In Portland, Mr. Howell used a Google-provided platform, TensorFlow, which helps people build machine-learning models.</li><li><a title="Fawkes" rel="nofollow" href="http://sandlab.cs.uchicago.edu/fawkes/">Fawkes</a> &mdash; The SAND Lab at University of Chicago has developed Fawkes1, an algorithm and software tool (running locally on your computer) that gives individuals the ability to limit how unknown third parties can track them by building facial recognition models out of their publicly available photos.</li><li><a title="Mass Extraction - Upturn" rel="nofollow" href="https://www.upturn.org/reports/2020/mass-extraction/">Mass Extraction - Upturn</a> &mdash; To search phones, law enforcement agencies use mobile device forensic tools (MDFTs), a powerful technology that allows police to extract a full copy of data from a cellphone — all emails, texts, photos, location, app data, and more — which can then be programmatically searched. As one expert puts it, with the amount of sensitive information stored on smartphones today, the tools provide a “window into the soul.”</li><li><a title="Doc Searls Weblog · About face" rel="nofollow" href="http://blogs.harvard.edu/doc/2019/10/31/about-face/">Doc Searls Weblog · About face</a> &mdash; We know more than we can tell.
</li><li><a title="Vivian Maier Photographer | Official website of Vivian Maier | Vivian Maier Portfolios, Prints, Exhibitions, Books and documentary film" rel="nofollow" href="http://www.vivianmaier.com/">Vivian Maier Photographer | Official website of Vivian Maier | Vivian Maier Portfolios, Prints, Exhibitions, Books and documentary film</a></li><li><a title="Welcome to the 21st Century: How To Plan For The Post-Covid Future - O&#39;Reilly Media" rel="nofollow" href="https://www.oreilly.com/tim/21stcentury/">Welcome to the 21st Century: How To Plan For The Post-Covid Future - O'Reilly Media</a> &mdash; So too, when we look back, we will understand that the 21st century truly began this year, when the COVID19 pandemic took hold. We are entering the century of being blindsided by things that we have been warned about for decades but never took seriously enough to prepare for, the century of lurching from crisis to crisis until, at last, we shake ourselves from the illusion that our world will go back to the comfortable way it was and begin the process of rebuilding our society from the ground up.</li></ul>]]>
  </itunes:summary>
</item>
  </channel>
</rss>
