Tune in to our new episode! Katherine Druckman, Doc Searls and Shawn Powers chat about Twitter verification, facial recognition, YouTube moderation, and algorithmic bias.Support Reality 2.0
- I’m Not a Reporter. But I’m Verified as One on Twitter | WIRED — I NEVER CALLED myself a journalist until Twitter made me. I’m an attorney, activist, and faculty member, but it was only by using the “journalist” label that I was able to get one of the most coveted assets in social media, the blue “verified” checkmark. My months-long effort to get verified revealed a system that is stacked against grassroots activists, particularly BIPOC communities.
- Twitter verification requirements - how to get the blue check — The blue Verified badge on Twitter lets people know that an account of public interest is authentic. To receive the blue badge, your account must be authentic, notable, and active.
- Goodbye, Fleets — We built Fleets as a lower-pressure, ephemeral way for people to share their fleeting thoughts. We hoped Fleets would help more people feel comfortable joining the conversation on Twitter. But, in the time since we introduced Fleets to everyone, we haven’t seen an increase in the number of new people joining the conversation with Fleets like we hoped. Because of this, on August 3, Fleets will no longer be available on Twitter.
- Black teen barred from skating rink by inaccurate facial recognition - The Verge — A facial recognition algorithm used by a local roller skating rink in Detroit wouldn’t let teen Lamya Robinson onto the premises, and accused her of previously getting into a fight at the establishment.
- Algorithmic bias - Wikipedia — Algorithmic bias describes systematic and repeatable errors in a computer system that create unfair outcomes, such as privileging one arbitrary group of users over others. Bias can emerge due to many factors, including but not limited to the design of the algorithm or the unintended or unanticipated use or decisions relating to the way data is coded, collected, selected or used to train the algorithm. Algorithmic bias is found across platforms, including but not limited to search engine results and social media platforms, and can have impacts ranging from inadvertent privacy violations to reinforcing social biases of race, gender, sexuality, and ethnicity. The study of algorithmic bias is most concerned with algorithms that reflect "systematic and unfair" discrimination. This bias has only recently been addressed in legal frameworks, such as the 2018 European Union's General Data Protection Regulation. More comprehensive regulation is needed as emerging technologies become increasingly advanced and opaque.
- Face-Detection Cameras: Glitches Spur Charges of Racism - TIME — When Joz Wang and her brother bought their mom a Nikon Coolpix S630 digital camera for Mother's Day last year, they discovered what seemed to be a malfunction. Every time they took a portrait of each other smiling, a message flashed across the screen asking, "Did someone blink?" No one had. "I thought the camera was broken!" Wang, 33, recalls. But when her brother posed with his eyes open so wide that he looked "bug-eyed," the messages stopped.
- Meet the Censored: Matt Orfalea - by Matt Taibbi - TK News by Matt Taibbi — Yes, the government is helping crack down on text messages and Facebook posts, but not to worry. At least your private thoughts are safe, right? Not so fast, found filmmaker Matt Orfalea
- Texas’ social media censorship bill pushes unconstitutional limits on free speech — Amid ongoing allegations that social media platforms are censoring conservatives, regulating Big Tech has become one of the hottest issues across the country. In Texas, Gov. Greg Abbott has called a special legislative session in part to debate and pass content moderation legislation.
- I Got Access to My Secret Consumer Score. Now You Can Get Yours, Too. - The New York Times — Little-known companies are amassing your data — like food orders and Airbnb messages — and selling the analysis to clients. Here’s how to get a copy of what they have on you.