Internet Censorship Won't Make Kids Safe
October 5, 2023
Today’s young people have a lot to deal with. Pandemic interruptions, social isolation, climate change, political polarization, ever-changing technology — all on top of the typical turbulence of adolescence. Studies on youth mental health outcomes show increasing loneliness and hopelessness, illustrating one thing: the kids are not alright. We all want a silver bullet for the youth mental health crisis, and some lawmakers are claiming they have one: the Kids Online Safety Act, or KOSA for short.
After failing to gain traction in 2022, this bipartisan bill has been revised and re-introduced by Congress — but like most solutions that claim to solve all our woes with the stroke of a pen, KOSA is too good to be true.
If passed, KOSA would allow each state’s attorney general to individually decide what parts of the internet kids can and cannot access. In fact, KOSA proponents have even openly admitted that they plan to use KOSA to block kids from LGBTQ content online.
We at the ACLU, along with other civil rights organizations and parents of queer and trans youth, have spoken out against the bill for all the ways it overreaches, suppresses our right to free information, and targets LGBTQ people. As anti-LGBTQ legislation continues to rise, KOSA is one of many censorship tools masquerading as a kids safety solution.
Joining us today to explain the consequences this bill could have for us all are Evan Greer, director of the digital rights group Fight for the Future, and Cody Venzke, senior policy counsel at the ACLU.
In this episode
Kendall Ciesemier
Listen to this episode on
Apple Podcasts SpotifyThis Episode Covers the Following Issues
-
Internet Privacy
-
Internet Speech
-
Consumer Online Privacy
-
Online Anonymity and Identity
-
LGBTQ Youth
-
Privacy & Technology
-
Trans and Gender-Nonconforming Youth
-
Transgender People and Discrimination
-
Transgender Rights
-
LGBTQ Rights
-
Anti-LGBTQ Web Filtering
-
LGBTQ Nondiscrimination Protections
-
Free Expression and Censorship
-
Free Speech
Related Content
-
VirginiaApr 2026
Privacy & Technology
Schmidt V. Norfolk. Explore Case.Schmidt v. Norfolk
Status: Ongoing -
News & CommentaryApr 2026
Privacy & Technology
Municipalities: Beware Of Changes In Flock’s Legal Terms If You’re Using Or Considering License Plate Readers. Explore News & Commentary.Municipalities: Beware of Changes in Flock’s Legal Terms if You’re Using or Considering License Plate Readers
A flurry of Flock contract changes disempower customersBy: Jay Stanley -
News & CommentaryApr 2026
Privacy & Technology
More Than A Dozen Wrongful Arrests Due To Police Reliance On Facial Recognition Technology. Explore News & Commentary.More than a Dozen Wrongful Arrests Due to Police Reliance on Facial Recognition Technology
One ACLU client spent six months in jail, because police relied on facial recognition technology to incorrectly identify her as a suspect. She’s the fourteenth person known to be wrongfully arrested due to the technology’s failures.By: Lauren Yu, Nathan Freed Wessler -
Press ReleaseApr 2026
Privacy & Technology
Woman Wrongly Jailed For Months Based On Faulty Facial Recognition Technology Demands Apology From Maryland Police Departments. Explore Press Release.Woman Wrongly Jailed for Months Based on Faulty Facial Recognition Technology Demands Apology from Maryland Police Departments
MARYLAND — Today, the American Civil Liberties Union and ACLU of Maryland sent letters to three Maryland police departments on behalf of Kimberlee Williams, an Oklahoma woman who was wrongfully arrested because Maryland police relied on an incorrect result from facial recognition technology and concealed their reliance on that unreliable technology from the court when applying for arrest warrants. Ms. Williams is the fourteenth person publicly known to have been wrongfully arrested by U.S. police because of reliance on erroneous facial recognition results. “I lost six months of my life when Maryland police wrongfully imprisoned me halfway across the country from my children, my home, and my job, all because they relied on an incorrect result from faulty technology,” said Kimberlee Williams. “I had never even been to Maryland before I was flown there in handcuffs, for a crime I had nothing to do with. My family and I can’t get that time back, but I hope my experience will be a warning to police in Maryland and across the country that this technology can ruin lives. No family deserves to go through that.” On June 23, 2021, Ms. Williams was accompanying one of her daughters, a DoorDash driver, as she made a delivery to a local military base in Lawton, Oklahoma. When base security conducted a standard ID check, they discovered an outstanding Maryland arrest warrant for Ms. Williams, detained her, and called local police, who arrested her. Ms. Williams spent 23 days in an Oklahoma jail before a Maryland officer arrived to transport her to a jail in Montgomery County, Maryland, where she was imprisoned for over three months while she fought to show she couldn‘t have committed a crime in Maryland, a state she had never been to before. The police accused Ms. Williams of being a match to an unknown individual who had entered several bank branches in Maryland, impersonated account holders, and withdrawn thousands of dollars from their accounts. When the bank began investigating these incidents, a bank investigator sent an image of the unknown suspect to a national listserv of police and private investigators called Crimedex. Someone on the listserv ran the image through facial recognition technology and sent back Ms. Williams’ name and photo as a purported match to the suspect. The bank investigator sent a memo to a Montgomery County detective stating that “facial recognition software” had flagged Ms. Williams as the suspect. Without any independent investigation corroborating that erroneous match, Montgomery County police obtained a warrant for Ms. Williams’ arrest. As the letter sent today to Montgomery County police details, when applying for the arrest warrant, the detective assigned to the case concealed that the entire basis of Ms. Williams purported identification was a lead from an unknown and unreliable face recognition technology search by an unknown entity. Instead, he misleadingly claimed that Ms. Williams had been “identified” as the suspect, and that the detective had confirmed the identification by visually comparing a photo of the suspect with an older photo of Ms. Williams. Because the face recognition search had found an innocent person (Ms. Williams) who looked similar to the suspect—an inherent problem with the technology—the officers claimed verification was worthless. If Montgomery County police had accurately represented the facts to the magistrate judge or conducted an adequate investigation, Ms. Williams would not have been subject to wrongful arrest, prosecution, and prolonged detention. Montgomery County prosecutors dropped their charges against Ms. Williams in October 2021, but her ordeal did not end there. Charges were pending against her in two other Maryland counties where other bank branches had been defrauded, based on the same misidentification by face recognition technology and no independent investigation. She was transferred from Montgomery County’s jail to a jail in Prince George’s County, where she spent another two months fighting charges there and in Anne Arundel County. Those cases were finally dismissed in December 2021. All told, Ms. Williams spent six months behind bars. “No one should spend six months in jail because an algorithm got it wrong,” said Lauren Yu, a legal fellow with the ACLU’s Speech, Privacy, and Technology Project. “We now know of 14 people across the country who have been wrongfully arrested because of police reliance on dangerous face recognition technology, with no telling how many more have faced the same injustice. These Maryland police departments owe it to Ms. Williams to make amends and to take serious steps to make sure this doesn’t happen to anyone else. And police across the country are on notice: face recognition technology is hurting people, and these abuses must end.” In 2024, the ACLU reached a landmark settlement agreement with the City of Detroit following the wrongful arrest of Robert Williams due to police reliance on face recognition technology. The technology frequently gets it wrong, and testing has repeatedly shown that facial recognition systems exhibit higher rates of false matches when used on people of color, women, older people, and young people. To date, police in Maryland, Michigan, Missouri, Louisiana, Nevada, New Jersey, North Dakota, Florida, and Arizona are known to have precipitated wrongful arrests due to this technology. Ms. Williams is the second known case of such a wrongful arrest in Maryland. “The investigative failures that led to Ms. Williams’ improper arrest are a predictable result of the State’s unwillingness to specify what further investigation should be required following a facial recognition match,” said David Rocah, senior staff attorney at the ACLU of Maryland. “We told the General Assembly additional guidance was necessary when they passed legislation governing this technology in 2024, and we told the Maryland State Police this was necessary when they were writing model policies for police departments to use.” On behalf of Ms. Williams, the ACLU is demanding that police departments in Montgomery County, Prince George’s County, and Anne Arundel County investigate the failures that led to her wrongful arrest and imprisonment and publicly apologize for this devastating error. Ms. Williams is also seeking policy changes to help prevent this from happening again. Reforms must include prohibiting police from relying on facial recognition technology searches conducted by outside entities, as well as banning police from making arrests based only on face recognition results followed by human identifications, which are tainted when face recognition technology makes a false match to an innocent person who looks similar to the suspect.Court Case: Kimberlee Williams Wrongful ArrestAffiliate: Maryland