www.washingtonpost.com Open in urlscan Pro
23.37.45.67  Public Scan

URL: https://www.washingtonpost.com/business/2024/05/18/facial-recognition-law-enforcement-austin-san-francisco/?utm_campaign=wp_pos...
Submission: On May 21 via api from BE — Scanned from DE

Form analysis 1 forms found in the DOM

<form class="wpds-c-gRPFSl wpds-c-gRPFSl-jGNYrR-isSlim-false">
  <div class="transition-all duration-200 ease-in-out"><button type="submit" data-qa="sc-newsletter-signup-button" class="wpds-c-kSOqLF wpds-c-kSOqLF-uTUwn-variant-primary wpds-c-kSOqLF-eHdizY-density-default wpds-c-kSOqLF-ejCoEP-icon-left">Sign
      up</button></div>
</form>

Text Content

Accessibility statementSkip to main content

Democracy Dies in Darkness
SubscribeSign in



Advertisement


Democracy Dies in Darkness
Business


THESE CITIES BAR FACIAL RECOGNITION TECH. POLICE STILL FOUND WAYS TO ACCESS IT.

Citing concerns about accuracy and racial bias, the cities banned the
technology. So some police officers sought help from other law enforcement
agencies.

By Douglas MacMillan
May 18, 2024 at 6:00 a.m. EDT

Concerns about the accuracy of facial recognition tools have prompted a wave of
local and state bans on the technology. (iStock/Getty Images)

Listen
11 min

Share
Comment on this storyComment1833
Add to your saved stories
Save

As cities and states push to restrict the use of facial recognition
technologies, some police departments have quietly found a way to keep using the
controversial tools: asking for help from other law enforcement agencies that
still have access.


Get a curated selection of 10 of our best stories in your inbox every weekend.


Officers in Austin and San Francisco — two of the largest cities where police
are banned from using the technology — have repeatedly asked police in
neighboring towns to run photos of criminal suspects through their facial
recognition programs, according to a Washington Post review of police documents.



In San Francisco, the workaround didn’t appear to help. Since the city’s ban
took effect in 2019, the San Francisco Police Department has asked outside
agencies to conduct at least five facial recognition searches, but no matches
were returned, according to a summary of those incidents submitted by the
department to the county’s board of supervisors last year.

Advertisement

Story continues below advertisement



SFPD spokesman Evan Sernoffsky said these requests violated the city ordinance
and were not authorized by the department, but the agency faced no consequences
from the city. He declined to say whether any officers were disciplined because
those would be personnel matters.


GET CAUGHT UP

Trump’s immigration plans could deal a major blow to the job market

SparkleSummary is AI-generated, newsroom-reviewed.


Who is Mohammad Mokhber? Iran taps acting president after Raisi’s death.

SparkleSummary is AI-generated, newsroom-reviewed.


Taiwan swears in new president, stands up to Chinese aggression

SparkleSummary is AI-generated, newsroom-reviewed.


Two teen prodigies shocked America with a cynical murder 100 years ago

SparkleSummary is AI-generated, newsroom-reviewed.


There’s more than one type of anxiety. Here are tips to cope.

SparkleSummary is AI-generated, newsroom-reviewed.


Austin police officers have received the results of at least 13 face searches
from a neighboring police department since the city’s 2020 ban — and have
appeared to get hits on some of them, according to documents obtained by The
Post through public records requests and sources who shared them on the
condition of anonymity.

“That’s him! Thank you very much,” one Austin police officer wrote in response
to an array of photos sent to him by an officer in Leander, Tex., who ran a
facial recognition search, documents show. The man displayed in the pictures,
John Curry Jr., was later charged with aggravated assault for allegedly charging
toward someone with a knife, and is currently in jail awaiting trial. Curry’s
attorney declined to comment.

Advertisement

Story continues below advertisement



But at least one man who was ensnared by the searches argued that police should
be held to the same standards as ordinary citizens.

“We have to follow the laws. Why don’t they?” said Tyrell Johnson, 20, who was
identified by a facial recognition search in August as a suspect in the armed
robbery of an Austin 7-Eleven, documents show. Johnson said he’s innocent,
though prosecutors said in court documents that he bears the same hand tattoo
and was seen in a video on social media wearing the same clothing as the person
caught on tape committing the crime. He’s awaiting trial.

A spokeswoman for the Austin Police Department said these uses of facial
recognition were never authorized by department or city officials. She said the
department would review the cases for potential violations of city rules.

Advertisement

Story continues below advertisement



“When allegations are made against any department staff, we follow a consistent
process,” the spokeswoman said in an emailed statement. “We’ve initiated that
process to investigate the claims. If the investigation determines that policies
were violated, APD will take the necessary steps.”

The Leander Police Department declined to comment.

Police officers’ efforts to skirt these bans have not been previously reported
and highlight the challenge of reining in police use of facial recognition. The
powerful but imperfect artificial intelligence technology has played a role in
the wrongful arrests of at least seven innocent Americans, six of whom were
Black, according to lawsuits each of these people filed after the charges
against them were dismissed.

Story continues below advertisement



Concerns about the accuracy of these tools — found to be worse when scanning for
people of color, according to a 2019 federal study — have prompted a wave of
local and state bans on the technology, particularly during the policing
overhauls passed in the wake of the Black Lives Matter protests of 2020.

Advertisement


But enforcing these bans is difficult, experts said, because authorities often
conceal their use of facial recognition. Even in places with no restrictions on
the technology, investigators rarely mention its use in police reports. And,
because facial recognition searches are not presented as evidence in court —
legal authorities claim this information is treated as an investigative lead,
not as proof of guilt — prosecutors in most places are not required to tell
criminal defendants they were identified using an algorithm, according to
interviews with defense lawyers, prosecutors and judges.

“Police are using it but not saying they are using it,” said Chesa Boudin, San
Francisco’s former district attorney, who said he was wary of prosecuting cases
that may have relied on information the SFPD obtained in violation of the city’s
ban.

Story continues below advertisement



Facial recognition algorithms have been used by some police for over a decade to
identify criminal suspects. The technology analyzes a “probe image” — taken
perhaps from a crime scene photo or surveillance video — and rapidly scans
through a database of millions of images to locate faces with similar features.
Experts said the technology’s effectiveness can hinge on the quality of the
probe image and the cognitive biases of human users, who have the sometimes
difficult task of selecting one possible match out of dozens of candidates that
may be returned by an algorithm.

Advertisement


The first known false arrest linked to facial recognition was of a Black man in
Detroit. His arrest was the subject of an article in the New York Times in June
2020, one month after the murder of George Floyd at the hands of Minneapolis
police fueled national protests over policing tactics in minority communities.

That same month, Austin passed its ban on facial recognition, part of a city
council resolution that also restricted police use of tear gas, chokeholds,
military equipment and no-knock warrants.

Story continues below advertisement



“The outcry was so great in that moment,” said Chris Harris, a policy director
at the Austin Justice Coalition, a nonprofit civil rights group. A long list of
police reforms the community had discussed for years, he said, “suddenly became
possible.”

The city council of Jackson, Miss., soon followed suit, saying the technology
“has been shown to programmatically misidentify people of color, women, and
children: thus supercharging discrimination.” Portland, Maine, passed its own
ban, saying “the use of facial recognition and other biometric surveillance
would disproportionately impact the civil rights and liberties of people who
live in highly policed neighborhoods.”

Advertisement


In all, 21 cities or counties and Vermont have voted to prohibit the use of
facial recognition tools by law enforcement, according to the Security Industry
Association, a Maryland-based trade group.

Story continues below advertisement




Boudin, the former San Francisco district attorney, says he saw evidence the
SFPD commonly employed a different workaround that gave it plausible
deniability: sharing “be on the lookout” fliers containing images of suspects
with other police agencies in the Bay Area, who might take it upon themselves to
run the photos through their facial recognition software and send back any
results.

Sernoffsky, the SFPD spokesman, called Boudin’s claim an “outlandish conspiracy
theory,” adding that any assertion that “SFPD routinely engaged in this practice
beyond the cases we made public is absolutely false.”

Advertisement


In September 2020, the San Francisco Chronicle reported that the SFPD had
charged a suspect with illegally discharging a gun after he was identified
through a facial recognition search result. The lead was provided by the
Northern California Regional Intelligence Center, or NCRIC, a multi-jurisdiction
program serving law enforcement agencies in the region. At the time, the SFPD
told the Chronicle that it had not asked the NCRIC to conduct the search and
that it identified the suspect through other means.

Story continues below advertisement



Mike Sena, executive director of the NCRIC, said his analysts always send
suspect leads out to agencies whenever they get a hit. “We are not trying to
force anyone to violate their policy, but if I identify a potential lead as a
murder suspect, we are not going to just sit on it,” Sena said.

In the five cases the police department reported to the city’s board of
supervisors, two SFPD officers explicitly asked a “state and local law
enforcement fusion center” to help them identify robbery, aggravated assault and
stabbing suspects in 2020 and 2021, and one of them asked the Daly City Police
Department for help identifying a stabbing suspect in 2021. The disclosure was
part of an annual report in which the SFPD is required to list how it used any
surveillance technology.

Advertisement


A Daly City police official said he had no immediate comment.

The SFPD told the board of supervisors that all five incidents had been examined
by the department’s internal investigators but did not say whether any
disciplinary measures had been taken.

“The SFPD is not taking the facial recognition ban seriously,” said Brian Hofer,
executive director of Secure Justice, a watchdog of police surveillance who
shared the San Francisco document with The Post. “They have repeatedly violated
it and stronger consequences are needed.”

Sernoffsky said the SFPD follows all city laws and department policies and
thoroughly investigates any accusations of policy violations.

Police departments can be deeply entwined with their law enforcement neighbors,
especially when it comes to sharing information that could help catch criminals.
The Rev. Ricky Burgess, a former city council member in Pittsburgh, warned his
colleagues when they passed a 2020 ban on facial recognition that the measure
probably would be ineffective because police frequently collaborate with
neighboring and statewide agencies.

Share this articleShare

“Right now, today, the city of Pittsburgh is using facial recognition through
the state of Pennsylvania, and we have no control over it whatsoever,” Burgess
said at the time, according to a video of the meeting archived by the public
records database Quorum. “This is a bill simply for window dressing.”

A spokesperson for Pittsburgh police said the department does not use facial
recognition technology.

Austin’s city council tried to prevent such loopholes. Its resolution prohibits
city employees from using facial recognition as well as “information obtained”
from the technology. Exceptions for cases of “imminent threat or danger” require
approval from the city manager.

After the ban went into effect, Austin police discovered that their colleagues
in Leander, a suburb about 30 miles north of Austin, had access to Clearview AI,
one of the most popular providers of facial recognition software for police
agencies. Clearview’s database includes billions of images scraped from social
media and other websites — images that privacy advocates say were collected
without appropriate consent.

Between May 2022 and January 2024, Austin police officers emailed Leander police
at least six times, explicitly asking them to run photos through facial
recognition, documents show. In at least seven other cases, Leander police
provided facial recognition results to Austin police even if it wasn’t clear
they had been explicitly asked to do so. Most of the searches were conducted by
one Leander officer, David Wilson, whose name circulated within the ranks of
Austin police as someone who could help run facial searches, emails reviewed by
The Post show. Wilson was listed on Leander’s contract with Clearview as the
agency’s “influencer” for the technology.

“Hello sir, I was referred to you by our Robbery Unit who advised me you are
able to do facial recognition,” one Austin detective wrote in a December 2022
email to Wilson obtained by The Post. “I am working a case where I am trying to
identify a suspect and was curious if you might be able to help me out with it.”

Wilson did not respond to requests for comment.

Clearview prohibits law enforcement customers from sharing their access to the
platform with anyone outside their agency. But the software lets customers
easily export the results of facial recognition searches. Last year, Clearview
CEO Hoan Ton-That sent an email to customers saying it would begin restricting
this feature, citing concerns about too much information sharing hurting the
company’s business and increasing the potential for errors.

“Sharing results with others who are not trained on facial recognition usage and
best practices may lead to higher chances of mistakes in these investigations,”
Ton-That said in the email.

The letter said nothing about sharing Clearview results through more rudimentary
methods, such as copying and pasting images into emails — the method Wilson
appeared to use several times, even after Clearview said it was clamping down on
sharing, emails show.

Clearview did not respond to requests for comment.

Nate Jones and Jeremy B. Merrill contributed to this report.

Share
1833 Comments



NewsletterWednesdays
The Color of Money
Advice on how to save, spend and talk about your money for the short and long
term from Michelle Singletary.
Sign up


Subscribe to comment and get the full experience. Choose your plan →


Advertisement



Advertisement

TOP STORIES
The Post’s View
Views from The Post’s Editorial Board on current events
Opinion|The problem with diversity statements — and what to do about them


Opinion|Russia shows resilience. There is more to do for Ukraine.


Opinion|Trump promises tax cuts and fewer rules. CEOs shouldn’t fall for it.


back
Try a different topic

Sign in or create a free account to save your preferences
Advertisement


Advertisement

Company
About The Post Newsroom Policies & Standards Diversity & Inclusion Careers Media
& Community Relations WP Creative Group Accessibility Statement Sitemap
Get The Post
Become a Subscriber Gift Subscriptions Mobile & Apps Newsletters & Alerts
Washington Post Live Reprints & Permissions Post Store Books & E-Books Print
Archives (Subscribers Only) Today’s Paper Public Notices Coupons
Contact Us
Contact the Newsroom Contact Customer Care Contact the Opinions Team Advertise
Licensing & Syndication Request a Correction Send a News Tip Report a
Vulnerability
Terms of Use
Digital Products Terms of Sale Print Products Terms of Sale Terms of Service
Privacy Policy Cookie Settings Submissions & Discussion Policy RSS Terms of
Service Ad Choices
washingtonpost.com © 1996-2024 The Washington Post
 * washingtonpost.com
 * © 1996-2024 The Washington Post
 * About The Post
 * Contact the Newsroom
 * Contact Customer Care
 * Request a Correction
 * Send a News Tip
 * Report a Vulnerability
 * Download the Washington Post App
 * Policies & Standards
 * Terms of Service
 * Privacy Policy
 * Cookie Settings
 * Print Products Terms of Sale
 * Digital Products Terms of Sale
 * Submissions & Discussion Policy
 * RSS Terms of Service
 * Ad Choices
 * Coupons








WE CARE ABOUT YOUR PRIVACY

We and our 44 partners store and/or access information on a device, such as
unique IDs in cookies to process personal data. You may accept or manage your
choices by clicking below, including your right to object where legitimate
interest is used, or at any time in the privacy policy page. These choices will
be signaled to our partners and will not affect browsing data.

If you click “I accept,” in addition to processing data using cookies and
similar technologies for the purposes to the right, you also agree we may
process the profile information you provide and your interactions with our
surveys and other interactive content for personalized advertising.

If you do not accept, we will process cookies and associated data for strictly
necessary purposes and process non-cookie data as set forth in our Privacy
Policy (consistent with law and, if applicable, other choices you have made).


WE AND OUR PARTNERS PROCESS COOKIE DATA TO PROVIDE:

Actively scan device characteristics for identification. Create profiles for
personalised advertising. Use profiles to select personalised advertising.
Create profiles to personalise content. Use profiles to select personalised
content. Measure advertising performance. Measure content performance.
Understand audiences through statistics or combinations of data from different
sources. Develop and improve services. Store and/or access information on a
device. Use limited data to select content. Use limited data to select
advertising. List of Partners (vendors)

I Accept Reject All Show Purposes