Amazon’s Facial-Recognition Tech ‘Supercharges’ Local Police

Please Share This Story!
image_pdfimage_print
Amazon’s says its Rekognition AI system “indistinguishable from magic” but it now offers it for nominal fees to law enforcement at all levels, from local city police and sheriff offices to the FBI. There are currently no national laws regulating the use of facial recognition. ⁃ TN Editor

When workers at an Ace Hardware here reported that a woman had walked out of the store with an $11.99 tank of welding gas that she hadn’t paid for in her tote bag, an elaborate high-tech crime-fighting operation sprang into action.

A Washington County sheriff’s detective, working with the agency’s Special Investigations Unit, ran the store’s surveillance footage through an internal facial-recognition program built by Amazon, revealing a possible match.

That woman’s license plate was flagged and, three months later, a narcotics officer in an unmarked SUV saw it and radioed other patrol deputies to stop her. A deputy clapped a pair of handcuffs around her wrists, and arrest report states. She said she’d needed the gas to fix her car.

Deputies in this corner of western Oregon outside ultraliberal Portland used to track down criminals the old-fashioned way, faxing caught-on-camera images of a suspect around the office in hope that someone might recognize the face.

Then, in late 2017, the Washington County Sheriff’s Office became the first law enforcement agency in the country known to use Amazon’s artificial-intelligence tool Rekognition, transforming this thicket of forests and suburbs into a public testing ground for a new wave of experimental police surveillance techniques.

Almost overnight, deputies saw their investigative powers supercharged, allowing them to scan for matches of a suspect’s face across more than 300,000 mug shots taken at the county jail since 2001. A grainy picture of someone’s face – captured by a security camera, a social media account or a deputy’s smartphone – can quickly become a link to their identity, including their name, family and address. More than 1,000 facial-recognition searches were logged last year, said deputies, who sometimes used the results to find a suspect’s Facebook page or visit their home.

But Washington County also became ground zero for a high-stakes battle over the unregulated growth of policing by algorithm. Defense attorneys, artificial-intelligence researchers and civil rights experts argue that the technology could lead to the wrongful arrest of innocent people who bear only a resemblance to a video image. Rekognition’s accuracy is also hotly disputed, and some experts worry that a case of mistaken identity by armed deputies could have dangerous implications, threatening privacy and people’s lives.

Some police agencies have in recent years run facial-recognition searches against state or FBI databases using systems built by contractors such as Cognitec, IDEMIA and NEC. But the rollout by Amazon has marked perhaps the biggest step in making the controversial face-scanning technology mainstream. Rekognition is easy to activate, requires no major technical infrastructure and is offered to virtually anyone at bargain-barrel prices. Washington County spent about $700 to upload its first big haul of photos, and now, for all its searches, it pays about $7 a month.

It’s impossible to tell, though, just how accurate or effective the technology has been during its first 18 months of real-world tests. Deputies don’t have to note in arrest reports when a facial-recognition search was used, and the exact number of times it has resulted in an arrest is unclear. Sheriff’s officials said the software has led to dozens of arrests for theft, violence or other crimes, but a public-records request turned up nine case reports in which facial recognition was mentioned.

“Just like any of our investigative techniques, we don’t tell people how we catch them,” said Robert Rookhuyzen, a detective on the agency’s major crimes team who said he has run “several dozen” searches and found it helpful about 75% of the time. “We want them to keep guessing.”

Sheriff’s officials say face scans don’t always mark the end of the investigation: Deputies must still establish probable cause or find evidence before charging a suspect with a crime. But the Sheriff’s Office sets its own rules for facial-recognition use and allows deputies to use the tool to identify bodies, unconscious suspects and people who refused to give their name.

The search tool’s imperfect results raise the risk of an innocent person being flagged and arrested, especially in cases of the scanned images being blurred, low-quality or partially concealed. Deputies are also allowed to run artist sketches through the search, an unusual use that AI experts said could more often lead to a false match.

Amazon’s guidelines for law enforcement say officials should use Rekognition’s results only when the system is 99% confident in a match. But deputies here are not shown that search-confidence measurement when they use the tool. Instead, they are given five possible matches for every search, even if the system’s certainty in a match is far lower.

After fielding questions from The Washington Post, Amazon added language to those guidelines, stating that officers should manually review all matches before detaining a suspect and that the search “shouldn’t be used as the sole determinant for taking action.”

The relationship between Amazon and Oregon’s third-largest law enforcement agency is mutually beneficial: The Sheriff’s Office is helping to refine the system, which Amazon hopes to sell across the country. But Amazon’s push into law-enforcement sales has alarmed some legal advocates who say the system poses too many risks to civil liberties. (Amazon founder and CEO Jeff Bezos owns The Post.)

“The government is incredibly powerful, and they bring a lot to bear against an individual citizen in a case,” said Mary Bruington, the director of the Washington County Public Defender’s Office, which represents defendants who can’t afford an attorney. “You couple that with Amazon? That’s a powerful partnership.”

Matt Wood, the general manager of artificial intelligence for the company’s cloud-computing division, Amazon Web Services, said in a statement that Rekognition is just “another input among many other leads for a 100 percent human-driven investigation.”

Still, the company faces criticism on many fronts: Top AI researchers, members of Congress and civil rights groups – as well as some of Amazon’s own investors and employees – have urged the company to stop providing the technology to law enforcement, pointing to studies that have found that the system is less accurate with dark-skinned faces. Amazon has disputed that research.

Some of Amazon’s rivals have spurned similar contracts. Microsoft President Brad Smith said in April that the company had recently declined to provide its facial-recognition software to a California law enforcement agency that wanted to run a face scan anytime its officers pulled someone over, but that it had approved a deal putting the technology in a U.S. prison. Microsoft declined to provide details.

Amazon investors will vote in May on a proposal, backed by a group of activist shareholders, that would prevent the company from selling Rekognition to government agencies unless the company’s board determines that it doesn’t pose a risk to human rights.

The Sheriff’s Office allowed Post journalists to spend two days in March in its squad cars, detective’s offices and county jail, observing how deputies have folded the technology into their daily caseload. Most of those interviewed said the software had saved them time, boosted their arrest numbers and helped them process the growing glut of visual evidence. To date, no legal challenge has been made to an arrest on the grounds that the photo match was mistaken, both deputies and public defenders said.

But lawyers in Oregon said the technology should not be, as many see it, an imminent step forward for the future of policing, and they frame the system not as a technical milestone but a moral one: Is it OK to nab more bad guys if more good guys might get arrested, too?

“People love to always say, ‘Hey, if it’s catching bad people, great, who cares,’ ” said Joshua Crowther, a chief deputy defender in Oregon, “until they’re on the other end.”

‘Indistinguishable from magic’

When Amazon revealed Rekognition in 2016, the company called it a breakthrough for a potent style of deep-learning artificial intelligence that showed results “indistinguishable from magic.” In a blog post illustrated with a photo of an executive’s dog, the company offered some general ideas for how people could begin using it, including for security checkpoints or billboards wired to gather data from a viewer’s face.

The unveiling caught the eye of Chris Adzima, a former eBay programmer who had been hired at the Washington County Sheriff’s Office to work on an iPhone app that deputies use to track inmates’ behavior. His agency had hundreds of thousands of facial photos already online and no real way to analyze them. Using Amazon’s AI, he got a system up and running in less than three weeks.

“They didn’t really have a firm idea of any type of use cases in the real world, but they knew that they had a powerful tool that they created,” said Adzima, a senior information systems analyst who works in a small cubicle at the sheriff’s headquarters. “So, you know, I just started using it.”

Deputies immediately began folding facial searches into their daily beat policing, and Adzima built a bare-bones internal website that let them search from their patrol cars. He dropped the search-confidence percentages and designed the system to return five results, every time: When the system returned zero results, he said, deputies wondered whether they’d messed something up. To spice it up, he also added an unnecessary purple “scanning” animation whenever a deputy uploaded a photo – a touch he said was inspired by cop shows like “CSI.”

Read full story here…

Join our mailing list!


avatar
  Subscribe  
Notify of