Customs Says Facial Recognition Use ‘Is Not A Surveillance Program’

Customs and Border Protection (CBP) have been using facial recognition systems in airports and at the border, but insists that it isn’t a surveillance program. Legislators are finally starting to wake up to the fact that, Yes, it IS a surveillance program!  ⁃ TN Editor

Lawmakers and civil liberties advocates might be pressing law enforcement agencies to scale back their use of facial recognition software, but international travelers should only expect to see more of the tech in the years ahead.

It’s been almost two years since Customs and Border Protection began deploying facial recognition systems at U.S. airports, and despite the recent backlash against the software, the agency’s efforts show no signs of slowing down. But if you ask Deputy Executive Assistant Commissioner John Wagner, the agency’s use of facial recognition falls far short of the dystopian panopticon feared by many of the tech’s critics.

“This is not a surveillance program,” Wagner, who heads CBP’s biometric entry and exit initiative, said in a conversation with Nextgov. “We are not just hanging a camera in an airport and randomly identifying people … as they’re walking through.”

Under Wagner’s program, CBP agents use facial recognition to compare real-time images of international travelers to the photos on their passports or visas. For arrivals, people have their faces scanned while officers review their travel documents, and for departures, the tech captures images right at the boarding gate.

Today, the tech is deployed in some capacity at 16 airports across the U.S, and by 2021, CBP expects to scale up the program to cover more than 97 percent of the people flying outside the country. Ultimately, officials anticipate biometrics could render physical boarding passes obsolete.

The system is intended to help agents keep better tabs who is entering and leaving the country. Instead of relying on traditional flight logs and manual document inspections to monitor international traffic, using the tech, agents can now verify passengers are who they claim to be with more than 98 percent accuracy in a matter of seconds, Wagner said. The agency is currently testing facial recognition at three checkpoints in Arizona to identify people crossing the U.S.-Mexico border.

And officials are already seeing program bear fruit. Since August, agents have intercepted six individuals trying to illegally enter the U.S. through airports and another 125 “imposters” along the southern border, a CBP spokesperson told Nextgov. Wagner said the agency also identified more than 14,000 people who left the country after overstaying their visas, a violation that could prevent them from returning to the U.S. for up to a decade. The program has been so successful that the Partnership for Public Service nominated Wagner for one of its annual Service to America awards.

Outside the Homeland Security Department, however, its reception has been mixed. The program came under fire last week on Capitol Hill as lawmakers and legal experts bashed law enforcement agencies for their often dubious use of facial recognition. During the hearing, Neema Singh Guliani, a senior legislative counsel at the ACLU, said she had “lots of questions and concerns” about CBP expanding the use of biometrics beyond airport terminals.

But compared to the sweeping and often covert applications of facial recognition by the FBI and other agencies, Wagner sees CBP’s operations as pretty tame.

People are always aware their picture is being taken, and U.S. citizens have the ability to opt out of face scans, at least for the departure process, he said. The tech is also only used in sections of the airport where people would already need to show identification, he said, and the image itself is only compared to passport and visa photos that already in the government’s possession.

“The biometric really becomes as simple as validating the information we’ve already received,” Wagner said. “There’s no new information we’re requiring of a person other than taking their photograph and comparing it to a photograph they’ve already given us.”

Instead of running images against a single trove of government IDs, the agency compares them to custom databases created for each individual flight, which significantly reduces the risk of misidentification, Wagner said. Those new airport photos are also deleted from CBP’s systems within less than a day, he added.

Read full story here…




Amazon And Facebook Ready To Read Your Emotions

The latest software land rush goes beyond simple face recognition to reading your emotions. This will have thousands of ready applications such as pre-crime analysis and reporting. Of course, just as people misread emotions on a regular basis, so will the algorithm. ⁃ TN Editor

Facebook and Amazon’s insanity only seems to continue with no sign of slowing down anytime soon. Now, the two big conglomerate giants want to move into the uncharted territory of reading human emotions, both in their own ways.

Facebook wants a robot that has five senses which can read human emotions. Facebook wants “emotionally sensitive” robots that can explore the world, identify objects and people and enable its users to make more friends, Dailymail reported.

The robots would be fitted with wheels or tank-like caterpillar treads that would allow them to trundle about their environment.

Alternatively, such robots could be fitted out with drive systems that would allow them to move around underwater, fly through the air or float in space, Facebook suggest in their patent.

I am not sure why anyone would trust Facebook with data ever again, let alone biometric data, after all the numerous scandals Activist Post has documented including data mining. But to each their own I guess.

Amazon is also looking into reading human emotions in a completely different way by utilizing a voice-activated wearable device, that will sense its wearer’s state of mind by the tone of voice, Bloomberg reported.

It’s worth noting that both companies have a smart home device, and after reading this you should fear what information is being gathered by the cameras and microphones attached to those electronics … besides the typically targeted advertising to turn consumers into the product.

On the Amazon front, it seems more than likely the company will want to use this technology in a variety of different digital gadgets, ranging from personal assistants such as Alexa to new technologies that the retail giant is currently developing. Amazon has announced it’s developing a personal assistance robot, so the new emotional technology could easily be integrated into this at-home robot as a means to “serve the consumer better.” A horrifically terrifying thought indeed.

Amazon and Facebook aren’t the only companies looking into utilizing human emotions. Previously, Activist Post reported that Walmart was also looking into to monitoring your biometric data, pulse, and location from the sensors on a shopping cart handle.

This news comes as hundreds of retail stores — and soon thousands — are investigating using biometric facial recognition software FaceFirst to build a database of shoplifters to aid in the fight against theft, Activist Post reported.

FaceFirst is designed to scan faces as far as 50 to 100 feet away. As customers walk through a store entrance, the video camera captures repetitious images of each shopper and chooses the clearest one to store. The software then analyzes that image and compares it to a database of “bad customers” that the retailer has compiled; if there is a match, the software sends an alert to store employees that a “high risk” customer has entered the door.

The future of shopping seems to allude to having biometric scanners written all over it, a worrying prospect for privacy enthusiasts.

Several privacy advocate groups, attorneys, and even recently Microsoft, which also markets its own facial recognition system, have all raised concerns over the technology, pointing to issues of consent, racial profiling, and the potential to use images gathered through facial recognition cameras as evidence of criminal guilt by law enforcement.

Read full story here…