Everseen Walmart

Walmart Using AI-Powered Cameras In 1,000 Stores To Track Shoppers

Walmart has hired a social engineering company, Everseen, to control and change behavior of its shoppers through the use of AI-based surveillance systems. By surveilling everyone, Walmart hopes to catch a few cheaters.

According to Everseen’s website, “process mining” “amplifies awareness of scenes unfolding daily in retail, pinpointing the “moments that matter”, in order to nudge a behavior one seeks to change, and/or transform the underlying process.” ⁃ TN Editor

 

Walmart is using computer vision technology to monitor checkouts and deter potential theft and other causes of shrink in more than 1,000 stores, the company confirmed to Business Insider.

The surveillance program, internally called Missed Scan Detection, uses cameras to help identify and correct checkout scanning errors and failures.

Ireland-based Everseen is one of several companies supplying Walmart with the technology for its Missed Scan Detection program.

“We are continuously investing in people, programs and technology to keep our stores and communities safe,” a Walmart spokeswoman said.

Walmart is using computer vision technology to monitor checkouts and deter potential theft in more than 1,000 stores, the company confirmed to Business Insider.

The surveillance program, which Walmart refers to internally as Missed Scan Detection, uses cameras to help identify checkout scanning errors and failures.

The cameras track and analyze activities at both self-checkout registers and those manned by Walmart cashiers. When a potential issue arises, such as an item moving past a checkout scanner without getting scanned, the technology notifies checkout attendants so they can intervene.

The program is designed to reduce shrinkage, which is the term retailers use to define losses due to theft, scanning errors, fraud, and other causes.

US retailers lost an estimated 1.33% of revenues to shrinkage in 2017, totalling an estimated $47 billion, according to the National Retail Federation. If Walmart’s shrink rates match the industry average, the company’s US business would have lost more than $4 billion last year to theft and other related losses.

“Walmart is making a true investment to ensure the safety of our customers and associates,” Walmart spokeswoman LeMia Jenkins said. “Over the last three years, the company has invested over half a billion dollars in an effort to prevent, reduce and deter crime in our stores and parking lots. We are continuously investing in people, programs and technology to keep our stores and communities safe.”

Walmart began rolling out Missed Scan Detection technology to stores two years ago, and it appears to be working successfully so far. Shrink rates have declined at stores where it’s deployed, Jenkins said.

Ireland-based Everseen is one of several companies supplying Walmart with the technology for the program.

“Everseen overcomes human limitations. By using state of the art artificial intelligence, computer vision systems, and big data, we can detect abnormal activity and other threats,” an Everseen video advertises. “Our digital eye has perfect vision and it never needs a coffee break or a day off.”

Read full story here…




Amazon Cloud Will Host DHS Biometric Data On Hundreds Of Millions Of People

Home grown Surveillance State: “The department seeks a new platform to identify people using fingerprints, irises and faces, and eventually DNA, palm prints, scars and tattoos.” ⁃ TN Editor

The Homeland Security Department is looking to upgrade the software it uses to analyze biometric data on hundreds of millions of people around the globe, and it plans to store that information in Amazon’s cloud.

The agency’s Office of Biometric Identification Management will replace its legacy biometric analysis platform, called the Automated Biometric Identification System, or IDENT, with a new, more robust system hosted by Amazon Web Services, according to a request for information released Monday.

IDENT essentially serves as an enterprisewide clearinghouse for troves of biometric and biographic data collected by the Transportation Security Administration, Customs and Border Protection, Secret Service and other Homeland Security components. The system links fingerprint, iris and face data to biographic information, allowing officials to quickly identify suspected terrorists, immigration violators, criminals and anyone else included in their databases.

In total, IDENT contains information on more than 250 million people, a Homeland Security spokesperson told Nextgov.

According to the solicitation, Homeland Security is in the process of replacing IDENT with the Homeland Advanced Recognition Technology System, or HART. The new system will include the same biometric recognition features as its predecessor, and potentially additional tools that could identify individuals based on DNA, palm prints, scars, physical markings and tattoos.

Whereas IDENT stores records in government-run data centers, the Homeland Security solicitation states “HART will reside in the Amazon Web Services (AWS) FedRAMP certified GovCloud.” Further, “biometric matching capabilities for fingerprint, iris, and facial matching will be integrated with HART in the Amazon Web Services GovCloud.” Amazon Web Services will also store HART’s biometric image data.

Amazon Web Services’ GovCloud US-East and US-West regions are data centers specifically built by the company to house some of the government’s most restricted information. AWS is no stranger to hosting sensitive government data, having already claimed the CIADefense DepartmentNASA and other federal agencies as customers in part because of perceived security improvements over government legacy systems.

When reached for comment, an AWS spokesperson referred inquiries to DHS.

In 2018, Northrop Grumman won a $95 million contract to develop the first two stages of the HART system, and its contract is set to expire in 2021. The department plans to use responses to the latest solicitation to inform its strategy for further developing the platform, the DHS spokesperson said.

Specifically, officials are asking vendors for ideas on how to build those multiple identification functions into the new system, while leaving room to add any new recognition “modalities” as they arise. Officials also want input on developing a handful of general reporting, analytics and search tools, as well as desktop and mobile web portals where Homeland Security employees can access the system.

Read full story here…




hong kong

Young Hong Kong Protestors Outwit The Surveillance State

Necessity is the mother of invention as wary protestors dumped recognizable social media accounts for VPNs and end-to-end encrypted messaging apps. Basically, nobody trusts China anymore. ⁃ TN Editor

The moment the 25-year-old protester got home from demonstrations that turned violent – tear gas still stinging her eyes – she knew what she had to do: delete all of her Chinese phone apps.

WeChat was gone. So was Alipay and the shopping app Taobao. She then installed a virtual private network on her smartphone to use with the secure messaging app Telegram in an attempt to stay hidden from cyber-monitors.

“I’m just doing anything” to stay ahead of police surveillance and hide her identity, said the protester. She asked to be referred only by her first name, Alexa, to avoid drawing the attention of authorities amid the most serious groundswell against Chinese-directed rule in Hong Kong since 2014.

Protests that expanded over the past week against a bill allowing extraditions to mainland China were marked by something unprecedented: A coordinated effort by demonstrators to leave no trace for authorities and their enhanced tracking systems.

Protesters used only secure digital messaging apps such as Telegram, and otherwise went completely analogue in their movements: buying single ride subway tickets instead of prepaid stored value cards, forgoing credit cards and mobile payments in favor of cash, and taking no selfies or photos of the chaos.

They wore face masks to obscure themselves from CCTVs and in fear of facial recognition software, and bought fresh pay-as-you-go SIM cards.

And, unlike the pro-democracy movement in 2014, the latest demonstrations also have remained intentionally leaderless in another attempt to frustrate police, who have used tear gas and rubber bullets against the crowds.

On Saturday, Hong Kong chief executive Carrie Lam announced the postponement of the extradition bill, saying she hoped to return peace to the streets of the city. But the measure was not fully withdrawn and Lam still expressed support.

Protesters, meanwhile, have called for another major show of defiance on the streets on Sunday.

Amid the chaos, Hong Kong has offered a picture of what it looks like to stage mass civil disobedience in the age of the surveillance state.

“The Chinese government will do a lot of things to try to monitor their own people,” said Bonnie Leung, a leader of the Hong Kong-based Civil Human Rights Front.

Leung cited media coverage of Chinese use of artificial intelligence to track individuals and its social credit score system.

“We believe that could happen to Hong Kong, too,” she said.

The core of the protests is over the belief that Beijing – which was handed back control of the former British colony more than 20 years ago – is increasingly stripping Hong Kong of its cherished freedoms and autonomy.

But the identity-masking efforts by the protesters also reflects deep suspicions that lines between China and Hong Kong no longer exist – including close cooperation between Hong Kong police and their mainland counterparts who have among the most advanced and intrusive surveillance systems.

“It is the fundamental reason people are protesting in the first place,” said Antony Dapiran, who wrote a book on protest culture in Hong Kong. “They don’t trust Beijing, they don’t trust their authorities and the legal system, and they don’t like the blurring of lines between Beijing and Hong Kong.”

For many who had taken to the streets over the past week, the fight was a familiar one.

In 2014, protesters occupied Hong Kong’s main arteries for 79 days demanding full universal suffrage in the territory. Prominent student leaders and activists marshaled up support night after night in mini cities that had been set up on Hong Kong’s thoroughfares, until they were eventually cleared out by police.

Today, all of the most prominent leaders of that movement – Joshua Wong, only a teenager at the time of the protests, legal scholar Benny Tai and Chan Kin-man, a sociology professor – are in jail.

The masses gathered around government buildings this week were without clear leaders. Demonstrators shared protest tips and security measures with people they had met just hours before to avoid a similar fate. Meetups were primarily planned on Telegram, which became the top trending app on the iPhone app store in Hong Kong in the days leading up to the protest.

“Information on personal safety was passed around on Telegram channels and group chats,” said Caden, a 21 year-old Hong Kong student in Indiana who returned home early to participate. When he among estimated 1 million marching on June 7 to begin the protest movement.

On the group groups, Caden received a barrage of advice which included changing your username on Telegram so it sounds nothing like your actual name, changing your phone number associated with app and using SIM cards without a contract.

“We are much more cautious now for sure than in 2014. Back then, it was still kind of rare for the police to arrest people through social media,” Caden said, declining to give his full name for fear of retribution. “All of this is definitely new for most people there.”

Read full story here…




DNA

Unholy Alliance: When DNA Meets Facial Recognition

Scientists believe that your DNA can be used to roughly predict what you look like in physical appearance, and vice versa and they are talking about using the technology in law enforcement to help solve crimes? ⁃ TN Editor

Technology that was once only seen in dystopian science fiction is rapidly infecting real life. Two means of collecting personal data – DNA databases and facial recognition software – are forming an unholy alliance, and the privacy implications could be devastating.

Privacy concerns surrounding DNA and facial recognition aren’t anything new. As the popularity of DNA genealogy websites like Ancestry DNA and 23andMe increase, so do questions over who has access to that data and how it will be used. The use of facial recognition and other biometric data technology is on the rise, and people are expressing concerns (and outrage) about that technology as well.

Genealogy sites have been making the news of late, mainly for concerns over how our personal data is used – and who has access to that information.

Recently, GEDmatch, which has more than 1 million genetic profiles in its database, decided to stop providing information to police without user permission. Last month, the site faced criticism when it allowed Utah police to use the database while investigating a violent assault. “Prior to the change, GEDmatch had allowed police to use its data only for rapes and homicides,” reports Bloomberg:

The site is now asking users whether they want to allow police to access their DNA information. Previously, law enforcement could search profiles in the database unless a customer explicitly opted out.

Law enforcement agencies have increasingly relied on GEDmatch after it was used to find a man suspected of killing at least 12 people and raping scores of women in California between 1976 and 1986. A distant relative of the alleged Golden State Killer had uploaded DNA into GEDmatch. The genetic profile partially matched crime-scene evidence, indicating the person was related to the anonymous suspect who had eluded capture for four decades. From there, investigators built out that person’s family tree, which led to the arrest last year of a 72-year-old former police officer. (source)

Engineers are figuring out how to match DNA to faces in a database.

Now, DNA and facial recognition have merged, a development that is sure to ignite new concerns over privacy and how our most personal data is collected and used.

A team of engineers just built an algorithm that scans through a database of images and selects the faces that could be linked to the DNA found at a crime scene. Until now, predicting what someone’s face looks like based on a DNA sample has eluded scientists. But, it is getting easier for them to use samples to filter the right face from a database, as an international team led by researchers at KU Leuven has shown. Their findings were published in Nature Communications.

Our physical appearance (including our facial details) is hardwired into our genetic material. Scientists have already identified multiple genes that determine the shape of our face – from the distance between our nostrils to the shape of our chin. Some of that research was also conducted by a team at KU Leuven. Back in 2018, they identified fifteen genes that determine our facial features. In a related press release, the researchers explained the technique they used:

In a new study conducted by KU Leuven in collaboration with the universities of Pittsburgh, Stanford and Penn State, the researchers adopted a different approach. “Our search doesn’t focus on specific traits,” lead author Peter Claes (KU Leuven) explains. “My colleagues from Pittsburgh and Penn State each provided a database with 3D images of faces and the corresponding DNA of these people. Each face was automatically subdivided into smaller modules. Next, we examined whether any locations in the DNA matched these modules. This modular division technique made it possible for the first time to check for an unprecedented number of facial features.” (source)

As of now, the ability to use that information to find a match between a DNA sample from a crime scene and someone in a DNA database does not exist, but it appears scientists are getting closer to making that a possibility.

In a press release, Claes, an electrotechnical engineer at KU Leuven and the senior and corresponding author of the new study, explained why that remains a challenge:

“We believe that the shape of our face is determined by thousands of genes, but also by the food we eat and other living conditions. Therefore, it is unlikely that we will ever be able to accurately predict a lifelike face from DNA alone.”

This technology could be used to solve crimes…or for less honorable purposes.

For forensic analysis and other applications, the ideal scenario would be to find a match between a DNA sample from a crime scene and someone in a DNA database. If the DNA doesn’t match anyone in the database, however, it could still be used to predict what the perpetrator’s face looks like and make a sketch. This sketch could then be compared to a database with the faces of known criminals, for example, Claes said:

“This method mostly helps rule people out. In practice, we don’t usually get any further than a sort of reference face, such as ‘a European male’. That’s not much use to a forensic investigator. Unfortunately, while we are learning about more and more genes that determine certain aspects of our face, this does not yet sufficiently translate to a better match between the predicted face and the faces in the database.”

To overcome this obstacle, the researchers developed a “reverse approach”, as explained in the press release:

“Instead of going from DNA to face, we’re trying to go from face to DNA. Using special software, we measure each face and check if this face is a possible outcome based on a unique bit of DNA.”

“It then becomes a game of ‘guess who’. If the face is male and the DNA says it is a woman, all the men are eliminated. If the hair is blond and the genetic material confirms this, that eliminates all other hair colours. “

“The more genes we identify, the more accurate this method becomes, and it will only continue to improve as our knowledge of the relevant genes grows.”

So what if the face you’re looking for is not in the database? “Then you will at least have a list of faces that look a lot like the one you are looking for. Instead of a single reference face, such as ‘Asian female’, you will have a list of Asian females who are most similar to the person you want to find.” (source)

If your face isn’t in a database yet, it likely will be soon.

For this technology to be useful, large face databases will be needed, such as ID card or driver’s license databases. Obviously, privacy issues are a huge concern. Claes acknowledges this fact. “Working with databases that contain private information such as DNA or faces requires strict supervision to prevent misuse,” he said.

By 2021, 97% of travelers will have their faces scanned. In some stores, “smart” refrigerators “analyze” your face and track your irises (but don’t worry – the company says they aren’t using true facial recognition). Amazon is developing software called Rekognition that can identify anything it’s looking at – including your face. Surveillance systems are insidious, sneaking into our lives at every turn.

Read full story here…




Human Rights Lawyer: Facial Recognition Is Arsenic In The Water Of Democracy

One critic of Social Engineering and Digital Slavery, aka Technocracy, suggests that “Mass surveillance has a chilling effect that distorts public behaviour.” Indeed, the world has little time left to reverse the trend. ⁃ TN Editor

Automated facial recognition poses one of the greatest threats to individual freedom and should be banned from use in public spaces, according to the director of the campaign group Liberty.

Martha Spurrier, a human rights lawyer, said the technology had such fundamental problems that, despite police enthusiasm for the equipment, its use on the streets should not be permitted.

She said: “I don’t think it should ever be used. It is one of, if not the, greatest threats to individual freedom, partly because of the intimacy of the information it takes and hands to the state without your consent, and without even your knowledge, and partly because you don’t know what is done with that information.”

Police in England and Wales have used automated facial recognition (AFR) to scan crowds for suspected criminals in trials in city centres, at music festivals, sports events and elsewhere. The events, from a Remembrance Sunday commemoration at the Cenotaph to the Notting Hill festival and the Six Nations rugby, drew combined crowds in the millions.

San Francisco recently became the first US city to ban police and other agencies from using automated facial recognition, following widespread condemnation of China’s use of the technology to impose control over millions of Uighur Muslims in the western region of Xinjiang.

When deployed in public spaces, automated facial recognition units use a camera to record faces in a crowd. The images are then processed to create a biometric map of each person’s face, based on measurements of the distance between their eyes, nose, mouth and jaw. Each map is then checked against a “watchlist” containing the facial maps of suspected criminals.

Spurrier said: “I think it’s pretty salutary that the world capital of technology has just banned this technology. We should sit up and listen when San Francisco decides that they don’t want this on their streets.

“It goes far above and beyond what we already have, such as CCTV and stop-and-search. It takes us into uncharted invasive state surveillance territory where everyone is under surveillance. By its nature it is a mass surveillance tool.”

She said a lack of strong governance and oversight could allow the police to roll out live facial recognition by stealth, without a meaningful debate on whether the public wanted it or not. The technology was developing so fast, she said, that government was failing to keep up.

“There is a real sense of technological determinism that is often pushed by the big corporations, but also by law enforcement and by government, that it’s inevitable we’ll have this, so we should stop talking about why we shouldn’t have it,” she said.

Read full story here…




Orwell

Interest In Orwell’s Classic ‘1984’ Still Endures After 70 Years

Words and phrases from Orwell’s 1984 abound in American culture. Although many are worried after having read the prophetic dystopian novel, few are alarmed enough to do anything to stop it in modern society. ⁃ TN Editor

Seventy years after its publication, George Orwell’s classic dystopian novel “1984” continues to fascinate readers, in particular youngsters growing up in a social media-dominated age of increasing angst.

“Some students are very shocked by it, and remain shocked by it,” said Michael Callanan, an English teacher and director of the Orwell Youth Prize, which supports political expression amongst young people.

“It is part of the paradox of a book being 70-years-old,” he added.

“I think they were taken aback by how fresh and how true to our lives today it strangely is.”

Written in 1948, and published the following year, “1984” depicted a chilling future world in which a totalitarian state controls people’s thoughts and actions, suppressing any dissent.

This rigidly-controlled society features a so-called “ministry of truth” that distorts reality, with the ever-watchful eyes of “Big Brother” keeping tabs on citizens’ behaviour.

The novel introduced other terms that have endured in the lexicon, including “double-think”, which means “the power of holding two contradictory beliefs in one’s mind simultaneously, and accepting both of them”, according to Orwell.

For Jean Seaton, director of the Orwell Foundation, which promotes the work of the writer who died in 1950 aged 46, and administers various awards, his masterpiece was “very prescient”.

She noted the book’s description of “two minutes hate” — in which citizens watch a daily film inciting them to hate enemies of the state — as comparable to online hate-mobs today.

Over the seven decades since its publication, “1984” has remained omnipresent on the social landscape, and even seen periodic spikes in sales.

In 2017 it enjoyed a boom in popularity after one of US President Donald Trump’s advisors used the term “alternative facts”, a phrase used in the book.

In Britain, sales that year increased by an eye-popping 165 percent, publisher Penguin Books told AFP.

The novel also saw a marked increase in purchases in 2013, after the revelations of mass state spying by US whistleblower Edward Snowden.

Callanan, who has been teaching for 30 years, said today’s teenagers are not “frightened of it in a way that maybe previous generations were”.

“But in the past couple of years with the rise of Trump, there is now significant minorities of students that are very concerned about the way the world is going — and of course truth is the big thing,” he added.

Read full story here…




House Oversight Committee

House Oversight Committee: Facial Recognition ‘Not Ready For Primetime’

Technocrats in law enforcement obscure the scope of their unconstitutional and possibly illegal activities. An epic battle line is forming between Congress and surveillance hawks. ⁃ TN Editor

The FBI and Transportation Security Administration (TSA) defended their use of controversial facial recognition technology in a hearing before the House Oversight Committee on Tuesday.

Meanwhile, some elected leaders blasted it as “not ready for primetime,” calling for everything from an outright ban of its use to federal legislation regulating it.

The blowback was bipartisan, with Committee Chairman Elijah Cummings, D-MD, pledging to hold more hearings. “I’m a little freaked out by facial recognition,” Rep. Rashida Tlaib, D-MI, said. In his opening statement, Ranking Member Jim Jordan, R-OH said, “Today is a subject matter where we have a lot of common ground and a lot of agreement.”

Facial recognition tech is under fire from cities and representatives in Congress despite polls showing few Americans want to restrict it.

The San Francisco Board of Supervisors voted last month to ban city departments from using the technology, though it does not restrict private companies’ or citizens’ use of it. Politico reported that ban will also not apply to San Francisco International Airport’s planned deployment of facial recognition cameras due to a loophole in the law that excludes Customs and Border Protection (CBP).

At the federal level, there is uncertainty surrounding the technology’s further use, especially for law enforcement. During her testimony, Kimberly Del Greco, deputy assistant director of the FBI’s Criminal Justice Information Services (CJIS), echoed the views of city police departments who say the technology has useful applications including tracking down suspects more quickly.

The FBI’s CJIS program allows law enforcement agencies at all levels, that have signed a memorandum of understanding (MOU), to search its database of criminal mugshots and find a match using facial recognition technology, Del Greco said. The database can return a gallery of of between 2-50 “candidate” photos, which are manually reviewed for a match. The MOUs to cooperate with the FBI were signed before the conception of facial recognition, Del Greco said, which some committee members scoffed at.

While the FBI said it only has access to criminal mugshots, Gretta Goodwin, director of justice and law enforcement issues on the Homeland Security and Justice Team at the U.S. Government Accountability Office (GAO), refuted that. The FBI can access 640 million photos from various civil repositories, including the Department of Motor Vehicles (DMV), she said.

Goodwin and some elected leaders took the FBI to task for a report the GAO issued, which concluded that the “FBI has limited information on the accuracy of its face recognition technology capabilities.” To remedy that, the GAO said the FBI must take extra steps to ensure privacy and accuracy, although of the six steps, only one has been completed due to legal questions from the U.S. Department of Justice (DOJ).

Rep. Carolyn Maloney, D-NY, said more needs to be done to show the technology works, especially given the civil liberty concerns.

“You don’t have answers on how it’s working, how it was set up, what’s coming out of it, whether it’s hurting people, helping people,” Maloney said. “You don’t even have information on whether it’s aiding law enforcement in their goal for hunting down terrorists. We need more accountability.”

Read full story here…




FBI

GAO: FBI Has Access To 640 Million Photographs

Technocrats are addicted to data and currently there are almost no regulations preventing law enforcement from using police-state tools. Where is Congress in all this? Apparently, sound asleep. ⁃ TN Editor

A government watchdog says the FBI has access to about 640 million photographs — including from driver’s licenses, passports and mugshots — that can be searched using facial recognition technology.

The figure reflects how the technology is becoming an increasingly powerful law enforcement tool, but is also stirring fears about the potential for authorities to intrude on the lives of Americans. It was reported by the Government Accountability Office at a congressional hearing in which both Democrats and Republicans raised questions about the use of the technology.

The FBI maintains a database known as the Interstate Photo System of mugshots that can help federal, state and local law enforcement officials. It contains about 36 million photographs, according to Gretta Goodwin of the GAO.

But taking into account the bureau contracts providing access to driver’s licenses in 21 states, and its use of photos and other databases, the FBI has access to about 640 million photographs, Goodwin told lawmakers at the House oversight committee hearing.

Kimberly Del Greco, a deputy assistant director at the FBI, said the bureau has strict policies for using facial recognition. She said it is used only when there is an active FBI investigation or an assessment, which can precede a formal investigation. When using the state databases, the FBI submits a so-called “probe photo” and then states conduct a search to yield a list of potential candidates to be reviewed by trained federal agents.

Read full story here…




Customs Says Facial Recognition Use ‘Is Not A Surveillance Program’

Customs and Border Protection (CBP) have been using facial recognition systems in airports and at the border, but insists that it isn’t a surveillance program. Legislators are finally starting to wake up to the fact that, Yes, it IS a surveillance program!  ⁃ TN Editor

Lawmakers and civil liberties advocates might be pressing law enforcement agencies to scale back their use of facial recognition software, but international travelers should only expect to see more of the tech in the years ahead.

It’s been almost two years since Customs and Border Protection began deploying facial recognition systems at U.S. airports, and despite the recent backlash against the software, the agency’s efforts show no signs of slowing down. But if you ask Deputy Executive Assistant Commissioner John Wagner, the agency’s use of facial recognition falls far short of the dystopian panopticon feared by many of the tech’s critics.

“This is not a surveillance program,” Wagner, who heads CBP’s biometric entry and exit initiative, said in a conversation with Nextgov. “We are not just hanging a camera in an airport and randomly identifying people … as they’re walking through.”

Under Wagner’s program, CBP agents use facial recognition to compare real-time images of international travelers to the photos on their passports or visas. For arrivals, people have their faces scanned while officers review their travel documents, and for departures, the tech captures images right at the boarding gate.

Today, the tech is deployed in some capacity at 16 airports across the U.S, and by 2021, CBP expects to scale up the program to cover more than 97 percent of the people flying outside the country. Ultimately, officials anticipate biometrics could render physical boarding passes obsolete.

The system is intended to help agents keep better tabs who is entering and leaving the country. Instead of relying on traditional flight logs and manual document inspections to monitor international traffic, using the tech, agents can now verify passengers are who they claim to be with more than 98 percent accuracy in a matter of seconds, Wagner said. The agency is currently testing facial recognition at three checkpoints in Arizona to identify people crossing the U.S.-Mexico border.

And officials are already seeing program bear fruit. Since August, agents have intercepted six individuals trying to illegally enter the U.S. through airports and another 125 “imposters” along the southern border, a CBP spokesperson told Nextgov. Wagner said the agency also identified more than 14,000 people who left the country after overstaying their visas, a violation that could prevent them from returning to the U.S. for up to a decade. The program has been so successful that the Partnership for Public Service nominated Wagner for one of its annual Service to America awards.

Outside the Homeland Security Department, however, its reception has been mixed. The program came under fire last week on Capitol Hill as lawmakers and legal experts bashed law enforcement agencies for their often dubious use of facial recognition. During the hearing, Neema Singh Guliani, a senior legislative counsel at the ACLU, said she had “lots of questions and concerns” about CBP expanding the use of biometrics beyond airport terminals.

But compared to the sweeping and often covert applications of facial recognition by the FBI and other agencies, Wagner sees CBP’s operations as pretty tame.

People are always aware their picture is being taken, and U.S. citizens have the ability to opt out of face scans, at least for the departure process, he said. The tech is also only used in sections of the airport where people would already need to show identification, he said, and the image itself is only compared to passport and visa photos that already in the government’s possession.

“The biometric really becomes as simple as validating the information we’ve already received,” Wagner said. “There’s no new information we’re requiring of a person other than taking their photograph and comparing it to a photograph they’ve already given us.”

Instead of running images against a single trove of government IDs, the agency compares them to custom databases created for each individual flight, which significantly reduces the risk of misidentification, Wagner said. Those new airport photos are also deleted from CBP’s systems within less than a day, he added.

Read full story here…




Amazon And Facebook Ready To Read Your Emotions

The latest software land rush goes beyond simple face recognition to reading your emotions. This will have thousands of ready applications such as pre-crime analysis and reporting. Of course, just as people misread emotions on a regular basis, so will the algorithm. ⁃ TN Editor

Facebook and Amazon’s insanity only seems to continue with no sign of slowing down anytime soon. Now, the two big conglomerate giants want to move into the uncharted territory of reading human emotions, both in their own ways.

Facebook wants a robot that has five senses which can read human emotions. Facebook wants “emotionally sensitive” robots that can explore the world, identify objects and people and enable its users to make more friends, Dailymail reported.

The robots would be fitted with wheels or tank-like caterpillar treads that would allow them to trundle about their environment.

Alternatively, such robots could be fitted out with drive systems that would allow them to move around underwater, fly through the air or float in space, Facebook suggest in their patent.

I am not sure why anyone would trust Facebook with data ever again, let alone biometric data, after all the numerous scandals Activist Post has documented including data mining. But to each their own I guess.

Amazon is also looking into reading human emotions in a completely different way by utilizing a voice-activated wearable device, that will sense its wearer’s state of mind by the tone of voice, Bloomberg reported.

It’s worth noting that both companies have a smart home device, and after reading this you should fear what information is being gathered by the cameras and microphones attached to those electronics … besides the typically targeted advertising to turn consumers into the product.

On the Amazon front, it seems more than likely the company will want to use this technology in a variety of different digital gadgets, ranging from personal assistants such as Alexa to new technologies that the retail giant is currently developing. Amazon has announced it’s developing a personal assistance robot, so the new emotional technology could easily be integrated into this at-home robot as a means to “serve the consumer better.” A horrifically terrifying thought indeed.

Amazon and Facebook aren’t the only companies looking into utilizing human emotions. Previously, Activist Post reported that Walmart was also looking into to monitoring your biometric data, pulse, and location from the sensors on a shopping cart handle.

This news comes as hundreds of retail stores — and soon thousands — are investigating using biometric facial recognition software FaceFirst to build a database of shoplifters to aid in the fight against theft, Activist Post reported.

FaceFirst is designed to scan faces as far as 50 to 100 feet away. As customers walk through a store entrance, the video camera captures repetitious images of each shopper and chooses the clearest one to store. The software then analyzes that image and compares it to a database of “bad customers” that the retailer has compiled; if there is a match, the software sends an alert to store employees that a “high risk” customer has entered the door.

The future of shopping seems to allude to having biometric scanners written all over it, a worrying prospect for privacy enthusiasts.

Several privacy advocate groups, attorneys, and even recently Microsoft, which also markets its own facial recognition system, have all raised concerns over the technology, pointing to issues of consent, racial profiling, and the potential to use images gathered through facial recognition cameras as evidence of criminal guilt by law enforcement.

Read full story here…