FBI Quietly Changes Privacy Rules For Accessing NSA Data Collected On Americans

TN Note: Technocrats always pursue knowing everything about everything and everybody, so a goldmine of data at the NSA is like a candy store to the FBI. No matter that legislators have tried to clamp down on privacy violations, all these agencies have just gone ahead to do whatever they want to do. Remember also that the FBI and NSA answer directly to the Director of National Intelligence, James Clapper. In March 2013, Clapper told a Congressional committee that the NSA does not collect any type of data at all on millions of Americans – he was subsequently described as having lied under oath, obstructed justice and having given false testimony. Two Representatives accused him of perjury and 26 Senators complained about his testimony.

The FBI has quietly revised its privacy rules for searching data involving Americans’ international communications that was collected by the National Security Agency, US officials have confirmed to the Guardian.

The classified revisions were accepted by the secret US court that governs surveillance, during its annual recertification of the agencies’ broad surveillance powers. The new rules affect a set of powers colloquially known as Section 702, the portion of the law that authorizes the NSA’s sweeping “Prism” program to collect internet data. Section 702 falls under the Foreign Intelligence Surveillance Act (Fisa), and is a provision set to expire later this year.

A government civil liberties watchdog, the Privacy and Civil Liberties Oversight Group (PCLOB), alluded to the change in its recent overview of ongoing surveillance practices.

The watchdog confirmed in a 2014 report that the FBI is allowed direct access to the NSA’s massive collections of international emails, texts and phone calls – which often include Americans on one end of the conversation. The activists also expressed concern that the FBI’s “minimization” rules, for removing or limiting sensitive data that could identify Americans, did not reflect the bureau’s easy access to the NSA’s collected international communications.

FBI officials can search through the data, using Americans’ identifying information, for what PCLOB called “routine” queries unrelated to national security. The oversight group recommended more safeguards around “the FBI’s use and dissemination of Section 702 data in connection with non-foreign intelligence criminal matters”.

As of 2014, the FBI was not even required to make note of when it searched the metadata, which includes the “to” or “from” lines of an email. Nor does it record how many of its data searches involve Americans’ identifying details – a practice that apparently continued through 2015, based on documents released last February. The PCLOB called such searches “substantial”, since the FBI keeps NSA-collected data with the information it acquires through more traditional means, such as individualized warrants.

But the PCLOB’s new compliance report, released on Saturday, found that the administration has submitted “revised FBI minimization procedures” that address at least some of the group’s concerns about “many” FBI agents who use NSA-gathered data.

Read full story here…




We’ll Bust You For The Crime You Have Not Yet Committed, But Probably Will

TN Note: This whole pre-crime mentality that is sweeping the law enforcement community is very dangerous. Believing that they have the ability to accurately predict the future is an age-old deception seen practiced with Ouija Boards, Fortune Tellers, Astrology, etc. In short, it is mysticism at the base level.

 Computers are getting pretty good at predicting the future. In many cases they do it better than people. That’s why Amazon uses them to figure out what you’re likely to buy, how Netflix knows what you might want to watch, the way meteorologists come up with accurate 10-day forecasts.

Now a team of scientists has demonstrated that a computer can outperform human judges in predicting who will commit a violent crime. In a paper published last month, they described how they built a system that started with people already arrested for domestic violence, then figured out which of them would be most likely to commit the same crime again.

The technology could potentially spare victims from being injured, or even killed. It could also keep the least dangerous offenders from going to jail unnecessarily. And yet, there’s something unnerving about using machines to decide what should happen to people. If targeted advertising misfires, nobody’s liberty is at stake.

For two decades, police departments have used computers to identify times and places where crimes are more likely to occur, guiding the deployment of officers and detectives. Now they’re going another step: using vast data sets to identify individuals who are criminally inclined. They’re doing this with varying levels of transparency and scientific testing. A system called Beware, for example, is capable of rating citizens of Fresno, California, as posing a high, medium or low level of threat. Press accounts say the system amasses data not only on past crimes but on web searches, property records and social networking posts.

Critics are warning that the new technology had been rushed into use without enough public discussion. One question is precisely how the software works — it’s the manufacturer’s trade secret. Another is whether there’s scientific evidence that such technology works as advertised.

Read full story here…




Agression Therapy: Scientists Pinpoint Area Of Brain Where Anger Originates

TN Note: It is just a matter of time before this discovery makes its way into the hands of technocrat social engineers who would use it as mandated therapy to thwart violence in society. Note that this finding is a result of President Obama’s “brain mapping initiative” that seeks to do for understanding the brain what the human genome project did for genetics. 

The ground breaking study found that the same area of our brain controls both our appetite for food and for aggression
scientists pinpoint the exact area of the brain where anger originates, making Clockwork Orange-type therapies possibleBreakthrough: Scientists think they’ve found the ‘anger spot’

Scientists have pinpointed the exact area where violence and anger originates, which could pave the way to aggression-curbing treatments such as those in movie A Clockwork Orange.

The discovery was found during a study into the ventromedial hypothalamus, or VMH, in the brain.

This part of the brain’s circuit is usually associated with appetite, as it was found that animals with damage to the area ate more and became obese.

But the study also found that the VMH has a darker side, as one element of it is thought to control aggression and provides the spark to ignite violence.

This could lead doctors to creating a form of anti-aggression treatment, such as the therapies used in cult flick A Clockwork Orange.

But a leading scientist warned it was only a distant possibility, and that is if the ethical and legal issues surrounding it could be resolved.

US lead scientist Dr Dayu Lin, from New York University, said: “That said, our results argue that the ventrolateral part of the ventromedial hypothalamus (VMHvl) should be studied further as part of future efforts seeking to correct behaviours from bullying to sexual predation.”

In Stanley Kubrick’s 1971 movie A Clockwork Orange, set in a dystopian near-future society, sociopath delinquent Alex is subjected to an experimental aversion therapy to cure him of his craving for “ultraviolence”. But by rendering him incapable of wrongdoing, the treatment also robs Alex of his free will.

The new study focused on aggression in mice, which have many of the same brain circuits found in humans.

Read full story here…




China Aggressively Pursues Pre-Crime Technology To Catch Evil-Doers Before They Act

TN Note: China is run by technocrats who rule with an iron fist. Surveillance in China is turning into an art form, driven by technology-crazed engineers and scientists who believe that they can completely control their citizens. There are few laws preventing them from accomplishing this. 

China’s effort to flush out threats to stability is expanding into an area that used to exist only in dystopian sci-fi: pre-crime. The Communist Party has directed one of the country’s largest state-run defense contractors, China Electronics Technology Group, to develop software to collate data on jobs, hobbies, consumption habits, and other behavior of ordinary citizens to predict terrorist acts before they occur. “It’s very crucial to examine the cause after an act of terror,” Wu Manqing, the chief engineer for the military contractor, told reporters at a conference in December. “But what is more important is to predict the upcoming activities.”

The program is unprecedented because there are no safeguards from privacy protection laws and minimal pushback from civil liberty advocates and companies, says Lokman Tsui, an assistant professor at the School of Journalism and Communication at the Chinese University of Hong Kong, who has advised Google on freedom of expression and the Internet. The project also takes advantage of an existing vast network of neighborhood informants assigned by the Communist Party to monitor everything from family planning violations to unorthodox behavior. A draft cybersecurity law unveiled in July grants the government almost unbridled access to user data in the name of national security. “If neither legal restrictions nor unfettered political debate about Big Brother surveillance is a factor for a regime, then there are many different sorts of data that could be collated and cross-referenced to help identify possible terrorists or subversives,” says Paul Pillar, a nonresident fellow at the Brookings Institution.

Building a crystal ball to predict and prevent terror attacks, a real-world version of Minority Report, is the ultimate goal of crime fighters the world over. But, so far, more data has just meant more noise, security experts say. “There are not enough examples of terrorist activity to model what it looks like in data, and that’s true no matter how much data you have,” says Jim Harper, a senior fellow at the Cato Institute. “You need yeast to make bread. You can’t make up for a lack of yeast by adding more flour.”

China was a surveillance state long before Edward Snowden clued Americans in to the extent of domestic spying. Since the Mao era, the government has kept a secret file, called a dang’an, on almost everyone. Dang’an contain school reports, health records, work permits, personality assessments, and other information that might be considered confidential and private in other countries. The contents of the dang’an can determine whether a citizen is eligible for a promotion or can secure a coveted urban residency permit. The government revealed last year that it was also building a nationwide database that would score citizens on their trustworthiness.

Read full story here...




State Police Pushing Connecticut Lawmakers For Options To Arm Drones

TN Note: It’s just a matter of time that law enforcement will break down public sentiment against arming police drones with various weapons, including tear gas, tazers and firearms. The military is already aggressively pursuing this technology.

The drone discussion has been buzzing around the state Capitol for hearings this week.

Monday, a bill to ban the weaponization of drones was discussed, and on Tuesday, a bill to control the use of drones in many ways, including by law enforcement agencies, was debated.

Lawmakers listened to testimony concerning restrictions on drones with cameras versus expectations of privacy. The bill would also require police to get a warrant to use drones in many cases.

Police made the argument that armed drones in law enforcement could be an effective weapon for public safety.

“We’ve had a report that somebody’s going to fly a drone into an airplane, into an engine, or it’s a weaponized drone,” Farmington Police Chief Paul Melanson said.  “We’re concerned and we don’t have those answers yet.”

The ACLU of Connecticut testified against allowing police to arm drones, saying it could open the door even more to excessive use of force.

“We are concerned that there could be misuse, particularly on vulnerable communities,” David McGuire of the ACLU said at the hearing.




Listening Devices On Public Buses In Maryland Record Private Conversations

TN Note: The article states, “What [the Maryland Transit Administration] is doing is a mass surveillance.” Technocracy trickle-down mentality: Just do it. Nobody told them not to do it, nor did they ask permission to do it; they just did it and thought it was perfectly OK. 

The Maryland Senate on Tuesday delayed action on a bill that would clamp down on when public buses and trains can record the private conversations of their passengers.

Sen. Robert A. Zirkin (D-Baltimore County), chair of the Senate Judicial Proceedings, which unanimously voted for the measure to move to the Senate floor, said he wanted the committee to address an amendment offered by some of those who are concerned about costs associated with the bill.

The bill is likely to be considered by the Senate on Wednesday, he said.

“What [the Maryland Transit Administration] is doing is a mass surveillance,” Zirkin said.

“I find it outrageous,” he said. “I don’t want to overstate it, but this is the issue of our generation. As technology advances, it becomes easier and easier to encroach on people’s civil liberties.”

While Zirkin and other proponents argue that the technology, which has been in use since 2012, is an infringement on civil liberties, the bill’s opponents say the recordings are a necessary tool for homeland security.

The bill, which would affect MTA buses in the Baltimore area, Ride On buses in Montgomery County and TheBus in Prince George’s County, creates guidelines for audio recordings and places limits on when they can be made.

MTA began using recording devices inside some of its buses in 2012, without seeking legislative approval. Nearly 500 of its fleet of 750 buses now have audio recording capabilities. Officials say the devices can capture important information in cases of driver error or an attack or altercation on a bus.

Under the bill, recording devices would have to be installed near a bus or train operators’ seat. The devices would be controlled by the driver and could be activated only in the event of a public-safety incident.

The legislation to limit the recordings came to the Senate floor last week, but a vote was delayed until Tuesday after several lawmakers raised questions about how much it would cost to retrofit or replace existing recording equipment to meet the bill’s requirements.

Some lawmakers raised the issue of security. Several asked for the delay to allow time to draft amendments.

 “I can make an argument to tape everybody, everywhere, everywhere they walk, everywhere they talk, and you can make the excuse for homeland security,” Zirkin said. “But that is not a valid reason to encroach this fundamentally on people’s privacy rights.”
Read full story here…



Google/Boston Dynamics’ Robot Dog Has All The Right Movements

TN Note: The Fourth Industrial Revolution is beginning to come together. Forecasts of human employment being decimated by robots are eerily accurate. Watch these videos and see the natural movements and ability of “Spot” the dog. Also note that it is being controlled by a remote “handler” and does not yet sport Artificial Intelligence. If it weren’t for the military (DARPA in particular) driving for this type of robot, there might be some decent benevolent uses for it. But, as it is, it will ultimately be used to control others. 

To be perfectly frank, we have very mixed emotions about the video you’re about to watch. Is this how it begins? The robots trick us into letting our guard down by acting all cute and playful. Then the next thing you know, bam! Skynet becomes sentient and we’re battling real-life Terminators.

OK, maybe we’re getting ahead of ourselves a bit. Instead of panicking, let’s all just sit back, relax, and watch this adorable video of Google’s robot dog “Spot” playing with a real dog. Isn’t it cute? Aren’t they having so much fun together? Yup, this is exactly what the robots want…

Read full story here…




Pentagon’s Secret Pre-Crime Program To Know Your Thoughts, Predict Your Future

TN Note: This is NOT science fiction. Read every word of this article. Predictive behavior analysis is the big rage among technocrats who are bent on “scientific social engineering”. The only hindrance to a complete rollout of this technology is the more advanced computing power needed to analyze big data in real-time mode – but it is very close. Total Awareness Society is a long-established requirement for Technocracy.

The US Department of Defense (DoD) wants contractors to mine your social media posts to develop new ways for the US government to infer what you’re really thinking and feeling — and to predict what you’ll do next.

Pentagon documents released over the last few months identify ongoing classified research in this area that the federal government plans to expand, by investing millions more dollars.

The unclassified documents, which call on external scientists, institutions and companies to submit proposals for research projects, not only catalogue how far US military capabilities have come, but also reveal the Pentagon’s goals: building the US intelligence community’s capacity to forecast population behavior at home and abroad, especially groups involved in political activism.

They throw light on the extent to which the Pentagon’s classified pre-crime R&D has advanced, and how the US military intends to deploy it in operations around the world.

Could your social media signature reveal your innermost thoughts?

A new Funding Opportunity Announcement document issued by the DoD’s Office of Naval Research (ONR) calls for research proposals on how mining social media can provide insight on people’s real thoughts, emotions and beliefs, and thereby facilitate predictions of behavior.

The research for Fiscal Year 2016 is part of the Pentagon’s Multidisciplinary Research Program of the University Research Initiative (MURI), which was initiated over 25 years ago, regularly producing what the DoD describes as “significant scientific breakthroughs with far reaching consequences to the fields of science, economic growth, and revolutionary new military technologies.”

The document calls for new work “to understand latent communication among small groups.” Social meaning comes not just from “the manifest content of communication (i.e., literal information), but also from latent content — how language is structured and used, as well as how communicators address each other, e.g., through non-verbal means — gestures, head nods, body position, and the dynamics in communication patterns.”

The Pentagon wants to understand not just what we say, but what is “latent” in what we say: “Subtle interactions such as deception and reading between the lines, or tacit understanding between communicators, relative societal position or relationship between communicators, is less about what is said and more about what is latent.”

All this, it is imagined, can be derived from examining social media, using new techniques from the social and behavioral sciences.

The Pentagon wants to:

“… recognize/predict social contexts, relationships, networks, and intentions from social media, taking into account non-verbal communication such as gestures, micro-expressions, posture, and latent semantics of text and speech.”

By understanding latent communication, the Pentagon hopes to develop insight into “the links between actors, their intentions, and context for use of latent signals for group activity.” The idea is to create:

“… algorithms for prediction and collection of latent signals and their use in predicting social information.”

These algorithms also need to “accurately detect key features of speech linked to these structural patterns (e.g., humor, metaphor, emotion, language innovations) and subtle non-verbal elements of communication (e.g., pitch, posture, gesture) from text, audio, and visual media.”

The direct military applications of this sort of information can be gleaned from the background of the administrator of this new research program, Dr. Purush Iyer, who is Division chief of Network Sciences at the US Army Research Laboratory (USARL).

Among the goals of Dr. Iyer’s research at the US Army are expanding “Intelligent Networks” which can “augment human decision makers with enhanced-embedded battlefield intelligence that will provide them with tools for creating necessary situational awareness, reconnaissance, and decision making to decisively defeat any future adversarial threats.”

Creeping police state

The allure of co-opting Big Data to enhance domestic policing is already picking up steam in the US and UK.

In the US, an unknown number of police authorities are already piloting a software called ‘Beware’, which analyses people’s social media activity, property records, the records of friends, family or associates, among other data, to assign suspects a so-called “threat-score.”

That “threat-score” can then be used by police to pre-judge if a suspect is going to be dangerous, and to adapt their approach accordingly.

Given the police’s discriminatory track record with shootings of unarmed black people skyrocketing, the extent to which such ‘Minority Report’-style policing could backfire by justifying more discriminatory policing is alarming.

In the UK, Home Secretary Theresa May just last week told the Police ICT Suppliers Summit that polices forces should use predictive analytics to “identify those most at risk of crime, locations most likely to see crimes committed, patterns of suspicious activity that may merit investigation and to target their resources most effectively against the greatest threats.”

Noting that the police have yet to catch up with the “vast quantities of data” being generated by citizens, she complained: “Forces have not yet begun to explore the crime prevention opportunities that data offers.”

In reality, the shift to predictive policing in the UK is well underway, with Greater Manchester, Kent, West Midlands, West Yorkshire and London’s Metropolitan Police having undertaken trials of a software known as “PredPol.”

Read full story here…




Apple Fighting FBI’s Demands To Unlock Terrorist’s iPhone

TN Note: Unfettered access to all information is Technocracy’s dreamscape. Intelligence and law enforcement agencies have tried in vain to force software companies to put “back doors” into their offerings so the Feds could have easy access. This has been fought tooth-and-nail by many (not all) technology companies. This is the latest attempt by the FBI to force Apple to re-write its iPhone security program in order to “crack” the cellphone used by the San Bernardino terrorist, and Apple is fighting back. If the Feds win this case, it will change the face of technology forever.

On Tuesday, a US judge ordered Apple to help the FBI unlock an encrypted iPhone.

The Cupertino, California-based company has reacted furiously.

Apple CEO Tim Cook has published an extremely strongly worded letter, calling the demand “chilling,” arguing that it “would undermine the very freedoms and liberty our government is meant to protect.”

……

This court case isn’t taking place in a vacuum. We’re in the middle of a bitter feud between tech companies and law enforcement about the rise in the use of encryption.

In the years after NSA whistle-blower Edward Snowden’s revelations about the US government’s mass-surveillance programs, there have been a heightened awareness of privacy issues and moves to strengthen protections on consumer products.

Apple has been one of the strongest voices in support of this move, and all new iPhones and Apple devices are now encrypted by default.

……

[Apple CEO Tim Cook writes]

The government suggests this tool could only be used once, on one phone. But that’s simply not true. Once created, the technique could be used over and over again, on any number of devices. In the physical world, it would be the equivalent of a master key, capable of opening hundreds of millions of locks — from restaurants and banks to stores and homes. No reasonable person would find that acceptable.

The government is asking Apple to hack our own users and undermine decades of security advancements that protect our customers — including tens of millions of American citizens — from sophisticated hackers and cybercriminals. The same engineers who built strong encryption into the iPhone to protect our users would, ironically, be ordered to weaken those protections and make our users less safe.

We can find no precedent for an American company being forced to expose its customers to a greater risk of attack. For years, cryptologists and national security experts have been warning against weakening encryption. Doing so would hurt only the well-meaning and law-abiding citizens who rely on companies like Apple to protect their data. Criminals and bad actors will still encrypt, using tools that are readily available to them.

Read full story here…




US Seeks To Redefine Terrorism Rules To Include Anti-Government Protestors

TN Note: Technocracy defends itself by turning the government against anti-government protestors in an unprecedented move. Citizens convicted of supporting domestic groups placed on the “domestic terror” list could draw a penalty of up to 20 years in federal prison. This obviates the Constitution’s protection of free speech and the right to dissent. What kind of groups could potentially be classified as terrorist? Pro-gun lobbies, local and state militias, anti-BLM/land rights groups, etc. Your legislators must be warned to spurn any legislation that plays into this meme.

The U.S. Justice Department is considering legal changes to combat what it sees as a rising threat from domestic anti-government extremists, senior officials told Reuters, even as it steps up efforts to stop Islamic State-inspired attacks at home.

Extremist groups motivated by a range of U.S.-born philosophies present a “clear and present danger,” John Carlin, the Justice Department’s chief of national security, told Reuters in an interview. “Based on recent reports and the cases we are seeing, it seems like we’re in a heightened environment.”

Over the past year, the Justice Department has brought charges against domestic extremist suspects accused of attempting to bomb U.S. military bases, kill police officers and fire bomb a school and other buildings in a predominantly Muslim town in New York state.

But federal prosecutors tackling domestic extremists still lack an important legal tool they have used extensively in dozens of prosecutions against Islamic State-inspired suspects: a law that prohibits supporting designated terrorist groups.

Carlin and other Justice Department officials declined to say if they would ask Congress for a comparable domestic extremist statute, or comment on what other changes they might pursue to toughen the fight against anti-government extremists.

The U.S. State Department designates international terrorist organizations to which it is illegal to provide “material support.” No domestic groups have that designation, helping to create a disparity in charges faced by international extremist suspects compared to domestic ones.

A Reuters analysis of more than 100 federal cases found that domestic terrorism suspects collectively have faced less severe charges than those accused of acting on behalf of Islamic State since prosecutors began targeting that group in early 2014.

Read full story here…