The Pentagon's secret pre-crime program to know your thoughts, predict your future behavior. (InsurgeIntelligence).
In the US, an unknown number of police authorities are already piloting a software called ‘Beware’, which analyses people’s social media activity, property records, the records of friends, family or associates, among other data, to assign suspects a so-called “threat-score.”
The US
Department of Defense (DoD) wants contractors to mine your social media
posts to develop new ways for the US government to infer what you’re
really thinking and feeling — and to predict what you’ll do next.
Pentagon
documents released over the last few months identify ongoing classified
research in this area that the federal government plans to expand, by
investing millions more dollars.
The
unclassified documents, which call on external scientists, institutions
and companies to submit proposals for research projects, not only
catalogue how far US military capabilities have come, but also reveal
the Pentagon’s goals: building the US intelligence community’s capacity
to forecast population behavior at home and abroad, especially groups
involved in political activism.
They
throw light on the extent to which the Pentagon’s classified pre-crime
R&D has advanced, and how the US military intends to deploy it in
operations around the world.
Could your social media signature reveal your innermost thoughts?
A
new Funding Opportunity Announcement document issued by the DoD’s
Office of Naval Research (ONR) calls for research proposals on how
mining social media can provide insight on people’s real thoughts,
emotions and beliefs, and thereby facilitate predictions of behavior.
The
research for Fiscal Year 2016 is part of the Pentagon’s
Multidisciplinary Research Program of the University Research Initiative
(MURI), which was initiated over 25 years ago, regularly producing what
the DoD describes as “significant scientific breakthroughs with far
reaching consequences to the fields of science, economic growth, and
revolutionary new military technologies.”
The document
calls for new work “to understand latent communication among small
groups.” Social meaning comes not just from “the manifest content of
communication (i.e., literal information), but also from latent
content — how language is structured and used, as well as how
communicators address each other, e.g., through non-verbal
means — gestures, head nods, body position, and the dynamics in
communication patterns.”
The
Pentagon wants to understand not just what we say, but what is “latent”
in what we say: “Subtle interactions such as deception and reading
between the lines, or tacit understanding between communicators,
relative societal position or relationship between communicators, is
less about what is said and more about what is latent.”
All this, it is
imagined, can be derived from examining social media, using new
techniques from the social and behavioral sciences.
The Pentagon wants to:
“… recognize/predict social contexts, relationships, networks, and intentions from social media, taking into account non-verbal communication such as gestures, micro-expressions, posture, and latent semantics of text and speech.”
By
understanding latent communication, the Pentagon hopes to develop
insight into “the links between actors, their intentions, and context
for use of latent signals for group activity.” The idea is to create:
… algorithms for prediction and collection of latent signals and their use in predicting social information.”
These
algorithms also need to “accurately detect key features of speech
linked to these structural patterns (e.g., humor, metaphor, emotion,
language innovations) and subtle non-verbal elements of communication (e.g., pitch, posture, gesture) from text, audio, and visual media.”
The
direct military applications of this sort of information can be gleaned
from the background of the administrator of this new research program,
Dr. Purush Iyer, who is Division chief of Network Sciences at the US Army Research Laboratory (USARL).
Among
the goals of Dr. Iyer’s research at the US Army are expanding
“Intelligent Networks” which can “augment human decision makers with
enhanced-embedded battlefield intelligence that will provide them with
tools for creating necessary situational awareness, reconnaissance, and
decision making to decisively defeat any future adversarial threats.”
The allure of co-opting Big Data to enhance domestic policing is already picking up steam in the US and UK.
In the US, an unknown number of police authorities are already piloting
a software called ‘Beware’, which analyses people’s social media
activity, property records, the records of friends, family or
associates, among other data, to assign suspects a so-called
“threat-score.”
That
“threat-score” can then be used by police to pre-judge if a suspect is
going to be dangerous, and to adapt their approach accordingly.
Given
the police’s discriminatory track record with shootings of unarmed
black people skyrocketing, the extent to which such ‘Minority
Report’-style policing could backfire by justifying more discriminatory
policing is alarming.
In the UK, Home Secretary Theresa May just last week told
the Police ICT Suppliers Summit that polices forces should use
predictive analytics to “identify those most at risk of crime, locations
most likely to see crimes committed, patterns of suspicious activity
that may merit investigation and to target their resources most
effectively against the greatest threats.”
Noting
that the police have yet to catch up with the “vast quantities of data”
being generated by citizens, she complained: “Forces have not yet begun
to explore the crime prevention opportunities that data offers.” Hmmm..........Thought crimes are the crimes of the future. Read the full story here.
No comments:
Post a Comment