Gun crime tech 'failed to save lives' in Chicago:;
An attempt to use software to help prevent gun crime in Chicago did not save lives, according to a study.
In 2013, the city's police began using algorithms to create a list of people deemed to be most at risk of being shot dead.
But
the effort had no impact on homicide rates, the report said. Rather,
those on the list were more likely to face arrest themselves.
The police defended the tech saying its predictive power has since improved.
The report was carried out by the Rand Corporation, a public policy-focused research body, and was published in the
Journal of Experimental Criminology.
Both the police and the software's developer - the Illinois Institute of Technology - co-operated with Rand's evaluation.
'No benefit'
The so-called
"predictive policing" initiative was based on the idea that potential
victims of gun crime could be identified by building a social network
model.
Specifically, the software calculated a person's risk factor on the basis of two variables:
- how many times they had been arrested with others who had later themselves become gun crime victims
- the number of relationships they had to intermediaries who had been arrested with people who had become homicide victims
This
resulted in a total of 426 people being identified as "high risk" in
March 2013. They were placed on a register called the Strategic Subjects
List (SSL).
The researchers said their analysis of the gun crime
that followed indicated that being on the list made no difference to
people's chances of being shot or killed. Neither was there any impact
on overall homicide levels, they added.
But they said the SSL's members became more likely to be arrested for the shootings of others.
"The effect size was rather large... 2.88 times more likely than their matched counterparts," the study said.
The report's authors said officers had received "no practical
direction" about what to do with the list, and, in some cases, had
decided to use it as a way to identify possible subjects.
The
danger, they warned, was that use of the list could lead to civil rights
and privacy abuses. This might ultimately backfire, they said, if
people felt they were being unfairly treated, although they added they
had seen no evidence of this themselves.
Backfire risk
The Chicago Police Department
has issued a press release in which it said the findings were "no longer relevant".
The
force said it now used a more elaborate model that takes account of
additional factors, such as how many times an individual has recently
been arrested for violent offences.
And it said it now used the
technique to arrange visits to members of the list, their families and
friends to explain what preventative steps they could take.
Even so, one privacy rights group said the affair served as a warning.
"Using predictive policing might seem like an ingenious solution to
fighting crime, but predictions from data algorithms can often draw
inaccurate conclusions," Renate Samson, chief executive of Big Brother
Watch, told the BBC,
"The police must exercise caution when using
data to target people and be sure that they adhere to the rule of
innocent until proven guilty."
Share And Comment Bellow On What You Think About This Post!!!
Comments
Post a Comment
Welcome.......
What are you thinking of....!!