...

Surveillance Algorithms Assessing and Evaluating Us Based on Our Circumstances

Surveillance Algorithms Assessing and Evaluating Us Based on Our Circumstances

The way you are treated by the criminal justice system can be influenced by factors such as your money, postcode, friends, and family. While a program in New South Wales that used algorithmic risk scores to target individuals for police surveillance has been scrapped, similar programs still exist. For example, Corrective Services NSW uses a statistical assessment tool called LSI-R to predict prisoner reoffending rates. Those deemed “high risk” may receive more intense interventions and be denied parole based on factors such as criminal associations, family involvement in crime or drugs, financial problems, living in a high-crime neighborhood, and frequent changes of address.

Predictive algorithms are sets of rules used by computers to make decisions based on patterns in data. These algorithms have been criticized for their discriminatory nature, from biased search engines to health databases. In the book “Artificial Justice,” it is argued that the use of predictive tools based on factors like poverty or family background should be concerning. Punishment should only be based on wrongdoing, not the circumstances one has been dealt.

Algorithms are widely used in criminal justice systems worldwide. In the UK, the OASys system shapes bail, parole, and sentencing decisions. In the US, COMPAS performs a similar function. Risk scores generated by algorithms are also used outside of criminal justice, such as in healthcare to predict medication misuse. While these algorithms can save lives and aid in decision-making, they can also perpetuate unjust inequalities.

The article highlights the potential issues with using algorithms to predict crime. For example, if an algorithm is developed to identify crime hotspots based on data linking crime to lower-income areas, it may inadvertently reinforce biased policing practices. The article argues that using statistics to predict intentional actions, such as toxic behavior or drug abuse, can lead to unjust outcomes. Factors influencing these predictions are often undisclosed but can have significant impacts on individuals’ lives.

The article emphasizes the importance of allowing individuals to make choices aligned with their values and needs. Punishing someone based on factors they cannot easily control treats them as if they are incapable of making good choices. The historical examples of Cesare Lombroso and Charles Goring are cited to illustrate how algorithms can perpetuate the idea that certain individuals are fated to engage in criminal behavior.

To address these issues, the article suggests that public bodies should be required to disclose the facts behind predictive decisions. Machine learning should only be used if these requirements can be met, allowing for meaningful discussions about where to draw the line. The article argues that harsher penalties should only be imposed for actual wrongdoing, not based on physical, mental, or social characteristics. Once individuals have been penalized for their crimes, they should not be treated differently or subjected to longer sentences based on factors such as their friends, family, financial status, or past mistreatment.

Discover more from WIREDGORILLA

Subscribe now to keep reading and get access to the full archive.

Continue reading