Futurism logo

Technology, and the Future of Criminal Justice

Why you need to be more diligent than ever.

By James P.Published 7 years ago 9 min read
Like
Smile... You never know who, or what is watching

A recent New York Times article highlighted a system and part of the criminal justice system that most people aren’t aware of; Risk Management Systems and their use in sentencing convicted offenders, predicting recidivism rates, and even working to the likelihood of children to become criminal offenders later in life. Tools such as these are based in the sciences of Predictive Analytics, Behavioral Analysis, Pattern Recognition and Machine Learning.

Before looking at what this means for the criminal justice system, and for all of us, it will help to understand how this influences the life of every human living in modern society.

If you have Social Media profiles with “friends,” connections, interests, etc. then every story you post or like, comment upon, or even open contributes to your profile. Not to mention the profiles of the people you connect with. Is this for great and nefarious purposes? That depends on your viewpoint.

As a general rule, Social Media engines build communities and users, and gather information about them, for the purposes of identifying what the most effective products and services are to market to that individual. This is how these companies generate revenues, and stay in business to keep offering their services.

However, let’s take this to its next step. More mature engines such as Facebook use this information to determine what posts you see from the individuals and groups you follow. Ostensibly, this is to make sure you see information you are more likely to find interesting. Now consider the fact that Facebook has admitted to manipulating the feeds of its users in a sociological experiment to affect moods. Facebook, Twitter, and other services have removed, or at least minimized the views of posts that are potentially “controversial” based on social or political views. Take from this what you will, but remember, these services exist and are available to you at no charge not for your benefit, but theirs.

Speaking of marketing, let’s move into eCommerce. Looking at the 800 pound digital gorilla in the room, Amazon uses multiple behavioral systems in their online business model. A few of these targeted uses include:

  • Targeted Sales and Marketing as a Consumer – You may see “Users who bought X also bought ….,” “Related to items you’ve viewed,” “Recommended for You,” “New for You, etc. All of these are based on your own profile of items you have purchased, looked at, or searched for and compared against product keywords, behaviors of other customers, and their own marketing strategies.
  • Targeted Sales and Marketing as a Seller/Service Provider – As a marketplace, Amazon also allows others to build their own storefronts and offer products through Amazon’s platform. This includes Kindle, Createspace, and ACX in addition to offering your own physical products.
  • Fraud Monitoring – Most major eCommerce sites monitor sales and account access any time it’s from a new device, and also for behavior uncommon for that customer. If someone were to log into a new device, and make an exceptionally large and uncharacteristic purchase, it may go through a manual review in addition to the automated checks before completing the transaction.
  • Questionable and Fraudulent Products – For products listed through their platform, Amazon works to guarantee the products delivered as are advertised. Even greater are the digital products created and distributed through their platform. Take Amazon Kindle products. Especially for work that is self-published or done through a small press, the verification systems used for new products are improving, but can either let fake and poor quality work through, or be draconian in categorizing work.

Beyond the more obvious platforms of Social Media and eCommerce is another online presence you use every day, and may not realize is also weighted for you. Search Engines.

If you want to perform a small experiment, perform the same Google search on you laptop, and a second device, like an Android phone. Often, you will find you get slightly different to sometimes very different search results. This is true both for the search results that are in fact “sponsored” (paid advertising, and boosting the sponsor to the top of the results) and for the “natural” results. This comes from the cookies stored on the device, your other search history stored with Google, and sometimes even they type of device with which the search was performed.

This is all well and good, but you may be asking how this ties back to the original question of virtual adjudication and sentencing?

All of these are now part of our daily lives, and all contribute to our larger digital profile, and our digital footprint. All of these use the same basic concepts and tools in creating a virtual you.

In 2016, ROSS, a virtual attorney was “hired” as a researcher for bankruptcy cases by a private firm, and there was a lot of attention paid to this as Artificial Intelligence entering the justice system. What most people are unaware of is how long it has played a role in our courts.

Under the burden of a swell of criminal prosecutions and number of people being arrested in the 1980’s, states began passing minimum sentencing and standardization of sentencing guidelines, reducing and eliminating parole, and other measures to “expedite the justice process,” and relieve some of the “burden on the system” that resulted from prosecutors and defenders exercising discretion for adjudicating and sentencing for offenses.

It is during this time we see the first complex rules based engines and technologies being introduced to law enforcement and the court systems.

Before long, the rules and guidelines grew faster than the tax code, and often in more painful ways.

As is often the case, the best of intentions is not the basis for a process or a system. With twenty five years’ experience in designing and implementing systems using varying levels of rule systems and Artificial Intelligence, the people and considerations used in making a decision are what determines how the technology acts. We as people often overlook the many factors we use in making decisions every day, from what to have for breakfast to whether or not someone should go to prison for twenty years.

While these guidelines were intended to introduce standards in sentencing and take away some of the inequity, in fact many studies reflected the opposite. The Sentencing Project is but one of many organizations studying and looking to ensure that sentencing criteria are equitable and applied consistently. Government studies have recognized some of the issues in the sentencing process, and I’ll also point to the Sentencing Reform Act as an indicator and an attempt to make improvements in the system. The results are still up for debate.

Considering the complexity of building the ability to create systems to determine equitable punishments, this model only addresses a fraction of what is needed to evaluate someone for release or parole. This brings us back to Northpointe’s COMPAS system. Built on IBM’s Watson Platform, It uses a series of questions in addition to demographic information. Once it is all said and done, the final decision is still made based on a combination of human configuration and machine learning. One issue that Artificial Intelligences have an even bigger problem with than people is the idea that Correlation is not necessarily Causation. And this is but just one such tool being used today.

But what about those who have never been on the other side of the law?

Have you ever said the phrase, “I have nothing to hide?” In Harvey A Silverglate’s Three Felonies a Day, he argues that every one of us commits acts, mostly unintentionally, that could put us into the system, and ultimately wreck our lives. Consider then, the myriad of surveillance and detection systems now in place, ostensibly to “keep us safe.” The media has well publicized the surveillance technologies the NSA is using on our communications.

The author Barrett Brown was arrested and sent to prison for linking to resources in an article that other media sources (including CNN) had already done in news stories. Ostensibly, because of his connections to Anonymous and his activism, could have motives attributed to posting to the results of the data breach, but larger media sources spoke out against it as well.

While an internet meme, there are instances of authors and reporters doing research receiving a knock at the door simply because their browsing history triggered the right keywords. Depending on where you are, the average person has their picture taken by ATM’s, security cameras, and even as the result of people posting unrelated images to social media a minimum of five times a day, and in metropolitan areas, an average of fifty times a day. In the largest cities, these averages are rapidly approaching a thousand times a day. Most of us carry a tracking device that records everywhere you go, and may even record unintended audio and video in the form of our phones.

We already see instances of things we post on Social Media having civil and criminal implications. Maryland parents lost custody of their two youngest children because of cruel pranks they pulled on the kids, and posted to their YouTube channel. Google “Social Media Posts that got people arrested” and look at the stream of articles that come up.

Based on the current rate of social and technological advancement, how far are we from a day when our phone beeps, and we look down to see our bank account has been charged for an unintended offense? Could you see receiving a ticket for speeding more than five miles per hour over the speed limit for more than thirty seconds? Jaywalking? Using profane or socially unacceptable language?

Or a day when we are told by our smartphones to report to the nearest officer or police station because of the heinousness of our offense? A drone appears at our side and disables us if we fail to comply?

Or what if we go to the level of Minority Report’s future crime? We already see the foundational technologies being used in sentencing and parole hearings to estimate the likelihood of recidivism. What about the day we have a fight with our spouse? If we leave the house and go to the hardware store and buy a shovel, a tarp, rope, and Duct Tape? Will the police show up to do a wellness check to find out the fight was because our cellphones recorded elevated voices, our fitness trackers showed elevated heart rates, and our shopping triggered watch list items?

What if I now tell you the fight was about going camping?

We currently live in an age where technology is, and has been outpacing society, the courts, and law for decades. Like a hammer, technology is nothing more than a tool. It is the intent of the user, and how it is deployed that determines the outcome. Like a child, technologies have to learn, grow and mature, as do the people that live with it. I believe it is rare that people use these tools maliciously and with intent, but we do see the unintended consequences of the influences and prejudices of the people who design and implement the tools.

Looking at the number of projects from entities including Google, IBM, and DARPA, not to mention open source project like OpenBCI, it is not a matter of if, but when it will be commonplace to no longer be carrying devices like out cell phones. It will be built into our bodies and minds. We will lose our remaining sense of privacy, such as it is today.

Not only will everything passing through our physical senses be subject to capture and recording, our innermost thoughts and ideas will be as well. We are entering an age where we will regularly interface with Artificial Intelligences, and we won’t be able to lie to them. More likely, we will be unable to know when they are influencing us.

In these days of strict rules and zero tolerance, it is up to each of us to be aware of the world around us, and how we interact with it. We have to know and understand that technologies are coming rapidly that the majority of the population are barely aware of in science fiction that are soon to be reality. We will see many people propose using neural interfaces and other technologies integrated with our bodies as a means of monitoring, and even modifying the behavior of criminals in the future.

How we define crime, anti-social behavior, and society in general will all have to change. We often accept invasive laws and technologies in exchange for convenience. My question for you, how far are you willing to go, and who will you be when you get there?

fact or fictionhumanityintellectsocial mediatechtranshumanism
Like

About the Creator

Reader insights

Be the first to share your insights about this piece.

How does it work?

Add your insights

Comments

There are no comments for this story

Be the first to respond and start the conversation.

Sign in to comment

    Find us on social media

    Miscellaneous links

    • Explore
    • Contact
    • Privacy Policy
    • Terms of Use
    • Support

    © 2024 Creatd, Inc. All Rights Reserved.