The Future of Silicon Valley

An excerpt from Beyond the Valley

Computer
Benjamin Balázs/Flickr

In his new book, Beyond the Valley, technology scholar Ramesh Srinivasan delves into the unseen web of technology that winds its way through our daily lives, connecting—and disconnecting—us from each other. We may rely on services like Amazon Prime or Google’s search engine, but as Srinivasan points out, the process is one-way. We’re not asked for our opinions, only our data. But is a democratic Internet even possible? Srinivasan argues that a better Internet is one that is open and inclusive, especially to those who are most likely to be exploited. In the excerpt below, he offers two chilling examples of how technology—in these cases, tools employed by the criminal justice system—can go awry.

The COMPAS system, developed by the Northpointe Company, has received significant scrutiny, thanks to the in-depth reporting of two journalists from the nonprofit organization ProPublica. They first chronicle the case of Brisha Borden, an eighteen-year-old African American girl without a criminal record. In 2014 Borden and her friend picked up a kid-sized scooter and bicycle on the sidewalk, valued at $80, and started to ride the vehicles down the street. A neighbor spotted them and called out that the bike and scooter belonged to her six-year-old son. The girls stopped riding and walked away. But another neighbor who witnessed the event had already called the police, who arrested Borden and charged her with burglary and petty theft.

Compare that to the case of a forty-one-year-old white male, Vernon Prater, who had an armed robbery conviction and had served multiple years in prison. Prater was apprehended for shoplifting merchandise worth $86.35 from a Home Depot store. The COMPAS system concluded that Borden, the teenage black girl (who had committed only a misdemeanor as a juvenile) was at higher risk for recidivism than Prater, the white man convicted of a felony.

The system’s determinations suggest that race, age, and gender are more important in judgments of potential criminality than a felony conviction and prison time. COMPAS has also flagged black defendants as being twice as likely to commit crimes compared to white criminals, despite prior criminal records. Based on data from Broward County, Florida, its predictions around violent crimes have been found to be incorrect 80 percent of the time. Describing the system, Mark Boessencker, a judge in Napa County (California) Superior court, where the system has also been used, explained: “A guy who has molested a small child every day for a year could still come out as a low-risk because he probably has a job. … Meanwhile, a drunk guy will look high risk because he’s homeless.”

The ProPublica study also included the case of Paul Zilly, a forty-eight-year-old construction worker sent to prison for stealing a lawn mower and some tools. COMPAS rated Zilly with a high score for violent recidivism, even though Zilly had been working with a Christian pastor to address problems that influenced his decision to steal. Despite this proactive attempt to reform, the COMPAS system doomed him to a more punitive sentence than the one he might have received had the judge not seen the rating. The issue, Zilly explained, extends beyond innocence or guilt to concern power over one’s own life: “Not that I’m innocent, but I just believe people do change.”

Questions such as “Was one of your parents ever sent to jail or prison?” and “How often did you get in fights while at school?” form the basis for the system’s algorithmic rating formulas. Puzzlingly, many COMPAS supporters insist that the answers to such questions are not highly correlated to race, despite social science research that shows how race plays a factor in criminal determination. In response to this research we often hear another question: Is this not better than a racist judge? This question presumes that our only alternatives are racist judges or racist algorithms. What if instead we demanded other technologies that support the world we aspire toward, a world where racial justice is prioritized and racial bias is eliminated? What if the most criminalized populations in the country were brought in to help design and implement these technologies?

Predictive Policing and Racism

The Los Angeles Police Department (LAPD) has incorporated the PredPol (predictive policing) system since 2014 in over a third of its divisions. Instead of using a narrative crime report to determine whether a given rime is gang-related, PredPol produces a “partially generative neural network,” in other words, an algorithmically written crime report, by considering quantitative, less “messy” data such as the number of suspects, the primary weapon used, and where the crime took place. A police department then draws upon these three criteria to provide a map of what it deems to be gang territory.

In an article for the journal e-flux Jackie Wang explains in greater detail: PredPol computes jurisdiction maps covered with red square boxes that indicate where crimes are likely to occur and at what times throughout the day. Police can then patrol these “spatial” zones in the flesh. Wang raises a number of “what happens next” questions, some already troubling to civil rights groups and critics of predictive policing:

What is the attitude or mentality of the officers who are patrolling one of the
boxes? When they enter one of the boxes, do they expect to stumble upon a
crime taking place? How might the expectation of finding crime influence what
the officers actually find? Will people who pass through these temporary crime
zones while they are being patrolled by officers automatically be perceived as
suspicious? Could merely passing through one of the red boxes constitute probable cause?

Wang has also written Carceral Capitalism, a compilation of essays on topics related to technologies of control (both inside and outside prison) that affect perceptions about racism and crime. She notes how the PredPol system evolved from software funded by the Pentagon and used in Iraq to track insurgents and predict casualties. Jeffrey Brantingham, a UCLA anthropology professor and PredPol’s co-developer, holds an evolutionary view of criminals as “modern-day urban foragers whose desires and behavioral patterns can be predicted.” Such a model, says Wang, “appeals to our desire for certitude and knowledge about the future.” But how might satisfying that desire ultimately work against us?

Let’s go back to examine PredPol’s selling point, that it avoids “messy” crime scene descriptions. This implies that the system’s algorithmically formulated and visualized data represents a neatly reflected (and therefore true) reality. Instead, says Wang, such data “actively constructs our reality.” The system thus ignores the subjectivities and contexts that shape specific real-world events. And those events are admittedly more difficult to process and compute using objective learning algorithms. And what if behind the system’s prediction there only lies data that shows police have patrolled the corner before, perhaps because it’s in a poor neighborhood?

PredPol consistently claims that it can be linked to a decrease in crime in cities that have implemented it. But crime rates have been decreasing steadily throughout the United States since the 1990s, and there is little evidence to correlate any decrease to PredPol. For example California’s gang database, the very source of so-called objective information upon which the algorithms are based and modeled, was found to be full of errors. For example, it mistakenly included forty-two gang member names with birth dates of one year old or younger. …

Both PredPol and COMPAS illustrate how private companies have developed technologies that impact our public institutions and lives. Both are “intelligent systems” that disproportionately target racial and sexual minorities as well as those of low socioeconomic status, using secretive data sets. Because they are seen as “statistical” or “mathematical,” it becomes very difficult to make prejudice visible. Both systems, and therefore the courts and cops using them, do not take into account the impact of incorrect judgments or predictions. Nor do they consider the amount of pain their decisions may have on individuals and communities who are already racially profiled and placed at the margins of social and economic power. But this myopic viewpoint ignores alternative values that we must safeguard: complexity, social context, civic decision-making, and an empathic approach that sees us all as connected. And so, instead of further criminalizing vulnerable communities with technology, we can design tools in the image of compassion and restorative justice.

Excerpted from Beyond the Valley: How Innovators around the World are Overcoming Inequality and Creating the Technologies of Tomorrow by Ramesh Srinivasan, Copyright 2019, Reprinted with permission from The MIT Press.</blockquote class=”shelflife”>

Permission required for reprinting, reproducing, or other uses.

Katie Daniels is the assistant editor for the Scholar.

● NEWSLETTER

Please enter a valid email address
That address is already in use
The security code entered was incorrect
Thanks for signing up