Fighting Racist Surveillance in Detroit

Spread the love

Fighting Racist Surveillance in Detroit

Flashing green lights let you know you’re being watched

by Bill Wylie-Kellermann

Reprinted with permission from Sojourners, March 2020, (800) 714-7474, www.sojo.net.

WE GATHERED THIS fall on the steps of St. Peter’s Episcopal Church. Summoned by the Detroit chapter of Black Youth Project 100, we were preparing to march a mile-long stretch of gentrified Michigan Avenue, which intersects there. I had served the church for 11 years as pastor, and in the last dozen or so this Catholic Worker neighborhood had been invaded by $400,000 condos, plus destination bars and restaurants. Among others, guests at our Manna Meal soup kitchen and Kelly’s Mission, largely black, are stigmatized and made unwelcome.

But the focus of the march was more than the gentrified influx: Accompanying gentrification has been a heavy increase in electronic surveillance by the so-called Project Green Light, where businesses pay for street cameras that feed to a Real-Time Crime Center deep in police headquarters. In areas like this, as with downtown, high surveillance makes white people feel safe for moving in or just shopping and dining.

Dan Gilbert, a mortgage, finance, and development billionaire owns more than 100 buildings downtown, all covered with the cameras of his own Rock Security. They feed to his corporate center, but also to the police crime center. Detroit Public Schools has its own central command, which is not yet tied to the police center. Likewise, a separate system of streetlight cameras, and even drones, is under development. California recently banned the use of facial recognition software in police body cams, which would convert an instrument of police accountability into a device of surveillance. Michigan has not.

Most Project Green Light cameras are in Detroit neighborhoods, on gas stations, bars, party stores, churches, and clinics. There, each is marked with the constant flash of green lights that say: You are being watched. When the city demolished a house behind ours, we had to hang light-impervious curtains to prevent green pulsing on our bedroom wall from the funeral home a block away.

For nearly two years, unacknowledged, facial recognition software was being employed by the police department. The software can be used with a “watch list” of persons of interest, plus access mugshots, driver’s licenses and state ID photos, and more—perhaps 40 million Michigan likenesses. A string of recent studies indicates the software can be inaccurate for dark-skinned people, creating false matches and prosecutions. A federal study released in December, according to The Washington Post, showed that “Asian and African American people were up to 100 times more likely to be misidentified than white men.” Thus, resistance to the system quickly mounted in Detroit, which is 80 percent black.

There are two surveillance-industrial complexes. It’s said the days are coming when nations will need to choose which surveillance network to join, the American or the Chinese version—parallel to military alliances. China has 200 million cameras focused on its people, but with 50 million, the U.S. has more cameras per capita. The best facial recognition technology is coming out of Chinese firms, but these are banned in the U.S. because of their connection to “human rights abuses.” The Detroit contract is with DataWorks Plus.

Orwell would be astonished

THE BLACK YOUTH PROJECT 100 march along Michigan Avenue was more like a dance parade, thick with drums, rhythm, and chant: “Black Out Green Light.” This stretch is a so-called Green Light Corridor, where all businesses participate, marked with modest and tastefully lighted signs. At each venue we passed—coffee house, bar, restaurant, bagel shop—three or four folks would go inside with signs and leaflets to chant and speak. A nonviolence training a couple weeks prior had prepared participants for the increasing police presence as the action proceeded. Squad cars blared sirens and officers stood by doorways, never blocking entrance, but threatening. Back at the church Rashad Buni of BYP100, which seeks “Black Liberation through a Queer Feminist Lens,” called for surveillance funds to be spent instead on neighborhood investment—foreclosure prevention, job training, clean affordable water—as the real source of community security. We all want safe neighborhoods. The question he puts is: By what means?

The downtown vs. neighborhood experience of Project Green Light reflects a paradox in the new surveillance. On the one hand, it is often not just normalized, but voluntary, welcomed, even paid for by those surveilled. Orwell would be astonished that we not only sit in front of laptops that stare back at us but pay for higher and higher resolution cameras that we carry around self-photographing and broadcasting our locations. Folks pay to turn their DNA codes over to Big Data corporations. We hand off our shopping lists in exchange for membership sale prices. For facial recognition, we tag photos of ourselves and friends to confirm what the software has already discerned or take the “10-year challenge” on Facebook, potentially teaching algorithms to fine-tune the processes of aging. We put on watches that track and broadcast our steps, heart rates, and blood pressure. We report our emotional states with elaborate emojis. We eagerly clamor for 5G, which will integrate all our personal data, from printer usage to thermostat settings and security controls. All of these are resistible to one degree or another, and I myself do so as I’m able.

On the other hand, surveillance occurs without consent, notification, or permission. One big backdrop, of course, is the Patriot Act, which allowed the NSA (National Security Agency) to gather and store the phone calls, emails, and text messages of U.S. citizens, even gain backdoor access to your camera. With Project Green Light, anyone even passing by equipped premises is subject to capture and facial recognition. Business owners were never notified that their video feeds were being scanned by DataWorks Plus.

Among the businesses themselves, some have been eager for it, and others feel jammed as though it were a system of extortion, paying for police protection (up to $6,000 upfront plus a $1,600 annual fee) and setting up tiers as to who will receive prompt response. Detroit Mayor Mike Duggan has announced plans to require it for businesses open in late hours. In Baltimore, which also has an aggressive facial recognition program, police scanned the crowds of those protesting the death of Freddie Gray, turning up matches and pulling people with outstanding warrants. It is only two years since Black Lives Matter was reputedly removed from the FBI list of “black identity extremist” organizations.

Though surveillance has always had dimensions both public and private, that distinction is all the more blurred. Project Green Light is a police department program, but it is contracted and funded by private businesses. Ring, Amazon’s surveillance camera division, which began with smart doorbells, has fashioned plans to use facial recognition software in its network of home security systems, creating neighborhood watchlists. Though the program is still in development, connections to local law enforcement are more than contemplated. Imagine residential feeds tied to real-time crime centers. Amazon’s related app, “Neighbors,” creates a residential surveillance social network to share and chat about suspicions. Notice that “neighborhood,” which needs to be built on relationship and trust, is thereby turned to signify a circle of fear and suspicion. A review of the app last year by Motherboard found that in a given period the majority of those reported as “suspicious” were people of color.

Being seen, not watched

THROUGH THE DETROIT Community Technology Project, Tawana “Honeycomb” Petty was instrumental in conducting a study on community experience and perception with respect to personal data capture and storage. Petty wrote:

[Detroiters] expressed experiencing difficulties in having a decent quality of life, based on the tracking and sharing of their information. If they couldn’t afford water, it followed them. If they couldn’t afford to pay tickets, it followed them. If they got into debt and missed payments on utilities, it followed them. They expressed feeling tracked and monitored. That one bad experience prevented good experiences in other areas. They expressed not feeling seen as human, only as a trail of data and bad decisions … This feeling of being watched and tracked, but not seen, was further exacerbated by the implementation of mass surveillance in Detroit.

In sum, she says, “Detroiters want to be seen, not watched.”

I read in that a theological assertion. Simply put, God sees; the powers-that-be watch. No coincidence that in Hebrew, the word for city, ‘iyr or ‘iyr re’em (arguably the earliest naming of the powers biblically), also means the “Watching Angel.” Is this not how Israel in its pastoral nomadism would spiritually name the walled ramparts of the ancient city-state? Its eye turned outward upon them? Constantly scanning the horizon for enemies, outlaws, and the unwelcome? But now the great eye turns inward as well.

There’s no question that historically the omniscience of God, combined with a fiery judgment, has been employed by monarchy and empire as rationale and extension to its own surveillance and sanction. The principalities and powers covet God’s knowing. Ultimately, they aspire to pre-empt, usurp, and supplant the omniscience of God, a presumption rooted in fear and objectification. (Technically, that’s blasphemy). Divine omniscience, however, the knowing of all things, is an estate of steadfast love. As the psalmist put it, “O God, you have searched me and known me. You know when I sit down and when I rise up … Where can I go from your spirit? Or where can I flee from your presence?”

Surveillance capitalism

WRITING IN THE 1970s, William Stringfellow, preeminent theologian of the powers, counted surveillance a stratagem and tactic of the demonic. He counted it as a debilitating method of rule. By his lights, “the prevalence of industrial and commercial espionage, the monitoring of shoppers” had so habituated human beings to being watched that “tolerance of citizens toward political surveillance and the loss of privacy” was normalized. “The kind of open society contemplated by the First Amendment seems impossible—and, what is more ominous, seems undesirable—to very many Americans.” Prescient as he was, he’d still be shocked by the extent to which “the private” has been not just invaded, but plundered and commodified.

The totalization of surveillance by the powers is not only technological but economic. A capitalist system devours new territory. Big Data is a realm currently unregulated and virtually lawless. As capitalism draws more and more things into the market, it commodifies them. Hence, as nature is turned to land and specifically real estate, or work becomes wage labor, so our personal and private human experience (collected and passed along by all those apps) becomes behavioral data rendered to be bought and sold, for the purpose of predicting behavior and even controlling it. Donald Trump was a buyer through Cambridge Analytica in the 2016 election, tailoring ads personally to Facebook users based on their own preferences and personalities—nudging opinions and votes, perhaps more so than Russian meddling. Read the fine print: Privacy policies to which we click “agree” are in actuality surveillance policies. We think we are searching Google, but it is actually searching and mining us. All of this has been painstakingly detailed by Shoshana Zuboff in her book The Age of Surveillance Capitalism.

In surveillance studies, a primary historical image comes from 18th century utilitarian philosopher Jeremy Bentham, who conceived the Panopticon (“all seeing”). The design for a prison in the round, all its cells opened inward to the center, where a tower with small windows could keep watch on convicts. At night cells would be lit by a combination of lanterns, mirrors, and windows, illuminating them round the clock. The tower did not even need to be occupied at all times, since the important thing was simply for prisoners to imagine and understand they were being watched. Moreover, Bentham envisioned it as a private prison, run by himself and employing the free labor of the prisoners for profit. Take it as one metaphor for the surveillance society.

Racist technology?

IN HER BOOK Dark Matters, Simone Browne reads the panopticon in light of two other developments. One was the slave ship, a mobile, seagoing prison—precursor to the modern land-based prison. The other was the 18th century lantern laws of New York City that required enslaved people (black, mixed-race, and Indigenous) to carry candle lanterns if they went about the streets after sunset unaccompanied by a white person. They were compelled to show their faces on demand, revealing who was in place and who was out. As Browne says, the legal framework of stop and frisk goes way back.

Can a software technology be racist? Of course it can. Just as white supremacy may be structured into institution, law, or policy, so it can be embedded in algorithms and the constructions of technocracy. Algorithms learn from human behavior, including its biases. Try teaching one to recognize suspicious behavior in front of the house and see who turns out most often to be “suspicious.”

In the present instance, white supremacy prevails because whiteness is the normative baseline shading for recognition software. The error rate for people of color is especially dangerous in connection with a criminal justice system structurally stacked against them. The mugshot database is already disproportionately black and brown because of the way communities are policed. Mismatching multiplies criminalization. Last year, in an ACLU test, Amazon’s recognition software mismatched 27 professional athletes to individuals in a mugshot database. As Eric Williams of the Detroit Justice Center put it, “If it were white people who were misidentified at the same rate as black people … by this technology, we wouldn’t even be having this discussion. People would simply say, ‘Oh, it doesn’t work.’”

The weaponization of facial software

FACIAL RECOGNITION TECHNOLOGY can readily be weaponized. The U.S. Army has put out a request to contractors for development of a new generation rifle with facial recognition capability. Two years ago, Stuart Russell of the University of California, Berkeley, who wrote an influential book on artificial intelligence, produced a seven-minute fictional video called “Slaughterbots,” in which a swarm of tiny drones equipped with facial recognition and firepower are released to accomplish targeted assassinations. Fearmongering? Such devices are possible, and The New York Times has confirmed witnessing a military test of Styrofoam bots on that scale. It’s simply a matter of combining existing technologies, in this case, driven less by military desire than commercial interests.

There is no federal legislation regulating facial recognition software. Illinois was the first state to do so. Last year the state supreme court ruled that Illinois residents could sue companies under the 2008 Biometric Information Privacy Act for collecting such data (face and fingerprints) without permission. Facebook is in trouble there. San Francisco was the first major city to ban government use of facial recognition software. Sommerville, Mass., and Oakland, Calif., have followed.

Building “relational security”

DETROITERS HAD HOPED to ban the use of facial recognition software here. Last summer the focus was on the Board of Police Commissioners, ostensibly a citizen review board that oversees police operations. When the software contract came to light after nearly two years of secret operation, the development of a policy to regulate usage fell to the board. Their meetings, often lively with community participation, now became regularly raucous. In one, the chair even ordered the arrest of another commissioner. A coalition—including the ACLU, Color of Change, CAIR Michigan, Detroit Justice Center, and the Detroit Hispanic Development Center—publicly urged the board to reject the use of this technology.

By September, a policy document had been refined. To confirm it meant sanctioning and accepting the technology already in place. In what has become emblematic of the commission’s attentiveness to community concerns, the board first approved the policy and then opened the floor to public comment. My own remarks to the commission, I confess, were rancorous, accusing the commission not merely of rubber-stamping but covering up two years of virtual impunity by the mayor and the chief—employing the technology without public notice or accountability.

The struggle continues. Other systems such as drones and streetlight cameras will require their own policy directives by the commission. Though the city council’s record of standing up to the mayor is thin, it will take up the matter, opening the floor to more public action and discussion. A Detroit representative has introduced state legislation that would declare a five-year moratorium on the use of facial recognition, to let policy catch up with technology. And U.S. Rep. Rashida Tlaib of Detroit has introduced federal legislation prohibiting its use in public housing. Nonviolent actions of resistance will need to become more direct and creative.

Corollary to the theological distinction between being watched and being seen, there is an ethic of sorts as well. As means and end, it is the beloved community that sees and trusts, knows, and loves. That is the base from which and method by which, we struggle. 

Several years ago, in response to events of police brutality, Peace Zones for Life were organized across the city—introducing community awareness, intervention, and conflict resolution as alternatives to aggressive policing for neighborhood security. One of those zones on the east side has now pioneered a project called “Green Chairs Not Green Lights.” Distributing chairs for elders and others to use on their front porches to keep a loving eye on the block, they propose neighborly solidarity and relational security instead of cameras and software. A principle of self-reliance and self-determination. An act of seeing and knowing. An ethic of beloved community.

Bill Wylie-Kellermann, a Sojourners contributing editor, is a community activist, author, teacher, and pastor in Detroit. Among his recent books, Principalities in Particular: A Practical Theology of the Powers That Be (Fortress Press) addresses surveillance.