Tag Archives: Facial Recognition

MA House Applies Crusher To Senate’s Police Reforms

Yesterday, the Massachusetts House launched their own version of a police “reform” bill (https://malegislature.gov/Bills/191/H4860).

TL;DR:
The House bill is, overall, far weaker than the Senate bill. We have till 1pm tomorrow to persuade House members to submit amendments. We want to see the Senate language on qualified immunityschool resource officerspolice stops, and military equipment approvals, in the House bill. We like the House’s face surveillance language better than the Senate’s. We don’t want, or need, yet more blue-ribbon commissions to consider at length What, If Anything, To Do. It’s quite clear what the problem is:

The police spy on, shoot and hurt people without probable cause, often for racist reasons. People who do that shouldn’t be police, and people it gets done to, should get to sue the people who did it to them.

There’s not much time. You can find your House Rep’s phone number at https://malegislature.gov/Search/FindMyLegislator. Please call this morning!

Here’s a quick summary of the key differences:

COMPARISON OF REFORM BILLSS2800H4860
Police rape of residents outlawed?YesYes
Qualified immunity limited?YesNo
School info sharing with “gang” database limited?YesYes
Government use of face surveillance banned?Temporary, plus RMVPermanent, minus RMV
Local discretion on whether to have police in schools?YesNo
Local elected official approval process for military equipment acquisition by police?YesNo
Chokeholds outlawed if intent or result of unconsciousness or death?YesYes
No-knock warrants limited?YesYes
Data collection on police traffic and pedestrian stops to prevent profiling?YesNo

In other words, the House bill has stronger provisions on face surveillance, but strips key language from the Senate version on qualified immunity, school resource officers, military equipment for police, and data collection on traffic stops. And as a last slap in the face to the Black community in Massachusetts, the House bill takes funds designated for securing racial equity in cannabis dispensary licenses, and redirected them to yet more police training.

At Digital Fourth, we would support a bill stronger than the Senate bill. Our optimal bill here would outlaw chokeholds, tear gas, other chemical irritants, the use of dogs at protests, and police rape; end qualified immunity, end information sharing of schools with the police and ICE, ban school resource officers, end the 1033 military equipment acquisition program, end no-knock warrants, end civil asset forfeitures, reverse the delays introduced by amendment in the Senate to the decertification process, and still collect data on all police stops.

The Senate bill at least represented progress, especially with the House provisions on face surveillance added. Therefore, we support all amendments adding the Senate language back in, excepting those relating to face surveillance. But the House bill – again, excepting the face surveillance provisions – is a betrayal of everyone genuinely concerned for equal justice, and deserves to wither in the fire.This is what happens now. 

You have till 1pm tomorrow to persuade your House member to submit or endorse amendments to the House bill. Then, House leadership will allow debate, likely on Tuesday or Wednesday, and vote on them and the bill. Then, the House and Senate will create a conference committee to try to agree common language. As you can see above, there are a lot of key differences. If the conference agrees on language, the bill goes back to both bodies for a vote, and then, if passed, it goes to the Governor’s desk. If the bill is not signed by the end of the session, which is currently scheduled for July 31, then the bill dies for this session, and would be reintroduced when the new session begins in January.

Good luck, and may the Fourth be with you!

Boston Just Banned Face Surveillance. What Now?

The Boston City Council voted unanimously on June 24 to ban government use of face surveillance technologies. Face surveillance systems are systematically worse at recognizing women and people of color, partly because the training datasets they learn with contain a preponderance of white, middle-aged men. Nothing about our criminal justice system requires the adoption of a technology that biases arrests and charging decisions more against Black people.

But if the technology ever somehow overcomes that, and becomes one hundred percent accurate, it becomes immensely more terrifying. In many cities already across the world, the police track wherever you go in public, and the authorities can easily form a picture of your habits and activities, to keep in their pockets for whenever you’re accused of a crime – or for whenever you grow inconvenient to them in other ways. Now, thanks to years of work by the #BosCops and #PressPause coalitions, which we’ve been a part of from the start, Boston will not be one of those cities. This matters.

Now we turn to what’s next. Face surveillance is a unique kind of threat, but the police should not deploy any surveillance technology without public hearings, and without the knowledge and approval of local elected officials. Those officials should have the power to approve or deny the use of such technologies. The surveillance state needs a little more sand in its gears, to stop the continuous ratchet of more and more invasive technologies. Next month – probably – the City Council will consider a surveillance ordinance that would do all that. Similar ordinances are already on the books in Cambridge, Somerville, Brookline, Lawrence, Northampton, and (as of July 1) Easthampton too, and many other municipalities across the nation.

But the face surveillance ordinance itself still, like any ordinance, has loopholes and limitations. We’ve written to the Boston City Council to lay out some of those problems:

  • So there won’t be a public network of City-owned cameras; what happens if there’s a private network, and the City simply requests that footage?
  • The City has the authority to regulate whether and how private businesses deploy face surveillance in the City. To address this, the city-wide ban on face surveillance should be amended to include language on how sports stadiums like Fenway Park, the TD Garden and retail stores like Home Depot, Macy’s, Best Buy and Kohl’s will be permitted to use face surveillance software, and require them to disclose use of it to the public.
  • And we still don’t know whether MBTA uses facial recognition; City agencies, including the police, should need a warrant for their footage.

Here’s our testimony on these points. And if you’d like to help with our continuing municipal campaigns to rein in surveillance in Massachusetts, email us today!

https://warrantless.org/wp-content/uploads/2020/07/Digital-Fourth-response-to-Facial-Recognition-ban-070720.pdf

REAL ID and Islamophobia: Resisting Our Legibility To The State

kafka-metamorphosis

In most parts of Europe, since the totalitarian governments of the inter-war period, pressure from governments to make their citizens legible has been hard to resist. Germany now has universal biometric ID cards for all adults, which police have a right to demand to see, irrespective of whether they have probable cause of your involvement in a crime; 24 of the 27 EU states have mandatory national ID cards.

Biometrics matter, because outside of science fiction, they can’t be changed. During refugee crises, deep anxieties – Who are these people? Why are they coming here? – induce governments to pin people down to an unchanging identity, like bugs in a biologist’s cabinet.

This is a fundamental difference between mainly-autochthonous and mainly-settler societies. Ideologically, the United States came to be out of westward conquest, by people eager to refashion themselves away from the religious and social strictures of more settled societies. At Ellis Island, you could change your name; on the frontier, a white man could be whoever he declared himself to be. As Walt Whitman wrote, “Of every hue and caste am I, of every rank and religion, / A farmer, mechanic, artist, gentleman, sailor, quaker, / Prisoner, fancy-man, rowdy, lawyer, physician, priest. / I resist any thing better than my own diversity, Breathe the air but leave plenty after me.” Settler societies are supposed to “leave plenty” of air to breathe for those who come to settle after them; they’re supposed to leave room to self-refashion. Anonymity, pseudonymity and the ability to erase your tracks bolster your power versus the state.

Which brings us to Donald Trump, and his calls for registration of suspiciously Muslim people; and which also brings us to efforts here in Massachusetts to increase our legibility by implementing the REAL ID Act.

Read More →

Drowning in Data, Starved for Wisdom: The surveillance state cannot meaningfully assess terrorism risks

In this movie, we're Brad.

Pity the analysts.

The NSA has just vigorously denied that their new Utah Data Center, intended for storing and processing intelligence data, will be used to spy on US citizens. The center will have a capacity of at least one yottabyte, and will provide employment for 100-200 people. With the most generous assumptions [200 employees, all employed only on reviewing the data, only one yottabyte of data, ten years to collect the yottabyte, 5GB per movie], each employee would be responsible on average for reviewing 4500 billion terabytes, or approximately 23 million years’ worth of Blu-ray quality movies, every year.

 

Must...keep...watching...my...country...needs...me

Must…keep…watching…my…country…needs…me

This astounding and continually increasing mismatch shows that we are well beyond the point where law enforcement is able to have a human review a manageable amount of the data in its possession potentially relating to terrorist threats. Computer processing power doubles every two years, but law enforcement employment is rising at a rate of about 7% every ten years, and nobody’s going to pay for it to double every two years instead. Purely machine-based review inevitably carries with it a far higher probability that important things will be missed, even if we were to suppose that the data was entirely accurate to begin with – which it certainly is not.

So why is anybody surprised that Tamerlan Tsarnaev, the elder of the Boston Marathon bombing suspects and one of around 750,000 people in the TIDE database, was not stopped at the border? That facial recognition software wasn’t able to flag him as a match for a suspect? That the fusion centers, intended to synthesize data into actionable “suspicious activity reports”, flag things too late for them to be of any use? That the Air Force is panicking a little at not having enough people to process the data provided by our drone fleet?

It’s in this context, then, that we should understand the calls for more surveillance after the Boston Marathon attacks for what they are. More cameras, more surveillance drones and more wiretapping, without many more humans to process the data, will make this problem worse, not better. These calls are being driven not by a realistic assessment that surveillance will help prevent the next attack, but by the internal incentives of the players in this market. Neither the drone manufacturers, nor law enforcement, nor elected officials, have an interest in being the ones to call a halt. So instead they’re promoting automation – automated drones, automated surveillance, and email scanning software techniques.

They are missing something very simple. We don’t need a terrorism database with 750,000 names on it. There are not 750,000 people out there who pose any sort of realistic threat to America. If the “terrorism watch list” were limited by law to a thousand records, then law enforcement would have to focus only on the thousand most serious threats. Given the real and likely manpower of the federal government, and the rarity of actual terrorism, that’s more than enough. If law enforcement used the power of the Fourth Amendment, instead of trying to find ways round it, it could focus more on the highest-probability threats.

Yes, they would miss stuff. That’s inevitable under both a tight and a loose system. But a tight system has the added advantages that it protects more people’s liberties, and costs a lot less.

UPDATE: With the help of a New Yorker fact-checker, the figure of “400 billion terabytes” above has been corrected to “500 billion terabytes”.

%d bloggers like this: