It’s A Whole New World

Every two years, the Massachusetts legislature starts a fresh session. Here, we review bills on the top ten topics relating to surveillance, privacy and the Fourth Amendment, that have been introduced in the new session.

Please contact your legislators via https://malegislature.gov/Search/FindMyLegislator, to express your support, and to ask for theirs. Our thanks to Julie Bernstein for conducting the legislative research for this article.

1. Civil Asset Forfeitures: HD1780 / SD2388, HD1328
2. QUALIFIED IMMUNITY REFORM: SD1970
3. Oversight of Fusion Centers: HD2088
4. Commercial Data Privacy Protection: SD745
5. Restricting Law Enforcement Use Of Facial Recognition: HD2304 / SD750
6. Restricting Automated License Plate Recognition: HD428 & HD2360
7. Protecting Locational Privacy: HD3698
8. Protecting Biometric Information: HD3053
9. Protecting Browsing Information: SD1217
10. SAFE COMMUNITIES ACT: HD2459 / SD1937

Summaries and explanations of each of these bills follow after the jump:

Read more: It’s A Whole New World
1. CIVIL ASSET FORFEITURES: HD1780 / SD2388, HD1328

HD.1780 / SD.2388 An Act Relative to Forfeiture Reform

HD.1328: An Act Relative To Civil Asset Forfeiture Transparency And Data Reporting

HD.2128: An Act Relative to Civil Asset Forfeiture

Restore The Fourth’s Issue Brief on Civil Asset Forfeiture

The threshold for civil asset forfeitures (CAFs) in MA is the lowest in the country, “probable cause” that a crime was committed. Our state is notorious for seizing cash and vehicles from people without them having committed a crime and we were ranked worst in the country for civil asset forfeiture policies by The Institute for Justice.

Last year, a special legislative commission was convened to investigate civil asset forfeiture in MA. They requested civil asset forfeiture data from every District Attorney (DA)  and every local law enforcement agency. The only response that they received was from Suffolk County and in cataloging  how the assets from their seizures and forfeitures were spent, they listed 50% as going to “other”. H.D.1780 is an outcome of the recommendations of the Commission on Civil Asset Forfeiture.

H.D.1780 raises the evidentiary standard for CAFs by one level to “a preponderance of the evidence” which is more typical nationwide. DAs and local law enforcement keep all of the proceeds from forfeiture in our state incentivizing seizures. H.D.1780 requires that all proceeds from seizures and forfeitures go to the Treasurer, who after reimbursing all non-personnel costs associated with the seizure and paying liens, would deposit the remainder in the General Fund.

This bill also narrows a major loophole. Currently police departments participating in joint task forces with the federal government (often cooperating in large seizures of contraband), are required by the federal government to contribute the 80% of the proceeds which they receive into law enforcement. This has enabled law enforcement to purchase surveillance technology like stingrays, without any oversight even when required by a local Surveillance Ordinance. Under the new provisions, if federal law prevents the distribution of CAF proceeds to the General Fund, then police departments can no longer accept forfeited property or proceeds from the federal government. A remaining  gap is that all joint seizures would have to be litigated by a local DA or the AG except for seizures of U.S. currency worth more than $50,000. 

A report by Politico and WBUR about civil asset forfeitures in Worcester County revealed that 1 in 4 seizures of cash and property that the Worcester DA’s office filed forfeitures for in 2018 either were not associated with a criminal conviction or weren’t even linked to a criminal drug charge and another 9% of seizures had no publicly available court records. Among those, there were more than 90 instances where people lost money or cars, taken most often during traffic stops, frisks and home searches — even though there weren’t related drug convictions or drug charges. WBUR documented more than 500 occasions between 2016 and 2019  where funds were held by the DA’s office for ten years or more before officials tried to notify people. More than half of funds seized between 2017 and 2019 were $500 or less. When the county finally got around to notifying someone that their assets were not legitimately seized and could be returned, they published a small notice in the local newspaper.

Elsewhere in the state there was a well-publicized case where a vehicle belonging to Malinda Harris was seized after her son was suspected of using it in a crime. The woman had nothing to do with his crime and needed her car for work. Six years later it was finally returned to her.

H.D. 1780 would require that seizures and forfeitures occur only after a court convicts the suspect of a crime with exceptions for lawful arrests and searches, and seizures of contraband. Police officers would be compelled to itemize everything that they seize and they would be prohibited from seizing currency of less than $200 and vehicles worth under $10,000. A seizure that occurred before a trial for a crime can be appealed via a hearing. Both H.D.1780 and S.D.1328 compel every law enforcement agency including the state police and all DAs to annually report all seizures and forfeitures including those under federal jurisdiction, and the crimes associated with them.  These would be entered by the executive office of administration and finance into a case tracking system and searchable public website.

H.D. 1328 requires that important additional information be reported including the outcome of any criminal charges, the details of all proceedings related to seizures and forfeitures, all case numbers and the zip code in which the seizure occurred. This granularity is crucial in view of the abuses that have occurred and the need to understand whether the new regulations adequately address these. Furthermore, whereas H.D.1780 requires that the data be reported to the AG, H.D. 1328 requires that all of the data also be reported to the Senate and House Committees on Ways and Means and the Joint Committee on the Judiciary.

H. D. 2128 would raise the standard of proof for a civil forfeiture to occur further than H. D. 1780 would do; instead of the Commonwealth having to prove that the asset was associated with a crime on “the preponderance of the evidence”, they would have to meet a standard of “clear and convincing evidence”. That standard or higher is the law in 28 states. The bill would also route all state forfeitures revenue into the Commonwealth Substance Abuse Prevention and Treatment Fund. It includes process improvements similar to H. D. 1780, though less detailed than those in H. D. 1328.

Digital Fourth supports these bills individually, and would support a consolidation of them in committee, using the standard of proof and revenues provisions from H. D. 2128, the detailed process requirements from H. D. 1780, and the detailed reporting requirements from H. D. 1328. These bills should help to ensure that forfeitures occur only when the vehicle, asset, or realty was involved in a crime, that innocent owners do not lose their property, and that law enforcement agencies have no financial incentive to conduct seizures and forfeitures.

2. QUALIFIED IMMUNITY REFORM: SD1970

Qualified immunity reform was left out of the 2020 police reform in Massachusetts, unlike in other states. Currently, Massachusetts imposes an unfeasibly high bar on civil rights lawsuits against state government agents, including police, of having to prove that the civil rights violation involved “threats, intimidation or coercion.” As a consequence, attorneys don’t take these cases, because they don’t expect to win; many plaintiffs can’t afford to pay an attorney unless they win damages.

S.D. 1970 stipulates that: “In an action brought under this section against a person or entity acting under color of law, proof shall not be required that the interference or attempted interference was by threats, intimidation or coercion.”

3. OVERSIGHT OF FUSION CENTERS: HD2088

This bill would require the Commonwealth’s “criminal intelligence systems” – the Boston Regional Intelligence Center, the Commonwealth Fusion Center, and others – to submit to regular outside auditing to ensure that they are complying with 28 CFR Part 23. This federal regulation requires that any information they hold on Massachusetts residents be based on reasonable suspicion of involvement in a crime.

It provides a private right of action to residents who believe that these entities have violated their privacy rights. It also requires the Commonwealth Fusion Center to publish the names of its privacy advisory committee, to have it meet quarterly, and to make its minutes public.

4. COMMERCIAL DATA PRIVACY PROTECTION: SD745

SD. 745: An Act Establishing the Massachusetts Data Privacy Protection Act

This is a very complete data privacy bill that covers large corporations, service providers social media companies and data brokers that either collect, process or transfer data. It requires the originating covered entity (CE), for example, Google, to limit the data that it collects from you to only what is necessary in order to provide you the service that you desire and must give you an easily accessible and user friendly affirmative consent mechanism in which you will be told what data Google collects and where it goes for what purposes and you will be able to consent to or opt out of these uses of your data. The CE must communicate your preferences to all of the service providers(SPs) or data brokers (DBs) or any other third parties with which it shares your data because they must comply with your preferences.

Each covered CE and SP must make publicly available an obvious and understandable privacy policy including a detailed and accurate representation of its data collection, processing, and transfer activities, the purpose of all data collected, the length of time that the data is to be retained, the data security practices implemented, every data broker or third party to whom the data is transferred and several forms of contact information so an individual can readily access the CE or SP to make requests concerning their data.

If the covered entity makes any changes in the data it collects, shares or transfers or sends your data to a new party, this must be communicated to you so that you can consent or opt out. You can change your data preferences and delete data twice a year without paying.

All CEs must allow individuals to access their data in a downloadable, portable, structured, interoperable, and machine-readable format and to make any corrections to inaccurate and incomplete data. Requests to change or delete your data should generally be honored within 30 days and you can make these changes twice annually for free.

Companies will have to report to the Attorney General (AG) how many requests they receive and how they have been handled. Any individual alleging a violation of their privacy rights under this act may bring “a civil action in the superior court or any court of competent jurisdiction” against the CE, DP or third parties. If a violation is found to have occurred, the plaintiff will be eligible for damages as well as an injunction or other relief and attorney fees.

DBs must register with the OCABR Office of Consumer Affairs and Business Regulation)which will maintain a searchable database with information on what data it collects and transfers and how you can contact the data broker about removing or verifying your data, linked to a website provided by the DB where you can opt out of data collection. Failure of the DB to comply will result in a fine.

Each DB will also be required to provide the AG with an impact statement for any algorithms that it uses that can potentially have a disparate impact on any protected group or individual registered to a political party along with steps they are taking to mitigate the impact. The AG can take action against CE or SP that fails to comply with civil rights provisions.

Large data holders (DHs) must hire at least one privacy officer or a data security officer and implement a data privacy program and data security program to safeguard the privacy and security of covered data. All CEs and Large DHs must perform a privacy impact assessment that weighs the benefits of the data collecting, processing, and transfer practices against the potential adverse consequences of such practices, including substantial privacy risks, to individual privacy and mustreview how technologies are being used to secure covered data.

CEs must provide all legal requests for disclosure of personal information that they receive to the AG and the general public on a bimonthly basis. This includes requests for location information and both the number of legal requests that resulted in the covered entity disclosing location or biometric information and those that did not.

The bill bans targeted advertisements to minors.

The bill has strong protections for workers against electronic monitoring that limit the monitoring to the least amount of information necessary from the fewest number of employees for the shortest length of time in order to enable tasks that are necessary to accomplish essential job functions or to monitor production processes or quality. The monitoring must not harm the employee’s mental or physical health. Employers must provide employees with notice that electronic monitoring will occur prior to conducting each specific form of electronic monitoring and include details including the purpose, the specific activities, locations, communications, and job roles that will be electronically monitored, the technologies that will be used and all vendors and third parties who will receive the data.

5. RESTRICTING LAW ENFORCEMENT USE OF FACIAL RECOGNITION: HD2304 / SD750

This bill implements the findings of last session’s Commission on Face Surveillance. The findings had support from law enforcement as well as from civil liberties organizations. The bill would provide that:

1. Law enforcement other than the State Police and FBI cannot directly possess or access a biometric surveillance database.

2. Law enforcement may not use biometric surveillance to infer a person’s emotion or affect nor for analysis of moving images or video data.

3. The State Police can access the facial recognition database used by the registrar of motor vehicles to conduct a search for local law enforcement, a federal agency or the FBI if they are presented with warrant issued by a judge based upon probable cause or if there is an immediate threat of danger of serious injury to someone or a need to identify a deceased person.

4. Law enforcement must document the basis for any emergency requests and file them with the appropriate Superior Court within 48 hours of the request.

5. All searches of the database by the State Police or FBI must be documented and reported to the executive office of public safety and security, quarterly disaggregated, by the requesting law enforcement or federal agency. The same goes for breakdowns of whether the request involved a warrant or emergency. The agency must post the total # of searches performed ID of a deceased person. These must all be publicly posted by EOPSS by March 31 of the following year.

6. Any person charged with a crime in which they were identified by a facial recognition search must be provided notice that the search occurred and defendants and their attorneys in criminal prosecutions must be provided with all records and information pertaining to any facial recognition searches performed or requested during the course of the investigation of the crime or offense.

6. Restricting Automated License Plate Recognition: HD428 & HD2360

HD.428 An Act Relative to All-Electronic Tolling Data Privacy.

This bill provides that:

1. A department may not access, search, review, disclose or exchange tolling data (meaning any data captured or created by an ALPR system or from signals or radio frequencies emitted by a transponder in connection with the assessment or collection of a toll, including, without limitation, GPS coordinates or vehicle location information, dates and times traveled, images, vehicle speed, and license plate numbers, existing in an any form or medium, whether electronic, paper or otherwise) unless this is necessary to:

a. collect, access or pursue payment tolls or fines or surcharges related to unpaid tolls

b. to install, maintain or repair a transponder

c. to respond to a reasonable belief that an individual is at imminent risk of serious physical injury, death or abduction; provided, that not later than 48 hours after responding, the access and detailed reasons for it are provided to the AG.

d. comply with a search warrant, production order, or preservation request issued in connection with the investigation or prosecution of a felony.

3. a. The department must erase or destroy the tolling data accessed within 120 days of access.

    b. The department may retain tolling data beyond 120 to comply with a search warrant, production order, or preservation request, or as necessary to collect unpaid tolls or fines or surcharges related to unpaid tolls.

4. a. A person whose tolling data was retained in violation of the above can institute a civil action in district or superior court for damages or in superior court for injunctive relief.

    b. If a violation has occurred the violator will not be entitled to absolute or qualified immunity and will be liable for proven actual damages, be liable for treble damages or for exemplary damages of between $100 and $1000 along with costs and reasonable attorney’s fees.

Why this is important: ALPR data records everywhere that someone has driven. If it is maintained in a database, then it can be reviewed retroactively for many unlawful purposes such as to identify a suspect in a crime for which there is ho particularized evidence of them having committed the crime This means that potentially many people who have traveled to the vicinity of the location of a crime will now become suspects. In addition, tolling data can be used to identify individuals who have participated in a political event or rally or a protest which are acts protected by the First Amendment and therefore should not be monitored.

HD.2360 An Act Establishing Driver Privacy Protections

This bill provides that:

Law enforcement or other state government employees or officials may not:

  • use an ALPR system to track or monitor activity protected by freedoms of religion or speech guaranteed by the Massachusetts Declaration of Rights or the First Amendment to the United States Constitution;
  • retain ALPR data longer than 14 days except in connection with a specific criminal investigation based on articulable facts linking the data to a crime;
  • disclose, sell or permit access to ALPR data except as required in a judicial proceeding; or
  • access ALPR data from other governmental or non-governmental entities except with a valid search warrant.

Toll collection technologies may only be used to identify the location of any vehicle for tolling purposes.

The department of transportation may not access, search, review, disclose, or exchange tolling data in its possession, custody, or control except to:

  • assess, collect or pursue the payment tolls or fines or surcharges related to unpaid tolls; 
  • install, maintain or repair an ALPR or transponder system or a system storing tolling data;
  • respond when an individual is at imminent risk of serious physical injury, death or abduction
  • comply with a search warrant, production order, or preservation request issued in connection with the investigation or prosecution of a felony.

The department of transportation must eliminate all tolling data that it possesses or controls within 120 days of its was creation unless it is necessary to comply with a search warrant, production order, or preservation request, or as necessary to collect unpaid tolls or fines or surcharges related to unpaid tolls.

No toll collection or vehicle data may be shared with or provided to any law enforcement entity or official without a search warrant, or production order; unless this information is requested  because of a reasonable belief that an individual is at imminent risk of serious physical injury, death or abduction and that such data is necessary to respond. Such a request must be narrowly tailored to address the emergency and subject to the following limitations:

  • the request must document the factual basis for the emergency and the applicability of toll collection and/or vehicle data
  • within 48 hours of accessing these records, the government office must file a written notice describing with particularity the grounds for emergency access and exactly what tolling data was accessed, with the Attorney General.

If ALPR data, tolling data, and vehicle data is collected, retained, disclosed, sold, or accessed without complying with the above requirements, it may  not be admitted, offered or cited by any governmental entity for any purpose in any criminal, civil, or administrative proceeding.

An individual whose rights have been violated by the improper transfer of or access to these data, may introduce evidence concerning this data in a civil action for damages or injunctive relief in a district or superior court or may allow another party in a civil proceeding to do the same.

If a willful violation occurred, the violator will not be allowed to claim any privilege absolute or qualified. In addition to any proven actual liability, the violator will be liable for treble damages, or, alternative, exemplary damages of between $100 and $1000 for each violation as well as costs and reasonable attorney’s fees.

The attorney general will enforce the above and will have the power to petition the court for injunctive relief and other appropriate relief against violators.  

7. PROTECTING LOCATIONAL PRIVACY: HD3698

In this bill, location information is defined as directly or indirectly revealing the present or past geographical location of an individual or device within the Commonwealth of Massachusetts with sufficient precision to identify street-level location information within a range of 1,850 feet or less. Location information includes but is not limited to (i) an internet protocol address (ii) Global Positioning System (GPS) coordinates; and (iii) cell-site location information.

HD. 3698 prohibits the collection, processing, or disclosure by  a Covered Entity (CE) including “any individual, partnership, corporation, limited liability company, association, or other group” (except a state or local government agency or court) of an individual’s location information  from any device that “connects to a cellular, bluetooth, or other wireless network” “for profit or in exchange for monetary or other consideration including selling, renting, trading, or leasing location information without the express consent of the individual except for the following purposes:

Location information can be collected for “(i) provision of a product, service, or service feature to the individual to whom the location information pertains when that individual requested the provision of such product, service, or service feature by subscribing to, creating an account, or otherwise contracting with a covered entity; (ii) initiation, management, execution, or completion of a financial or commercial transaction or fulfill an order for specific products or services requested by an individual, including any associated routine administrative, operational, and account-servicing activity such as billing, shipping, delivery, storage, and accounting; (iii) compliance with an obligation under federal or state law; or (iv) Response to an emergency service agency, an emergency alert, a 911 communication, or any other communication reporting an imminent threat to human life.”

When location information is collected for any but the last two allowed purposes, the CE must list each purpose in a Location Privacy Policy and individuals must provide discrete consent for each purpose to enable the collection of location information. Each CE must provide a clear, conspicuous, and simple means to opt out of the processing of their location information for purposes of selecting and delivering targeted advertisements.

Permission will be valid for one year unless the individual chooses to revoke it before that . If permission is revoked, any location information possessed by a covered entity must be permanently destroyed. An individual can opt in again at a future time. There cannot be any retaliation against someone who chooses not to have their location information collected but a service requiring this information can be withheld.

Covered Entities may not:

  • collect more precise location information than necessary to carry out the permitted purpose,
  • retain location information longer than necessary to carry out this purpose,
  • sell, rent, trade, or lease location information to third parties; or
  • derive or infer from location information any data that is not necessary to carry out the permitted purpose.

The CE may not disclose or assist in any way the disclosure of an individual’s location information to third parties (TPs), unless this is necessary to carry out the permissible purpose for which the information was collected, or requested by the individual to whom the location data pertains.

A CE or service provider (SP) may not disclose location information to any federal, state, or local government agency or official unless:(1) the agency or official presents a valid warrant or establishes the existence of exigent circumstances that make it impracticable to obtain a warrant ,or (2) disclosure is mandated under federal or state law, or (3) the subject of the data requests this disclosure.

The CE must maintain and make available its Location Privacy Policy including:

  • the purpose(s) for which the covered entity is collecting, processing, or disclosing any location information;
  • the type of location information collected, including the precision of the data;
  • the identities of SPs with which the CE contracts with respect to location data;
  • any disclosures of location data necessary to carry out each purpose and the identities of the third parties to whom the location information could be disclosed;
  • whether the CE’s practices include its use of location information for targeted ads
  • the data management and data security policies governing location information;
  • the retention schedule and guidelines for permanently deleting location information

Users of the CE must be given 20 days advance notice of any change in the Location Privacy Policy.

It will be illegal for the government to monetize location data.

Covered entities must annually disclose annually any warrants for location information received by themselves or any associated SPs or TPs (if known), disaggregated by the requesting agency, statutory offense under investigation, and the source of authority to the Attorney General (AG). The AG will make these reports available to the public online.

Any individual alleging a violation of this chapter by a CE or SP may bring a civil action in the superior court or any court of competent jurisdiction. They will not need to file a report with the AG or accept arbitration. If a claim is proven, the plaintiff may be rewarded damages for emotional distress, or $5,000 per violation, whichever is greater, (2) punitive damages; and (3) any other relief, including but not limited to an injunction or declaratory judgment, that the court deems to be appropriate as well as attorney’s fees and other costs.

The AG can bring an action against a CE or SP to remedy violations. The AG must conduct investigations of any possible violations of this chapter and refer cases for criminal prosecution to the appropriate federal, state, or local authorities.

Location information may be collected by a healthcare provider for treatment or research purposes in compliance with HIPPA.

CEs must comply with this chapter within 6 months of enactment and delete any location information retroactively for individuals who withhold consent.

8. PROTECTING BIOMETRIC INFORMATION: HD3053

In this bill, “Biometric information or data” means information or data that pertains to measurable biological or behavioral characteristics of an individual that can be used alone, with each other or with other information, for verification, recognition, or identification of an unknown individual. Examples include: fingerprints, retina and iris patterns, voiceprints, DNA sequences, facial characteristics and face geometry, gait, handwriting, keystroke dynamics, and mouse movements. (The bill excludes medical information protected by HIPPA, medical images used for diagnosis or research. donated organs or tissues stored by a federal agency as well as writing samples, written signatures, mere photographs, human biological samples used for valid scientific testing or screening, demographic data, tattoo descriptions, or physical descriptions such as height, weight, hair color, or eye color.)

The Covered Entities (CEs) include any individual, partnership, corporation, limited liability company, association, or another group, however organized but not a state or local government agency, or any court of Massachusetts.

“ Processing includes collecting, accessing, using, storing, retaining, sharing, monetizing, analyzing, creating, generating, aggregating, altering, correlating, operating on, recording, modifying, organizing, structuring, disclosing, transmitting, selling, licensing, disposing of, destroying, de-identifying, or otherwise manipulating biometric information.

A CE or Data Processor (DP) cannot collect or process  (collect access, use, store, retain, share, monetize analyze, create, generate, aggregate, alter, correlate, operate on, record, modify, organize, structure, disclose, transmit, sell, license, dispose of, destroy, or de-identify)

someone’s biometric information unless: they

  • provide a written explanation of exactly what it will collect or process
  • provide the individual with the Biometric Privacy Policy(BPP)
  • receive advance explicit handwritten or electronic consent from the individual or their legal guardian or representative

Consent will expire after 3 years or when the initial purpose for processing the biometric information has been satisfied, whichever occurs first. Upon expiration, any biometric information possessed by a CE must be permanently destroyed. Consent may be renewed

The BPP must include:

  • the use models, detailing whether the biometric information is going to be used for identification or verification purposes; 
  • all data management and data security policies governing biometric information; 
  • all disclosure practices; and 
  • the retention schedule and guidelines for permanently deleting biometric information.

The CE must provide notice of any change to its BPP at least 20 business days in advance of implementation and request consent for the changes.

The CE must store, transmit, and protect from disclosure all biometric data in a manner that is the same as or more protective than the manner that it stores, transmits, and protects other confidential and sensitive information, consistent with the standard for similar private industries.

Any CE, DP or third party (TP) may only disclose biometric information if:

  • disclosure is required for the provision of a service or product by the CE and the individual has consented
  • disclosure is needed to complete a financial or commercial transaction requested by the individual and to which they have consented
  • disclosure is for a single purpose to a TP that has been authorized by the individual in handwritten consent
  • federal or state law requires disclosure but individual must be notified in advance via BPP
  • in response to a valid warrant
  • response to imminent threat to life or property[JB1] 

No CE, DP or TP may monetize biometric information.

If CE, DP or TP are served with a warrant for biometric information (BI), they must immediately provide the individual with a copy of the warrant, to whom and when their BI was provided, an inventory of the data disclosed, whether the CE, DP or TP provided the data, who requested the warrant from the court, if known. However, a government entity may apply to the court for a 30 day delay in notification and for a renewal of that delay.

CEs must annually report to the Attorney General (AG) any warrants for BI received by them or by associated DPs or TPs. CEs required to report BI pursuant to a law must annually report general aggregate information pertaining to these to the AG.

An individual alleging harm by a violation of this law may bring a civil action in any court of competent jurisdiction directed to any CE, DP or TP believed to have committed the violation.

If the defendant prevails they are eligible for liquidated damages ranging from  0.1% of the annual global revenue of the covered entity or $1,000 per violation, whichever is greater for negligent violations to 0.5% of the annual global revenue of the covered entity or $5,000 per violation, whichever is greater for deliberate violations, punitive damages and any other relief, including but not limited to an injunction as well as reasonable attorney’s fees and costs, including expert witness fees and other litigation expenses. Each instance of violation is eligible for damages.

The AG may bring an action pursuant to section 4 of chapter 93A against a CE, DP or TP to remedy violations of this chapter and for other relief that may be appropriate. 

Within 6 months of enactment of the law CEs must obtain consent for all BI collected or stored and must destroy any BI for which consent was not given. The Act will be in effect one year after enactment.

9. PROTECTING BROWSING INFORMATION: SD1217

This law would apply to electronic information collected by any corporation which sends or receives electronic communications, including any service that acts as an intermediary in the transmission of electronic communications, or stores electronic communication information for the general public.

It covers any information pertaining to an electronic communication or the use of an electronic communication service, including, but not limited to the content of electronic communications, metadata, sender, recipients, format, or location of the sender or recipients at any point during the communication, the time or date the communication was created, sent, or received, or any information pertaining to any individual or device participating in the communication.

In order for a government office, law enforcement agency or public official to access your electronic information from either a service provider or an electronic device itself, they would need to get a particularized search warrant supported by probable cause from a superior court judge. Exceptions would include if there were an emergency threatening immediate physical injury or, if you had previously given written consent to the corporation that possesses your electronic data to release it to them. Even in an emergency situation, the government would need to provide a written explanation of why the data was needed to the local superior court within 48 hours. Corporations would have to share the requested information within 14 days or earlier if justified, unless the corporation appeals for and is granted more time.

A Massachusetts corporation that provides electronic communication services, remote computing services, or location information services must respond to a warrant or subpoena from another state to produce records that would reveal the identity of the customers using those services, data stored by, or on behalf of the customer, the customer’s usage of those services, the recipient or destination of communications sent to or from those customers, or the content of those communications, as if that warrant or subpoena had been issued under the law of the commonwealth. This element is concerning, because it would allow a state that prohibits abortion to access content that might reveal that someone either had an abortion or received abortion medication.

The law enforcement or government officer who obtains someone’s electronic information via a search warrant must provide them with a copy of the warrant, the application for the warrant, an explanation of the law enforcement inquiry and the information requested and date of the request within 7 days of collecting their information unless a reason is provided for a delay which may be granted for up to 90 days and may compel the entity providing the data to delay notifying the target person.

A warrant for the electronic information requested is not necessary if the owner of the electronic information or the recipient of the information gives the law enforcement or government officer their written consent to share it.

If a government office, law enforcement agency, or public official believes that an electronic device is lost, stolen, or abandoned they may access electronic device information necessary  in order to attempt to identify, verify, or contact the owner or authorized possessor of the device.

Within 5 business days of issuing or denying a warrant, the court must report to the office of court management within the trial court all of the information pertaining to the warrant described above as well as name of the agency making the application, the offense described in the warrant and any modifications or extensions made to the warrant.

Every June, the court administrator in the office of court management in the trial court must provide the legislature with a complete report of the number of applications for warrants authorizing or requiring the disclosure of or access to information including a summary and analysis of the data which will all be public records.

No government office or law enforcement may ask any court for a reverse-location court order (including a search warrant or subpoena) to obtain the location of a specific device(s) or a reverse-keyword court order to identify who electronically searched for particular words, phrases, or websites, nor may they purchase this data. No court is permitted to issue any court order allowing the disclosure of reverse-location or reverse keyword data.

No government office or law enforcement may make a reverse location request or reverse keyword request from a company. Nor may they seek the assistance of any agency of the federal government or any agency of the government of another state or subdivision thereof in obtaining information or data from a reverse-location court order, reverse-keyword court order, reverse-location request, or reverse-keyword request if they would be barred from directly seeking such information.

No government office, law enforcement agency, or public official may use a cell site simulator (CSS)device for any purpose other than to locate or track the location of a specific electronic device, pursuant to a particularized warrant based on probable cause or if exigent circumstances exist requiring swift action to prevent imminent danger to the safety of an individual or the public. A warrant issued limits the use of the CSS to 15 days unless an application is made for renewal.

A warrant application must specify

  • the facts establishing probable cause to believe the targeted individual has committed, is committing, or is about to commit a felony
  • that less invasive methods of investigation or surveillance to the privacy of non-targeted parties have been tried and failed or are reasonably unlikely to succeed
  • It must disclose the nature and capabilities of the cell site simulator to be used, the name of the government agency that owns the cell site simulator device
  • exactly how it will be deployed, including whether it will obtain data from non-target communications devices
  • the procedures that will be followed to protect the privacy of non-targets during the investigation, including the deletion of data obtained from non-target communication device
  • that all target data must be deleted within 30 days if there is no longer probable cause  that such information or metadata is evidence of a crime

Any individual whose information was obtained by a government entity in violation of the above requirements for the collection of private electronic information must be notified in writing, by the government office, law enforcement agency, or public official who committed the violation and of the legal recourse available to that person.

Any electronic information collected in violation of the above provisions may not be used in evidence any trial, hearing, or other proceeding in or before any court, grand jury, department, officer, agency, regulatory body, legislative committee, or other authority of the commonwealth, or a political subdivision thereof.

Anyone who has been harmed by a violation of these protections of private electronic information may bring a civil action against the government office, law enforcement agency, or public official who violated those sections in the Superior Court or any court of competent jurisdiction. Such a person will not need to  file an administrative complaint with the attorney general or to accept mandatory arbitration of a claim.

When the plaintiff prevails in a civil action, the court may award actual damages, including damages for emotional distress, the greater of either $1000 per violation or actual damages, (punitive damages; and any other relief, including but not limited to injunctive or declaratory relief). In addition to any relief awarded, the court will award reasonable attorney’s fees and costs to the plaintiff.

Any contract whether government or private that infringes the above rights will be considered void.

This bill would also prohibit “library user private data” meaning records of a public library which reveals the identity and intellectual pursuits of a person using the library from being collected by any government or law enforcement agency.

10. SAFE COMMUNITIES ACT: HD2459 / SD1937

This long-standing goal of Digital Fourth and allied organizations, especially MIRA, would prevent local and state law enforcement from sharing information relating to the potential presence of undocumented immigrants, with ICE or other federal agencies.

For further details, please see the action alert here: https://actionnetwork.org/letters/tell-lawmakers-prioritize-the-safe-communities-act-this-session-23

Cambridge Spies On CPS Students

Illustration by Annie Zhao for VICE magazine

Many kids in the Cambridge Public Schools (and elsewhere in the Commonwealth) still don’t know that if you’re using a school-issued Chromebook, the school is monitoring whatever you browse, down to deleted draft emails, whether you’re at school or not.

This is through a browser add-on called “Securly.” CPS has an agreement with Securly that all school-issued Chromebooks will have this add-on.

What’s more, wittingly or not, CPS is lying to the City Council about whether student data gets shared. Let’s show you how.

In the Annual Surveillance Report submitted to the City, Cambridge Public Schools cites to the language of its Data Privacy Agreement with Securly, insisting, “This data is not shared with third parties” (Annual Surveillance Report, p.67). However, the DPA actually allows the sharing of data with third parties – specifically, but not limited to, the cops. Law enforcement is allowed to contact Securly to get data on students, and Securly is allowed to disclose that information without waiting for a warrant or evidence of involvement in illegal activities, and without telling either CPS or the student:

II. 4. Law Enforcement Requests. Should law enforcement or other government entities (“Requesting Party(ies)”) contact Provider with a request for Student Data held by the Provider pursuant to the Services, the Provider shall notify the LEA in advance of a compelled disclosure to the Requesting Party, unless lawfully directed by the Requesting Party not to inform the LEA of the request.

Since Securly can tell the cops without telling CPS, there’s no way CPS can truthfully guarantee to the City Council that your “data is not shared with third parties.” It might not be. But they can’t know for sure.

Beyond that, Article IV of the DPA goes into great detail about the circumstances under which Securly may share both personally identifiable student information and de-identified student information, for a variety of purposes. Again, it might be that, despite the DPA allowing them to, Securly is not in fact sharing CPS student information onwards; but we suspect that they are doing whatever the DPA currently allows them to do.

CPS also insists that Securly is being used only as a “Web Filter”, to block various kinds of disagreeable content. The material they have provided to the City Council focuses on students accessing gun-related content and suicide-related content.

But Securly’s Web Filter product not only blocks; it also shows to teachers and to admins what URLs are being blocked, offering what Securly describes as “Complete online visibility … monitor[ing] for signs of bullying, self-harm, gun terms, and violence”, with “AI-based context analysis … for signs of bullying, self-harm, gun terms, and violence across social networking and web searches. If a student is suffering or looking at concerning content, you’ll know.”

It is legal for students to search for content that includes violence, graphic imagery, and guns, and it’s hard to envision how they could research, say, Russia’s invasion of Ukraine without encountering such content.

It’s not clear that school monitoring software in general works. VICE reports, “The few published studies looking into the impacts of these tools indicate that they may have the opposite effect, breaking down trust relationships within schools and discouraging adolescents from reaching out for help—particularly those in minority and LGBTQ communities, who are far more likely to seek help online.” It is evident in places where school monitoring software is in use that students and parents are often contacted, inflicting harm, without administrators or teachers first examining the context of the flagged material. At a minimum, the City Council should find out what terms and sites are being flagged in Securly’s system, in order to evaluated whether there is manifest prejudice going into the selection of those terms and sites and whether each instance is being reviewed by the student’s teacher.

What Securly’s system appears to do is to monitor everything, and then rely on school officials’ discretion to determine whether what gets flagged is really cause for worry. Monitoring and disciplining students for accessing such content places the school district on dangerous legal ground. In last September’s ruling in Mahanoy School District v. B. L., the Supreme Court explained that students’ off-campus speech may be regulated only in cases of  “[1] serious or severe bullying or harassment targeting particular individuals; [2] threats aimed at teachers or other students; [3] the failure to follow rules concerning lessons, the writing of papers, the use of computers, or participation in other online school activities; and [4] breaches of school security devices, including material maintained within school computers.” Securly’s systems envision monitoring students’ off-campus speech in a far larger set of circumstances than provided for in Mahanoy.

My master’s thesis was on blocking and filtering technologies, and their potential for discriminating against the provision of LGBT-oriented information. I was also bullied in school, for years. I understand why schools want to track students’ access to gun- and suicide-related imagery. But public schools have to adhere to the Constitution in the surveillance they conduct of students. At most, considering the rights protected by the Fourth and First Amendments, schools are only be justified in starting to track out-of-school browsing behavior of a particular student on a school-issued device if they have probable cause to believe that the student was engaged in or is the target of one of the four kinds of conduct envisioned under Mahanoy. This technology goes far beyond what the law and the Constitution permits. We believe that the City Council should not approve the use of this technology.

This is part of a series on the surveillance technologies the City of Cambridge is reviewing. The City Council has referred consideration of these technologies through to the Public Safety Committee, which will hold a hearing and then report back to the City Council with recommendations. Email us if you’d like to testify at the Public Safety Committee. Now is the time to weigh in on whether you want to see this technology deployed in your community!

Understanding Fusion Centers

Our local fusion center, BRIC, has been at the core of police efforts to surveil and suppress social movements for over a decade. And, since 2012, we’ve been calling them out on their abusive and un-Constitutional practices.

This October 30, please join us for a livestreamed discussion on fusion centers, with Boston City Councilor Ricardo Arroyo, law student Dani Hargus, and journalist Emma Best, moderated by our own Alex Marthews!

State Surveillance Cannot Save Us From Mass Violence

160614001532-07-orlando-attack-vigil-0613-exlarge-169 (1)

After the appalling deaths of 49 people, and injuries to another 53, at a gay nightclub in Orlando this week, the presidential candidates leapt to push their own agendas. For Trump, it was about immigration; he magically transformed the US-born shooter into an Afghan, in order to emphasize that he was right about banning Muslim immigration. For Clinton, it was about gun control; she called for better background checks and limits on obtaining assault weapons. But when it came to surveillance, they might as well have been singing from the same hymn-sheet.

Clinton called for an “intelligence surge,” for increased internet surveillance and suppression of First Amendment-protected speech, to prevent “radicalization”; for propaganda promoting a US-government-seal-of-approval version of Islam; praised a “Countering Violent Extremism” (CVE) program that marks for intervention Muslims whose politics deviate from what the FBI thinks acceptable; and suggested that people on due-process-free terrorism watchlists should not be allowed to buy guns. Then, she wrapped her actual policy proposals in a cotton-wool language of diversity and inclusion, and claimed that this is not “special surveillance on our fellow Americans because of their religion.” She talked about “Islamism” rather than “Islam”, in order to claim to not be against Islam in itself—but in her world, the government gets to define who is a good and who is a bad Muslim. Perhaps the “bad Muslims” in her mind include citizens like Ayyub Abdul-Alim, imprisoned for refusing to inform on other Muslims for the FBI, who seems only have wanted to help strengthen his community; or Tarek Mehanna, imprisoned for translating al-Qaeda documents and posting them online, who held atrocious opinions but never planned or participated in a violent attack.

Trump, with a little less cotton-wool, actually says much the same about surveillance. Domestically, the “Muslim community” will “have to cooperate with law enforcement and turn in the people who they know are bad”, which is what CVE is intended to achieve, and what Mr. Abdul-Alim is in prison for resisting. Trump proposes an “intelligence gathering system second to none” that “includes better cooperation between state, local and federal officials,” and says that intelligence and law enforcement are “not being allowed to do their job.” And he wraps this up with vehement expressions of solidarity with the LGBT community.

There’s no evidence that mass surveillance, conducted and promoted by the government, works. In every country that is hit with any attack, large or small, there are calls for more surveillance, then more attacks, then more surveillance, then more attacks. It’s a vicious ratchet that we can only step off by becoming aware of it. France implemented its mass surveillance law before the Paris attacks: The law didn’t prevent them. France now lives under a state of near-martial law, where what we would call ordinary First and Fourth Amendment rights have been suspended. Britain is in the process of passing a new surveillance law that will enable the government to view your browsing history without a warrant, and already outlawed “glorifying terrorism.” They have gone farther along this ratchet than we have, but they are not reducing their chance of being attacked; instead, the purpose is to reduce the chance that a given politician will be blamed for “not doing enough” against terrorism. In truth, there is no perfect safety, and there is a small proportion of violent criminals in every country that the State is ultimately powerless to eliminate.

Our own mass surveillance systems led this “lone wolf” to be found and interviewed by the FBI, twice. But neither Clinton nor Trump articulate clearly what they thought the FBI should have done next, perhaps because there’s nothing more the FBI could lawfully have done regarding allegations of terrorist affiliation. If the aim of surveillance is for the FBI to interview suspected “radicals,” what should they do then to prevent an entirely hypothetical attack? Preventively detain them, without charge or trial, as happened to Jose Padilla? Preventively shoot them before they kill anyone else, as happened with Usaama Rahim? Do we want a State that, claiming to keep us safe, claims the right to do that to any of us? We are already part-way down that road; has it helped us so far?

State surveillance cannot save us from mass violence. It’s a poor guarantor of LGBT people’s safety. The sad truth is that there is a tendency to violence in every human being’s heart, irrespective of religion. Guns help violent people carry out their violent fantasies on a larger scale, and while comprehensive background checks wouldn’t have helped with this attack, the evidence suggests that they would probably help to prevent others. Mass surveillance doesn’t even enjoy that evidentiary advantage; last time the surveillance agencies were actually confronted on their assertion that mass surveillance had helped to prevent terrorist attacks, during the debate over the renewal of Section 215 of the PATRIOT Act, the agencies’ claims shriveled under scrutiny like an ice-cream in the sun.

More than that, the State perpetrates mass violence on a scale much vaster than a single violent, conflicted misogynist. On a daily basis, the lives the State takes in the name of the War on Terror far exceed the number of lives taken by terrorists. We’re busy implementing a cure that causes more pain than the disease, because the State does not value enough or see enough glory in a more peaceful path. Why, then, should we trust the State with more power over the lives of Muslims and other “extremists,” here or abroad?

Instead of the State, we should look to each other. We should consider how we can build bonds of friendship and support that will encourage kindness, courtesy, and an appreciation of our mutual humanity. As we volunteer together, worship together, take care of loved ones together, work on good causes and reach out across lines of race and religion to those in distress, we step by step build the thriving “beloved community” of which Martin Luther King spoke long ago, so that even when attacks happen, they cannot break our bonds to one another. And so long as we work to trust one another, we can guard safely our thoughts, our opinions, and our liberties, even against a State that urges us constantly, for the sake of “safety,” to abandon them.

Belgian Police Overwhelmed By…Mass Surveillance?

Lest_darkness_fall_holt

Buzzfeed’s Mitch Prothero reported on the day of the Brussels attacks that “Belgian Authorities [Are] Overwhelmed By Terror Investigations“. He quotes a “Belgian counterterrorism official”, talking prior to the attacks, as having told him that:

[D]ue to the small size of the Belgian government and the huge numbers of open investigations — into Belgian citizens suspected of either joining ISIS, being part of radical groups in Belgium, and the ongoing investigations into last November’s attacks in Paris, which appeared to be at least partially planned in Brussels and saw the participation of several Belgian citizens and residents — virtually every police detective and military intelligence officer in the country was focused on international jihadi investigations. “We just don’t have the people to watch anything else and, frankly, we don’t have the infrastructure to properly investigate or monitor hundreds of individuals suspected of terror links, as well as pursue the hundreds of open files and investigations we have,” the official, who spoke on condition of anonymity because he was not authorized to speak to the media, said. “It’s literally an impossible situation and, honestly, it’s very grave.”

This icorroborates a major part of this blog – and our group’s – analysis of the surveillance state: That it generates so many false leads that it drowns law enforcement in data they can’t reasonably analyze or follow up on.

As a comparison, consider this comment from Michael Downing, deputy LAPD police chief and head of their counterterrorism unit, in 2012:


“[suspicious activity reporting has] flooded fusion centers, law enforcement, and other security entities with white noise; [the profusion of SAR reports] complicates the intelligence process and distorts resource allocation and deployment decisions.”

Continue reading Belgian Police Overwhelmed By…Mass Surveillance?

Rein In The Warrior Cops: State House, Tuesday March 8, 10:30am

theyhateusforourfreedom

Last year, across the country, over 1,100 people were shot by police.

In Massachusetts, we pride ourselves as being somehow different and more sophisticated than the rest of the country, but our police still shoot people at sixteen times the rate of people in Germany.

We have a situation so absurd that the police chief of the tiny town of Rehoboth can apply for, and receive, a $700,000 mine-resistant military assault vehicle, and the town doesn’t even bat an eye. They didn’t hold hearings, they didn’t take a vote, they just left it up to the police to decide how much to turn themselves into a military occupying force in that town.

Our police are trained, through initiatives like Urban Shield, to think of themselves as quasi-military, and the people as their enemies.

None of this is good enough.

This morning, Tuesday March 8, there will be a hearing at the State House on our bill to help deal with this, H. 2169. Come make your voice heard; head below the fold for the background.

H. 2169, “An Act assuring municipal control of military equipment procurement by local law enforcement”, sponsored by Rep. Denise Provost
Press Release
Digital Fourth’s Testimony to the Committee

Continue reading Rein In The Warrior Cops: State House, Tuesday March 8, 10:30am

State House Police Reform Showdown 10am 2/4: Be There!

rialto-stat

Tomorrow (Thursday, February 4), at 10am, the Massachusetts legislature’s Joint Committee on Public Safety will be holding a public hearing on all of the bills for the 2015-16 session that deal with police accountability, including Digital Fourth’s bill mandating bodycams and data collection on police stops. Join our gallant volunteers Sam T., Shekia S., Chris L., Jason L. and Robert C., in trying to make a difference; if you can make it, let us know and we can help you with your testimony!

The bills under consideration mostly propose improved training for police officers in de-escalation techniques, “emotional CPR”, and dealing with people with autism and mental illness. These are good things, but police training in departmental procedure alone can’t be the answer. Most police stops and shootings occur with police officers acting within the guidelines set by their departments; the problem is that those guidelines themselves can be very broad, and there’s essentially no accountability even if those guidelines are violated. If we’re going to get to a point where police officers routinely respect the constitutional rights of the people they stop on the street, there’s going to have to be meaningful accountability. Some officers should be deprived of their badges; some should be deprived of their liberty; and until that happens much more than it does now, we’ll keep seeing the parade of horrifying police shootings that cost over 1,100 members of the public their lives last year in the US.

Sign up at the Facebook event page
Read our Press Release
See all of the bills up for consideration

UPDATE: Here’s our testimony for the hearing.

Go Smart, Not Broad: A Constitutional Response To Violent Attacks

juillien-eiffel-peace

A former Middle East advisor to President Obama, Steven Simon, suggested in Saturday’s New York Times that the administration’s response to the Paris attacks was likely to include “Tighter border controls, more intensive surveillance in the U.S. and more outreach to local communities in the hope that extremists will be fingered by their friends and family. And a tightening of already intimate cooperation with European intelligence agencies.”

These proposals, if adopted, would be immensely counterproductive, and here’s why.

First, tighter border controls are irrelevant to this attack. It appears that all of the attackers so far identified, were EU citizens; none were refugees from Syria.

Second, France already had a draconian mass surveillance law, which came into effect at the beginning of October. It didn’t work to thwart these attacks. The reason is the “false positives” problem. Any system employing demographics, metadata, or past behavior, inevitably sweeps up a vast majority of innocent people, and diverts police and intelligence resources towards ruling them out. This LA Times study of “pre-crime” efforts to prevent violent crimes by US Army soldiers added every variable they could, and still, for every 15 people who did in fact commit violence in a given year in their set of suspects, 985 did not. Similarly, before the Boston Marathon attacks, the FBI had flagged Tamarlen Tsarnaev for interview; but they interview hundreds of flagged people every week, and have no way of knowing which among them will actually commit an attack. So, it appears that six weeks before the attacks, France’s intelligence agencies snowed themselves under with an ocean of false positives, and weren’t able to detect among that traffic the communications that were suspicious. They can’t be faulted for not being able to do so; it’s mathematically impossible. All mass surveillance allows is what’s happening now, which is to be able to go back into the system and see what you missed.

Third, Muslim and black communities were already under very heavy pressure in France, and are already under very heavy pressure here from the FBI, through its “Countering Violent Extremism” program, to “finger friends and family”. CVE uses models of radicalization with no solid academic basis to identify people as potentially radical simply because they have changed their dietary habits or become more devout about their religion. To make their numbers, the FBI has even resorted, in case after case, to creating their own terrorists out of young, poor, and mentally unstable young men, using confidential informants to lead them through every stage of devising a plot till they do something the FBI can arrest them for. We don’t need more of that either.

Fourth, when it comes to “more intimate cooperation” with European intelligence agencies, the fact is that such cooperation is already “intimate” – so intimate that the British systematically tap Internet traffic and hand us the contents; so intimate that we share “raw take” intelligence with Israeli security services; so intimate that the German intelligence agency helped the NSA spy on Europe’s top politicians in exchange for access to the latest in surveillance wizardry. Short of actually being in bed with one another, there’s no more “intimacy” to be had – and it still isn’t working.

This kind of mass surveillance is not working to thwart attacks. But in four important ways, it does work. Mass surveillance intimidates citizens in their ordinary conversations and activities of life. It allows bigoted politicians to curry favor with their base, and coast on a wave of anti-Muslim suspicion. It brings great profits to the private security firms smart enough to fill their cup at the never-failing spigot of federal counterterrorism funding. And it makes the general public feel that Something Is Being Done, convincing them to trade more of their rights away for a little temporary safety.

Last, if we react in this particular way, it also serves the ends of the violent criminals who committed this attack. Lacking resources themselves to wage war, they seek to provoke a backlash that will garner them support among the peaceful Muslim majority. Back in the day, the IRA posed as the defenders of the rights of peaceful Northern Irish Catholics against foreign oppression; today, the Islamic State poses as the defenders of the rights of peaceful Muslims against foreign oppression. A governmental backlash against Muslims in general will merely bolster their propaganda: See? We told you they’re out to get you! Come join us!

Instead, we should use the Constitution to solve the false positives problem. The Fourth Amendment bars mass surveillance, requiring, before surveillance is conducted, a warrant based on individualized probable cause of involvement in actual criminal activity. Imagine that, instead of having a “TIDE” terrorist database with 750,000+ names on it, it were limited to a maximum of one thousand, but that the one thousand were each investigated thoroughly on the basis of actual evidence. The surveillance agencies would waste a lot less time chasing fruitless leads, building data centers, or shoveling money to software vendors to try to solve this insoluble problem.

Foreign policy and economic solutions are beyond our remit, but it should be obvious that in order to drain the Islamic State of support, we have to provide those fleeing its rule with a credible chance at a better life. At the bare minimum, we should let them know that if they come to our country, they will be treated justly, not kept constantly under watch even if innocent of any crime.

MA, Feds Behind The Curve on Warrants for Email Searches

Back in the days of DEC and Wang Computer, there was serious doubt whether California’s Route 101 or Massachusetts’ Route 128 would be at the forefront of the digital economy. My aunt immigrated to Massachusetts to stake out her part of the new digital frontier. And round about that time, when I was seven years old, is also the last time Congress passed an email privacy bill. Called the “Electronic Communications Privacy Act“, by now it might as well be called the “Ordinance Describing Rules Regarding the Transportation of Speedy Telegrams Via Means Faster Than The Horse” for all the good it does. Among other ridiculously outdated provisions, it treated holding onto your emails for more than six months as a crazy-expensive thing only crazy people would ever do, so emails older than that are considered “abandoned” like your curbside trash. Law enforcement therefore don’t need to get a warrant to search it.

Fast-forward to now, and a federal bill to solve this, the Email Privacy Act, now has more than 300 sponsors in the House. That would be more than enough to pass it if it got to the floor, and is more cosponsors than any other bill still being held up in committee. It’s being held up essentially because the SEC and FTC want a free hand to not get warrants:

In a hearing before the Senate Judiciary Committee, representatives of the SEC and FTC claimed that other other types of court orders provide a comparable standard to a warrant based on probable cause. Nothing could be further from the truth. A search warrant – the standard in the constitution – allows access to information only when there is a strong likelihood it will show evidence of criminal violations of the law. That is a high standard that applies only in a narrow class of cases. By contrast the SEC and FTC are seeking access to email whenever it is relevant to civil violations of the law – such as mistakenly filling out a tax form. That is a low standard which applies in many cases. This rule would then apply to every agency – from the IRS to the local health inspector. Agencies shouldn’t be able to highjack reform to seek a digital power grab,” said Chris Calabrese, Vice President for Policy at the Center for Democracy & Technology (CDT)

In reaction to these roadblocks, organizers at the state level, including our fellow chapters of Restore The 4th – SF Bay Area and Restore The 4th- LA, have advocated for stronger state privacy laws, and in California, they just succeeded in passing Cal-ECPA, the strongest email privacy law in the nation.

In Massachusetts, our path to Warrant Protection for Email City has resembled so far, far more the federal path than the one California just chose. Last session, a bill was reported out favorably from committee, and got dropped only on the last day of the session, amid the usual flurry of the close of business. It just came up again at this week’s Judiciary Hearing, the ACLU testified on it, and we support it too. But California has now lit the path for us to follow, and it will matter greatly whether we follow it. Massachusetts residents should not enjoy lesser protections for their emails than California residents, any more than the Sox deserve to lose to the Giants.

MA Senate Maj. Leader Strongly Opposes Fusion Centers. So Do We.

In its October 7 hearing on “Protected Classes. Privacy, and Data Collection Legislation”, the Massachusetts legislature heard impassioned testimony on the fusion centers from Senate Majority Leader Sen. Harriette Chandler. She argued that they represent an illegitimate intrusion of federal surveillance into our everyday lives.

The fusion centers gather a vast array of data on law-abiding Massachusetts residents whom they believe to have been behaving “suspiciously” in some lawful way. This violates the Fourth Amendment, and is also bad policy. Right now, as far as we have been able to determine, no external body ever evaluates the accuracy or appropriateness of the data the fusion centers hold. DHS evaluates them every five years to certify their adherence to DHS procedures for fusion centers; the fusion centers self-certify annually that they are ramping up according to plan, and that they respect privacy and civil liberties. (They give themselves full marks, naturally). That’s it.

We too dislike the fusion centers, and also see them as sinisterly ensnaring Massachusetts residents in a web of surveillance. To us, the question is not so much whether we as a state should regulate the fusion centers, but whether we should fire all their employees, blow up their buildings, and then salt the earth beneath them as a mark of horror for future generations. Still, still, we love that there is a fusion center reform bill, and we warmly support it.

Our five-year vision for the Massachusetts fusion centers differs sharply from theirs.

The bill’s provisions make good, if incremental, sense. They require the fusion centers to audit themselves annually to determine whether they have investigations open that shouldn’t be, and make the report of that a public record; they empower an inspector-general to conduct outside audits; and they specify some metrics whereby the fusion centers can determine how well they are respecting people’s privacy. These are important first steps toward establishing whether anything that the fusion centers do, actually does the rest of us any good; and will prepare the ground better for us to have discussions in future years about closing them entirely.