The Data for Justice Project | ACLU of Massachusetts https://data.aclum.org You can't manage what you don't measure. Thu, 26 Feb 2026 16:25:33 +0000 en-US hourly 1 https://wordpress.org/?v=6.8.5 https://data.aclum.org/wp-content/uploads/2019/03/cropped-D4JP-navicon-1-32x32.png The Data for Justice Project | ACLU of Massachusetts https://data.aclum.org 32 32 Flock Gives Law Enforcement All Over the Country Access to Your Location https://data.aclum.org/2025/10/07/flock-gives-law-enforcement-all-over-the-country-access-to-your-location/ https://data.aclum.org/2025/10/07/flock-gives-law-enforcement-all-over-the-country-access-to-your-location/#respond Tue, 07 Oct 2025 18:10:28 +0000 https://data.aclum.org/?p=75491 Police in Texas, Florida, and thousands of departments nationwide can track Massachusetts drivers in real-time — without a warrant, probable cause, or even reasonable suspicion of wrongdoing. Documents obtained by the ACLU of Massachusetts reveal that police across the state are collecting detailed information about the locations of Massachusetts drivers and sharing that information with a network of over 7,000 agencies and organizations all over the country — including in states that have passed laws banning abortion and gender-affirming healthcare for minors, and laws requiring police to conduct civil immigration enforcement operations.  

The records and other publicly available information confirm that over 80 Massachusetts police departments have entered into contracts to deploy Flock Safety’s automatic license plate reader (LPR) technology to surveil drivers when they pass one of Flock’s cameras on the roads. Over the past three years, Massachusetts police have spent over $2 million in taxpayer funds on this technology. Many of these departments have been sharing LPR data collected in their jurisdiction with Flock to be entered into its national database, where it can be accessed by thousands of out-of-state police departments and even federal agencies.  

At minimum, this dragnet surveillance means warrantless tracking of everyone on the road. At worst, it means a digital police state wherein law enforcement officials in far-flung jurisdictions outside of Massachusetts can track protesters, political opponents, immigrants, patients, and others not suspected of any crime and use the information to hurt them. 

What Is Flock and How Does It Work? 

Flock Safety is a major player in the LPR industry, contracting with thousands of law enforcement agencies, running LPR surveillance across nearly 7,000 networks, and deploying nearly 90,000 cameras nationwide as of July 2025. The company’s CEO claims their technology can eradicate all crime in Americathough that’s more marketing hype than reality. 

License plate readers are cameras that automatically capture and record license plates, locations, and timestamps as vehicles pass by. These AI-enabled systems allow police to instantly track where motorists are now and where they’ve been. 

At the crux of the problems with Flock is their nationwide data sharing model. Police departments that contract with Flock can choose to share the LPR data they collect with no other departments, with specific named departments, with all departments in their state, or with the entire Flock network nationwide. But Flock has designed its system to incentivize maximum sharing: if a police department chooses to share their data with the entire nationwide network, that department can also search the entire nationwide network. In effect: “You show me yours, I’ll show you mine.” This Flock training video received in response to public records requests demonstrates how seamless and unrestricted data sharing operates within Flock’s system. To share their data with police nationwide, all an administrator needs to do is click a button: 

This screenshot from Flock’s training materials shows that simply by selecting “Enable National Lookup[,]” agencies contracted with Flock can give their staff access to Flock’s national database and, in turn, share their data with agencies nationwide.
While departments contracted with Flock can opt to only accept specific data sharing requests, departments also have the option of automatically accepting sharing requests from other agencies.

We know many Massachusetts police departments are taking part in this nationwide network because, after submitting public records requests with nearly 80 departments, we received numerous records called “Flock Network Audits[.]” These network audits document searches conducted by police across the country of approximately 7,000 agencies and organizations’ LPR data, including data on Massachusetts drivers collected by dozens of Massachusetts police departments. What this means in practice is that officers with the Florida Highway Patrol and those in Dallas, TX, Jacksonville, FL, Columbus, OH, and thousands of other locations can track where and when Massachusetts residents are driving, even when they are in Massachusetts — all without demonstrating any probable cause or even reasonable suspicion that those people have committed a crime.  

Even some federal law enforcement agencies may have access to Flock’s database, despite Flock’s recent assertion that it ended its pilot program with the Department of Homeland Security’s Customs and Border Protection (CBP). News reports indicate local police have conducted searches on behalf of ICE agents.  

And that’s not all. Flock isn’t the only license plate reader company with a large presence in Massachusetts. The Massachusetts State Police (MSP) and some local departments also have contracts with Vigilant Solutions, a company that maintains its own national database and has contracts with agencies nationwide — including those in states that have passed extreme restrictions on abortion and gender affirming healthcare access. Vigilant also has contracts with federal agencies, including ICE. Indeed, records obtained through a public records request to the town of Auburn, Massachusetts show ICE agents have direct access to query the Vigilant database. The ACLU of Massachusetts is currently involved in public records litigation to learn more about the MSP’s license plate reader surveillance network and its statewide LPR tracking database. Stay tuned for more details as that litigation progresses. 

ICE staff and agents have access to search Vigilant’s nationwide database, as shown in the above screenshot from public records shared by the Auburn, MA police department.

Is Warrantless, Dragnet Surveillance of Motorists Legal? 

In 2014, in Commonwealth v. Augustine, the Massachusetts Supreme Judicial Court held that police are required to obtain a search warrant to access cell site location information under the Massachusetts Constitution, the Declaration of Rights. Four years later, the Supreme Court applied similar protections nationwide in Carpenter v. United States. The Court reasoned that technology enabling the government to track everyone, to monitor all our public movements, and to do so both in real time and retroactively, posed a significant threat to our Fourth Amendment rights.  

LPRs are analogous to cellphone tracking. These AI-enabled cameras track motorists in real-time and historically, giving the government the means to track people’s locations in a manner similar to the cell site location information at issue in Augustine and Carpenter. 

The Massachusetts Supreme Judicial Court recognized this in 2020, holding in Commonwealth v. McCarthy that “[w]ith enough cameras in enough locations, the historic location data from an [LPR] system in Massachusetts would invade a reasonable expectation of privacy and would constitute a search for constitutional purposes.” Yet today, police in Massachusetts are subject to no state or federal statute governing their use of automatic license plate readers. In the absence of a state statute, police are engaged in mass surveillance of all drivers. 

Flock’s Data Sharing Model vs. the Massachusetts Shield Law 

In August 2024, Massachusetts strengthened its Shield Law, which prohibits Massachusetts law enforcement from providing information or assistance to any other state’s law enforcement agency in relation to investigations into reproductive healthcare or gender-affirming healthcare that is lawful in the Commonwealth. The Shield Law was designed to protect people from other states’ laws that criminalize abortion and restrict access to gender-affirming care, ensuring that people who receive and provide protected healthcare that is lawful in Massachusetts can do so without fear of retribution from out-of-state actors.  

But Massachusetts law enforcement’s use of Flock’s nationwide data sharing undermines the effectiveness of the Shield Law. Out-of-state officers from thousands of agencies across the country can and do access information about where and when people are driving in Massachusetts, and there is nothing stopping them from using that information to track the movements of people in Massachusetts who are seeking or providing protected healthcare. 

These concerns are not hypothetical. Earlier this year, police in Johnson County, Texas performed a nationwide search in Flock’s database to find a woman who they believed had a self-administered abortion — searching LPR data from many states where abortion is legal and protected, including Massachusetts. We know about this investigation because the Texas officer entered “had an abortion, search for female” in the “Reason” field in Flock’s database. Initially, Flock tried to dismiss criticism stemming from the negative press, suggesting that the woman’s “family feared she was hurt” and that police merely sought to make sure that she was alright. But subsequent reporting from 404 Media based on public records obtained by the Electronic Frontier Foundation show the police who made this search were pursuing the woman as part of a “death investigation” into the abortion. As 404 Media reporters put it: “In documents created prior to the publication of our article, there is zero mention of concern about the woman’s safety. The records show that the police retroactively created a separate document about the Flock search a week after our article was published, in which they justify the search by saying they were concerned for her safety.” 

Flock claims it protects people’s privacy and legal rights by requiring all law enforcement officers to enter a “search reason” before accessing database results. According to Flock’s recent response to similar data sharing concerns in other states, if an out-of-state officer enters a reason that would violate a state law that protects access to reproductive and gender-affirming healthcare (for example, Massachusetts’ Shield Law), that state’s LPR data would be excluded from the search results. But the Flock network audits obtained by the ACLU of Massachusetts demonstrate that the company’s measures are woefully inadequate. 

The network audits show police frequently enter vague, tautological terms like “investigation” or “susp” instead of information about the substance of the investigation. But even if Flock were to prohibit officers from accessing search data unless they provide a substantive reason, police could evade the system’s guardrails. As Flock’s technology attracts more negative media attention and scrutiny from public officials concerned about its data sharing practices, officers in states that criminalize abortion and gender-affirming healthcare could simply decline to provide specific details about their investigations or use terms like “homicide” or “suspicious death” when investigating an abortion case. The scale of the nationwide searches makes case-by-case oversight impossible. For example, a Flock network audit provided to the ACLU of Massachusetts in response to a July 2025 public records request documents over 450,000 searches of the nationwide database in just a 30-day period in the spring of 2025. 

The above screenshot shows an actual Massachusetts police department’s Flock network audit, with personally identifiable information redacted.

Flock’s purported solution to comply with state laws that restrict the sharing of information related to investigations of protected healthcare services is no solution at all. Today, out-of-state officers continue to have effectively unrestricted access to Flock’s nationwide database, including data about Massachusetts residents and visitors. 

Flock’s Contract Problem 

Police departments that use Flock and seek to protect the rights of Massachusetts residents from out-of-state or federal scrutiny must turn off information sharing settings authorizing those entities from searching the data they collect. But even when departments opt to restrict sharing in Flock’s system settings, they may not actually be protecting people’s privacy. According to public records reviewed by the ACLU of Massachusetts, the template user agreement many law enforcement agencies sign gives Flock “a non-exclusive, worldwide, perpetual, royalty-free right and license (during and after the Service Term hereof) to (i) use and distribute [] Aggregated Data to improve and enhance the Services and for other development, diagnostic and corrective purposes, other Flock offerings, and crime prevention efforts, and (ii) disclose the Agency Data (both inclusive of any Footage) to enable law enforcement monitoring against law enforcement hotlists as well as provide Footage search access to law enforcement for investigative purposes only” [italics added for emphasis].

The above screenshot shows an excerpt of Flock’s template user agreement, which grants Flock the license to use, distribute, and disclose data collected by a given agency.

What this means in practice is troubling: even when a police department chooses in Flock’s application to restrict data access to its own officers, the template agreement gives Flock the right to disclose the local police department’s data both to law enforcement nationwide and federal agencies for “investigative purposes.” 

Individual police departments can, and sometimes have, addressed this problem by amending Flock’s contract language. For example, when the Boston Police Department (BPD) conducted a pilot of Flock’s technology earlier this year, the department elected in its sharing settings not to share data outside the department — but the BPD also appears to have rewritten Flock’s template contract. Unlike the template contract language that many police departments have agreed to, the Boston Police Department’s agreement does not give Flock the right to disclose its LPR data. 

The above screenshot shows an excerpt of the Boston Police Department’s user agreement with Flock, which notably differs from Flock’s standard template user agreement in that the Boston Police Department’s agreement does not give Flock the right to disclose agency data.

But the vast majority of police departments in Massachusetts lack the legal resources and sophistication of Boston’s department, putting them at a serious disadvantage. Flock, which recently received a large investment from Andreessen Horowitz, has been engaged in an aggressive marketing campaign to win more contracts and secure more taxpayer dollars. This campaign includes potent sales tactics, such as invitations to participate in a free TopGolf event and an opportunity for agencies to participate in Flock’s “Project Prove It[,]” in which Flock installs LPR cameras on the roads and the agency can back out at no cost within 45 days. Flock sales staff also offer to work with local Massachusetts departments and other municipal leaders to identify and apply for grant funding for Flock LPR systems. 

Flock draws police departments into its nationwide data sharing network by offering officers and other staff the opportunity to participate in free events, like at TopGolf.
The above screenshot of an email from a Flock salesperson to the Waltham police department shows Flock’s typical sales tactics used to convince even departments that are hesitant to invest in its LPR technology to adopt it through Flock’s “Project Prove It.”
Flock sales staff even offer to help local police departments obtain funding to contract with Flock.

Faced with this onslaught from a company with seemingly endless resources at its disposal, small- and medium-sized police departments across the country are falling prey to aggressive marketing and sales pitches. Individual action by sophisticated departments like Boston’s isn’t enough. Residents in communities with less well-resourced departments deserve protection too.  

How Lawmakers and Law Enforcement Can Address Issues with LPRs 

The scope and severity of this dragnet surveillance make clear that both legislative action and individual police department changes are needed to achieve statewide protections from unrestricted LPR use. 

First, state lawmakers must pass comprehensive LPR legislation. H.3755 strikes the right balance, protecting civil rights and civil liberties while allowing police to use LPRs for legitimate investigations. The bill prohibits law enforcement from using LPRs to monitor individuals based on First Amendment-protected activities, including political protests or religious gatherings. It requires that LPR data be deleted within 14 days unless a record is tied to a specific, ongoing criminal investigation supported by articulable facts. Access to LPR data would be limited to law enforcement for investigative purposes and could not be disclosed outside judicial proceedings. The bill also prohibits buying, selling, renting, or sharing LPR and related location data unless required by judicial process, and bars law enforcement from accessing LPR data collected by another entity without a valid search warrant. These provisions prevent the accumulation and sharing of data on millions of people not suspected of any wrongdoing, while allowing law enforcement to use LPRs in legitimate criminal investigations. This legislation would ensure consistent protections apply for all Massachusetts residents. 

Second, police departments must take immediate action. Individual departments should stop voluntarily sharing data with out-of-state and federal agencies. They should redraft contracts with Flock to ensure their department retains full control of all data they collect. Finally, police departments must adopt internal policies requiring that every Flock network search be justified by a specific, documented reason for the inquiry, clarifying to all officers and staff that a non-descriptive entry like “investigation” will not suffice. 

Take Action 

Right now, your movements on Massachusetts roads are being tracked and shared with thousands of agencies nationwide. No warrant, no suspicion, no safeguards. 

Join us in calling on the Massachusetts Legislature to pass common sense checks and balances on police use of license plate readers, to ensure that you retain basic civil liberties on the road.  

Email your state legislators today to show your support for H.3755, An Act Establishing Driver Privacy Protections.

This post was updated on February 26, 2026, to reflect that more than 80 police departments across Massachusetts have contracts with Flock Safety.

]]>
https://data.aclum.org/2025/10/07/flock-gives-law-enforcement-all-over-the-country-access-to-your-location/feed/ 0
What we know (and what we don’t know) about BPD’s surveillance camera network https://data.aclum.org/2025/05/21/bpd-surveillance-video/ Wed, 21 May 2025 19:48:18 +0000 https://data.aclum.org/?p=75405 Fact check: In Boston, pro-immigrant policies coexist with lower crime rates, not higher ones https://data.aclum.org/2025/02/24/fact-check-boston-crime-rates/ Mon, 24 Feb 2025 18:55:51 +0000 https://data.aclum.org/?p=75117

Trump administration officials claim that state and local laws preventing police from participating in federal civil immigration enforcement (sometimes referred to as “sanctuary” policies) make communities more dangerous. But these claims don’t fit the facts.  

The Center for American Progress conducted a nationwide analysis of over 2,000 counties in 2017, finding that crime per capita is significantly lower in sanctuary counties compared to similar non-sanctuary counties. On average, there were 35 fewer crimes committed per 10,000 people in sanctuary counties. These results accord with many other peer-reviewed studies. What’s more, the Center for American Progress report found that sanctuary counties are also more economically prosperous. A 2020 study published by researchers at UC San Diego found that immigrants are less likely to trust local law enforcement if they work with ICE. These studies support the view, long espoused by the ACLU and other supporters of welcoming city policies, that creating a clear boundary between policing and immigration enforcement enhances rather than diminishes community safety. 

In Boston, the city’s Trust Act was signed in August 2014 and amended in 2019. The law prohibits City of Boston officials from using city resources to assist with federal civil immigration enforcement. Recently, the Trump administration and Congressional Republicans have taken aim at Boston and Mayor Michelle Wu, demanding that the city assist with Trump’s plans for mass deportations. Underlying these demands are claims that policies such as Boston’s harm public safety. 

But the facts tell a very different story: FBI and BPD data show crime rates, already declining when the Trust Act passed in 2014, have continued to decline in the nearly 11 years since the legislation became law. Indeed, Boston has reported historic lows in the number of homicides and shootings over the last few years. Both property crime and violent crime, according to FBI Uniform Crime Reporting statistics, have been on the decline since 2005, a decline that continued following the enactment of the Trust Act.  

The data from Boston reflects a trend observed in peer-reviewed studies, with some finding that sanctuary laws actually result in a decrease in crime. Contrary to claims from anti-immigrant officials in Washington, cities across the country would do well to consider Boston a model. 

]]>
We saved hundreds of Biden-era AI documents, so you don’t have to https://data.aclum.org/2025/01/13/biden-era-ai-archive/ Mon, 13 Jan 2025 23:58:18 +0000 https://data.aclum.org/?p=74810

Image credit: Adapted from Anton Grabolle / Better Images of AI / Classification Cupboard / CC-BY 4.0


On January 20th, Trump will begin his second term in office.

If Trump and his administration follow through on their campaign promises, the transition of power will bring devasting consequences for our fundamental rights and freedoms, particularly for immigrants, reproductive health care seekers, people of color, poor people, and queer and trans communities.

Among the many anticipated impacts, the new administration has signaled it intends to reverse Biden-era progress related to protecting our civil rights and civil liberties in the realms of privacy and artificial intelligence.

The day after the election, Trump announced his intention to repeal a Biden era executive order regulating AI. Since then, Trump has granted Elon Musk unprecedented influence over government affairs, and hosted scores of tech leaders at Mar-a-Lago. Tech CEOs have in turn donated millions to Trump’s inauguration, with contributions from Apple, Meta, Amazon, and OpenAI far outpacing their donations to Biden in 2020.

These and other developments foreshadow an incoming administration that is going to take a lax approach toward regulating technology companies.

Over the last four years, the Biden administration made important strides in this area, including a directive increasing transparency into government use of AI, FTC enforcement against location data brokers, and a DOJ lawsuit challenging the legality of RealPage’s rent price-fixing algorithm.

These efforts were buttressed by legal documents, reports, blogs, and other records laying out the administration’s efforts to protect consumers and hold Big Tech accountable. But once administrative agencies change hands under Trump, there is no guarantee that documentation of these and other initiatives will remain accessible to advocates, journalists, and interested members of the public.

So, we saved them.

Below, you can view archived copies of over 250 documents and webpages on topics like algorithmic discrimination, generative AI, and biometric surveillance. Try filtering by agency (e.g., DHS, FTC) or keywords (e.g., AI use case inventory, Kochava, risk). Use the drop-down menu to change how many rows you can see at a time.

No matter what happens, the ACLU of Massachusetts remains committed to fighting for law reforms to protect the public interest, civil rights, and civil liberties. Click here to find out ways you can get involved.


AgencyDocumentSource
Administrative ConferenceAI and regulatory enforcementOriginal source
Administrative ConferenceGuidance on agency use of AI (2020)Original source
Administrative ConferenceReport on automated legal guidance at federal agenciesOriginal source
ANSIPublic private partnershipsOriginal source
CFPBAdverse action notification on credit algorithmsOriginal source
CFPBAI in financial services - commentOriginal source
CFPBAI in home appraisals - blogpostOriginal source
CFPBAI in home appraisals - ruleOriginal source
CFPBBackground dossiers and AI in employmentOriginal source
CFPBCall to action for tech whistleblowingOriginal source
CFPBFair Credit Reporting and Name-Only Matching ProceduresOriginal source
CFPBGuidance on black box algorithmsOriginal source
CFPBGuidance on credit denials using AIOriginal source
CFPBNo-action letter to Upstart - blogpostOriginal source
CFPBNo-action letter to Upstart (credit lender)Original source
CFPBOn false matches in tenant and employment screeningOriginal source
CFPBProposed registry for AI harm repeat offendersOriginal source
CFPBRights for job seekers on background screeningOriginal source
CFPBRights for tenants on rental application denialsOriginal source
CFPBTenant screening - consumer snapshotOriginal source
CFPBTenant screening - market reportOriginal source
Chief Information Officers CouncilGuidance for federal agencies on AI use case inventory (2024)Original source
CommerceDepartment of Commerce AI use case inventory (2023)Original source
CongressAdvancing American AI actOriginal source
Copyright OfficeCopyright and AI: Digital ReplicasOriginal source
Department of StateDepartment of State AI use case inventory (2024)Original source
Department of StateDepartment of State use of AI compliance planOriginal source
DHSDHS AI roadmap (2024)Original source
DHSDHS AI use case inventory - blog post Original source
DHSDHS AI use case inventory - landing pageOriginal source
DHSDHS AI use case inventory (2022)Original source
DHSDHS AI use case inventory (2023)Original source
DHSDHS AI use case inventory (2024)Original source
DHSDHS playbook for GenAI in the public sectorOriginal source
DHSDHS Simplified AI use case inventory landing pageOriginal source
DHSDHS use of AI compliance planOriginal source
DHSPress release on DHS playbook on GenAI in the public sectorOriginal source
DODDOD use of AI compliance planOriginal source
DOJAI and disability discrimination in hiringOriginal source
DOJDOJ AI use case inventoryOriginal source
DOJDOJ Sues RealPage for Algorithmic Pricing Scheme Original source
DOJDOJ Sues Six Large Landlords for Algorithmic Pricing Scheme Original source
DOJUS et al. v. RealPage - proposed final judgmentOriginal source
DOJUS vs RealPageOriginal source
DOJUS vs RealPage - amended compliantOriginal source
DOLAI and ADS under Fair Labor Standards ActOriginal source
DOLAI and worker well-being principlesOriginal source
DOLFederal contractors use of AIOriginal source
StateRisk Management Profile for AI and Human RightsOriginal source
DOTDOT AI use case inventoryOriginal source
DOTDOT use of AI compliance planOriginal source
EducationAI Discrimination in EducationOriginal source
EducationDepartment of Education AI use case inventoryOriginal source
EducationDepartment of Education use of AI compliance planOriginal source
Election Assistance CommissionElection Assistance Commission use of AI compliance planOriginal source
EnergyDepartment of Energy AI use case inventory (2023)Original source
EnergyDepartment of Energy GenAI Reference GuideOriginal source
EnergyDepartment of Energy use of AI compliance planOriginal source
EOCCAddressing adverse impact of AI in employment under CRAOriginal source
EOCCArtificial Intelligence and Algorithmic Fairness InitiativeOriginal source
EOCCEOCC use of AI compliance plan (2024)Original source
EOCCGuidance on AI in employment and ADA Original source
EOCCImplications of big data for equal employment opportunity lawOriginal source
EOCCiTutorGroup age discrimination lawsuitOriginal source
EOCCiTutorGroup settlementOriginal source
EOCCPress release on guidance on AI in employment and ADA Original source
EOCCTips for workers on AI and ADAOriginal source
EOCCVisual Disabilities in the Workplace and ADAOriginal source
EOCCWearables in the workplace under federal discrimination lawsOriginal source
EOTAI and future of teaching and learningOriginal source
EPAEPA use of AI compliance planOriginal source
Executive Office of PresidentPromoting the Use of Trustworthy Artificial Intelligence in the Federal GovernmentOriginal source
FCCAI in robocalls and proposed rule on robotextsOriginal source
Federal Housing Finance AgencyFederal Housing Finance Agency use of AI compliance plan (2024)Original source
FTCAI and the risk of consumer harmOriginal source
FTCApproaches to Address AI-enabled Voice CloningOriginal source
FTCBest practices for use of FRTOriginal source
FTCBlogpost on location data casesOriginal source
FTCExplainer on proposed settlements with Avast, X-Mode, and InMarketOriginal source
FTCExplainer on real-time biddingOriginal source
FTCExplainer on surveillance pricingOriginal source
FTCFRT vs Facebook (2012) settlementOriginal source
FTCFTC on deceptive AI claimsOriginal source
FTCFTC vs Ascend Ecom deceptive claim lawsuitOriginal source
FTCFTC vs DoNotPay "AI lawyer" deceptive claim - agreed consent orderOriginal source
FTCFTC vs DoNotPay "AI lawyer" deceptive claim - complaintOriginal source
FTCFTC vs Ecommerce Empire deceptive claim complaintOriginal source
FTCFTC vs Everalbum Photo App complaintOriginal source
FTCFTC vs Everalbum Photo App press releaseOriginal source
FTCFTC vs Everalbum settlementOriginal source
FTCFTC vs Facebook (2012) settlement order press releaseOriginal source
FTCFTC vs Facebook settlement orderOriginal source
FTCFTC vs Facebook settlement order press releaseOriginal source
FTCFTC vs FBA machine deceptive claimOriginal source
FTCFTC vs FBA machine orderOriginal source
FTCFTC vs Flo Health press releaseOriginal source
FTCFTC vs Flo Health statement on settlementOriginal source
FTCFTC vs GravyAnalytics complaintOriginal source
FTCFTC vs GravyAnalytics concurring statement (1 of 3)Original source
FTCFTC vs GravyAnalytics concurring statement (2 of 3)Original source
FTCFTC vs GravyAnalytics concurring/dissenting statement (3 of 3)Original source
FTCFTC vs GravyAnalytics press releaseOriginal source
FTCFTC vs GravyAnalytics proposed orderOriginal source
FTCFTC vs InMarket complaintOriginal source
FTCFTC vs InMarket press releaseOriginal source
FTCFTC vs InMarket press release on finalized orderOriginal source
FTCFTC vs Inmarket proposed orderOriginal source
FTCFTC vs Intellivision (unsupported claims about FRT) complaintOriginal source
FTCFTC vs Intellivision blogpostOriginal source
FTCFTC vs Intellivision press releaseOriginal source
FTCFTC vs Intellivision proposed consent orderOriginal source
FTCFTC vs Kochava amended complaintOriginal source
FTCFTC vs Kochava case pageOriginal source
FTCFTC vs Kochava complaintOriginal source
FTCFTC vs Kochava concurring statementOriginal source
FTCFTC vs Kochava memorandum decision 02/2024Original source
FTCFTC vs Kochava press releaseOriginal source
FTCFTC vs MobileWalla complaintOriginal source
FTCFTC vs MobileWalla complaintOriginal source
FTCFTC vs MobileWalla proposed settlement orderOriginal source
FTCFTC vs RiteAid complaintOriginal source
FTCFTC vs RiteAid concurring statementOriginal source
FTCFTC vs RiteAid press releaseOriginal source
FTCFTC vs RiteAid proposed orderOriginal source
FTCFTC vs Rytr (writing assistant) deceptive claim complaintOriginal source
FTCFTC vs Rytr proposed orderOriginal source
FTCFTC vs X-mode agreement with consent order (Jan 2024)Original source
FTCFTC vs X-mode analysis of proposed consent orderOriginal source
FTCFTC vs X-Mode and Outlogic press release on finalized orderOriginal source
FTCFTC vs X-mode and Outlogic press release on outcomeOriginal source
FTCFTC vs X-mode complaint (April 2024)Original source
FTCFTC vs X-mode complaint (Jan 2024)Original source
FTCFTC vs X-mode decision and order (April 2024)Original source
FTCFTC vs X-mode proposed order (Jan 2024)Original source
FTCHealth app breach notification rule Original source
FTCImpersonation rule press releaseOriginal source
FTCOperation AI ComplyOriginal source
FTCPolicy statement on use of biometric informationOriginal source
FTCPress release on health app breach notification rule Original source
FTCStatement by FTC commissioner on health app breachesOriginal source
FTCSurveillance pricing order press releaseOriginal source
FTCSurveillance pricing order to file reportOriginal source
FTCWarning about biometric surveillanceOriginal source
GAOGov AI accountability highlightsOriginal source
GAOGov AI accountability reportOriginal source
General Services AdministrationGSA use case inventory Original source
General Services AdministrationGSA use of AI compliance plan - governanceOriginal source
General Services AdministrationGSA use of AI compliance plan - responsible innovationOriginal source
General Services AdministrationGSA use of AI compliance plan - risksOriginal source
HealthDepartment of Health AI use case inventory (2024)Original source
HHSU.S. Department of Health and Human Services use of AI compliance planOriginal source
HUDFair Housing Act and use of criminal records in housing transactionsOriginal source
HUDFair Housing Act guidance on AIOriginal source
HUDGuidance on AI in advertising of housing, credit and real estateOriginal source
HUDGuidance on AI in tenant screeningOriginal source
HUDHUD AI use case inventory (2023)Original source
HUDHUD AI use case inventory (2024)Original source
HUDHUD use of AI compliance plan (2024)Original source
Industry and Security BureauProposed rule for mandatory AI reportingOriginal source
InteriorDepartment of Interior AI use case inventory (2024)Original source
InteriorDepartment of Interior use of AI compliance planOriginal source
IRSIRS transitions away from FRT for third-party verificationOriginal source
LaborDOL AI use case inventoryOriginal source
LaborDOL use of AI compliance planOriginal source
Library Of CongressAI and copyrightOriginal source
Multi-agency2023 Joint Statement on Enforcement of Civil Rights, Fair Competition, Consumer Protection, and Equal Opportunity Laws in Automated SystemsOriginal source
Multi-agency2024 Joint Statement on Enforcement of Civil Rights, Fair Competition, Consumer Protection, and Equal Opportunity Laws in Automated SystemsOriginal source
Multi-agency2024 use case inventory reporting instructionsOriginal source
Multi-agencyCFPB and Federal Partners on ADSOriginal source
Multi-agencyCFPB and NLRB Announce Information Sharing Agreement Original source
Multi-agencyFederal Agency AI Use Case Inventory (2023)Original source
Multi-agencyFederal Agency AI Use Case Inventory (2024)Original source
Multi-agencyFederal Agency AI Use Case Inventory READMEOriginal source
Multi-agencyJoint statement against AI biasOriginal source
Multi-agencyJoint statement on ADS enforcementOriginal source
NAIACAI literaryOriginal source
NAIACAI positive impact on science and medicineOriginal source
NAIACAI's Procurement ChallengeOriginal source
NAIACData Challenges and Privacy Protections for Safeguarding Civil Rights in GovernmentOriginal source
NAIACExpand the AI Use Case Inventory by Limiting the ‘Common Commercial Products’ ExceptionOriginal source
NAIACExpand the AI Use Case Inventory by Limiting the ‘Sensitive Law Enforcement’ ExceptionOriginal source
NAIACFAQs on Foundation Models and GenAIOriginal source
NAIACFindings and recommendations on AI SafetyOriginal source
NAIACFindings on The Potential Future Risks of AIOriginal source
NAIACGenAI risksOriginal source
NAIACHarnessing AI for Scientific ProgressOriginal source
NAIACImplementing the NIST AI Rights Management FrameworkOriginal source
NAIACInstitutional Structures to Support Safer AI SystemsOriginal source
NAIACNAIAC webpagesOriginal source
NAIACNational Artificial Intelligence Advisory Committee: Year 1 ReportOriginal source
NAIACNational Artificial Intelligence Advisory Committee: Year 2 ReportOriginal source
NAIACNational Campaign on Lifelong AI Career SuccessOriginal source
NAIACOn adverse event reporting of emerging risks from AI Original source
NAIACOn field testing of law enforcement AI toolsOriginal source
NAIACOn implementing the NIST AI Safety InstituteOriginal source
NAIACPublic Summary Reporting on Use of High-Risk AIOriginal source
NAIACPublic Use Policies for High-Risk AIOriginal source
NAIACRationales, Mechanisms, and Challenges to Regulating AIOriginal source
NAIACReport on impact of AIOriginal source
NAIACReport on law enforcement use of AIOriginal source
NAIACResponsible Procurement Innovation for AI at Government AgenciesOriginal source
NAIACStatement of support on Safe, Secure and Trustworthy AI EOOriginal source
NAIACStatement on AI and Existential RiskOriginal source
NAIACTowards Standards for Data Transparency for AI ModelsOriginal source
NAIACWorking Group on Rights-Respecting AIOriginal source
NAIRRNAIRR task force final reportOriginal source
NAIRRNAIRR task force interim reportOriginal source
NAIRRNational AI Research Resource pilot launchOriginal source
NASANASA use of AI compliance planOriginal source
NISTAI risk management frameworkOriginal source
NISTAI risk management playbookOriginal source
NISTAI technical standardsOriginal source
NISTGenAI risk managementOriginal source
NISTGenAI software development practicesOriginal source
NISTNIST landing post on Safe, Secure, Trustworthy AI EOOriginal source
NISTOne-pager on carrying out Safe, Secure, Trustworthy AI EOOriginal source
NISTRisks of foundation modelsOriginal source
NISTSecure Software Development FrameworkOriginal source
NISTStandards for identifying biasOriginal source
NSFNSF 2023/2024 AI use case inventoryOriginal source
NSFNSF use of AI compliance planOriginal source
NSSCETNational Standards Strategy for Critical and Emerging Technology roadmapOriginal source
NSTTrustworthy AI R&DOriginal source
Nuclear Regulatory CommissionNuclear Regulatory Commission use of AI compliance planOriginal source
OETGuidance for developers on AI in EducationOriginal source
Office of Personnel ManagementOffice of Personnel management use of AI compliance planOriginal source
OMBBlogpost on responsible acquisition of AIOriginal source
OMBMemo on AI governanceOriginal source
OMBMemo on responsible acquisition of AIOriginal source
OSTPBlueprint for AI Bill Of RightsOriginal source
SECSEC AI use case inventory 2024Original source
SECSEC use of AI compliance planOriginal source
SSASSA use of AI compliance planOriginal source
TreasuryAI use case inventory May 2023Original source
TreasuryTreasury use of AI compliance planOriginal source
US Commission on Civil RightsAI in K-12 educationOriginal source
US Commission on Civil RightsCivil rights implications of algorithmsOriginal source
US Commission on Civil RightsCivil rights implications of FRTOriginal source
US Commission on Civil RightsCivil rights implications of FRT factsheetOriginal source
US Commission on Civil RightsUSCCR use of AI compliance planOriginal source
USAIDUSAID use of AI compliance planOriginal source
USDAUSDA AI strategyOriginal source
USDAUSDA AI use case inventory (2024)Original source
USDAUSDA use of AI compliance planOriginal source
VeteransDepartment of Veterans Affairs use of AI compliance planOriginal source
White HouseEO on Safe, Secure and Trustworthy AIOriginal source
White House EO on Safe, Secure and Trustworthy AI - press releaseOriginal source
White HouseDelivering on the Promise of AI to Improve Health OutcomesOriginal source
White HouseEO on Responsible AIOriginal source
White HouseFramework on AI governance in national securityOriginal source
White HouseNational Standards Strategy for Critical and Emerging Technology - press releaseOriginal source
White HouseThe Cost of Anticompetitive Pricing Algorithms in Rental HousingOriginal source
White HouseVoluntary Commitment on AI - press releaseOriginal source

Can’t find what you’re looking for? It may be available on the Internet Archive.

]]>
Eyes in the Sky: Big Brother is (still) watching https://data.aclum.org/2024/06/24/government-drones-ma-2023/ Mon, 24 Jun 2024 17:28:19 +0000 https://data.aclum.org/?p=74168

New records obtained from the Federal Aviation Administration (FAA) in December 2023 show that the number of drones licensed by government agencies has gone up across the Commonwealth. We’ve updated our interactive tool, which lets anyone explore the dataset and identify drones owned by public entities in their communities.

Search Government Drones in Massachusetts

Keep reading to see what we learned. 

In 2021, we published a report detailing data acquired by the ACLU on government agencies’ use of drones in Massachusetts. According to late 2023 data, it remains the case that almost half (43%) of active government drones in Massachusetts are registered to police departments. The Massachusetts State Police has the largest number of drones of any police agency in the state. 

In 2022, we published documents revealing police used drones to monitor Black Lives Matter protests in five cities in Massachusetts, including Boston. Video feeds from these drones were streamed in real-time to local police departments and the State Police “Commonwealth Fusion Center,” which shares information with federal agencies and out-of-state police entities.  

Protecting public safety?  

One of the most common drones registered by government agencies is the DJI Matrice line. Just one of these drones costs between $10,000 to $20,000 dollars. With 133 Matrice drones in operation in Massachusetts as of 2023, these drones alone likely cost taxpayers $1 to 2 million dollars.

Despite the hefty price tag, DJI drones – especially the Matrice and Mavic lines – have been prone to crashes. In August 2022, a police-operated DJI Mavic 2 Enterprise drone used to locate a suspect in the UK crashed into a building after its battery failed and it plummeted 130 feet. According to FAA documents, the Massachusetts State Police has registered eight DJI Mavic 2 Enterprise drones.

The DJI Matrice 210 drone is even less reliable. Reports of crashes were frequent enough that, in June 2020, the website www.reportdroneaccident.com urged Matrice 210 pilots to “not fly over any people,” echoing a warning from the drone manufacturer from 2018. Regarding the Matrice 210, a FAA-certified pilot and drone expert with the Wake Forest Fire Department reported in 2018 that “three public safety agencies … had batteries fail in flight.” In 2020, a Matrice 210 failed at 270 feet, crashing hard enough that a piece of the drone ended up “buried 8 inches deep.” Based on the DROPS standards, if a Matrice 210 drone were to fall on someone from just six feet or more, the injury would be fatal.  

These crashes are not due to pilot error but rather stem from known issues with the technology itself. Despite these problems, as of December 2023, Massachusetts government agencies had 58 active Matrice 210 drones.  

More drones, more money, more problems 

The December 2023 FAA data shows that across Massachusetts, the total number of drones registered by government agencies increased by 169 between 2021 and 2023. The Massachusetts State Police acquired 19 additional drones, amounting to a 25% increase in the department’s drone fleet. Likewise, the Norfolk County Sheriff’s Department, which had a single drone in 2021, had acquired 19 more drones by 2023. Four of these new drones were the Mavic 2 Enterprise drones, discussed above.  

Two years ago, we raised concerns that government entities, particularly police departments, were not doing enough to prevent the misuse of drones or drone footage. In 2022, Worcester City Council approved a request by the Worcester Police to purchase a $25,000 drone. The police department had come under fire from homeless advocates for using a drone to monitor people at encampments.  

While Massachusetts has no laws on the books regulating police use of drones, the ACLU of Massachusetts supports legislation that would ban the weaponization of drones and other robots. 

Search government drones in Massachusetts 

If you want to look at the data yourself, you can use our interactive tool, where you can explore the data or download the data in full.   

Search Government Drones in Massachusetts

If you’re interested in learning more about how government agencies in Massachusetts use drones, you can use our model public records request to find out. More information about this process and relevant resources are available here. 

]]>
Boston Police Records Show Nearly 70 Percent of ShotSpotter Alerts Led to Dead Ends https://data.aclum.org/2024/04/08/boston-shotspotter/ Mon, 08 Apr 2024 14:21:33 +0000 https://data.aclum.org/?p=73861

Image credit: Sketch illustration by Inna Lugovyh

The ACLU of Massachusetts has acquired over 1,300 documents detailing the use of ShotSpotter by the Boston Police Department from 2020 to 2022. These public records shed light for the first time on how this controversial technology is deployed in Boston.  

ShotSpotter — now SoundThinking — is a for-profit technology company that uses microphones, algorithmic assessments, and human analysts to record audio and attempt to identify potential gunshots. A public records document from 2014 describes a deployment process that considers locations for microphones including government buildings, housing authorities, schools, private buildings and utility poles.

According to city records, Boston has spent over $4 million on ShotSpotter since 2012, deploying the technology mostly in communities of color. Despite the hefty price tag, in nearly 70 percent of ShotSpotter alerts, police found no evidence of gunfire. The records indicate that over 10 percent of ShotSpotter alerts flagged fireworks, not weapons discharges.

The records add more evidence to support what researchers and government investigators have found in other cities: ShotSpotter is unreliable, ineffective, and a danger to civil rights and civil liberties. It’s time to end Boston’s relationship with ShotSpotter. Boston’s ShotSpotter contract expires in June, making now the pivotal moment to stop wasting millions on this ineffective technology.

Boston’s relationship with ShotSpotter dates from 2007. A recent leak of ShotSpotter locations confirms ShotSpotter is deployed almost exclusively in communities of color. In Boston, ShotSpotter microphones are installed primarily in Dorchester and Roxbury, in areas where some neighborhoods are over 90 percent Black and/or Latine.  

Coupled with the high error rate of the system, BPD records indicate that ShotSpotter perpetuates the over-policing of communities of color, encouraging police to comb through neighborhoods and interrogate residents in response to what often turn out to be false alarms.  

For each instance of potential gunfire, ShotSpotter makes an initial algorithmic assessment (gunshot, fireworks, other) and sends the audio file to a team of human analysts, who make their own prediction about whether it is definitely, possibly, or not gunfire. These analysts use heuristics like whether the audio waveform looks like “a sideways Christmas tree” and if there is “100% certainty of gunfire in the reviewer’s mind.” 

ShotSpotter relies heavily on these human analysts to correct ShotSpotter predictions; an internal document estimates that human workers overrule around 10 percent of the company’s algorithmic assessments. The remaining alerts comprise the reports we received: cases in which police officers were dispatched to investigate sounds ShotSpotter identified as gunfire. But the records show that in most cases, dispatched police officers did not recover evidence of shots fired. 

Analyzing over 1,300 police reports, we found that almost 70 percent of ShotSpotter alerts returned no evidence of shots fired.  

In all, 16 percent of alerts corresponded to common urban sounds: fireworks, balloons, vehicles backfiring, garbage trucks and construction. Over 1 in 10 ShotSpotter alerts in Boston were just fireworks, despite a “fireworks suppression mode” that ShotSpotter implements on major holidays.  

ShotSpotter markets its technology as a “gunshot detection algorithm,” but these records indicate that it struggles to reliably and accurately perform that central task. Indeed, email metadata we received from the BPD describe several emails that seem to refer to inaccurate ShotSpotter readings. The records confirm what public officials and independent researchers have reported about the technology’s use in communities across the country. For example, in 2018, Fall River Police abandoned ShotSpotter, saying it didn’t “justify the cost.” In recent years, many communities in the United States have either declined to adopt ShotSpotter after unimpressive pilots or elected to stop using the technology altogether.

Coupled with reports from other communities, these new BPD records indicate that ShotSpotter is a waste of money. But it’s worse than just missed opportunities and poor resource allocation. In the nearly 70 percent of cases where ShotSpotter sent an alert but police found no evidence of gunfire, residents of mostly Black and brown communities were confronted by police officers looking for shooters who may not have existed, creating potentially dangerous situations for residents and heightening tension in an otherwise peaceful environment.  

Just this February, a Chicago police officer responding to a ShotSpotter alert fired his gun at a teenage boy who was lighting fireworks. Luckily, the boy was not physically harmed. Tragically, 13-year-old Chicago resident Adam Toledo was not so fortunate; he was killed when Chicago police officers responded to a ShotSpotter alert in 2021. The resulting community outrage led Chicago Mayor Brandon Johnson to campaign on the promise of ending ShotSpotter. This year, Mayor Johnson followed through on that promise by announcing Chicago would not extend its ShotSpotter contract. 

The most dangerous outcome of a false ShotSpotter alert is a police shooting. But over the years, ShotSpotter alerts have also contributed to wrongful arrests and increased police stops, almost exclusively in Black and brown neighborhoods. BPD records — detailing incidents from 2020-2022 — include several cases where people in the vicinity of an alert were stopped, searched, or cited — just because they happened to be in the wrong place at the wrong time.  

For instance, in 2021, someone driving in the vicinity of a ShotSpotter alert was pulled over and cited for an “expired registration, excessive window tint, and failure to display a front license plate.” Since ShotSpotter devices in Boston are predominately located in Black and brown neighborhoods, its alerts increase the funneling of police into those neighborhoods, even when there is no evidence of a shooting. This dynamic exacerbates the cycle of over-policing of communities of color and increases mistrust towards police among groups of people who are disproportionately stopped and searched.  

This dynamic can lead to grave civil rights harms. In Chicago, a 65-year-old grandfather was charged with murder after he was pulled over in the area of a ShotSpotter alert. The charges were eventually dismissed, but only after he had already spent a year in jail.  

In summary, our findings add to the large and growing body of research that all comes to the same conclusion: ShotSpotter is an unreliable technology that poses a substantial threat to civil rights and civil liberties, almost exclusively for the Black and brown people who live in the neighborhoods subject to its ongoing surveillance. 

Since 2012, Boston has spent over $4 million on ShotSpotter. But BPD records indicate that, more often than not, police find no evidence of gunfire — wasting officer time looking for witness corroboration and ballistics evidence of gunfire they never find. The true cost of ShotSpotter goes beyond just dollars and cents and wasted officer time. ShotSpotter has real human costs for civil rights and liberties, public safety, and community-police relationships.  

For these and other reasons, cities including Canton, OH, Charlotte, NC, Dayton, OH, Durham, NC, Fall River, MA, and San Antonio, TX have decided to end the use of this controversial technology. In San Diego, CA, after a campaign by residents to end the use of ShotSpotter, officials let the contract lapse. And cities like Atlanta, GA and Portland, OR tested the system but decided it wasn’t worth it. 

From coast to coast, cities across the country have wised up about ShotSpotter. The company appears to have taken notice of the trend, and in 2023 spent $26.9 million on “sales and marketing”. But the cities that have decided not to partner with the company are right: Community safety shouldn’t rely on unproven surveillance that threatens civil rights. Boston’s ShotSpotter contract is up for renewal in June. To advance racial justice, effective anti-violence investments, and civil rights and civil liberties, it’s time for Boston to drop ShotSpotter. 

An earlier version of this post stated that one of the false alerts was due to a piñata. That was incorrect.


Further reading 

Emiliano Falcon-Morano contributed to the research for this post. With thanks to Kade Crockford for comments and Tarak Shah from HRDAG for technical advice.

]]>
Yes, All Location Data: Separating fact from myth in industry talking points about “anonymous” location data https://data.aclum.org/2024/02/28/yes-all-location-data/ Wed, 28 Feb 2024 18:15:58 +0000 https://data.aclum.org/?p=73762

Image credit: Joahna Kuiper / Better Images of AI / Little data houses (square) / CC-BY 4.0

We carry our phones around wherever we go – and our cellphone location data follows us every moment along the way, revealing the most sensitive and intimate things about us. Everywhere we go, everyone we meet, and everything we do – it’s all accessible to anyone with a credit card, thanks to the data broker industry.  

Apps use location data for a variety of purposes including finding directions, logging runs, ordering food, and hailing rideshares. While this information can be used for legitimate purposes, this sensitive data is also exploited for profit and extremist agendas, putting every cellphone user at risk. In 2023, right-wing extremists capitalized on the unregulated open data marketplace to out gay Catholic priests. This disturbing undertaking was possible because data brokers are allowed to buy location information, repackage it, and sell it to anyone who wants to buy it. And, currently, there’s nothing stopping them.  

As independent researchers have shown time and time again, it is all too easy to trace cellphone location data back to the people holding those phones. 

Data generated from apps are superficially ‘pseudo-anonymized’ by assigning each user a unique combination of numbers called a MAID (“Mobile Advertising ID”), also known as an IDFA (“ID For Advertisers”). But since each MAID is associated with a single device and common across apps, it’s easy to paint a unique picture of someone by aggregating location datapoints across apps.  

In fact, just a few data points are sufficient to uniquely identify most individuals. Several highly-cited scientific studies using real-world cellphone location data – including a Scientific Reports research paper – showed that a few linked spatiotemporal data points are enough to uniquely identify most individuals from a crowd. Intuitively, if someone finds out where your phone is between midnight and five a.m., then they know where you likely live. If they then find out where your phone is between nine a.m. and five p.m. on weekdays, then they know where you likely work.  

While two location points – home and work – are plenty, data brokers have much more data than that. In fact, data brokers peddle a sprawling digital dossier on millions of people with incredible temporal and spatial detail

Recently, data broker Kochava was thrown into the spotlight as a result of a shocking investigation by the Federal Trade Commission (FTC). Among other revelations, analysts from the FTC were able to obtain a free sample of cellphone location data and use that information to track someone who visited an abortion clinic all the way back to their home. This data, like all data from data brokers, was supposed to be anonymous – instead, it revealed a person’s private health care practices and real identity. For vulnerable people travelling from states where reproductive health care is now a crime, the open sale of their cellphone location data is a serious matter. But Kochava is not a lone bad apple. They are one company out of a multibillion dollar industry that exists solely to profit off our data, putting us – and our loved ones – at risk.  

Just in case location information is not sufficient to identify someone, it is easy to connect this data with other pieces of information that are easily accessible, such as a person’s public work directory, LinkedIn profile, or by using one of many people search sites that list people’s full names and addresses. Indeed, a spinoff industry has cropped up that offers “identity resolution” services to do just that. For instance, a company called Liveramp partners with several well-known location data brokers, claiming to “resolve data to the user or household level”, helping ad companies “build, configure, and maintain a unified view of your customer, easily connecting customer data from any and all data sources.” Similarly, data brokers like Adobe and Oracle offer identity resolution services to aggregate data across disparate data sources.  

Mobile advertising IDs, as mentioned above, are part of the problem – but not the end of the road. In 2021, Google made some strides to secure MAIDs – but left opting out to more tech-savvy users. Meanwhile, Apple phased out MAIDs for users who don’t explicitly opt in to tracking. While these moves were a step in the right direction, they still leave a lot of room for loopholes. From consent for cookies to Do Not Track requests, the ad industry has historically countered every superficial privacy win with dogged – and successful – efforts to circumvent restrictions. When it comes to the “end” of MAIDs, the ad industry has already developed workarounds, allowing companies to match location data to users using “identity graphs”, even if they lack advertising IDs for those people.  

As an executive of ad tech company himself described, “when you move to these more restrictive methods, what happens is that all the shady companies … try to find alternative workarounds to the MAID but with methods the user doesn’t have any control over, ultimately hurting end-user privacy.”  

Data brokers claim they want to protect our privacy as much as we do. But we can’t trust that they will choose our privacy over their profits. We need more than superficial solutions.  

That’s why the ACLU of Massachusetts and our partners are working to pass legislation to ban the sale of cellphone location data. This bill would prevent location data being tracked or traded for anyone in the state of Massachusetts. It is a vital defense to stop this multibillion-dollar industry from profiting from our personal data. We can’t do this without your help – so click here to contact your legislator and urge them to pass this crucial legislation. It’s time to end this shady practice once and for all.  

Essential reading 

]]>
Balancing the scales of justice: Why right to counsel in eviction cases is a racial justice and housing justice issue https://data.aclum.org/2024/01/16/eviction-cases-analysis/ Tue, 16 Jan 2024 21:15:08 +0000 https://data.aclum.org/?p=73621

Evictions devastate lives and communities. Research shows evictions lead to displacement from neighborhoods, decreased physical and mental well-being, instability in employment and education, increased likelihood that children will be placed in foster or other out-of-home care, and greater reliance on social service supports.  

Legislation before the Massachusetts Joint Committee on the Judiciary, An Act promoting access to counsel and housing security in Massachusetts (H.1731/S.864), would provide both low-income tenants and low-income owner-occupants with access to full legal representation in eviction proceedings – and thus the crucial fighting power to stay in their homes. This legislation is supported by a broad coalition, including the legal community, health care providers, local politicians, and faith-based organizations. 

Despite the many harms of evictions, only 3 percent of tenants in Massachusetts facing an eviction have a lawyer representing them in housing court. In contrast, over 90 percent of Massachusetts landlords have legal representation in those cases. The result of this imbalance is no surprise: evictions.

While it is illegal to evict someone without going through housing court, this protection is meaningless if tenants have no legal support to fight their pending eviction. Since most eviction cases are due to non-payment of rent, defendants, who can’t afford rent, probably can’t afford a lawyer. Tenants without counsel must face the confusing court system and complex housing law on their own, while others might not be able to attend their court hearing at all due to childcare, employment, or transportation issues. For people with disabilities and those who do not speak English, the barriers are even higher. 

In 2020, as COVID-19 hit Massachusetts, the state put a temporary moratorium on evictions. With the scores of the population out of work, this humanitarian stopgap was essential in allowing people to stay in their homes during times of a deadly transmissible virus and stay-at-home orders. But the moratorium ended in October 2020, followed by the end of the federal moratorium in August 2021. Since then, evictions have snowballed.

Evictions are a racial justice issue. Black and Latine households are more likely than white households to rent. Research indicates these communities are also over-represented in households facing eviction. In Massachusetts, eviction cases and eviction outcomes were more frequent in communities with a higher proportion of Black and Hispanic residents. This correlation was highly statistically significant.

Adults aren’t the only ones affected by evictions – kids are too. On average, 11 percent of children under age 5 face eviction each year in Massachusetts. For Black and Hispanic communities, the percentage of children facing eviction triples at 27 percent. These evictions lead to a vicious cycle of disrupting educational engagement, contributing to higher dropout rates, and negatively affecting physical and mental health. In this way, evictions contribute to lasting generational harms that can scar communities of color for many years to come.

We need meaningful action to prevent unfair evictions. Right to counsel will correct the power imbalance that gives landlords an unfair advantage in eviction cases. Tenants deserve a fair process. Massachusetts legislators can balance the scales.  


Learn more: An Act promoting access to counsel and housing stability in Massachusetts (H.1731/S.864)


Further reading:

Anthony Cilluffo, A.W. Geiger & Richard Fry, More U.S. Households Are Renting Than At Any Point In 50 Years, Pew Research Center Fact Tank (July 19, 2017), https://www.pewresearch.org/fact-tank/2017/07/19/more-u-s-households-are-renting-than-at-any-point-in-50-years/  

Jaboa Lake, The Pandemic Has Exacerbated Housing Instability for Renters of Color, Center for American Progress (October 30, 2020), https://cdn.americanprogress.org/content/uploads/2020/10/29133957/Renters-of-Color-2.pdf

Emily Badger, Claire Cain Miller & Alicia Parlapiano. The Americans Most Threatened by Eviction: Young Children, The New York Times (October 2, 2023). https://www.nytimes.com/2023/10/02/upshot/evictions-children-american-renters.html 

]]>
Investigating Boston Police Department SWAT Raids from 2012 to 2020 https://data.aclum.org/2023/09/27/investigating-boston-police-department-swat-raids-from-2012-to-2020/ Wed, 27 Sep 2023 13:47:56 +0000 https://data.aclum.org/?p=73340 Today, the ACLU of Massachusetts released a new interactive tool allowing members of the public to visualize and analyze nearly a decade of Boston Police Department SWAT team after-action reports. An ACLU analysis of these reports has identified troubling racial disparities in BPD SWAT raids, particularly when those raids involve drug investigations. 

Review Boston Police SWAT After-Action Reports

About the data 

This analysis draws on after-action reports of BPD SWAT incidents from 2012 to May 2020. After-action reports are created by the BPD SWAT team to document the unit’s deployments. The reports include descriptions of the incident, information about the people involved (the police, targets, and others impacted), the geographic location of the deployment, and details about the level of police force used during the SWAT action, among other details. The BPD released these reports in response to a 2020 subpoena from then-City Councilor Michelle Wu.  

Unfortunately, the documents produced by the BPD were difficult to analyze in bulk; among other problems, they were not machine readable. While the ACLU and our partners made every attempt to ensure the accuracy of the data pulled from these reports, our analysis has some data limitations. For example, the underlying after-action reports occasionally include missing, clearly incorrect, or difficult-to-interpret data. As explained in greater detail in the methodology section at the bottom, some errors may also have arisen in our processing of the records and extraction of information. That said, we engaged in a human review of the documents to ensure any errors in our extraction do not impact the overall analysis of trends.  

 


The data: Disproportionate SWAT policing of Black and Brown communities and people

There are 262 after-action incident reports representing deployments between 2012 and May 2020. A significant portion of these police actions took place in police precincts roughly corresponding to the neighborhoods of Mattapan, Roxbury, and Dorchester.  

According to data reported by the Boston Planning & Development Agency Research Division, these three neighborhoods had the highest percentage of residents who are non-white (i.e., identifying as a race other than white or as white along with one more other races) of any Boston neighborhood in 2020. These neighborhoods also had a relatively higher share of residents identifying as Black or African American and/or Hispanic than most other Boston neighborhoods.   

After-action reports typically list one or more incident types indicating the reason(s) for the raid. In order of frequency, the most common types were Search Warrant – Other/Unknown, Search Warrant – Drugs, Mental Health Crisis, and Barricaded Suspect. 

Like other SWAT teams, the BPD SWAT unit is deployed in response to active crises and in deliberate, planned SWAT raids. Typically, the latter cases involve the serving of search and arrest warrants. While the BPD has discretion about whether to deploy the SWAT unit in response to calls for service, SWAT deployments pertaining to active emergencies like mental health crises, barricaded suspects, and domestic violence incidents are nonetheless responsive to active emergencies in the community. On the other hand, the deployment of the SWAT unit to serve warrants, particularly search warrants, is more reflective of self-directed police work. For that reason, racial disparities in the use of SWAT units to serve search warrants are a significant cause for concern.  

Data indicate that white and Black people in the United States use and sell drugs at approximately the same rates. But the BPD’s SWAT unit serves drug warrants almost exclusively on people of color.  

The data: Targets of the BPD’s SWAT raids

According to the after-action reports, 784 people were subjected to SWAT raids between 2012 and 2020. 270 of these individuals were specifically marked as the “suspects” or “targets” of the raid.  

The ages of all raid subjects ranged widely from infants to 83-year-olds. Individuals between 20 and 40 years old were the largest group impacted. For those marked specifically as raid suspects or targets there is a smaller range of ages; the youngest person was 16 and the oldest was 67.  

According to the reports, 105 children under the age of 18, including 25 children under the age of 5, lived in homes subjected to BPD SWAT raids between 2012 and early-2020. 

Those impacted by raids were predominantly men, accounting for 88% of those targeted by raids and 62% of all individuals subjected to raids. When considering both race and gender, Black men were the largest group of those both subjected to raids (38%) and marked as raid suspects/targets (50%).

There are significant racial disparities in those targeted and all those impacted by raids 56.3% of those targeted by raids were Black non-Hispanic and 8.5% were Black Hispanic. These figures are disproportionate to the share of Black residents in Boston. As of the 2020 Census, 25.5% of Boston’s population was Black or African American; this figure includes multiracial residents and both Hispanic and non-Hispanic residents. Of this, 22% of Boston’s population was Black non-Hispanic, and 3.4% was Black Hispanic. Hispanic residents are also disproportionately impacted by raids. 28.7% of those targeted by raids were recorded as Hispanic, White Hispanic, or Black Hispanic; however, the 2020 Census found that Hispanic or Latino residents made just 19% of Boston’s population. 

The disproportionate share of Black and Hispanic individuals impacted by raids varies by incident type; the disparity is notable in “officer-initiated” raids* such as those for search warrants. In contrast, the demographics of people subjected to SWAT team involvement in crisis-related incidents, such as mental health crises, more closely reflect the demographics of Boston.  

*To better understand BPD SWAT unit practices, we categorized incident types into “officer-initiated” incidents such as search warrants or arrest warrants and “crisis-related” incident types that reflect an emergency such as a mental health crisis. 

Another way to view this disparity is to look at the share of white non-Hispanic residents impacted by raids. White non-Hispanic people are underrepresented in officer-initiated raids; Boston’s population is 44.6% white non-Hispanic, but the group of people impacted by all officer-initiated raids are only 6.5% white. In contrast, white people were involved in crisis-related raids at a rate more proportionate to the overall population; 33.5% of those impacted by these raids are white.  

The data: SWAT team after-action reports document frightening, militarized surprise raids 

Reports detail the SWAT team battering down doors, deploying aerosol grenades, waking individuals asleep in bed, and detaining relatives of suspects and children. According to the reports, well over half of the people subjected to this level of force were not even the targets of the operation but were just roommates, partners, family members, or children who were present in the home during the raid.  

SWAT raids can even harm people who live nearby a person wanted by police. One of the records details a raid where the entire household was not the target of the operation; the SWAT team failed to go to the correct address on a search warrant and wrongfully invaded another family’s home. In 2018, the ACLU of Massachusetts represented this family in a lawsuit against the BPD. As our lawsuit on behalf of the family alleged, “[the BPD SWAT team] used a battering ram to break down the door, trained guns on and handcuffed the parents … and their 15-year-old child, and ransacked the home with two younger children present.” The family suffered lasting emotional distress from this assault. Ultimately, the case was settled and the City was forced to pay the family $500,000.  

If you are interested in learning more about the practices of the BPD SWAT team, please use the interactive BPD SWAT records review tool. In addition to exploring the data, you can also read the original after-action reports, which include a narrative description of the incident at the end of each document. Please don’t hesitate to contact [email protected] with any significant findings.

Review Boston Police SWAT After-Action Reports

Data Notes and Methodology

The original incident reports were made public as non-machine-readable PDFs. Therefore, the first task of this project involved determining how to extract information from these files in a usable format. Volunteer data scientists began by OCR-ing and processing the PDF files to extract useful fields from the documents. Due to the nature of the PDFs and occasional image quality issues, the programmatic data processing and extraction of these reports was subject to error. As a final measure to address any errors and ensure overall data quality, selected fields were manually reviewed and updated by volunteers. For additional data documentation please see the “data notes” section of the BPD SWAT tool.  

This data extraction project was possible with the support of: 

  • Tarak Shah, Human Rights Data Analysis Group — programmatically extracted data from the after-action report PDFs 
  • Aaron Boxer, ACLU-MA volunteer — conducted data extraction, processing, manual entry, and analysis 
  • Natasha Ceol, ACLU-MA volunteer — assisted with data quality-checking and redaction of the after-action reports.  
]]>
Analyzing Mayor Wu’s FY24 Boston Police Department Budget Recommendation https://data.aclum.org/2023/05/05/analyzing-fy24-boston-police-department-budget-recommendation/ Fri, 05 May 2023 18:35:26 +0000 https://data.aclum.org/?p=73176 See our previous work on policing in Massachusetts

On April 12, Mayor Wu submitted her recommended operating budget for the upcoming fiscal year (FY24) to the Boston City Council, kicking off the public portion of the City’s budgeting process. Among other investments, the Mayor’s budget proposes funding the police department at $9.88 million over its FY23 budget. 

Over the coming months, the City Council will review the Mayor’s budget recommendation and can propose and vote on changes. During this same review period last year for the FY23 budget, the City Council unanimously proposed reducing funding for the police department and increasing funding for youth engagement and employment programming, among other proposed changes. These efforts had a limited impact in 2023, however. The Mayor’s budget passed with a few small tweaks from the Council.  

This analysis focuses on the Boston Police Department’s (BPD) budget. Our hope is that this breakdown will be useful for community members, journalists, and other interested parties as the City’s various stakeholders and residents engage in conversations and debate on the budget, which are set to take place between now and July. To explore the full recommended operating budget and design your own changes, please see our (Co)Design the Boston Budget calculator tool.

 


The Mayor’s proposal would increase the Police Department budget by $9.88 million

Of the $4.28 billion city operating budget, the Mayor’s recommended budget allocates $404.97 million to the Boston Police Department.  

The recommendation is an increase of $9.88 million from last year’s BPD adopted budget of $395.09 million. (Note: the City’s recommended budget document describes this as a smaller $9 million increase for the BPD budget, comparing the Mayor’s FY24 recommendation to the FY23 appropriation of $395.91 million rather than the FY23 adopted budget of $395.09 million. This analysis compares the FY24 recommended budget to the adopted budget of previous years, not the appropriated budgets.)

In adopted BPD budgets of recent years, there have been significant reductions to the police overtime line item; notably, this FY24 recommended budget does not make additional changes to police overtime. The FY21 and FY22 budgets both significantly reduced the overtime line item from previous fiscal years, and both the FY23 adopted budget and FY24 recommendation preserve those cuts. However, police overtime spending is not constrained by a reduction in the line-item overtime allocation; all police overtime is paid by the city even when it exceeds its appropriation. Since actual overtime spending frequently surpasses its budget line item, the analysis below compares the Mayor’s FY24 BPD budget recommendation both in its entirety and with the overtime line item excluded. 

When excluding the overtime budget, the remaining portion of the Mayor’s recommended FY24 BPD budget is the largest in recent years.  

The proposed increase in the FY24 police budget represents a shift. In the past few years since 2020, the growth of BPD’s adopted budget has curbed in response to community calls to shift funds from policing into community health and safety initiatives. When excluding the overtime line item, the $12.38 million increase between FY19-FY20 was followed by much lower increases of $1.94 million between FY20-FY21 and $0.57 million between FY21-FY22. Last year’s FY23 adopted budget, also proposed by Mayor Wu, showed a reduction of $4.78 million from the FY22 budget—the first such reduction to non-overtime sections of the budget since at least FY15. An increase of $9.88 million for FY24 would break from the trend of the past few years. 

 

Where is this additional money going?

The bulk of the Mayor’s proposed budget increase for BPD provides funding for police department staff. According to the City’s budget documents, the FY24 recommended budget would fund “a recruit class to replace project attrition” and $582,000 to the Youth Connect program, which provides social workers for police precincts. Last year’s FY23 budget also increased the size of the police cadet class from 60 to 90; it appears that this budget preserves that increase. 

Here’s how the BPD’s staffing levels under the FY24 recommendation would compare to other city departments, excluding the Boston Public Schools. 

The Mayor’s budget recommendation proposes 2,766 full-time equivalent employees (FTE) at BPD on 1/1/24. This represents: 

    • 3.3x the FTEs of the Public Health Commission 
    • 7x the FTEs of the Public Library 
    • 10.3x the FTEs of the Boston Center for Youth & Families 
    • 11.8x the FTEs of Parks and Recreation 

This is an increase of 108.6 full-time equivalent employees compared to 1/1/23, though as city documents note, it is important to bear in mind that the number of employees on January 1 of a given year often varies due to retirement timing and the start of new classes. Budget documents describe this change in staff levels along these lines—as a reflection of last year’s police class being delayed until April 2023. 

The size of BPD’s full-time equivalent staff exceeds not just departments but entire city cabinets. The police department is larger than every cabinet besides Education and its own Public Safety cabinet. 

Boston has typically had higher-than-average police staffing. As we previously demonstrated using FBI UCR data, in 2021 Boston had more officers per resident than 70 percent of similarly sized cities. 

BPD has a higher budget than every city department other than Boston Public Schools

Year after year, the Boston Police Department has the second-largest budget of any department, second only to the Boston Public Schools.

Note: The chart above excludes Boston Public Schools, which has a FY24 recommended budget of $1.45 billion. 

Just as the size of the police department staff surpasses nearly all other departments and cabinets, so does the size of its budget. Per the Mayor’s recommended FY24 budget, the BPD budget would be: 

    • 3.2x the size of the Public Health Commission 
    • 8.5x the size of the Library Department 
    • 9.9x the size the Environment, Energy, & Open Space Cabinet 
    • 13.5x the size of the Boston Center for Youth & Families 
    • 22.5x the size of Youth Employment and Opportunity 

Boston Police Department overtime spending

The police department’s funds extend beyond the already-sizable adopted budget; the City pays all overtime costs that the police department incurs even when they exceed the initial budgeted amount. And as we’ve documented the police also receive large amounts of grant funding from the state and federal governments each year. 

The department continually exceeds its overtime appropriation. Last year’s overage was particularly substantial, since the FY22 police overtime budget was reduced on paper but actual police expenditures did not decline accordingly. The department spent $72.33M on overtime: similar to previous years’ overtime spending but $28.41M more than budgeted. 

To put this overage in perspective, the difference between BPD’s budgeted overtime allocation and its actual overtime spending in one year could fund not just departments but entire City cabinets. $28.41M is enough to cover the FY22 expenditures for the Cabinets of Equity and Inclusion ($9.8M), Economic Opportunity and Inclusion ($7.6M), Community Engagement ($4.22M), Arts and Culture ($3.50M), and Office of Police Transparency and Accountability ($718K) combined, with a couple million to spare. This FY22 overage is also over half of the Boston Public Library’s budget. 

For more on BPD payroll and overtime during the 2022 calendar year, please see our previous analysis. 


Data Sources

  • Boston’s City budget website  
    • Current and past fiscal year budgets at bottom of page; information was obtained from the “operating budget” and “public safety cabinet” sections of the budget documents 


Explore the FY24 budget recommendation

(Co)Design the Boston Budget – Budget Calculator

How to get involved

  • Members of the public may testify at City Council hearings on the budget, which started this week and will continue through early June. Public testimony, including virtual public testimony, will be accepted at each hearing and at the public testimony hearings.  
    • Resources
    • Schedule for BPD budget hearings, public testimony, and working sessions as of 5/5/23. These are subject to change—confirm hearing dates and times through Public Notice. Both the Ianella Chamber and Piemonte Room are on the Fifth Floor of Boston City Hall: 
      • Thursday, May 11 at 10am (Iannella Chamber)Hearing: BPD Revolving Funds 
      • Thursday, May 11 at 2pm (Iannella Chamber)Hearing: Grants, Reform, Community Programs, and Crisis Response 
      • Thursday, May 17 at 6pm (Iannella Chamber)Public Testimony: BPD, BFD, Safety, BEMS 
      • Tuesday, May 30 at 2pm (Piemonte Room) – Amendments Working Session: BFD, BPD 

 


 

The analysis benefitted from feedback and input from Fatema Ahmad (MJL), Lauren Chambers (UC Berkeley I School; ACLUM), and Youth Justice and Power Union. Data analysis was informed by work done in collaboration with Boston University students under the supervision of the BU Spark! Program.

]]>