Your location data is more than a simple point on a map—it’s a revealing digital fingerprint. It can show where you live, where you work, where you worship, and even where you access healthcare. In today’s hyper-connected environment, these movements are silently collected, packaged, and sold to the highest bidder. For those seeking reproductive or gender-affirming care, attending protests, or visiting immigration clinics, this data can become a dangerous weapon.
Last year, privacy advocates raised urgent concerns, calling on lawmakers to address the risks posed by unchecked location tracking technologies. These tools are now increasingly used to surveil and criminalize individuals for accessing fundamental services like reproductive healthcare.
There is hope. States such as California, Massachusetts, and Illinois are now moving forward with legislation designed to limit the misuse of this data and protect individuals from digital surveillance. These bills aim to preserve the right to privacy and ensure safe access to healthcare and other essential rights.
Imagine a woman in Alabama—where abortion is entirely banned—dropping her children at daycare and driving to Florida for a clinic visit. She uses a GPS app to navigate and a free radio app along the way. Without her knowledge, the apps track her entire route, which is then sold by a data broker. Privacy researchers demonstrated how this could happen using Locate X, a tool developed by Babel Street, which mapped a user’s journey from Alabama to Florida.
Despite its marketing as a law enforcement tool, Locate X was accessed by private investigators who falsely claimed affiliation with authorities. This loophole highlights the deeply flawed nature of current data protections and how they can be exploited by anyone posing as law enforcement.
The data broker ecosystem remains largely unregulated, enabling a range of actors—from law enforcement to ideological groups—to access and weaponize this information. Near Intelligence, a broker, reportedly sold location data from visitors to Planned Parenthood to an anti-abortion organization. Meanwhile, in Idaho, cell phone location data was used to charge a mother and her son with aiding an abortion, proving how this data can be misused not only against patients but also those supporting them.
The Massachusetts bill proposes a protected zone of 1,850 feet around sensitive locations, while California takes a broader stance with a five-mile radius. These efforts are gaining support from privacy advocates, including the Electronic Frontier Foundation.
“A ‘permissible purpose’ (which is key to the minimization rule) should be narrowly defined to include only: (1) delivering a product or service that the data subject asked for, (2) fulfilling an order, (3) complying with federal or state law, or (4) responding to an imminent threat to life.”
Time and again, we’ve seen location data weaponized to monitor immigrants, LGBTQ+ individuals, and those seeking reproductive care. In response, state legislatures are advancing bills focused on curbing this misuse. These proposals are grounded in long-standing privacy principles such as informed consent and data minimization—ensuring that only necessary data is collected and stored securely.
These laws don’t just protect residents. They also give peace of mind to travelers from other states, allowing them to exercise their rights without fear of being tracked, surveilled, or retaliated against.
To help guide new legislation, this post outlines essential recommendations for protecting communities through smart policy design. These include:
- Strong definitions,
- Clear rules,
- Affirmation that all location data is sensitive,
- Empowerment of consumers through a strong private right of action,
- Prohibition of “pay-for-privacy” schemes, and
- Transparency through clear privacy policies.
These protections are not just legal reforms—they’re necessary steps toward reclaiming control over our digital movements and ensuring no one is punished for seeking care, support, or safety.