Introduction and summary
In a world where much of our daily lives is captured online, major technology and telecommunications companies hold tremendous power over Americans’ data and the fate of people’s digital footprints.1 Particularly under leadership that has boasted about overturning Roe v. Wade,2 there is a dire urgency to protect people’s online privacy and private reproductive health information.3 The U.S. Supreme Court’s decision in Dobbs v. Jackson Women’s Health Organization set a foundation for states to ban abortion across the country,4 and it emboldened opponents of abortion rights, state legislatures, and prosecutors to use any means necessary5 to control pregnant people’s bodies—including monitoring their digital lives.6
Many authors of and contributors to Project 2025, the far right’s authoritarian playbook for the presidential administration,are being proposed as potential candidates for positions in the Trump administration7—a clear sign of an orchestrated effort to criminalize abortion care and surveil people pursuing reproductive health care.8 As part of those efforts, major tech companies could be weaponized as conduits to track and prosecute pregnant people seeking abortion care and their providers.9 If not adequately mitigated, the technology and telecommunications industries could play an instrumental, though inadvertent, role in implementing a backdoor, nationwide abortion ban—that is, imposing severe restrictions that make it functionally impossible to access abortion care—and controlling women’s bodies.10 Furthermore, Black women and other women of color and immigrants who already face heightened levels of surveillance and criminalization could be disproportionately affected by the negative impacts of using tech to track pregnancy outcomes, given that they are already more likely than their white counterparts to be criminalized during pregnancy and motherhood.11 Discriminatory surveillance tactics already used by law enforcement could be further adapted to enforce abortion bans.12
As many people increasingly turn to telemedicine and online options for abortion care,13 weak protections for digital information may put hundreds of thousands of abortion seekers and their providers at risk of criminal prosecution or individual harassment. In states where abortion is banned, law enforcement agencies are increasingly turning to information obtained online to prosecute those who seek or provide abortion care.14 This prosecution is likely to become more common as digital data become more easily accessible.15
Tech companies must be required to act to safeguard Americans’ online information and to prevent the misuse of any type of data that could be used as evidence in pregnancy-related prosecutions. Likewise, the U.S. government must address significant gaps in privacy laws by advancing comprehensive federal privacy protections for all Americans and must establish oversight and enforcement mechanisms to ensure companies comply with the requirements. Individuals can also take steps to protect their sensitive reproductive health information from digital surveillance and misuse by limiting who they share data with and by exercising caution in online communications by using secure messaging platforms.
HIPAA’s role in protecting individuals’ personal health information
The Health Insurance Portability and Accountability Act (HIPAA) of 1996 establishes federal standards to protect the privacy and security of individuals’ sensitive, personal health information, referred to as “protected health information” (PHI).16 The rule aims to ensure that PHI cannot be used or disclosed without an individual’s consent. PHI includes “individually identifiable health information”17 that could identify an individual or reasonably be used to do so. Most health care plans, clearinghouses, and providers—along with their business associates—are required to comply with the rule. These entities are required to implement measures to ensure the confidentiality, integrity, and security of PHI.18 HIPAA also provides individuals with rights over their health data, including the ability to access and correct their PHI and, in certain cases, restrict its use or disclosure.
In 2024, the Biden administration finalized a rule to modify regulations under HIPAA to further strengthen protections for reproductive health information.19 The new rule prohibits the use or disclosure of PHI for the purpose of imposing or conducting investigations or other activities to establish the liability—administrative, civil, or criminal—of any person for the mere act of seeking, obtaining, providing, or facilitating lawful reproductive health care.
However, even with this new rule, HIPAA has notable limitations,20 and these gaps can make individuals vulnerable to the mishandling, sharing, and misuse of their personal health information. Not every entity or business that collects health information is required to adhere to HIPAA.21For instance, apps that track users’ fitness goals, menstruation, and mental health are not covered by HIPAA, even when they collect sensitive health information.22 This includes instances when a health care provider communicates with a patient through an app that is not a covered entity. This also means that sensitive health information provided to these apps by individuals is potentially vulnerable to unauthorized access and misuse, not only by the app itself but also by third parties such as data brokers, advertisers, or even law enforcement.
See more
Big Tech’s business model facilitates mass surveillance of users seeking reproductive health care
Many tech companies—whether by design or negligence—play a role in the hypersurveillance of people seeking sexual and reproductive health care, particularly abortion care.23 As noted in a September 2024 report from the Federal Trade Commission (FTC), online advertising practices, in particular targeted advertising, incentivize companies to collect mass amounts of data from users for profit.24 With few protective measures in place, companies are encouraged to practice more invasive methods of data collection,25 and this incentive structure provides little motive for companies to self-regulate and protect consumers’ data.
For instance, apps that track users’ fitness goals, menstruation, and mental health are not covered by HIPAA, even when they collect sensitive health information.
Furthermore, companies employ a “notice and consent” framework as the default approach to user privacy.26 Most commonly, this process presents users with a service’s terms and conditions and privacy policy notice and asks them to give permission or “consent” to the company’s data collection practices by checking a box.27 However, this practice is insufficient in ensuring individuals’ informed consent and protecting their privacy, and it should not be the default framework for user privacy. Research shows that people are inclined to click “I agree” to these notices,28 often without reading them29 or without fully understanding them.30 Under this model, consumers have little to no bargaining power when it comes to how companies collect and use personal data: They are forced to choose between accepting a company’s terms to gain access to a service or declining and being denied access altogether.
The consequences of this business model are far-reaching and daunting in the absence of federal protections for abortion, as the collection of sensitive health data exposes users to the risk of significant privacy invasion and criminalization.31 Instead, companies should exercise a data minimization approach in which they only collect information that is reasonably necessary to provide a given service.32 This approach would protect sensitive health data by limiting the volume of data collected and retained,33 and it would reduce the risk of unauthorized access to and misuse of people’s data.
Data brokers collect and sell sensitive information to the highest bidder
Data brokers are companies that collect information from different sources and then aggregate and sell that information to other third-party clients.34 Although this does not include protected health information such as medical records, which are covered by HIPAA, data brokers are still able to create detailed personal profiles and sell them. Although some state laws impose limited requirements on data brokers, the sector remains largely unregulated at both the state and federal levels.35 The U.S. data broker industry generated more than $250 billion in 2023 from the sale of personal information,36 including sensitive reproductive health data, often collected without users’ awareness or informed consent.37 For instance, public records show that 79 data brokers registered in California currently sell precise geolocation data and 25 sell reproductive health information.38
State-level legislation, such as the California Consumer Privacy Act and the California Delete Act,39 attempts to mitigate these risks by requiring companies to disclose what data they collect40—granting consumers the right to opt out of their data’s sale—and by enabling deletion requests.41 The Delete Act goes further by introducing a centralized mechanism for consumers to request deletion of their data from data brokers’ collections. However, these protections rely heavily on individual users to opt out of data collection and sale,42 a burden that many people may not realize they need to shoulder. Furthermore, data brokers operating outside of California are not bound by these rules, creating a patchwork of protections that leaves significant gaps. This underscores the need for federal legislation that standardizes privacy protections across all states, ensuring consistent rules for data brokers regardless of where they operate.
Data brokers can infer intimate private details about a person’s life and activities using seemingly unrelated pieces of data.43 This information may then be sold to anyone, including law enforcement agencies,44 allowing government agencies to exploit a loophole that would otherwise require them to comply with the legal process and obtain a warrant to access certain information.45 Law enforcement agencies have used this loophole to purchase users’ information, circumventing Americans’ constitutional right against unreasonable searches and seizures, because users’ private information has become a commodity for anyone willing to buy it.46 These companies profit from collecting and selling user data at the potential expense of people being prosecuted simply for seeking, receiving, or providing health care.
What happens when government agencies purchase data from brokers?
In the absence of a comprehensive federal privacy law, data brokers can source and package hundreds of thousands of data points on individuals.47 Federal agencies exploiting the data broker loophole are then able to purchase this sensitive information without a warrant and may use it as a surveillance tool.48 For example, a 2020 news report said that some agencies within the U.S. Department of Homeland Security, including U.S. Customs and Border Protection and Immigration and Customs Enforcement (ICE),49 purchased geolocation cell phone data from data brokers Venntel and Babel Street.50 Using this information, the agencies were able to target and track people in certain areas for immigration enforcement. ICE also reportedly accessed a private database containing information from various utility companies, giving the agency access to sensitive records related to people’s employment, housing, credit reports, criminal histories, and more—all without a warrant.51
Although it is unclear exactly how federal agencies have used such information to investigate or prosecute individuals, the use of data unveils a surreptitious surveillance partnership between private companies and the federal government that circumvents Americans’ constitutional right to privacy.52 This type of government overreach threatens abortion access and reproductive rights in general.
The consequences of this for-profit infringement of people’s personal information have become dire as states increasingly consider various ways to ban abortion care and criminalize abortion seekers and providers. In fact, the 210 pregnancy-related prosecutions nationwide from June 2022 to June 2023 represent the highest number ever recorded in just one year.53The prosecution data show that charging documents may include alleged actions such as “researching or exploring the possibility of an abortion” or “any mention of an abortion procedure or attempt to end pregnancy” as evidence that the person may have violated the law.54 For example, just months after the overturn of Roe, law enforcement in Nebraska issued a warrant to Facebook to release private messages between a mother and her teenage daughter regarding abortion pills.55 The mother, Jessica Burgess, was later sentenced to two years in prison.56
Weak privacy and security practices pose threats to everyone’s online privacy
Tech and telecommunications companies’ failure to adopt stringent privacy protections—including robust data retention policies and other safeguards such as end-to-end encryption, a process for making messages unreadable except by their intended audiences57—means users’ sensitive private health information is constantly at risk.58 It is also crucial to note that in most cases, sensitive health data is combined with other nonsensitive data, making it difficult to parse out and protect in an all-inclusive manner. Insufficient protections leave digital records vulnerable to law enforcement subpoenas and unauthorized access by third parties.59 A 2022 study by Accountable Tech of Google’s location data retention policies revealed that the company was still retaining location search queries and histories for some users,60 despite having issued an updated policy claiming that it would begin deleting location history data when users visited sensitive locations.61 In a follow-up study in 2023, Accountable Tech found that Google still retained sensitive transit information and location search queries from users who visited Planned Parenthood locations in 4 out of 8 experiments conducted across seven states.62 Soon thereafter, Google announced that it would gradually roll out expanded protections for location history data over the course of 2024.63
These discrepancies in companies’ privacy policies and their implementation undermine public trust and highlight a lack of meaningful accountability for companies that hold vast amounts of sensitive user information.64 When state privacy legislation insufficiently defines or is silent on what constitutes “sensitive data,” critical decisions are left up to individual companies. This absence leaves enforcement of privacy up to the discretion of the individual companies and their interpretation of “sensitive information” and how it should be protected.65 Without accountable oversight, strict enforcement, and transparent policies, users have little control over how their private data are stored, shared, and potentially used against them.
Types of data subject to misuse and criminalization
Tech and telecommunications companies collect and manage vast amounts of user data, much of which can be weaponized to target, monitor, or prosecute individuals seeking reproductive health care.66While each data type—such as search histories, location data, and app usage—presents unique privacy challenges, their combination creates a detailed and invasive digital profile. The cumulation of this digital paper trail may serve as a mountain of evidence that prosecutors can use to show intent.67 For example, combining location data with search histories can paint a detailed picture of an individual’s activities, such as searching for abortion clinics and then visiting those locations. This integration amplifies surveillance risks and increases the potential for misuse by law enforcement and government agencies or groups that oppose abortion rights.68
Fears of these data types being used to criminally punish abortion seekers are not theoretical. Pregnant people and their supporters already live in fear of criminalization,69 and the weaponization of their online data further proves why these fears are justified.
Search histories
Internet search and browsing histories about abortion or related topics can be used to infer intentions to end a pregnancy.70 There are already instances of law enforcement agencies accessing and using women’s private messages on social media platforms and search histories as evidence to criminalize them for their pregnancy outcomes and private reproductive decisions.71For example, even before the overturn of Roe in 2017, a woman in Mississippi, Latice Fisher, was charged with second-degree murder after she experienced a stillbirth at home.72 Police searched her phone and found she had previously searched for medication abortion.73 Though the case was later dismissed, it shows how private search data can be weaponized to suggest intent to terminate a pregnancy and subject women to criminalization.
Location tracking
One of the biggest threats to people’s data privacy is location tracking.74 Mobile phones can consistently share location data with telecom companies, service providers, apps, and tech companies.75 This may occur without users’ full awareness or consent,76 and often without their understanding of how the data can later be used against them.77 Many states that have banned abortion are also the biggest users of location data to surveil women ending their pregnancies.78
Opponents of abortion rights have already used these data to target people based on their proximity to reproductive health facilities.79 For example, the organization Veritas Society used geolocation tracking technology to run an ad campaign from 2019 to 2022 that sent advertisements against abortion to people visiting Planned Parenthood clinics.80 The campaign demonstrated how organizations can target people with alarming specificity, and it exemplified the much broader issue of targeting people simply for their proximity to reproductive health care facilities. Combined with browsing data and other types of personally identifiable information, location tracking can be used against people seeking care to place them in or around abortion providers.81
Period-tracking apps
Millions of people use period-tracking apps to monitor their menstrual cycles, birth control use, sexual activity, and fertility-related concerns.82 In doing so, users share an abundance of sensitive reproductive health-related information with these apps—data that are not protected by HIPAA83 and, in many cases, are vulnerable to use and sharing by the app itself and by third parties. While some state privacy laws protect certain types of personal data, these laws vary widely and leave significant gaps, especially for users in states without comprehensive privacy regulations. Apps may cooperate with law enforcement during criminal investigations, meaning that period-tracking data could be a target for investigators.84
See more
One industry survey of more than 900 Americans conducted by Secure Data Recovery showed that women’s health apps were some of the least trusted apps among Americans ages 18 to 77 when it comes to concerns about data security.85 This general distrust is buttressed by real-world missteps as period-tracking apps have come under scrutiny in recent years for alleged misleading privacy practices and disclosures.86 For example, in 2021, the FTC reached a settlement with a popular period-tracking app called Flo for allegedly misleading users about sharing their reproductive health data with “Facebook’s analytics division, Google’s analytics division, Google’s Fabric service, AppsFlyer, and Flurry.”87
Recommendations for a more private, secure future
Digital information can be used to criminally charge, arrest, and prosecute abortion seekers and providers. However, emerging solutions can address the real threats digital surveillance poses to reproductive health care in order to protect individuals’ privacy, and they can also prevent tech companies from helping to criminalize abortion care.
Address data privacy and reproductive health protections
Tech companies have a crucial role to play in safeguarding user privacy and preventing the misuse of sensitive health information, particularly in the context of reproductive health. By adopting proactive privacy practices that go beyond current legal requirements, companies can help mitigate risks, protect vulnerable users, and set a higher standard for data protection. Key best practices include:
- Implementing data minimization and consent policies: Limit the collection, retention, and sharing of user data to only what is strictly necessary for the functionality of the service.
- Providing transparency and user control: Clearly communicate data collection and usage practices to users in accessible, concise language and explicitly state the purpose for which personal information is being collected. Offer tools that allow users to easily manage their privacy settings, including the ability to permanently delete their data.
- Integrating privacy by design principles: Embed privacy protections into the design of apps and services from the outset, ensuring the user data is protected by default.
- Collaborating on standards and advocacy: Work with policymakers and civil society organizations to develop and promote clear, enforceable standards for data privacy. Advocate for comprehensive federal privacy legislation that addresses existing regulatory gaps and ensures consistent protections nationwide.
Protect individuals at the state level with state shield laws
Since the overturn of Roe, 19 states have enacted shield laws either through executive order or legislation to protect providers and the privacy of people seeking reproductive care.88 State shield laws serve as legal protections for patients, health care providers, and supporters from civil and criminal consequences from out-of-state abortion restrictions. These types of laws will become increasingly important over the next few years as the sexual and reproductive health landscape is tossed into greater uncertainty. Shield laws in California,89 Connecticut,90and Washington91 generally prohibit their courts and law enforcement agencies from cooperating with out-of-state investigations related to abortion care. Additionally, Washington state enacted H.B. 1155, which expressly protects certain types of consumer information, such as location and reproductive health data, gathered by period-tracking apps.92 These laws help prevent the use of digital surveillance to prosecute people who travel to access care and health care professionals who provide it.
These protections are particularly important in preventing the misuse of sensitive data—such as search histories, location tracking, or data from period-tracking and women’s health-related apps93—that could otherwise be weaponized to prosecute individuals for accessing care across state lines. By limiting cooperation with states that criminalize abortion, shield laws can help reduce the likelihood that individuals will have their private data at risk of exposure and face legal consequences simply for seeking health care. The efficacy of shield laws would be seriously jeopardized, though, were there to be a federal abortion ban.
Protect individuals’ data
While the onus should not be on individuals to stop their data and private reproductive health information from misuse and abuse, they can take practical steps to mitigate this risk and protect themselves against mass digital surveillance.94 People can limit data sharing on apps; turn off or limit location services; opt out of personalized and targeted ads; and use encrypted communications, privacy-focused search engines, and secure messaging platforms for reproductive health-related communications.95 Various strategies can be used to defend against location tracking depending on the tracking method employed. For example, the Surveillance Self-Defense project by the Electronic Frontier Foundation covers different threats to abortion seekers and providers and compiles ways to protect the privacy of digital data.96
While these steps can help individual abortion seekers reduce their vulnerability to data misuse, these are merely stopgaps in the absence of stronger, systemic privacy protections. The absence of a comprehensive federal law to protect Americans’ data privacy has allowed tech companies to capitalize on the use of individuals’ data with little oversight and regulation. Fortunately, there are bipartisan opportunities to fix that.
Strengthen federal consumer protections and privacy
In April 2024, the U.S. House voted 219-199 to pass the Fourth Amendment Is Not For Sale Act sponsored by Rep. Warren Davidson (R-OH), a bipartisan bill that takes an important step toward closing the data broker loophole and updating Title II of the Electronic Communications Privacy Act of 1986, the Stored Communications Act.97 Had it been signed into law, the bill would have restricted law enforcement and other government entities from purchasing some communications-related information and location data that they would otherwise need a warrant to obtain.98 The bill would have helped protect Americans against government access to their personal data. However, it had some limitations: It covered only a few types of data, including “communications content, geolocation information, and non-contents information pertaining to a consumer or subscriber of an ECS or RCS provider,” meaning that it did not cover health data or other types of sensitive data.99 In combination with other measures, this law would have been a step in the right direction for securing stronger Fourth Amendment protections regarding digital surveillance. Although the U.S. Senate has not taken action on the bill since receiving it in April 2024, there is a possibility that the bill may be considered in future congressional sessions.
The absence of a comprehensive federal law to protect Americans’ data privacy has allowed tech companies to capitalize on the use of individuals' data with little oversight and regulation.
The American Privacy Rights Act (APRA), introduced in 2024 and sponsored by Sen. Maria Cantwell (D-WA) and Rep. Cathy McMorris Rodgers (R-WA), is a bicameral bill with bipartisan support that would establish the first-ever comprehensive federal data privacy law in the United States.100 The bill was groundbreaking: Its earliest iteration included groundbreaking foundational civil rights provisions that would protect users against data-driven discrimination.101 It aimed to minimize the amount of personal information companies can collect and sell, strengthen protections for certain health-related data, and give consumers greater control over their personal data. The measure would also require companies to limit the amount of information they collect to only what is necessary, and it would give individuals more power to prevent the transfer or sale of their personal data without their affirmative express consent.102
If passed, the bill would also establish enforcement mechanisms—including through the FTC and state attorneys general and, notably, a private right of action for individuals—to hold violators accountable.103 This would be a vast improvement upon existing data privacy laws and would put more power into the hands of everyday consumers to decide what happens to their data. However, the civil rights safeguards policymakers stripped from the bill after the U.S. House Energy and Commerce Subcommittee on Innovation, Data, and Commerce markup were absolutely crucial for guaranteeing fair privacy protections for everyone, and future bills must include robust civil rights protections if they are to have meaningful and practical utility.104
While the future of APRA is untenable, as its most meaningful and consequential components were eliminated, Congress must pass a comprehensive federal privacy bill to markedly strengthen existing data privacy laws and protect everyday online consumers from unauthorized access from companies and agencies and the misuse of their personal data. This includes requiring the adoption of a data minimization framework as a foundation for companies’ privacy policies and implementing civil rights protections to prevent discrimination and biased targeting of protected groups.
Strengthen protections for private, sexual, and reproductive health information
The My Body, My Data Act, first introduced in 2022, was the first congressional action to propose strengthening data protections specifically for sensitive reproductive health information.105 The bill aims to protect personal sexual and reproductive health information by limiting the data that can be collected, retained, and shared by companies, including data brokers. That includes data related to pregnancy, menstruation, contraception, and more. Similar to APRA, it would require companies to follow a data minimization framework and prohibit regulated entities from collecting, retaining, using, or sharing reproductive health data aside from what is strictly necessary to provide a product or service. This bill also provides people with necessary rights to access, delete, and correct their reproductive health information at any time, offering them greater control over how their data are used and shared. If enacted, it would create a uniform standard for protecting reproductive health data nationwide.
The Reproductive Data Privacy and Protection Act of 2024, sponsored by Rep. Ted Lieu (D-CA), would prohibit law enforcement from accessing or utilizing data related to sexual and reproductive health—including location data for clinics; period-tracking app records; records of reproductive-related surgeries and procedures such as in vitro fertilization treatments; and contraception and medication abortion purchases and prescriptions or recommendations for such products—for the purpose of investigating or prosecuting any person.106 The intent is to prevent such sensitive data from being used to investigate, criminalize, or prosecute individuals for seeking or assisting with reproductive health care. If enacted, this bill would help protect Americans against the ever-increasing invasion of private reproductive health information and would be an additional layer of protection against the use of reproductive health data to prosecute any person in connection with inquiring about, seeking, obtaining, providing, or facilitating reproductive or sexual health treatment or care.
Conclusion
As reproductive rights remain under attack across the country, the role of the technology industry in enabling the surveillance and criminalization of abortion cannot be ignored. Through data collection practices involving obtaining search histories, tracking locations, and storing information from period-monitoring apps, tech companies have created a system that leaves people vulnerable to hypersurveillance and prosecution simply for seeking health care they need. Technology and telecommunications companies must reckon with the potentially devastating impact of this infringement on peoples’ privacy, bodily autonomy, and freedom. The current sexual and reproductive health landscape, coupled with the reality of existing data privacy practices, necessitates that individuals do what they can to protect themselves from the misuse of their sensitive data.
By adopting more robust privacy protections, reassessing data practices that leave people vulnerable, and practicing greater transparency, companies can protect peoples’ rights to make their own health decisions without fear of surveillance or criminalization. Lawmakers must also prioritize bipartisan, comprehensive federal data privacy protections so that all Americans have better control over their data privacy and feel protected in a rapidly innovating sector. U.S. government agencies such as the FTC and the Department of Health and Human Services must act swiftly to oversee and hold these companies accountable to these laws.
The overturn of Roe v. Wade and the resulting patchwork of state abortion bans, along with the lack of privacy laws at the state and federal levels, mean the danger posed by tech surveillance and tracking is acute. That danger to reproductive freedom is likely to persist until abortion is protected again at the federal level.
Acknowledgments
The author would like to thank Amina Khalique, Kate Kelly, Nicole Alvarez, Megan Shahi, Adam Conner, and Andrea Ducas for their reviews of and contributions to this report.