The comment draws upon past CAP work to answer key questions the ANPR poses, including:
Question 1: To the question of what practices are used to surveil consumers, CAP offers a research-based explanation of common surveillance technologies, including those used in ad targeting, creation of social graphs, third-party cookies, Federated Learning of Cohorts, and software development kits (“SDKs”).
Question 4: To the question of how commercial surveillance harms consumers, CAP outlines literature illustrating the economic, privacy, and consumer protection issues arising from commercial surveillance practices and related services. This synthesis notes that these harms to consumers are difficult for them to avoid, even with substantial work and expertise, and fall asymmetrically on low-income communities and communities of color.
Question 5: To the question of harms that consumers may not easily discern or identify, CAP highlights polling and other measures which show that, while consumers are generally concerned about corporate surveillance of their activities, they have little detailed knowledge or understanding about those practices or their potential impacts on the services and information they receive.
Questions 11 and 17: To Question 11 on commercial incentives and business models that lead to lax data security measures or harmful commercial surveillance practices, CAP outlines common tech industry algorithmic strategies to increase user engagement. Simultaneously, this response draws connections to Question 17 on “techniques that manipulate consumers into prolonging online activity (e.g., video autoplay, infinite or endless scroll, quantified public popularity).” Question 17 has a specific focus on children and teenagers, and CAP’s response focuses on the algorithmic techniques which impact all users, including children and teenagers.
Question 12: To the question of which “commercial surveillance practices are unlawful such that new trade regulation rules should set out clear limitations or prohibitions on them,” CAP encourages the Commission to develop rules that clearly prohibit certain harmful practices in commercial surveillance and data security, especially those practices which present an inherent risk to civil rights.
Question 30: To the question of whether the Commission should commence a Section 18 rulemaking on commercial surveillance and data security, CAP makes an argument in the affirmative. Amidst a lack of regulation, the market for more data has created a race-to-the-bottom on commercial surveillance. Thoughtful regulatory interventions are an appropriate tool to fix this market failure, and it is well within the Commission’s power to do so. There is no reason to believe that continued reliance on self-regulation will produce anything but the status quo: predatory, deceptive practices as the industry standard.
Question 31: To the question of whether the Commission should commence a Section 18 rulemaking exclusively on data security, CAP provides a set of questions about process and scope to help inform the Commission’s decision. It notes that the scope of entities that would be affected by a data security rulemaking are likely to be far wider (potentially all entities that store data) than those entities who utilize commercial surveillance data collection, monetization, and movement.
Question 65: To the question of the prevalence of algorithmic discrimination based on protected categories such as race, sex, and age, CAP discusses the broad civil rights harms stemming from the rise of online services and their impacts on low-income communities and communities of color.
Question 86: To the question of opacity in different forms of commercial surveillance practices, including technical or legal mechanisms companies use to shield them from public scrutiny, CAP highlights the impact of the loss of third-party analytics firms. Previously, such firms sought to provide insight into online activity, especially on large digital gatekeeper platforms. Their closure or acquisition by digital gatekeepers who wish to foreclose access to the limited information they provided contributes to the excessive opacity with which digital gatekeepers shield even basic operations and widespread practices.
To address harms stemming from commercial surveillance and lax data security, CAP recommends that the FTC start a Section 18 rulemaking. Such ex ante regulation is a critical first step to addressing the harms that online services perpetuate. The FTC must focus on prohibiting the practices that have proved most harmful while remaining flexible regarding scope and innovation. As data security rules could affect far more entities than commercial surveillance rules, CAP urges the FTC to consider the potential differences in stakeholders and scope relative to those two topics. A Section 18 rulemaking will undoubtedly be a significant undertaking, but it is an essential, and long-overdue, step to making the U.S. economy safer and more competitive. CAP applauds the FTC for its effort.