The Privacy Center's Deputy Director Laura Moy testified before the U.S. House of Representatives on Wednesday during a hearing co-convened by the Energy & Commerce Subcommittee on Communications & Technology and Subcommittee on Digital Commerce and Consumer Protection.
The hearing, titled "Algorithms: How Companies’ Decisions About Data and Content Impact Consumers", ended up covering a wide range of issues, ranging from algorithms and consumer expectations to FCC Chairman Ajit Pai's recently-announced proposal to reverse the 2016 Open Internet Order that enshrines net neutrality protections.
In her written testimony, Laura explained the importance of protecting consumers' information in circumstances where consumers really have no choice whether to share the information, such as with Internet Service Providers and Credit Reporting Agencies:
"Virtually every single consumer shares information about everything they do online with an Internet service provider (ISP). Consumers share this information not because they want to, but because they must. . . Sharing information with an ISP is an unavoidable part of going online.
"An ISP can see what websites its subscribers visit and when they visit them, and can make inferences based on that information. . . In addition, even when consumers’ online activities have been purged of personal identifiers, such as name or a subscriber identifier, browsing histories can still be linked back to specific individuals. . . No other type of actor in the Internet ecosystem has access to as rich and reliable a stream of private information about individual users as ISPs. [...]
"As with Internet service providers, consumers have no choice but to share highly private information with CRAs like Equifax. The massive troves of valuable and potentially damaging information that CRAs maintain are provided by furnishers, not by consumers themselves. This is part of why consumers are so outraged by the recent Equifax breach."
She went on to describe the powerful ways in which algorithms, which often rely on such private data, can impact consumers' daily lives:
"Algorithms may be used to determine which job applicants are invited to come in for an interview, where police officers should patrol, or how long a person convicted of a crime should spend in jail. Algorithms also select much of what we read and see online. They may determine which products are presented to us in advertisements, which movies are recommended to us, which friends’ photos we see, and which news articles we read.
"Algorithmic decision-making may streamline some aspects of our lives, but algorithms can sometimes have flaws that lead to negative or unfair consequences. For example, hiring algorithms have been accused of unfairly discriminating against people with mental illness. Sentencing algorithms—intended to make sentencing fairer by diminishing the role of potentially biased human judges—may actually discriminate against Black people. Search algorithms may be more likely to surface advertisements for arrest records—regardless of whether such records exist—when presented with characteristically Black names.
"The use of consumer data to power algorithmic decision-making deserves particularly close scrutiny when the decisions to be made will affect opportunities for education, healthcare, financial products, or employment.
"Consumers want more control over their private information, and consistently are asking for it. According to a 2016 report from the Pew Research Center, “91% of adults agree or strongly agree that consumers have lost control of how personal information is collected and used by companies,” and 68% believe current laws are not good enough in protecting people’s privacy online. Consumers need clear forward-looking protections that are flexible, strongly enforced, and appropriate based on context."
Laura's testimony—which we highly recommend you read in full—concludes with a concrete set of recommendations to protect consumer information in this rapidly-evolving field:
"To improve privacy and data security for consumers, the FTC—or another agency or agencies—must be given more powerful regulatory tools and stronger enforcement authority.
The law should grant an expert agency or agencies the authority to develop prospective privacy and data security rules, in consultation with the public, so that data collectors and users can know in advance what standards apply to consumers’ information. Regulations should also be flexible, allowing agencies to adjust them as technology changes, as the FTC did just a few years ago with the COPPA Rule.
Congress also should ensure that whatever agency or agencies are to be in charge of enforcing privacy and data security standards have substantial civil penalty enforcement authority. . . Regulations are effective to deter violations only if entities fear the punishment that would surely follow.
"[P]rivacy laws and regulations should be context-specific, carefully tailored based on the avoidability of the information sharing, the sensitivity of the information shared, and the expectations of consumers... When information sharing is unavoidable or less avoidable by consumers, it is important that the information be protected. [...]
Policymakers should also consider how the avoidability of any particular choice presented to a consumer may be affected or distorted by other factors that make it unavoidable as a practical matter, such as whether the choice is technically difficult for most consumers to understand or exercise. [...]
"In determining what level of protection should be afforded to information shared in a particular context, policymakers should also examine how sensitive the shared information is. . . Protection for consumers’ information should also be tailored based on consumers’ expectations for how the information will be used."
Laura's written testimony also included specific recommendations in the context of Credit Reporting Agencies, and her oral testimony went into a host of additional issues relating to the importance of net neutrality protections.
As we said, it's worth the read.