News and Commentary Impacting Data Privacy and Cybersecurity Programs
- A Bipartisan Federal Comprehensive Data Privacy Proposal
- New Additions to the State Consumer Privacy Patchwork
- Peanuts, Cracker Jacks, and Facial Authentication?
- Concluding Consideration
“In Order to Form a More Perfect [Data Privacy] Union”?
Hopes and fears for a federal comprehensive data privacy law revived last week with a bipartisan, bicameral unveiling of the draft American Privacy Rights Act (“APRA”). According to the drafters’ press release, APRA “eliminates the existing patchwork of state comprehensive data privacy laws”, of which there are now 15, and counting. As written, APRA would establish national consumer data privacy rights and standards for data security that differ from many state laws. We will be monitoring in particular some of these key distinctions:
- Preemption. Preempts comprehensive state consumer privacy laws, but empowers state agencies to enforce ARPA themselves.
- Coverage. Includes nonprofits as covered entities, but excludes most “small businesses” (less than $40M annual revenue; process covered data of less than 200,000 individuals, excluding covered data used strictly to process payment transactions; no revenue from transferring covered data).
- Unique Disclosure, Registry, and Reporting Obligations. For “data brokers”, “covered high-impact social media companies” ($3B in revenue and 300M monthly users), and “large data holders” ($250M in revenue and covered data of more than 5M individuals/15M portable connected devices linkable to individuals/35M connected devices linkable to individuals or sensitive data of more than 200,000 individuals/300,000 portable connected devices linkable to individuals/700,000 connected devices linkable to individuals).
- Data Minimization. APRA’s default position is to prohibit collecting/processing personal data unless it is “necessary, proportionate, and limited” to provide a requested product/service, to provide an anticipated communication, or to accomplish any of 15 expressly permitted purposes. This default approach contrasts with the trend among state laws, which is to permit uses of personal data provided that the covered entity properly discloses those uses to the consumer and obtains consumer consent when required. APRA’s approach is more akin to that of its European counterpart, the GDPR, which provides that processing of personal data is not lawful unless it falls within one of six lawful bases.
- Broader Definition of Sensitive Data. Expands traditional state scope to include private communications; personal calendar and address data; private photos and recordings; video viewing and online browsing activities over time and across third-party websites; etc.
- Algorithmic Decision-Making and Civil Rights.
- Prohibits processing covered data in a way that discriminates on the basis of race, color, religion, national origin, sex, or disability.
- Requires special impact assessments for large data holders that use a covered algorithm that poses specified consequential risks of harm.
- Requires any entity to provide clear notice and opt-out mechanisms when using a covered algorithm to facilitate a “consequential decision” (i.e., a decision, offer, or advertisement that uses covered data and relates either to opportunities in housing, employment, education, healthcare, insurance, or credit, or to access to any place of public accommodation).
- FTC “Privacy Bureau” and Rulemaking. Federal Trade Commission will enforce APRA and establish regulations. Within one year of enactment, FTC must establish a new bureau comparable to its existing bureaus for competition and consumer protection.
- Private Right of Action/Arbitration Prohibitions. Permits private cause of action for violations of certain (not all) consumer privacy rights. Invalidates pre-dispute arbitration provisions as used against minors or against any individual who can demonstrate a “substantial privacy harm” (i.e., minimum $10,000 financial harm; physical/mental harm requiring health care; physical injury; highly offensive intrusions into privacy; or discrimination for race, color, religion, national origin, sex, or disability).
- Compressed Ramp-Up to Compliance. Effective 180 days from passing.
The legislative process is only just beginning, and despite the law’s bipartisan origin it is too early to predict whether Congress will pass APRA or a version of it this election year. Previous attempts foundered on the issues of preemption and private right of action. APRA tweaks those provisions. Next steps will include subcommittee and committee hearings followed by the introduction of a formal bill, if any.
Latest on State Consumer Privacy Patchwork – Maryland Stands Out
Preemptive federal privacy efforts aside, state legislatures continue to fill the regulatory void left by Congress. With the signing of acts in New Hampshire in March and Kentucky earlier this month, there are now 15 state privacy laws formally on the books. That number will soon grow to 17, as governors in Maryland and Nebraska are poised to sign those state laws.
We continue to see substantial overlap among the state consumer privacy laws. Most companies are now familiar with basic privacy law terminology, consumer rights, and company obligations. Below we highlight key provisions and distinctions in New Hampshire and Kentucky, and we provide early attention to Maryland, which promises to be a genuine outlier.
New Hampshire Consumer Data Privacy Act
- Effective Jan. 1. 2025.
- Covered entities: for-profit companies doing business in New Hampshire or offering products and services to New Hampshire residents and who either process the personal data of 35,000 consumers (excluding processing strictly for completing payment transactions) or process the personal data of 10,000 consumers while deriving more than 25% of their revenue from the sale of personal data.
- Typical exclusions for employee data, as well as for entities and data regulated by federal privacy laws, such as financial institutions, healthcare providers and business associates, and higher education.
- Attorney General enforcement with 60-day opportunity to cure, which sunsets in 2026. No private cause of action.
Kentucky Consumer Data Protection Act
- Effective Jan. 1, 2026.
- Covered entities: for-profit companies doing business in Kentucky or offering products and services to Kentucky residents and who either process the personal data of 100,000 consumers or process the personal data of 25,000 consumers while deriving 50% of their revenue from the sale of personal data.
- Same exclusions as noted above for New Hampshire.
- Attorney General enforcement with 30-day opportunity to cure and no sunset period. No private cause of action.
Maryland Online Data Privacy Act of 2024 (“MODPA”)
Maryland’s law does not copy other state laws. In some ways MODPA more closely resembles the federal approach discussed above. Here are some key differentiators to consider as we await enactment:
- Effective Date. October 1, 2025.
- Data Minimization by Default.
- Prohibits personal data processing except to the extent reasonably necessary for a requested product or service.
- Prohibits sale of sensitive personal data.
- Bans targeted advertising to, and the sale of personal data of, children under 18.
- Prohibits personal data processing that discriminates on the basis of race, color, religion, national origin, sex, sexual orientation, gender identity, or disability, with few specified exceptions.
- Lower Applicability Thresholds. Covered entities conduct business in Maryland or offer products/services to Maryland residents and either process the personal data of 35,000 consumers (excluding processing strictly for completing payment transactions) or process the personal data of at least 10,000 consumers while deriving more than 20% of revenue from sale of personal data.
- Narrower Exemptions. Typical entity- and data-level exemptions as other states, except no general exemption for non-profits (joining Colorado, Delaware, and Oregon).
- Enforcement. Attorney General enforcement with 60-day opportunity to cure, which sunsets April 1, 2027. No private right of action.
Peanuts, Cracker Jacks, and Facial Authentication – MLB’s “Go-Ahead Entry” Ticketing Policy
Major League Baseball’s (MLB) optional ticketing initiative, “Go-Ahead Entry”, purports to provide participating fans with a frictionless entry on gameday. The technology, developed by Nippon Electric Company (NEC), relies on a “high-quality selfie” uploaded in advance by the user. The tool analyzes and converts the image into a “unique number”, and then deletes the image. MLB links the unique number to tickets purchased by the user. If your company intends to utilize similar facial recognition technologies for authentication and access purposes, here is a list of data privacy concerns to consider:
- Biometric Data is Sensitive Data. Biometric data includes facial features and patterns unique to individuals. In this context, state consumer privacy laws generally regard such data as “sensitive data”, requiring heightened protections such as data privacy impact assessments and special disclosures and opt-in/opt-out mechanisms.
- Linking Personal Data. Unique identifiers generated from facial authentication software are linked to the user’s consumer activity. Any breach in this linkage could expose personal details, compromise user privacy, and result in identity theft and fraud.
- Data Retention Policies. Establish clear policies regarding retention and deletion of biometric data. Keeping data for longer than necessary increases the risk of unauthorized access and misuse. Regularly purging unnecessary data mitigates these risks.
- Data Storage and Security. Data classification and security remain crucial to the prevention of unauthorized access, data breaches, or misuse of personal information.
- Consent and Transparency. Before collecting and processing biometric data, inform users about how you will use their data, who will have access to it, and what protective measures are in place.
Concluding Consideration
Even the least onerous comprehensive consumer privacy laws create corporate obligations around principles of minimizing data collection and protecting the consumer from unwitting, unfair consequences of profiling and automated decision making. As companies collect and process that data through an expanding array of third-party AI-type solutions, a system that relies on detailed consumer disclosures and consumer action seems unsustainable. Whether Congress establishes a national standard or one emerges from a consensus of state laws, a growing number of companies will likely need to internalize these principles and develop competencies for data privacy compliance, such as:
- conducting data protection impact assessments before rolling out new products, services, or marketing campaigns;
- demanding such assessments of technology partners and solutions prior to purchase; and
- drafting appropriate contractual provisions to assign risk to technology partners when their solutions harm consumers.
For further information or guidance on these issues, please contact:
Sherwin M. Yoder, CIPP/US, CIPP/E and CIPM
Partner
203.575.2649
syoder@carmodylaw.com
This information is for educational purposes only to provide general information and a general understanding of the law. It does not constitute legal advice and does not establish any attorney-client relationship.