This memo covers significant developments in state and local privacy laws and litigation of the issues in 2023.
NEW STATE PRIVACY LAWS
The privacy landscape was transformed by the passage of thirteen state privacy laws (CSPLs). Between 2020 and 2022, California, Virginia, Colorado, Connecticut, and Utah passed CSPLs. In 2023, eight states followed - Iowa, Indiana, Tennessee, Montana, Texas, Florida, Delaware, and Oregon. These new laws took effect between July 2024 and January 2026. Despite variances, no state has yet to choose to reinvent the data privacy wheel. All CSPLs increasingly share definitions and provisions in common but differ in certain respects and tend to vary in their scopes of applicability.
SCOPE VARIATIONS
The scope of applicability of the 2023 laws mostly follows pre-existing approaches. They generally exclude governmental institutions, institutions of higher education, and most not-for-profit institutions. All exclude data and entities covered under HIPPA and other specified federal laws. Iowa, Indiana, Oregon, Montana, and Tennessee established thresholds that trigger each act’s applicability, similar to the earlier CSPLs. The Tennessee and Texas Acts have a threshold based on income, as do California and Utah.
While the majority of CSPLs apply to either in-state businesses or businesses that target state residents, Texas has a much broader scope and applies to legal or natural persons who produce products or services consumed by Texas residents. Texas exempts small businesses, as defined by the U.S. Small Business Administration. Texas still prohibits small businesses from selling sensitive personal data without consumer consent.
Florida applies to for-profit business organizations that conduct business in Florida, generate more than one billion dollars in global gross annual revenue, and determine the purposes and means of processing personal data about consumers. Focused on big tech companies, the Florida Act applies only to entities that: (1) generate 50 percent or more of their global gross annual revenues from the sale of ads online; (2) operate an app store or digital distribution platform that offers at least 250,000 different software applications for consumers; or (3) operate particular kinds of consumer smart speaker and voice command component services.
EMERGING COMPLIANCE FLOOR
All of the 2023 laws contain a comparable basket of core consumer rights, including the right to know about, access, delete, and port personal data. Except for Iowa and Utah, all other 2023 laws afford residents the right to correct false or erroneous personal data. All 2023 laws prohibit discriminating against consumers for exercising their privacy rights. All 2023 laws require similar privacy notices, reliable and intuitive methods for exercising consumer rights, appeal mechanisms, and nearly identical frameworks for dealing with consumer requests.
All 2023 laws share similar definitions of sensitive personal information (SPI), which includes information revealing a person’s racial or ethnic origin, religious beliefs, mental or physical health diagnosis, citizenship or immigration status, and precise geolocation data, as well as additional information collected from a known child. All 2023 laws require the implementation of reasonable data security measures. No 2023 law establishes a private right of action. Except for Iowa, all 2023 laws allow consumers to opt out of the use of their data for sale, targeted advertising, and profiling. Except for Iowa, all 2023 laws require covered entities to create data protection assessments.
Significant variations between CSPLs do not necessarily signal higher compliance burdens. The Florida Act, which contains numerous outlier provisions, applies only to businesses that satisfy its billion-dollar income threshold. One provision in Tennessee, however, increases compliance costs by creating an affirmative defense for any business that adopts a privacy program that reasonably conforms to the National Institute of Standards and Technology Privacy Framework or other documented policies, standards, and procedures designed to safeguard consumer privacy.
EMERGING COMPLIANCE MODELS
There are differences of note differences among the 2023 laws. Some states define SPI more broadly than others. All 2023 laws define SPI to include genetic or biometric data processed for the purposes of uniquely identifying an individual. Delaware includes any genetic or biometric data regardless of usage. While Texas defines SPI to include information revealing an individual’s sexuality, all other 2023 laws use the term sexual orientation. Delaware and Montana extend the definition to include sex life and one’s status as transgender or nonbinary. Oregon adds this latter designation as well. Oregon also extends the definition to include one’s status as a victim of a crime.
While most 2023 laws require consumer consent before processing SPI, Iowa and Florida give consumers the right to notice and the opportunity to opt out of SPI processing. Florida also provides consumers with the opportunity to opt out of the collection of personal data via voice recognition or facial recognition features.
There are commonalities. Three broad approaches to data privacy are exemplified by the California, Virginia, and Connecticut laws. The California Act covers employment contexts and (1) establishes a state agency for privacy protection and (2) creates a limited right to private action. Virginia focuses on harmonizing state laws with those global best practices that businesses might find more straightforwardly to implement. Connecticut comes closest to global best practices, making it easier for U.S. businesses to offer their products and services on the global market.
Of the 2023 laws, Iowa, Indiana, Tennessee, and Texas are modeled after the Virginia Law. Montana, Oregon, and Delaware are lined up with Connecticut. For example, Iowa, Indiana, and Tennessee do not require recognition of a universal opt-out mechanism that would allow consumers to indicate their privacy choices across all websites. As an additional example, the Iowa, Indiana, and Tennessee Acts are silent on the matter. Florida, Texas, Oregon, Delaware, and Montana prohibit the use of dark patterns for eliciting consent. A dark pattern means a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice. It includes but is not limited to, any practice the Federal Trade Commission refers to as a dark pattern.
Finally, the 2023 laws take varying approaches to notice-and-cure periods that provide opportunities to remedy violations and avoid enforcement measures. Iowa, Indiana, Tennessee, and Montana establish permanent cure periods, ranging from thirty to ninety days. Delaware, Oregon, and Florida offer cure periods that are either discretionary or set to expire.
CONSUMER HEALTH DATA
Washington and Nevada passed limited consumer health data privacy laws. Each act requires prior consent for the collection or processing of consumer health data. Each includes similar entity- and data-level exemptions as most CSPLs and is aimed to cover privacy and data protection gaps in the consumer health context. Each defines consumer health data similarly to include information regarding gender-affirming care, reproductive or sexual health, and health-related location information. Those definitions also include biometric and genetic data as consumer health data. Each requires consumer health data privacy policies.
Both laws give consumers a right to have their consumer health data deleted. Under the Washington law, obligations to delete data extend to third parties or affiliates of covered entities. To varying degrees, both laws prohibit geofencing around entities that provide health care services. Both laws require written authorization for the sale of personal health data. Washington enumerates detailed requirements for authorization. Unlike Nevada, Washington creates a private right of action. Nevada Act exempts information used to access or enable gameplay on a video game platform and to identify shopping habits or interests unrelated to health status from coverage.
Connecticut amended its data privacy provisions to address consumer health data, which it defines as personal data that a controller uses to identify a consumer’s physical or mental health condition or diagnosis. It includes but is not limited to, gender-affirming health data and reproductive or sexual health data. Connecticut amended its definition of sensitive data to include consumer health data and one’s status as a victim of crime. Connecticut now prohibits regulated companies from processing consumer health data without obtaining the consumer’s consent. Also, it prohibits the use of a geofence near a health facility for the purpose of identifying, tracking, or collecting data from any consumer regarding consumer health data. Connecticut also requires that regulated companies conduct a data protection assessment that now includes the processing of consumer health data.
DIGITAL PRIVACY FOR MINORS
The California Age-Appropriate Design Code Act applies to businesses that develop and provide online services, products, or features that children are likely to access. It requires covered businesses to consider the best interests of children when designing, developing, and providing any online service, product, or feature and to prioritize the privacy, safety, and well-being of children over commercial interests. It prohibits covered businesses from using the personal information of any children in a way that the business knows or has reason to know is materially detrimental to a child. The Act requires a high level of privacy as a default setting for children and prohibits the use of dark patterns. It requires privacy policies and other relevant materials in language that suits the age of the children that might need to access them. It requires obvious signals to children when they are monitored or tracked online by parents, guardians, or any other consumer. The Act mandates biennial Data Protection Impact Assessments that identify the risks of material detriment to children that arise from the data management practices and timed plans to mitigate identified risks before children gain access. Opportunities to cure violations are available only to businesses already in substantial compliance with the law. Finally, the Act establishes the California Children’s Data Protection Working Group.
New state laws in Utah, Louisiana, and Arkansas also address the privacy of minors on social media platforms. All three laws prohibit minors from opening or holding accounts on a social media platform without the express consent of a parent or guardian. Utah’s Social Media Regulation Amendments and Louisiana’s Secure Online Child Interaction and Age Limitation Act restrict direct messaging with minors, targeted advertising, collecting and using the personal information of minors, and require that a parent or guardian be given a password to access the minor’s account. Utah grants a parent or guardian sweeping access to the minor’s social media accounts. Utah also imposes additional requirements, like blocking minors from accessing their platforms from 10:30 PM to 6:30 AM. A parent or guardian can change these settings and otherwise limit minor usage. All three laws authorize state enforcement through civil action. Utah empowers the Division of Consumer Protection with rulemaking authority. Utah provides a thirty-day notice-and-cure period for violators. Louisiana provides a forty-five-day cure period, and Arkansas’ Social Media Safety Act does not contain a cure period.
Connecticut amended its data privacy statute to include similar new protections for minors. Connecticut also established an Internet Crimes Against Children Task Force.
Businesses that need to comply with requirements across multiple jurisdictions should be cautious, given the extensive rights granted to parents under the Utah Act, which might conflict with the rights to privacy established for children in other states or under federal law.
COLORADO
Colorado passed a law that requires state and local government agencies, including colleges, that use (or intend to develop, procure, or use) a facial recognition service (FRS) to file a notice with its reporting authority. Colorado also created a task force to consider FRSs.
NEW YORK CITY RULES ON AUTOMATED EMPLOYMENT DECISION TOOLS
The New York City Department of Consumer and Worker Protection adopted first-in-the-country rules restricting the use of automated employment decision tools for hiring. They clarify terms and the requirements for bias audits, notice and disclosure to current and prospective employees, and other obligations for the employer or employment agencies covered under the law.
ILLINOIS AND TEXAS BIOMETRIC INFORMATION ACTS
Decisions handed down by the Illinois courts significantly expanded the reach of the Illinois Biometric Information Privacy Act (BIPA).
The Illinois Supreme Court ruled that claims for damages under BIPA accrue on each violation, not just the first. Individuals also now have five years after any alleged BIPA violation to bring claims under the law’s private right of action. These rulings dramatically increase the potential damages a court might award under BIPA.
An Illinois appellate court ruled that BIPA’s requirement for a written retention-and-destruction schedule is triggered on the initial date of possession, not afterward.
Federal courts in Illinois have determined that educational institutions that lend funds to students directly will qualify for BIPA’s financial institution exemption in the higher education context.
A federal court in Illinois found that BIPA claims were covered under insurance policies for personal and advertising injuries, including any injury arising out of an oral or written publication, including electronic publication, of material that violates a person’s right to privacy.
A federal court in Illinois granted a defendant’s motion for a new trial on damages after initially awarding $228 million to a class in a BIPA suit. When the jury found 45,600 violations of BIPA, the court determined that the per-violation award should be $5,000—for a total of $228 million. The court later held that the jury should have determined the appropriate amount of damages and granted a new trial on that issue.
The Texas attorney general sued Google under Texas’ Biometric Identifier Act, alleging the collection of Texans’ facial and voice recognition information without explicit consent violated that Act.
Kommentare