AI Genetic and Biometric Data Capture: The Legal Fight for Privacy
Among the most pressing legal concerns in the field of data privacy is the rise of AI-powered tools that capture, analyze, and store genetic and biometric information, often without meaningful notice or consent. From facial recognition platforms and wearable health devices to clinical AI systems and consumer genomics applications, companies across nearly every sector are ingesting biological data at a scale that the legal system is still working to contain.
In this blog, we focus specifically on where AI-driven data collection intersects with two of the most consequential privacy statutes in the United States: the Illinois Biometric Information Privacy Act (BIPA) and the Genetic Information Privacy Act (GIPA). If your biometric or genetic information was captured, stored, or shared without your written authorization, you may have valid legal claims, regardless of whether you suffered financial harm.
What Is AI Genetic and Biometric Data Capture?
AI genetic data capture refers to the automated collection and analysis of biological identifiers that are either genetically derived or unique to a specific individual. This includes far more than direct DNA sequencing. Modern machine learning systems can extract actionable genetic signals from facial geometry, iris scans, voiceprints, gait analysis, and continuous physiological monitoring.
When an AI platform maps the contours of a human face during onboarding, it may be generating a biometric template that doubles as an ancestry or health inference engine. When a wearable device transmits real-time heart rate variability and skin conductance data to a cloud algorithm, that data may be feeding predictive models capable of inferring hereditary disease risk.
The boundary between traditional biometric collection and functional genetic capture is eroding, and companies that ignore consent obligations can be held accountable.
How BIPA and GIPA Apply to AI Data Collection
- BIPA requires private entities to provide written notice, obtain informed written consent, publish a data retention policy, and refrain from profiting from biometric data before any collection occurs. It provides a private right of action with statutory damages of $1,000 per negligent violation and $5,000 per intentional or reckless violation. Critically, no proof of actual harm is required — the unauthorized collection itself is actionable.
- GIPA governs the collection, use, and disclosure of genetic testing information by healthcare providers, employers, insurers, and other entities. It prohibits employers from conditioning employment on genetic testing and bars insurers from using genetic data in underwriting decisions. As AI systems grow capable of inferring genetic characteristics from non-genetic inputs, GIPA’s reach is expanding well beyond its original consumer DNA testing context. Penalties reach $2,500 for negligent violations and $15,000 for intentional ones.
What makes the intersection of AI, BIPA, and GIPA uniquely dangerous for violating companies is that a single AI deployment may trigger violations under both statutes simultaneously.

Key Case Law: Genetic Capture in the AI Context
Courts have issued a few landmark decisions over the past three years that define the scope of liability for AI-driven biometric and genetic data collection. Understanding these rulings is essential context for any potential claimant.
Cothron v. White Castle System, Inc. (Ill. Supreme Court, 2023) In what became the most consequential BIPA ruling of the decade, the Illinois Supreme Court held that a separate violation accrues each time a private entity collects or transmits biometric data without consent — not just at the first instance. White Castle’s fingerprint-based employee system, used daily for years, exposed the company to an estimated $17 billion in aggregate liability.
The ruling triggered a 65 percent surge in BIPA filings almost immediately. While the 2024 legislative amendment later capped this to one recovery per person per method of collection, Cothron established that AI systems collecting biometric data on a recurring basis face compounding legal exposure.
Tims v. Black Horse Carriers, Inc. (Ill. Supreme Court, 2023) This decision resolved a long-running dispute over which statute of limitations applies to BIPA claims. The Illinois Supreme Court held that a uniform five-year limitations period governs all BIPA violations. For plaintiffs whose biometric data was captured by AI systems as far back as early 2020, this ruling substantially extends the window to file suit and recover damages.
Zellmer v. Meta Platforms, Inc. (9th Cir., 2024) The Ninth Circuit affirmed dismissal of a BIPA claim against Meta involving the company’s AI-powered “Tag Suggestions” feature, which generated internal “face signatures” from uploaded photos. The court held that to qualify as a biometric identifier under BIPA, the captured data must actually be capable of identifying an individual.
Because Meta’s face signatures could not independently identify a person, they fell outside the statute’s reach. This ruling established an important threshold: AI systems that generate biometric proxies without a direct identification function may avoid BIPA coverage — a nuance that defendants are increasingly leveraging.
Tibbs v. Arlo Technologies (N.D. Ill., 2024) In a significant win for plaintiffs, a federal district court allowed BIPA claims to proceed against smart home AI company Arlo Technologies, whose doorbell and security camera systems used infrared and visible light scanning to map face geometry. This decision is especially relevant to AI robotics and home security companies deploying machine vision systems.
Cisneros v. Nuance Communications, Inc. (7th Cir., 2025) Nuance, a healthcare and enterprise AI company, faced BIPA claims over its voiceprint identification system used in financial advisory call centers. The district court dismissed the case but the plaintiff appealed. The outcome will have significant implications for AI companies operating across healthcare and financial sectors that process voice data for identity verification.
GIPA Litigation Surge (2023–2025) More than 100 GIPA class actions were filed between 2023 and early 2025, primarily targeting employers that collected family medical history information during pre-employment physicals — data GIPA treats as genetic information. Courts have been receptive to these claims, and the litigation wave has expanded to include healthcare systems, staffing agencies, and large manufacturers. As AI-assisted hiring tools begin incorporating predictive health screening components, GIPA’s application to algorithmic hiring decisions is an emerging and rapidly developing area of law.
Industries and Companies Facing Scrutiny
- Direct-to-Consumer Genetic Testing Platforms: A few years ago 23andMe became the subject of a joint regulatory investigation by Canadian and U.K. privacy authorities. Ancestry.com faces ongoing questions about secondary use of DNA data in pharmaceutical and insurance partnerships without sufficiently disclosed consent terms.
- AI Facial Recognition and Surveillance Companies: Clearview AI settled a multi-state BIPA class action for $51.75 million after building a facial recognition database from billions of scraped internet images without consent. In Texas, Meta agreed to a $1.4 billion settlement under that state’s Capture or Use of Biometric Identifier (CUBI) Act for similar conduct — the largest biometric privacy settlement in U.S. history. NEC Corporation and Idemia have also faced scrutiny in government and commercial deployment contexts.
- Healthcare and Clinical AI Companies: Nuance Communications, Amazon Web Services’ HealthLake, and other clinical AI platforms that authenticate patients using voice or facial recognition face layered exposure under both BIPA and GIPA where pre-collection consent frameworks are absent or inadequate. Hospital systems deploying AI documentation tools that passively capture patient biometrics during clinical interactions have also been targeted.
- Robotics and Wearable Technology: Warehouse robotics operators using biometric time and attendance verification, AI security systems with embedded facial recognition, and wearable health tech companies transmitting continuous physiological data to third-party analytical platforms all operate in significant risk zones. Any AI deployment that draws inferences about hereditary disease risk from wearable sensor data may also implicate GIPA independently.
- Employers Across Sectors: Traditional employers requiring pre-employment physicals with family health history questionnaires remain the primary GIPA defendants. Hospitals, staffing firms, logistics companies, and large manufacturers have all faced class actions in recent years. As AI-integrated applicant tracking systems begin flagging health-related risk factors from submitted personal information, the next wave of litigation is expected to target the platforms themselves.
Statute of Limitations: How Long Do You Have?
All BIPA claims are subject to a five-year statute of limitations in Illinois. For GIPA claims, the standard five-year limitations period for civil statutory violations also applies. This means that individuals whose biometric or genetic data was collected without consent as far back as 2021 may still have viable claims today.
Frequently Asked Questions
Do I need to prove I was financially harmed to bring an AI Genetic Capture Privacy claim? No. Both statutes allow claims based on the unconsented collection or use of protected data itself. Courts have consistently recognized that the violation of your privacy rights constitutes a cognizable injury without any accompanying financial loss.
What counts as biometric data in an AI context? BIPA covers fingerprints, retina and iris scans, voiceprints, and scans of face or hand geometry. AI-generated biometric data qualifies if it can be used to identify a specific individual, and meaning many machine vision and facial mapping systems used in consumer electronics, security, and healthcare fall within the statute’s reach.
What if the AI company is headquartered outside Illinois? BIPA and GIPA apply based on where the data subject is located at the time of collection, not where the company is incorporated or headquartered. Illinois residents whose biometric or genetic data is collected by out-of-state AI companies have pursued claims under both statutes.
How much could my claim be worth? Under BIPA, statutory damages are $1,000 per negligent violation and $5,000 per intentional or reckless violation, now capped at one recovery per person for the same collection method following the 2024 amendment. Under GIPA, damages reach $2,500 for negligent violations and $15,000 for intentional ones.
What should I do if I think my data was captured without consent? Document any notice or consent forms you did or did not receive, preserve any communications with the company regarding data collection, and consult with a data privacy attorney as soon as possible. The statute of limitations is running so call our firm to learn about the full range of your legal options.

Why Hire The Lyon Firm for AI Biometric and Genetic Privacy Cases?
The Lyon Firm has built a focused national practice in biometric and genetic data privacy litigation, with deep experience prosecuting data privacy class actions. Our attorneys understand both the technical architecture of AI data collection systems and the procedural demands of privacy class action litigation — a combination that allows us to build strong cases from investigation through resolution.
We represent clients exclusively on a contingency fee basis. You pay nothing unless we recover compensation on your behalf. If your biometric or genetic data was collected by a healthcare provider, employer, AI platform, robotics company, or any other entity without your written consent, contact The Lyon Firm today for a free and confidential case evaluation.