Policy Roast: JCPenney's AI Makeup Advisor Just Became a $50M BIPA Liability
Virtual try-on tech meets Illinois biometric law. JCPenney faces class action over facial scanning without consent. Again.
Policy Roast: JCPenney's AI Makeup Advisor Just Became a $50M BIPA Liability
JCPenney is facing a class action lawsuit in Illinois for allegedly using facial recognition technology in its virtual makeup try-on tool without obtaining the required consent under the state's Biometric Information Privacy Act (BIPA). The lawsuit claims the retailer scanned customers' faces to provide personalized cosmetics recommendations while illegally capturing and storing biometric data.
This is the latest in a long line of BIPA cases that have cost tech companies hundreds of millions in settlements. The difference? This time it's retail AI, not social media or workplace surveillance.
What BIPA Requires (and What JCPenney Allegedly Didn't Do)
Illinois' Biometric Information Privacy Act, passed in 2008, is the strictest biometric privacy law in the United States. It regulates the collection, storage, and distribution of biometric identifiers - fingerprints, voiceprints, iris scans, and facial geometry.
BIPA's requirements are simple:
- Inform the subject that their biometric data is being collected
- Explain the purpose of the collection and how long it will be stored
- Obtain written consent before collecting any biometric information
- Provide a publicly available retention schedule and destruction timeline
The law also creates a private right of action, meaning individuals can sue directly without waiting for the state attorney general to act. Violations carry statutory damages of $1,000 per negligent violation or $5,000 per intentional or reckless violation.
Do the math on a class action with thousands of customers, and you understand why BIPA settlements routinely hit eight figures.
How Virtual Try-On Becomes Biometric Collection
JCPenney's virtual makeup advisor works like most AI-powered try-on tools: scan your face, map facial geometry, overlay cosmetics in real time. The technology analyzes facial features to recommend products and show how they'll look on your specific face shape and skin tone.
From a user experience perspective, it's convenient. From a BIPA perspective, it's biometric data collection.
The lawsuit alleges JCPenney: - Scanned customers' faces without disclosure - Stored facial geometry data without consent - Failed to provide a retention and destruction schedule - Never obtained the required written release
If these allegations hold up, every scan is a potential violation. Every customer who used the virtual try-on tool in Illinois becomes a potential class member. And every violation carries statutory damages.
The BIPA Litigation Boom
Illinois has become the epicenter of biometric privacy lawsuits. Since BIPA's passage in 2008, plaintiffs' attorneys have filed hundreds of class actions against companies ranging from Facebook to Snapchat, Clearview AI to Microsoft Teams.
Notable settlements include:
- Facebook (Meta): $650 million for facial tagging without consent
- Clearview AI: $51.75 million plus operational restrictions
- TikTok: $92 million for collecting biometric data from users
- Google: Multiple ongoing cases over facial recognition in photos
The Illinois Supreme Court ruled in 2023 that BIPA claims have a five-year statute of limitations - longer than most privacy statutes - giving plaintiffs a wide window to file. The court also held that each scan or collection can constitute a separate violation, multiplying potential damages exponentially.
This creates massive liability exposure for any company using facial recognition, voiceprint analysis, or fingerprint scanning in Illinois without perfect BIPA compliance.
Why Retail AI Is the New Frontier
Most BIPA cases have targeted social media platforms (Facebook's facial tagging), workplace biometrics (fingerprint timeclocks), or surveillance tech (Clearview's facial recognition database). JCPenney represents a newer category: retail AI used for customer experience.
Virtual try-on tools are everywhere. Sephora, Ulta, L'Oreal, and dozens of beauty brands offer apps that scan your face to preview makeup, hair color, or skincare products. Eyewear retailers use facial mapping to recommend frames. Fashion brands use body scanning for fit recommendations.
All of this involves collecting biometric data. And if you're offering these tools to Illinois residents without BIPA-compliant consent, you're building class action exposure every time someone opens the app.
The retail angle matters because it's not just tech companies anymore. Traditional retailers - many of whom don't have in-house privacy counsel familiar with biometric regulations - are deploying AI tools from third-party vendors without fully understanding the compliance obligations.
The Vendor Liability Question
Here's the wrinkle: JCPenney likely didn't build this facial recognition tool in-house. Most retailers license the technology from third-party vendors who provide white-labeled AI solutions.
BIPA doesn't care. The statute applies to whoever "collects, captures, purchases, receives through trade, or otherwise obtains" biometric information. If JCPenney is the entity presenting the tool to customers and collecting the data (even if it's processed by a vendor), they're potentially liable.
This creates a vendor risk problem:
- Retailer deploys AI tool without understanding BIPA requirements
- Vendor may or may not have built BIPA-compliant consent flows
- Retailer assumes vendor handles compliance
- Class action names the retailer (who has deeper pockets)
- Retailer discovers too late that their vendor agreement doesn't indemnify BIPA violations
Smart retailers are now asking vendors: "Is this tool BIPA-compliant? Do you provide compliant consent mechanisms? Will you indemnify us if we get sued?"
Smart vendors are building BIPA compliance directly into their products - or excluding Illinois from deployment altogether.
What Compliance Actually Looks Like
If you're deploying facial recognition, voiceprint analysis, or any biometric tech that touches Illinois residents, BIPA compliance isn't optional.
Here's the checklist:
- Before collection: Display a clear notice explaining that biometric data will be collected, what it will be used for, and how long it will be stored
- Obtain written consent: Users must affirmatively opt in. Pre-checked boxes don't count. Buried terms-of-service clauses don't count. You need explicit, informed consent.
- Publish a retention schedule: How long will you keep the data? When will you delete it? This must be publicly available.
- Actually delete the data: When the stated retention period expires or the original purpose is satisfied, delete the biometric data. Don't just archive it.
- Document everything: Keep records of consent, retention policies, and deletion practices. You'll need them if you get sued.
This isn't complicated, but it requires intentional design. Your AI tool can't just scan faces and move on. It has to pause, inform, request consent, and respect the user's choice.
Many vendors don't build this by default. Many retailers don't ask for it. And then the lawsuits arrive.
The Business Calculation
From JCPenney's perspective, the virtual makeup tool was probably a business decision: enhance customer experience, drive online cosmetics sales, compete with Sephora and Ulta's digital tools.
From a BIPA perspective, it was a liability decision: expose the company to statutory damages of $1,000–$5,000 per customer who used the tool in Illinois, multiplied by however many scans occurred over the five-year limitations period.
If 10,000 Illinois customers used the tool once each, that's $10 million to $50 million in potential statutory damages. If they used it multiple times (each scan being a separate violation), multiply accordingly.
And that's before legal fees, settlement administration costs, and the reputational hit of being sued for privacy violations.
The business case for BIPA compliance is simple: the cost of building compliant consent flows is a fraction of one settlement.
What Happens Next
The lawsuit is in early stages. JCPenney will likely move to dismiss, arguing that the virtual tool doesn't meet BIPA's definition of "biometric identifier" or that users consented via terms of service.
Courts have rejected both arguments before. Illinois courts have held that facial geometry analysis qualifies as biometric data, and that buried consent clauses don't satisfy BIPA's written consent requirement.
If the case survives dismissal, it will follow the standard BIPA playbook: discovery on how many people used the tool, how the data was stored, whether any consent was obtained. Then settlement negotiations, because no company wants to risk a jury trial with statutory damages in the tens of millions.
The settlement will probably include: - Cash payouts to class members - Destruction of collected biometric data - Injunctive relief requiring BIPA-compliant practices going forward - Attorney fees (often one-third of the total settlement value)
And other retailers will watch carefully, because they're running the same tools with the same vendors and the same lack of BIPA-compliant consent flows.
The Takeaway
If you're deploying AI that touches faces, voices, or fingerprints, and you operate in Illinois, you need BIPA compliance before launch. Not after. Not when the lawsuit arrives.
Audit your tools. Ask your vendors for BIPA compliance documentation. Build consent flows that actually inform users and obtain written permission. Publish retention schedules. Delete data when you said you would.
Or stop offering the tool to Illinois residents. Some companies have made that calculation: the juice isn't worth the squeeze when the squeeze costs $50 million in settlements.
JCPenney bet that virtual makeup try-on was worth the risk. Now they'll find out what that risk actually costs.