The California Consumer Privacy Act ("CCPA") was enacted in early 2018 as a political compromise to stave off a poorly drafted, and plaintiff’s friendly ballot initiative. Although the CCPA is scheduled to go into force in early 2020, there is a great deal of confusion regarding the requirements of the CCPA, including the degree to which it aligns with other privacy regulations such as the European General Data Protection Regulation (“GDPR”).
To help address that confusion, BCLP published the California Consumer Privacy Act Practical Guide, and is publishing a multi-part series that discusses the questions most frequently asked by clients concerning the CCPA.
One of the differences between the GDPR and the Privacy Directive that preceded it can be found in how the two frameworks define “special categories” of information. Specifically under the Privacy Directive, the term referred to
. . . personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, trade-union membership, and the processing of data concerning health or sex life.1
The GDPR expanded the definition to include “the processing of genetic data, [and] biometric data for the purposes of uniquely identifying a natural person.”2 “Biometric data” in turn was defined to include data “resulting from specific technical processing relating to the physical, [or] physiological . . . characteristics of a natural person, which allow or confirm the unique identification of that natural person, such as facial images . . . .”3 This is consistent with the position taken by the Article 29 Working Party, a pre-GDPR advisory body formed to provide independent counsel to the European Commission on data protection matters, that the use of facial recognition technology “as a means for identification and verification” falls within the definition of “biometric data.”4
As with other “special category” data, the use of facial recognition technology is “prohibited” by default unless one of ten exceptions enumerated within the GDPR applies.5 For most consumer-oriented apps the only exception likely to apply would be the “explicit consent” to the use of the technology by the data subject.6 The net result is that in order to use facial recognition technology as an authentication mechanism to an App, the opt-in consent of the user must be obtained. It should be noted that the Article 29 Working Party has suggested that “[i]n order for the consent to be valid, an alternative, and equally secure, access control system must be in place (such as a strong password)” and that the “alternative privacy friendly option should be the default.”7
In addition to obtaining opt-in consent, the owner of the application that uses facial recognition technology will likely be considered the controller of the reference image (i.e., the image against which the user will be indirectly compared) and the reference template (i.e., the algorithm, measurements, or values against which future images will be directly compared).8 As a result, the normal obligations of a controller will likely apply such as the obligation to (1) distribute a privacy notice to the user, (2) minimize the length of time that the data will be maintained, (3) respond to access, modification, and deletion requests, (4) protect the reference image and the reference template (e.g., encrypt them)9, and (5) require processors to adhere to the contractual obligations found within Article 28. One oddity is that in the case of facial recognition technology the Article 29 Working Party took the position that if a data subject made an access request, a controller might have to provide access to “both the original images, and the template generated in the context of facial recognition.”10 As the template is not likely to be in a format that could be understood by a data subject, it’s not clear how it could, or would, be produced and to what benefit it would confer on a recipient. In addition to the “normal” obligations of a controller, because biometric data is considered a “special” category of data there is the possibility that a company may need to conduct a Data Protection Impact Assessment and/or appoint a Data Protection Officer.
In comparison to Europe, the United States does not have a federal statute that governs the use of facial recognition technology. While the Federal Trade Commission has provided a list of recommended practices, those recommendations do not exceed the requirements imposed by the GDPR. On the state level, Illinois has the most restrictive statute governing the use of facial recognition technology. The Illinois Biometric Information Privacy Act (“BIPA”) was enacted in 2008, and governs the collection and use of “biometric identifiers,” a term which excludes a photograph by itself, but includes a “scan of . . . face geometry.”11 BIPA imposes notice, consent, data retention, and data security requirements that are more restrictive, in some respects, than those imposed under the GDPR. Specifically:
Damages. BIPA permits individuals whose biometric identifiers were captured or used in violation of the statute to bring suit and recover up to $5,000 of liquidated damages.17 In addition, a consumer may seek attorneys’ fees, costs, and injunctive relief.18
1. Privacy Directive, Art. 8(1).
2. GDPR, Article 9(1).
3. GDPR, Article 4(14) (emphasis added).
4. WP 193 at 21-22.
5. GDPR, Article 9(1), (2)(a)-(j).
6. GDPR, Article 9(2)(a).
7. WP192 at 6.
8. WP 192 at 5 (stating that “data controllers will typically be website owners and/or online service providers as well as mobile application operators who engage in facial recognition in that they determine the purpose and/or means of the processing.”
9. WP 192 at 8.
10. WP 192 at 9.
11. 740 ILCS 14/10.
12. 740 ILCS 14/15(b)(1)-(2).
13. 740 ILCS 14/15(b)(3).
14. 740 ILCS 14/15 (a).
16. 740 ILCS 14/15(e)(2).
17. 740 ILCS 14/20 (1)-(4).