Introduction
Facial recognition technology (“FRT”), an innovation that has garnered both praise and concern, has become increasingly prevalent in our daily lives, from unlocking smartphones to clocking in at work. While it offers numerous benefits, such as convenience and improved security, it also raises significant concerns about privacy and security. To address these concerns, the Cyberspace Administration of China has recently released the Provisions on the Security Management of FRT Application (Trial) (Draft for Soliciting Opinions) (“Draft”). This article delves into the nuances of the Draft that we found interesting and their potential implications.
Purpose
Article 1 states that the purpose of the Draft is “to regulate the application of facial recognition technology, protect rights and interests in personal information [(“PI”)] and other personal and property rights and interests, maintain social order and public security.” It is notable that the Draft does not mention organisations or businesses within Article 1. Moreover, FRT is not defined anywhere within the Draft.
Territorial Scope
China’s Personal Information Protection Law (“PIPL”) has an extraterritorial effect on PI processing activities outside of China if the purposes of the overseas processing activities are to provide products and services to individuals in China or to assess their behaviours.
Article 1 of the Draft states that its provisions are formulated in accordance with the PIPL and some other laws. However, it seems that the Draft limits its own territorial scope to China only. Article 2 of the Draft states that it applies to the use of facial recognition technology to process facial information and the provision of facial recognition technology products or services within China. We understand that such a limitation will not affect the operation of the PIPL.
General Obligations
Article 3 of the Draft contains general obligations, including generic legal compliance requirements and prohibitions. However, one notable prohibition is that FRT may not infringe upon organisations’ rights and interests. This is notable because the express purpose of the Draft does not include the protection of organisations, and no details are offered to describe what an organisation’s rights and interests might include in the context.
Purpose and Necessity
Article 4 of the Draft emphasises that the use of FRT should have a specific purpose, sufficient necessity, and strict protection measures. In context, the requirement of sufficient necessity appears to mean the specific purpose cannot be achieved by any means other than FRT. However, the meaning of a specific purpose still requires some further clarification. That being said, as the purpose of the Draft is to protect individuals and maintain social order and public security, specific purpose and sufficient necessity should, arguably, be understood from these perspectives.
Article 4 also highlights the importance of prioritising alternative non-biological feature recognition technologies when feasible that “achieve the same purpose or meet the same business requirements…” As such, it appears that when non-biological feature recognition technologies, such as passwords, 2-factor authentication, keys, etc., can adequately achieve the specific purpose, FRT should not be prioritised.
The Draft also goes as far as to suggest that for personal identity verification, “it is encouraged to give priority to using authoritative channels such as the National Population Basic Information Database and the national network identity authentication public service.”
Data Minimisation
Article 17 of the Draft provides that FRT users may not “retain original facial images, pictures, videos, except for facial information that has undergone anonymisation.” Article 18 of the Draft then states that FRT users should “try to avoid collecting facial information unrelated to the provision of services. If unavoidable, it should be promptly deleted or anonymised.”
Consent and Separate Consent
Due to the wording of Article 29 of the PIPL, whereby separate consent is needed to process any sensitive PI, it is no surprise that Article 5 of the Draft generally requires separate consent to be collected before facial recognition information is processed.
Article 13 provides that, to process the facial information of minors under the age of 14, the separate consent or written consent of the minor’s parents or other guardians needs to be obtained.
Private and Public Places
Article 6 of the Draft provides: “Hotels, public bathrooms, changing rooms, restrooms, and other places that could infringe upon the privacy of others shall not install image capture or personal identity recognition devices.” In places where facial recognition devices may be installed, “it should be necessary to maintain public safety, comply with relevant national regulations, and provide significant warning signs” (Article 7 of the Draft). Any captured images must be kept confidential and used for public safety purposes only unless separate consent is provided for other uses.
Article 8 states the requirements for “Organisations that install image capture and personal identity recognition devices for internal management purposes…” The requirements themselves are relatively generic. However, the implicit acceptance of the legitimacy of organisations using personal identity recognition devices for internal management purposes is interesting per se. While the precise limits of such purposes are unknown, some degree of employee monitoring in the workplace seems acceptable.
Article 9 provides, among other things, that “Operating venues such as hotels, banks, stations, airports, sports venues, exhibition halls, museums, art galleries, and libraries, etc., shall not forcibly, deceive, fraudulently, or coercively require individuals to undergo facial recognition technology verification for the purpose of conducting business or improving service quality…” It should be noted that Article 9 does not prohibit these operating venues from using FRT if the individual voluntarily chooses to use FRT to verify personal identity, the individual is fully informed of the circumstances, and the purpose of identity verification is conveyed to the individual in the verification process.
To conduct remote, non-sensory recognition of specific natural persons in public places or operating venues via FRT, the purpose and necessity of use is limited to that which is “…necessary for the maintenance of national security, public safety or for the protection of the life, health and property of natural persons in emergency situations, and initiated by an individual or interested party” (Article 10 of the Draft). The time, place and scope of such services must also implement the principle of data minimisation.
The Draft does not clearly define public places, though the operating venues listed in Article 9 (See above.) likely fall within this category, while those listed in Article 6 (such as “Hotels, public bathrooms, changing rooms, restrooms, and other places that could infringe upon the privacy of others”), likely fall outside this category.
In the context of access to managed buildings, FRT may not be used as the sole method of entering or exiting (Article 14 of the Draft). Management companies must provide alternative methods of access. In light of Article 4 of the Draft, it seems other access methods should be prioritised.
Profiling
Article 11 restricts but does not prohibit profiling as follows: “Except where required by statutory conditions or obtaining individual consent, users of facial recognition technology shall not analyse sensitive [PI] such as race, ethnicity, religious beliefs, health conditions, and social class using facial recognition technology.”
Matters of Significant Personal Interest
In matters of significant personal interest, such as social assistance and real estate disposal, FRT may not replace manual identity verification but can be used as an auxiliary means to verify personal identity (Article 12 of the Draft). The Draft does not specify what other things would be considered matters of significant personal interest.
Personal Information Protection Impact Assessments
Personal Information Protection Impact Assessments (“PIPIA”) are mandated by the PIPL in certain situations, which includes the use of FRT (See Articles 28 and 55 of the PIPL). The PIPL provides very high-level requirements for conducting PIPIA, which Article 15 of the Draft builds upon in the context of FRT by requiring PIPIA to consider:
“(1) Compliance with the mandatory requirements of laws, administrative regulations, and national standards, and whether it conforms to ethical principles;
(2) Whether the processing of facial information has a specific purpose and sufficient necessity;
(3) Whether the processing is limited to the required accuracy, precision, and distance to achieve the purpose;
(4) Whether the protective measures taken are legal, effective, and commensurate with the level of risk;
(5) Risks of facial information leakage, tampering, loss, damage, illegal acquisition, and illegal use, and potential harm;
(6) Harm and influence on individual rights and measures to reduce adverse effects, as well as whether these measures are effective.”
Consistent with PIPL, the Draft requires a PIPIA to be stored for at least 3 years.
When the purpose and method for processing face recognition information changes or after a major security incident, FRT users must conduct a fresh PIPIA. As the PIPL does not explicitly list triggering conditions for conducting fresh PIPIA, this could be considered a special requirement in the context of FRT.
Government Registration
Article 16 of the Draft requires FRT users who hold the facial information of over 10,000 individuals to make filings with municipal-level and above CAC departments within 30 working days. Such filings should contain:
“(1) Basic information of the user of facial recognition technology and the person responsible for [PI] protection;
(2) Explanation of the necessity of processing facial information;
(3) The purpose, method, and security protection measures for processing facial information;
(4) Rules and operating procedures for processing facial information;
(5) Impact assessment report on [PI] protection;
(6) Other materials the cyberspace administration department deems necessary to provide.”
If there are substantial changes in the filed information or the use of face recognition technology is terminated, FRT users need to go through relevant procedures within a given period of time.
FRT Service Providers
Article 17, Paragraph 2 of the Draft provides that FRT service providers’ relevant technology systems need to meet the requirements of at least Level Three network security protection and use measures, such as data encryption, security auditing, access control, authorisation management, intrusion detection, and defence, to protect the security of facial information.
FRT Service Users
FRT service users need to conduct annual security and risk assessments of image capture equipment and personal identity recognition equipment, improve security strategies based on the assessment results, adjust the confidence threshold, and take effective measures to protect image capture equipment and personal identity recognition equipment from attacks, intrusions, interference, and damage (Article 19 of the Draft).
Reporting
Any organisation or individual may report violations of the Draft to the government. Based on our observations of how other laws and regulations function in practice, such reports will more likely than not be made by disgruntled and former employees.
Violations
The Draft does not explicitly list punishments for violations. Instead, it refers to other laws and regulations in Article 23. Most notable among those laws and regulations is the PIPL, which allows for the confiscation of unlawful gains, fines of up to CNY 50 million of 5% of revenue in the prior year (whichever is higher), fines against the individuals responsible, business suspension and business license termination.
Conclusion
As technology advances, it becomes ever more crucial to establish a clear and comprehensive legal framework to safeguard individual privacy and data security. The Draft makes a significant contribution towards establishing such a framework and allows stakeholders to provide feedback and contribute toward the further refinement of the final regulations, which we hope will harness the potential of technology while respecting the rights and interests of individuals.
We note that readers have until 7 September 2023 to give regulators feedback. The full text of the Draft can be accessed at http://www.cac.gov.cn/2023-08/08/c_1693064670537413.htm.