On February 3, 2021, the conclusions of a joint investigation conducted by the Office of the Privacy Commissioner of Canada (OPC), the Commission d’accès à l’information du Québec (QC Commission), the Office of the Information and Privacy Commissioner of British Columbia (BC OIPC) and the Office of the Information and Privacy Commissioner of Alberta (AB OIPC), collectively referred to as the Offices, were released—finding that Clearview AI violated the privacy rights of Canadians.
What happened?
As can be seen in the PIPEDA Report of Findings, the Offices noted that Clearview AI, an American tech company, developed and delivered its facial recognition software and combined database solution (App) to clients around the world. The goal of Clearview AI’s App was to allow clients to upload a digital image of an individual’s face and run a search against it, and then the App applied its algorithm to the digital image and ran the result against Clearview AI’s database to identify and display likely matches along with associated source information.
In the beginning of 2020, public reports suggested that Clearview AI was populating its facial recognition database by scraping digital images from public websites, some of which included Facebook, YouTube, Instagram, Twitter, and Venmo, in violation of the organizations’ terms of service and without the consent of individuals. What is more, the reports stated that these digital images were indefinitely stored in Clearview AI’s database to be sourced and served as results for facial recognition searches. Several reports then confirmed that a number of Canadian law enforcement agencies and private organizations were using Clearview AI’s services in order to identify individuals.
In response, the Offices commenced their investigations to determine whether Clearview AI complied with Canada’s Personal Information Protection and Electronic Documents Act (PIPEDA), Québec’s Act Respecting the Protection of Personal Information in the Private Sector (QC Private Sector Act), Québec’s Act to Establish a Legal Framework for Information Technology (QC Information Technology), British Columbia’s Personal Information Protection Act (BC PIPA), and Alberta’s Personal Information Protection Act (AB PIPA), which are collectively referred to as the Acts.
What were the findings?
The Offices found the following:
1. Consent
The Offices concluded that Clearview AI did not attempt to obtain consent required for its collection, use and disclosure of personal information through the App, and instead erroneously relied upon the “publicly available” exception. The Acts required consent for the collection, use or disclosure of personal information unless an exception applied, and the type of consent required varied depending on the circumstances and the type of information involved. As could be seen in the Guidelines for Obtaining Meaningful Consent (jointly issued by the OPC, AB OIPC, and BC OIPC), it was necessary for organizations to obtain express consent when: the information being collected, used or disclosed was sensitive; the collection, use or disclosure was outside of the reasonable expectations of the individual; and/or the collection, use or disclosure created a meaningful residual risk of significant harm.
In addition to Clearview AI’s collection of images, the creation of biometric information in the form of vectors constituted a distinct and further collection and use of personal information, as previously found in Cadillac Fairview, which I discussed here. As well, the QC Information Technology specifically required the express consent of the person concerned, and this meant that the consent had to be explicit and unequivocal. In other words, individuals had to perform positive actions that clearly demonstrated their agreement, and this involved being informed about what the consent entailed. Consent had to free, enlightened, given for specific purposes, and limited in time.
Biometric information was sensitive in almost all circumstances—it was intrinsically, and in most instances permanently, linked to the individual. It was distinctive, unlikely to change over time, difficult to modify, and largely unique to the individual. While there were degrees of sensitivity, facial biometric information was particularly sensitive. To that end, in the absence of an applicable exception, Clearview AI should have obtained express opt-in consent before it collected the images of any individual in Canada.
Although Clearview AI argued that the information it collected was publicly available and hence there was no reasonable expectation of privacy, the Offices rejected this—information from sources such as social media or professional profiles collected from public websites and then used for an unrelated purpose did not fall under the “publicly available” exceptions of PIPEDA, AB PIPA, and BC PIPA (and relevant regulations). Moreover, it was important to note that “publicly available” was distinct from a common understanding of “publicly accessible” information. An examination of the QC Private Sector Act and QC Information Technology, together with the previous finding of the QC Commission (even where personal information had been posted on a public website, it did not mean that the information could be used for other purposes without consent), led to the conclusion that the “publicly available” exception did not apply.
Therefore, Clearview AI contravened the Acts.
2. Appropriate purpose
The Offices found that Clearview AI’s purpose for collecting, using or disclosing personal information was neither appropriate nor legitimate. Ultimately, the Offices found that Clearview AI’s collection of images and creation of biometric facial recognition arrays, for its stated purpose of providing a service to law enforcement personnel and use by others through trial accounts, represented the mass identification and surveillance of individuals by a private entity in the course of commercial activity.
That is, a reasonable person would not consider this purpose to be appropriate, reasonable, or legitimate in the circumstances, within the meaning of the Acts.
The Offices emphasized the sensitive nature of the information (facial biometric data was particularly sensitive since there was an enhanced ability to identify and surveil individuals) along with the additional contextual information provided via source links (social media and websites). There was also the mass indiscriminate collection of the personal information of minors, which would be considered particularly sensitive.
Essentially, Clearview AI did not have an appropriate purpose for the mass and indiscriminate scraping of images from millions of individuals across Canada, including children, amongst over 3 billion images scraped world-wide; the development of biometric facial recognition arrays based on these images, and the retention of this information even after the source image or link had been removed from the Internet; or the subsequent use and disclosure of that information for its own commercial purposes where those purposes: were unrelated to the purposes for which the images were originally posted, were often to the detriment of the individual, and created the risk of significant harm to individuals whose images were captured by Clearview AI, where the vast majority of those individuals had never been and will never be implicated in a crime or identified to assist in the resolution of a serious crime.
What is more, Clearview AI collected the sensitive biometric personal information without obtaining express consent of the individuals in question—or any form of knowledge or consent.
Since Clearview AI did not collect the information directly from the individuals in question or have any relationship with the third parties whose websites it scraped (many of whom alleged that Clearview AI was not authorized to collect the information from their websites), Clearview AI achieved its purposes via collection that inherently contravened the Acts, and those purposes were not considered to be appropriate.
3. Biometric obligations
Briefly, the Offices noted that companies that built a biometrics system in Quebec had to comply with the rules set out in the QC Private Sector Act and QC Information Technology, namely to obtain the express consent of the persons concerned, and disclose the creation or existence of the biometrics system to the QC Commission.
Plainly put, Clearview AI failed to obtain the express consent of the persons concerned (it sought no consent at all), and it failed to disclose the existence of its biometrics system (it had to disclose its database of biometric characteristics and measurements) to the QC Commission.
4. Recommendations
The Offices considered the matter to be well-founded and recommended that Clearview AI:
- cease offering the facial recognition services that have been the subject of this investigation to clients in Canada,
- cease the collection, use and disclosure of images and biometric facial arrays collected from individuals in Canada, and
- delete images and biometric facial arrays collected from individuals in Canada in its possession
Although Clearview AI expressly disagreed with the conclusions of the Offices, it voluntarily withdrew from the Canadian market earlier in the investigation. At the time of writing the report, Clearview AI had not committed to following the above recommendations or orders. If this continues, the Offices plan on pursuing other available actions to bring Clearview AI into compliance with the Acts.
What can organizations take from this?
As can be seen from the above discussion, it is critical that organizations gain an understanding of consent and the exceptions to consent. Organizations are recommended to review the Guidelines for Obtaining Meaningful Consent and their own policies and procedures. In addition, it is important for organizations to appreciate the sensitive nature of biometric information, and the particularly sensitive nature of facial biometric information, when examining consent and purposes of collection, use, and disclosure of personal information.
- Recent proposal for an American federal privacy law - April 19, 2024
- Bill 149 receives royal assent March 21, 2024 - April 1, 2024
- Reasonable expectation of privacy in Internet Protocol (IP) addresses - March 26, 2024