The Federal Court of Appeal just released an important opinion in Google LLC v. Canada (Privacy Commissioner) 2023 FCA 200, holding that Google cannot rely on the journalistic exception in PIPEDA to operate legally in Canada. The decision is a frustrating one because it fails to resolve whether PIPEDA’s restrictions on Google and other search engines are consistent with the Charter and if not, how search engines and artificial intelligence systems can operate legally in Canada. These important questions are unlikely to be resolved soon under the OPC’s ill conceived reference process. Further, while the CPPA would introduce a new legitimate interests exception that could apply to search engines and AI systems, the proposed exception in the CPPA is narrower than the exception in the GDPR which will still leave the resolution of these questions in doubt.
I address these questions below. I start with a summary of the reference and then provide a summary of the reference decision and the decision of the Court of Appeal. I then comment on the ill conceived reference made by the OPC. Finally, I provide a comparison of the legitimate interests exception in the GDPR with what is proposed in the CPPA before making recommendations to clarify the legal basis for search engines and AI systems to operate legally in Canada.
Overview of reference
The questions before the Court arose from a reference submitted to the Federal Court by the Office of the Privacy Commissioner (OPC). The central threshold questions posed by the reference were:
(1) Does Google, in the operation of its search engine service, collect, use or disclose personal information in the course of commercial activities within the meaning of paragraph 4(1)(a) of PIPEDA when it indexes web pages and presents search results in response to searches of an individual’s name?
(2) Is the operation of Google’s search engine service excluded from the application of Part 1 of PIPEDA by virtue of paragraph 4(2)(c) of PIPEDA because it involves the collection, use, or disclosure of personal information for journalistic, artistic or literary purposes and for no other purpose?
Regrettably, the Commissioner chose not to refer a third issue that Google had raised—whether any interpretation of PIPEDA that would prohibit Google from providing search results leading to journalistic content would contravene the freedom of expression guaranteed by section 2(b) of the Charter. (Unfortunately, Google did not take the initiative to launch its own application for declaratory relief to resolve the Charter issue.)
I summarize below the decision of the reference judge and then focus on the decision in the appeal.
Summary of reference decision
On the main issue in the case, the reference judge concluded that Google’s search business was subject to PIPEDA. She found that Google’s search activities was a commercial activity and was thus prima facie within PIPEDA.
The reference judge also rejected Google’s argument that it that it fell within the PIPEDA journalistic exception by merely facilitating access to information, such as news media. She held that does not give Google’s search engine control over content, it expresses no opinion, and involves no content creation. In so holding, she relied on an “ordinary understanding of the word journalism” as something that “encompasses content creation and content control” and on the definition of journalism referred to in AT v. Globe24h.com as “an activity should qualify as journalism only where its purpose is to (1) inform the community on issues the community values, (2) it involves an element of original production, and (3) it involves a “self-conscious discipline calculated to provide an accurate and fair description of facts, opinion and debate at play within a situation”. She also characterized Google’s primary purpose as to index and present search results and held this was not primarily a journalistic purpose. She thus concluded that “Google’s purposes for collecting, using and disclosing personal information […] are not journalistic, and they are certainly not exclusively so.”
Summary of the appeal decision
Google made two main submissions in the appeal which were summarized by Laskin JA, who wrote the 2:1 majority reasons for the Court.
First, Google submits, the reference judge should have struck out the notice of application for a reference, or declined to answer the question. It says that the question could not properly be answered, and should not have been answered, in a “Charter vacuum”—that is, without considering whether subjecting Google’s search engine to Part 1 of PIPEDA would infringe the freedom of expression guaranteed by the Canadian Charter of Rights and Freedoms—and without considering whether the Commissioner has jurisdiction to determine Charter issues.
Second, and in the alternative, Google submits that the reference judge too narrowly interpreted the exemption in paragraph 4(2)(c) of PIPEDA. That exemption applies when an organization collects, uses or discloses personal information for journalistic, artistic or literary purposes and does not collect, use or disclose it for any other purpose. The reference judge erred in applying that exception, Google contends, in holding that only a publisher of news can act with a journalistic purpose, when the principles of statutory interpretation support a broad meaning of the term that includes dissemination of news. Therefore, if the reference question is to be answered, Google submits that this Court, giving the judgment the reference judge should have given, should set aside her answer and answer the question “yes”.
The Court first addressed and dismissed Google’s argument that the reference judge should have declined to answer the reference questions because the Charter issue was not before the Court.
On the main issue of whether the journalistic exception applied, Google made three main arguments, First, Google submitted, the principles of statutory interpretation support a broad definition of “journalistic purpose,” one that includes disseminating the news. Second, it submitted, the reference judge erred by accepting and applying the definition of “journalism” adopted in Globe24h. Third, it submitted, she erred by failing to consider that paragraph 4(2)(c) applies to specific information, rather than an organization as a whole.
Google argued that “journalistic purpose” includes the distribution of journalistic content and includes “the entire journalistic process,” from gathering information through to publication and dissemination. Google had put before the Court definitions of journalism that included the distribution of journalistic works.[i]
Google also submitted that given the ambiguity in interpreting the exception, it should be construed broadly to give effect to Charter protected freedom of expression and freedom of the press. In this regard it argued that Parliament intended that paragraph 4(2)(c) be “a broad exemption that excludes Charter-protected speech from PIPEDA”.
The Court rejected these submissions holding that there was no genuine ambiguity in the meaning of the exception. This conclusion was largely based on the Court’s holding that, whatever, the term “journalism” meant – and the Court declined to rule on the meaning of the term – the exception required that the “journalistic purpose” be the only purpose and that “Google’s focus is on the fact of dissemination of news, not its purpose.”
As noted above, Google also argued that the reference judge failed to consider the specific nature of the personal information before her. It argued that PIPEDA and its exemptions apply on a case-by-case basis depending on the specific personal information at issue and how it is used. The Court agreed with this submission, but ruled that the reference judge did not ignore this distinction.
On this latter point Webb J.A. wrote strong dissenting reasons concluding that on the facts before the reference judge Google’s search engine only collected and disclosed journalistic articles for journalistic purposes and was thus within the exception. The dissent likened Google’s activities to those of newspaper retailers who distribute news and radio and TV broadcasters who find and use an article as part of the news.[ii]
Comments on the reference and decision
The search engine reference and the decision of the Court brings us no closer to answering the fundamental purposes of the proceeding, namely, to determine whether search engines are subject to PIPEDA, and if they are, how they can operate legally in Canada and whether they are subject to individuals’ right of erasure (aka the right to be forgotten).
The OPC is largely to blame for predicament we find ourselves in. The OPC intentionally structured the reference to avoid answering the fundamental question as to whether PIPEDA violates the Charter by not having an exception that would permit search engines (and now AI systems) to operate legally and in a responsible manner. This was something that could have been included in the reference but was not. In response to a question I asked former Commissioner Therrien at an open meeting several years ago as to why the Charter issue was not included, the Commissioner responded that it was, among things, for “financial reasons”. However, one cannot imagine a more expensive, time consuming, and frustrating process than the one selected by the Commissioner as the current reference will need to be followed by one or more further references to resolve the important questions, subject to the issue being resolved under the CPPA.
What is particularly maddening is that the Commissioner’s decision has also resulted in a lack of guidance on whether training and deploying AI systems that scrape personal information from the web or other sources can be legally engaged in under PIPEDA, something that is currently being investigated by the OPC.
Need for amendments in the CPPA
It is clear that Canada’s privacy law needs to be clarified and updated to provide a clear legal basis for search engines and AI systems to be legally and responsibly operated in Canada. Canada’s draft new privacy law in Bill C-27 is a helpful first step, but fails to provide the same legal basis for search engines and Ai systems as under the GDPR.
Search engines and AI systems under the GDPR
Under the General Data Protection Regulation (GDPR) search engines can operate legally in the EU, provided they comply with the regulation’s provisions. According to Article 6(1)(f) of the GDPR, processing of personal data is lawful if it is necessary for the purposes of the legitimate interests pursued by the controller or by a third party, except where such interests are overridden by the interests or fundamental rights and freedoms of the data subject which require protection of personal data.
Search engines can fall under the “legitimate interests” exception in the GDPR due to the nature of their operation. The primary function of a search engine is to index and provide access to publicly available information. This action can be seen as a legitimate interest because it serves a broad societal interest in the free flow of information and can also be necessary for the search engine provider’s economic interest. This means that search engines can process personal data based on legitimate interests, provided they ensure that these interests do not override the rights and freedoms of the data subjects (users) in their personal data. The latter condition involves a balancing test, do the individual’s interests override the legitimate interest?
The Court of Justice of the EU (CJEU) has provided some guidance on this issue in the Google Spain (C-131/12 and GC and Others case (C-136/17) cases. It held that a search engine, as a data controller, has a responsibility to consider requests from individuals to de-index certain search results (the so-called “right to be forgotten”). The Court found that the individual’s rights to privacy and data protection can override the search engine’s economic interest and the public’s interest in having access to that information.
The GDPR also provides a legal basis for AI systems to operate in the EU, though some questions on how the GDPR applies remain open. See, The impact of the General Data Protection Regulation (GDPR) on artificial intelligence (europa.eu).
Search engines and AI systems under the CPPA
Section 18(3) of the CPPA would introduce in the new privacy law a new legitimate interests exception. The proposed amendment reads as follows:
18(3) An organization may collect or use an individual’s personal information without their knowledge or consent if the collection or use is made for the purpose of an activity in which the organization has a legitimate interest that outweighs any potential adverse effect on the individual resulting from that collection or use and
(a) a reasonable person would expect the collection or use for such an activity; and
(b) the personal information is not collected or used for the purpose of influencing the individual’s behaviour or decisions.
(4) Prior to collecting or using personal information under subsection (3), the organization must
(a) identify any potential adverse effect on the individual that is likely to result from the collection or use;
(b) identify and take reasonable measures to reduce the likelihood that the effects will occur or to mitigate or eliminate them; and
(c) comply with any prescribed requirements.
The CPPA proposed amendment is significantly narrower than the legitimate interests exception in the GDRP. While both have balancing tests, under the GDPR the balancing of the societal and benefits to the controller is against the interests or fundamental rights and freedoms of the data subject which require protection of personal data, compared to “any potential adverse effect on the individual”. The exception also only applies to the collection and use of personal information, which would leave disclosures such as disclosures in search results still offside the CPPA. It also does not include processing by third parties, something that is required for AI systems. The GDPR legitimate interest exception is also not subject to the additional conditions precedent as in the CPPA. Nor can be exception be overridden by the new overreaching and confusing “appropriate purposes” limitation in the CPPA.
Bill C-27’s AIDA, if enacted, will regulate certain high-impact systems via regulatory processes. The government has outlined amendments that will be made to C-27 in a letter to the INDU Committee following the Minister’s appearance at the Standing Committee on Industry and Technology on September 26, 2023. However, none of the proposed amendments address the many problems with the CPPA including those which will leave Canadian search engines and AI system operators with a more narrow legal basis to operate those platforms in Canada, despite the government’s repeated commitments that Bill C-27 would provide interoperable laws with those of our trading partners.
Based on the foregoing, I recommend the following:
Recommend: The Government refer the Charter issue as to whether PIPEDA violates the Charter to the Supreme Court of Canada, which could hear this reference at the same time as it may hear any appeal from the recent FCA decision.
Recommend: The government amend the legitimate interests exception to be more closely aligned with the GDPR exception and to expressly include disclosures as part of the exception and to extend the exception to third parties.
By Barry B. Sookman
This article was first posted on www.barrysookman.com.
[i] These were summarized by the Court:
It refers in particular to the definitions in Encyclopedia Britannica (“Journalism”, Encyclopedia Britannica, Inc., online: published January 27, 2021) and the Canada Evidence Act, R.S.C. 1985, s. 39.1. The former defines “journalism” as “the collection, preparation, and distribution of news and related commentary and feature materials […]”. The latter defines “journalist” as “a person whose main occupation is to contribute directly, either regularly or occasionally, for consideration, to the collection, writing or production of information for dissemination by the media […].” Google also relies on the recognition by the Supreme Court of Canada that freedom of the press encompasses “the right to transmit news and other information”: Canadian Broadcasting Corp. v. New Brunswick (Attorney General), 1996 CanLII 184 (SCC),  3 S.C.R. 480 at para. 24.
[ii] Webb JA expressed these reasons as follows:
In response to a search request, Google provides the excerpt from the newspaper articles and provides the link to the website where the full articles can be found. Its sole purpose in disclosing this information is to provide access to the newspaper articles that could have been found by the same individual if, instead of using Google’s search engine, that person would have searched the websites maintained by the publishers of the articles. Google simply facilitated access to the newspaper articles.
Google’s purpose was therefore to provide a link between an individual who is searching for information and the source where the information can be found on the internet. Google is helping the searcher find relevant newspaper articles and is helping the publishers distribute these articles.
The role of Google can be compared to the role played by a retailer that sold the newspapers in issue when they were first published. The retailer collected the personal information when it received the newspapers from the publishers. The retailer disclosed the information when it sold the newspapers. The retailer would not otherwise perform any other journalistic function. In my view, it would not be intended that PIPEDA would apply to a retailer who sells newspapers.
Another example will illustrate why, in my view, PIPEDA does not apply to the Google search engine when it collects, uses and discloses journalistic articles published by a newspaper. Assume an individual, in preparing for a newscast or other television or radio program, finds the same articles in issue in this appeal by reading a paper copy of the newspapers. If the personal information in the article is reported as part of the program (with an acknowledgement of the source of the information), in my view PIPEDA would not apply to the collection, use or disclosure of that information as part of the television or radio program. The individual citing the article is simply distributing the news that has already been reported by the newspaper.
Similarly, if instead of reading a paper copy of the newspaper, the individual finds the same articles by searching the newspaper’s website and then discloses the personal information (acknowledging the source), in my view PIPEDA would not apply to the collection, use or disclosure of that information as part of the television or radio program. In my view, the result would also be the same if the individual uses the Google search engine to find the newspaper articles.
In each scenario, the operator of the television or radio station is simply finding newspaper articles and reporting on what was published by the newspaper. The only functions performed by the operator of the television or radio station are the collection of the newspaper articles (by reading the newspapers or searching for the articles) and the use and disclosure of that information as part of the television or radio program. No other journalistic function is performed as the operator is relying on the journalists who wrote the articles to have verified the accuracy of the information.
The activities of the operator of the television and radio station are essentially the same as the activities carried out by Google. Both the operator and Google search for information and disclose that information to the public. The operator of the television or radio station searches newspapers and other sources for articles that would be of interest to its viewing or listening audience. A decision is made concerning when the article will be disclosed during the television or radio program. Google searches the internet and organizes the information that its search engine finds. It displays the search results based on its determination of the relevance of the information to the searcher using algorithms maintained by Google. In the case of the operator of the television or radio station, the information is disclosed to anyone who is watching or listening to the program. In Google’s case, the information is disclosed to anyone who searches for the information.
If PIPEDA applies to Google when its search engine finds and discloses a newspaper article, would PIPEDA apply to a television or radio station operator who searches for newspaper articles that would be of interest to its viewing or listening audience and then discloses the article? In my view, PIPEDA should not apply to either Google when its search engine finds and discloses a newspaper article (which is a journalistic article) or the operator of a television or radio station when the operator finds and discloses a newspaper article (which is a journalistic article) to its viewing or listening audience.
- Quebec’s Law 25 and cookies: Not so cookie cutter - November 20, 2023
- Sweeping privacy reform comes into force in Quebec - October 27, 2023
- Legality of search engines and AI systems under PIPEDA and CPPA: Google v Privacy Commissioner - October 23, 2023