When do commonplace Internet advertising “nudge” techniques cross the line into inappropriate manipulation?
Regulators now have a new source of Internet harm in their crosshairs: so called “dark patterns”. Generally speaking, the expression “dark patterns” refers to manipulative or deceptive user interface design choices engineered by developers to nudge individuals towards decisions they might not otherwise make, including unintended and potentially harmful decisions. Dark patterns have been a recent area of focus for regulatory authorities. Because of the increasing complexity and pervasiveness of dark patterns, several jurisdictions have passed or are considering legislation to limit their use and mitigate their impact on consumers.
In this article, we explore how dark patterns are regulated in the U.S. and the EU and consider how they could potentially be regulated in Canada.
How dark patterns are regulated in the U.S. and the EU
In October 2021, the U.S. the Federal Trade Commission (“FTC”), in an enforcement policy statement, vowed to bring actions against companies employing dark patterns that “trick and trap” consumers into subscription services. Although the statement only targets a specific use of dark patterns (i.e., in relation to subscription services), the FTC has laid the groundwork for recognizing dark patterns in general as an unfair and deceptive practice under the Federal Trade Commission Act (“FTC Act”) and other laws.
The FTC and State Attorneys General may also rely on the Restore Online Shoppers’ Confidence Act (“ROSCA”) to combat unlawful dark patterns. This law, in essence, prevents sellers of negative option subscriptions from charging any consumer unless they clearly and conspicuously disclose all material terms of the transaction and provide a simple mechanism to cancel subscriptions. The FTC has already filed charges against specific companies, relying on the FTC Act and ROSCA.
Moreover, the U.S. Consumer Financial Protection Bureau (“CFPB”) has a growing interest in suppressing the use of dark patterns and intends to focus on repeat offenders and hold executives accountable where necessary. Recently, the CFPB has filed a lawsuit against TransUnion, two of its subsidiaries, and a former TransUnion executive for violating a 2017 law enforcement order. Among other offences, the CFPB alleges that TransUnion has repeatedly used dark patterns to trick customers into recurring payments that are difficult to cancel.
Individual states are also starting to target dark patterns through privacy laws. The California Privacy Rights Act (“CPRA”), which will become fully operative in 2023, amends and expands the existing California Consumer Privacy Act by introducing new consumer rights and obligations for businesses that collect and process personal data. Notably, consent under the CPRA’s new definition cannot be obtained through dark patterns which are defined as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice, as further defined by regulation.” Similarly, the Colorado Privacy Act also addresses dark patterns by excluding an “agreement obtained through dark patterns” from its definition of consent.
In March 2022, the European Data Protection Board published the draft Guidelines 3/2022 on dark patterns in social media platform interfaces. The Guidelines’ purpose is to provide practical recommendations and guidance to developers and users of social media platforms on how to notice and avert dark patterns that violate the EU’s General Data Protection Regulation (“GDPR”). Certain dark patterns could directly infringe the conditions for obtaining consent for personal data processing under the GDPR, which requires users to agree to share their data explicitly and retain the option to withdraw consent as easily as it has been given. Typical dark patterns that violate the GDPR’s consent provisions include consent forms and notices without a “reject all” option, preselected choices, and those that require users to go through more effort to reject data sharing than to agree to it.
How existing privacy legislation in Canada can potentially regulate dark patterns
In Canada, the Personal Information Protection and Electronics Document Act (“PIPEDA”) applies to private organizations that collect, use or disclose personal information in the course of commercial activity. So far, enforcement by the Office of the Privacy Commissioner of Canada (“OPC”) has mostly concerned insufficient user consent rather than deception through design elements found in a user interface. The Guidelines for obtaining meaningful consent published by the OPC emphasize the importance of obtaining “meaningful consent” from users when collecting, using, and disclosing their personal information. The Guidelines namely recommend involving UI/UX designers in the development of consent processes in order to ensure that they are “understandable, user-friendly and customized to the nature of the product or service”.
With regards to enforcement efforts related to design-based deception, in a 2016 joint investigation with Australia’s Privacy Commissioner, the OPC investigated Ashley Madison, a dating website for married individuals. The website’s registration page displayed trust marks which suggested a high level of security and discretion. However, the Terms of Service of the website contradicted the trust marks by warning users that security or privacy information could not be guaranteed. The OPC held that Ashley Madison contravened PIPEDA Principle 4.3.5, which prohibits obtaining consent through deception. Some individuals might have chosen not to share their personal information if they had not been misled by the false security trust marks, which were deliberately designed to deceive.
Furthermore, in policy proposals submitted to the OPC in 2020, Prof. Ignacio Cofone from McGill University’s Faculty of Law suggested protecting meaningful consent by clearly prohibiting deceptive design mechanisms, such as dark patterns. It is worth mentioning that the recently unveiled Bill C-27, which aims to replace PIPEDA with the Consumer Privacy Protection Act, appears to have adopted Prof. Cofone’s recommendation. The bill proposes broad language which could be interpreted as prohibiting businesses from using certain types of dark patterns to obtain consent for the collection, use or disclosure of personal information. In fact, section 16 of Bill C-27 expressly provides: “An organization must not obtain or attempt to obtain an individual’s consent by providing false or misleading information or using deceptive or misleading practices. Any consent obtained under those circumstances is invalid.” Should the bill be adopted, we can reasonably expect that enforcement efforts could increasingly be directed towards design-based deception rather than only inadequate privacy notices. The bill also enacts the Artificial Intelligence and Data Act (“AIDA”), which prohibits the possession or use of illegally obtained personal information for the purpose of designing, developing, using or making available for use an artificial intelligence system. According to the AIDA, illegally obtained personal information is acquired through the commission of an offence under federal or provincial law. For the time being, it is unclear whether personal information obtained through certain types of dark patterns could qualify as “illegally obtained” as per the AIDA. This will depend on how penal provisions are enforced throughout federal and provincial privacy laws, including those found in Quebec’s Bill 64.
At the provincial level, section 14 of Quebec’s private-sector privacy law, the Act Respecting the Protection of Personal Information in the Private Sector, as amended by Bill 64, stresses the importance of obtaining clear, free, and informed consent from individuals for specific purposes, but does not explicitly address dark patterns. It remains to be seen whether these heightened consent requirements could be leveraged by the Commission d’accès à l’information to curb the use of dark patterns.
Finally, Canada’s Anti-Spam Legislation (“CASL”) amended the Competition Act in 2014 to prohibit false and misleading statements in electronic messages that promote a business interest or a product, which limits to some extent the use of dark patterns. Specifically, the Competition Act, as amended by CASL, prohibits the use of misleading representations in an electronic message’s sender description, subject matter or message field, or locator. In practice, these prohibitions make it illegal, amongst other things, to include misleading claims in email subject matter lines, create misleading URLs, or send electronic messages under a false name.
Recent developments in the U.S. and the EU have shown that regulators are starting to target manipulative online business practices, including dark patterns.
Although Canada currently does not have any regulation specifically addressing dark patterns, the OPC has shown a willingness to interpret meaningful consent requirements applicable under privacy law as a means of limiting their use. With Bill C-27 and the arrival of heightened consent requirements through the Bill 64 amendments, Canadian businesses should monitor regulatory trends concerning dark patterns, as they may need to review their user interfaces to comply with emerging regulatory standards.
 S. 310.2 of the Federal Trade Commission’s Telemarketing Sales Rule provides that: “Negative option feature means, in an offer or agreement to sell or provide any goods or services, a provision under which the customer’s silence or failure to take an affirmative action to reject goods or services or to cancel the agreement is interpreted by the seller as acceptance of the offer.”
 See CFPB Charges TransUnion and Senior Executive John Danaher with Violating Law Enforcement Order, available at <https://www.consumerfinance.gov/about-us/newsroom/cfpb-charges-transunion-and-senior-executive-john-danaher-with-violating-law-enforcement-order/>
 S. 14(h) and (l) of the CPRA.
 S. 6-1-1303 (5)(c) and (9) of the Colorado Privacy Act; “Dark pattern” is defined as “a user interface designed or manipulated with the substantial effect of subverting or impairing user autonomy, decision-making, or choice.” Note the similarity with the CPRA definition.
 Art. 7 of the GDPR.
 Jeremy Wiener, Deceptive Design and Ongoing Consent in Privacy Law, 2021 53-1 Ottawa Law Review 133.
 Jeremy Wiener, Deceptive Design and Ongoing Consent in Privacy Law, 2021 53-1 Ottawa Law Review 133.
 See Joint investigation of Ashley Madison by the Privacy Commissioner of Canada and the Australian Privacy Commissioner/Acting Australian Information Commissioner, available at <https://www.priv.gc.ca/en/opc-actions-and-decisions/investigations/investigations-into-businesses/2016/pipeda-2016-005/>
 PIPEDA Principle 4.3.5; “In obtaining consent, the reasonable expectations of the individual are also relevant. […] Consent shall not be obtained through deception.”
 S. 38 of the Artificial Intelligence and Data Act in Bill C-27
 S. 9 of Bill 64 amends Sec. 14 of the Act as follows: Consent under this Act
to the collection, communication or use of personal information must be manifest clear, free and enlightened informed and must be given for specific purposes. It must be requested for each such purpose, in clear and simple language. […]
 S. 52.01 and 74.011 of the Competition Act.
- One step closer to AI regulations in Canada: The AIDA companion document - May 31, 2023
- The unexpected effect of the introduction of mandatory breach notification requirements in Québec - May 23, 2023
- An arbitrator rules on the legality of administrative suspensions without pay for investigations in collective labour relations contexts and zero-tolerance alcohol policies for employees in high-risk positions - April 26, 2023
Leave a Reply