2023 Year in Review: Privacy

February 16, 2024
By Parnian Soltanipanah and Melanie Szweras

2023 was an active year when it came to advances in Canadian privacy law. Here are some notable developments regarding legislative amendments and case law to keep you abreast within this ever-evolving field.

Legislative Updates

Law 25

The majority of amendments to Quebec’s Act Respecting the Protection of Personal Information in the Private Sector (QC ARPPIPS) came into effect in September 2023. The amendments proposed by Law 25 (formerly Bill 64) bring the QC ARPPIPS closer in line with the European Union’s General Data Protection Regulation (GDPR).

Law 25 applies to all businesses that process the personal information (PI) of Quebec residents. Some key obligations that came into force include the requirement of businesses to develop privacy governance frameworks, conduct privacy impact assessments (PIA), the imposition of additional consent requirements and more severe penalties for non-compliance, among others.

Businesses must establish and implement a governance framework, explaining in clear terms on their website, their PI retention and destruction practices, the process for dealing with complaints, and defining the roles and responsibilities of personnel throughout the life cycle of the PI.

Of note is the requirement to conduct privacy impact assessments (PIA) (a) when acquiring, developing, or redesigning an information or electronic delivery system involving PI, (b) when transferring PI outside of Quebec, and (c) when transferring PI to a third party for research purposes without consent. Earlier this year, the CAI released a guidance document on conducting PIAs along with a PIA template, available in French here.

Moreover, Law 25 articulated a new consent framework. As described in greater detail in the consent guidelines released by the CAI earlier this year, the new framework requires that consent be:

  • clear: the consent must be obvious, clearly demonstrating the individual’s intention;
  • free: the consent must be provided of the individual’s free will without pressure or coercion, with the option of withdrawing consent at any time;
  • informed: the individual must know and understand what they are consenting to;
  • specific: consent must be given for a specific purpose;
  • granular: consent must be requested for each specified purpose;
  • understandable: the request for consent must be in clear and simple language, such that it is comprehensible;
  • temporary: consent must be valid only for the period of time necessary to achieve the purposes for which it was requested; and
  • distinct: the request for consent must be separate from other information if it is made in writing.

Other notable requirements include the right to be forgotten, the right to be informed of automated decision-making, privacy by default and consent requirement of minors.

It is important for businesses to be aware of these changes, as lack of compliance with the updated QC ARPPIPS could lead to hefty penalties. Given the enhanced enforcement regime, businesses can be held liable for up to $10 million or 2% of their worldwide turnover in administrative monetary penalties (AMPs), and up to $25 million or 4% of their worldwide turnover for penal offences. Additionally, Law 25 introduces a private right to claim punitive damages where an unlawful infringement of a right afforded under the QC ARPPIPS causes injury and is intentional or results from gross fault.

Proposed Amendments to the CPPA and AIDA (to replace parts of PIPEDA)

Bill C-27, which proposes to enact a number of statutes including the Consumer Privacy Protection Act (CPPA) and the Artificial Intelligence and Data Act (AIDA), has passed through the second reading of the House of Commons and is currently under consideration by the Standing Committee on Industry and Technology (INDU).

In October 2023, the Minister of Innovation, Science and Industry (the Minister) proposed several amendments to the CPPA and the AIDA for consideration by INDU, accessible here. The government’s recommendations take into consideration feedback from stakeholders, including the Privacy Commissioner of Canada (Commissioner).


The Minister suggested amending the preamble and the purpose clause to qualify the right to privacy as a fundamental right. Moreover, while the CPPA already includes stronger protection of minors, the Minister suggested further amending the bill to ensure organizations consider the special interests of minors when determining whether PI is being collected, used or disclosed for an appropriate purpose (per section 12 of the CPPA). Lastly, the Minister put forward an amendment to the CPPA for permitting compliance agreements to contain financial consideration, in order to address concerns that the Privacy Commissioner cannot impose financial penalties on non-compliant organizations. This would be in addition to the Commissioner’s power to impose interim orders which are not subject to an automatic right to appeal. Not to mention, the ability of the proposed Personal Information and Data Protection Tribunal to impose AMPs, which can be up to the greater of $10 Million or 3% of an organization’s gross global annual revenue.

Unfortunately, the Minister’s recommendations did not call for further clarity on some of the more ambiguous new standards proposed by the CPPA, such as for example the appropriate purposes limitation. Nor did they call for clarification of provisions which seem to impose impractical standards, such as the definition of “anonymized” data, which is even more stringent than those imposed by the GDPR.

While protecting privacy is of utmost importance, and qualifying the right to privacy as a fundamental right would be a further step towards that goal, the purpose of the CPPA is to balance the privacy of individuals with an organization’s need for collecting, using and disclosing PI. Overly stringent legislation combined with the threat of hefty penalties may work to skew that balance.


The Minister suggested clarifying the meaning of “high impact system” by setting out a number of key classes. The initial proposed list of classes would be with respect to AI systems used for:

  • matters relating to employment;
  • matters relating to the determination of whether to provide services to an individual, the determination of the cost or type of service, or the prioritization of the service to be provided to the individual;
  • processing biometric data relating to identification or an individual’s behaviour;
  • online content moderation or the prioritization of the presentation of such content;
  • matters relating to health care or emergency services;
  • decision making by a court or administrative body; and
  • assisting a peace officer exercise their law enforcement powers and duties.

The Minister recognized that this list of classes could change over time with evolving technologies. However, the identification of these classes still does not define the criteria that would characterize an AI system as “high impact”. This is perhaps something that should be explained in the AIDA itself, and not left to be defined in its regulations.

The Minister further suggested targeted amendments to key definitions in the AIDA in order to align it more closely with international frameworks such as those in the EU and the Organization for Economic Cooperation and Development (OECD). For example, one of the suggestions included broadening the definition of AI to “a technological system that, using a model, makes inferences in order to generate output, including predictions, recommendations or decisions”.

The Minister also proposed providing clearer obligations for developers, persons making available high-impact systems, and those managing the operations of such systems.

Similarly, the Minister recommended outlining distinct obligations for general-purpose AI systems (such as ChatGPT), which can be used for many different tasks. Stakeholders believe that while general-purpose AI systems could be regulated as a high-impact system, they are distinct enough to garner their own recognition in the law. Among these proposed amendments are provisions to ensure that Canadians can identify AI-generated content.

Lastly, the Minister asserted support for amendments that would clarify the function and roles of the Artificial Intelligence and Data Commissioner (AIDC). Currently, the AIDA generally defines the AIDC’s role as assisting in the administration and enforcement of the AIDA.

Despite the Minister’s attempt to further clarify the AIDA, many stakeholders believe it still has a long way to go and continue to suggest removing the AIDA from Bill C-27 altogether. Removing the AIDA would allow for more thorough consultation and perhaps lead to a clearer, more robust piece of legislation.

However, there is trepidation that removal of the AIDA may further dampen the push to pass Bill C-27 before Canada’s next federal election, especially in light of the European Commission’s recent adequacy decision. Earlier in January this year, the European Commission concluded that Canada, among 10 other countries up for review, has adequate data protection safeguards via the Personal Information Protection and Electronic Documents Act (PIPEDA), for continued PI transfer between the countries, lessening the urgency to get Bill C-27 enacted.

Voluntary AI Code of Conduct

In the meantime, while the AIDA as well as the proposed amendments are contemplated, Canada’s Voluntary Code of Conduct on the Responsible Development and Management of Advanced Generative AI Systems is available to help bridge the gap. Organizations are encouraged to apply the measures outlined in the voluntary code when managing their general-purpose AI systems. The voluntary code is focused on six core principles: accountability, safety, fairness and equity, transparency, human oversight and monitoring, and validity and robustness.

Notable Cases

In Privacy Commissioner of Canada v Facebook, Inc [2023 FC 533] , the Office of the Privacy Commissioner of Canada (OPC) conducted an investigation, and consequently requested a Federal Court (FC) hearing to ascertain whether Facebook failed to obtain meaningful consent before sharing users’ PI with third-party applications (apps), and whether it failed to adequately safeguard their users’ PI.

The Federal Court dismissed the application, asserting that it found “itself in an evidentiary vacuum”. Consequently, the Federal Court was unable to provide additional guidance on the meaning of “meaningful consent”. Ultimately, the FC held that the OPC failed to discharge its burden in proving that Facebook had breached PIPEDA. With respect to Facebook’s safeguarding duties, the FC held that a company’s safeguarding responsibilities are with respect to the internal handling of PI, and once that PI is disclosed to a third-party app with user consent, Facebook’s safeguarding duties under PIPEDA end. For more information, please see our full case comment here.

The OPC is appealing the decision to the Federal Court of Appeal, with a hearing scheduled for February 21, 2024.

The Divisional Court’s decision in Broutzas v Rouge Valley Health System [2023 ONSC 540], highlights the high bar required for certifying class actions under the common law tort of intrusion upon seclusion.

The representative plaintiffs appealed a decision by the motions judge denying certification of their respective class action suits. The plaintiffs brought an action after hospital employees independently accessed hospital records of patients who had recently given birth, to either elicit sale of RESPs, or sell their contact information to RESP salespeople. The information accessed was characterized as “contact information” which while personal, was not private, and was regularly provided when showing proof of identity. Additional information accessed, such as the date and place of birth and the child’s name and gender were also not considered “private” information, since the parents would likely have publicized this. The Divisional Court agreed that while there was intrusion, there was no intrusion upon seclusion.

Further affirming the difficulty of obtaining a class action in general, in Lamoureux v. Organisme Canadien de Réglementation du Commerce des Valeurs Mobilières (OCRCVM) [2023 CanLII 24495], the Supreme Court of Canada dismissed the application for leave to appeal the judgment of the Court of Appeal of Quebec (QCCA).

The case initially arose when Mr. Lamoureux instituted a class action against the Investment Industry Regulatory Organization of Canada (IIROC), after a laptop containing the PI of thousands of Canadian investors was left on a train by an IIROC inspector. The laptop was never returned or found. Following a full trial on its merits, the Superior Court of Quebec dismissed the class action, which was further affirmed by the QCCA. The lower courts held that the members’ fear and annoyance resulting from the loss of PI are normal inconveniences of life that anyone living in society today encounters and should accept and thus do not amount to compensable harm. The lower courts further found that there was no evidence to support a causal link between the loss of the computer and the unlawful uses alleged by the class members, and the defendant had adhered to best practices and diligently responded to the incident.

Investigation into OpenAI’s ChatGPT. In the spring of 2023, the OPC along with the privacy authorities for Quebec, British Columbia and Alberta launched a joint investigation into OpenAI’s ChatGPT following a complaint alleging the collection, use and disclosure of user PI without consent. The full announcement can be viewed here.

This joint investigation occurred on the heels of the Italian Data Protection Authority’s (IDPA) temporary ban on ChatGPT in the country while investigating compliance with the GDPR. While ChatGPT was made available again to Italian users following the implementation of certain privacy controls, the IDPA’s continued investigation has more recently asserted that ChatGPT violates the GDPR, as related to the mass collection of users’ data for training the algorithm and concerns regarding exposure of inappropriate content to young users.

While the OPC and provincial privacy authorities have yet to release their decision, it will be interesting to see how it compares to their European counterparts.

Following an investigation into Home Depot (PIPEDA Findings #2023-001), the OPC concluded that the home improvement retailer had not obtained meaningful consent before disclosing their customer’s PI to Facebook.

Home Depot used “Offline Conversions”, a Facebook business tool used to measure the extent to which Facebook ads lead to real-world outcomes. In doing so, Home Depot sent their customer’s hashed email address and offline purchase details to Facebook when customers provided their email address for an e-receipt. After matching the customers’ email address to their Facebook account, Facebook would provide Home Depot with an aggregate report of their findings, and further use the information for their own purposes.

The OPC concluded that Home Depot required express opt-in consent prior to sharing customer PI with Facebook, which it had not obtained. Moreover, Home Depot did not obtain implied consent, nor could it have relied on implied consent, as most customers were unaware of Home Depot’s information-sharing practices and would not reasonably expect it.

The privacy sector saw many changes in 2023, and 2024 is bound to be just as dynamic. While privacy laws continue to change, it is vital for businesses to stay on top of their obligations. For more information please reach out to Bereskin & Parr’s Privacy and Data Protection team.

Subscribe to our newsletter

You can unsubscribe at any time. This site is protected by reCAPTCHA and the Google Privacy Policy and Terms of Service apply.

This site is registered on as a development site.