Strengthening Privacy for the Digital Age

Proposals to modernize the Personal Information Protection and Electronic Documents Act

Introduction

Technology has long brought enormous benefits, along with profound changes, to almost every aspect of human life. Much as the printing press revolutionized society starting in the 15th century, the digital revolution has had, and will continue to have, an enormous impact on daily life. Business, communications, entertainment, transportation, banking, education, health care, our interpersonal interactions and our physical movements — almost every aspect of our lives is mediated by digital technology. And with those interactions, enormous amounts of data about individuals is being created and harnessed for a vast array of purposes.

Digital and data-driven technology is already empowering science, supporting innovation, and driving economic growth. For example, advancements in areas including robotics, artificial intelligence (AI), quantum computing, and nanotechnology are leading to ground-breaking discoveries with significant economic and social benefits. But while these technological achievements are in many ways enriching our society, this transformation also brings with it challenges and uncertainty that we as a country must be prepared to address. In response to this, some stakeholders have called for the Government to adopt a National Data Strategy.

On June 19, 2018, the Government of Canada launched its National Digital and Data consultations to demonstrate its commitment to continuing to work together to make Canada a nation of innovators. As we noted in Canada’s Digital Charter in Action: A Plan by Canadians, for Canadians, we asked Canadians across the country to share their unique perspectives and ideas on what are some of the challenges and areas of opportunity for Canada in this time of transformation. And we received a resounding response — from small business owners and multi-national companies; students, teachers, and researchers; innovators and entrepreneurs; and everyone in between.

Canadians shared their optimism with us about the great social and economic potential for Canada in this digital age. But they also shared their concerns about how personal data could be used. Simply put, the way forward on data collection, management and use must be built on a strong foundation of trust and transparency between citizens, companies and government.

Trust is indeed the lynchpin of the digital and data-driven economy. Yet, clearly, individuals' trust is at risk. Popular media is rife with stories of data breaches; misuse of personal information by large companies; foreign interference, and malicious actors; cyberbullying; along with increasing concern about the impacts of the digital and data revolution on issues ranging from our mental healthFootnote 1 to democratic institutionsFootnote 2. Ineffective or inconsistent security hygiene; a lack of competition; and business models that are based on surveillance of individualsFootnote 3 have left individuals increasingly wary of how the products and services on which they now depend for nearly all aspects of their activities are collecting and using their personal information.

Trust, the Digital Economy and the Personal Information Protection and Electronic Documents Act

In the early days of the commercial Internet, when e-commerce was emerging, the Government of Canada enacted the Personal Information Protection and Electronic Documents Act (PIPEDA) to ensure trust in the emerging economy. Its stated purpose is:

to establish, in an era in which technology increasingly facilitates the circulation and exchange of information, rules to govern the collection, use and disclosure of personal information in a manner that recognizes the right of privacy of individuals with respect to their personal information and the need of organizations to collect, use or disclose personal information for purposes that a reasonable person would consider appropriate in the circumstances.Footnote 4

A principles-based, technology-neutral law, PIPEDAFootnote 5 applies to a wide-range of commercial activity, and is overseen by an Agent of Parliament, the Office of the Privacy Commissioner of Canada. In the nearly 20 years since it came into force, commercial activity has evolved rapidly and in ways unforeseen. Based on the internationally accepted privacy principles contained in the Organisation for Economic Co-operation and Development (OECD) Guidelines on the Protection of Privacy and Transborder Flows of Personal Data, (Privacy Guidelines)Footnote 6, the 10 interrelated privacy principles (and related sub-paragraphs) in PIPEDA guide organizations' personal information handling activities. One of these principles, Knowledge and Consent, along with a limited set of exceptions to consent, authorize those activities, which are required to be "appropriate in the circumstances." The rest of the principles, such as accountability, openness, accuracy, access, safeguards, redress, among others, are intended to ensure that organizations treat personal information in a manner that is fair and understandable to the average person and in keeping with their reasonable expectations. The law has been applied to a wide variety of business activities, including in the context of trans-border data flows, and has proven to be reasonably nimble in the nearly 20 years of its existence.

That said, it has been criticizedFootnote 7, particularly in terms of its consent regime and enforcement model, for not providing the kinds of incentives in a data- and digitally-driven economy to ensure that organizations are in compliance. The House Standing Committee on Access to Information, Privacy and Ethics, has also recommended updates to improve individual control and organizational transparency, in order to strengthen privacy protections in an age where individuals feel a lack of control and understanding. The Government of Canada has stated its agreement with recommendations made in several recent Parliamentary reportsFootnote 8 that changes are required to Canada's federal private-sector privacy regime to ensure that rules for the use of personal information in a commercial context are clear and enforceaable and will support the level of privacy protection that Canadians expect.

The principles outlined in Canada's Digital Charter, along with their supporting activities, collectively provide the foundation for achieving a strong and vibrant digital economy for Canada. The reform of PIPEDA must contribute to achieving the outcomes related to these principles. PIPEDA, as a key element of Canada's marketplace framework, must also contribute to achieving an inclusive digital economy that provides a level playing field, fairness of opportunity, enhanced security and privacy, predictability for business, and international competitiveness.

Canada is facing these opportunities and challenges in parallel with other leading nations as part of a global innovation race. Our global competitors are taking aggressive action in terms of supporting trust and privacy to lead in a data-driven, digital global marketplace.

Next generation privacy and e-protection laws, specifically in the European Union but also in the United States, are impacting domestic policies and practices. There is a desire for an approach to personal information protection in the private sector that meets Canada's needs and remains interoperable with leading jurisdictions. While there is commonality amongst privacy statutes in Canada and abroad, a number of important distinctions between Canadian and international frameworks are challenging the goal of an integrated digital economy both at the domestic and international levels.

The Government is considering how best to modernize its private-sector policy and regulatory framework in order to protect privacy and support innovation and prosperity. In short, the goal is to respect individuals and their privacy by providing them with meaningful control without creating onerous or redundant restrictions for business; enable responsible innovation on the part of organizations; and ensure an enhanced, reasoned enforcement model.

Specifically, the Government is proposing clarifications under PIPEDA that detail what information individuals should receive when they provide consent; certain exceptions to consent; data mobility; deletion and withdrawal of consent; incentives for certification, codes, standards, and data trusts; enhanced powers for the Office of the Privacy Commissioner; as well certain modernizations to the structure of the law itself and various definitions. The proposals outlined in this paper fall within a broader conceptual framework, detailed in Annex A, for advancing policy work in the digital and data context.

With this discussion paper, Innovation, Science and Economic Development (ISED) Canada is continuing the dialogue on "Trust and Privacy" that was initiated in the Data and Digital consultations in 2018. This paper outlines a series of policy considerations related to specific proposals that would serve to enhance consumers' control, enable responsible innovation and enhance enforcement.

The Government is also studying potential reforms to the Privacy Act, which governs the personal information-handling practices of federal institutions. That initiative is being led by Justice Canada, working closely with the Treasury Board Secretariat.

Part 1: Enhancing individuals' control

Issue:

The increased volume and complexity of data flows has strained the traditional knowledge-and-consent system and left individuals without meaningful control over their personal information and privacy.

Why is this an issue?

Digital platforms and services have become an integral part of how Canadians live, work and play. Yet, platforms and products are increasingly designed to gather and share data and/or monitor users by default, reducing consumer choice and making consent less relevant. As noted by Teresa Scassa: "…the Personal Information Protection and Electronic Documents Act's consent-based regime may need to be supplemented, and there is considerable interest in consumer- and competition-friendly tools, such as data portability, that give consumers more control over their personal information. Increasingly, public harms — algorithmic bias and the manipulation of individuals and groups — flow from the capture and use of personal information. New frameworks are required for the ethical use of data."Footnote 9

PIPEDA's requirement for knowledge and consent requires organizations to inform individuals of the purpose of the collection, use or disclosure of their personal information, and to obtain their consent. In practice, however, it has meant that individuals have borne a great deal of the responsibility to inform themselves of an organization's privacy management practices and to understand the nature, purpose and consequences of consenting to have their information collected, used and disclosed by the organization.

This is what Daniel Solove from George Washington University Law School has labeled a Privacy Self-Management approach, whereby the onus is on the individual to manage their privacy.Footnote 10 Complex data flows involving numerous parties strain an individual's ability to fully comprehend what they are consenting to. Although many organizations have privacy policies in place, these are notoriously long and complex to understand, and most individuals neither have time nor sufficient legal training to understand themFootnote 11. Solove notes that "(b)ecause individual decisions to consent to data collection, use, or disclosure might not collectively yield the most desirable social outcome, privacy self-management often fails to address these larger social values." Footnote 12 The multiplicity of online interactions can present challenges to individuals to understand the nature and extent of information sharing that occurs in this environment.

Furthermore, a lack of transparency around automated decision-making processes and the resulting decisions increases individuals' concerns related to bias and potential discrimination. Ian Kerr notes that "... AIs [artificial intelligences] are designed in ways that raise unique challenges for privacy. Many use machine learning to excel at decision-making; this means AIs can go beyond their original programming, to make 'discoveries' in the data that human decision-makers would neither see nor understand... I would therefore submit that PIPEDA requires a duty to explain decision-making by machines."Footnote 13 There is also the emerging presence of software agents and bots interacting in the marketplace. This has the potential to deceive users and undermine confidence in the digital marketplace and underscores the need for measures to ensure trust is maintained.

Canadians have made their concerns very clear. Eighty-four percent of Canadians are concerned with the use of personal information by social media platformsFootnote 14. Nearly three in four (74%) Canadians think they have less privacy protection than ten years agoFootnote 15. Ninety percent of Canadians would be "very" or "fairly" likely to sever ties with businesses that use data "unethically"Footnote 16. Seventy-one percent of Canadians would be more likely to do business with a company if it was subject to strict financial penalties.Footnote 17 According to the Canadian Automobile Association's survey of Canadians regarding autonomous vehicles, 81 percent of Canadians feel a "need for clear, enforced rules to protect their privacy of personal information when it comes to vehicle data."Footnote 18

The results of the National Data and Digital Consultations showed that Canadians want more transparency in how their data is being collected and how it is being used. However, current models that rely completely on the provision of an individual's consent to complex and lengthy privacy policies are inadequate and do not help to build trust. Canadians also want greater control over how their information is used, and need to see the value of the benefits it brings. Moreover, next generation privacy laws are including new responses to these issues by providing for explicit new rights for data mobility, expanding on rights around transparency and automated decision-making, and for deletion of their information. Canada needs to consider these options as possible responses to ensure that Canadians have the control they need to trust the data and digital economy.

A. Possible options — Consent and transparency

  • The ETHI study on PIPEDA, as well as consultations undertaken by the OPC, have resulted in a number of suggestions on how to improve individuals' control, and what the role of consent in privacy protection should be. Personal information protection laws, including PIPEDA (in consent and the existing exceptions to it) and the GDPR (outlined in its grounds for "processing"), recognize a variety of grounds for the handling of personal information. A number of proposals currently being put forward for a national US privacy law also reflect such an approach. Indeed, the Privacy Commissioner noted, "(i)n seeking to find solutions to the challenges of the consent model in the digital realm, it may be that consent is simply not practicable in certain circumstances."Footnote 19
  • The first possible set of options focuses on consent, exceptions to consent, transparency, and the definition of de-identified information and publicly available information. One potential approach for PIPEDA is to specify what information is needed as the basis for meaningful consent.Footnote 20 Requiring too much detailed information in the consent process can overwhelm individuals or become yet another screen on a device to click-through in the rush to get to the product or service. That said, when it comes to transparency in automated-decision making or accountability, an area of increasing concern as we move towards artificial intelligence and machine learning, a light should be shone on the use of these technologies to help inform individuals and oversight bodies about the basis for impactful decisions about individuals — a fundamental tenet of privacy protections.
  • Equally, we must focus consent on situations where there is an opportunity for individuals to make a meaningful and informed decision.Footnote 21 To do so it will be necessary to identify purposes for which consent may not be necessary or even appropriate.Footnote 22
  • A component of this issue concerns the concept of de-identification.
  • Under PIPEDA, personal information is defined as information about an identifiable individual. According to the Federal Court of Canada, "information will be about an 'identifiable individual' where there is a serious possibility that an individual could be identified through the use of that information, alone or in combination with other information."Footnote 23 "About" means that the information is not just the subject of something but also relates to or concerns the subject.Footnote 24 Generally speaking, the definition of personal information is given a broad and expansive interpretation. In the era of Big Data, however, when vast amounts of data are being created every day, this potentially means that any piece of data could be considered to be about an identifiable individual. Moreover, there are increasingly sophisticated means to re-identify information that ostensibly appears to be non-personal. The idea that the anonymization of information, which would render such information outside the scope of privacy legislation, is practically attainable, is unlikely. That said, a risk-based approach, in which de-identified information could be defined and its use allowed in certain specified circumstances, with penalties for re-identification, could be taken to both address privacy concerns and enable innovation.
  • Concepts such as pseudonymous information are being incorporated into other privacy lawsFootnote 25, in recognition that there is a desire to use information that need not necessarily be personally identified, but that remain identifiable, and that protections are needed for such information. The concept of pseudonymous information could be incorporated into exceptions to consent, to clarify that while this information may not be "identified", it still retains a privacy interest and must be protected.
  • ETHI also raised concerns about consent and personal information in the public domain. PIPEDA currently contains the Regulations Specifying Publicly Available Information, which are outside of the knowledge and consent requirements. Drafted at the same time as PIPEDA, it reflects to some degree, the technology and uses of its time, and includes information sets (such as registries) that were mandated to be publicly available. Personal information and classes of personal information are listed, along with restrictions (generally, the collection, use or disclosure of such information must relate directly to the purposes for which it was made publicly available), and are exempt from the requirements for knowledge and consent. Some have argued that these Regulations need updating to reflect the current environment; others have raised concerns about the need to respect the privacy interests attached to such information.Footnote 26 This issue also relates to individuals' online reputation, which is discussed further, later in this Part.

We therefore propose to:

Provide more meaningful controls and increased transparency to individuals by:

  • Requiring organizations to provide individuals with the information they need to make informed decisions, including requiring specific, standardized, plain-language information on the intended use of the information, the third parties with which information will be shared, and prohibiting the bundling of consent into a contract.
  • Providing for certain alternatives or exceptions to consent to facilitate use of personal information by business under specific circumstances, to cover, for example, common uses of personal information for standard business activities. Likewise, adding a definition of de-identified information, along with an exception to consent for its use and disclosure for certain prescribed purposes and penalties for re-identification, could also enable the use of such information for appropriate purposes, while at the same time ensuring that it is otherwise subject to the protections afforded under PIPEDA. Developing such a definition, however, will be challenging given that nearly any information can be personal information.
  • Consent would still be required for those uses that have the biggest impact on individuals. This would of course not encompass those situations where consent is inappropriate or contrary to the activity, such as investigations, responding to subpoenas or other lawful means to compel the production of information.
  • Informing individuals about the use of automated decision-making, the factors involved in the decision, and where the decision is impactful, information about the logic upon which the decision is based. Such a requirement would not extend to revealing confidential commercial information to an individual. As more complex data uses, especially those that do not involve human discretion, such as those supporting the development of artificial intelligence, increasingly move out of research labs and into the marketplace, automated decision-making will become the norm. With it comes the risk of misuse of personal information that can result in undue discrimination and bias. The purpose of shining more light on automated decisions is to assist individuals in better understanding how such decisions are made about them.
  • Requiring enhanced transparency of practices, by explicitly requiring organizations to demonstrate their accountability, including in the context of transborder data flows. This proposal is explored further in Parts 2 and 3.
  • Exploring the definition of publicly available information.

Considerations and questions:

  • PIPEDA already requires organizations to notify individuals of the purposes for the collection, use or disclosure, and to inform individuals of the types of personal information being collected as part of being open about their personal information policies and practices. The Office of the Privacy Commissioner of Canada (along with its provincial counterparts in Alberta and British Columbia) has issued guidanceFootnote 27 on this question. Under consideration are further clarifications in law.
    • Will the additions we are proposing be enough to increase meaningful consent for individuals?
    • Some jurisdictionsFootnote 28 have defined enhanced measures, such as prohibitions or explicit consent, linked to the collection and processing of sensitive personal information, whereas PIPEDA has always taken a contextual approach. Should sensitive personal information be defined, with specific protections added?
  • Taken together, enhancing consent requirements where the impact is greatest, and reducing reliance on consent for common practices or trust environments (see Part 2), could provide a balanced approach for individuals and businesses. For example, it could give more meaningful control to individuals and reduce the risk of "consent fatigue" by removing the need to consent to uses that most individuals would consider reasonable and focusing on those of greater risk. For business, it could provide greater certainty about those practices where consent is required and reduce the risk of objection to common business practices (though they will still need to be open about these practices by outlining them in a privacy policy readily available to individuals).
  • The term "standard business practices" could capture purposes such as fulfilling a service; using information for authentication purposes; sharing information with third-party processors; risk management; or meeting regulatory requirements. The accountability principle, as well as all other requirements under the Act would continue to apply.
    • What are the benefits or risks of removing the requirement to obtain consent to process personal information for purposes that are considered to be standard business practices?
    • What activities should be captured by such a provision?
    • What must be outside the scope of such a provision?
  • This approach would require companion amendments to enhance the oversight role for the Privacy Commissioner. The existing overarching requirement that purposes be appropriate would, of course, remain.
  • The concept of de-identified/pseudonimized data is being recognized in other laws as a way forward to enable innovation and protect privacy, with appropriate conditions surrounding it. Adopting a similar approach as exists in other jurisdictions will also help with interoperability concerns and will bring greater certainty to individuals and organizations, especially in the cross-border context. In Part 2, we propose a particular use, without consent, for de-identified information, related to research and data trusts.
  • While de-identification can provide for increased security and confidence to allow increased data use and sharing, there are increasingly sophisticated techniques that permit the re-identification of so-called de-identified or anonymized data.
    • What other protections for de-identified data would be required to mitigate the risks of re-identification?
  • Lastly, with respect to publicly available information, this is a very complex issue, with many implications for individuals and organizations. Recent controversies have illustrated the privacy implications that arise from the use of social media information, for example.
    • Does the current definition provide enough clarity and certainty for businesses and individuals to understand how personal information in the public domain should be protected?

B. Possible options — data mobility

"Data mobilityFootnote 29" refers to enabling individuals to request that the personal information that they have provided to an organization, be provided to another organization. It has been touted as having the potential to empower individuals to "vote with their feet" so to speak. It is viewed by some as an evolution to the existing provisions in PIPEDA for access to one's own personal information and for withdrawal of consent.

Studies in other jurisdictionsFootnote 30 have determined that data mobility has the potential to enhance consumer choice thus fostering the emergence and growth of innovative new goods and services, in addition to supporting greater individual control over data and encouraging competition. As noted by Michael Geist in his Senate testimony on data mobility in the banking sector, "there are undoubtedly security protocols and standards to be developed, but the starting point is regulated support for a consumer-focused system that gives consumer control by opening their data at their request."Footnote 31

We therefore propose to:

Introduce new data mobility opportunities to enhance individuals' control over information by:

  • Providing an explicit right for individuals to direct that their personal information be moved from one organization to another in a standardized digital format, where such a format exists.

Considerations and questions:

  • This approach would need to be complemented by the development of common approaches to data transference, reception, and use, potentially through codes of practice or the development of technical standards.
  • Exceptions to the requirement to port data must be provided in situations where it would be contrary to law enforcement, prejudice an investigation, would reveal proprietary processes or technologies, or where it is not technically feasible.
  • There are a number of issues with respect to implementation of data mobility under PIPEDA. In particular there is a question as to whether the provision should capture derived information (e.g. a profile or categorization of the individual by that organization), and information pertaining to a third party (e.g. the individual's contact list). It will also be important to manage expectations. Data mobility can raise significant consumer expectations; for example, there may be instances where a competitor does not yet exist or where mobility may not be technically feasible.
    • Should a provision for data mobility refer strictly to information that is provided by the individual and exclude derived information?
    • Should a provision for data mobility capture 3rd party information?

C. Possible options — online reputation

  • There has been a great deal of discussion in recent years about the impact of the online environment on individuals' ability to manage their privacy and reputation. While social media facilitates connectivity amongst individuals, it can also present challenges for individuals to control all possible third-party use of their personal information. The incident involving Facebook and Cambridge Analytica is but one example of this.
  • Some of PIPEDA's existing provisions are helpful in addressing reputation issues; in particular those allowing individuals to correct the accuracy of their information, withdraw consent for the use of their information, and to go on the record in relation to disputed personal information. There are also provisions limiting the retention of personal information and requiring disposal of personal information when no longer required. In all, these contribute to providing individuals with some measure of control.Footnote 32
  • However, there are limits to the effect of these provisions in the online environment and the Government recognizes the particular risks to youth and children.Footnote 33
  • Some responses put forward in other jurisdictions include de-indexing and source takedown/deletion. De-indexing involves the removal or suppression of certain links in online search engine results. Source takedown/deletion refers to the removal of personal information from sites where the information is directly provided by an individual.
  • At the time of writing, the application of PIPEDA to search engines is before the Federal Court of CanadaFootnote 34. This discussion paper will therefore not focus on de-indexing at this time.
  • Another key challenge is that storage of information has become extremely cheap, and the amount of information that can be retained is vast. Although PIPEDA currently requires personal information to be kept only as long as needed to fulfill identified purposes, this often does not happen. In an age in which personal information is increasingly monetized, the incentives to keep personal information — in case it may be of use at a later point — are great. Stale information may be used against individuals and have impacts on their reputation.Footnote 35 It can also be replicated easily (this is particularly challenging in the social media context).
  • The following are possible options to enhance existing abilities to remove personal information at the source:

Enhance the ability of individuals to maintain their online reputation by:

  • Requiring organizations to inform minors about their right to delete or de-identify their personal information that they provided and how to do so, with minimal exceptions;
  • Providing all individuals with the explicit right to request deletion of information about them that they provided, with some caveats.
  • Ensuring the accuracy and integrity of information about an individual throughout the chain of custody by requiring organizations to communicate changes or deletion of information to any other organization to whom it has been disclosed.
  • Exploring the use of defined retention periods to increase data integrity and decrease the risk of misuse.

Considerations and questions:

  • These options expand on or clarify existing rights under PIPEDA. They also reflect a movement in other jurisdictions towards providing individuals with explicit rights to deletion of their information in certain circumstances.Footnote 36
  • The possible options would clarify and enhance PIPEDA's rules around deletion and withdrawal of consent, thereby giving individuals, in particular youth, greater control over their personal information and reputation. Providing a specific right for young people recognizes the importance of being able to explore interests and friendships online without these activities prejudicing them later in life.
  • There is a potential for such provisions to impose a compliance burden on organizations, and to unnecessarily impede access to information that is critical for business.
  • For example, it may be challenging in some instances for an organization to know who is a minor. It may also be difficult to comply with a deletion request when information was provided by a third party.
    • Do these proposals adequately enhance control for young people?
    • What parameters should be considered in order to mitigate the burden to organizations of complying with such provisions?
    • Should there be a defined retention period under PIPEDA?

Part 2: Enabling responsible innovation

Issue:

Increasingly, new business models and emerging technologies rely on complex uses of personal information by a variety of players. This has led to calls for enhanced access to personal information for the development of innovative products and services. At the same time, this triggers the need for increased accountability and higher standards of care to ensure privacy and security are respected. Added to this is the concern that it is not always clear how a principles-based law applies to new business models/technologies.

Why is this an issue?

As noted in Canada’s Digital Charter in Action: A Plan by Canadians, for Canadians: "Canada has the right ingredients to thrive in an increasingly digital world. We have strong research capacity, a diverse and highly educated workforce, and a strong investment climate. We are tech savvy and well-connected with 87 percent of Canadians and 95 percent of Canadian businesses connected to the Internet. Eighty-eight percent of Canadians have a mobile device."Footnote 37

Data is the fuel to grow the Canadian data-driven economy, yet complex data flows involving numerous parties, often across borders, can reduce an individuals' sense of control over of their personal information and ultimately their trust that it can be adequately protected. This combined with the perceived lack of transparency around automated decision-making processes, including programmatic processes, increase individuals' concerns over potential for abuse of data collected/used.

Almost every organization is now in the data business in some way, resulting in a lack of clarity about who is accountable for personal information. The autonomous vehicle industry is illustrative of this point. In addition to the sensors the vehicle manufacturer has placed within the vehicle, there are the other platforms and application developers that are also collecting data about the vehicle and the driver. In other instances, there is increasing collaboration between public and private sectors (a timely example is in the smart city scenario), which raises concerns about accountability, appropriate uses of data in the public interest, and access to data for public policy-making.

Given the importance of data- and digitally-driven innovation to Canada's economy and future prosperity, the legislative frameworks that support this marketplace must be balanced and fit for purpose.

A. Possible option: Enabling data trusts for enhanced data sharing

When compared to other jurisdictions, countries such as Canada could benefit from models that maximize the use of available data and provide a means to securely pool the data in pursuit of innovation and public good, particularly in areas such as health or transportation. Emerging solutions, such as "data trusts" may provide a way forward to help enable responsible innovation, particularly in the case of public-private partnerships.

Data trusts would involve trusted third parties managing access by organizations to sensitive databanks for research and development purposes, while protecting privacy and ensuring that organizations use data appropriately. Trusts are a framework for fiduciary asset management; policymakers, firms, and experts have begun to explore the potential application of such a trust model to data governance. Bianca Wylie and Sean McDonald of the Centre for International Governance Innovation have identified data trusts as providing "a way for data rights holders to aggregate and build leverage toward collectively bargaining for more balanced, publicly beneficial data relationships... The act of creating a data trust... is inherently specific, requiring the parties involved to agree on a common purpose, a governance structure and a clear theory of shared benefit."Footnote 38 Data trusts treat datasets as the assets that an independent third party must manage according to contractual terms designed to ensure the responsible, appropriate use of those assets.Footnote 39

We therefore propose the following:

Encourage use of data for research/innovation

  • By establishing a regime for use of de-identified data in PIPEDA, the creation of data trusts could be supported, whereby de-identified information could be processed without consent when managed by a data trust. This could be done by revising the existing consent exception for research to encourage the creation of a data trust. Such a regime could create a clear legal framework for the sharing of information without the need to seek consent but would otherwise ensure appropriate coverage under the Act.
  • This approach would need to be accompanied by prohibitions against intentional re-identification or targeting of individuals in data, or re-identification as the result of negligence or recklessness.
  • This approach would need to clearly establish linkages between enforcement of PIPEDA and oversight of a data trust.

Considerations and questions:

  • Data trusts could potentially provide a secure and privacy-enhancing means to share data in order to spur the development of AI innovations in a broad spectrum of the economy. Data trusts have the potential to allow for greater sharing and use of data for socio-economically beneficial purposes within a framework that protects against abuses of that data.
  • Canada's closest allies are exploring trusts in AI/data policies. The United Kingdom is examining data trusts as standard contracts for sharing of public- and private-sector data and Australia is in the process of creating new entities to manage private access to public-sector data.
  • There are also a number of private-sector examples of data trusts, such as firms that serve as a GDPR-compliant data controller on the behalf of its clients or organizations that will provide academics with access to data through a trusted intermediary for approved research purposes.
  • The model has the potential to provide both trusted access and disclosure of data as well as a level of oversight and accountability of those that gain access to data through the possibility of authentication and identity schemes.
    • Could PIPEDA be harnessed to encourage the development of data trusts, in particular the existing exception to knowledge and consent for the disclosure of personal information for research and statistical purposes?Footnote 40

Possible options — self-regulation and technical standards

In keeping with Canada's strategic objective to preserve the free flow of information across borders while maintaining meaningful privacy protection, Canada is engaged in international fora that promote global interoperability of privacy frameworks. Specifically, Canada supports multilateral approaches to privacy that seek to "bridge" privacy regimes internationally such that a form of "mutual recognition" is established across multiple countries or regions. APEC's Cross Border Privacy Rules System (of which Canada is a participant) is a good example. Given various initiatives are currently underway to "bridge" the APEC system with non-legislative EU legal instruments for transborder flows, the APEC system could be of considerable interest (and utility). The OECD Privacy Guidelines is another example of the type of international arrangement that Canada supports. Promoting global interoperability of privacy frameworks is a key foundation of Canada's approach to privacy.Footnote 41 Integration of codes and standards as part of a statutory framework can further assist in aligning privacy frameworks both domestically and internationally.

PIPEDA provides a baseline for privacy protection, and currently contemplates a role for codes of practice. Other jurisdictions have recognized the value of codes, standards and certification schemes in improving regulatory agility, and supporting responsible innovation. These schemes have the potential to provide more specific protections for certain sectors or activities, and to increase transparency and certainty for individuals. Furthermore, such approaches could potentially help individuals make choices based on organizations' privacy practices. In short, there is a need to recognize the value and utility of standards, codes and certification as tools to underpin privacy "rules" and try to influence and encourage their development in areas that reflect Canadian requirements, priorities and interests. Moreover, adherence to codes and standards could incentivize compliance and potentially help enable a more proactive enforcement model. In Revisiting the Governance of Privacy, Contemporary Policy Instruments in a Global Perspective, Colin Bennett and Charles Raab propose, "In domestic and international arenas, standards could fill important gaps in the enforcement regime, relieve regulators of compliance work and serve as credible methods of certification for transnational transfers of data."Footnote 42

We therefore propose the following:

Incentivize the use of standards and codes

  • The development of codes of practice, accreditation/certification schemes and standards could be encouraged through formal recognition in PIPEDA as a means of demonstrating due diligence in regards to compliance with certain provisions of the Act.
  • This could be further supported by providing the Minister of ISED, as part of his responsibilities for administration of the Act, with broad regulation-making authority related to accreditation/certification schemes.
  • Validation of codes or certification mechanisms could be achieved through recognition by the OPC and serve as a mitigating factor in the event of investigations or enforcement action.

Considerations and questions:

  • Certification, codes of conduct and standards could be used as mechanisms to enhance international interoperability and coherence across substantially similar legislation. However, there are two potential downsides to certification and codes, in particular: typically, these are expensive, and without appropriate oversight, they can be at best meaningless and at worst deceptive.
  • To address the concern about oversight, if officially provided for under PIPEDA, they could offer a route for proactive oversight by OPC.
    • Would a certification mechanism for companies to proactively demonstrate compliance with PIPEDA, with the OPC being given authority to periodically review an organization's adherence to the certification scheme or code, be welcomed by both organizations and consumers?
    • What role could other organizations play, for example, the Standards Council of Canada?
    • How could such bodies co-operate?

Part 3: Enhancing Enforcement and Oversight

Issue:

There is a growing viewFootnote 43 that the ombudsman model and enforcement of PIPEDA, which relies largely on recommendationsFootnote 44, naming of organizations in the public interest, and recourse to the Federal Court, to effect compliance with privacy laws, is outdated and does not incentivize compliance, especially when compared to the latest generation of privacy laws. The current state of affairs cannot continue; meaningful but reasoned enforcement is required to ensure that there are real consequences when the law is not followed.

Why is this an issue?

There are currently constrained consequences and impacts on organizations for non-compliance with PIPEDA. Following an investigation, the Privacy Commissioner can make recommendations, enter into a compliance agreement with an organization or pursue the matter in Federal Court, where there will be a de novo hearing. Should there be recommendations at the end of an investigation, an organization will likely incur costs associated with implementing those. If the Commissioner makes his findings public and names the organization, there may be some negative attention, which can impact the bottom line. However, while the Commissioner's recommendations are often followed, this is not always the case. Indeed, a recent privacy incident has its roots in a complaint investigation by the Privacy Commissioner from 10 years ago, where the Commissioner found that the earlier recommendations were not fully followed or implemented, and the behaviour that was offside the law continuedFootnote 45. The lack of consequences for egregious behaviour has been noted as being unfair to others in the economy, and is unacceptable. Good players, who seek advice, make improvements, and who, in short, spend money to ensure compliance, may welcome a more level playing field.

The possibility of stronger financial consequences for organizations that are offside of the law will incentivize them to take measures to be in compliance. There are some indications, based on the response of organizations when breach reporting became mandatory in 2018, and with it, the possibility of fines for willfully not reporting or keeping records of breaches, that the threat of financial penalties causes organizations to pay attention. Likewise, when the GDPR came into force, much of the media coverage and discussion in various fora centred on the substantial fines that could accrue to organizations found offside of the lawFootnote 46. Although the United States is currently considering a federal privacy law covering the private sector, the Federal Trade Commission (FTC) has negotiated a number of settlements under its current (and more limited) privacy rules. Closer to home, some of the Privacy Commissioner's provincial counterparts (and all three who oversee provincial private-sector privacy law) have order-making powers. Should Bill C-58 pass, the Information Commissioner, an Agent of Parliament like the Privacy Commissioner, would also have order-making power.

It should also be recognized, though, that non-compliance can sometimes be the result of a lack of clarity or certainty in terms of organizations' obligations under the Act. Organizations may want to comply but have difficulty understanding what they need to do in certain circumstances. Our proposals to address this are outlined further in Parts 2 and 4.

While the current model largely emphasizes mediation and negotiation, as well as education, to achieve compliance objectives, high-profile and significant incidents involving unexpected uses of personal information, as well as breaches, are eroding confidence in the digital economy and raising privacy concerns. Now is the time to strengthen Canada's privacy framework to ensure that Canada's federal private-sector privacy regime does not fall further behind.

Possible options — enhance the Commissioner's powers

The Commissioner currently has a range of investigation powers, including the ability to compel evidence, administer oaths, enter premises, examine documents, and interview witnesses. He or she may initiate an investigation on his or her own, where there are reasonable grounds, and he or she can accept complaints from any member of the public. At the end of an investigation, the Commissioner can issue a report with recommendations or enter into a compliance agreement. He or she may also take a matter to the Federal Court at the end of an investigation (there is no recourse to the Court in cases where the Commissioner initiated the complaint). The Court can then order remedies or award damages. The Commissioner may also conduct an audit if the Commissioner has reasonable grounds to believe that the organization has contravened a provision of the Act or Schedule 1. The Commissioner has no recourse to the Federal Court at the end of an audit.

Apart from these investigatory powers, PIPEDA requires the Commissioner to educate organizations and individuals on the Act, conduct research, develop guidance and encourage the development of codes of practice.

An effective enforcement regime generally involves activities related to four key componentsFootnote 47:

Effective compliance: Text version

Education/Outreach

  • Develops guidance and educates the public and businesses on the Act and their related rights and obligations

Investigation and audit

  • Investigating complaints
  • Self-initiating complaints and/or auditing

Advance/Proactive Advice and Dialogue

  • Offering advice and guidance to organizations on proposed business models and the implementation of new technology

Tools to address non-compliance or offences

  • Administration of penalties, damage awards and fines for offences or to assist in correcting non-compliance

We therefore propose modernizing Canada's private-sector privacy law by incentivizing compliance of multi-nationals and small- and medium-sized enterprises (SMEs) by:

A. Education/ Outreach:

  • Maintaining the Privacy Commissioner's education and awareness mandate to provide high-quality guidance on complex matters that are covered by PIPEDA as well as it ongoing implementation.
  • Extending the Minister's existing authority under PIPEDA to request that the Privacy Commissioner undertake research, to include privacy themes relevant to the Minister's mandate in his research and guidance development, which may allow for greater clarity for industry on emerging issues and placing it within a broader policy role for the Department.

Investigation and Audit:

  • Providing increased discretion to the Commissioner on whether to investigate complaints, and allowing for consideration of adherence to standards, certification or codes of practice in making decisions to investigate.
  • Providing for increased flexibility for auditing or reviewing organizations (related to the proposed option, outlined in Part 2, to give the OPC authority to periodically review an organization's adherence to the certification scheme or code).
  • Exploring options and mechanisms to provide for increased cooperation and information sharing with other enforcement agencies.

C. Tools to address non-compliance or offences:

  • Providing the Privacy Commissioner, in the context of its investigation and audit functions, with circumscribed order-making power in the form of cessation and records preservation orders. These powers may be used by the Commissioner to halt collection, use or disclosure of personal information by a non-compliant organization. With respect to cessation orders, the use of such orders can be further circumscribed to situations where the non-compliance has caused or is likely to cause a risk of harm or significant distress to an individual. This would provide a strong tool that the Commissioner can deploy to protect individuals when organizations put them and their personal information at risk.
  • Extending the existing regime for finesFootnote 48 to other key provisions of the Act, including and in particular consent requirements, data safeguard requirements, limiting use, disclosure and retention requirements. This involves the Privacy Commissioner referring matters of concern to the Attorney General of Canada for investigation. New obligations pertaining to deletion and data mobility would be considered key provisions of the Act, and subject to fines.
  • Substantially increasing the range of fines that are tied to offences under the Act, and provide for a scheme that identifies the mitigating and aggravating factors that should be considered, including adherence to codes, certification or standards. Scalable enforcement that takes into account the impacts on SMEs may need to be considered.
  • Further empowering the Court to order statutory damages for certain contraventions. PIPEDA could be amended to prescribe a range of damage awards, setting out minimum and maximum amounts for contraventions of specific provisions.

D. Advance/Proactive Advice

  • Reviewing the Office of the Privacy Commissioner's current proactive advisory activities to ensure that such activities can support new or emerging business models and technologies. Given that privacy is only one of many elements needed to consider when developing new innovative business models, it may be preferable to have multi-stakeholder dialogue and approaches to support codes of practice and standards in such areas. This could be leveraged through ISED's access to stakeholder communities.

Considerations and questions:

  • For business, this approach provides a level of certainty (in terms of the nature of orders that can be issued) and some incentives to comply with the law (cessation orders and the possibility of fines). For individuals, this approach also provides certainty, as well as some stronger tools to protect their rights, particularly when a cessation order is made.
  • This approach allows the Commissioner to make more efficient use of resources.
  • The proposal more closely aligns with enforcement regimes of other jurisdictions, in particular the provinces and the EU.
  • Further examination of other options (some proposed by the Privacy Commissioner, such as Administrative Monetary Penalties or AMPs) is needed as they may have impacts on machinery of government. The goal is to ensure that incentives are strong, that SMEs are not unfairly penalized, and that individuals' personal information is appropriately protected.
    • If an AMP regime is introduced, would this mechanism need to be mediated by a third party (for example, by a tribunal)?
    • Will offence-related fines be a strong enough incentive given that the current fining powers have never been used in the nearly 20 years of the law's existence and that they are outside the control of the Privacy Commissioner?
    • How do we ensure administrative fairness in a model with stronger enforcement mechanisms and an education/advisory mandate?
    • Given the significance of impacts on individuals in the digital and data economy, what should be the appropriate range of statutory damages (and conditions) to be included in the Act?
    • How could codes of practice (and possibly certification schemes) be part of a renewed enforcement scheme that incentivizes organizations, gives greater certainty and brings transparency and accountability to practices, to the benefit of all stakeholders in the data economy?

Part 4: Areas of Ongoing Assessment

Issue:

PIPEDA is a principles-based and technology neutral law. These are its strengths and should remain. However, it has been criticized for, among other things, being inaccessible to individuals and organizations, especially small- and medium-sized organizations due to its complex structure. Evolving business models, and the numerous players involved, mean that the scope of application of the law should be examined in order to ensure that Canadians are protected and businesses have a level playing field and that accountabilities — along with the responsibilities that entails — are appropriately apportioned.

Why is this an issue?

Clarity of obligations

PIPEDA is a creature of compromise, and its drafting reflects that. When PIPEDA was enacted, it followed a number of years in which there was pressure on government and industry to act with respect to personal information protection. The passage of the EU Directive 95/46 EC, the burgeoning e-commerce industry, and growing concerns among Canadians about how their personal information was being used led to consideration of how best to proceed to ensure trust in the economy and to support Canada's trade goals. Industry, preferring self-regulation, along with representatives of consumer groups, academia and government, developed the CSA Model Code, which contained 10 principles of privacy protection, based on the OECD Privacy Guidelines. The federal government ultimately decided, however, that self-regulation was not enough, and moved to legislate. In doing so, it chose to incorporate the Model Code into the Act, without changes to the language, given that the Code had represented a consensus among industry, consumer protection advocates, academics and government participants. It was of the view that this would be the most effective and expeditious way to act in a relatively short timeframe.

Although praised for being principles based and technology neutral, PIPEDA has been criticized for being difficult to understandFootnote 49. Having rights and obligations contained in Schedule 1, instead of in the body of the law, and cast in non-legal language, mixing obligations with best practices (shall v. should) have posed challenges for individuals and organizations, as well as the courts, to understand.

As a result, it is difficult for individuals to challenge organizations' compliance, and for organizations to understand their obligations. Moreover, PIPEDA applies to other issues, apart from privacy (the Electronic Documents part of its name, or Parts 2 to 5, specifically).

The Government has supported a number of initiatives to enhance digital literacy. To assist with this, we are proposing redrafting the law to set out personal information protection rights and requirements in a manner that is easier for all to understand.

Scope of application, accountability

Since PIPEDA was enacted, new business models and types of organizations have emerged which are not traditionally acting as "controllers" or "processors". As the business environment continues to evolve and more players appear (for example in the Internet of Things or AI environments), the applicability of the Act should be updated and clarified, including in the context of transborder data flows.

A growing number of organizations and entities are engaging in non-commercial data collection activities. While these activities are not covered under PIPEDA, it might be appropriate to assess the relevance of extending PIPEDA to these activities.

Ensuring that the Act properly applies to the various players is also particularly important when considering accountability and the need for privacy management programs that include flexible risk assessment processes, including Privacy by Design.

Considerations and questions

  • It will be important to maintain the principles-based and technology-neutral approach. Alberta and British Columbia's respective Personal Information Protection Acts offer solid examples of maintaining that approach in the text. The nuances — the respect for context, individuals' expectations and overall emphasis on reasonableness should remain.
    • What are the risks and benefits of these important "house-keeping" measures?
    • Are there others?

Next steps

Discussions that result from this paper will inform the development of options for legislative reform.

Annex A: Overview of marketplace policy conceptual framework

The conceptual framework outlines a complete policy approach that focuses on the whole of the marketplace. Such an approach is not limited to legislative and regulatory reform, but also creates incentive for industry-led standards and codes while also supporting international progressive agreements, which address digital commerce considerations.

Annex A: Text version

Figure illustrates a policy approach that lists activities focused at a domestic level at the bottom and moving up to activities at a global level. An arrow on the side of the figure illustrates the movement from domestic to global activities. The list of activities from top to bottom are:

  • International Trade
  • Standards and Codes / Data Trusts
  • Sector specific guidance / activity-based regulations (for example — connected and automated vehicles)
  • Legislation of General Application: Privacy & Data protection, Competition & IP statutes, Other marketplace policy statutes