The European Union’s General Data Protection Regulation (GDPR) and the California Consumer Privacy Act (CCPA) / California Privacy Rights Act (CPRA) are groundbreaking privacy regulations that aim to give individuals greater control over how companies collect, manage, and use their personal data. Yet the first few years following the start of these groundbreaking regulations have given rise to a variety of challenges and opportunities as organizations have worked to put the laws into practice, regulators have worked to enforce them, and individuals have tried to exercise their rights.
In July 2022, the UC Berkeley Center for Long-Term Cybersecurity (CLTC) hosted a 2nd annual research symposium to examine how different stakeholders — including firms and consumers — have been affected by and responded to the GDPR and CCPA/CPRA.
A key goal of the symposium was to provide a space for scholars and practitioners to share insights across an emerging body of empirical studies of privacy and data protection regulations. The papers presented were chosen following CLTC’s request for proposals disseminated in late 2021, which called for research papers that look at effects and responses in organizational structure and business models, technical design and development processes, political engagement and contestation, social norms and behavior, and rulemaking and enforcement.
The working papers are briefly summarized below, along with high-level takeaways and future directions.
Alice in Wonderland: Challenges of Data Compliance for Startups — The Case of Serbia
In this paper, Branka Anđelković and Jelena Šapić explore the impact of data protection regulations on data-dependent start-ups in Serbia. Based on a qualitative survey with local start-ups, as well as semi-structured interviews with selected start-ups, legal experts, and venture capital firms active in Serbia, the paper argues that the failure to include data management practices related to GDPR and CCPA exposes local startups to a variety of vulnerabilities and risks that can result in lost business opportunities. They found that the GDPR and, to a lesser extent, the CCPA, have an impact on start-ups searching for business opportunities, regardless of their industry. They found the two laws have varying impacts in terms of scope, depth, liability, and penalties, with GDPR exerting relatively more influence on them than CCPA; the costs and challenges of complying with CCPA exerted some dampening influence on Serbian start-ups’ desire to enter the California market.
The authors find that start-ups have internalized the high impact of GDPR requirements and have begun to meet GDPR requirements as early as possible. “GDPR has a high impact on startups from Serbia,” the authors conclude. “Even if they do not belong to the industry which is under the direct focus of GDPR such as blockchain or gaming, they are exposed to GDPR depending on which data they collect and how they process them. In this regard, GDPR imposed itself as an omnipresent regulation for Serbian startups interested in the EU market… These costs are not just associated with fines (which none of the interviewed startups encountered) but also with potentially lower investments into their business and related exits. In sum, the attractiveness of the EU market compels startups to adopt GDPR rules regardless of the operational costs.”
From Policy to Pixels: Strategic UX Design and User Support for GDPR Implementation
In this paper, Ame Elliott and Susan Kennedy of Superbloom, formerly Simply Secure, focus on the development of “cookie banners,” which have been widely implemented to notify users of websites that their cookies may be tracked as a requirement of the GDPR. Through interviews with designers and front-end developers who have experience creating cookie banners, the researchers learned that companies often focus more on meeting the legal requirements of the regulation, rather than using design to ensure consumers are fully informed and empowered to protect their privacy.
The paper introduces a preliminary framework of “personas” for capturing a range of attitudes in how businesses approach cookie development and GDPR compliance and implementation broadly. Among the personas are “Ambivalent Non-Deciders,” who take a laissez-faire attitude to implementation, and “Liability Avoiders,” who are primarily concerned with ensuring their text complies with the law. “Front-end implementers (designers and developers) and legal teams need to work more collaboratively with each other in creating cookie banners that are easy to navigate, read, and comprehend — and don’t simply ‘comply’ to the letter of the law,” they wrote. “Equipping people without specialized domain knowledge or personal passion for privacy to participate in discussion about GDPR implementation is essential for shifting the status quo and making informed consent a reality.”
Encoding privacy? How tech workers shape privacy regulations
When data protection regulations are passed from the policy arena to technical teams, how do tech workers’ attitudes and experiences shape how such policies are implemented? This question was at the heart of a paper by Rohan Grover, who conducted semi-structured interviews with tech workers about their experiences implementing technical requirements for GDPR and CCPA.
Grover found that technologists responsible for ensuring their websites were in compliance with the privacy regulations — for example, by implementing cookie consent notices, purging data, or updating data infrastructure — were primarily interested in complying with the “spirit of the GDPR/CCPA,” and they regarded the privacy regulations as “less serious” than other laws, such as the Health Insurance Portability and Accountability Act (HIPAA). Engineers described having high levels of autonomy with minimal oversight over their work; they noted that lawyers were often not available to help them, and that there was a lack of auditing or follow-up to ensure high-quality compliance. “Overall, this study validates the importance of understanding not only the behavior of developers and other tech workers when implementing privacy features, but also their attitudes toward that work — especially in the case of data protection regulations,” Grover concluded.
Do Data Breach Notification Laws Work? A Staggered Synthetic Control Approach to FTC Panel Data 2000–2020
In his paper, Aniket Kesari introduces a new taxonomy to understand data breach notification laws in the U.S., which have been enacted in all 50 states over the past 20 years. Kesari found “rich variation” across states’ laws, including “differences in disclosure requirements to regulators and credit monitoring agencies; varied mechanisms for public and private enforcement; and a range of thresholds that define how firms should assess the likelihood that a data breach will ultimately harm consumers.”
Using a method called “staggered synthetic control” — a popular method in the social sciences for conducting data-driven comparative case studies — Kesari found that whether data breach notification and identity theft laws work depends on which legal provisions are employed. “The most effective provisions are those that require disclosure to state regulators and those that apply breach notification requirements to encrypted data,” Kesari wrote. “As more states create regulatory bodies [similar to the The California Privacy Protection Agency (CPPA], scholars would benefit from paying attention to how these agencies approach data breach and other areas of privacy, in addition to examining how the FTC will regulate privacy under the Biden Administration. State-level privacy regulation may prove to an important source of regulatory innovation.”
Translating Data Protection into Practice: Exploring Gaps in Data Privacy and Regulation within Organizations
Privacy laws risk being watered down as they are translated into practice. This paper by Jillian Kwong explores gaps in the US regulatory environment by taking a closer look at organizational responses to data protection mandates, and particularly the “human actions and behavior taking place within small to medium sized organizations within the United States.”
Based on interviews with experts from across law, privacy, and security fields, Kwong’s paper uses a “communication lens” to understand the breakdowns that occur at the intersection of policy, law, business, management, and organizational behavior at small to medium-sized enterprises (SMEs). She examines how terms are used differently in the U.S. and Europe; for example, the U.S. has defined and codified laws in terms of data privacy (rather than data protection, like the European Union), which “has major implications for how these laws converge and diverge in practice.” Kwong concluded that more research should be done to understand the perspectives of data privacy and security practitioners in SMEs: “this community has vital insights and expertise about the resiliency of organizations, specifically how data gets protected in practice, which can greatly inform the development and implementation of future privacy and cybersecurity strategies.”
Investigating Dark Patterns in GDPR’s Legitimate Interest
Deceptive designs, or “dark patterns”, are user interfaces that lead users into making decisions that benefit the online service, often through subtle deception; an “elaborate dark pattern” is a dark pattern that extends beyond the user interface. In this paper, Lin Kyi, Asia Biega, and Franziska Roesner examine how firms use a form of elaborate dark pattern to justify processing data based on the GDPR’s allowance that data can be processed if it is “in the legitimate interests of the data controller or another third party.” “We are especially interested in seeing potential elaborate dark patterns that emerge from legitimate interest because it allows for broad interpretations, does not require user consent for data collection, and the lack of legal oversight allows for it to be taken advantage of by data controllers as a loophole for dubious data practices,” the researchers wrote. “We believe the term, ‘legitimate interest’ and its usage is vague; there is a lack of guidance on what an actual legitimate interest that we all can agree should not require user consent is, and a ‘legitimate’ interest that is exploited by data controllers.”
The authors used quantitative and qualitative means to analyze the wording and designs of 10,000 websites’ consent notices, with a goal to understand how legitimate interest is used by data controllers. “From our preliminary analysis of the consent notices, we found that around 643 websites mention using legitimate interest as a ground for collecting data in their consent notices,” they wrote. “Upon further analysis, we found that more popular websites were more likely to collect data under legitimate interest grounds, and more likely to have complicated consent banners…. Overall, our findings, along with those of previous findings suggest that privacy laws, such as the GDPR and ePrivacy Directive, should impose more explicit guidelines about designing to avoid elaborate dark patterns so that users are not nudged into making poor privacy decisions.
Over the Shoulder Enforcement in European Regulatory Networks: The role of arbitrage mitigation mechanisms in the General Data Protection Regulation
The European Union enforces the GDPR through “regulation by networks,” as a network of national regulatory bodies implement regulations across member states. This paper by Abraham Newman examines how firms may be taking advantage of this enforcement approach to “arbitrage” their data practices across nations and shirk the requirements of the GDPR. He also explores how a novel governance tool — arbitrage mitigation mechanisms — may address some of these issues. These mechanisms are “formal institutional procedures, which attempt to generate equivalent application of EU law for cross border regulation, and include horizontal checks among regulatory peers as well as vertical checks that allow civil society to voice accountability concerns.”
Newman notes that the GDPR has arbitrage mitigation mechanisms built in that can help reduce the risks of: “Article 60, often referred to as the One-Stop-Shop provision, sets out practices to encourage cooperation between lead and other supervisory authorities, and Article 63, which is called the consistency mechanism, legalizes joint implementation. Using a register that publishes decisions made under Article 60 and cases of enforcement actions under the GDPR, we show preliminary evidence that these mechanisms may generate new forms of accountability…. Our argument forces a reassessment of the One-Stop-Shop, which we argue has been mischaracterized as a regulatory structure, which accentuates arbitrage possibilities. Our work looks to yet underexamined procedures in GDPR that may serve to enhance political contestation over and salience of privacy implementation, minimizing such failures and enhancing the civil liberties and security of European citizens.”
Defining ‘Reasonable’ Cybersecurity: Lessons from the States
U.S. states have passed a series of laws requiring “reasonable” cybersecurity, such as for manufacturers of internet-connected devices. The California Consumer Privacy Act, for example, “limits the private right of action to only those instances where the underlying business fails to maintain ‘reasonable’ security.” Yet the exact definition of ‘reasonable’ cybersecurity reporting and operating practices remains unclear. In this paper, Scott Shackelford, Anne Boustead, and Christos Makridis summarize the state of state-level cybersecurity policymaking with a special emphasis on how states are defining “reasonable” cybersecurity. They also disclose the results of a statewide survey on cybersecurity perceptions and practices among organizations in Indiana (done in partnership with the Indiana Attorney General’s Office) and make suggestions based on these findings about how to better educate and incentivize firms about instituting reasonable cybersecurity best practices.
“Our findings point to the need for an empirically grounded, flexible approach to the problem that combines a minimum (i.e., ‘common floor’) comprised of widely recognized cybersecurity best practices with sector-specific guidance and an effort to inform consumers of their rights and importance of exercising them,” they wrote. “A universal baseline standard of “reasonable” cybersecurity… is impossible to state for all circumstances, but should be thought of as a sliding scale but with certain universal precautions that all businesses, regardless of their size or sophistication, should arguably be taking that takes into account (1) the sensitivity of the information in question, and (2) utilizes cost/benefit analysis…. No single checklist or framework will protect at-risk organizations from the wide variety of cyber threats they face. Rather, each decision should be tailored to the particular cybersecurity needs of a given organization, including its functions, footprint, assets, and customer base.”
Privacy Legislation as Business Risks: How GDPR and CCPA are Represented in Technology Companies’ Investment Risk Disclosures
While much privacy research has focused on how users perceive privacy and interact with companies, this paper by Richmond Wong, Cooper Aspegren, and Andrew Chong focuses on how privacy legislation is discussed among a different set of relationships: those between companies and investors. This paper investigates how companies translate the GDPR and CCPA into business risks in documents created for investors, specifically the annual disclosure regulatory filings (Form 10-K) from nine major technology companies.
We outline five ways that technology companies consider GDPR and CCPA as business risks, describing both direct and indirect ways that the legislation may affect their businesses,” they write. “We highlight how these findings are relevant for the broader Computer-Supported Cooperative Work And Social Computing (CSCW) and privacy research communities in research, design, advocacy, and policy. Creating meaningful changes within existing institutional structures requires some understanding of the dynamics of these companies’ decision-making processes and the role of capital. As we push for more responsible decision-making at technology companies, these representations to investors provide an important indication for how we can push for effective privacy design and policy within these companies.”
To close out the symposium, participants reflected on what research questions could be pursued in the future; what practical problems could be solved through continued research; and what resources would be needed to build a robust field of empirical studies that could serve both. Following are some of the key takeaways:
Implementation is Key: It will be important for researchers to continue to explore how privacy laws are put into practice, i.e., how the privacy laws are impacting the daily practice of a heterogeneous landscape of companies and organizations. Interdependencies exist among various actors, and this complexity should be investigated to better understand what is (and what is not) working, and to support organizations as they address shared challenges. Helping small- and medium-sized enterprises (SMEs) comply may be especially important.
Impacts are Different Across Jurisdictions: While we may think of the privacy laws as offering blanket rights, we need to recognize that they are often limited to residents in particular jurisdictions, and that organizations are never fully compliant as laws vary across state and national borders. More work is needed to understand whether and how to reconcile and harmonize these laws (e.g., EU’s human-rights-based approach vs. the U.S. Constitutional framework). Whether and how to harmonize privacy regulations across borders will remain a salient issue.
Organizational Context Matters: Compliance with privacy regulations is deeply situated in organizational contexts, where attitudes and interpretations of developers (those responsible for the technical implementation necessary for compliance) — and others — can vary widely. This includes small- and medium-sized enterprises (SMEs), where ambiguity in data privacy laws can create points of contention between departments that have competing realities or interests.
Developers play a unique role in shaping compliance work when guidelines for privacy compliance are lacking.
A Holistic Approach is Necessary: Empirical findings from the papers suggested that organizations often treat privacy as either a technical or legal problem, leading to reliance on developers or lawyers, respectively. However, findings suggest that organizations should rather approach privacy more holistically, encouraging cross-functional understanding and collaboration through shared vocabularies, employee training, and appropriate resource allocation. Most importantly, privacy should be woven into an organization’s culture, instead of a job of a few individuals or a team.
We need to better define privacy harms: While privacy laws aim to offer enhanced privacy protections to individuals by imposing compliance responsibility on corporations, it is often the public who needs to be agile and proactive to exercise their rights under the ongoing notice-and-choice regime. Much of the privacy research has focused on examining compliance. Further research on concrete identification and understanding of privacy harms is needed. For example, it remains hard to establish commonality among individuals when privacy harms are individualized and cannot be part of a common class action.
We need to think ahead to anticipate changing technologies: While various privacy laws are emerging, technologies are fast-changing and constantly evolving. We should think ahead about how to handle newer technologies like artificial intelligence and quantum computing. Further research on anticipating future technologies and their implications on privacy could inform some open questions the group posed, including: would current privacy law address privacy issues that emerge from future technologies; would regimes like GDPR and CCPA/CPRA need to be adapted to meet the needs of future technologies; or would governments need new privacy regimes entirely to address privacy concerns coming down the pike?
Empirical Evidence is Now Available to Policymakers: The GDPR and CCPA have been in effect long enough to be examined through empirical research. We have entered a new juncture where actual evidence from the ground is available to policymakers. The most effective research will be holistic and interdisciplinary, and researchers should offer not only critique, but also constructive support for the effective aspects of the laws. Submitting support for aspects of privacy legislation that work well helps policymakers understand the strengths of the legislation and builds a record for what parts of the policy are effective.
Overall, this latest symposium highlighted the array of challenges and questions tied to the implementation of data protection and privacy regulations. As more state and national governments take measures to regulate data protection and privacy, scholars should continue to research the on-the-ground impacts — and translate their findings into actionable ideas and recommendations for policymakers.
The Center for Long-Term Cybersecurity is proud to be playing a role in convening scholars from around the world who are beginning to tackle these vital issues, and we hope to serve as a hub for future dialogue not just among academic researchers, but also with policymakers, institutional leaders, and other key stakeholders.