Artificial intelligence (AI) – and particularly generative AI – has accelerated the race toward ever more innovative data-driven products and services, and sharpened focus on the ever-growing importance of data. As the value of data increases, legislators, regulators, and enforcement bodies are jockeying for position as referees of AI and data protection. The result is an expanding body of laws, regulation, guidance, and enforcement that create new obligations for companies about how they collect and use data, protect consumer rights, secure their systems, and develop and deploy data-driven technologies. Companies also may find themselves facing pointed and wide-ranging questions from regulators, the media, customers, employees, suppliers, and shareholders about their data-related practices.
The SEC has long since focused on encouraging greater board oversight of cybersecurity, issuing staff guidance as early as 2011 and most recently culminating with new rules issued in October 2023. Shareholders and proxy advisory firms been taking notice, as well, with Glass Lewis setting disclosure expectations for cybersecurity incidents well exceeding the new SEC rules starting this year. And the Delaware courts have shown increasing willingness to find corporate boards have failed to exercise proper fiduciary duties when it comes to managing risks and disclosures. These same issues – of board oversight, risk management and disclosure – apply to a broad range of risks related to data. From data privacy to the numerous global regulatory regimes governing data to the emergence of generative AI, regulatory and reputational risk regarding data is at an all-time high.
Data is a mission-critical asset, and it is important for the company’s board of directors to exercise appropriate oversight of the company’s data governance – not merely cybersecurity, but also its regulatory and reputations risks – including by understanding potential data-related risks and setting appropriate expectations for management reporting to the board on data governance and protection. Below, we describe several key areas of emerging data-related regulation and risk that boards should know about, and describe measures that boards can take to help exercise appropriate oversight in this rapidly-changing area.
What boards should know about emerging data-related legal risks
Data-related risks are expanding under multiple areas of law, including (1) emerging regulation of the development and use of AI technologies, particularly in contexts that may significantly affect individuals; (2) cybersecurity laws that govern how companies prepare for, respond to, and disclose cyber incidents; and (3) data privacy laws that impose heightened obligations regarding the safeguarding of personal information. While all three of these areas will affect most companies, certain areas may have a greater impact based on the nature of a company’s business and data-related activities.
- AI regulation
- Regulators and legislators are actively considering new laws and regulations to mitigate potential risks and harms from the use of artificial AI technologies.
- The White House’s Executive Order on the Safe, Secure, and Trustworthy Development and Use of Artificial Intelligence lays the groundwork for standards and reporting regimes related to the development and use of AI systems in public-sector activities.
- Meanwhile, states across the country are passing protective measures to address concerns about AI’s use in employment, financial services, healthcare, and a host of other sectors. For example, many of the new U.S. state consumer data privacy laws impose new restrictions on certain types of automated decision-making.
- The European Union is finalizing a comprehensive AI Act that will have extraterritorial effect for companies with users in Europe, along with a suite of other data-related regulations.
- The draft AI Act introduces EU-wide minimum requirements for AI systems with a sliding scale of rules based on the level of risk.
- The AI Act will affect providers, users, end-product manufacturers, importers or distributors of AI systems from inside or outside of the EU, to the extent the AI system (a) is placed on the EU market, (b) is used in a manner that affects people located in the EU, or (c) produces output that is used in the EU.
- Regulators and legislators are actively considering new laws and regulations to mitigate potential risks and harms from the use of artificial AI technologies.
- Cybersecurity
- Publicly-traded companies are now required to provide more detailed information in their SEC filings regarding their cybersecurity practices, including the Board’s oversight role.
- The SEC’s cybersecurity disclosure rules require companies to provide annual disclosure in their Form 10-K about their cybersecurity governance, strategy, and risk management processes, including the board’s oversight of the company’s management of cybersecurity risks.
- The SEC’s rules also require companies to disclose any material cybersecurity incident in a Form 8-K, generally within four business days after determining the incident was material.
- The SEC has demonstrated that it will not hesitate to take action against company executives as well as the company itself for alleged fraudulent misstatements and omissions regarding the company’s cybersecurity posture, as shown in its recent Solarwinds action.
- Regulators are expanding sector-specific cybersecurity requirements.
- Companies operating in certain sectors also may be subject to specialized cybersecurity regulations, such as the Department of Homeland Security cybersecurity regulation of critical infrastructure sectors and New York Department of Financial Services rules for companies offering financial or insurance products.
- The White House has also released a roadmap for increased cybersecurity regulation, as part of the National Cybersecurity Disclosure, foreshadowing additional areas of potential cybersecurity regulation.
- Publicly-traded companies are now required to provide more detailed information in their SEC filings regarding their cybersecurity practices, including the Board’s oversight role.
- Data Privacy
- Regulators are finding increasingly sharp tools to enforce privacy regulations, some of which can impose existential crises for companies.
- As a noteworthy example, the FTC has been sharpening its enforcement tool of “algorithmic disgorgement”: i.e., requiring companies, as part of an FTC consent order, to delete AI models and algorithms developed using data that the FTC determined had been improperly obtained. This type of remedy can strike a fundamental blow to a company’s core product, beyond any injunctive or financial relief that may be obtained.
- Data privacy legislation is expanding across the U.S., as well as globally, and may require fundamental changes in data-driven products, services, and business models.
- In the U.S., over a dozen states have passed comprehensive consumer data privacy laws and many other states are considering similar legislation. To date, these laws primarily affect B2C companies, but the California Consumer Privacy Act (CCPA) notably extends to HR and B2B data as well.
- These laws impose new obligations and restrictions on the handling of covered personal information, including new requirements concerning notices, consents, and opt-out rights, and new restrictions on automated decision-making. They require greater transparency to consumers, and impose significant new limitations on targeted advertising and data sharing activities in particular.
- Privacy litigation presents a growing business risk for businesses, with many companies facing class action litigation.
- Privacy and data security breaches of personal information have resulted in massive enforcement actions, such as Equifax’s $575M settlement with the FTC, Consumer Financial Protection Bureau, and 50 states and territories arising from its 2017 data breach.
- Data security breaches also are triggering large volumes of consumer class action cases, and the CCPA’s private right of action for certain security incidents has provided even more fertile ground for this type of litigation.
- Regulators are finding increasingly sharp tools to enforce privacy regulations, some of which can impose existential crises for companies.
What measures can boards take to help exercise appropriate oversight of the company’s data governance?
What is data governance? Like all governance, data governance involves understanding risks, managing and mitigating risks, establishing appropriate risk appetite and direction for management, implementing appropriate communication channels for internal reporting and disclosure of compliance or issues, and ensuring adequate board oversight of these risks. One means of data governance can be establishing an overarching framework for management to report to the board on what risks exist, their potential scale and impact on a company, a target risk appetite, and options for mitigation, externalization, and acceptance of risks. Third-party frameworks such as NIST’s Privacy Framework, Cybersecurity Framework, and AI Risk Management Framework can provide helpful resources as the company develops a data governance framework that is tailored to its own business and needs. The regular review of the data governance framework and risk assessment will be critical to ensuring that the company’s processes evolve in light of a fast-changing regulatory technological and risk landscape.
Considerations for boards: Boards should engage with management to understand how the company understands the risks – from regulatory risks to reputational risks – related to its data collection, use, and storage. Those risks should be measured against and enterprise risk management framework. And the board should set expectations for management as to what level of risk is acceptable, allocate budget for mitigation of risks, and track progress of management’s efforts and remediation efforts. Like all areas of governance, boards should also set “red flag rules” for management to raise to the board when certain risks are elevated, such as the impact of new regulations or the risks associated with a new product line, like those involving generative AI. Increasingly, boards must determine how to establish oversight and ensure appropriate management of these risks from a multi-disciplinary perspective across the organization.
Considerations for boards:
- Does the company have a framework for measuring risks related to data, understanding controls and mitigations for those risks, and accepting residual risks?
- Does management keep the board informed regarding critical risks, including risks related to its most important “crown jewel” data, ongoing regulatory risks, and potential reputation impact of its data practices?
- Does the board understand the company’s data strategy and how data is used in its key products?
- Is data central enough to the company’s mission and success that a board committee should be assigned oversight of data governance? Has a cadence of regular reporting to the committee and the board been established? Have committee charters been updated or revised to conform to this allocation of responsibilities?
- How often does the board meet with management regarding existing, new and emerging risks related to data, and has the board set expectations for management on data governance? Do management and the committee or the board have an understanding of what level and amount of information is necessary in order for the committee and board to establish effective oversight?
- Does the board have a method to conduct oversight of management’s data governance, including public reporting of materiality and risks related to cybersecurity?
- Is the board regularly informed regarding how the company is complying with new data privacy and security laws and regulations and the potential financial and reputational risks of noncompliance?
- Are the company’s CISO, chief privacy officer, and data governance team properly positioned within management so that their views are heard and their compensation appropriately incentivizes their oversight of risk mitigation?
- Has the board discussed the company’s incident response plan, how management plans to engage the board in incidents, and how it assesses materiality? When was the last time the board participated in a tabletop exercise?
A strong data governance program not only mitigates legal risks to the business, but it also can help to protect and maximize the value of the company’s investment in its data, setting the course for future growth and opportunity.