Advertisement

Leveraging Data Governance to Manage Diversity, Equity, and Inclusion (DEI) Data Risk

By on
Read more about author Mark Milone.

This is part 2 of a two-part article. Find part 1 here. In this section, we discuss how to leverage Data Governance to manage DEI data risk.

In part 1 of this article, we identified several best practices for data that chief diversity officers can leverage to support data-driven approaches to diversity, equity, and inclusion (DEI). As discussed in part 1, risk arising from DEI data and metrics is a significant roadblock for advocates. Now we turn to the practices that organizations use to mitigate data risk and demonstrate how chief diversity officers can more effectively partner with data professionals to advocate for data-driven DEI.

DEI Data Compliance

DEI data risk introduces the next Data Governance concept we must consider: compliance. Compliance is the process we use to ensure that DEI data is collected by (and made accessible to) only the right people. To demonstrate compliance, DEI data stewards must formally delegate responsibility to data producers for classifying the data they collect based on its sensitivity. Depending on the classification approach adopted by the organization, DEI data will probably be classified as “restricted,” which is applied to highly sensitive data regarding customer or internal business operations. Next, stewards delegate responsibility to platform managers for administering access controls based on data risk assessments and data sensitivity classifications. 

Lastly, stewards delegate responsibility to data consumers for protecting the data they use based on sensitivity classifications and any other DEI compliance obligations. Remember that a significant portion of DEI risk is attributable to the actions people take based on the metrics, rather than the metrics themselves. This risk may increase based on the notes DEI data consumers take, the emails they send and the discussions they lead. Anyone with access to diversity metrics must be sensitized to these risks through proper training. Your protocols may also want to include directions on how to recognize when information should only be shared verbally, rather than in writing.

DEI Data Democratization 

In organizations with a healthy data culture, the counterpart to compliance is data democratization. Democratization is the ability to make data accessible to the right people at the right time in compliance with all relevant legal, regulatory, and contractual obligations. Leaders delegate responsibility to stewards for driving data culture by democratizing data so that high-quality data is available to the enterprise in a compliant manner. Such democratized data enables frontline action by placing data into the hands of people who are solving business problems. Stewards democratize data by eliminating silos and moving past the inertia that develops around sensitive data sources. 

An essential aspect of democratization, therefore, is compliance. Stewards will not be able to democratize data without a clear ability to assess and manage risk associated with sensitive data. That said, it is critical that DEI advocates limit democratization of DEI data, especially at the outset of their project or program. Legal and compliance SMEs are likely to require that DEI advocates implement controls that mitigate compliance risks before any data is processed on data platforms or used to make decisions. This is because any visualizations and compilations of DEI data will be discoverable in the event of litigation and plaintiffs will do their best to take such work product out of context. Another essential aspect of a DEI initiative, therefore, is clear communication of how risks will be managed and compliance obligations satisfied prior to democratization. As we will see, it also makes sense to explicitly limit democratization until sufficient levels of data maturity have been reached.

DEI Data Protection Impact Assessment

Assessment is a key activity to demonstrate compliance and enable controlled democratization of DEI data. We need to dig a bit deeper into evolving privacy laws to understand why this is the case. Outside of the United States, nations have developed comprehensive data protection laws that need to be considered for data-driven DEI. The European Union (EU), for example, includes employee privacy within its General Data Protection Regulation (GDPR) that apply to the protection of individuals. The EU’s approach has also taken root in the United States, however, specifically within states such as California, Colorado and Virginia. [1] Due to its relative maturity, it is useful to use the GDPR as a benchmark for assessing data practices.

Under comprehensive privacy regimes, it is likely that DEI data constitutes a “special category” of personal data that warrants additional protections because it poses a high risk to personal information. [2] Although the terms differ among jurisdictions, the basic concepts are substantially similar and these special categories of data require assessments that are referred to as Data Protection Impact Assessments (DPIAs). These DPIAs are required any time organizations begin a new project that is likely to involve such elevated risks. [3] One example of the types of conditions that require a DPIA include processing personal data related to racial or ethnic origin, religious or philosophical beliefs, health or concerning a natural person’s sex life or sexual orientation. In other cases, where the high-risk standard is not met, it may still be prudent to conduct a DPIA to minimize liability and ensure best practices for data security and privacy are being followed. These DPIAs are an example of the “protection by design” principle in privacy governance and these concepts should be leveraged by DEI advocates when gathering support for their initiatives.

Because it is highly likely that DEI projects will trigger the need for a DPIA, DEI advocates should partner with privacy professionals to assess and document:

1. Lawful Processing: a systematic description of the likely processing operations and the purposes of the processing including the legitimate interest pursued by the controller

2. Necessity/Proportionality: an assessment of the necessity and proportionality of the processing operations in relation to the purposes

3. Data Subject Rights: an assessment of the risks to the rights and freedoms of data subjects

In addition to lawyers, privacy professionals will be another crucial part of your team. This brings us to the last Data Governance practice we will consider.

DEI Data Teams

Data professionals whose role encompasses Human Resources (HR) are familiar with addressing many legal issues that arise in the employment lifecycle. In addition to consulting with the legal and information technology (IT) departments, data professionals keep in close contact with the HR experts and these relationships should be leveraged to support DEI initiatives. HR professionals have mature practices for handling confidential information that will be important for managing DEI data risk. 

HR professionals often receive detailed training about how to collect information relevant to employment decisions while avoiding practices that increase the risk of an antidiscrimination claim. Company policies that prohibit discrimination often provide more detailed guidance about what interview and background screening practices are permitted. One common strategy to reduce risk is to avoid asking questions that elicit information about membership in a protected class. Another strategy is to be consistent and ask the same questions of all candidates. Unfortunately, these strategies may be at odds with data-driven DEI objectives. It is important to note that long-standing policies in HR may cause reluctance to adjust policies for new use cases like DEI. This is where forming the right data team is critical to the success of DEI projects.

The first step to standing up the right team is the understanding and support of leadership. Prior to any collection and analysis of DEI data, advocates need to make sure they have buy-in from senior leadership and budget to address any problems that data and metrics reveal. Although you do not necessarily need a perfect solution, you will need to act promptly as data reveals insights into your operations so that you can mitigate foreseeable risks. As such, it makes sense to start small by launching a pilot. This will help fine-tune responses and interventions in an iterative fashion before rolling it out more broadly. This brings up the last Data Governance concept we will consider: the data team. 

One key value delivered by mature Data Governance is the ability to quickly stand up teams to address data challenges. A data team is a working group of stewards, data architects, lawyers, and other SMEs who develop administrative, technical, and physical controls that make it easier to find, use, and understand data. Data team operations (sometimes referred to as DataOps) traces its origins to two software engineering practices: continuous integration (CI) and continuous delivery (CD). Data teams use these practices to improve the quality of data and mitigate data risk. This reduces the end-to-end cycle time of data analytics and increases the creation of new data sets, data assets and models.

Best practices for DataOps direct organizations to start “business backwards, not data forward” and follow several key steps to stand up the right team:

  1. Identify a few impactful, visible, relatable opportunities. 
  2. Create a starting hypothesis. 
  3. Build a cross-functional team.  
  4. Use your platform to start running experiments. 
  5. Make sure the insights are acted upon by enabling frontline action (subject to the limitations discussed). 
  6. Seek feedback, measure, and refine. 

DEI advocates should work with data professionals and attorneys to form the right team, identify the right opportunities, and tailor the right message to leadership to gather support. This will ensure that pilot programs are overseen by a cross-disciplinary team that has a clear mandate and specific goals. This will also help you to build your business case, determine what documents can be protected by privilege, and persuade key stakeholders to support your DEI project. 

A well-trained DEI steward should lead the team with the authority to oversee the proper collection and use of sensitive information. The steward will work with the team to document who is authorized to collect, process, and analyze DEI data. The steward will also establish a procedure for adding new members to the team. The steward should retain authority for final approval prior to any sharing of sensitive information outside the team. Team members should be put on notice that any violation of the protocol may lead to disciplinary action. Using these mature data practices is the best path to a sustained series of small, incremental improvements that achieve DEI goals.

[1] See the California Privacy Rights Act (CPRA) and Virginia Consumer Data Protection Act (VCDPA), which became effective January 1, 2023, and the Colorado Privacy Act (CPA), which will become effective July 1, 2023.

[2] See, e.g., GDPR Article 9, stating that “[p]rocessing of personal data revealing racial or ethnic origin, political opinions, religious or philosophical beliefs, or trade union membership, and the processing of genetic data, biometric data for the purpose of uniquely identifying a natural person, data concerning health or data concerning a natural person’s sex life or sexual orientation shall be prohibited. Paragraph 1 shall not apply if one of the following applies: the data subject has given explicit consent to the processing of those personal data for one or more specified purposes, except where Union or Member State law provide that the prohibition referred to in paragraph 1 may not be lifted by the data subject.”

[3] For example, according to Section 35 of the GDPR: “[w]here a type of processing in particular using new technologies, and taking into account the nature, scope, context and purposes of the processing, is likely to result in a high risk to the rights and freedoms of natural persons, the controller shall, prior to the processing, carry out an assessment of the impact of the envisaged processing operations on the protection of personal data.”