top of page

Loopholes in DEI Data

More than half of U.S. companies openly report on DEI data; a meaningful increase since the beginning of 2021 (+13%). It’s important to note that as of 2021, public companies are required by the SEC to report on human capital, including DEI. While reporting on DEI data is becoming more common, it isn’t regulated how and what companies report. As a result, a lot of reporting today continues to miss the mark on two important components - accountability and action. So... why does reporting today still miss the mark?


Aggregates

The most common reporting we come across spans representation, promotion rates, attrition, and pay parity across gender identity and race. Companies have to resort to aggregation to avoid exposing individual data but the persistent lack of representation across specific groups means that only looking at aggregates fails to acknowledge the experiences of less represented groups.

Gender aggregations we commonly see are limited to women and men excluding gender identities such as non-binary or trans. Race aggregations often group into “underrepresented minorities” or “underrepresented groups”, bundling any race other than white into a single group.

There are multiple problems with the above, for one: it misses individual experiences and diminishes their importance. Secondly: it assumes that all races in the workplace share the same experiences, and lastly: it skews representation.


How does this show up in organizations?

Let’s say a company looks at promotion rates for women. They might find that the overall promotion rate is comparable to men, but what about intersectional identities? Since the organization only has a few women of any other race than white in the organization, their general DEI reporting dashboard doesn’t allow a further breakdown and misses the experiences of black women in the organization that have lower than average promotion rates.

We’d like to share a real story of a black woman that worked at a technology company for 5 years. Let’s call her Rosa. Rosa works at a company where the average time to promotion at more junior levels ranges between 6-12 months. Rosa wasn’t promoted until her 4th year, despite multiple lateral moves, increased responsibility, and strong performance reviews. When Rosa’s manager flagged this with HR, HR said that Rosa has never been perceived as someone with high potential by her managers.


Throughout Rosa’s career, all 7 of her managers were white and all but one were male. Only one noticed something seemed off, and up until her new manager, Rosa didn’t feel comfortable raising her concerns. While Rosa might be in a role where her pay is equitable to the role she moved into recently, and has a manager invested in her career growth, the damage of lost career progression, and therefore salary growth, can’t be dismissed. Rosa’s salary only grew 70% from her starting salary, vs. her peer's ranges of 2x or 3x their salaries over the same timeframe. This reiterates the persistent wage gap for black women compared to white men (64 cents for every $1 earned by white men).

Rosa’s story is still the story of many black women in organizations and DEI data today, unfortunately, continues to miss experiences like Rosa’s. While Rosa ended up eventually having a manager that advocated for her, most commonly we hear people either get gaslit by their HR teams, stay silent, or ultimately leave the organization.


Skewed representation

We mentioned earlier that companies often bundle any other race but white into groups such as URGs and URMs. This grouping predisposes itself to look better on a high level than when looked at more deeply. Let’s say a company has a good representation of one or two sub-groups bringing up the overall number, but when diving deeper, other groups might have little or no representation at all.


How does this show up in organizations?

Let’s take another real example here. An engineering function at a technology company reported on their representation of race and proudly announced that representation has improved significantly where now 41% of the engineering organization identifies as part of a URG. While this progress is exciting, over the 10+ years the company existed, they only have had 1 black male engineer, and 0 women Latine or Black engineers. Their representation is heavily driven by South- and East-Asian hires, but they still fail to attract talent from Latine and Black communities. In the overall reporting, this might be missed, which would also disregard any efforts to improve hiring from those communities.

As you can see above, aggregates can make it really hard for organizations to pinpoint where they might be failing their employees alongside their ability to figure out where they need to take informed action. You might wonder… why didn’t any previous manager of Rosa see this? Or why wouldn’t the Engineering Leader proactively notice the lack of Black and Latine representation? Well, because they don’t have to. Loopholes in data mean that it also gives enough room for unconscious (or conscious) bias to persist and doesn’t create avenues for leadership to be held accountable. A white male manager that managed Rosa never experienced similar discrimination, and they might deal with their own unconscious (or conscious) biases allowing them to not see that there is a problem.

The lack of data surfacing issues like this also creates more avenues for managers or HR to make excuses that this is a “one-off” situation that isn’t happening across the board and can’t be generalized while making it harder for the person affected by it to navigate. To say the least, if it happened to Rosa, it can happen to another black woman in the organization and requires thoughtful, and systemic changes, to ensure it doesn’t.


Sentiment surveys aren’t the answer either…

A lot of companies have invested in getting a general pulse on employee sentiment and experience through methods where employees can comfortably share their feedback. Most commonly organizations leverage platforms such as CultureAmp to survey their organizations.

This is a meaningful step in a better direction where employees are given a voice but we often hear how a major disconnect between what the output of the survey is, and what employees are asking for persists. “Sentiment” surveys are often run every 6 to 12 months causing not only the population to change skewing results by not maintaining a steady baseline, but external factors and recency bias can influence how an employee answers questions asked based on where they are at that moment in time.

Let’s take another real example here from a tech company we work with. The tech company did its bi-annual sentiment survey and one of the strongest negative sentiments was toward meeting culture. The leadership function wanted to act by implementing no meeting days but faced surprising pushback by employees that this would mean meetings would be jammed into fewer days. When digging deeper, the leadership team actually found that it wasn’t just the amount of meetings that happened but the fact that they were often ineffective, and repetitive, and that the constant context-switching was draining. Instead of implementing forced no-meeting days, the organization had to instead look into how they can help teams run more effective meetings and regain control of their calendars. This demonstrates that overall sentiment often isn’t enough and companies need to go the extra mile to dive deeper. Sentiment should be seen as a leading indicator of a problem, not the root cause.

The above examples lead to misguided focus at organizations, which results in $50B in cost to US companies every year. We also see that most investment continues to be focused on the top of the funnel by increasing representation, but even if representative hires are made, they fail to grow and retain them. Data lends itself to inform systemic changes in organizations, but only if it is looked at, and used the right way.

What can we do?

Read Part 2 of our blog next week to learn about ways we can leverage our current data to identify the gaps/experiences we might be overlooking when it comes to our own DEI data. And how to fix them.

21 views0 comments

Recent Posts

See All
bottom of page