Meta’s Plan To Evaluate Employees’ AI Use May Negatively Impact Women

Meta News

Meta’s Plan To Evaluate Employees’ AI Use May Negatively Impact Women
GoogleMicrosoftWomen Who Use AI
  • 📰 ForbesWomen
  • ⏱ Reading Time:
  • 411 sec. here
  • 15 min. at publisher
  • 📊 Quality Score:
  • News: 192%
  • Publisher: 51%

Meta's plan to evaluate employees' AI use in 2026 may pose gender bias concerns. New research finds that women who use AI are viewed as less competent than male AI users.

Meta 's plan to start evaluating employees' AI use in performance reviews in 2026 may pose gender bias risks. New research finds that women who use AI are perceived as less competent and as less significant contributors than men who also use AI and produce identical work product.

. Meta’s Chief People Officer Janelle Gale informed employees of this policy via internal memo. Using this new evaluation criteria, Meta plans to reward employees for AI-driven impact either in their own work or in their team’s performance. While Meta’s new policy is intended to be gender neutral, new research raises concerns about unique risks for women workers. Evaluating employees based on AI use often triggersthat when women and men both use AI to produce identical work product, the women are evaluated as less competent and less significant contributors than the men. Recognizing and mitigating these risks of gender bias when evaluating employees’ AI use will become increasingly important as more companies move in the same direction as Meta.by Google CEO Sundar Pichai in an all-hands meeting in July, after Google’s vice president Megan Kacholia’s June announcement that using AI would be added to the job descriptions for many software engineering positions. While Meta, Microsoft, and Google have made their plans public, many other companies are assessing their workers’ AI adoption more quietly. Providers of worker-tracking software have reported a steepAs more companies start evaluating employees based on their use of AI, business leaders should acknowledge and proactively address the gender bias risks that this trend may pose.Researchers launched the 2025 study after leaders at a global technology company reached out to them for help. The company, which was anonymized for the study, ranks among the top 50 of Forbes Global 2000. The company had launched an unsuccessful internal campaign to incentivize its software engineers to use a generative AI programming tool. After a year-long effort, only 41% of its engineers , were using the AI tool. The researchers wanted to understand whether the lack of AI adoption was linked to how managers evaluated employee use of AI. The researchers gave 1,026 software engineers at the company the exact same piece of computer code, and asked them to rate both the code’s quality and the coder’s competence. While the code itself was identical, the evaluators were told different information about whether the coder was male or female and whether the code was written with or without AI assistance. The evaluators rated the quality of the computer code similarly in all conditions, which made sense because they were all assessing identical code. However, the evaluators gave very different ratings to the coders. Even though the company actively encouraged AI use, the evaluators imposed a significant “competence penalty” against AI users compared to non-AI users. Overall, the evaluators gave 9% lower competence ratings to the engineers who purportedly used AI assistance, even though the evaluators were viewing identical work product. What’s more, this penalty was not gender neutral. The competence penalty that evaluators imposed against women AI users was more than twice as large as for male AI users who produced the same code. While male AI users received 6% lower competence ratings than non-AI users, female AI users received 13% lower competence ratings. This gender bias existed even in the context of a company that had invested in a year-long effort to incentivize employee AI use. Meta’s plan to evaluate employees’ AI use is also part of a broader effort to expand AI adoption throughout the organization. Meta beganon dashboards earlier this year, while incentivizing employee adoption through a game that awards badges when employees meet AI usage milestones. The study’s findings suggest, however, that a company’s support for employee AI use is not enough to eliminate evaluation bias. The study found that the use of AI reinforced preexisting stereotypes about women’s technical competence, even though AI use was desired by the company. “The AI assistance is framed as a ‘proof’ of their inadequacy rather than evidence of their strategic tool use,” explained the study’s researchers in an August 1st, 2025In the same study, the researchers also asked the code evaluators to estimate the relative contribution of the engineer versus the AI tool. Although the code itself was identical, the evaluators estimated a larger contribution for the AI tool when they were told that the coder was a woman rather than a man. In other words, when both women and men use AI to produce identical work product, women’s contributions to the project are devalued.found that women in scientific research teams are significantly less likely than men to be credited with authorship on publications despite similar contributions. When men and women coauthor academic articles, tenure reviewers assume that the men contributed more, according to a 2017 study published in the“There’s quite a bit of evidence women receive less credit for group work, so receiving less credit for work done with AI seems like a 2020’s twist,” said Alan Benson, a professor at Carlson School of Management, via email. This insight should inform practices at companies like Meta that plan not only to evaluate employees’ personal use of AI, but also employees’ perceived AI-driven impact on their team’s performance.The findings from the 2025 study may help companies understand disappointing results from efforts to increase employees’ AI use. Employees may be understandably reluctant to use AI to complete jobs task if they are aware of or suspect that doing so will result in lower competence or contribution evaluations, even when their company encourages AI use. The researchers verified this concern among women engineers in their study. In a follow-up survey of 919 engineers at the same company, women expressed significantly more concern than men that using AI would decrease their manager’s evaluation of their coding ability. The researchers offer several suggestions for how companies can both reduce gender bias and more effectively encourage employee AI use.In the 2025 study, the researchers found that evaluators who did not use AI themselves imposed the most severe competence penalties for AI use. Evaluators who were not AI users also demonstrated the largest gender bias in their reviews. Specifically, the study found that male engineers who did not use AI themselves rated women engineers who used AI 26% more harshly than they rated men who used AI, even when producing identical work product. This finding suggests that companies that evaluate employees’ AI use and contributions should ensure that all evaluators are themselves proficient AI users.According to the AI study’s authors, the most effective way to reduce gender bias in AI-related evaluation contexts is by adopting a “blind review” process. The means that the evaluator does not know which employee produced the work product that is being evaluated. When a blind review process is not feasible, bias may still be reduced by explicitly shifting the evaluation focus from the worker to the actual work. In the AI study, for example, the evaluators did not demonstrate gender bias when they were asked to rate the quality of the computer code. The evaluators rated the quality of the identical code similarly in all situations, regardless of whether the coder was identified as male or female. It was only when the evaluators were asked to rate the coder’s competence and contributions that gender bias negatively skewed the evaluations against women. This finding suggests that companies should train evaluators to assess the quality of AI-driven work product, rather than asking evaluators to assess various qualities of the worker who is tasked with using an AI tool.Using subjective assessment criteria tends to increase the impact of gender bias in evaluations of competence and workplace contributions. So evaluations of employee AI use should be linked to objective productivity or impact measures, rather than holistic impressions about performance. Taking these proactive steps should not only benefit companies by counteracting the impact of gender-based stereotypes. Ensuring fair and consistent evaluations of employees’ AI use should also help increase overall employee willingness to adopt AI tools and discover their most effective use.

We have summarized this news so that you can read it quickly. If you are interested in the news, you can read the full text here. Read more:

ForbesWomen /  🏆 477. in US

Google Microsoft Women Who Use AI Gender Bias Gender Bias In AI Employee Use Of AI Gender Stereotypes

 

United States Latest News, United States Headlines

Similar News:You can also read news stories similar to this one that we have collected from other news sources.

USDA Employees Turn Backs on Sonny Perdue as USDA Head Announces Moving of Research AgenciesUSDA Employees Turn Backs on Sonny Perdue as USDA Head Announces Moving of Research AgenciesAndrea Germanos is a senior editor and staff writer at Common Dreams.
Read more »

Goldman Sachs employees shocked after exec caught up in Epstein scandal sends tone-deaf emailGoldman Sachs employees shocked after exec caught up in Epstein scandal sends tone-deaf emailToday's Video Headlines: November 18, 2025
Read more »

3 HISD employees arrested on campus for outstanding traffic-related warrants, district says3 HISD employees arrested on campus for outstanding traffic-related warrants, district saysAccording to the district, these employees are no longer on campus pending an internal employment investigation.
Read more »

Employees at Dreamstyle Remodeling blindsided by renovation company’s sudden closureEmployees at Dreamstyle Remodeling blindsided by renovation company’s sudden closureAustin Grabish traded in winter boots for a surfboard. He joined the ABC 10News team as a reporter in January 2023 after reporting for CBC, Canada's national public broadcaster, for six years.
Read more »

Former Federal Prison Camp Bryan Employees Allege Misconduct and Retaliation Amid Ghislaine Maxwell ScrutinyFormer Federal Prison Camp Bryan Employees Allege Misconduct and Retaliation Amid Ghislaine Maxwell ScrutinyTwo former employees of Federal Prison Camp Bryan claim they were fired after reporting concerns about misconduct and special treatment afforded to Ghislaine Maxwell, leading to scrutiny from the House Judiciary Committee.
Read more »

Verizon confirms plans to lay off 13,000 employeesVerizon confirms plans to lay off 13,000 employeesFind the latest technology news and expert tech product reviews. Learn about the latest gadgets and consumer tech products for entertainment, gaming, lifestyle and more.
Read more »



Render Time: 2026-04-01 17:58:07