Companies must address AI bias or they risk facing potential litigation and reputational damage. Here, are three ways workplaces can tackle AI discrimination.
TOPSHOT - A robot using artificial intelligence is displayed at a stand during the International Telecommunication Union AI for Good Global Summit in Geneva, on May 30, 2024. Humanity is in a race against time to harness the colossal emerging power of artificial intelligence for the good of all, while averting dire risks, a top UN official said.
As companies rush to implement AI tools, they may be unaware of how these tools can quietly embed bias and accelerate discrimination at a faster pace and on a larger scale than any manager could. A growing body of research finds existing AI tools and algorithms are causing gender and racial bias in decisions related to hiring, pay increases and promotions. For example, afound that Chat GPT generates differences in salary advice, based on gender, which could perpetuate gender pay gaps if acted on."Facial recognition tools, for example, have a 1% error rate for light-skinned men but a 35% error rate for darker-skinned women. We know that global financial services companies are already using AI tools in their credit scoring. But those tools tend to discriminate against people from marginalized groups, downgrading the likelihood they will be approved for a loan", says Bates. AI discrimination is not just an ethical problem; organizations must address the negative impact on their talent management processes, or they risk facing potential litigation and reputational damage. Here, Bates shares three ways workplaces can tackle AI discrimination.As a starting point to safeguard employees against existing AI bias, companies can map the use of AI tools against the employee lifecycle to identify how AI influences decisions. Companies can then use a number of test cases and vary demographic data, while keeping qualifications and experience the same, to identify any bias in hiring, development, pay, and promotion recommendations.Bates says that AI tools, especially those used in recruitment, are designed to filter out candidates who may differ from the dominant group in organizations, which results in biased outcomes. "Even if you make AI race and gender blind, or turn off those identifying factors, it's going to discriminate by proxy. So if you have a C-suite that is full of white men, then it is going to start picking up on that and provide you with a homogenous set of privately educated white male candidates," she says. By regularly auditing AI tools for bias, organizations can not only prevent discrimination but also ensure they are hiring the best person for the job.Large language models learn by processing large amounts of text from a wide range of sources, such as the internet or news articles. Often, these sources contain discriminatory language and reflect existing inequities. "When you ask a generative model to populate that advertising campaign with supermodels, for example, it will immediately give you images of very thin, blonde, white skinned, non-disabled women. If you ask for an image of a CEO, you will get a white middle-class man sitting at a desk. So it's taking progress that we fought really hard for and wiping it out," says Bates. One way to debias AI tools is to use synthetic or segmented data sets, but even the most unbiased algorithms can be undermined by humans. To solve for this, companies need to train hiring managers, recruiters, or any decision-makers to understand existing forms of discrimination and inequity so they can identify when AI tools are mirroring those same patterns and also not engage in biased decision-making.The proliferation of biased AI tools is increasing the normalization of prejudice. "It has always been the case previously that the youngest people had the most socially progressive attitudes, and the oldest cohorts held outdated views about women. There was reason to believe that misogynistic ideas were gradually disappearing. That isn't the case anymore," says Bates."Products are being implemented at pace by workplaces who are so keen not to miss out or fall behind. They are not necessarily working in women's best interests; this even plays out in the ways in which these products are used," says Bates. A good example of the gendered nature of AI tools is digital voice assistants, which often have a default female name and voice. Bates says research finds that around 10% of conversations with those voice assistants are abusive. "This increases the normalization of calling a woman a name if she doesn't give the answer that you are looking for immediately. Children are growing up in homes where they hear 'Oh, for God's sake, Siri, you idiot!' comments repeatedly. And it all has a cumulative impact." While companies are scrambling to implement AI, they need to identify and manage the associated risks and potential negative impact on all individuals, which can be done by including a diverse range of perspectives in the development and implementation of AI tools. While AI might be here to stay, whether it perpetuates inequality or not is a decision workplaces can make.
Artificial Intelligence Bias Inequality Gender Inequality Digital DEI Diversity Inclusion Michelle King
United States Latest News, United States Headlines
Similar News:You can also read news stories similar to this one that we have collected from other news sources.
Boston Celtics waive former Alabama Mr. Basketball, Crimson Tide guardThe NBA team parts ways with the player after three seasons.
Read more »
Trump’s Anti-Bias AI Order Is Just More BiasThe Trump administration says it wants AI models free from ideological bias, as it pressures their developers to reflect the president’s worldview.
Read more »
Three ways to watch ‘Mama June: Family Crisis’ season 7, episode 8 for freeFind out how to watch 'Mama June: Family Crisis' season 7, episode 8 without paying a dime with current streaming deals.
Read more »
Aaron Judge Injury: Three Ways Yankees Can Deal With Star’s Absence at Trade DeadlineThe Yankees have some big decisions to make after Aaron Judge's injury with the MLB trade deadline looming.
Read more »
3 ways to tell if your relationship is truly serving you.Three ways to tell if your relationship is truly serving you.
Read more »
Blackpink's Rosé Styles Summer's Most Divisive Shoe in Three Different WaysThis season's flip-flop obsession is showing no signs of slowing down.
Read more »
