A college professor wants to expose the hidden bias in AI, and then use it for good


Credit: Pixabay/CC0 Public Domain

Lauren Rowe researches the world of artificial intelligence and fast-paced machine learning technology. But she wants everyone to slow down.

Roh, assistant professor of information systems at the University of Maryland’s Robert H. Smith School of Business, recently reviewed emotion-recognition technology within three facial recognition services: Amazon Rekognition, Face++, and Microsoft. Her research revealed what Rowe described as “really stark.” ethnic differences.

Amazon Rekognition is offered for use by other companies. Face++ is used for identity verification. Microsoft plans to stop using files face recognition technology This year, including sentiment recognition tools.

Rhue collected black and white photos of NBA players from the 2016 season, judging by the degree to which they were smiling. Then she ran those photos through facial recognition software.

In general, the models are set more negative feelings Roe found for black players. Additionally, if players had ambiguous facial expressions, black players were more likely to be assumed to have a negative facial expression, while white players were more likely to “benefit from skepticism”.

“I think we should all step back, and think, do we need to analyze faces this way?” Roe said.

Rhue, 39, is not the first to discover racial disparity in AI systems. For example, MIT graduate student Joy Buolamwini gave TED Talks about her experience with facial analysis software that couldn’t detect her face because the algorithm wasn’t coded to identify a wide enough range of skin tones and facial structures.

“With the current enthusiasm for AI, there seems to be a need to model everything you can model,” Rhue said. “But I would really like to see a little more stopping and thinking, ‘Do we need this? What does he bring to the table? “

Facial recognition technology is widely used. The Port of Baltimore uses facial recognition technology to verify the identities of passengers upon disembarkation. HireVue, based in Utah, conducts video interviews of potential employees and records the candidates’ faces and emotions as part of their candidacy analysis. Artificial intelligence has been deployed to examine emotions and body language to find potential threats among crowds.

Some countries limit the use of artificial intelligence. California, for example, is considering restricting the use of artificial intelligence to screen job candidates to avoid a “discriminatory effect.” In Illinois, employers must disclose when AI tools are used during video interviews. Maryland has a similar law.

And last summer, the Baltimore City Council imposed a ban on facial recognition technology, except for the police department, until December of this year.

And with AI seeping into all areas of society, Rhue just wants people – and companies – to stop and think about the long-term implications.

“These types of systems are becoming increasingly integrated into our technology. We are not always aware of them. We are not always aware of how to use them,” Rowe said. “And I think it’s important to understand the potential for bias. And then my offset research is looking at human intervention to see if that makes it better, if people are able to offset that bias.”

Rhue pointed out that in every situation, there should be a combination of AI tools and human intervention Used to mitigate bias. She wants to bypass “unintended negative consequences”.

She believes the rest of her field is beginning to prioritize this type of work as well. She said the death of George Floyd at the hands of the police and subsequent calls for racial justice in 2020 led to interest in understanding the struggles of marginalized communities and how technology can promote inclusion.

Joy Ramaprasad, assistant professor of information systems at Maryland Business School, works with Roe and has known her since she was a doctoral student at New York University. She said Rhue’s work on bias in machine learning is some of the “most influential” work in their field.

“I think she’s doing work that she cares about because it affects her and it affects people in the community,” Ramaprasad said. “I think it’s really hard to be the one doing the work when you’re also someone who faces those, this bias or discrimination in the environment we live and work in.”

Despite the discrepancies she has revealed, Rowe believes that technology can be used for good. For example, Rhue has done research on crowdfunding digital platforms Focusing on Kickstarter, which organizes campaigns based on employee interest. In an effort to highlight projects put forward by black innovators, I found that using predictive models rather than relying on human subjective analysis increased recommendation rates for black projects without decreasing the success rate.

“I think there is a lot of potential for technology to have a really positive impact on inclusion, especially financial inclusion,” Rowe said.

Outside of her research scope, Roe teaches data visualization for undergraduate and master’s students. She previously taught at Wake Forest University.

Roe said she could see the effect it was having in class. I asked the students to tell her, including some in graduate schoolShe’s the only black professor they’ve ever had. She has asked others to tell her that they want to pursue their Ph.D. Because she made it seem possible.

Will Hawkes is an assistant professor of management at Nova Southeastern University in Florida. Prior to that, he was a student at Rhue School in Wake Forest.

While Rowe wasn’t Black’s first female professor — Hawkes had previously attended Florida A&M University, a historically black institution — seeing her on campus still made an impression.

“To see someone like you accomplish things you never thought could be achieved — that impossible becomes real,” Hawkes said. “Being a black professor in the same industry now…Our presence means a lot to them. And I know that because I’ve been in their shoes.”

Hawks Roe called it a “game changer” for him. He has been in contact with her since his graduation. He reached out to him when he was applying for academic jobs and even recently invited her to join him in researching hate crimes known as “Zoom Bombings” and how these incidents affect organizations and people. Hawkes believes Roe changed the course of his life.

“You wouldn’t talk to Dr. Will Hooks now if I hadn’t met Dr. Roe,” Hawkes said.

Sentiment reading technology fails the racial bias test

2022 Baltimore Sun.
Distributed by Tribune Content Agency, LLC.

the quote: A college professor wants to expose hidden bias in artificial intelligence, then use it forever (2022, September 14) Retrieved September 14, 2022 from https://techxplore.com/news/2022-09-university-professor-expose-hidden-bias .html

This document is subject to copyright. Notwithstanding any fair dealing for the purpose of private study or research, no part may be reproduced without written permission. The content is provided for informational purposes only.

Leave a Comment