Twitter finds its AI tends to crop out Black people and men from photos | Inquirer
 
 
 
 
 
 

Twitter finds its AI tends to crop out Black people and men from photos

06:45 AM May 20, 2021

Twitter Inc’s image-cropping algorithm has a problematic bias toward excluding Black people and men, the company said in new research on Wednesday, adding that “how to crop an image is a decision best made by people.”

The study by three of its machine learning researchers was conducted after user criticism last year about image previews in posts excluding Black people’s faces.

It found an 8% difference from demographic parity in favor of women, and a 4% favor toward white individuals.

The paper cited several possible reasons, including issues with image backgrounds and eye color, but said none were an excuse.

Your subscription could not be saved. Please try again.
Your subscription has been successful.

Subscribe to our daily newsletter

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.

“Machine learning based cropping is fundamentally flawed because it removes user agency and restricts user’s expression of their own identity and values, instead imposing a normative gaze about which part of the image is considered the most interesting,” the researchers wrote.

To counter the problem, Twitter recently started showing standard aspect ratio photos in full – without any crop – on its mobile apps and is trying to expand that effort.

ADVERTISEMENT

The researchers also assessed whether crops favored women’s bodies over heads, reflecting what is known as the “male gaze,” but found that does not appear to be the case.

The findings are another example of the disparate impact from artificial intelligence systems including demographic biases identified in facial recognition and text analysis, the paper said.

Related Articles

Best Online Workouts at Home

How to Get Instagram Followers Fast

Work by researchers at Microsoft Corp and the Massachusetts Institute of Technology in 2018 and a later U.S. government study found that facial analysis systems misidentify people of color more often than white people.

Amazon Inc in 2018 scrapped an AI recruiting tool that showed bias against women.

(Reporting by Paresh Dave; Editing by Edwina Gibbs)

Want stories like this delivered straight to your inbox? Stay informed. Stay ahead. Subscribe to InqMORNING

Don't miss out on the latest news and information.
TAGS:
For feedback, complaints, or inquiries, contact us.
Your subscription could not be saved. Please try again.
Your subscription has been successful.

Stay informed. Stay ahead. Subscribe to InqMORNING

By providing an email address. I agree to the Terms of Use and acknowledge that I have read the Privacy Policy.




This is an information message

We use cookies to enhance your experience. By continuing, you agree to our use of cookies. Learn more here.