Twitter says it investigating allegations of racial bias in its image previews after users complained that its picture-cropping algorithm prefers white faces to black ones.
According to the complaints, Twitter only shows the face of the white image if both black and white images appear in the same post.
The controversy began when a University Manager from Vancouver, Colin Madland attempted to troubleshoot a black colleague’s head which kept vanishing during a video conference on zoom. He realized that the software identified the man’s head as part of the background because of his colour and so kept removing.
He decided to take his observation to twitter and discovered that twitter also left out his friends face and features only his face while previewing on mobile apps.
This discovery produced a flurry of reactions from other users who used a variety of examples including the face of US Senate majority leader, Mitch McConnel and former President Barrack Obama, and a stock photo of a white and black man in suit. It was discovered that Twitter algorithms picked the white faces over the black ones every time.
In its response, the app said it tested the algorithm for racial and gender bias during developmental stage but would do more analysis.
“We did analysis on our model when we shipped it – but [it] needs continuous improvement. Love this public, open, and rigorous test – and eager to learn from this.” Twitter’s Chief Technology Officer, Parag Agrawal tweeted.