tech news

Twitter will reward hackers for fixing racial and gender bias in its software


Twitter is turning to hackers to help it fix problems with its algorithm (Photo by Nikolas Kokovlis/NurPhoto via Getty Images)

Twitter has announced a bug bounty program to encourage hackers to identify and fix faults with one of its algorithms.

The social network has announced the competition in order to deal with apparent racial and gender bias in its image-cropping algorithm.

Last year, researchers pointed out the apparently racist way that Twitter’s cropping algorithm worked. On photos attached to tweets that are larger or a different shape than the thumbnail proportions, the site automatically chooses what portion of the image to show in the preview.

Twitter itself acknowledged that it chose white, male faces over others and promised to address the issue.

‘As part of our commitment to address this issue, we also shared that we’d analyze our model again for bias,’ wrote Twitter’s Rumman Chowdhury. ‘Over the last several months, our teams have accelerated improvements for how we assess algorithms for potential bias and improve our understanding of whether ML is always the best solution to the problem at hand.’

Now the company is trying to incentivise people to help deal with the problem by paying the winner of the competition $3,500 (£2,500) for identifying the cause of the bias.

‘We want to take this work a step further by inviting and incentivizing the community to help identify potential harms of this algorithm beyond what we identified ourselves,’ Twitter wrote in a blog post announcing the challenge.

‘With this challenge we aim to set a precedent at Twitter, and in the industry, for proactive and collective identification of algorithmic harms.’


MORE : Google, Facebook and Twitter delay return to office as Delta variant cases rise


MORE : Who is going to look after your social media profiles after you die?





READ SOURCE

See also  Apple boss Tim Cook is about to enjoy a $120 million payday

Leave a Reply

This website uses cookies. By continuing to use this site, you accept our use of cookies.  Learn more