tech news

Facebook moderators speak out about vetting ‘90% sexual’ private chats

[ad_1]

Moderators have to vet any message that gets flagged to them (Getty)

Those responsible for keeping Facebook free of violence, prejudice and graphic sexual content have spoken out about the daily toll of their work.

Talking under condition of anonymity to the Guardian, the contractors explained the realities of keeping Facebook users in check day in, day out.

Perhaps most worrying of all was the task of vetting private chats conducted through Facebook Messenger that were flagged for moderation.

‘You understand something more about this sort of dystopic society we are building every day,’ one of the moderators told the Guardian, explaining that of the messages they examine ‘90% are sexual’.

The Berlin-based worker had to speak under anonymity because he had signed a non-disclosure agreement with Facebook.

‘We have rich white men from Europe, from the US, writing to children from the Philippines … they try to get sexual photos in exchange for $10 or $20.’

Facebook boss Mark Zuckerberg says he wants the platform to start valuing privacy (AFP)

Metro contacted Facebook’s representatives about these claims and we were provided with the same statement that was given to the Guardian.

‘Content moderators do vital work to keep our community safe, and we take our responsibility to ensure their wellbeing incredibly seriously,’ a Facebook spokesperson said.

‘We work closely with our partners to ensure they provide the support people need, including training, psychological support and technology to limit their exposure to graphic content.

‘Content moderation is a new and challenging industry, so we are always learning and looking to improve how it is managed. We take any reports that our high standards are not being met seriously and are working with our partner to address these concerns.’

The other moderators who spoke out said that over time the constant viewing of graphic pornography, violence and fake news began to have a compounded effect on them. They said they became ‘addicted’ to it and began hoarding increasingly extreme examples in their own private collections.

Facebook is responsible for moderating Messenger and Instagram as well (AP)

The political agenda is also explicitly mentioned.

‘Maybe because all this hate speech we have to face every day affects our political view somehow,’ one source said.

‘So a normal person, a liberal person, maybe also a progressive person, can get more conservative, more concerned about issues like migrants for example. Indeed, many of the hate speech contents we receive on a daily basis are fake news … which aim to share very particular political views.’

In 2018, a man reportedly died at his desk from a heart attack after moderating child abuse images for Facebook in the US.

While many agree Facebook (and other social networks like YouTube and Twitter) are making small improvements when it comes to moderation, these latest statements show there is still a lot of work to be done.



[ad_2]

READ SOURCE

This website uses cookies. By continuing to use this site, you accept our use of cookies.  Learn more