IT’S one of the coolest companies in the world but a Californian woman says she developed PTSD from working there.
IT’S one of the coolest companies in the world but a Californian woman says she developed PTSD from working there.

My office job gave me PTSD

IF YOU think you have a soul sucking office job, be thankful you don't work as a content moderator for a global social media company.

Armies of content moderators who work for major internet platforms like Facebook, Twitter, and Google, spend their time looking at every single picture and video that has been flagged for potentially being objectionable.

As you might expect, that includes some pretty nasty stuff such as snuff films, self harm images, bestiality clips, videos of people being decapitated and child porn.

At least one employee says it was too much and is suing Facebook for being "exposed to highly toxic, unsafe and injurious content during her employment as a content moderator" for the billion dollar company.

Selena Scola was a content moderator at Facebook's Menlo Park headquarters in California from June 2017 to March of this year. She has filed a damages claim against the tech giant claiming her experience working for Facebook resulted in developing post traumatic stress disorder or PTSD.

The case was filed as a class-action civil case, but at the moment Ms Scola is the only named plaintiff, according to Reuters.

The claim says thousands of content moderators face mental trauma on the job at Facebook and criticises the company's efforts to protect employees.

For its part, the company says psychological help is available to all its moderators 24 hours a day.

A former Facebook employee has sued the company after reportedly developing PTSD.
A former Facebook employee has sued the company after reportedly developing PTSD.

Those who work as a content moderator are bombarded with "thousands of videos, images and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide and murder," according to court papers.

"Facebook is ignoring its duty to provide a safe workplace and instead (is) creating a revolving door of contractors who are irreparably traumatised by what they witnessed on the job," Korey Nelson, a lawyer for Ms Scola said in a statement.

Facebook has about 7500 content moderators around the world who conduct this type of work - many who are employed through outsourced contractors.

A recent documentary called The Cleaners detailed the toll taken on these outside workers used by tech companies to determine whether videos and photos that have been shared online should be removed or not.

The film tracks a handful of people based in Manila who spend their days looking at terrorist videos, political propaganda, self-harm videos, and child pornography, breaking them into binary categories: "ignore," where they let the post stay up and "delete," where the imagery is removed for violating community standards.

The release of Facebook's content moderation guidelines earlier this year drew attention to the difficult work of those whose job it is to review such material all day long.

While automatic filtering systems can work, companies still need to hire humans to sift through the vile content that gets uploaded.

"New technology like machine learning, computer vision and artificial intelligence helps us find more bad content, more quickly," Antonia Sanda, Head of Communications for Facebook Australia told news.com.au this week.

"We are also investing heavily in more people to review content that is flagged," she added.

The social network has faced regulatory scrutiny over not doing enough to prevent content like fake news and hate speech on its platform. But the trauma faced by content moderators could be a growing problem for the company.

Facebook said in a statement: "We recognise that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources."


Classic Christmas tales meet radio

Classic Christmas tales meet radio

Radioplay is coming to Noosa Museum for a step back in time

Body art nabs big award

Body art nabs big award

Festival wins ahead of a huge 2019

Seeing Alice in Noosa-land

Seeing Alice in Noosa-land

Famous pantomime for Christmas fun

Local Partners