Thursday, 13 February 2020

Seminar on content moderation

You can't say that! Policing the long tail of public opinion, and why Facebook won’t allow us to talk about it
Presented by Tech Won’t Build It Ireland and School of Multidisciplinary Technology TU Dublin and the TU Dublin Critical Media Literacy Group
With Chris Gray.

7pm-8:30pm, Wednesday 19th Feb 2020 Room 259, TU Dublin, Bolton Street.

Social media platforms such as Facebook, YouTube and Twitter employ around 100,000 people worldwide to vet content posted by their users - online bullying, hate speech, extreme violence, pornography, fake news, and worse. Most content moderators are employed by third-party companies who provide services to the major Internet companies.

Thousands of those content moderators are based here in Ireland. It’s becoming increasingly clear that many moderators experience trauma and PTSD as a result of the volume of disturbing content that they’re exposed to, often without adequate preparation or support from their employer[1] . In late January this year, one contractor (Accenture) even asked employees to sign a document acknowledging the risk of PTSD, and making them individually responsible for dealing with it.

Two groups of current and former employees are suing Facebook over this issue, in both California and Ireland. Chris Gray is the lead plaintiff in the case against Facebook and contractor Cpl Resources that is now going through the Irish High Court. 

Chris will walk us through how content moderation works, how moderators are trained, conditions on the job, and how moderators make decisions. He will tell us about the psychological impact of this work, and his experience taking a legal case against his former employer Facebook, one of the world’s largest technology companies. 

We’ll also discuss the trade offs between free speech, hate speech, censorship and fake news; and the role of regulation versus corporate responsibility for content on Internet platforms.  


Exam Papers