The Hidden, Exploited Workforce
Every murder, suicide, sexual assault or child abuse video that does not make it onto a platform has been viewed and flagged by a content moderator or an automated system trained by data most likely supplied by a content moderator. Employees performing these tasks suffer from anxiety, depression and post-traumatic stress disorder due to constant exposure to this horrific content.Here are some questions I posed to the group:
- Is it right that underpaid, often untrained workers have to look at extremely violent or sexual materials so that you don’t accidentally view this when you use generative AI?
- Is it right for big tech companies (some of the wealthiest companies in the world) to send ‘disturbing content’ work to countries where labour laws are weaker, just because it’s cheaper?
I believe the need for critical AI literacy is absolutely essential for our learners (and teachers), where issues such as this can be discussed. The perfect opportunity to help develop ‘ethical and informed learners’, who are prepared to address the question, “Just because we can, does that mean we should?” I very much hope that this year’s update to the digital competence framework will provide opportunities to address issues such as this, along with focuses on impact on the environment, society and intellect.

Comments
Post a Comment