TY - JOUR
T1 - "how advertiser-friendly is my video?"
T2 - YouTuber's Socioeconomic Interactions with Algorithmic Content Moderation
AU - Ma, Renkai
AU - Kou, Yubo
N1 - Funding Information:
is partially supported by the National Science Foundation, under grant no. 2006854.
Publisher Copyright:
© 2021 ACM.
PY - 2021/10/18
Y1 - 2021/10/18
N2 - To manage user-generated harmful video content, YouTube relies on AI algorithms (e.g., machine learning) in content moderation and follows a retributive justice logic to punish convicted YouTubers through demonetization, a penalty that limits or deprives them of advertisements (ads), reducing their future ad income. Moderation research is burgeoning in CSCW, but relatively little attention has been paid to the socioeconomic implications of YouTube's algorithmic moderation. Drawing from the lens of algorithmic labor, we describe how algorithmic moderation shapes YouTubers' labor conditions through algorithmic opacity and precarity. YouTubers coped with such challenges from algorithmic moderation by sharing and applying practical knowledge they learned about moderation algorithms. By analyzing video content creation as algorithmic labor, we unpack the socioeconomic implications of algorithmic moderation and point to necessary post-punishment support as a form of restorative justice. Lastly, we put forward design considerations for algorithmic moderation systems.
AB - To manage user-generated harmful video content, YouTube relies on AI algorithms (e.g., machine learning) in content moderation and follows a retributive justice logic to punish convicted YouTubers through demonetization, a penalty that limits or deprives them of advertisements (ads), reducing their future ad income. Moderation research is burgeoning in CSCW, but relatively little attention has been paid to the socioeconomic implications of YouTube's algorithmic moderation. Drawing from the lens of algorithmic labor, we describe how algorithmic moderation shapes YouTubers' labor conditions through algorithmic opacity and precarity. YouTubers coped with such challenges from algorithmic moderation by sharing and applying practical knowledge they learned about moderation algorithms. By analyzing video content creation as algorithmic labor, we unpack the socioeconomic implications of algorithmic moderation and point to necessary post-punishment support as a form of restorative justice. Lastly, we put forward design considerations for algorithmic moderation systems.
UR - http://www.scopus.com/inward/record.url?scp=85117930620&partnerID=8YFLogxK
UR - http://www.scopus.com/inward/citedby.url?scp=85117930620&partnerID=8YFLogxK
U2 - 10.1145/3479573
DO - 10.1145/3479573
M3 - Article
AN - SCOPUS:85117930620
SN - 2573-0142
VL - 5
JO - Proceedings of the ACM on Human-Computer Interaction
JF - Proceedings of the ACM on Human-Computer Interaction
IS - CSCW2
M1 - 429
ER -