Ignoring As a Moderation Strategy for Volunteer Moderators on Twitch
Document Type
Conference Proceeding
Publication Date
4-19-2023
Abstract
Content moderation is a crucial aspect of online platforms, and it requires human moderators (mods) to repeatedly review and remove harmful content. However, this moderation process can lead to cognitive overload and emotional labor for the mods. As new platforms and designs emerge, such as live streaming space, new challenges arise due to the real-time nature of the interactions. In this study, we examined the use of ignoring as a moderation strategy by interviewing 19 Twitch mods. Our findings indicated that ignoring involves complex cognitive processes and significant invisible labor in the decision-making process. Additionally, we found that ignoring is an essential component of real-time moderation. These preliminary findings suggest that ignoring has the potential to be a valuable moderation strategy in future interactive systems, which highlights the need to design better support for ignoring in interactive live-streaming systems.
Identifier
85158115341 (Scopus)
ISBN
[9781450394222]
Publication Title
Conference on Human Factors in Computing Systems Proceedings
External Full Text Location
https://doi.org/10.1145/3544549.3585704
Grant
1928627
Fund Ref
National Science Foundation
Recommended Citation
Li, Na; Cai, Jie; and Wohn, Donghee Yvette, "Ignoring As a Moderation Strategy for Volunteer Moderators on Twitch" (2023). Faculty Publications. 1774.
https://digitalcommons.njit.edu/fac_pubs/1774