Received a new NSF grant to study content moderation practices in live streaming:
This research will investigate how individuals and small groups handle content moderation real time in the context of live streaming, from both technical and social perspectives, distinguishing between professional content creators who create content for a living, and hobbyists. Live streaming services such as Twitch are the latest form of social media that marries user-generated content with the traditional concept of live television broadcasting: as someone broadcasts, viewers can post comments in a chat interface that is displayed alongside the broadcast, creating an interactive synchronous media experience. This real-time interaction, however, makes the platform ripe for deviant behavior, as potential harassers can visually see the immediate impact of their harsh words on the person who is broadcasting. Most current forms of social media rely on crowdsourced methods of moderation, where users report bad content that is ultimately reviewed by a human moderator. This does not work well in the context of real-time moderation, posing greater social and technological challenges. This project will study approaches to improving understanding of the sociotechnical aspects of content moderation from the perspective of micro communities on live streaming platforms. By understanding how streamers currently moderate audiences through manual and automated labor, the research will identify opportunities for technology to assist and enhance the moderation process and provide guidelines for sustainable and scalable moderation. Exploration of different governance structures of moderation may also yield insights into alternative models of moderation for the future of social media and understanding of how different moderation practices may influence the evolution of positive and negative norms in micro communities.
Because live streaming is such a new phenomenon, presenting novel technical and social challenges, exploratory research is required before any serious attempt to solve its problems through technology design. This research agenda will advance knowledge about how moderation influences the development of social norms in micro communities from a qualitative perspective, laying the groundwork for future large-scale empirical studies, experiments, and development of useful artificial intelligence tools. The research will be able to identify the breadth of methods that are employed in the practice of moderation that will yield a comprehensive framework of understanding the conceptual functions of moderation by building a taxonomy of moderation, and develop a common language for both academics and practitioners that enables mapping between problems and potential design solutions. Moreover, through ethnographic work, the research will provide descriptive knowledge of this new form of social media that results in novel research questions unique to this particular technology. This research will inform design of moderation tools and practices that could impact millions of people who publish content online and yet even more people who view that content. By focusing on the individual producer, rather than the corporation running the system, as the center of their own system, the findings may be able to empower a new era of Internet activity in which individuals and small groups have more agency over what happens online.