Queuing for Civility: Regulating Emotions and Reducing Toxicity in Digital Discourse
By: Akriti Verma , Shama Islam , Valeh Moghaddam and more
Potential Business Impact:
Helps online chats stay calm and kind.
The pervasiveness of online toxicity, including hate speech and trolling, disrupts digital interactions and online well-being. Previous research has mainly focused on post-hoc moderation, overlooking the real-time emotional dynamics of online conversations and the impact of users' emotions on others. This paper presents a graph-based framework to identify the need for emotion regulation within online conversations. This framework promotes self-reflection to manage emotional responses and encourage responsible behaviour in real time. Additionally, a comment queuing mechanism is proposed to address intentional trolls who exploit emotions to inflame conversations. This mechanism introduces a delay in publishing comments, giving users time to self-regulate before further engaging in the conversation and helping maintain emotional balance. Analysis of social media data from Twitter and Reddit demonstrates that the graph-based framework reduced toxicity by 12%, while the comment queuing mechanism decreased the spread of anger by 15%, with only 4% of comments being temporarily held on average. These findings indicate that combining real-time emotion regulation with delayed moderation can significantly improve well-being in online environments.
Similar Papers
Modelling the Spread of Toxicity and Exploring its Mitigation on Online Social Networks
Social and Information Networks
Bots reduce online hate speech by changing its message.
The High Cost of Incivility: Quantifying Interaction Inefficiency via Multi-Agent Monte Carlo Simulations
Artificial Intelligence
Makes arguments take longer, costing money.
HateBuffer: Safeguarding Content Moderators' Mental Well-Being through Hate Speech Content Modification
Human-Computer Interaction
Protects online workers from seeing mean words.