Towards Simulating Social Influence Dynamics with LLM-based Multi-agents
By: Hsien-Tsung Lin , Pei-Cing Huang , Chan-Tung Ku and more
Potential Business Impact:
Computers can now act like people talking online.
Recent advancements in Large Language Models offer promising capabilities to simulate complex human social interactions. We investigate whether LLM-based multi-agent simulations can reproduce core human social dynamics observed in online forums. We evaluate conformity dynamics, group polarization, and fragmentation across different model scales and reasoning capabilities using a structured simulation framework. Our findings indicate that smaller models exhibit higher conformity rates, whereas models optimized for reasoning are more resistant to social influence.
Similar Papers
An Empirical Study of Group Conformity in Multi-Agent Systems
Artificial Intelligence
AI debates can make opinions change like people.
Social Simulations with Large Language Model Risk Utopian Illusion
Computation and Language
Computers show fake, too-nice people in chats.
Simulating Online Social Media Conversations on Controversial Topics Using AI Agents Calibrated on Real-World Data
Social and Information Networks
Computers can now pretend to be people online.