Epistemic diversity across language models mitigates knowledge collapse
By: Damian Hodel, Jevin D. West
Potential Business Impact:
Keeps AI smart by using many different minds.
The growing use of artificial intelligence (AI) raises concerns of knowledge collapse, i.e., a reduction to the most dominant and central set of ideas. Prior work has demonstrated single-model collapse, defined as performance decay in an AI model trained on its own output. Inspired by ecology, we ask whether AI ecosystem diversity, that is, diversity among models, can mitigate such a collapse. We build on the single-model approach but focus on ecosystems of models trained on their collective output. To study the effect of diversity on model performance, we segment the training data across language models and evaluate the resulting ecosystems over ten, self-training iterations. We find that increased epistemic diversity mitigates collapse, but, interestingly, only up to an optimal level. Our results suggest that an ecosystem containing only a few diverse models fails to express the rich mixture of the full, true distribution, resulting in rapid performance decay. Yet distributing the data across too many models reduces each model's approximation capacity on the true distribution, leading to poor performance already in the first iteration step. In the context of AI monoculture, our results suggest the need to monitor diversity across AI systems and to develop policies that incentivize more domain- and community-specific models.
Similar Papers
Epistemic Diversity and Knowledge Collapse in Large Language Models
Computation and Language
AI models share less knowledge than web searches.
Epistemic Diversity and Knowledge Collapse in Large Language Models
Computation and Language
Computers share less knowledge than a web search.
Autonomous AI imitators increase diversity in homogeneous information ecosystems
Computers and Society
AI can add or remove news variety.