Introducing multiverse analysis to bibliometrics: The case of team size effects on disruptive research
By: Christian Leibel, Lutz Bornmann
Potential Business Impact:
Checks research results for truthfulness.
Although bibliometrics has become an essential tool in the evaluation of research performance, bibliometric analyses are sensitive to a range of methodological choices. Subtle choices in data selection, indicator construction, and modeling decisions can substantially alter results. Ensuring robustness, meaning that findings hold up under different reasonable scenarios, is therefore critical for credible research and research evaluation. To address this issue, this study introduces multiverse analysis to bibliometrics. Multiverse analysis is a statistical tool that enables analysts to transparently discuss modeling assumptions and thoroughly assess model robustness. Whereas standard robustness checks usually cover only a small subset of all plausible models, multiverse analysis includes all plausible models. We illustrate the benefits of multiverse analysis by testing the hypothesis posed by Wu et al. (2019) that small teams produce more disruptive research than large teams. While we found robust evidence of a negative effect of team size on disruption scores, the effect size is so small that its practical relevance seems questionable. Our findings underscore the importance of assessing the multiverse robustness of bibliometric results to clarify their practical implications.
Similar Papers
Single-Dataset Meta-Analysis For Many-Analysts And Multiverse Studies
Methodology
Shows how different math choices change results.
Synergy, not size: How collaboration architecture shapes scientific disruption
Digital Libraries
Finds best ways for scientists to work together.
Exploring the Garden of Forking Paths in Empirical Software Engineering Research: A Multiverse Analysis
Software Engineering
Finds many ways to analyze data, making results more trustworthy.