Replications in Context: A Framework for Evaluating New Methods in Quantitative Political Science (with Jeff Harden and Anand Sokhey). Political Analysis, Vol. 27 No. 1, 2019: 119-125.
How should quantitative political scientists evaluate the practical utility of new methods? Political methodologists often justify a new method’s value, in part, by reporting results from a small number of replication studies for which the new method yields divergent substantive conclusions compared to the original research. We contend that this approach encourages replication selection bias and establishes inadequate evaluation criteria, which may lead to overconfidence in the new method. We propose an alternative evaluation framework that involves preregistering a replication plan, collecting a representative sample of replication studies, and formally assessing the average difference between existing and new methods from a substantive perspective. We employ this framework in evaluating several examples of newly-introduced methods. We conclude that our approach sets a more rigorous standard for assessing whether a new method is useful to applied researchers and complements the discipline’s rising norms of transparency and easy access to replication data.
Read the paper here.
Work in progress:
Influence in State Legislatures
Political Institutions & Legislative Cue-Taking Networks