META-REP: A Meta-scientific Programme to Analyse and Optimise Replicability in the Behavioural, Social, and Cognitive Sciences (SPP 2317)
print

Links and Functions

Breadcrumb Navigation


Content

Towards a Falsifiable Framework for Theories in Psychological Science

Building falsifiable theories is particularly important in unravelling the scientific underpinnings of neurodevelopmental disorders, such as developmental dyslexia, as it helps establish a testable, logical skeleton for hypothesis testing (Popper, 2005; Wacker, 1998). Yet, in psychological science, several factors have been impeding the construction of falsifiable theories. These include, for example, ill-defined terminologies and indeterminate operational indicators of psychological constructs (e.g., Meehl, 1978), dubious psychological tests (e.g., Eronen & Bringmann, 2021), flexibility in data analysis (Simmons et al., 2011), and publication bias (e.g., Francis, 2013). These factors might be associated with the replication crisis in psychological sciences, manifested by the conflicting evidence and low effect sizes found in empirical studies (Open Science Collaboration, 2015). Yet, to date, the relative prominence of these theoretical, methodological, and systematic factors on replicability is unknown. Preceding the building of sound theories in psychological science, understanding and assessing the relationship between these associated factors and replicability is fundamental.

In light of the above, this meta-science project aims to evaluate the effects of theory underspecification, questionable measurements, and biased research practices on replicability in psychological sciences. Taking developmental dyslexia research as a case study, this project aims to quantify to what extent (1) a theory might be underspecified, (2) psychological measurements might be invalid or unreliable, (3) research designs and statistical models might be maladapted and reported with biases, and (4) how the above factors predict effect size variability and replicability. There are three work packages (WP1, WP2, and WP3) in this project: WP1 develops methods to quantify theory specificity, in particular, specificity in defining key terminology and hypothesis generation; WP2 examines the ways to quantify the methodological strength of empirical studies of each dyslexia theory, indicated by the psychometric properties of the cognitive tasks, the reproducibility and robustness of the studies, and publication bias; WP3 explores the relative importance of the above-examined factors, including theory underspecification, poor measurement, and publication bias, as correlates of low replicability. The developed framework for quantifying and assessing the weights of potential factors associated with low replicability, especially the importance of theory specification, is expected to be generalizable to other subfields of psychology.

For more details, the proposal of our project is available at https://osf.io/b9nmr/.

References:

  • Eronen, M. I., & Bringmann, L. F. (2021). The Theory Crisis in Psychology: How to Move Forward. Perspectives on Psychological Science, 16(4), 779–788. https://doi.org/10.1177/1745691620970586
  • Francis, G. (2013). Replication, statistical consistency, and publication bias. Journal of Mathematical Psychology, 57(5), 153–169. https://doi.org/10.1016/j.jmp.2013.02.003
  • Meehl, P. E. (1978). Theoretical risks and tabular asterisks: Sir Karl, Sir Ronald, and the slow progress of soft psychology. Journal of Consulting and Clinical Psychology, 46(4), 806–834. https://doi.org/10.1037/0022-006X.46.4.806
  • Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251), aac4716. https://doi.org/10.1126/science.aac4716
  • Popper, K. (2005). The Logic of Scientific Discovery. Routledge.
  • Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-Positive Psychology: Undisclosed Flexibility in Data Collection and Analysis Allows Presenting Anything as Significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632
  • Wacker, J. G. (1998). A definition of theory: Research guidelines for different theory-building research methods in operations management. Journal of Operations Management, 16(4), 361–385. https://doi.org/10.1016/S0272-6963(98)00019-9

Further Information

Our Team