META-REP: A Meta-scientific Programme to Analyse and Optimise Replicability in the Behavioural, Social, and Cognitive Sciences (SPP 2317)
print

Links and Functions

Breadcrumb Navigation


Content

Results & Publications

oa_badge materials_small_color Kohrt, F., Smaldino, P. E., McElreath, R., & Schönbrodt, F. D. (2022, October 6). Replication of the natural selection of bad science. https://doi.org/10.31222/osf.io/sjyp3

oa_badge data_small_color preregistered_small_color materials_small_color Breznau, N., Rinke, E., Wuttke, A., Nguyen, H. H. V., Adem, M., Adriaans, J., … Żółtak, T. (2022). How Many Replicators Does It Take to Achieve Reliability? Investigating Researcher Variability in a Crowdsourced Replication. https://doi.org/10.31235/osf.io/j7qta

oa_badge preregistered_small_color Trübutschek, D.*, Yang, Y.*, Gianelli, C.*, Cesnaite, E., Fischer, N. L., Vinding, M. C., Marshall, T., Algermissen, J., Pascarella, A., Puolivali, T., Vitale, A., Busch, N. A.**, Nilsonne, G.** (2022). Eegmanypipelines: A Large-scale, Grass-root Multi-analyst Study of EEG Analysis Practices in the Wild. https://osf.io/preprints/metaarxiv/jq342
* shared first authorship
** shared senior authorship

Gollwitzer, M., & Schwabe, J. (2022). Context dependency as a predictor of replicability. Review of General Psychology, 26(2), 241-249. https://doi.org/10.1177/10892680211015635

Ankel-Peters, J., Vance, C., & Bensch, G. (2022). Spotlight on researcher decisions–Infrastructure evaluation, instrumental variables, and first-stage specification screening. Ruhr Economic Papers, 22(991). https://www.rwi-essen.de/fileadmin/user_upload/RWI/Publikationen/Ruhr_Economic_Papers/REP_22_991.pdf

oa_badge materials_small_color Ankel-Peters, J., Fiala, N., & Neubauer, F. (2023). Is Economics Self-Correcting? Replications in the American Economic Review. Ruhr Economic Papers, 23(1005). https://www.rwi-essen.de/fileadmin/user_upload/RWI/Publikationen/Ruhr_Economic_Papers/REP_23_1005.pdf

oa_badge Ankel-Peters, Jörg; Fiala, Nathan; Neubauer, Florian (2023): Do Economists Replicate?, I4R Discussion Paper Series, No. 13, Institute for Replication (I4R). https://www.econstor.eu/handle/10419/267931

oa_badge data_small_color preregistered_small_color materials_small_color Krähmer, D., Schächtele, L., & Schneck, A. (2023). Care to share? Experimental evidence on code sharing behavior in the social sciences. Plos one, 18(8), e0289380. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0289380

Bißantz, S. M., Frick, S., Melinscak, F., Iliescu, D., & Wetzel, E. (2024). The Potential of Machine Learning Methods in Psychological Assessment and Test Construction. European Journal of Psychological Assessment, Advance online publication. https://doi.org/10.1027/1015-5759/a000817

Schneck, A. (2023). Are most published research findings false? Trends in statistical power, publication selection bias, and the false discovery rate in psychology (1975–2017). Plos one, 18(10), e0292717. https://doi.org/10.1371/journal.pone.0292717


Legend on research transparency

In the META-REP program, we advocate for open, transparent, and reproducible research. To promote these goals, we identify Open Access, Open Data, Preregistration, and Open Material in our publications. Clicking on one of the the icons in the publication list above will take you directly to the corresponding resource.

oa_badgedata_small_colorpreregistered_small_colormaterials_small_color