Results & Publications
Kohrt, F., Smaldino, P. E., McElreath, R., & Schönbrodt, F. (2023). Replication of the natural selection of bad science. Royal Society Open Science, 10(2), 221306. https://doi.org/10.1098/rsos.221306
Breznau, N., Rinke, E., Wuttke, A., Nguyen, H. H. V., Adem, M., Adriaans, J., … Żółtak, T. (2022). How Many Replicators Does It Take to Achieve Reliability? Investigating Researcher Variability in a Crowdsourced Replication. https://doi.org/10.31235/osf.io/j7qta
Trübutschek, D.*, Yang, Y.*, Gianelli, C.*, Cesnaite, E., Fischer, N. L., Vinding, M. C., Marshall, T., Algermissen, J., Pascarella, A., Puolivali, T., Vitale, A., Busch, N. A.**, Nilsonne, G.** (2022). EEGManyPipelines: A Large-scale, Grassroots Multi-analyst Study of Electroencephalography Analysis Practices in the Wild. Journal of cognitive neuroscience, 36(2), 217-224. https://doi.org/10.1162/jocn_a_02087
* shared first authorship
** shared senior authorship
Gollwitzer, M., & Schwabe, J. (2022). Context dependency as a predictor of replicability. Review of General Psychology, 26(2), 241-249. https://doi.org/10.1177/10892680211015635
Ankel-Peters, J., Vance, C., & Bensch, G. (2022). Spotlight on researcher decisions–Infrastructure evaluation, instrumental variables, and first-stage specification screening. Ruhr Economic Papers, 22(991). https://www.rwi-essen.de/fileadmin/user_upload/RWI/Publikationen/Ruhr_Economic_Papers/REP_22_991.pdf
Paul, K., Short, C. A., Beauducel, A., Carsten, H. P., Härpfer, K., Hennig, J., ... & Wacker, J. (2022). The methodology and dataset of the coscience eeg-personality project–a large-scale, multi-laboratory project grounded in cooperative forking paths analysis. Personality Science, 3(1), e7177. https://doi.org/10.5964/ps.7177
Ankel-Peters, J., Fiala, N., & Neubauer, F. (2023). Do Economists Replicate? Journal of Economic Behavior & Organization, 212, 219-232. https://doi.org/10.1016/j.jebo.2023.05.009
Ankel-Peters, J., Fiala, N., & Neubauer, F. (2023b). Is Economics Self-correcting? Replications in the American Economic Review. Economic Inquiry. https://doi.org/10.1111/ecin.13222
Krähmer, D., Schächtele, L., & Schneck, A. (2023). Care to share? Experimental evidence on code sharing behavior in the social sciences. Plos one, 18(8), e0289380. https://journals.plos.org/plosone/article?id=10.1371/journal.pone.0289380
Schneck, A. (2023). Are most published research findings false? Trends in statistical power, publication selection bias, and the false discovery rate in psychology (1975–2017). Plos one, 18(10), e0292717. https://doi.org/10.1371/journal.pone.0292717
Kristanto, D., Gießing, C., Marek, M., Zhou, C., Debener, S., Thiel, C. M., & Hildebrandt, A. (2023). An Extended Active Learning Approach to Multiverse Analysis: Predictions of Latent Variables from Graph Theory Measures of the Human Connectome and Their Direct Replication. Brainiacs Journal of Brain Imaging and Computing Sciences, 4, Edoc J962E0F53. https://doi.org/10.48085/J962E0F53
Kristanto, D., Hildebrandt, A., Sommer, W., Zhou, C. (2023). Cognitive abilities are associated with specific conjunctions of structural and functional neural subnetworks. NeuroImage. https://doi.org/10.1016/j.neuroimage.2023.120304
Gießing, C. (2023). Identifying Reproducible Biomarkers of Autism Based on Functional Brain Connectivity. Biological Psychiatry, 94, 2-3. https://doi.org/10.1016/j.biopsych.2023.04.021
Nebe, S., Reutter, M., Baker, D.H., Bolte, J., Domes, G., Gamer, M., Gartner, A., Gießing, C., Gurr, C., Hilger, K., Jawinski, P., Kulke, L., Lischke, A., Markett, S., Meier, M., Merz, C.J., Popov, T., Puhlmann, L.M.C., Quintana, D.S., Schafer, T., Schubert, A.L., Sperl, M.F.J., Vehlen, A., Lonsdorf, T.B., & Feld, G.B. (2023). Enhancing precision in human neuroscience. Elife, 12. https://doi.org/10.7554/eLife.85980
Burkhardt, M., & Gießing, C. (2023). A dynamic functional connectivity toolbox for multiverse analysis. bioRxiv: https://doi.org/10.1101/2024.01.21.576546
Rausch, M., Hellmann, S., & Zehetleitner, M. (2023). Measures of metacognitive efficiency across cognitive models of decision confidence. Psychological methods, 10.1037/met0000634. Advance online publication. https://doi-org.emedien.ub.uni-muenchen.de/10.1037/met0000634
Leung, A. Y., & Schmalz, X. (2023). In search of proxy measures of heterogeneity in conceptual definitions: A cognitive linguistic perspective. Proceedings of the Annual Meeting of the Cognitive Science Society, 45. https://escholarship.org/uc/item/2c80j0rx
Schmalz, X., Biurrun Manresa, J., & Zhang, L. (2021, November 15). What Is a Bayes Factor?. Psychological Methods. Advance online publication. http://dx.doi.org/10.1037/met0000421
Jacobsen, N. S. J., Kristanto, D., Welp, S., Inceler, Y. C., & Debener, S. (2024). Preprocessing Choices for P3 Analyses with Mobile EEG: A Systematic Literature Review and Interactive Exploration. https://doi.org/10.1101/2024.04.30.591874
Bißantz, S., Frick, S., Melinscak, F., Iliescu, D., & Wetzel, E. (2024). The potential of machine learning methods in psychological assessment and test construction. European Journal of Psychological Assessment, 40(1), 1–4. https://doi.org/10.1027/1015-5759/a000817
Knöpfle, P., & Schatto-Eckrodt, T. (2024). The Challenges of Replicating Volatile Platform-Data Studies: Replicating Schatto-Eckrodt et al.(2020). Media and Communication, 12. https://doi.org/10.17645/mac.7789
Knöpfle, P., Haim, M., & Breuer, J. (2024). Ethics in Computational Communication Science: Between values and perspectives. https://www.ssoar.info/ssoar/handle/document/91769
Beinhauer, L.J., Fünderich, J.H., Renkewitz, F. (2024). Erroneous Generalization - Exploring Random Error Variance in Reliability Generalizations of Psychological Measurements. https://doi.org/10.31234/osf.io/ud9rb
Fuenderich, J., Beinhauer, L.J., & Renkewitz, F. (2024). Whoever has will be given more? How to use the intercept-slope correlation in improving our understanding of replicability, heterogeneity and theory development. https://doi.org/10.31234/osf.io/s82zr
Frank, M. & Heene, M. (2024). The effect of suboptimal model choice - Ordinal modeling as a way to better understand effect size heterogeneity? https://doi.org/10.31234/osf.io/txnpg
Brodeur, A., Esterling, K., Ankel-Peters, J., Dreber, A., Johanneson, M., Miguel, E., Green, D. & others. (2024). Promoting Reproducibility and Replicability in Political Science. Research & Politics. https://doi.org/10.1177/20531680241233439
Breznau, N., & Nguyen, H. H. V. (2024). Enter The Theory Multiverse: Economizing Theory Development Through Meta-Analysis of Theories-as-Data. https://doi.org/10.31235/osf.io/4dbau
Glöckner, A., Jekel, M., & Lisovoj, D. (in press). Using machine learning to evaluate and enhance models of probabilistic inference. Decision.
Rausch, M., & Hellmann, S. (2024). statConfR: An R Package for Static Models of Decision Confidence and Metacognition. PsyArXiv. https://osf.io/dk6mr/
Röseler, L., Kaiser, L., Doetsch, C. A., Klett, N., Seida, C., Schütz, A., . Rausch, M., . Zhang, Y. (2024). The Replication Database: Documenting the Replicability of Psychological Science. MetaArXiv. https://osf.io/preprints/metaarxiv/me2ub
Knöpfle, P., Haim, M. & Breuer, J. (2024). Key topic or bare necessity? How Research Ethics are Addressed and Discussed in Computational Communication Science. Publizistik (2024). https://doi.org/10.1007/s11616-024-00846-7
Leung, A. Y., Melev, I., & Schmalz, X. (2024). Quantifying Concept Definition Heterogeneity in Academic Texts: Insights into Variability in Conceptualization. https://osf.io/preprints/osf/gu7b5
Fünderich, J. H., Beinhauer, L. J., & Renkewitz, F. (2024). Reduce, reuse, recycle: Introducing MetaPipeX , a framework for analyses of multi‐lab data. Research Synthesis Methods, jrsm.1733. https://doi.org/10.1002/jrsm.1733
Breuer, J., & Haim, M. (2024). Are We Replicating Yet? Reproduction and Replication in Communication Research. Media and Communication, 12. https://doi.org/10.17645/mac.8382
Kristanto, D., Burkhardt, M., Thiel, C. M., Debener, S., Gießing, C., & Hildebrandt, A. (2024). The multiverse of data preprocessing and analysis in graph-based fMRI: A systematic literature review of analytical choices fed into a decision support tool for informed analysis. Neuroscience & Biobehavioral Reviews, 105846. https://doi.org/10.1016/j.neubiorev.2024.105846
Special Issues and Topical Sections
Breuer, J., & Haim, M. (Eds.) (2024). Reproducibility and Replicability in Communication Research. Special Issue on “Reproducibility and Replicability in Communication Research” in Media and Communication. https://doi.org/10.17645/mac.i429
Legend on research transparency
In the META-REP program, we advocate for open, transparent, and reproducible research. To promote these goals, we identify Open Access, Open Data, Preregistration, and Open Material in our publications. Clicking on one of the the icons in the publication list above will take you directly to the corresponding resource.
Applications
MetaPipeX
The MetaPipeX Shiny App was developed as a tool to harmonize, analyze, visualize and document multi-lab data. The web-implementation of the app on servers of the Leibniz-Computing Centre (LRZ) allows you to explore the MetaPipeX framework without installing any software or R-packages. By using simulated data, you can explore its functionality immediately. A basic understanding of meta-analyses and multi-lab replications is required. For an introduction to the MetaPipeX framework, please consult our publication: https://onlinelibrary.wiley.com/doi/10.1002/jrsm.1733
g-fMRI-METEOR
The g-fMRI-METEOR app allows navigating a knowledge space of data processing and analysis decisions to parameterize graph theory metrics describing the functional human connectome. The app is described in Kristanto et al. (2023) and is available at https://www.apps.meta-rep.lmu.de/METEOR/. The code and the data can be accessed via: https://github.com/kristantodan12/fMRI_Multiverse/
mEEG-METEOR
The mEEG-METEOR app allows navigating a knowledge space of data processing decisions to parameterize the P300 Event Related Potentials in studies collected with mobile EEG. The app is described in Jacobsen et al. (2023) and is available at https://meteor-eeg-oldenburg.shinyapps.io/eeg_multiverse/
Code & Implementations
COMET Toolbox
The COMET Toolbox, described in Burkhardt and Gießing (2023) is available at https://github.com/mibur1/dfc-multiverse. The toolbox currently allows the implementation of dynamic functional connectivity (dFC) analysis, graph analysis and multiverse analysis in dFC. It is modular, meaning that individual parts can be used in combination with others, but also on their own.
Extended Active Learning (AL) approach to Multiverse Analysis
The code for an Extended Active Learning (AL) approach to Multiverse Analysis, described in Kristanto et al. (2023) is available at https://github.com/kristantodan12/ExtendedAL. This is an extension of the AL approach proposed by Dafflon et al. (2022) as an alternative to exhaustively exploring all forking paths in the multiverse. The approach consists of quantifying the similarity of preprocessing and analysis pipelines, embedding this similarity in a low-dimensional space, and using an active learning algorithm based on Bayesian optimisation and Gaussian processes to approximate an exhaustive multiverse analysis. This approach was evaluated by Dafflon et al. (2022) and open code was provided for the prediction of an observed outcome variable. However, in computational psychiatry and neurocognitive psychology, where latent traits are conceptualized as common causes of a variety of observable behavioral symptoms, the prediction of dimensional latent traits is often of interest. We therefore extended the pipeline to predict a latent outcome variable. Another important advancement of the method was to implement the pipeline similarity estimates not only for region-specific (as in Dafflon et al., 2022), but also for brain-wide graph metrics characterizing the functional human connectome.