Change search
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf
Same data, different conclusions: Radical dispersion in empirical results when independent analysts operationalize and test the same hypothesis
Show others and affiliations
Number of Authors: 1792021 (English)In: Organizational Behavior and Human Decision Processes, ISSN 0749-5978, E-ISSN 1095-9920, Vol. 165, p. 228-249Article in journal (Refereed) Published
Abstract [en]

In this crowdsourced initiative, independent analysts used the same dataset to test two hypotheses regarding the effects of scientists' gender and professional status on verbosity during group meetings. Not only the analytic approach but also the operationalizations of key variables were left unconstrained and up to individual analysts. For instance, analysts could choose to operationalize status as job title, institutional ranking, citation counts, or some combination. To maximize transparency regarding the process by which analytic choices are made, the analysts used a platform we developed called DataExplained to justify both preferred and rejected analytic paths in real time. Analyses lacking sufficient detail, reproducible code, or with statistical errors were excluded, resulting in 29 analyses in the final sample. Researchers reported radically different analyses and dispersed empirical outcomes, in a number of cases obtaining significant effects in opposite directions for the same research question. A Boba multiverse analysis demonstrates that decisions about how to operationalize variables explain variability in outcomes above and beyond statistical choices (e.g., covariates). Subjective researcher decisions play a critical role in driving the reported empirical results, underscoring the need for open data, systematic robustness checks, and transparency regarding both analytic paths taken and not taken. Implications for orga-nizations and leaders, whose decision making relies in part on scientific findings, consulting reports, and internal analyses by data scientists, are discussed.

Place, publisher, year, edition, pages
2021. Vol. 165, p. 228-249
Keywords [en]
crowdsourcing data analysis, scientific transparency, research reliability, scientific robustness, researcher degrees of freedom, analysis-contingent results
National Category
Psychology
Research subject
Psychology
Identifiers
URN: urn:nbn:se:su:diva-197352DOI: 10.1016/j.obhdp.2021.02.003ISI: 000674429500016OAI: oai:DiVA.org:su-197352DiVA, id: diva2:1599662
Available from: 2021-10-01 Created: 2021-10-01 Last updated: 2022-03-07Bibliographically approved

Open Access in DiVA

No full text in DiVA

Other links

Publisher's full text

Authority records

Schweinsberg, MartinAlbers, CasperMcNamara, Amelia A.Nilsonne, GustavSkarzynski, MartinStoffel, Martin A.

Search in DiVA

By author/editor
Schweinsberg, MartinOtner, Sarah M. G.Albers, CasperMcNamara, Amelia A.Nilsonne, GustavSkarzynski, MartinStoffel, Martin A.
By organisation
Stress Research Institute
In the same journal
Organizational Behavior and Human Decision Processes
Psychology

Search outside of DiVA

GoogleGoogle Scholar

doi
urn-nbn

Altmetric score

doi
urn-nbn
Total: 82 hits
CiteExportLink to record
Permanent link

Direct link
Cite
Citation style
  • apa
  • ieee
  • modern-language-association-8th-edition
  • vancouver
  • Other style
More styles
Language
  • de-DE
  • en-GB
  • en-US
  • fi-FI
  • nn-NO
  • nn-NB
  • sv-SE
  • Other locale
More languages
Output format
  • html
  • text
  • asciidoc
  • rtf