Evaluating Implementation of the Transparency and Openness Promotion Guidelines: Reliability of Instruments to Assess Journal Policies, Procedures, and Practices
Issued Date
2023-01-01
Resource Type
ISSN
25152459
eISSN
25152467
Scopus ID
2-s2.0-85152087550
Journal Title
Advances in Methods and Practices in Psychological Science
Volume
6
Issue
1
Rights Holder(s)
SCOPUS
Bibliographic Citation
Advances in Methods and Practices in Psychological Science Vol.6 No.1 (2023)
Suggested Citation
Kianersi S., Grant S.P., Naaman K., Henschel B., Mellor D., Apte S., Deyoe J.E., Eze P., Huo C., Lavender B.L., Taschanchai N., Zhang X., Mayo-Wilson E. Evaluating Implementation of the Transparency and Openness Promotion Guidelines: Reliability of Instruments to Assess Journal Policies, Procedures, and Practices. Advances in Methods and Practices in Psychological Science Vol.6 No.1 (2023). doi:10.1177/25152459221149735 Retrieved from: https://repository.li.mahidol.ac.th/handle/20.500.14594/82264
Title
Evaluating Implementation of the Transparency and Openness Promotion Guidelines: Reliability of Instruments to Assess Journal Policies, Procedures, and Practices
Author's Affiliation
Indiana University School of Education
The University of North Carolina at Chapel Hill
Indiana University-Purdue University Indianapolis
Indiana University Bloomington
Brigham and Women's Hospital
Faculty of Medicine Ramathibodi Hospital, Mahidol University
University of Oregon
Indiana University School of Informatics and Computing
Center for Open Science
The University of North Carolina at Chapel Hill
Indiana University-Purdue University Indianapolis
Indiana University Bloomington
Brigham and Women's Hospital
Faculty of Medicine Ramathibodi Hospital, Mahidol University
University of Oregon
Indiana University School of Informatics and Computing
Center for Open Science
Other Contributor(s)
Abstract
The Transparency and Openness Promotion (TOP) Guidelines describe modular standards that journals can adopt to promote open science. The TOP Factor quantifies the extent to which journals adopt TOP in their policies, but there is no validated instrument to assess TOP implementation. Moreover, raters might assess the same policies differently. Instruments with objective questions are needed to assess TOP implementation reliably. In this study, we examined the interrater reliability and agreement of three new instruments for assessing TOP implementation in journal policies (instructions to authors), procedures (manuscript-submission systems), and practices (journal articles). Independent raters used these instruments to assess 339 journals from the behavioral, social, and health sciences. We calculated interrater agreement (IRA) and interrater reliability (IRR) for each of 10 TOP standards and for each question in our instruments (13 policy questions, 26 procedure questions, 14 practice questions). IRA was high for each standard in TOP; however, IRA might have been high by chance because most standards were not implemented by most journals. No standard had “excellent” IRR. Three standards had “good,” one had “moderate,” and six had “poor” IRR. Likewise, IRA was high for most instrument questions, and IRR was moderate or worse for 62%, 54%, and 43% of policy, procedure, and practice questions, respectively. Although results might be explained by limitations in our process, instruments, and team, we are unaware of better methods for assessing TOP implementation. Clarifying distinctions among different levels of implementation for each TOP standard might improve its implementation and assessment (study protocol: https://doi.org/10.1186/s41073-021-00112-8).