In this article, we provide a toolbox of recommendations and resources for those aspiring to promote the uptake of open scientific practices. Open Science encompasses a range of behaviours that aim to improve the transparency of scientific research. This paper is divided into seven sections, each devoted to different groups or institutions in the research ecosystem: colleagues, students, departments and faculties, universities, academic libraries, journals, and funders. We describe the behavioural influences and incentives for each of these stakeholders as well as changes they can make to foster Open Science. Our primary goal, however, is to suggest actions that researchers can take to promote these behaviours, inspired by simple principles of behaviour change: make it easy, social, and attractive. In isolation, a small shift in one person’s behaviour may appear to make little difference, but when combined, many shifts can radically alter shared norms and culture. We offer this toolbox to assist individuals and institutions in cultivating a more open research culture.
Many scientific disciplines are currently experiencing a “reproducibility crisis”. Psychology, economics, and medicine are just a few of the disciplines where many influential findings are failing to replicate (Camerer et al., 2018; Duvendack et al., 2017; Open Science Collaboration, 2015; Prinz et al., 2011). Questionable research and publication practices (QRPPs) are partly to blame for this crisis. For example, a norm of publishing positive results (Ferguson & Brannick, 2012; Rosenthal, 1979) incentivises researchers to be especially liberal when analysing their data (e.g., ‘p-hacking’; Simmons et al., 2011), and to generate hypotheses after the results of an experiment are known as if they were expected from the outset (‘HARKing’; Kerr, 1998). More than half of psychology researchers surveyed, for instance, reported peeking at a study’s results before subsequently deciding whether to collect more data, and more than one third claimed to have engaged in HARKing (John et al., 2012; but QRPPs are not limited to psychology, see Fraser et al., 2018; Gopalakrishna et al., 2021). QRPPs provide fertile ground for further irreproducibility and result in part from the culture and incentive structures in academia (Edwards & Roy, 2017; Munafò et al., 2017; Nosek et al., 2012; Smaldino & McElreath, 2016).
A movement of Open Science has emerged in response to these issues (see Vazire, 2018). The umbrella term ‘Open Science’ encompasses a range of practices that aim to increase the transparency and rigour of scientific research (Fecher & Friesike, 2014),1. Reforms such as preregistration, preprints, replication studies, and publicly sharing data are some practices that can improve how easy research is to evaluate, use, and reproduce (Corker, 2018; Spellman et al., 2018). However, many researchers are not embracing open practices; perhaps they are unaware of the benefits of these practices, perhaps they are dissuaded by a perception that more transparent science is too laborious, or perhaps they have not had the time or energy to integrate these practices in a way that fits with their current way of doing things (see Gagliardi et al., 2015). Academic incentive structures that reward the volume of one’s publications, and impact factors (rather than the quality or rigour of the research), may also undermine the adoption of Open Science practices.
The primary goal of this paper is to recommend some ways in which individual researchers can use principles of behaviour change to promote the uptake and maintenance of open practices in others. There are a few key behaviours that we believe are critical to Open Science: preregistration (and Registered Reports), preprints and open access publication, publicly sharing open and usable data and code, and conducting replication studies. However, the research ecosystem involves many stakeholder groups (both individual and institutional), and there are direct and indirect ways for researchers to encourage others to adopt open practices.
First, researchers can influence other individuals with whom they are in close contact, namely their colleagues and students. Researchers work with other academics and commonly serve as mentors and teachers to thousands of students—the next generation’s researchers and research consumers. In the first part of this paper, we outline actions that researchers can take to directly influence their colleagues and students to adopt open practices.
Of course, researchers and students are heavily influenced by top-down barriers and incentive structures. The policies and practices of institutions may be such that Open Science simply presents a barrier to one’s research goals. If researchers try to influence individuals directly, their actions may have little effect so long as institutions—departments and faculties, universities, libraries, journals, and funders—fail to change as well (Munafò, 2019). In the second part of this paper, we focus on how individuals can influence institutions. In each of the sections, we outline changes that those with institutional decision-making power can directly enact, but most importantly we also recommend ways in which the typical researcher, even those who do not hold positions of influence, can increase the likelihood that key decision-makers will enact change. For each section, we also provide a table summary with useful online resources.
Behaviour change
Whether or not individuals and institutions decide to adopt Open Science practices and policies is largely a behavioural question (Bishop, 2020; Norris & O’Connor, 2019). Insights from psychology and other behavioural sciences suggest that people routinely make decisions through automatic, impulsive, and emotional processes, and are often driven by social pressures and immediate cues in their environment (Kahneman, 2011; Tversky & Kahneman, 1974). Psychosocial factors influence the everyday decisions people make whether it be deciding which product to buy at a supermarket, or decisions about how to conduct, report, evaluate, publish, or fund research.
Theories and findings from across the behavioural sciences can inform practically any situation where a human decision-maker is involved. People tend to behave in a way that is easiest or provides least resistance. Sweden, for example, enjoys far higher rates of organ donation than Denmark not because the Swedes are more compassionate or because of some cultural difference, but simply because Sweden requires people to opt-out of donating their organs whereas people in Denmark must opt-in (Johnson & Goldstein, 2003). Highlighting a social norm—the accepted standard of behaviour of most people in a group that one cares about—can also greatly influence how people act. If people discover that 90% of fellow group members (rather than 10%) put their rubbish in the bin, they are more likely to do the same (for similar examples, see Hallsworth, 2014; Hallsworth et al., 2016; Nolan et al., 2008).
There are at least two key frameworks for effective behaviour change (see Figure 1). We use these to ground our recommendations for researchers to promote Open Science. The first framework is the Pyramid of Culture Change (Nosek, 2019) and the second is EAST (Easy, Attractive, Social, and Timely; UK Behavioural Insights Team, 2014). Though the frameworks are somewhat distinct, the underlying principles are similar. Both the Pyramid of Culture Change and the EAST framework assert that people are more likely to engage in a desired behaviour if it is made easy. Both frameworks also highlight the power of social connection, commitments, and norms in influencing behaviour, and they both underscore the effectiveness of making desired behaviours attractive or rewarding. If researchers wish to influence individuals and institutions, they ought to focus on making behaviours easy, social, and attractive. Incremental shifts across the entirety of the research ecosystem are likely to greatly improve the way scientific research is conducted, evaluated, and disseminated. These principles of behaviour change provide a means to promote such shifts.
Individuals
Individual researchers ultimately determine how scientific studies are conducted; they design the experiments, gather the data, and write up the results. Here, we suggest several open practices that individuals can readily adopt—preregistration, preprints, open data and code, and replication—and then we recommend ways in which researchers can influence colleagues and students to adopt these practices using the principles of make it easy, make it social, and make it attractive. We first turn our attention to colleagues, and then to students.
Colleagues
A commonly portrayed aphorism in academia is ‘to publish or perish’ (see Grimes et al., 2018); there is a significant pressure to publish results that are both novel and positive, and to publish a lot (Nosek et al., 2012; Rosenthal, 1979). Although researchers seem well-intentioned and strive to produce good scientific research, they are also incentivised to make their work as palatable to journals as possible. This problem is exacerbated since one’s publications often play a role in funding or career decisions (Schimanski & Alperin, 2018).
One can imagine that the prevailing incentive structures might make it tempting for researchers to engage in QRPPs without even being aware. Scientists, like all humans, are prone to biases such as confirmation bias (favouring results that confirm one’s beliefs; Klayman, 1995) and hindsight bias (overestimating one’s ability to have predicted an outcome that could not have been predicted; Roese & Vohs, 2012). Encouraging practices that improve rigour and transparency in one’s research is therefore critical to cultivating a culture of Open Science (Nuzzo, 2015; Spellman et al., 2018).
Target Behaviours
Though there are now many Open Science reforms taking place (e.g., national Reproducibility Networks, Open Science Communities), and many different practices that can contribute to remedying the replication crisis, here we focus on a handful of key behaviours that researchers can integrate into their routine workflow.
Preregistration and Registered Reports. Preregistration is a commitment to one’s research planning decisions, including predictions, recruitment strategies, exclusion criteria, stopping rules, materials, procedures, study design, and analysis plans before one has knowledge of a study’s outcomes (DeHaven, 2017; Gelman & Loken, 2013; Nosek et al., 2018). This commitment means that researchers must demarcate between confirmatory and exploratory decisions, analyses, and findings. Although some have questioned the utility of preregistration (see Devezer et al., 2020; Szollosi et al., 2020), shifting decisions about theory and study design to earlier in the research process helps to ensure that unforeseeable issues with an experiment are addressed early on. Preregistration also constrains the effect of researcher biases on the results or interpretation of a study and allows future readers to process findings within the broader analytic context. The practice therefore serves as a partial antidote to QRPPs (see Fischhoff & Beyth, 1975; Munafò et al., 2017; Nosek et al., 2018; Wagenmakers et al., 2012; Wicherts et al., 2016).
Registered Reports take preregistration to the next level. In a Registered Report, one submits a detailed plan of the research questions, hypotheses, methodology, and analyses to a scientific journal for review prior to collecting data (Chambers et al., 2014). Once a Registered Report is accepted, the journal agrees to publish the study if the quality control criteria are met, regardless of the results. A recent initiative, Peer Community In Registered Reports (PCI RR), also provides a community of scientists dedicated to reviewing and recommending Registered Reports. Once a Registered Report is positively evaluated, authors have the option to publish their work in traditional journals that have committed to accepting PCI RR recommendations without further peer review.
Preregistration and Registered Reports help to ensure that the scientific value of hypothesis testing is determined by the quality of research questions and methodology rather than the findings themselves. Importantly, preregistered studies reveal different patterns of results than traditional research practices, suggesting that they reduce false positive results and tend to outperform traditional research on several quality control criteria (Schäfer & Schwarz, 2019; Scheel et al., 2021; Soderberg et al., 2021).
Preprints and Open Access Publication. Expanding scientific knowledge rests on individuals having access to a broad range of research products so that anyone can build on prior work. The way people publish and disseminate research therefore influences how knowledge is created. One can make their work Open Access (OA) either by making articles freely accessible at the point of publication (see the ‘Libraries’ section), or by self-archiving either the published article or the author’s version of the accepted manuscript (Harnad et al., 2008). A preprint is a form of self-archiving where one posts (often publicly) a version of a scientific paper before it is formally peer-reviewed in a scholarly journal. Sharing preprints is becoming increasingly popular for many researchers (Berg et al., 2016; Narock & Goldstein, 2019).
Peer review is an integral aspect of science, and some have raised concerns over the lack of quality control in non-peer-reviewed manuscripts such as preprints (see Bauchner, 2017; Maslove, 2018). Nonetheless, knowing that a preprint has not been peer-reviewed might incentivise readers to be more critical and sceptical of a study (Velterop, 2016). More eyes on a manuscript might also improve replicability because researchers can be notified about errors, alternative interpretations, or logical flaws before formal publication (Oakden-Rayner et al., 2018). Posting preprints of null or negative findings can also alleviate the ‘file-drawer’ problem by giving a platform to results that may not be palatable for formal publication (Verma, 2017). Some have also suggested that preprints be used as a platform for open peer review (Saderi & Greaves, 2021). A recent initiative (Peer Community In; see Table 1) provides such a platform, arranging peer review and requiring a recommendation before preprints are posted. Authors then have the option to submit their ‘accepted’ preprint to a traditional journal.
Target behaviours | Preregister studies. |
Submit Registered Reports. | |
Upload preprints (and submit for peer community review). | |
Make data and code openly available. | |
Conduct replication studies. | |
Researcher actions | Organise an Open Science seminar or workshop. |
Use presentations to signal your use of open practices. | |
Share templates, guides, infographics, and papers during meetings or via email. | |
Host regular ‘ReproducibiliTea ‘meetings with colleagues. | |
Pledge with others on ‘Free Our Knowledge’. | |
Highlight to others the incentives for open practices. | |
General resources | Preregistration servers |
Open Science Framework: https://osf.io/ | |
As predicted: https://aspredicted.org/ | |
Clinical Trials: https://clinicaltrials.gov/ | |
Open Science information | |
https://how-to-open.science/ | |
Preregistration guides and templates | |
https://cos.io/prereg/, https://osf.io/zab38/, | |
van't Veer et al. (2016): https://doi.org/10.31234/osf.io/4frms | |
Registered Report guide | |
https://cos.io/rr/ | |
Preprint servers | |
arXiv: https://arxiv.org/ | |
PsyArXiv: https://psyarxiv.com/ | |
bioRxiv: https://www.biorxiv.org/ | |
OSF: https://osf.io/preprints/ | |
Preprint guide | |
https://help.osf.io/hc/en-us/articles/360019930533-Upload-a-Preprint | |
Example journals that accept preprints for submission | |
PLOS: https://plos.org/open-science/preprints/ | |
eLife: https://elifesciences.org/articles/64910 | |
‘Peer community In’ (PCI) | |
Preprint review: https://peercommunityin.org/ | |
Registered Reports: https://rr.peercommunityin.org/about | |
Data repositories | |
OSF: https://osf.io/ | |
Figshare: https://figshare.com | |
Harvard Dataverse: https://dataverse.harvard.edu/ | |
Zenodo: https://zenodo.org/ | |
Data sharing guide | |
Soderberg (2018): https://doi.org/10.1177%2F2515245918757689 | |
FAIR data information | |
https://www.go-fair.org/fair-principles/ | |
Guide to replication | |
https://doi.org/10.17605/osf.io/jx2td | |
ReproducibiliTea guide | |
https://reproducibilitea.org/about/ | |
Free our knowledge | |
https://freeourknowledge.org/ |
Target behaviours | Preregister studies. |
Submit Registered Reports. | |
Upload preprints (and submit for peer community review). | |
Make data and code openly available. | |
Conduct replication studies. | |
Researcher actions | Organise an Open Science seminar or workshop. |
Use presentations to signal your use of open practices. | |
Share templates, guides, infographics, and papers during meetings or via email. | |
Host regular ‘ReproducibiliTea ‘meetings with colleagues. | |
Pledge with others on ‘Free Our Knowledge’. | |
Highlight to others the incentives for open practices. | |
General resources | Preregistration servers |
Open Science Framework: https://osf.io/ | |
As predicted: https://aspredicted.org/ | |
Clinical Trials: https://clinicaltrials.gov/ | |
Open Science information | |
https://how-to-open.science/ | |
Preregistration guides and templates | |
https://cos.io/prereg/, https://osf.io/zab38/, | |
van't Veer et al. (2016): https://doi.org/10.31234/osf.io/4frms | |
Registered Report guide | |
https://cos.io/rr/ | |
Preprint servers | |
arXiv: https://arxiv.org/ | |
PsyArXiv: https://psyarxiv.com/ | |
bioRxiv: https://www.biorxiv.org/ | |
OSF: https://osf.io/preprints/ | |
Preprint guide | |
https://help.osf.io/hc/en-us/articles/360019930533-Upload-a-Preprint | |
Example journals that accept preprints for submission | |
PLOS: https://plos.org/open-science/preprints/ | |
eLife: https://elifesciences.org/articles/64910 | |
‘Peer community In’ (PCI) | |
Preprint review: https://peercommunityin.org/ | |
Registered Reports: https://rr.peercommunityin.org/about | |
Data repositories | |
OSF: https://osf.io/ | |
Figshare: https://figshare.com | |
Harvard Dataverse: https://dataverse.harvard.edu/ | |
Zenodo: https://zenodo.org/ | |
Data sharing guide | |
Soderberg (2018): https://doi.org/10.1177%2F2515245918757689 | |
FAIR data information | |
https://www.go-fair.org/fair-principles/ | |
Guide to replication | |
https://doi.org/10.17605/osf.io/jx2td | |
ReproducibiliTea guide | |
https://reproducibilitea.org/about/ | |
Free our knowledge | |
https://freeourknowledge.org/ |
Open Data and Code. Publicly sharing data and analytic scripts can also counter irreproducibility. Most (70%) surveyed researchers reported that they are likely to reuse open datasets (Fane et al., 2019), but a majority of surveyed researchers admitted to sharing data in fewer than 10% of their projects (Houtkoop et al., 2018). Sharing data and code can enable others to verify the appropriateness of a study’s data and analyses, and to check for errors (Klein et al., 2018). Moreover, sharing data and code can make research more reproducible if researchers ensure that the data is findable, accessible, interoperable, and reusable (FAIR; see Wilkinson et al., 2016). There are now many data repositories that researchers can use to supplement their publications (see Table 1). However, explaining why sharing certain data is not feasible or ethical also aligns with transparent practice.
Replication. Remedying irreproducibility relies on the capacity to assess whether a phenomenon or effect replicates. That is, whether the finding is stable and/or generalisable. If a finding is not replicable, it cannot advance scientific thinking; without replication studies, strong theories that map on well to statistical models are unlikely to develop (Fiedler, 2017; Szollosi et al., 2020). To replicate the work of others, researchers can follow the very same procedures that were used originally and see whether they too attain similar findings (direct replication), or take a theory or effect and see if it reoccurs in novel circumstances or populations (conceptual replication; Zwaan et al., 2018). Of course, the capacity to conduct replication studies hinges on open access to papers, data, and analyses, yet again highlighting the benefits of such practices.
Researcher Actions
How then can researchers make the above practices easy, attractive, and socially engaging for their colleagues? We summarise our ideas in Table 1 (but see Kowalczyk et al., 2020 for other useful tips and resources). A first step to making open practices easier is to provide educational opportunities to learn about what Open Science practices are (and are not). Researchers can organize Open Science seminars and workshops for colleagues. These do not have to be extensive or time consuming; if there are existing journal clubs or regular colloquia, researchers can select articles about Open Science for the group to read, or invite an expert to give a presentation. Many barriers to entry and maintenance can also arise during the research process (Corker, 2018; Munafò et al., 2017; Nosek et al., 2015); thus, sharing practical tips, infographics, video links, templates, guides, and papers during these gatherings, via email, or on an online platform, can reduce barriers to adopting and maintaining open behaviours.
Additionally, researchers can use conference talks, poster presentations, and lab meetings as opportunities to highlight an instance where they have preregistered a study, made data publicly available, or posted the relevant preprint. Signalling these practices will serve as a subtle and persistent social norm. Such a norm can be further strengthened by creating a physical ‘Open Science Wall’ in the department, or by circulating a regular Open Science newsletter.
Hosting ‘ReproducibiliTea’ journal clubs (Orben, 2019) can also foster a social environment for researchers to discuss interesting Open Science ideas and practices. These events can offer opportunities for researchers to raise reservations, dispel misconceptions, and solve problems related to Open Science practices. Regular meetings will foster group cohesion and strengthen identification with the Open Science movement. Researchers can also make a social commitments with one another to engage in certain practices; Free Our Knowledge (see https://freeourknowledge.org/) is a platform where individuals can commit to a behaviour (e.g., submit a Registered Report) such that once the number of pledges reaches a critical mass, there is then public pressure to follow through with one’s commitment. If people know that many others are committed to the same behaviour, they can trust that their actions will be part of a broader collective movement.
Meetings, presentations, social media, and discussion forums also provide opportunities to promote the personal advantages that open practices can afford researchers. Some journals now offer Open Science badges to those who preregister or share data (Kidwell et al., 2016). Posting preprints can prevent ‘scooping’ because they are associated with time-stamped digital-object identifiers (DOIs). Moreover, papers associated with a preprint or open data also tend to generate more citations (Fu & Hughey, 2019; Piwowar & Vision, 2013; Serghiou & Ioannidis, 2018), which may be particularly beneficial to early career researchers (see Allen & Mehler, 2019; Berg et al., 2016; Farnham et al., 2017; Pownall et al., 2021; Sarabipour et al., 2019). Emphasising these attractive incentives will likely encourage colleagues to adopt these practices.
Students
Students are the next generation of researchers and research consumers, and thus stand to become the future torchbearers of the Open Science reform. In many disciplines, students are key members of research teams and engage in all aspects of the research process, from conceptualising and conducting studies to communicating the resulting data. Students also have fewer research habits to unlearn and can serve as a conduit for change among more senior researchers who might face the added inertia of bedded down practices. If so, students might benefit from the suggestions we provide in the previous section in swaying others to adopt transparent research behaviours.
Indeed, many students may be aware of Open Science and generally hold positive views toward it, but this does not necessarily translate into higher levels of implementation (Toribio-Flórez et al., 2021). Students also face unique barriers; those looking to pursue a career in research, for example, are under pressure to succeed via recognised metrics of academic achievement because efforts to innovate or adopt open practices are not yet widely recognised (Schönbrodt, 2019). Knowledge of Open Science is another barrier; doctoral candidates are less likely to implement Open Science practices if not exposed to such practices by their mentors, or in their research methods courses (Toribio-Flórez et al., 2021).
Target Behaviours
The key behaviours that students ought to engage in are the same as those mentioned in the previous section: preregistration, posting preprints, sharing open data and code, and conducting replication studies. These behaviours can be ingrained in students early in their training. Open Science provides the means for students to put statistical and methodological reasoning into practice, and many universities cover open principles and reproducible research practices in science degrees (Chopik et al., 2018). Many students are intrinsically motivated to learn, but others are looking to get a competitive edge in the job market. Teaching Open Science tools and practices can provide both an engaging learning environment and a coveted skill set (e.g., Kathawalla et al., 2021).
Researcher Actions
Researchers typically serve as teachers and mentors at their institutions, and they are therefore uniquely placed to foster Open Science principles and practices among students. We summarise our ideas for how to do so in Table 2. There are several strategies teachers can use to make it easier for students to adopt open practices from their very first course or research project. For example, teachers can design lectures or courses devoted to Open Science theory (e.g., Sarafoglou et al., 2019). Teachers can also embed certain practices into graded assignments. It is common for students to be graded on how well they report experiments. However, teachers could task students with posting a mock preprint on an internal server for fellow classmates to read and review each other’s work before submitting the revised version for grading. Teachers can also set a replication study as the class project, or as the topic of an Honours thesis. Additionally, preregistration can be incorporated into thesis and dissertation projects as a necessary aspect of research proposals. With these learning experiences in place, open practices will be the default as students venture into research careers.
Target behaviours | Preregister studies. |
Submit Registered Reports. | |
Upload preprints. | |
Make data and code openly available. | |
Conduct replication studies. | |
Researcher actions | Design Open Science courses for students. |
Integrate preregistration and preprints in assessment. | |
Introduce students to open-source platforms and educate them on creating reproducible code. | |
Design group problem-solving activities involving open practices. | |
Demonstrate how to integrate open practices to lab groups. | |
General resources | Materials for teaching Open Science |
Framework for open research training: https://forrt.org/ | |
Course materials and syllabi: https://osf.io/zbwr4/, https://osf.io/vkhbt/, | |
https://www.projecttier.org/tier-classroom/course-materials/ | |
Resources for teaching: | |
https://www.osc.uni-muenchen.de/toolbox/resources_for_teaching/index.html | |
Open-source statistical software | |
R: https://r-project.org/ | |
GitHub: https://github.com/ | |
JASP: https://jasp-stats.org/ | |
Courses for learning statistics with open-source software | |
https://psyteachr.github.io/, https://learningstatisticswithr.com/,https://r4ds.had.co.nz/ | |
Replication guide for undergraduates | |
https://doi.org/10.17605/osf.io/jx2td | |
Open Scholarship Knowledge Base | |
https://www.oercommons.org/hubs/OSKB | |
Student initiative for Open Science | |
https://studentinitiativeopenscience.wordpress.com/ |
Target behaviours | Preregister studies. |
Submit Registered Reports. | |
Upload preprints. | |
Make data and code openly available. | |
Conduct replication studies. | |
Researcher actions | Design Open Science courses for students. |
Integrate preregistration and preprints in assessment. | |
Introduce students to open-source platforms and educate them on creating reproducible code. | |
Design group problem-solving activities involving open practices. | |
Demonstrate how to integrate open practices to lab groups. | |
General resources | Materials for teaching Open Science |
Framework for open research training: https://forrt.org/ | |
Course materials and syllabi: https://osf.io/zbwr4/, https://osf.io/vkhbt/, | |
https://www.projecttier.org/tier-classroom/course-materials/ | |
Resources for teaching: | |
https://www.osc.uni-muenchen.de/toolbox/resources_for_teaching/index.html | |
Open-source statistical software | |
R: https://r-project.org/ | |
GitHub: https://github.com/ | |
JASP: https://jasp-stats.org/ | |
Courses for learning statistics with open-source software | |
https://psyteachr.github.io/, https://learningstatisticswithr.com/,https://r4ds.had.co.nz/ | |
Replication guide for undergraduates | |
https://doi.org/10.17605/osf.io/jx2td | |
Open Scholarship Knowledge Base | |
https://www.oercommons.org/hubs/OSKB | |
Student initiative for Open Science | |
https://studentinitiativeopenscience.wordpress.com/ |
Teachers can further habituate students to create interpretable and reusable code and statistical skills inside and outside of the classroom. Students should be introduced to open-source statistical programming languages such as Python and R. Proficiency in these computing languages, and user-friendly open software built on them, provides students with the means to share and document their analysis scripts for others to review and reuse. Moreover, teachers can impart ways of writing code and running analysis scripts with detailed commentary so that errors or inconsistencies are easy to discover. Wrangling data into tidy, more readable formats, using sensible variable names, and producing attractive visualisations are other small steps that make analyses and results more interpretable and accessible. Learning how to simulate data from scratch can also help students to connect the dots between different designs, data structures, and data analysis techniques, which in turn helps to improve the clarity of their research plans and preregistration protocols. These skills can be taught using practical challenges in class to provide hands-on experience with reproducible tools and research workflows thereby making these behaviours easier as they continue into their research training.
Of course, teachers have the means to make these practices socially desirable as well. Classrooms provide a rich environment for social support, and the activities mentioned above can easily be adapted for group projects. Supervisors and mentors also play an important role in students’ lives, including establishing and perpetuating social norms regarding research practices. Thus, modelling rigorous, open research practice is important; if students observe their mentors thoughtfully integrating Open Science into their own research, or in the lab more generally, then students are likely to follow suit.
Institutions
Swaying other researchers and students to engage in open practices directly is unlikely to be effective in isolation if the top-down influences run counter to these efforts. Institutions in the research ecosystem incentivise and dissuade certain behaviours. If these institutions adopt structures that encourage Open Science, then open practices will prosper.
We now turn our attention to institutions: departments and faculties, universities, academic libraries, journals, and funders. For each institution, we describe their role in research, and then outline several target behaviours that the institution can implement to influence researchers and students to adopt open practices. Some researchers may have direct decision-making power and might find these ideas valuable if they have considerable sway in enacting such policies and practices. However, many researchers do not hold positions of considerable influence. We therefore end each section with recommended actions that the typical researcher—even those who might not hold positions of influence—can take to affect institutional change. Of course, this is not an exhaustive list, but we aim to provide a few concrete ways for individual researchers to indirectly influence scientific norms and practices by targeting institutional change in whatever capacity. Again, we take inspiration from the principles of behaviour change: make it easy, social, and attractive.
Departments and Faculties
Departments (and faculties) influence the practices of their researchers in several ways; departments disseminate information about trends and changes in scientific practices, set curricula that determine what students are taught and how they are assessed, and have considerable say in hiring and promotion decisions and how researchers are rewarded. Many departments and faculties still evaluate researchers based on how regularly one publishes in ‘high-impact’ journals or how frequently their work is cited (Rice et al., 2020). These metrics can motivate researchers to conduct quick and easy experiments that produce just enough new data for a new publication. To illustrate, the fastest way to accumulate publications would be to tweak experiments in minor ways and to write a separate article for each of these ‘least publishable units’ (Broad, 1981; also known as ‘salami slicing’). Collecting new data for publication’s sake is not a recipe for scientific progress but given that many departments rely on publication count when evaluating researchers, it is an ingredient of career progress. Alternatively, if departments and faculties focused more on quality (transparency and ‘contribution to scientific progress’) then researchers and students would be incentivised to devise careful experiments that aim to falsify or expand key theories in the field, and to share their data, code, and materials. Lacking the requisite learning experiences can also lead students to wander down research paths without possessing the knowledge and skills required to conduct quality research. Departments can either be catalysts or barriers to Open Science reforms among researchers and students, depending on the messaging, infrastructure, and incentives they put in place.
Target Behaviours
Those in positions of influence in departments and faculties—deans, department heads, and committees—often make decisions regarding administrative processes, online infrastructure, and hiring and promotion practices. Of course, researchers themselves tend to hold these positions and can therefore implement many initiatives directly. For instance, departments could require a (mock) preregistration as part of the ethical review processes or require plans to make data available and reusable where appropriate.
Awareness about open practices can grow if departments implement useful online infrastructure. Heads of departments and academic committees could provide funds to create and curate Open Science resources and have these available on the department’s website. Researchers would then have easy access to guides and support services. Departments could also fund workshops and research projects related to Open Science for researchers and students. Additionally, departments could consider hiring an expert on data management to advise researchers on how best to organise and prepare research data for open sharing (in line with the FAIR principles and national and international guidelines).
Departments can further encourage open practices by emphasising research quality when evaluating candidates for hire and promotion. For instance, job descriptions and panels could ask candidates for evidence of open practices in job advertisements and in interviews. Rather than demonstrating the number of publications or citations, candidates could be asked to comment on the quality and likely replicability of their publications, and the steps they have taken to increase transparency and contribute to scientific advances more broadly. Candidates, for example, could be asked to submit an annotated CV detailing preregistrations, replications, open data and code, power analyses, theoretical motivations, and rigorous experimental design. In fact, the dissertations of postgraduate students could be evaluated in similar ways. These behaviours are likely to signal that such practices are widespread and highly valued, and might encourage researchers to reconsider their own practices (Nosek, 2019). For those interested in criteria for assessing researchers, see Moher and colleagues’ (2020) review, the San Francisco Declaration of Research Assessment (DORA, 2012) and the Leiden Manifesto (Hicks et al., 2015). Gernsbacher (2018) also provides specific recommendations for how to reward open practices.
Researcher Actions
In Table 3, we provide a summary of actions and resources for encouraging departments and faculties to promote Open Science. Perhaps the most effective way for the typical researcher to encourage change at the departmental or faculty level is to serve on subcommittees and hiring panels given that there are often many opportunities to engage in these roles regardless of one’s career stage. In these roles, researchers can draw attention to research practices that improve (or threaten) research integrity. For instance, one can serve on an ethics subcommittee to promote preregistration and open data plans in ethical review processes, they can serve on a teaching and learning subcommittee to promote Open Science education in students, and they can serve on hiring panels and committees to promote changes to researcher evaluation. The nature of a committee or panel naturally means that researchers have a social platform to encourage these sorts of initiatives from the bottom-up because they can bring transparency and scientific contribution to the forefront of these decision-making processes. Providing examples of each initiative can also make the transition easier. For example, researchers can present an example of open evaluation criteria (see Table 3) or Open Science curriculum (see Table 2).
Target behaviours | Include preregistration as part of ethical review processes. |
Create support and online infrastructure for open practices. | |
Consider Open Science practices in hiring and promotion processes. | |
Researcher actions | Serve on subcommittees and panels to promote Open Science. |
Pitch changes at staff/academic meetings. | |
Offer example open evaluation criteria for candidates for hire and promotion. | |
General resources | Example Open Science guide |
https://www.uu.nl/en/research/open-science | |
Leiden manifesto for research evaluation | |
http://www.leidenmanifesto.org/ | |
Job descriptions and offers with Open Science focus | |
https://doi.org/10.17605/osf.io/7jbnt |
Target behaviours | Include preregistration as part of ethical review processes. |
Create support and online infrastructure for open practices. | |
Consider Open Science practices in hiring and promotion processes. | |
Researcher actions | Serve on subcommittees and panels to promote Open Science. |
Pitch changes at staff/academic meetings. | |
Offer example open evaluation criteria for candidates for hire and promotion. | |
General resources | Example Open Science guide |
https://www.uu.nl/en/research/open-science | |
Leiden manifesto for research evaluation | |
http://www.leidenmanifesto.org/ | |
Job descriptions and offers with Open Science focus | |
https://doi.org/10.17605/osf.io/7jbnt |
Of course, these official roles are not always available, but any faculty member can ask to have these Open Science initiatives as agenda items during staff meetings, and one can use these opportunities to point to the advantages and benefits of such initiatives. The increasingly normative nature of Open Science (see Christensen et al., 2020; Nosek & Lindsay, 2018), and frequent calls for more transparent research from journals and funding bodies, suggests that a track record of open practices and teaching will become increasingly sought after. Adopting and fostering Open Science at a departmental level early could be highly consequential as such initiatives are likely to benefit institutional rankings, and in turn attract more funding and demand from prospective students (McKiernan et al., 2016).
Universities
Universities are complex organisations with many often-competing interests. Like departments and faculties, universities often decide which researchers to hire and promote, which researchers to recognise with prizes, awards, and funds, and when and how to publicise researchers’ work. Universities also decide which training and development activities are recommended or compulsory, how integrity issues are handled, and what type of support to provide researchers and students.
University administrators want researchers at their institution to produce well-cited research in prestigious journals because these metrics are often used by governments and institutions when evaluating a university’s research output (e.g., the Research Excellence Framework; Times Higher Education; World University Rankings; see Huang, 2012). Such metrics and rankings influence how funds are allocated and can affect (among other things) the enrolment rates of domestic and international students (Harvey, 2008; Hurley, 2021; Nietzel, 2021). Some universities have even implemented financial incentive structures to reward publications in high impact journals (Abritis & McCook, 2017; Quan et al., 2017).
Many of the criteria in current frameworks and rankings are too often narrowly defined (Martin, 2011) and overlook societal impact, teaching quality, and open practices (Gadd, 2020; Pagliaro, 2021), but they nonetheless influence the local metrics that universities use to evaluate academic staff. As such, they largely incentivise poor research practice; by rewarding publication volume and citations, researchers may be implicitly incentivised to engage in QRPPs (see Smaldino & McElreath, 2016).
Target Behaviours
There are a number of initiatives that Universities can adopt to incentivise more open practices among its researchers. We will highlight three such initiatives, in no specific order. First, universities can sponsor Open Science task forces and join a national Reproducibility Network (if one exists; e.g., https://www.ukrn.org), which typically includes nominating at least one Open Science officer or leader. An Open Science task force can comprise anyone motivated to improve scientific practice at their university—academics, deans, or professional staff. The task force (or officer) can lead initiatives on behalf of the wider community on matters such as determining researchers’ attitudes and perceived barriers to Open Science, examining institutional policies and practices, suggesting alternative open policies, and making general recommendations to the Deputy Vice Chancellor (or equivalent) on matters of Open Science (Munafò, 2019). Members of the task force or networks could also offer resources and on-going training to researchers and students at the university.
Second, universities can adopt ‘OSF Institutions’, which is a free scholarly web tool designed to enhance transparency, foster collaboration, and increase the visibility of research outputs at the institutional level. ‘OSF Institutions’ makes it easy for users to incorporate the Open Science Framework (as well as other data repository services; see Table 4) into their existing research workflow. An option is also available for universities to recommend the Open Science Framework as a platform on which to manage research projects and make materials and data available to others.
Target behaviours | Establish an Open Science task force or officer, or become a member of a Reproducibility Network. |
Adopt ‘OSF Institutions’. | |
Sign the Declaration of Research Assessment (DORA). | |
Researcher actions | Start an Open Science Community. |
Conduct institution-wide surveys on open practices. | |
Pitch suggestions for Open Science reform to the chancellery (or equivalent). | |
Draft Open Science commitment statements. | |
General resources | OSF institutions |
https://www.cos.io/products/osf-institutions | |
Declaration of Research Assessment (DORA) | |
https://sfdora.org/read/,https://sfdora.org/signers/ | |
Open Science Communities | |
Open Science Communities Starter Kit: https://www.startyourosc.com/ | |
Network of (German-speaking) Open Science Initiatives: | |
https://doi.org/10.17605/osf.io/tbkzh | |
Example Open Science task force report and surveys | |
https://doi.org/10.17605/osf.io/vpwf7 | |
Example Open Science commitment statements | |
University of Helsinki: https://www.helsinki.fi/en/research/research-integrity/open-science | |
Sorbonne University: https://www.sorbonne-universite.fr/en/research-and-innovation/research-strategy/commitment-open-science | |
Leiden Ranking for research evaluation | |
https://www.leidenranking.com/ |
Target behaviours | Establish an Open Science task force or officer, or become a member of a Reproducibility Network. |
Adopt ‘OSF Institutions’. | |
Sign the Declaration of Research Assessment (DORA). | |
Researcher actions | Start an Open Science Community. |
Conduct institution-wide surveys on open practices. | |
Pitch suggestions for Open Science reform to the chancellery (or equivalent). | |
Draft Open Science commitment statements. | |
General resources | OSF institutions |
https://www.cos.io/products/osf-institutions | |
Declaration of Research Assessment (DORA) | |
https://sfdora.org/read/,https://sfdora.org/signers/ | |
Open Science Communities | |
Open Science Communities Starter Kit: https://www.startyourosc.com/ | |
Network of (German-speaking) Open Science Initiatives: | |
https://doi.org/10.17605/osf.io/tbkzh | |
Example Open Science task force report and surveys | |
https://doi.org/10.17605/osf.io/vpwf7 | |
Example Open Science commitment statements | |
University of Helsinki: https://www.helsinki.fi/en/research/research-integrity/open-science | |
Sorbonne University: https://www.sorbonne-universite.fr/en/research-and-innovation/research-strategy/commitment-open-science | |
Leiden Ranking for research evaluation | |
https://www.leidenranking.com/ |
Third, universities (or even individual departments) can make a commitment to prioritise Open Science in its endeavours, which will in turn signal what researchers ought to value when conducting and disseminating research. Universities as a first step could sign DORA (2012) and commit to counteracting the perverse incentives that traditional ranking systems promote. DORA makes 18 recommendations to transform how the academic community evaluates researchers and research outputs. More than 2,200 organisations and over 17,000 individuals across the world are now signatories, with many having revised their research assessment guidelines to align with DORA. As more institutions sign up, institutional frameworks and assessment frameworks will align more and more with open practices. From here, a university can then devise their own public commitment statement. Public declarations that explicitly communicate open values, norms, and aspirations can provide top-down goals for individual researchers within the department to consider when conducting and disseminating research.
Researcher Actions
Though few researchers have direct decision-making power at a university-wide level, they can draw the attention of administration heads to emerging Open Science initiatives and behaviours. We summarise our recommended researcher actions in Table 4. Researchers might first consider starting an Open Science Community (OSC; see Armeni et al., 2021), which can be a grassroots forerunner to an Open Science task force. An Open Science Community is a group of researchers who desire to educate each other (and others) on Open Science tools and practices. However, Open Science Communities also discuss how the university can provide support to its academics and inform university administrators on how to shape Open Science policies. For example, Open Science Communities can design and conduct surveys to understand the perceived barriers regarding Open Science at their institution. A large group of researchers collectively advocating for initiatives can place considerable social pressure on institutions to enact change.
Open Science Communities and researchers can also give brief presentations to members of the chancellery (or equivalent) to advocate for Open Science initiatives such as adopting ‘OSF Institutions’, signing DORA, or implementing an Open Science task force. An effective pitch might explain the problem (i.e., the replication crisis), demonstrate its influence on research across numerous disciplines, and highlight concrete actions the university can take. The pitch ought to mention that other universities have already taken steps to address such issues and then provide similar solutions to make any change easy to implement. For example, one could illustrate how other universities and funders have aligned their assessment guidelines with DORA or provide a draft statement of public commitment to Open Science. In these pitches, researchers should also highlight the appeal of the proposed initiatives, including the potential for the university to become a leader in the emerging Open Science space and new university rankings that place greater value on open practices (e.g., Leiden Ranking).
Libraries
Academic libraries deliver services across the full research lifecycle. They provide access to scholarly resources and provide support for research activities such as literature searches, systematic reviews, working with data (e.g., The Carpentries; https://carpentries.org/), bibliometrics, funding opportunity identification, and grant writing. Academic libraries have also championed Open Access (OA) publishing—the public dissemination of scholarly and scientific literature free of charge—for more than two decades, motivated in part by the idea that research should be publicly available to advance discovery and drive innovation. Open access can potentially remedy the ever-increasing publishing costs in academia. However, some publishers have exploited Open Access to profit. Many traditional subscription-only journal publishers are also charging authors excessive up-front article processing charges (APCs). In response, some libraries have invested significantly in repository infrastructure to support access to research outputs (including data) while others have begun to manage APCs and provide services to ensure that researchers meet the Open Access requirements of funding bodies. Open Access publishing, as well as the increasing emphasis on managing research data to enable purposeful sharing, are some of the major global drivers that have shaped academic libraries as they are today (Brown et al., 2018). However, all academic libraries manage access to online subscription resources (e.g., journals), routinely negotiate deals with major publishers, and grapple with budgets dominated by their online journal spend. Often researchers are unaware of these activities, but it is the research community who, via peer review and editorial roles, effectively contribute free labour to the publishers of the resources the libraries pay for.
Target Behaviours
Libraries are already actively engaged in Open Access, managing research data, and other areas directly relevant to open research, but can struggle to have broader impact. To connect library services to the broader Open Science agenda, libraries require a more holistic approach; one that promotes and advocates broadly for Open Access and research data management across the research life cycle (Tzanova, 2020). A shift towards library staff having strong research backgrounds may also benefit a more holistic support of Open Science.
A key piece of infrastructure often in the academic library’s toolkit is an institutional repository, but these are often not well integrated with other systems or researcher workflows, nor do they seamlessly integrate with external infrastructure to support open scholarly communication. Libraries therefore ought to consider how they can provide a sophisticated, user-friendly repository infrastructure that integrates with researcher workflows and external systems (including external tools that enable a more open approach to active projects).
It is also necessary for libraries to engage more deeply with the research community to provide feedback. Library staff often have large networks through liaison and other activities. These networks can be leveraged by leading institution-wide events to connect researchers from various disciplines and drive a shared understanding of the opportunities afforded by Open Science. Some academic libraries have also had success in driving change in the scholarly communication system by leading campus-wide discussions on the challenges of scholarly publishing. For example, a committee at the University of California has unanimously endorsed the Declaration of Rights and Principles to Transform Scholarly Communication (see Table 5) to directly address concerns over the financially unstable subscription model that extracts money from universities and free labour from authors.
Target behaviours | Adopt a holistic approach to staffing targeted toward the Open Science agenda. |
Lead campus-wide discussions and events to address APCs and barriers to Open Access. | |
Provide sophisticated, user-friendly repository infrastructure. | |
Researcher actions | Engage with libraries to discover a strategic approach to Open Access publishing and APCs. |
Prioritise the circumstances under which one will do unpaid work. | |
Link ORCiD to all of one’s research projects. | |
Create and use open educational resources supported by institutional libraries. | |
General resources | Open Science guides for libraries |
https://www.fosteropenscience.eu/learning/open-science-at-the-core-of-libraries/, | |
https://www.ala.org/acrl/publications/keeping_up_with/open_science | |
Documentary about academic publishing | |
Schmitt (2018): https://paywallthemovie.com/ | |
Example commitments to open scholarship | |
https://osc.universityofcalifornia.edu/uc-publisher-relationships/ | |
Find Open Access journals | |
https://doaj.org | |
ORCiD information | |
https://info.orcid.org/what-is-orcid/ | |
Open Educational Resources | |
https://www.oercommons.org |
Target behaviours | Adopt a holistic approach to staffing targeted toward the Open Science agenda. |
Lead campus-wide discussions and events to address APCs and barriers to Open Access. | |
Provide sophisticated, user-friendly repository infrastructure. | |
Researcher actions | Engage with libraries to discover a strategic approach to Open Access publishing and APCs. |
Prioritise the circumstances under which one will do unpaid work. | |
Link ORCiD to all of one’s research projects. | |
Create and use open educational resources supported by institutional libraries. | |
General resources | Open Science guides for libraries |
https://www.fosteropenscience.eu/learning/open-science-at-the-core-of-libraries/, | |
https://www.ala.org/acrl/publications/keeping_up_with/open_science | |
Documentary about academic publishing | |
Schmitt (2018): https://paywallthemovie.com/ | |
Example commitments to open scholarship | |
https://osc.universityofcalifornia.edu/uc-publisher-relationships/ | |
Find Open Access journals | |
https://doaj.org | |
ORCiD information | |
https://info.orcid.org/what-is-orcid/ | |
Open Educational Resources | |
https://www.oercommons.org |
Researcher Actions
What many libraries currently do aligns with the goals of Open Science, but researchers can help make these initiatives easier by engaging more with library services (see Table 5). Researchers should proactively seek to understand the pressures that publishers and vendors place on their university’s library budget and the sophisticated games that publishers play to increase their revenue. By engaging with libraries, researchers can find a strategic approach to their scholarly publishing practices, considering when and where they should (and should not) pay. Moreover, researchers should explore all models of publishing that enable Open Access to their research outputs, including green Open Access routes using institutional repositories, and ensure they retain ownership rights in their author accepted manuscripts.
Researchers have some social influence on publishers in their roles as editors and as peer reviewers. Researchers therefore ought to prioritise the circumstances under which they are prepared to do unpaid work, and voice concerns directly when current practices are not aligned with Open Science. Importantly, researchers can share this activity with their library colleagues, and work to understand how they can further support libraries in the cancellation of subscription (closed) journals and/or moving to other models (e.g., Read and Publish agreements; see Borrego et al., 2021).
Workflow and data management practices can also be integrated more seamlessly with the library’s infrastructure. A small step each individual researcher can take to help make the open research ecosystem more efficient is to get an ORCiD and use it whenever and wherever it is enabled. Researchers should ask their institutional repository to integrate with ORCiD to help ensure the publications in their ORCiD profile are openly available.
Finally, researchers can engage more with library services and resources. As noted previously, researchers are commonly involved in teaching and learning at universities and have the option to use and promote Open Educational Resources (OERs) that are supported by the library rather than rely on journal articles locked behind a paywall, or expensive textbooks (print and ebooks).
Journals
Research is primarily disseminated via refereed journal articles. Journal editors and publishers want to publish work that is seemingly rigorous, of high quality, and ultimately ‘impactful’ (e.g., cited) because this can enhance the journal’s prestige. Prestige serves to attract research that is likely to have the greatest impact in the respective field, which in turn tends to result in more prestige, more citations, more subscriptions, and higher revenue. Indeed, various organisations (e.g., Elsevier) have generated indices that seek to reflect the quality and impact of a journal such as its impact factor (https://researchguides.uic.edu/if/impact). Such indices have become a key currency, with various organizations investing a great deal of resources to monitor and publish journal impact factors.
However, much work that is ‘impactful’ is not necessarily rigorous, important, or transparent. The quest for impactful work has led to a range of problematic consequences. For example, journals have tended to publish research that claims to be novel and reports statistically significant results in the hope that this work has great impact. At the same time, journals have been less likely to publish research that reports weak, null, or negative findings, or replication research (Ferguson & Brannick, 2012). However, if any field is to accumulate a reliable body of knowledge, research needs to be rigorous (regardless of novelty, popularity, or statistical significance) and reported in an open, transparent, reproducible way so that people can grasp the actual state of what we know and build on this appropriately.
Target Behaviours
There are several ways for those in positions of influence (e.g., editors) at various journals to encourage open, rigorous, and reproducible research. For instance, editors (as well as funders) can adopt the Transparency and Openness Promotion (TOP) guidelines, which describe eight different dimensions of research standards of openness and transparency (e.g., preregistration, transparent data) when submitting or reviewing research (Nosek et al., 2015). Editors could adapt these guidelines to provide reviewers with explicit evaluation criteria to promote clear, collaborative, and constructive peer review. Journals should also consider making TOP suggestions the default when researchers submit a paper. For example, submission processes can be redesigned so that authors must opt out of making their data and materials openly available. The journal Cognition revealed that having a default open data policy made open data more prevalent and reusable (Hardwicke et al., 2018).
Editors can also alter review processes in other ways. For example, they could implement open peer review (pre- and post-publication), which tends to encourage more respectful and constructive feedback as well as clearer communication between reviewers, editors, and authors (Ross-Hellauer, 2017). Result-blinded peer review (Grand et al., 2018) is another initiative that editors could adopt to avoid a bias toward publishing ‘positive’ results.
Another simple (and increasingly common) way in which journals can support Open Science is to offer a wider variety of publication (submission) formats that emphasize quality research processes rather than the outcomes. For example, journals can adopt Registered Reports as a submission format (Nosek & Lakens, 2014) and encourage authors to submit replications.
Finally, editors can increase the visibility and appeal of open practices among researchers by adopting Open Science Badges (https://www.cos.io/initiatives/badges). These badges are awarded to articles if authors, for example, (a) preregister, or make openly available their (b) materials and (c) data. The use of badges is an easy, low-cost initiative, and serves as an injunctive norm of appropriate research behaviours (signalling what we value and ‘should’ do). The introduction of these badges has increased the extent to which researchers engage in open practices such as data sharing (Kidwell et al., 2016), while strengthening the reliability and trustworthiness of the research that is published.
Researcher Actions
How then can a typical researcher influence journals? In Table 6, we provide a summary of ideas. The typical researcher is likely to interact with senior editors when submitting manuscripts for review and when reviewing manuscripts. As a first step, reviewers ought to consider open evaluation guides and resources (such as TOP guidelines) when reviewing manuscripts, and commend instances where authors have demonstrated open practices such as preregistration or open data. These actions are likely to plant the seed that such practices are becoming increasingly normative and worthwhile.
Target behaviours | Adopt the TOP guidelines for manuscript evaluation. |
Embrace open peer review and/or result-blinded peer review. | |
Welcome Registered Reports (and Peer Community In) and replication studies. | |
Adopt Open Science badges. | |
Researcher actions | Commend the use of open practices when reviewing manuscripts. |
Suggest initiatives when interacting with editors, referring to relevant resources, norms, and benefits. | |
Prioritise reviewing manuscripts that demonstrate a commitment to open practices (PRO initiative). | |
General resources | TOP guidelines |
https://www.cos.io/initiatives/top-guidelines, https://osf.io/9f6gx, https://osf.io/fe2pz/ | |
Open peer review information | |
https://plos.org/resource/open-peer-review/ | |
Example guidelines for replication research | |
https://royalsocietypublishing.org/rsos/replication-studies | |
Journals that offer Registered Reports | |
https://www.cos.io/initiatives/registered-reports | |
Journals that have endorsed ‘Peer Community In’ Registered Reports | |
https://rr.peercommunityin.org/about/pci_rr_friendly_journals | |
Open Science badges | |
https://www.cos.io/initiatives/badges | |
Peer reviewers’ openness (PRO) initiative | |
Morey et al. (2016): https://www.opennessinitiative.org/ |
Target behaviours | Adopt the TOP guidelines for manuscript evaluation. |
Embrace open peer review and/or result-blinded peer review. | |
Welcome Registered Reports (and Peer Community In) and replication studies. | |
Adopt Open Science badges. | |
Researcher actions | Commend the use of open practices when reviewing manuscripts. |
Suggest initiatives when interacting with editors, referring to relevant resources, norms, and benefits. | |
Prioritise reviewing manuscripts that demonstrate a commitment to open practices (PRO initiative). | |
General resources | TOP guidelines |
https://www.cos.io/initiatives/top-guidelines, https://osf.io/9f6gx, https://osf.io/fe2pz/ | |
Open peer review information | |
https://plos.org/resource/open-peer-review/ | |
Example guidelines for replication research | |
https://royalsocietypublishing.org/rsos/replication-studies | |
Journals that offer Registered Reports | |
https://www.cos.io/initiatives/registered-reports | |
Journals that have endorsed ‘Peer Community In’ Registered Reports | |
https://rr.peercommunityin.org/about/pci_rr_friendly_journals | |
Open Science badges | |
https://www.cos.io/initiatives/badges | |
Peer reviewers’ openness (PRO) initiative | |
Morey et al. (2016): https://www.opennessinitiative.org/ |
Editorial teams may have reservations about promoting open practices (Hopp & Hoover, 2019). However, in their interactions with editors, reviewers can take the opportunity to suggest initiatives such as TOP guidelines, badges, wider publication formats, and open peer review. Relevant information can also be made easier to find by providing useful links. Moreover, one can point out the increasing norms around such practices; at the time of writing, more than 75 journals have implemented Open Science badges, more than 250 have Registered Reports as a publication format, and more than 1,000 have implemented TOP guidelines (see http://cos.io/). Highlighting what the journal can gain will also increase how attractive these initiatives appear. Introducing TOP guidelines, for example, is likely to reduce the time that authors and reviewers spend communicating, and it can improve the reporting standards of published research (Nosek et al., 2015).
Furthermore, as reviewers, researchers can be selective with what they decide to review. For example, they can prioritise to review articles that demonstrate a commitment to open practices, such as open data (or where explanations of closed data are provided), and they ought to explain this decision when in communication with editors. The Peer Reviewers’ Openness (PRO) initiative is one guide that advocates for this action to drive change, and it serves to incentivise journals to encourage authors to adopt open practices. Similarly, reviewers and authors can publicly commit to prioritising journals who are committed to Open Science (e.g., Meta-Psychology, Collabra) when deciding where to publish their work.
Funders
Funding bodies are often responsible for deciding how and where to allocate research funds from government, industry, and philanthropic sources. These funds are limited, and the process of evaluating research proposals is competitive. Grant applications are typically written by researchers before being sent to specialists in the field for review. A committee then assesses and ranks these applications to prioritise where available funds are allocated. This review process typically favours researchers with more publications, higher citation counts, and a track record of publishing in ‘high-impact’ journals, which then places these same researchers in a better position to publish further research (i.e., the Matthew effect; Bol et al., 2018). In fact, researchers at the top 20% of universities received over 60% of the funding from the National Science Foundation (Drutman, 2012). A model that consistently awards funds more to established, senior academics, and to conventional research, may be stifling scientific advancement and scientific reforms such as those that advocate for Open Science.
Target Behaviours
There are several ways to address these issues. Some funding bodies have adopted policies to promote Open Science by committing to make funded output freely accessible (e.g., Social Sciences and Humanities Research Council of Canada; National Institute of Health; cOAlition S). The European Research Council (2017) has also proposed that funded projects have open and reusable data attached to output. Policies that require open practices from funded projects would certainly increase the uptake of open practices among researchers.
Funding bodies ought to also consider altering how funds are allocated. Moderate recommendations include changes to evaluation, including a greater focus on the quality rather than the quantity of a researcher’s scientific contributions. A focus on quality may prioritise an investigator’s plans to engage in open practices such as preregistration, sharing data and code publicly, and their plans to conduct replications and submit Registered Reports. The Dutch Research Council (NWO) and European Research Council have now both signed DORA and have committed to weighting open practices and theoretical contribution more heavily. The NWO, for instance, now requires CVs to have a narrative academic profile and no more than five or ten key research outputs, as opposed to typical ‘impact’ metrics (e.g., h-index). More radical suggestions to funding allocation include innovation lotteries and open review (Gurwitz et al., 2014; Liu et al., 2020). Perhaps in the long-term, funders ought to consider randomising grants that pass a certain threshold of quality where quality encompasses strong theory and a commitment to open practices and Registered Reports.
Researcher Actions
A greater appreciation of Open Science when determining how research funds are allocated can be promoted through several small actions. We summarise some suggested actions in Table 7. Senior researchers who serve as reviewers on funding committees are in a position to favourably weigh those projects that signal quality rather than quantity. Researchers more generally, however, can also influence funding processes in how they write their grant proposals. For instance, researchers can include prepared statements of commitment to open practices in grant applications even when these are not specifically required. Funders often ask for broader impact statements and a history of translating research into action. A track record of open data, open access, and replication studies may be seen favourably in this context. Researchers can also outline how they plan to make their research outputs openly available, or their plans to provide Open Science training to postgraduate and postdoctoral students. These plans can even be accounted for in the project’s budget to demonstrate how easily these behaviours can be incorporated into a research project.
Target behaviours | Plan to fund open access publishing and require open data for funded projects. |
Adopt review processes that value open practices and theoretical contribution. | |
Adopt narrative CV formats for grants proposals and ‘Best five’ research outputs. | |
Researcher actions | As a reviewer, promote research proposals that include open and rigorous research practices. |
Include statements of commitment to open practices in grant applications. | |
Include impact statements, plans, and budgets to make data and publications openly accessible. | |
Advertise alternative metrics and best outputs in grant applications. | |
General resources | Example Open Science commitment statement |
http://www.researchtransparency.org/ | |
Example open CV format | |
https://www.nwo.nl/en/dora | |
Integrating Open Science into grant proposals | |
https://www.fosteropenscience.eu/content/winning-horizon-2020-open-science | |
Open access publication initiative | |
Coalition-S: https://www.coalition-s.org/about/ | |
Data management plan guides and templates | |
https://www.fosteropenscience.eu/index.php/foster-taxonomy/research-data-management, | |
https://dmponline.vu.nl/public_templates | |
Guide on collaborating with industry using Open Science | |
https://www.cos.io/blog/how-to-collaborate-with-industry-using-open-science |
Target behaviours | Plan to fund open access publishing and require open data for funded projects. |
Adopt review processes that value open practices and theoretical contribution. | |
Adopt narrative CV formats for grants proposals and ‘Best five’ research outputs. | |
Researcher actions | As a reviewer, promote research proposals that include open and rigorous research practices. |
Include statements of commitment to open practices in grant applications. | |
Include impact statements, plans, and budgets to make data and publications openly accessible. | |
Advertise alternative metrics and best outputs in grant applications. | |
General resources | Example Open Science commitment statement |
http://www.researchtransparency.org/ | |
Example open CV format | |
https://www.nwo.nl/en/dora | |
Integrating Open Science into grant proposals | |
https://www.fosteropenscience.eu/content/winning-horizon-2020-open-science | |
Open access publication initiative | |
Coalition-S: https://www.coalition-s.org/about/ | |
Data management plan guides and templates | |
https://www.fosteropenscience.eu/index.php/foster-taxonomy/research-data-management, | |
https://dmponline.vu.nl/public_templates | |
Guide on collaborating with industry using Open Science | |
https://www.cos.io/blog/how-to-collaborate-with-industry-using-open-science |
It is common for university research offices to edit grant proposals to place emphasis on traditional research metrics. However, researchers can take it upon themselves to highlight their five or ten ‘Best’ papers, to write a written statement of one’s broader impact, or to use less traditional metrics such as Altmetric. If an increasing number of researchers do the same, it can create a social pressure for these actions to become the expectation rather than an exception.
Researchers ought to also use open practices when collaborating with partnered industry funders. In applied fields, for instance, there is a great deal of communication between researchers and industry collaborators. Searston et al. (2019) have outlined ways in which the OSF and other open tools can keep partners updated at every stage of a research project. Transparency while working with industry will likely improve the quality of the end product and establish a norm of open collaboration.
Conclusions
Our aim in this paper has been to provide recommendations and resources that the everyday researcher can use to promote Open Science. For various nodes and stakeholder groups in the research ecosystem, we described how current behaviours, norms, and cultures sustain irreproducibility and slow scientific progress, while also suggesting alternative behaviours and practices that are more conducive to Open Science. Most critically, however, we recommended actions that individual researchers can take to promote these changes. We also used two behaviour change frameworks—EAST and the Pyramid of Culture Change—to ground these recommendations. In essence, these frameworks propose that, for behaviours to be adopted, they ought to be made easy, social, and attractive.
In the first part of this paper, we proposed ways that researchers could directly influence open practices among individuals with whom they work closely: colleagues and students. Progress, however, often also hinges on top-down influences from larger institutions. At the same time, there will be little drive for institutional change without pressure from researchers. In the second part of this paper, we proposed ways in which institutions—departments and faculties, universities, libraries, journals, and funders—can promote open practices, and suggested actions that the typical researcher can take to influence institutions despite not having direct decision-making power. Practices across the scientific community determine the quality of the research that is generated and disseminated. A holistic approach to improving the infrastructure, norms, and reward structures is needed to shift to a culture of Open Science. Inspired by principles of behaviour change, we hope to have provided useful means to empower researchers in this endeavour.
Contributions
Here we list the authors who contributed to each section: Introduction (SGR), Colleagues (MAB, JB, HB, CDK & DM), Students (RAS), Departments/faculties (JMC, KJ, REL & HAS), Universities (JLB & SGR), Journals (NKS), Libraries (AT) and Funders (SGR & HAS). The paper’s structure was organised by SGR. Additionally, MAB, JB, JLB, KJ, CDK, SGR, HAS, RAS, NKS, and JMT contributed to general editing. The idea for this paper was conceived by JMT during the Society for the Improvement of Psychological Science (SIPS) 2019 meeting in Rotterdam.
Funding
This research was supported by grant No. LP170100086 from the Australian Research Council to JMT and RAS, and by a grant from the National Science Center (2015/19/B/HS6/01253) to KJ.
Competing Interests
JC is president of the Association for Interdisciplinary Metaresearch and Open Science (AIMOS), a non-profit organization. All other authors have no conflicts of interest to declare.
Footnotes
In this paper, we primarily focus on rigor and transparency as main aspects of ‘Open Science’. However, other, much broader definitions encompass aspects of inclusivity, equity, or citizen science (see European Commission, 2019; Fox et al., 2021; Masuzzo, 2019).