In this article, we provide a toolbox of recommendations and resources for those aspiring to promote the uptake of open scientific practices. Open Science encompasses a range of behaviours that aim to improve the transparency of scientific research. This paper is divided into seven sections, each devoted to different groups or institutions in the research ecosystem: colleagues, students, departments and faculties, universities, academic libraries, journals, and funders. We describe the behavioural influences and incentives for each of these stakeholders as well as changes they can make to foster Open Science. Our primary goal, however, is to suggest actions that researchers can take to promote these behaviours, inspired by simple principles of behaviour change: make it easy, social, and attractive. In isolation, a small shift in one person’s behaviour may appear to make little difference, but when combined, many shifts can radically alter shared norms and culture. We offer this toolbox to assist individuals and institutions in cultivating a more open research culture.

Many scientific disciplines are currently experiencing a “reproducibility crisis”. Psychology, economics, and medicine are just a few of the disciplines where many influential findings are failing to replicate (Camerer et al., 2018; Duvendack et al., 2017; Open Science Collaboration, 2015; Prinz et al., 2011). Questionable research and publication practices (QRPPs) are partly to blame for this crisis. For example, a norm of publishing positive results (Ferguson & Brannick, 2012; Rosenthal, 1979) incentivises researchers to be especially liberal when analysing their data (e.g., ‘p-hacking’; Simmons et al., 2011), and to generate hypotheses after the results of an experiment are known as if they were expected from the outset (‘HARKing’; Kerr, 1998). More than half of psychology researchers surveyed, for instance, reported peeking at a study’s results before subsequently deciding whether to collect more data, and more than one third claimed to have engaged in HARKing (John et al., 2012; but QRPPs are not limited to psychology, see Fraser et al., 2018; Gopalakrishna et al., 2021). QRPPs provide fertile ground for further irreproducibility and result in part from the culture and incentive structures in academia (Edwards & Roy, 2017; Munafò et al., 2017; Nosek et al., 2012; Smaldino & McElreath, 2016).

A movement of Open Science has emerged in response to these issues (see Vazire, 2018). The umbrella term ‘Open Science’ encompasses a range of practices that aim to increase the transparency and rigour of scientific research (Fecher & Friesike, 2014),1. Reforms such as preregistration, preprints, replication studies, and publicly sharing data are some practices that can improve how easy research is to evaluate, use, and reproduce (Corker, 2018; Spellman et al., 2018). However, many researchers are not embracing open practices; perhaps they are unaware of the benefits of these practices, perhaps they are dissuaded by a perception that more transparent science is too laborious, or perhaps they have not had the time or energy to integrate these practices in a way that fits with their current way of doing things (see Gagliardi et al., 2015). Academic incentive structures that reward the volume of one’s publications, and impact factors (rather than the quality or rigour of the research), may also undermine the adoption of Open Science practices.

The primary goal of this paper is to recommend some ways in which individual researchers can use principles of behaviour change to promote the uptake and maintenance of open practices in others. There are a few key behaviours that we believe are critical to Open Science: preregistration (and Registered Reports), preprints and open access publication, publicly sharing open and usable data and code, and conducting replication studies. However, the research ecosystem involves many stakeholder groups (both individual and institutional), and there are direct and indirect ways for researchers to encourage others to adopt open practices.

First, researchers can influence other individuals with whom they are in close contact, namely their colleagues and students. Researchers work with other academics and commonly serve as mentors and teachers to thousands of students—the next generation’s researchers and research consumers. In the first part of this paper, we outline actions that researchers can take to directly influence their colleagues and students to adopt open practices.

Of course, researchers and students are heavily influenced by top-down barriers and incentive structures. The policies and practices of institutions may be such that Open Science simply presents a barrier to one’s research goals. If researchers try to influence individuals directly, their actions may have little effect so long as institutions—departments and faculties, universities, libraries, journals, and funders—fail to change as well (Munafò, 2019). In the second part of this paper, we focus on how individuals can influence institutions. In each of the sections, we outline changes that those with institutional decision-making power can directly enact, but most importantly we also recommend ways in which the typical researcher, even those who do not hold positions of influence, can increase the likelihood that key decision-makers will enact change. For each section, we also provide a table summary with useful online resources.

Whether or not individuals and institutions decide to adopt Open Science practices and policies is largely a behavioural question (Bishop, 2020; Norris & O’Connor, 2019). Insights from psychology and other behavioural sciences suggest that people routinely make decisions through automatic, impulsive, and emotional processes, and are often driven by social pressures and immediate cues in their environment (Kahneman, 2011; Tversky & Kahneman, 1974). Psychosocial factors influence the everyday decisions people make whether it be deciding which product to buy at a supermarket, or decisions about how to conduct, report, evaluate, publish, or fund research.

Theories and findings from across the behavioural sciences can inform practically any situation where a human decision-maker is involved. People tend to behave in a way that is easiest or provides least resistance. Sweden, for example, enjoys far higher rates of organ donation than Denmark not because the Swedes are more compassionate or because of some cultural difference, but simply because Sweden requires people to opt-out of donating their organs whereas people in Denmark must opt-in (Johnson & Goldstein, 2003). Highlighting a social norm—the accepted standard of behaviour of most people in a group that one cares about—can also greatly influence how people act. If people discover that 90% of fellow group members (rather than 10%) put their rubbish in the bin, they are more likely to do the same (for similar examples, see Hallsworth, 2014; Hallsworth et al., 2016; Nolan et al., 2008).

There are at least two key frameworks for effective behaviour change (see Figure 1). We use these to ground our recommendations for researchers to promote Open Science. The first framework is the Pyramid of Culture Change (Nosek, 2019) and the second is EAST (Easy, Attractive, Social, and Timely; UK Behavioural Insights Team, 2014). Though the frameworks are somewhat distinct, the underlying principles are similar. Both the Pyramid of Culture Change and the EAST framework assert that people are more likely to engage in a desired behaviour if it is made easy. Both frameworks also highlight the power of social connection, commitments, and norms in influencing behaviour, and they both underscore the effectiveness of making desired behaviours attractive or rewarding. If researchers wish to influence individuals and institutions, they ought to focus on making behaviours easy, social, and attractive. Incremental shifts across the entirety of the research ecosystem are likely to greatly improve the way scientific research is conducted, evaluated, and disseminated. These principles of behaviour change provide a means to promote such shifts.

Figure 1. Two frameworks for behaviour change

Note. An illustration of two behaviour change frameworks: the Pyramid of Culture Change and the EAST framework. Both emphasise making desired behaviours easy, attractive/rewarding, and social/normative.

Figure 1. Two frameworks for behaviour change

Note. An illustration of two behaviour change frameworks: the Pyramid of Culture Change and the EAST framework. Both emphasise making desired behaviours easy, attractive/rewarding, and social/normative.

Individual researchers ultimately determine how scientific studies are conducted; they design the experiments, gather the data, and write up the results. Here, we suggest several open practices that individuals can readily adopt—preregistration, preprints, open data and code, and replication—and then we recommend ways in which researchers can influence colleagues and students to adopt these practices using the principles of make it easy, make it social, and make it attractive. We first turn our attention to colleagues, and then to students.

Colleagues

A commonly portrayed aphorism in academia is ‘to publish or perish’ (see Grimes et al., 2018); there is a significant pressure to publish results that are both novel and positive, and to publish a lot (Nosek et al., 2012; Rosenthal, 1979). Although researchers seem well-intentioned and strive to produce good scientific research, they are also incentivised to make their work as palatable to journals as possible. This problem is exacerbated since one’s publications often play a role in funding or career decisions (Schimanski & Alperin, 2018).

One can imagine that the prevailing incentive structures might make it tempting for researchers to engage in QRPPs without even being aware. Scientists, like all humans, are prone to biases such as confirmation bias (favouring results that confirm one’s beliefs; Klayman, 1995) and hindsight bias (overestimating one’s ability to have predicted an outcome that could not have been predicted; Roese & Vohs, 2012). Encouraging practices that improve rigour and transparency in one’s research is therefore critical to cultivating a culture of Open Science (Nuzzo, 2015; Spellman et al., 2018).

Target Behaviours

Though there are now many Open Science reforms taking place (e.g., national Reproducibility Networks, Open Science Communities), and many different practices that can contribute to remedying the replication crisis, here we focus on a handful of key behaviours that researchers can integrate into their routine workflow.

Preregistration and Registered Reports. Preregistration is a commitment to one’s research planning decisions, including predictions, recruitment strategies, exclusion criteria, stopping rules, materials, procedures, study design, and analysis plans before one has knowledge of a study’s outcomes (DeHaven, 2017; Gelman & Loken, 2013; Nosek et al., 2018). This commitment means that researchers must demarcate between confirmatory and exploratory decisions, analyses, and findings. Although some have questioned the utility of preregistration (see Devezer et al., 2020; Szollosi et al., 2020), shifting decisions about theory and study design to earlier in the research process helps to ensure that unforeseeable issues with an experiment are addressed early on. Preregistration also constrains the effect of researcher biases on the results or interpretation of a study and allows future readers to process findings within the broader analytic context. The practice therefore serves as a partial antidote to QRPPs (see Fischhoff & Beyth, 1975; Munafò et al., 2017; Nosek et al., 2018; Wagenmakers et al., 2012; Wicherts et al., 2016).

Registered Reports take preregistration to the next level. In a Registered Report, one submits a detailed plan of the research questions, hypotheses, methodology, and analyses to a scientific journal for review prior to collecting data (Chambers et al., 2014). Once a Registered Report is accepted, the journal agrees to publish the study if the quality control criteria are met, regardless of the results. A recent initiative, Peer Community In Registered Reports (PCI RR), also provides a community of scientists dedicated to reviewing and recommending Registered Reports. Once a Registered Report is positively evaluated, authors have the option to publish their work in traditional journals that have committed to accepting PCI RR recommendations without further peer review.

Preregistration and Registered Reports help to ensure that the scientific value of hypothesis testing is determined by the quality of research questions and methodology rather than the findings themselves. Importantly, preregistered studies reveal different patterns of results than traditional research practices, suggesting that they reduce false positive results and tend to outperform traditional research on several quality control criteria (Schäfer & Schwarz, 2019; Scheel et al., 2021; Soderberg et al., 2021).

Preprints and Open Access Publication. Expanding scientific knowledge rests on individuals having access to a broad range of research products so that anyone can build on prior work. The way people publish and disseminate research therefore influences how knowledge is created. One can make their work Open Access (OA) either by making articles freely accessible at the point of publication (see the ‘Libraries’ section), or by self-archiving either the published article or the author’s version of the accepted manuscript (Harnad et al., 2008). A preprint is a form of self-archiving where one posts (often publicly) a version of a scientific paper before it is formally peer-reviewed in a scholarly journal. Sharing preprints is becoming increasingly popular for many researchers (Berg et al., 2016; Narock & Goldstein, 2019).

Peer review is an integral aspect of science, and some have raised concerns over the lack of quality control in non-peer-reviewed manuscripts such as preprints (see Bauchner, 2017; Maslove, 2018). Nonetheless, knowing that a preprint has not been peer-reviewed might incentivise readers to be more critical and sceptical of a study (Velterop, 2016). More eyes on a manuscript might also improve replicability because researchers can be notified about errors, alternative interpretations, or logical flaws before formal publication (Oakden-Rayner et al., 2018). Posting preprints of null or negative findings can also alleviate the ‘file-drawer’ problem by giving a platform to results that may not be palatable for formal publication (Verma, 2017). Some have also suggested that preprints be used as a platform for open peer review (Saderi & Greaves, 2021). A recent initiative (Peer Community In; see Table 1) provides such a platform, arranging peer review and requiring a recommendation before preprints are posted. Authors then have the option to submit their ‘accepted’ preprint to a traditional journal.

Table 1. Colleagues
Target behaviours Preregister studies. 
 Submit Registered Reports. 
 Upload preprints (and submit for peer community review). 
 Make data and code openly available. 
 Conduct replication studies. 
Researcher actions Organise an Open Science seminar or workshop. 
 Use presentations to signal your use of open practices. 
 Share templates, guides, infographics, and papers during meetings or via email. 
 Host regular ‘ReproducibiliTea ‘meetings with colleagues. 
 Pledge with others on ‘Free Our Knowledge’. 
 Highlight to others the incentives for open practices. 
General resources Preregistration servers 
 Open Science Framework: https://osf.io/ 
 As predicted: https://aspredicted.org/ 
 Clinical Trials: https://clinicaltrials.gov/ 
 Open Science information 
 https://how-to-open.science/ 
 Preregistration guides and templates 
 https://cos.io/prereg/, https://osf.io/zab38/
 van't Veer et al. (2016): https://doi.org/10.31234/osf.io/4frms 
 Registered Report guide 
 https://cos.io/rr/ 
 Preprint servers 
 arXiv: https://arxiv.org/ 
 PsyArXiv: https://psyarxiv.com/ 
 bioRxiv: https://www.biorxiv.org/ 
 OSF: https://osf.io/preprints/ 
 Preprint guide 
 https://help.osf.io/hc/en-us/articles/360019930533-Upload-a-Preprint 
 Example journals that accept preprints for submission 
 PLOS: https://plos.org/open-science/preprints/ 
 eLife: https://elifesciences.org/articles/64910 
 ‘Peer community In’ (PCI) 
 Preprint review: https://peercommunityin.org/ 
 Registered Reports: https://rr.peercommunityin.org/about 
 Data repositories 
 OSF: https://osf.io/ 
 Figshare: https://figshare.com 
 Harvard Dataverse: https://dataverse.harvard.edu/ 
 Zenodo: https://zenodo.org/ 
 Data sharing guide 
 Soderberg (2018): https://doi.org/10.1177%2F2515245918757689 
 FAIR data information 
 https://www.go-fair.org/fair-principles/ 
 Guide to replication 
 https://doi.org/10.17605/osf.io/jx2td 
 ReproducibiliTea guide 
 https://reproducibilitea.org/about/ 
 Free our knowledge 
 https://freeourknowledge.org/ 
Target behaviours Preregister studies. 
 Submit Registered Reports. 
 Upload preprints (and submit for peer community review). 
 Make data and code openly available. 
 Conduct replication studies. 
Researcher actions Organise an Open Science seminar or workshop. 
 Use presentations to signal your use of open practices. 
 Share templates, guides, infographics, and papers during meetings or via email. 
 Host regular ‘ReproducibiliTea ‘meetings with colleagues. 
 Pledge with others on ‘Free Our Knowledge’. 
 Highlight to others the incentives for open practices. 
General resources Preregistration servers 
 Open Science Framework: https://osf.io/ 
 As predicted: https://aspredicted.org/ 
 Clinical Trials: https://clinicaltrials.gov/ 
 Open Science information 
 https://how-to-open.science/ 
 Preregistration guides and templates 
 https://cos.io/prereg/, https://osf.io/zab38/
 van't Veer et al. (2016): https://doi.org/10.31234/osf.io/4frms 
 Registered Report guide 
 https://cos.io/rr/ 
 Preprint servers 
 arXiv: https://arxiv.org/ 
 PsyArXiv: https://psyarxiv.com/ 
 bioRxiv: https://www.biorxiv.org/ 
 OSF: https://osf.io/preprints/ 
 Preprint guide 
 https://help.osf.io/hc/en-us/articles/360019930533-Upload-a-Preprint 
 Example journals that accept preprints for submission 
 PLOS: https://plos.org/open-science/preprints/ 
 eLife: https://elifesciences.org/articles/64910 
 ‘Peer community In’ (PCI) 
 Preprint review: https://peercommunityin.org/ 
 Registered Reports: https://rr.peercommunityin.org/about 
 Data repositories 
 OSF: https://osf.io/ 
 Figshare: https://figshare.com 
 Harvard Dataverse: https://dataverse.harvard.edu/ 
 Zenodo: https://zenodo.org/ 
 Data sharing guide 
 Soderberg (2018): https://doi.org/10.1177%2F2515245918757689 
 FAIR data information 
 https://www.go-fair.org/fair-principles/ 
 Guide to replication 
 https://doi.org/10.17605/osf.io/jx2td 
 ReproducibiliTea guide 
 https://reproducibilitea.org/about/ 
 Free our knowledge 
 https://freeourknowledge.org/ 

Open Data and Code. Publicly sharing data and analytic scripts can also counter irreproducibility. Most (70%) surveyed researchers reported that they are likely to reuse open datasets (Fane et al., 2019), but a majority of surveyed researchers admitted to sharing data in fewer than 10% of their projects (Houtkoop et al., 2018). Sharing data and code can enable others to verify the appropriateness of a study’s data and analyses, and to check for errors (Klein et al., 2018). Moreover, sharing data and code can make research more reproducible if researchers ensure that the data is findable, accessible, interoperable, and reusable (FAIR; see Wilkinson et al., 2016). There are now many data repositories that researchers can use to supplement their publications (see Table 1). However, explaining why sharing certain data is not feasible or ethical also aligns with transparent practice.

Replication. Remedying irreproducibility relies on the capacity to assess whether a phenomenon or effect replicates. That is, whether the finding is stable and/or generalisable. If a finding is not replicable, it cannot advance scientific thinking; without replication studies, strong theories that map on well to statistical models are unlikely to develop (Fiedler, 2017; Szollosi et al., 2020). To replicate the work of others, researchers can follow the very same procedures that were used originally and see whether they too attain similar findings (direct replication), or take a theory or effect and see if it reoccurs in novel circumstances or populations (conceptual replication; Zwaan et al., 2018). Of course, the capacity to conduct replication studies hinges on open access to papers, data, and analyses, yet again highlighting the benefits of such practices.

Researcher Actions

How then can researchers make the above practices easy, attractive, and socially engaging for their colleagues? We summarise our ideas in Table 1 (but see Kowalczyk et al., 2020 for other useful tips and resources). A first step to making open practices easier is to provide educational opportunities to learn about what Open Science practices are (and are not). Researchers can organize Open Science seminars and workshops for colleagues. These do not have to be extensive or time consuming; if there are existing journal clubs or regular colloquia, researchers can select articles about Open Science for the group to read, or invite an expert to give a presentation. Many barriers to entry and maintenance can also arise during the research process (Corker, 2018; Munafò et al., 2017; Nosek et al., 2015); thus, sharing practical tips, infographics, video links, templates, guides, and papers during these gatherings, via email, or on an online platform, can reduce barriers to adopting and maintaining open behaviours.

Additionally, researchers can use conference talks, poster presentations, and lab meetings as opportunities to highlight an instance where they have preregistered a study, made data publicly available, or posted the relevant preprint. Signalling these practices will serve as a subtle and persistent social norm. Such a norm can be further strengthened by creating a physical ‘Open Science Wall’ in the department, or by circulating a regular Open Science newsletter.

Hosting ‘ReproducibiliTea’ journal clubs (Orben, 2019) can also foster a social environment for researchers to discuss interesting Open Science ideas and practices. These events can offer opportunities for researchers to raise reservations, dispel misconceptions, and solve problems related to Open Science practices. Regular meetings will foster group cohesion and strengthen identification with the Open Science movement. Researchers can also make a social commitments with one another to engage in certain practices; Free Our Knowledge (see https://freeourknowledge.org/) is a platform where individuals can commit to a behaviour (e.g., submit a Registered Report) such that once the number of pledges reaches a critical mass, there is then public pressure to follow through with one’s commitment. If people know that many others are committed to the same behaviour, they can trust that their actions will be part of a broader collective movement.

Meetings, presentations, social media, and discussion forums also provide opportunities to promote the personal advantages that open practices can afford researchers. Some journals now offer Open Science badges to those who preregister or share data (Kidwell et al., 2016). Posting preprints can prevent ‘scooping’ because they are associated with time-stamped digital-object identifiers (DOIs). Moreover, papers associated with a preprint or open data also tend to generate more citations (Fu & Hughey, 2019; Piwowar & Vision, 2013; Serghiou & Ioannidis, 2018), which may be particularly beneficial to early career researchers (see Allen & Mehler, 2019; Berg et al., 2016; Farnham et al., 2017; Pownall et al., 2021; Sarabipour et al., 2019). Emphasising these attractive incentives will likely encourage colleagues to adopt these practices.

Students

Students are the next generation of researchers and research consumers, and thus stand to become the future torchbearers of the Open Science reform. In many disciplines, students are key members of research teams and engage in all aspects of the research process, from conceptualising and conducting studies to communicating the resulting data. Students also have fewer research habits to unlearn and can serve as a conduit for change among more senior researchers who might face the added inertia of bedded down practices. If so, students might benefit from the suggestions we provide in the previous section in swaying others to adopt transparent research behaviours.

Indeed, many students may be aware of Open Science and generally hold positive views toward it, but this does not necessarily translate into higher levels of implementation (Toribio-Flórez et al., 2021). Students also face unique barriers; those looking to pursue a career in research, for example, are under pressure to succeed via recognised metrics of academic achievement because efforts to innovate or adopt open practices are not yet widely recognised (Schönbrodt, 2019). Knowledge of Open Science is another barrier; doctoral candidates are less likely to implement Open Science practices if not exposed to such practices by their mentors, or in their research methods courses (Toribio-Flórez et al., 2021).

Target Behaviours

The key behaviours that students ought to engage in are the same as those mentioned in the previous section: preregistration, posting preprints, sharing open data and code, and conducting replication studies. These behaviours can be ingrained in students early in their training. Open Science provides the means for students to put statistical and methodological reasoning into practice, and many universities cover open principles and reproducible research practices in science degrees (Chopik et al., 2018). Many students are intrinsically motivated to learn, but others are looking to get a competitive edge in the job market. Teaching Open Science tools and practices can provide both an engaging learning environment and a coveted skill set (e.g., Kathawalla et al., 2021).

Researcher Actions

Researchers typically serve as teachers and mentors at their institutions, and they are therefore uniquely placed to foster Open Science principles and practices among students. We summarise our ideas for how to do so in Table 2. There are several strategies teachers can use to make it easier for students to adopt open practices from their very first course or research project. For example, teachers can design lectures or courses devoted to Open Science theory (e.g., Sarafoglou et al., 2019). Teachers can also embed certain practices into graded assignments. It is common for students to be graded on how well they report experiments. However, teachers could task students with posting a mock preprint on an internal server for fellow classmates to read and review each other’s work before submitting the revised version for grading. Teachers can also set a replication study as the class project, or as the topic of an Honours thesis. Additionally, preregistration can be incorporated into thesis and dissertation projects as a necessary aspect of research proposals. With these learning experiences in place, open practices will be the default as students venture into research careers.

Table 2. Students
Target behaviours Preregister studies. 
 Submit Registered Reports. 
 Upload preprints. 
 Make data and code openly available. 
 Conduct replication studies. 
Researcher actions Design Open Science courses for students. 
 Integrate preregistration and preprints in assessment. 
 Introduce students to open-source platforms and educate them on creating reproducible code. 
 Design group problem-solving activities involving open practices. 
 Demonstrate how to integrate open practices to lab groups. 
General resources Materials for teaching Open Science 
 Framework for open research training: https://forrt.org/ 
 Course materials and syllabi: https://osf.io/zbwr4/, https://osf.io/vkhbt/
 https://www.projecttier.org/tier-classroom/course-materials/ 
 Resources for teaching: 
 https://www.osc.uni-muenchen.de/toolbox/resources_for_teaching/index.html 
 Open-source statistical software 
 R: https://r-project.org/ 
 GitHub: https://github.com/ 
 JASP: https://jasp-stats.org/ 
 Courses for learning statistics with open-source software 
 https://psyteachr.github.io/, https://learningstatisticswithr.com/,https://r4ds.had.co.nz/ 
 Replication guide for undergraduates 
 https://doi.org/10.17605/osf.io/jx2td 
 Open Scholarship Knowledge Base 
 https://www.oercommons.org/hubs/OSKB 
 Student initiative for Open Science 
 https://studentinitiativeopenscience.wordpress.com/ 
Target behaviours Preregister studies. 
 Submit Registered Reports. 
 Upload preprints. 
 Make data and code openly available. 
 Conduct replication studies. 
Researcher actions Design Open Science courses for students. 
 Integrate preregistration and preprints in assessment. 
 Introduce students to open-source platforms and educate them on creating reproducible code. 
 Design group problem-solving activities involving open practices. 
 Demonstrate how to integrate open practices to lab groups. 
General resources Materials for teaching Open Science 
 Framework for open research training: https://forrt.org/ 
 Course materials and syllabi: https://osf.io/zbwr4/, https://osf.io/vkhbt/
 https://www.projecttier.org/tier-classroom/course-materials/ 
 Resources for teaching: 
 https://www.osc.uni-muenchen.de/toolbox/resources_for_teaching/index.html 
 Open-source statistical software 
 R: https://r-project.org/ 
 GitHub: https://github.com/ 
 JASP: https://jasp-stats.org/ 
 Courses for learning statistics with open-source software 
 https://psyteachr.github.io/, https://learningstatisticswithr.com/,https://r4ds.had.co.nz/ 
 Replication guide for undergraduates 
 https://doi.org/10.17605/osf.io/jx2td 
 Open Scholarship Knowledge Base 
 https://www.oercommons.org/hubs/OSKB 
 Student initiative for Open Science 
 https://studentinitiativeopenscience.wordpress.com/ 

Teachers can further habituate students to create interpretable and reusable code and statistical skills inside and outside of the classroom. Students should be introduced to open-source statistical programming languages such as Python and R. Proficiency in these computing languages, and user-friendly open software built on them, provides students with the means to share and document their analysis scripts for others to review and reuse. Moreover, teachers can impart ways of writing code and running analysis scripts with detailed commentary so that errors or inconsistencies are easy to discover. Wrangling data into tidy, more readable formats, using sensible variable names, and producing attractive visualisations are other small steps that make analyses and results more interpretable and accessible. Learning how to simulate data from scratch can also help students to connect the dots between different designs, data structures, and data analysis techniques, which in turn helps to improve the clarity of their research plans and preregistration protocols. These skills can be taught using practical challenges in class to provide hands-on experience with reproducible tools and research workflows thereby making these behaviours easier as they continue into their research training.

Of course, teachers have the means to make these practices socially desirable as well. Classrooms provide a rich environment for social support, and the activities mentioned above can easily be adapted for group projects. Supervisors and mentors also play an important role in students’ lives, including establishing and perpetuating social norms regarding research practices. Thus, modelling rigorous, open research practice is important; if students observe their mentors thoughtfully integrating Open Science into their own research, or in the lab more generally, then students are likely to follow suit.

Swaying other researchers and students to engage in open practices directly is unlikely to be effective in isolation if the top-down influences run counter to these efforts. Institutions in the research ecosystem incentivise and dissuade certain behaviours. If these institutions adopt structures that encourage Open Science, then open practices will prosper.

We now turn our attention to institutions: departments and faculties, universities, academic libraries, journals, and funders. For each institution, we describe their role in research, and then outline several target behaviours that the institution can implement to influence researchers and students to adopt open practices. Some researchers may have direct decision-making power and might find these ideas valuable if they have considerable sway in enacting such policies and practices. However, many researchers do not hold positions of considerable influence. We therefore end each section with recommended actions that the typical researcher—even those who might not hold positions of influence—can take to affect institutional change. Of course, this is not an exhaustive list, but we aim to provide a few concrete ways for individual researchers to indirectly influence scientific norms and practices by targeting institutional change in whatever capacity. Again, we take inspiration from the principles of behaviour change: make it easy, social, and attractive.

Departments and Faculties

Departments (and faculties) influence the practices of their researchers in several ways; departments disseminate information about trends and changes in scientific practices, set curricula that determine what students are taught and how they are assessed, and have considerable say in hiring and promotion decisions and how researchers are rewarded. Many departments and faculties still evaluate researchers based on how regularly one publishes in ‘high-impact’ journals or how frequently their work is cited (Rice et al., 2020). These metrics can motivate researchers to conduct quick and easy experiments that produce just enough new data for a new publication. To illustrate, the fastest way to accumulate publications would be to tweak experiments in minor ways and to write a separate article for each of these ‘least publishable units’ (Broad, 1981; also known as ‘salami slicing’). Collecting new data for publication’s sake is not a recipe for scientific progress but given that many departments rely on publication count when evaluating researchers, it is an ingredient of career progress. Alternatively, if departments and faculties focused more on quality (transparency and ‘contribution to scientific progress’) then researchers and students would be incentivised to devise careful experiments that aim to falsify or expand key theories in the field, and to share their data, code, and materials. Lacking the requisite learning experiences can also lead students to wander down research paths without possessing the knowledge and skills required to conduct quality research. Departments can either be catalysts or barriers to Open Science reforms among researchers and students, depending on the messaging, infrastructure, and incentives they put in place.

Target Behaviours

Those in positions of influence in departments and faculties—deans, department heads, and committees—often make decisions regarding administrative processes, online infrastructure, and hiring and promotion practices. Of course, researchers themselves tend to hold these positions and can therefore implement many initiatives directly. For instance, departments could require a (mock) preregistration as part of the ethical review processes or require plans to make data available and reusable where appropriate.

Awareness about open practices can grow if departments implement useful online infrastructure. Heads of departments and academic committees could provide funds to create and curate Open Science resources and have these available on the department’s website. Researchers would then have easy access to guides and support services. Departments could also fund workshops and research projects related to Open Science for researchers and students. Additionally, departments could consider hiring an expert on data management to advise researchers on how best to organise and prepare research data for open sharing (in line with the FAIR principles and national and international guidelines).

Departments can further encourage open practices by emphasising research quality when evaluating candidates for hire and promotion. For instance, job descriptions and panels could ask candidates for evidence of open practices in job advertisements and in interviews. Rather than demonstrating the number of publications or citations, candidates could be asked to comment on the quality and likely replicability of their publications, and the steps they have taken to increase transparency and contribute to scientific advances more broadly. Candidates, for example, could be asked to submit an annotated CV detailing preregistrations, replications, open data and code, power analyses, theoretical motivations, and rigorous experimental design. In fact, the dissertations of postgraduate students could be evaluated in similar ways. These behaviours are likely to signal that such practices are widespread and highly valued, and might encourage researchers to reconsider their own practices (Nosek, 2019). For those interested in criteria for assessing researchers, see Moher and colleagues’ (2020) review, the San Francisco Declaration of Research Assessment (DORA, 2012) and the Leiden Manifesto (Hicks et al., 2015). Gernsbacher (2018) also provides specific recommendations for how to reward open practices.

Researcher Actions

In Table 3, we provide a summary of actions and resources for encouraging departments and faculties to promote Open Science. Perhaps the most effective way for the typical researcher to encourage change at the departmental or faculty level is to serve on subcommittees and hiring panels given that there are often many opportunities to engage in these roles regardless of one’s career stage. In these roles, researchers can draw attention to research practices that improve (or threaten) research integrity. For instance, one can serve on an ethics subcommittee to promote preregistration and open data plans in ethical review processes, they can serve on a teaching and learning subcommittee to promote Open Science education in students, and they can serve on hiring panels and committees to promote changes to researcher evaluation. The nature of a committee or panel naturally means that researchers have a social platform to encourage these sorts of initiatives from the bottom-up because they can bring transparency and scientific contribution to the forefront of these decision-making processes. Providing examples of each initiative can also make the transition easier. For example, researchers can present an example of open evaluation criteria (see Table 3) or Open Science curriculum (see Table 2).

Table 3. Departments and Faculties
Target behaviours Include preregistration as part of ethical review processes. 
 Create support and online infrastructure for open practices. 
 Consider Open Science practices in hiring and promotion processes. 
Researcher actions Serve on subcommittees and panels to promote Open Science. 
 Pitch changes at staff/academic meetings. 
 Offer example open evaluation criteria for candidates for hire and promotion. 
General resources Example Open Science guide 
 https://www.uu.nl/en/research/open-science 
 Leiden manifesto for research evaluation 
 http://www.leidenmanifesto.org/ 
 Job descriptions and offers with Open Science focus 
 https://doi.org/10.17605/osf.io/7jbnt 
Target behaviours Include preregistration as part of ethical review processes. 
 Create support and online infrastructure for open practices. 
 Consider Open Science practices in hiring and promotion processes. 
Researcher actions Serve on subcommittees and panels to promote Open Science. 
 Pitch changes at staff/academic meetings. 
 Offer example open evaluation criteria for candidates for hire and promotion. 
General resources Example Open Science guide 
 https://www.uu.nl/en/research/open-science 
 Leiden manifesto for research evaluation 
 http://www.leidenmanifesto.org/ 
 Job descriptions and offers with Open Science focus 
 https://doi.org/10.17605/osf.io/7jbnt 

Of course, these official roles are not always available, but any faculty member can ask to have these Open Science initiatives as agenda items during staff meetings, and one can use these opportunities to point to the advantages and benefits of such initiatives. The increasingly normative nature of Open Science (see Christensen et al., 2020; Nosek & Lindsay, 2018), and frequent calls for more transparent research from journals and funding bodies, suggests that a track record of open practices and teaching will become increasingly sought after. Adopting and fostering Open Science at a departmental level early could be highly consequential as such initiatives are likely to benefit institutional rankings, and in turn attract more funding and demand from prospective students (McKiernan et al., 2016).

Universities

Universities are complex organisations with many often-competing interests. Like departments and faculties, universities often decide which researchers to hire and promote, which researchers to recognise with prizes, awards, and funds, and when and how to publicise researchers’ work. Universities also decide which training and development activities are recommended or compulsory, how integrity issues are handled, and what type of support to provide researchers and students.

University administrators want researchers at their institution to produce well-cited research in prestigious journals because these metrics are often used by governments and institutions when evaluating a university’s research output (e.g., the Research Excellence Framework; Times Higher Education; World University Rankings; see Huang, 2012). Such metrics and rankings influence how funds are allocated and can affect (among other things) the enrolment rates of domestic and international students (Harvey, 2008; Hurley, 2021; Nietzel, 2021). Some universities have even implemented financial incentive structures to reward publications in high impact journals (Abritis & McCook, 2017; Quan et al., 2017).

Many of the criteria in current frameworks and rankings are too often narrowly defined (Martin, 2011) and overlook societal impact, teaching quality, and open practices (Gadd, 2020; Pagliaro, 2021), but they nonetheless influence the local metrics that universities use to evaluate academic staff. As such, they largely incentivise poor research practice; by rewarding publication volume and citations, researchers may be implicitly incentivised to engage in QRPPs (see Smaldino & McElreath, 2016).

Target Behaviours

There are a number of initiatives that Universities can adopt to incentivise more open practices among its researchers. We will highlight three such initiatives, in no specific order. First, universities can sponsor Open Science task forces and join a national Reproducibility Network (if one exists; e.g., https://www.ukrn.org), which typically includes nominating at least one Open Science officer or leader. An Open Science task force can comprise anyone motivated to improve scientific practice at their university—academics, deans, or professional staff. The task force (or officer) can lead initiatives on behalf of the wider community on matters such as determining researchers’ attitudes and perceived barriers to Open Science, examining institutional policies and practices, suggesting alternative open policies, and making general recommendations to the Deputy Vice Chancellor (or equivalent) on matters of Open Science (Munafò, 2019). Members of the task force or networks could also offer resources and on-going training to researchers and students at the university.

Second, universities can adopt ‘OSF Institutions’, which is a free scholarly web tool designed to enhance transparency, foster collaboration, and increase the visibility of research outputs at the institutional level. ‘OSF Institutions’ makes it easy for users to incorporate the Open Science Framework (as well as other data repository services; see Table 4) into their existing research workflow. An option is also available for universities to recommend the Open Science Framework as a platform on which to manage research projects and make materials and data available to others.

Table 4. Universities
Target behaviours Establish an Open Science task force or officer, or become a member of a Reproducibility Network. 
 Adopt ‘OSF Institutions’. 
 Sign the Declaration of Research Assessment (DORA). 
Researcher actions Start an Open Science Community. 
 Conduct institution-wide surveys on open practices. 
 Pitch suggestions for Open Science reform to the chancellery (or equivalent). 
 Draft Open Science commitment statements. 
General resources OSF institutions 
 https://www.cos.io/products/osf-institutions 
 Declaration of Research Assessment (DORA) 
 https://sfdora.org/read/,https://sfdora.org/signers/ 
 Open Science Communities 
 Open Science Communities Starter Kit: https://www.startyourosc.com/ 
 Network of (German-speaking) Open Science Initiatives: 
 https://doi.org/10.17605/osf.io/tbkzh 
 Example Open Science task force report and surveys 
 https://doi.org/10.17605/osf.io/vpwf7 
 Example Open Science commitment statements 
 University of Helsinki: https://www.helsinki.fi/en/research/research-integrity/open-science 
 Sorbonne University: https://www.sorbonne-universite.fr/en/research-and-innovation/research-strategy/commitment-open-science 
 Leiden Ranking for research evaluation 
 https://www.leidenranking.com/ 
Target behaviours Establish an Open Science task force or officer, or become a member of a Reproducibility Network. 
 Adopt ‘OSF Institutions’. 
 Sign the Declaration of Research Assessment (DORA). 
Researcher actions Start an Open Science Community. 
 Conduct institution-wide surveys on open practices. 
 Pitch suggestions for Open Science reform to the chancellery (or equivalent). 
 Draft Open Science commitment statements. 
General resources OSF institutions 
 https://www.cos.io/products/osf-institutions 
 Declaration of Research Assessment (DORA) 
 https://sfdora.org/read/,https://sfdora.org/signers/ 
 Open Science Communities 
 Open Science Communities Starter Kit: https://www.startyourosc.com/ 
 Network of (German-speaking) Open Science Initiatives: 
 https://doi.org/10.17605/osf.io/tbkzh 
 Example Open Science task force report and surveys 
 https://doi.org/10.17605/osf.io/vpwf7 
 Example Open Science commitment statements 
 University of Helsinki: https://www.helsinki.fi/en/research/research-integrity/open-science 
 Sorbonne University: https://www.sorbonne-universite.fr/en/research-and-innovation/research-strategy/commitment-open-science 
 Leiden Ranking for research evaluation 
 https://www.leidenranking.com/ 

Third, universities (or even individual departments) can make a commitment to prioritise Open Science in its endeavours, which will in turn signal what researchers ought to value when conducting and disseminating research. Universities as a first step could sign DORA (2012) and commit to counteracting the perverse incentives that traditional ranking systems promote. DORA makes 18 recommendations to transform how the academic community evaluates researchers and research outputs. More than 2,200 organisations and over 17,000 individuals across the world are now signatories, with many having revised their research assessment guidelines to align with DORA. As more institutions sign up, institutional frameworks and assessment frameworks will align more and more with open practices. From here, a university can then devise their own public commitment statement. Public declarations that explicitly communicate open values, norms, and aspirations can provide top-down goals for individual researchers within the department to consider when conducting and disseminating research.

Researcher Actions

Though few researchers have direct decision-making power at a university-wide level, they can draw the attention of administration heads to emerging Open Science initiatives and behaviours. We summarise our recommended researcher actions in Table 4. Researchers might first consider starting an Open Science Community (OSC; see Armeni et al., 2021), which can be a grassroots forerunner to an Open Science task force. An Open Science Community is a group of researchers who desire to educate each other (and others) on Open Science tools and practices. However, Open Science Communities also discuss how the university can provide support to its academics and inform university administrators on how to shape Open Science policies. For example, Open Science Communities can design and conduct surveys to understand the perceived barriers regarding Open Science at their institution. A large group of researchers collectively advocating for initiatives can place considerable social pressure on institutions to enact change.

Open Science Communities and researchers can also give brief presentations to members of the chancellery (or equivalent) to advocate for Open Science initiatives such as adopting ‘OSF Institutions’, signing DORA, or implementing an Open Science task force. An effective pitch might explain the problem (i.e., the replication crisis), demonstrate its influence on research across numerous disciplines, and highlight concrete actions the university can take. The pitch ought to mention that other universities have already taken steps to address such issues and then provide similar solutions to make any change easy to implement. For example, one could illustrate how other universities and funders have aligned their assessment guidelines with DORA or provide a draft statement of public commitment to Open Science. In these pitches, researchers should also highlight the appeal of the proposed initiatives, including the potential for the university to become a leader in the emerging Open Science space and new university rankings that place greater value on open practices (e.g., Leiden Ranking).

Libraries

Academic libraries deliver services across the full research lifecycle. They provide access to scholarly resources and provide support for research activities such as literature searches, systematic reviews, working with data (e.g., The Carpentries; https://carpentries.org/), bibliometrics, funding opportunity identification, and grant writing. Academic libraries have also championed Open Access (OA) publishing—the public dissemination of scholarly and scientific literature free of charge—for more than two decades, motivated in part by the idea that research should be publicly available to advance discovery and drive innovation. Open access can potentially remedy the ever-increasing publishing costs in academia. However, some publishers have exploited Open Access to profit. Many traditional subscription-only journal publishers are also charging authors excessive up-front article processing charges (APCs). In response, some libraries have invested significantly in repository infrastructure to support access to research outputs (including data) while others have begun to manage APCs and provide services to ensure that researchers meet the Open Access requirements of funding bodies. Open Access publishing, as well as the increasing emphasis on managing research data to enable purposeful sharing, are some of the major global drivers that have shaped academic libraries as they are today (Brown et al., 2018). However, all academic libraries manage access to online subscription resources (e.g., journals), routinely negotiate deals with major publishers, and grapple with budgets dominated by their online journal spend. Often researchers are unaware of these activities, but it is the research community who, via peer review and editorial roles, effectively contribute free labour to the publishers of the resources the libraries pay for.

Target Behaviours

Libraries are already actively engaged in Open Access, managing research data, and other areas directly relevant to open research, but can struggle to have broader impact. To connect library services to the broader Open Science agenda, libraries require a more holistic approach; one that promotes and advocates broadly for Open Access and research data management across the research life cycle (Tzanova, 2020). A shift towards library staff having strong research backgrounds may also benefit a more holistic support of Open Science.

A key piece of infrastructure often in the academic library’s toolkit is an institutional repository, but these are often not well integrated with other systems or researcher workflows, nor do they seamlessly integrate with external infrastructure to support open scholarly communication. Libraries therefore ought to consider how they can provide a sophisticated, user-friendly repository infrastructure that integrates with researcher workflows and external systems (including external tools that enable a more open approach to active projects).

It is also necessary for libraries to engage more deeply with the research community to provide feedback. Library staff often have large networks through liaison and other activities. These networks can be leveraged by leading institution-wide events to connect researchers from various disciplines and drive a shared understanding of the opportunities afforded by Open Science. Some academic libraries have also had success in driving change in the scholarly communication system by leading campus-wide discussions on the challenges of scholarly publishing. For example, a committee at the University of California has unanimously endorsed the Declaration of Rights and Principles to Transform Scholarly Communication (see Table 5) to directly address concerns over the financially unstable subscription model that extracts money from universities and free labour from authors.

Table 5. Libraries
Target behaviours Adopt a holistic approach to staffing targeted toward the Open Science agenda. 
 Lead campus-wide discussions and events to address APCs and barriers to Open Access. 
 Provide sophisticated, user-friendly repository infrastructure. 
Researcher actions Engage with libraries to discover a strategic approach to Open Access publishing and APCs. 
 Prioritise the circumstances under which one will do unpaid work. 
 Link ORCiD to all of one’s research projects. 
 Create and use open educational resources supported by institutional libraries. 
General resources Open Science guides for libraries 
 https://www.fosteropenscience.eu/learning/open-science-at-the-core-of-libraries/
 https://www.ala.org/acrl/publications/keeping_up_with/open_science 
 Documentary about academic publishing 
 Schmitt (2018): https://paywallthemovie.com/ 
 Example commitments to open scholarship 
 https://osc.universityofcalifornia.edu/uc-publisher-relationships/ 
 Find Open Access journals 
 https://doaj.org 
 ORCiD information 
 https://info.orcid.org/what-is-orcid/ 
 Open Educational Resources 
 https://www.oercommons.org 
Target behaviours Adopt a holistic approach to staffing targeted toward the Open Science agenda. 
 Lead campus-wide discussions and events to address APCs and barriers to Open Access. 
 Provide sophisticated, user-friendly repository infrastructure. 
Researcher actions Engage with libraries to discover a strategic approach to Open Access publishing and APCs. 
 Prioritise the circumstances under which one will do unpaid work. 
 Link ORCiD to all of one’s research projects. 
 Create and use open educational resources supported by institutional libraries. 
General resources Open Science guides for libraries 
 https://www.fosteropenscience.eu/learning/open-science-at-the-core-of-libraries/
 https://www.ala.org/acrl/publications/keeping_up_with/open_science 
 Documentary about academic publishing 
 Schmitt (2018): https://paywallthemovie.com/ 
 Example commitments to open scholarship 
 https://osc.universityofcalifornia.edu/uc-publisher-relationships/ 
 Find Open Access journals 
 https://doaj.org 
 ORCiD information 
 https://info.orcid.org/what-is-orcid/ 
 Open Educational Resources 
 https://www.oercommons.org 

Researcher Actions

What many libraries currently do aligns with the goals of Open Science, but researchers can help make these initiatives easier by engaging more with library services (see Table 5). Researchers should proactively seek to understand the pressures that publishers and vendors place on their university’s library budget and the sophisticated games that publishers play to increase their revenue. By engaging with libraries, researchers can find a strategic approach to their scholarly publishing practices, considering when and where they should (and should not) pay. Moreover, researchers should explore all models of publishing that enable Open Access to their research outputs, including green Open Access routes using institutional repositories, and ensure they retain ownership rights in their author accepted manuscripts.

Researchers have some social influence on publishers in their roles as editors and as peer reviewers. Researchers therefore ought to prioritise the circumstances under which they are prepared to do unpaid work, and voice concerns directly when current practices are not aligned with Open Science. Importantly, researchers can share this activity with their library colleagues, and work to understand how they can further support libraries in the cancellation of subscription (closed) journals and/or moving to other models (e.g., Read and Publish agreements; see Borrego et al., 2021).

Workflow and data management practices can also be integrated more seamlessly with the library’s infrastructure. A small step each individual researcher can take to help make the open research ecosystem more efficient is to get an ORCiD and use it whenever and wherever it is enabled. Researchers should ask their institutional repository to integrate with ORCiD to help ensure the publications in their ORCiD profile are openly available.

Finally, researchers can engage more with library services and resources. As noted previously, researchers are commonly involved in teaching and learning at universities and have the option to use and promote Open Educational Resources (OERs) that are supported by the library rather than rely on journal articles locked behind a paywall, or expensive textbooks (print and ebooks).

Journals

Research is primarily disseminated via refereed journal articles. Journal editors and publishers want to publish work that is seemingly rigorous, of high quality, and ultimately ‘impactful’ (e.g., cited) because this can enhance the journal’s prestige. Prestige serves to attract research that is likely to have the greatest impact in the respective field, which in turn tends to result in more prestige, more citations, more subscriptions, and higher revenue. Indeed, various organisations (e.g., Elsevier) have generated indices that seek to reflect the quality and impact of a journal such as its impact factor (https://researchguides.uic.edu/if/impact). Such indices have become a key currency, with various organizations investing a great deal of resources to monitor and publish journal impact factors.

However, much work that is ‘impactful’ is not necessarily rigorous, important, or transparent. The quest for impactful work has led to a range of problematic consequences. For example, journals have tended to publish research that claims to be novel and reports statistically significant results in the hope that this work has great impact. At the same time, journals have been less likely to publish research that reports weak, null, or negative findings, or replication research (Ferguson & Brannick, 2012). However, if any field is to accumulate a reliable body of knowledge, research needs to be rigorous (regardless of novelty, popularity, or statistical significance) and reported in an open, transparent, reproducible way so that people can grasp the actual state of what we know and build on this appropriately.

Target Behaviours

There are several ways for those in positions of influence (e.g., editors) at various journals to encourage open, rigorous, and reproducible research. For instance, editors (as well as funders) can adopt the Transparency and Openness Promotion (TOP) guidelines, which describe eight different dimensions of research standards of openness and transparency (e.g., preregistration, transparent data) when submitting or reviewing research (Nosek et al., 2015). Editors could adapt these guidelines to provide reviewers with explicit evaluation criteria to promote clear, collaborative, and constructive peer review. Journals should also consider making TOP suggestions the default when researchers submit a paper. For example, submission processes can be redesigned so that authors must opt out of making their data and materials openly available. The journal Cognition revealed that having a default open data policy made open data more prevalent and reusable (Hardwicke et al., 2018).

Editors can also alter review processes in other ways. For example, they could implement open peer review (pre- and post-publication), which tends to encourage more respectful and constructive feedback as well as clearer communication between reviewers, editors, and authors (Ross-Hellauer, 2017). Result-blinded peer review (Grand et al., 2018) is another initiative that editors could adopt to avoid a bias toward publishing ‘positive’ results.

Another simple (and increasingly common) way in which journals can support Open Science is to offer a wider variety of publication (submission) formats that emphasize quality research processes rather than the outcomes. For example, journals can adopt Registered Reports as a submission format (Nosek & Lakens, 2014) and encourage authors to submit replications.

Finally, editors can increase the visibility and appeal of open practices among researchers by adopting Open Science Badges (https://www.cos.io/initiatives/badges). These badges are awarded to articles if authors, for example, (a) preregister, or make openly available their (b) materials and (c) data. The use of badges is an easy, low-cost initiative, and serves as an injunctive norm of appropriate research behaviours (signalling what we value and ‘should’ do). The introduction of these badges has increased the extent to which researchers engage in open practices such as data sharing (Kidwell et al., 2016), while strengthening the reliability and trustworthiness of the research that is published.

Researcher Actions

How then can a typical researcher influence journals? In Table 6, we provide a summary of ideas. The typical researcher is likely to interact with senior editors when submitting manuscripts for review and when reviewing manuscripts. As a first step, reviewers ought to consider open evaluation guides and resources (such as TOP guidelines) when reviewing manuscripts, and commend instances where authors have demonstrated open practices such as preregistration or open data. These actions are likely to plant the seed that such practices are becoming increasingly normative and worthwhile.

Table 6. Journals
Target behaviours Adopt the TOP guidelines for manuscript evaluation. 
 Embrace open peer review and/or result-blinded peer review. 
 Welcome Registered Reports (and Peer Community In) and replication studies. 
 Adopt Open Science badges. 
Researcher actions Commend the use of open practices when reviewing manuscripts. 
 Suggest initiatives when interacting with editors, referring to relevant resources, norms, and benefits. 
 Prioritise reviewing manuscripts that demonstrate a commitment to open practices (PRO initiative). 
General resources TOP guidelines 
 https://www.cos.io/initiatives/top-guidelines, https://osf.io/9f6gx, https://osf.io/fe2pz/ 
 Open peer review information 
 https://plos.org/resource/open-peer-review/ 
 Example guidelines for replication research 
 https://royalsocietypublishing.org/rsos/replication-studies 
 Journals that offer Registered Reports 
 https://www.cos.io/initiatives/registered-reports 
 Journals that have endorsed ‘Peer Community In’ Registered Reports 
 https://rr.peercommunityin.org/about/pci_rr_friendly_journals 
 Open Science badges 
 https://www.cos.io/initiatives/badges 
 Peer reviewers’ openness (PRO) initiative 
 Morey et al. (2016): https://www.opennessinitiative.org/ 
Target behaviours Adopt the TOP guidelines for manuscript evaluation. 
 Embrace open peer review and/or result-blinded peer review. 
 Welcome Registered Reports (and Peer Community In) and replication studies. 
 Adopt Open Science badges. 
Researcher actions Commend the use of open practices when reviewing manuscripts. 
 Suggest initiatives when interacting with editors, referring to relevant resources, norms, and benefits. 
 Prioritise reviewing manuscripts that demonstrate a commitment to open practices (PRO initiative). 
General resources TOP guidelines 
 https://www.cos.io/initiatives/top-guidelines, https://osf.io/9f6gx, https://osf.io/fe2pz/ 
 Open peer review information 
 https://plos.org/resource/open-peer-review/ 
 Example guidelines for replication research 
 https://royalsocietypublishing.org/rsos/replication-studies 
 Journals that offer Registered Reports 
 https://www.cos.io/initiatives/registered-reports 
 Journals that have endorsed ‘Peer Community In’ Registered Reports 
 https://rr.peercommunityin.org/about/pci_rr_friendly_journals 
 Open Science badges 
 https://www.cos.io/initiatives/badges 
 Peer reviewers’ openness (PRO) initiative 
 Morey et al. (2016): https://www.opennessinitiative.org/ 

Editorial teams may have reservations about promoting open practices (Hopp & Hoover, 2019). However, in their interactions with editors, reviewers can take the opportunity to suggest initiatives such as TOP guidelines, badges, wider publication formats, and open peer review. Relevant information can also be made easier to find by providing useful links. Moreover, one can point out the increasing norms around such practices; at the time of writing, more than 75 journals have implemented Open Science badges, more than 250 have Registered Reports as a publication format, and more than 1,000 have implemented TOP guidelines (see http://cos.io/). Highlighting what the journal can gain will also increase how attractive these initiatives appear. Introducing TOP guidelines, for example, is likely to reduce the time that authors and reviewers spend communicating, and it can improve the reporting standards of published research (Nosek et al., 2015).

Furthermore, as reviewers, researchers can be selective with what they decide to review. For example, they can prioritise to review articles that demonstrate a commitment to open practices, such as open data (or where explanations of closed data are provided), and they ought to explain this decision when in communication with editors. The Peer Reviewers’ Openness (PRO) initiative is one guide that advocates for this action to drive change, and it serves to incentivise journals to encourage authors to adopt open practices. Similarly, reviewers and authors can publicly commit to prioritising journals who are committed to Open Science (e.g., Meta-Psychology, Collabra) when deciding where to publish their work.

Funders

Funding bodies are often responsible for deciding how and where to allocate research funds from government, industry, and philanthropic sources. These funds are limited, and the process of evaluating research proposals is competitive. Grant applications are typically written by researchers before being sent to specialists in the field for review. A committee then assesses and ranks these applications to prioritise where available funds are allocated. This review process typically favours researchers with more publications, higher citation counts, and a track record of publishing in ‘high-impact’ journals, which then places these same researchers in a better position to publish further research (i.e., the Matthew effect; Bol et al., 2018). In fact, researchers at the top 20% of universities received over 60% of the funding from the National Science Foundation (Drutman, 2012). A model that consistently awards funds more to established, senior academics, and to conventional research, may be stifling scientific advancement and scientific reforms such as those that advocate for Open Science.

Target Behaviours

There are several ways to address these issues. Some funding bodies have adopted policies to promote Open Science by committing to make funded output freely accessible (e.g., Social Sciences and Humanities Research Council of Canada; National Institute of Health; cOAlition S). The European Research Council (2017) has also proposed that funded projects have open and reusable data attached to output. Policies that require open practices from funded projects would certainly increase the uptake of open practices among researchers.

Funding bodies ought to also consider altering how funds are allocated. Moderate recommendations include changes to evaluation, including a greater focus on the quality rather than the quantity of a researcher’s scientific contributions. A focus on quality may prioritise an investigator’s plans to engage in open practices such as preregistration, sharing data and code publicly, and their plans to conduct replications and submit Registered Reports. The Dutch Research Council (NWO) and European Research Council have now both signed DORA and have committed to weighting open practices and theoretical contribution more heavily. The NWO, for instance, now requires CVs to have a narrative academic profile and no more than five or ten key research outputs, as opposed to typical ‘impact’ metrics (e.g., h-index). More radical suggestions to funding allocation include innovation lotteries and open review (Gurwitz et al., 2014; Liu et al., 2020). Perhaps in the long-term, funders ought to consider randomising grants that pass a certain threshold of quality where quality encompasses strong theory and a commitment to open practices and Registered Reports.

Researcher Actions

A greater appreciation of Open Science when determining how research funds are allocated can be promoted through several small actions. We summarise some suggested actions in Table 7. Senior researchers who serve as reviewers on funding committees are in a position to favourably weigh those projects that signal quality rather than quantity. Researchers more generally, however, can also influence funding processes in how they write their grant proposals. For instance, researchers can include prepared statements of commitment to open practices in grant applications even when these are not specifically required. Funders often ask for broader impact statements and a history of translating research into action. A track record of open data, open access, and replication studies may be seen favourably in this context. Researchers can also outline how they plan to make their research outputs openly available, or their plans to provide Open Science training to postgraduate and postdoctoral students. These plans can even be accounted for in the project’s budget to demonstrate how easily these behaviours can be incorporated into a research project.

Table 7. Funders
Target behaviours Plan to fund open access publishing and require open data for funded projects. 
 Adopt review processes that value open practices and theoretical contribution. 
 Adopt narrative CV formats for grants proposals and ‘Best five’ research outputs. 
Researcher actions As a reviewer, promote research proposals that include open and rigorous research practices. 
 Include statements of commitment to open practices in grant applications. 
 Include impact statements, plans, and budgets to make data and publications openly accessible. 
 Advertise alternative metrics and best outputs in grant applications. 
General resources Example Open Science commitment statement 
 http://www.researchtransparency.org/ 
 Example open CV format 
 https://www.nwo.nl/en/dora 
 Integrating Open Science into grant proposals 
 https://www.fosteropenscience.eu/content/winning-horizon-2020-open-science 
 Open access publication initiative 
 Coalition-S: https://www.coalition-s.org/about/ 
 Data management plan guides and templates 
 https://www.fosteropenscience.eu/index.php/foster-taxonomy/research-data-management
 https://dmponline.vu.nl/public_templates 
 Guide on collaborating with industry using Open Science 
 https://www.cos.io/blog/how-to-collaborate-with-industry-using-open-science 
Target behaviours Plan to fund open access publishing and require open data for funded projects. 
 Adopt review processes that value open practices and theoretical contribution. 
 Adopt narrative CV formats for grants proposals and ‘Best five’ research outputs. 
Researcher actions As a reviewer, promote research proposals that include open and rigorous research practices. 
 Include statements of commitment to open practices in grant applications. 
 Include impact statements, plans, and budgets to make data and publications openly accessible. 
 Advertise alternative metrics and best outputs in grant applications. 
General resources Example Open Science commitment statement 
 http://www.researchtransparency.org/ 
 Example open CV format 
 https://www.nwo.nl/en/dora 
 Integrating Open Science into grant proposals 
 https://www.fosteropenscience.eu/content/winning-horizon-2020-open-science 
 Open access publication initiative 
 Coalition-S: https://www.coalition-s.org/about/ 
 Data management plan guides and templates 
 https://www.fosteropenscience.eu/index.php/foster-taxonomy/research-data-management
 https://dmponline.vu.nl/public_templates 
 Guide on collaborating with industry using Open Science 
 https://www.cos.io/blog/how-to-collaborate-with-industry-using-open-science 

It is common for university research offices to edit grant proposals to place emphasis on traditional research metrics. However, researchers can take it upon themselves to highlight their five or ten ‘Best’ papers, to write a written statement of one’s broader impact, or to use less traditional metrics such as Altmetric. If an increasing number of researchers do the same, it can create a social pressure for these actions to become the expectation rather than an exception.

Researchers ought to also use open practices when collaborating with partnered industry funders. In applied fields, for instance, there is a great deal of communication between researchers and industry collaborators. Searston et al. (2019) have outlined ways in which the OSF and other open tools can keep partners updated at every stage of a research project. Transparency while working with industry will likely improve the quality of the end product and establish a norm of open collaboration.

Our aim in this paper has been to provide recommendations and resources that the everyday researcher can use to promote Open Science. For various nodes and stakeholder groups in the research ecosystem, we described how current behaviours, norms, and cultures sustain irreproducibility and slow scientific progress, while also suggesting alternative behaviours and practices that are more conducive to Open Science. Most critically, however, we recommended actions that individual researchers can take to promote these changes. We also used two behaviour change frameworks—EAST and the Pyramid of Culture Change—to ground these recommendations. In essence, these frameworks propose that, for behaviours to be adopted, they ought to be made easy, social, and attractive.

In the first part of this paper, we proposed ways that researchers could directly influence open practices among individuals with whom they work closely: colleagues and students. Progress, however, often also hinges on top-down influences from larger institutions. At the same time, there will be little drive for institutional change without pressure from researchers. In the second part of this paper, we proposed ways in which institutions—departments and faculties, universities, libraries, journals, and funders—can promote open practices, and suggested actions that the typical researcher can take to influence institutions despite not having direct decision-making power. Practices across the scientific community determine the quality of the research that is generated and disseminated. A holistic approach to improving the infrastructure, norms, and reward structures is needed to shift to a culture of Open Science. Inspired by principles of behaviour change, we hope to have provided useful means to empower researchers in this endeavour.

Here we list the authors who contributed to each section: Introduction (SGR), Colleagues (MAB, JB, HB, CDK & DM), Students (RAS), Departments/faculties (JMC, KJ, REL & HAS), Universities (JLB & SGR), Journals (NKS), Libraries (AT) and Funders (SGR & HAS). The paper’s structure was organised by SGR. Additionally, MAB, JB, JLB, KJ, CDK, SGR, HAS, RAS, NKS, and JMT contributed to general editing. The idea for this paper was conceived by JMT during the Society for the Improvement of Psychological Science (SIPS) 2019 meeting in Rotterdam.

This research was supported by grant No. LP170100086 from the Australian Research Council to JMT and RAS, and by a grant from the National Science Center (2015/19/B/HS6/01253) to KJ.

JC is president of the Association for Interdisciplinary Metaresearch and Open Science (AIMOS), a non-profit organization. All other authors have no conflicts of interest to declare.

1.

In this paper, we primarily focus on rigor and transparency as main aspects of ‘Open Science’. However, other, much broader definitions encompass aspects of inclusivity, equity, or citizen science (see European Commission, 2019; Fox et al., 2021; Masuzzo, 2019).

Abritis, A., & McCook, A. (2017, August 10). Cash bonuses for peer-reviewed papers go global. Science. https://doi.org/10.1126/science.aan7214
Allen, C., & Mehler, D. M. A. (2019). Open Science challenges, benefits and tips in early career and beyond. PLoS Biology, 17(12), e3000246. https://doi.org/10.1371/journal.pbio.3000246
Armeni, K., Brinkman, L., Carlsson, R., Eerland, A., Fijten, R., Fondberg, R., Heininga, V. E., Heunis, S., Koh, W. Q., Masselink, M., Moran, N., Baoill, A. Ó., Sarafoglou, A., Schettino, A., Schwamm, H., Sjoerds, Z., Teperek, M., van den Akker, O. R., van’t Veer, A., & Zurita-Milla, R. (2021). Towards wide-scale adoption of open science practices: The role of open science communities. Science and Public Policy, 48(5), 605–611. https://doi.org/10.1093/scipol/scab039
Bauchner, H. (2017). The rush to publication: An editorial and scientific mistake. JAMA, 318(12), 1109–1110. https://doi.org/10.1001/jama.2017.11816
Berg, J. M., Bhalla, N., Bourne, P. E., Chalfie, M., Drubin, D. G., Fraser, J. S., Greider, C. W., Hendricks, M., Jones, C., Kiley, R., King, S., Kirschner, M. W., Krumholz, H. M., Lehmann, R., Leptin, M., Pulverer, B., Rosenzweig, B., Spiro, J. E., Stebbins, M., … Wolberger, C. (2016). Preprints for the life sciences. Science, 352(6288), 899–901. https://doi.org/10.1126/science.aaf9133
Bishop, D. V. (2020). The psychology of experimental psychologists: Overcoming cognitive constraints to improve research: The 47th Sir Frederic Bartlett Lecture. Quarterly Journal of Experimental Psychology, 73(1), 1–19. https://doi.org/10.1177/1747021819886519
Bol, T., de Vaan, M., & van de Rijt, A. (2018). The Matthew effect in science funding. Proceedings of the National Academy of Sciences, 115(19), 4887–4890. https://doi.org/10.1073/pnas.1719557115
Borrego, Á., Anglada, L., & Abadal, E. (2021). Transformative agreements: Do they pave the way to open access? Learned Publishing, 34(2), 216–232. https://doi.org/10.1002/leap.1347
Broad, W. J. (1981). The publishing game: Getting more for less. Science, 211(4487), 1137–1139. https://doi.org/10.1126/science.7008199
Brown, S., Alvey, E., Danilova, E., Morgan, H., & Thomas, A. (2018). Evolution of Research Support Services at an Academic Library: Specialist Knowledge Linked by Core Infrastructure. New Review of Academic Librarianship, 24(3–4), 337–348. https://doi.org/10.1080/13614533.2018.1473259
Camerer, C. F., Dreber, A., Holzmeister, F., Ho, T.-H., Huber, J., Johannesson, M., Kirchler, M., Nave, G., Nosek, B. A., Pfeiffer, T., Altmejd, A., Buttrick, N., Chan, T., Chen, Y., Forsell, E., Gampa, A., Heikensten, E., Hummer, L., Imai, T., … Wu, H. (2018). Evaluating the replicability of social science experiments in Nature and Science between 2010 and 2015. Nature Human Behaviour, 2(9), 637–644. https://doi.org/10.1038/s41562-018-0399-z
Chambers, C. D., Feredoes, E., Muthukumaraswamy, S. D., & Etchells, P. (2014). Instead of “playing the game” it is time to change the rules: Registered Reports at “AIMS Neuroscience” and beyond. AIMS Neuroscience, 1(1), 4–17. https://doi.org/10.3934/neuroscience.2014.1.4
Chopik, W. J., Bremner, R. H., Defever, A. M., & Keller, V. N. (2018). How (and whether) to teach undergraduates about the replication crisis in psychological science. Teaching of Psychology, 45(2), 158–163. https://doi.org/10.1177/0098628318762900
Christensen, G., Wang, Z., Levy Paluck, E., Swanson, N., Birke, D., Miguel, E., & Littman, R. (2020). Open science practices are on the rise: The State of Social Science. https://osf.io/preprints/metaarxiv/5rksu/
Corker, K. (2018). Open science is a behavior. Center for Open Science. https://cos.io/blog/open-science-is-a-behavior/
DeHaven, A. (2017). Preregistration: A Plan, Not a Prison. Center for Open Science. https://cos.io/blog/preregistration-plan-not-prison/
Devezer, B., Navarro, D. J., Vandekerckhove, J., & Ozge Buzbas, E. (2020). The case for formal methodology in scientific reform. Royal Society Open Science, 8(3), 200805. https://doi.org/10.1098/rsos.200805
DORA. (2012). Declaration on Research Assessment. https://ascb.org/dora/
Drutman, L. (2012). How the NSF allocates billions of federal dollars to top universities. Sunlight Foundation. http://sunlightfoundation.com/blog/2012/09/13/nsf-funding/
Duvendack, M., Palmer-Jones, R., & Reed, W. R. (2017). What Is Meant by “Replication” and Why Does It Encounter Resistance in Economics? American Economic Review, 107(5), 46–51. https://doi.org/10.1257/aer.p20171031
Edwards, M. A., & Roy, S. (2017). Academic Research in the 21st Century: Maintaining Scientific Integrity in a Climate of Perverse Incentives and Hypercompetition. Environmental Engineering Science, 34(1), 51–61. https://doi.org/10.1089/ees.2016.0223
European Commission. (2019). Open Science. https://ec.europa.eu/info/sites/default/files/research_and_innovation/knowledge_publications_tools_and_data/documents/ec_rtd_factsheet-open-science_2019.pdf
European Research Council. (2017). Guidelines on the implementation of Open Access to scientific publications and research data in projects supported by the European Research Council under Horizon 2020. https://ec.europa.eu/research/participants/data/ref/h2020/grants_manual/hi/oa_pilot/h2020-hi-oa-pilot-guide_en.pdf
Fane, B., Ayris, P., Hahnel, M., Hrynaszkiewicz, I., Baynes, G., & Farrell, E. (2019). The State of Open Data Report 2019. Digital Science. https://doi.org/10.6084/m9.figshare.9980783.v2
Farnham, A., Kurz, C., Öztürk, M. A., Solbiati, M., Myllyntaus, O., Meekes, J., Pham, T. M., Paz, C., Langiewicz, M., Andrews, S., Kanninen, L., Agbemabiese, C., Guler, A. T., Durieux, J., Jasim, S., Viessmann, O., Frattini, S., Yembergenova, D., Benito, C. M., … Hettne, K. (2017). Early career researchers want Open Science. Genome Biology, 18(1), 1–4. https://doi.org/10.1186/s13059-017-1351-7
Fecher, B., & Friesike, S. (2014). Open science: one term, five schools of thought. In S. Bartling & S. Friesike (Eds.), Opening science (pp. 17–47). Springer, Cham. https://doi.org/10.1007/978-3-319-00026-8_2
Ferguson, C. J., & Brannick, M. T. (2012). Publication bias in psychological science: Prevalence, methods for identifying and controlling, and implications for the use of meta-analyses. Psychological Methods, 17(1), 120–128. https://doi.org/10.1037/a0024445
Fiedler, K. (2017). What constitutes strong psychological science? The (neglected) role of diagnosticity and a priori theorizing. Perspectives on Psychological Science, 12(1), 46–61. https://doi.org/10.1177/1745691616654458
Fischhoff, B., & Beyth, R. (1975). I knew it would happen: Remembered probabilities of once —future things. Organizational Behavior and Human Performance, 13(1), 1–16. https://doi.org/10.1016/0030-5073(75)90002-1
Fox, J., Pearce, K. E., Massanari, A. L., Riles, J. M., Szulc, Ł., Ranjit, Y. S., Trevisan, F., Soriano, C. R. R., Vitak, J., Arora, P., Ahn, S. J. (Grace), Alper, M., Gambino, A., Gonzalez, C., Lynch, T., Williamson, L. D., & Gonzales, A. L. (2021). Open Science, Closed Doors? Countering Marginalization through an Agenda for Ethical, Inclusive Research in Communication. Journal of Communication, jqab029. https://doi.org/10.1093/joc/jqab029
Fraser, H., Parker, T., Nakagawa, S., Barnett, A., & Fidler, F. (2018). Questionable research practices in ecology and evolution. PloS One, 13(7), e0200303. https://doi.org/10.1371/journal.pone.0200303
Fu, D. Y., & Hughey, J. J. (2019). Meta-Research: Releasing a preprint is associated with more attention and citations for the peer-reviewed article. eLife, 8, e52646. https://doi.org/10.7554/elife.52646
Gadd, E. (2020, November 24). University rankings need a rethink. Nature, 587, 523. https://doi.org/10.1038/d41586-020-03312-2
Gagliardi, D., Cox, D., & Li, Y. (2015). Institutional inertia and barriers to the adoption of open science. In The transformation of university institutional and organizational boundaries (pp. 107–133). Brill Sense. https://doi.org/10.1007/978-94-6300-178-6_6
Gelman, A., & Loken, E. (2013). The garden of forking paths: Why multiple comparisons can be a problem, even when there is no “fishing expedition” or “p-hacking” and the research hypothesis was posited ahead of time. Department of Statistics, Columbia University. http://stat.columbia.edu/~gelman/research/unpublished/forking.pdf
Gernsbacher, M. A. (2018). Rewarding research transparency. Trends in Cognitive Sciences, 22(11), 953–956. https://doi.org/10.1016/j.tics.2018.07.002
Gopalakrishna, G., ter Riet, G., Cruyff, M., Vink, G., Stoop, I., Wicherts, J. M., & Bouter, L. (2021). Prevalence of questionable research practices, research misconduct and their potential explanatory factors: A survey among academic researchers in The Netherlands. Metaarxiv. https://doi.org/10.31222/osf.io/vk9yt
Grand, J. A., Rogelberg, S. G., Banks, G. C., Landis, R. S., & Tonidandel, S. (2018). From outcome to process focus: Fostering a more robust psychological science through registered reports and results-blind reviewing. Perspectives on Psychological Science, 13(4), 448–445. https://doi.org/10.1177/1745691618767883
Grimes, D. R., Bauch, C. T., & Ioannidis, J. P. A. (2018). Modelling science trustworthiness under publish or perish pressure. Royal Society Open Science, 5(1), 171511. https://doi.org/10.1098/rsos.171511
Gurwitz, D., Milanesi, E., & Koenig, T. (2014). Grant application review: The case of transparency. PLoS Biology, 12(12), e1002010. https://doi.org/10.1371/journal.pbio.1002010
Hallsworth, M. (2014). The use of field experiments to increase tax compliance. Oxford Review of Economic Policy, 30(4), 658–679. https://doi.org/10.1093/oxrep/gru034
Hallsworth, M., Chadborn, T., Sallis, A., Sanders, M., Berry, D., Greaves, F., Clements, L., & Davies, S. C. (2016). Provision of social norm feedback to high prescribers of antibiotics in general practice: A pragmatic national randomised controlled trial. The Lancet, 387(10029), 1743–1752. https://doi.org/10.1016/s0140-6736(16)00215-4
Hardwicke, T. E., Mathur, M. B., MacDonald, K., Nilsonne, G., Banks, G. C., Kidwell, M. C., Hofelich Mohr, A., Clayton, E., Yoon, E. J., Tessler, M. H., Lenne, R. L., Altman, S., Long, B., & Frank, M. C. (2018). Data availability, reusability, and analytic reproducibility: Evaluating the impact of a mandatory open data policy at the journal Cognition. Royal Society Open Science, 5(8), 180448. https://doi.org/10.1098/rsos.180448
Harnad, S., Brody, T., Vallières, F., Carr, L., Hitchcock, S., Gingras, Y., Oppenheim, C., Hajjem, C., & Hilf, E. R. (2008). The access/impact problem and the green and gold roads to open access: An update. Serials Review, 34(1), 36–40. https://doi.org/10.1080/00987913.2008.10765150
Harvey, L. (2008). Rankings of higher education institutions: A critical review. Quality in Higher Education, 14(3), 187–207. https://doi.org/10.1080/13538320802507711
Hicks, D., Wouters, P., Waltman, L., & rafols, I. (2015, April 23). Bibliometrics: The Leiden Manifesto for research metrics. Nature, 520(7548), 429–431. https://doi.org/10.1038/520429a
Hopp, C., & Hoover, G. A. (2019). What Crisis? Management Researchers’ Experiences with and Views of Scholarly Misconduct. Science and Engineering Ethics, 25, 1–40. https://doi.org/10.1007/s11948-018-0079-4
Houtkoop, B. L., Chambers, C., Macleod, M., Bishop, D. V. M., Nichols, T. E., & Wagenmakers, E. J. (2018). Data sharing in psychology: A survey on barriers and preconditions. Advances in Methods and Practices in Psychological Science, 1(1), 70–85. https://doi.org/10.1177/2515245917751886
Huang, M. H. (2012). Opening the black box of QS World University Rankings. Research Evaluation, 21(1), 71–78. https://doi.org/10.1093/reseval/rvr003
Hurley, P. (2021, January 14). 2021 is the year Australia’s international student crisis really bites. The Conversation. https://theconversation.com/2021-is-the-year-australias-international-student-crisis-really-bites-153180
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the Prevalence of Questionable Research Practices With Incentives for Truth Telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953
Johnson, E. J., & Goldstein, D. (2003). Do defaults save lives? Science, 302(5649), 1338–1339. https://doi.org/10.1126/science.1091721
Kahneman, D. (2011). Thinking, fast and slow. Farrar, Straus and Giroux.
Kathawalla, U.-K., Silverstein, P., & Syed, M. (2021). Easing Into Open Science: A Guide for Graduate Students and Their Advisors. Collabra: Psychology, 7(1), 18684. https://doi.org/10.1525/collabra.18684
Kerr, N. L. (1998). HARKing: Hypothesizing after the results are known. Personality and Social Psychology Review, 2(3), 196–217. https://doi.org/10.1207/s15327957pspr0203_4
Kidwell, M. C., Lazarević, L. B., Baranski, E., Hardwicke, T. E., Piechowski, S., Falkenberg, L.-S., Kennett, C., Slowik, A., Sonnleitner, C., Hess-Holden, C., Errington, T. M., Fiedler, S., & Nosek, B. A. (2016). Badges to acknowledge open practices: A simple, low-cost, effective method for increasing transparency. PLoS Biology, 14(5), e1002456. https://doi.org/10.1371/journal.pbio.1002456
Klayman, J. (1995). Varieties of confirmation bias. Psychology of Learning and Motivation, 32, 385–418. https://doi.org/10.1016/s0079-7421(08)60315-1
Klein, O., Hardwicke, T. E., Aust, F., Breuer, J., Danielsson, H., Mohr, A. H., IJzerman, H., Nilsonne, G., Vanpaemel, W., Frank, M. C., & Vazire, S. (2018). A practical guide for transparency in psychological science. Collabra: Psychology, 4(1). https://doi.org/10.1525/collabra.158
Kowalczyk, O., Lautarescu, A., Blok, E., Dall’Aglio, L., & Westwood, S. J. (2020). What senior academics can do to support reproducible and open research: A short, three-step guide. PsyArXiv. https://doi.org/10.31234/osf.io/jyfr7
Liu, M., Choy, V., Clarke, P., Barnett, A., Blakely, T., & Pomeroy, L. (2020). The acceptability of using a lottery to allocate research funding: A survey of applicants. Research Integrity and Peer Review, 5(1), 1–7. https://doi.org/10.1186/s41073-019-0089-z
Martin, B. R. (2011). The research excellence framework and the ‘impact agenda’: Are we creating a Frankenstein monster? Research Evaluation, 20(3), 247–254. https://doi.org/10.3152/095820211x13118583635693
Maslove, D. M. (2018). Medical preprints—a debate worth having. JAMA, 319(5), 443–444. https://doi.org/10.1001/jama.2017.17566
Masuzzo, P. (2019, October 25). From Open Science to Inclusive Science [Webinar]. Zenodo. https://zenodo.org/record/3518951
McKiernan, E. C., Bourne, P. E., Brown, C. T., Buck, S., Kenall, A., Lin, J., McDougall, D., Nosek, B. A., Ram, K., Soderberg, C. K., Spies, J. R., Thaney, K., Updegrove, A., Woo, K. H., & Yarkoni, T. (2016). Point of view: How open science helps researchers succeed. Elife, 5, e16800. https://doi.org/10.7554/elife.16800
Moher, D., Bouter, L., Kleinert, S., Glasziou, P., Sham, M. H., Barbour, V., Coriat, A.-M., Foeger, N., & Dirnagl, U. (2020). The Hong Kong Principles for assessing researchers: Fostering research integrity. PLOS Biology, 18(7), e3000737. https://doi.org/10.1371/journal.pbio.3000737
Morey, R. D., Chambers, C. D., Etchells, P. J., Harris, C. R., Hoekstra, R., Lakens, D., Lewandowsky, S., Morey, C. C., Newman, D. P., Schönbrodt, F. D., Vanpaemel, W., Wagenmakers, E.-J., & Zwaan, R. A. (2016). The Peer Reviewers’ Openness Initiative: Incentivizing open research practices through peer review. Royal Society Open Science, 3(1), 150547. https://doi.org/10.1098/rsos.150547
Munafò, M. R. (2019, December 10). Raising research quality will require collective action. Nature, 576, 183. https://doi.org/10.1038/d41586-019-03750-7
Munafò, M. R., Nosek, B. A., Bishop, D. V. M., Button, K. S., Chambers, C. D., Percie du Sert, N., Simonsohn, U., Wagenmakers, E.-J., Ware, J. J., & Ioannidis, J. P. A. (2017). A manifesto for reproducible science. Nature Human Behaviour, 1(1), 1–9. https://doi.org/10.1038/s41562-016-0021
Narock, T., & Goldstein, E. B. (2019). Quantifying the growth of preprint services hosted by the Center for Open Science. Publications, 7(2), 44. https://doi.org/10.3390/publications7020044
Nietzel, P. (2021, May 5). The trend continues: More Universities to freeze tuition. Forbes. https://forbes.com/sites/michaeltnietzel/2021/05/05/the-trend-continues-more-universities-decide-to-freeze-tuition/?sh=76e90e0e3c76
Nolan, J. M., Schultz, P. W., Cialdini, R. B., Goldstein, N. J., & Griskevicius, V. (2008). Normative social influence is underdetected. Personality and Social Psychology Bulletin, 34(7), 913–923. https://doi.org/10.1177/0146167208316691
Norris, E., & O’Connor, D. B. (2019). Science as behaviour: Using a behaviour change approach to increase uptake of open science. Psychology & Health, 34(12), 1397–1406. https://doi.org/10.1080/08870446.2019.1679373
Nosek, B. A. (2019, June 11). Strategy for culture change. Center for Open Science. https://www.cos.io/blog/strategy-for-culture-change
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., … Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 1422–1425. https://doi.org/10.1126/science.aab2374
Nosek, B. A., Ebersole, C. R., DeHaven, A. C., & Mellor, D. T. (2018). The preregistration revolution. Proceedings of the National Academy of Sciences, 115(11), 2600–2606. https://doi.org/10.1073/pnas.1708274114
Nosek, B. A., & Lakens, D. (2014). Registered reports: A method to increase the credibility of published results. Social Psychology, 45(3), 137–141. https://doi.org/10.1027/1864-9335/a000192
Nosek, B. A., & Lindsay, S. (2018, February 28). Preregistration becoming the norm in psychological science. Association for Psychological Science. https://www.psychologicalscience.org/observer/preregistration-becoming-the-norm-in-psychological-science
Nosek, B. A., Spies, J. R., & Motyl, M. (2012). Scientific utopia: II. Restructuring incentives and practices to promote truth over publishability. Perspectives on Psychological Science, 7(6), 615–631. https://doi.org/10.1177/1745691612459058
Nuzzo, R. (2015). How scientists fool themselves – and how they can stop. Nature, 526, 182–185. https://doi.org/10.1038/526182a
Oakden-Rayner, L., Beam, A. L., & Palmer, L. J. (2018). Medical journals should embrace preprints to address the reproducibility crisis. International Journal of Epidemiology, 47(5). https://doi.org/10.1093/ije/dyy105
Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349(6251). https://doi.org/10.1126/science.aac4716
Orben, A. (2019, September 24). A journal club to fix science. Nature, 573(7775), 465. https://doi.org/10.1038/d41586-019-02842-8
Pagliaro, M. (2021). Purposeful evaluation of scholarship in the open science era. Challenges, 12(1), 6. https://doi.org/10.3390/challe12010006
Piwowar, H. A., & Vision, T. J. (2013). Data reuse and the open data citation advantage. PeerJ, 1, e175. https://doi.org/10.7717/peerj.175
Pownall, M., Talbot, C. V., Henschel, A., Lautarescu, A., Lloyd, K. E., Hartmann, H., Darda, K. M., Tang, K. T. Y., Carmichael-Murphy, P., & Siegel, J. A. (2021). Navigating Open Science as Early Career Feminist Researchers. Psychology of Women Quarterly, 45(4), 526–539. https://doi.org/10.1177/03616843211029255
Prinz, F., Schlange, T., & Asadullah, K. (2011). Believe it or not: How much can we rely on published data on potential drug targets? Nature Reviews Drug Discovery, 10(9), 712–712. https://doi.org/10.1038/nrd3439-c1
Quan, W., Chen, B., & Shu, F. (2017). Publish or impoverish: An investigation of the monetary reward system of science in China (1999-2016). Aslib Journal of Information Management, 69(5), 486–502. https://doi.org/10.1108/ajim-01-2017-0014
Rice, D. B., Raffoul, H., Ioannidis, J. P. A., & Moher, D. (2020). Academic criteria for promotion and tenure in biomedical sciences faculties: Cross sectional analysis of international sample of universities. BMJ, 369, m2081. https://doi.org/10.1136/bmj.m2081
Roese, N. J., & Vohs, K. D. (2012). Hindsight bias. Perspectives on Psychological Science, 7(5), 411–426. https://doi.org/10.1177/1745691612454303
Rosenthal, R. (1979). The file drawer problem and tolerance for null results. Psychological Bulletin, 86(3), 638–641. https://doi.org/10.1037/0033-2909.86.3.638
Ross-Hellauer, T. (2017). What is open peer review? A systematic review. F1000Research, 6, 588. https://doi.org/10.12688/f1000research.11369.2
Saderi, D., & Greaves, S. (2021, July 7). Using preprint reviews to drive journal peer review. ASAPbio. https://asapbio.org/using-preprint-reviews-to-drive-journal-peer-review
Sarabipour, S., Debat, H. J., Emmott, E., Burgess, S. J., Schwessinger, B., & Hensel, Z. (2019). On the value of preprints: An early career researcher perspective. PLoS Biology, 17(2), e3000151. https://doi.org/10.1371/journal.pbio.3000151
Sarafoglou, A., Hoogeveen, S., Matzke, D., & Wagenmakers, E.-J. (2019). Teaching Good Research Practices: Protocol of a Research Master Course. Psychology Learning & Teaching, 19(1), 46–59. https://doi.org/10.1177/1475725719858807
Schäfer, T., & Schwarz, M. A. (2019). The meaningfulness of effect sizes in psychological research: Differences between sub-disciplines and the impact of potential biases. Frontiers in Psychology, 10, 813. https://doi.org/10.3389/fpsyg.2019.00813
Scheel, A. M., Schijen, M. R. M., & Lakens, D. (2021). An excess of positive results: Comparing the standard Psychology literature with Registered Reports. Advances in Methods and Practices in Psychological Science, 4(2). https://doi.org/10.1177/25152459211007467
Schimanski, L. A., & Alperin, J. P. (2018). The evaluation of scholarship in academic promotion and tenure processes: Past, present, and future. F1000Research, 7, 1605. https://doi.org/10.12688/f1000research.16493.1
Schmitt, J. (Producer, Director). (2018). Paywall: The Business of scholarship [Film]. https://paywallthemovie.com
Schönbrodt, F. (2019). Training students for the open science future. Nature Human Behaviour, 3(10), 1031–1031. https://doi.org/10.1038/s41562-019-0726-z
Searston, R. A., Thompson, M. B., Robson, S. G., Corbett, B. J., Ribeiro, G., Edmond, G., & Tangen, J. M. (2019). Truth and transparency in expertise research. Journal of Expertise, 2(4), 199–209. https://doi.org/10.31234/osf.io/bn85g
Serghiou, S., & Ioannidis, J. P. (2018). Altmetric scores, citations, and publication of studies posted as preprints. JAMA, 319(4), 402–404. https://doi.org/10.1001/jama.2017.21168
Simmons, J. P., Nelson, L. D., & Simonsohn, U. (2011). False-positive psychology: Undisclosed flexibility in data collection and analysis allows presenting anything as significant. Psychological Science, 22(11), 1359–1366. https://doi.org/10.1177/0956797611417632
Smaldino, P. E., & McElreath, R. (2016). The natural selection of bad science. Royal Society Open Science, 3(9), 160384. https://doi.org/10.1098/rsos.160384
Soderberg, C. K. (2018). Using OSF to Share Data: A Step-by-Step Guide. Advances in Methods and Practices in Psychological Science, 1(1), 115–120. https://doi.org/10.1177/2515245918757689
Soderberg, C. K., Errington, T. M., Schiavone, S. R., Bottesini, J., Thorn, F. S., Vazire, S., Esterling, K. M., & Nosek, B. A. (2021). Initial evidence of research quality of registered reports compared with the standard publishing model. Nature Human Behaviour, 5(8), 990–997. https://doi.org/10.1038/s41562-021-01142-4
Spellman, B. A., Gilbert, E. A., & Corker, K. S. (2018). Open science. In J. T. Wixted & E.-J. Wagenmakers (Eds.), Stevens’ handbook of experimental psychology and cognitive neuroscience, Volume 5: Methodology (4th ed., pp. 729–776). John Wiley. https://doi.org/10.1002/9781119170174.epcn519
Szollosi, A., Kellen, D., Navarro, D. J., Shiffrin, R., van Rooij, I., Van Zandt, T., & Donkin, C. (2020). Is Preregistration Worthwhile? Trends in Cognitive Sciences, 24(2), 94–95. https://doi.org/10.1016/j.tics.2019.11.009
Toribio-Flórez, D., Anneser, L., deOliveira-Lopes, F. N., Pallandt, M., Tunn, I., & Windel, H. (2021). Where do early career researchers stand on open science practices? A survey within the Max Planck Society. Frontiers in Research Metrics and Analytics, 5, 17. https://doi.org/10.3389/frma.2020.586992
Tversky, A., & Kahneman, D. (1974). Judgment under uncertainty: Heuristics and biases. Science, 185(4157), 1124–1131. https://doi.org/10.1126/science.185.4157.1124
Tzanova, S. (2020). Changes in academic libraries in the era of Open Science. Education for Information, 36(3), 281–299. https://doi.org/10.3233/EFI-190259
UK Behavioural Insights Team. (2014). EAST: Four simple ways to apply behavioural insights. https://www.bi.team/publications/east-four-simple-ways-to-apply-behavioural-insights/
van ’t Veer, A. E., & Giner-Sorolla, R. (2016). Pre-Registration in Social Psychology—a Discussion and Suggested Template. PsyArXiv. https://doi.org/10.31234/osf.io/4frms
Vazire, S. (2018). Implications of the credibility revolution for productivity, creativity, and progress. Perspectives on Psychological Science, 13(4), 411–417. https://doi.org/10.1177/1745691617751884
Velterop, J. (2016, October 20). Is the reproducibility crisis exacerbated by pre-publication peer review? . SciELO in Perspective. https://blog.scielo.org/en/2016/10/20/is-the-reproducibility-crisis-exacerbated-by-pre-publication-peer-review
Verma, I. M. (2017). Preprint servers facilitate scientific discourse. Proceedings of the National Academy of Sciences, 114(48), 12630. https://doi.org/10.1073/pnas.1716857114
Wagenmakers, E. J., Wetzels, R., Borsboom, D., van der Maas, H. L. J., & Kievit, R. A. (2012). An agenda for purely confirmatory research. Perspectives on Psychological Science, 7(6), 632–638. https://doi.org/10.1177/1745691612463078
Wicherts, J. M., Veldkamp, C. L. S., Augusteijn, H. E. M., Bakker, M., Van Aert, R. C. M., & Van Assen, M. A. L. M. (2016). Degrees of freedom in planning, running, analyzing, and reporting psychological studies: A checklist to avoid p-hacking. Frontiers in Psychology, 7, 1832. https://doi.org/10.3389/fpsyg.2016.01832
Wilkinson, M. D., Dumontier, M., Aalbersberg, Ij. J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., da Silva Santos, L. B., Bourne, P. E., Bouwman, J., Brookes, A. J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C. T., Finkers, R., … Mons, B. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Scientific Data, 3(1), 1–9. https://doi.org/10.1038/sdata.2016.18
Zwaan, R. A., Etz, A., Lucas, R. E., & Donnellan, M. B. (2018). Making replication mainstream. Behavioral and Brain Sciences, 41. https://doi.org/10.1017/s0140525x17001972
This is an open access article distributed under the terms of the Creative Commons Attribution License (4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Supplementary data