The scientific community has long recognized the benefits of open science. Today, governments and research agencies worldwide are increasingly promoting and mandating open practices for scientific research. However, for open science to become the by-default model for scientific research, researchers must perceive open practices as accessible and achievable. A significant obstacle is the lack of resources providing a clear direction on how researchers can integrate open science practices in their day-to-day workflows. This article outlines and discusses ten concrete strategies that can help researchers use and disseminate open science. The first five strategies address basic ways of getting started in open science that researchers can put into practice today. The last five strategies are for researchers who are more advanced in open practices to advocate for open science. Our paper will help researchers navigate the transition to open science practices and support others in shifting toward openness, thus contributing to building a better science.

In the last twenty years, we have witnessed the rise of open science—the cultural movement and ethical-political perspective that strives to make all processes and products involved in scientific research available and accessible to all people without barriers (Friesike & Schildhauer, 2015; Lakomý et al., 2019). Open science is so relevant today that it has been proposed as a by-design and by-default model for conducting scientific research (National Academies of Sciences, Engineering, and Medicine, 2018; Office of the Chief Science Advisor of Canada, 2020, 2021).

Open science comprises a diverse set of best practices that seek to democratize knowledge and improve the quality of scientific research. These practices are often grouped into areas as varied as open access, open educational resources, open data, open labs, open notebooks, open innovation, open evaluation, open hardware, open-source software, and citizen science (Chan et al., 2020; Pontika et al., 2015). In the field of psychology, open science practices such as pre-registration, sharing data in public repositories alongside code, and using open-source statistical software such as R (R Core Team, 2021) have been emphasized, because they can foster the reproducibility and replicability of findings (Crüwell et al., 2019; Davis-Kean & Ellis, 2019).

The benefits of open science go beyond enhancing the rigor and reliability of the scientific research process. For example, open publications get more citations and allow authors to retain the rights to their scholarly output through open licenses (Gargouri et al., 2010; McKiernan et al., 2016). Open data practices can contribute to generating high-quality datasets that others can reuse, thus avoiding unnecessary duplication of research efforts (Ali-Khan, Jean, & Gold, 2018), and citizen science can lead to more socially relevant research by involving citizens in the scientific endeavor (see Robinson et al., 2018). Open science practices can reduce costs, for example when articles are published open access rather than behind a paywall. Further, they enable impactful activities that would not otherwise be possible, like fostering innovative collaborations, products, and industries (Fell, 2019). This is because ideas, data, and discoveries flow freely, more quickly, and in more diverse contexts than do the results of closed research. It is, therefore, sensible to argue that the “early adoption of open and reproducible methods is an investment in the future and can put researchers ahead of the curve” (Allen & Mehler, 2019, p. 9).

Nevertheless, practically implementing open science poses significant difficulties. As Kathawalla et al. (2021) point out, researchers who want to transition to the open model often develop “a sense of paralysis associated with not knowing where to begin” (p. 1). Several factors likely contribute to this feeling. For one, open science resources are often more theoretical than practical (Farnham et al., 2017). Even if researchers agree with the general principles of open science, they may not know how to apply them on a day-to-day basis. In turn, navigating the plethora of open science resources and tools available can take significant time and energy, especially where the institutional rewards of practicing open science are more ephemeral than concrete (Gernsbacher, 2018; Hall et al., 2022).

Additionally, researchers wishing to embrace open science must often adjust aspects of their day-to-day scientific behavior (Chubin, 1985; Robson et al., 2021), which can prove daunting. For example, pre-registration and registered reports require researchers to elaborate an analysis pipeline prior to data collection, thus inverting the order in which they traditionally do research (Allen & Mehler, 2019). This fact can contribute to the feeling that open practices such as pre-registration crush creativity, increase work-related stress, or involve additional work that lengthens project duration unnecessarily (McDermott, 2022; Sarafoglou et al., 2022). Similarly, researchers contemplating open data may have concerns about data ownership, responsibility, and control (Berghmans et al., 2021; Inkpen et al., 2021). As another example, for-profit journals (especially those with high prestige) often have astronomical article processing charges that authors or research institutions are required to pay in order to publish openly (e.g., Asai, 2019, 2021; Khoo, 2019). Such high publication costs constitute an economic hardship for many authors, and are simply unaffordable for authors with fewer resources, restricting the free circulation of knowledge (see Persic et al., 2021; Smith et al., 2022). Finally, the limited empirical research on the benefits, risks, and limits of open science (see Ali-Khan, Jean, MacDonald, et al., 2018) acts as a barrier, holding back students, researchers, and the public from perceiving open science practices as a worthwhile endeavor (for recent research on this topic, see Bakker et al., 2021; Bottesini et al., 2021; Echevarría et al., 2021; Nosek et al., 2015; Pardo Martínez & Poveda, 2018; Tenopir et al., 2020).

In this article, we outline ten concrete strategies that can help researchers in psychology and beyond to foster open science practices (see Table 1). Rather than discussing the characteristics of different open science practices (e.g., the peculiarities of publishing a preprint or pre-registering a study), we describe simple courses of action that readers can take immediately to foster their implementation. In this way, our article calls on readers to revise day-to-day practices and implement changes at every step of the research cycle. We begin by describing five strategies that will allow researchers to make the transition to becoming open science users. This will be helpful both for those who are not familiar with the open approach to research practice and for those who are but have not yet implemented open science practices in their workflows. Afterward, we present five more advanced strategies for researchers to transition from users to advocates who help disseminate and broaden the adoption of open science in their local contexts. It is worth noting that the difference between a user and an advocate of open science is one of degree and not of kind. For example, doing open science where others can notice it and learn from it can be thought of as a simple act of advocacy. Our advanced strategies discuss more complex and challenging ways of making open science more widely available.

Table 1.
Strategies for fostering open science in psychology and beyond
Basic strategies: becoming an open science user 
Strategy Concrete actions 
Experiment with integrating open science practices and policies into daily workflows Start small: use open-source software, publish a preprint, pre-register a study or sign a peer-review (i.e., open peer reviews). As you get more comfortable with open science practices, share lab notebooks, deposit data and code in public repositories, or write a registered report. 
Become familiar with national and institutional open science policies Create spaces to read, share, and discuss open science mandates developed by government agencies and other institutions. 
Analyze and share successful cases of implementation of open science practices Become familiar with and inspired by cases of successful implementation of open science practices. Discuss them with others. Identify, compare and eventually implement strategies to spur open science at your institution. 
Introduce open science practices in your courses Incorporate open science practices in course content. Allow students to come into contact and experiment with the open research culture before carrying out major projects. 
Embrace open educational resources (OERs) Use and/or create OERs, such as open courses, syllabi, lectures, assignments, and textbooks. If you create OERs, publish them under a Creative Commons license through open repositories. 
Advanced strategies: from open science user to advocate 
Strategy Concrete actions 
Collaborate with others using open tools Exploit collaborations to rethink, question, and overcome embedded closed practices. Try writing an article with others using R Markdown or Quarto, sharing data using open formats, or keeping a version history of research files using GitHub or similar platforms. 
Develop networks of open collaboration Establish collaborative networks with others that advocate for open research and work on related topics. Develop standards and guidelines to wrangle, analyze, and visualize data, to comment and proofread code, and to create accompanying documentation. 
Voice your opinion Write testimonials, reports, glossaries, or declarations of support for open science. Outline principles, values, and considerations relevant to your community. 
Rethink and promote changes in the assessment of scholarly production Take ownership of the discussion on research assessment. Collaborate in building spaces to discuss and promote alternatives to the traditional measurement of academic performance (e.g., impact factor, h-index). Advocate for the recognition of a variety of research outputs (e.g., open datasets, open peer reviews). 
Create opportunities for people to specialize in open science If you have or can pursue funding to hire personnel, create open-science related job opportunities (e.g., PhD scholarships and research assistantships, research manager positions, postdoctoral positions). 
Basic strategies: becoming an open science user 
Strategy Concrete actions 
Experiment with integrating open science practices and policies into daily workflows Start small: use open-source software, publish a preprint, pre-register a study or sign a peer-review (i.e., open peer reviews). As you get more comfortable with open science practices, share lab notebooks, deposit data and code in public repositories, or write a registered report. 
Become familiar with national and institutional open science policies Create spaces to read, share, and discuss open science mandates developed by government agencies and other institutions. 
Analyze and share successful cases of implementation of open science practices Become familiar with and inspired by cases of successful implementation of open science practices. Discuss them with others. Identify, compare and eventually implement strategies to spur open science at your institution. 
Introduce open science practices in your courses Incorporate open science practices in course content. Allow students to come into contact and experiment with the open research culture before carrying out major projects. 
Embrace open educational resources (OERs) Use and/or create OERs, such as open courses, syllabi, lectures, assignments, and textbooks. If you create OERs, publish them under a Creative Commons license through open repositories. 
Advanced strategies: from open science user to advocate 
Strategy Concrete actions 
Collaborate with others using open tools Exploit collaborations to rethink, question, and overcome embedded closed practices. Try writing an article with others using R Markdown or Quarto, sharing data using open formats, or keeping a version history of research files using GitHub or similar platforms. 
Develop networks of open collaboration Establish collaborative networks with others that advocate for open research and work on related topics. Develop standards and guidelines to wrangle, analyze, and visualize data, to comment and proofread code, and to create accompanying documentation. 
Voice your opinion Write testimonials, reports, glossaries, or declarations of support for open science. Outline principles, values, and considerations relevant to your community. 
Rethink and promote changes in the assessment of scholarly production Take ownership of the discussion on research assessment. Collaborate in building spaces to discuss and promote alternatives to the traditional measurement of academic performance (e.g., impact factor, h-index). Advocate for the recognition of a variety of research outputs (e.g., open datasets, open peer reviews). 
Create opportunities for people to specialize in open science If you have or can pursue funding to hire personnel, create open-science related job opportunities (e.g., PhD scholarships and research assistantships, research manager positions, postdoctoral positions). 

Strategy 1: Experiment with Integrating Open Science Practices into Daily Workflows

Rather than being all or nothing, openness should be considered a continuum of practices ranging from uncomplicated to highly complex (McKiernan et al., 2016). Researchers can begin small and gradually integrate open science practices into everyday workflows by, so to speak, picking and choosing what to experiment with from the “open science buffet” (Bergmann, 2018). This is a wide-ranging strategy that can be implemented in various ways depending on the specific objectives of each investigation. Ways to start include using open-source software, publishing a preprint, pre-registering a study, or signing a review (i.e., open peer review). There are good guides on how to get started with these practices (e.g., Cook et al., 2022; Kathawalla et al., 2021; Masuzzo & Martens, 2017; Moshontz et al., 2021; Simmons et al., 2021).

As research team members become more comfortable with some basic open science practices, they can begin to consider more complex ones, such as sharing lab notebooks, depositing research data and code in public repositories, or writing a registered report article (Kiyonaga & Scimeca, 2019; Nosek & Lakens, 2014; Reich, 2021). Although more costly in terms of effort, ventures such as these are likely to have a greater impact on the behavior of researchers and eventually change the dynamics of the research process itself.

Early career researchers seem to have a more positive attitude towards open science than more established researchers and are eager to participate in open initiatives, even if they sometimes feel they need more support to do so (Abele-Brehm et al., 2019; Pownall et al., 2021; Toribio-Flórez et al., 2021; Zečević et al., 2021). Thus, giving graduate students and postdoctoral researchers the freedom and support to explore open science practices—and eventually use this knowledge to help other researchers engage in open science practices—can prove a valuable strategy for catalyzing change.

Strategy 2: Become Familiar with National and Institutional Open Science Policies

Transitioning to open science is a challenging undertaking because it involves both a shift in how science is understood and substantive changes in behaviors associated with research (see, for instance, Attendees Of The NDSF Summit, 2020). Strategy 1 developed above seeks to encourage researchers to gradually render certain behaviors normative. It is a bottom-up approach to promoting change in how science is done by accumulating meaningful practices at the grassroots level. By contrast, mandates from governments and granting agencies are a top-down approach to change.

Mandates help open science become a standard and no longer an add-on to research. For instance, in Canada, the federal funding agencies (also known as the Tri-council) have enacted an open access policy, whereby grant recipients must ensure that publications stemming from agency funding are freely accessible within 12 months of publication (e.g., by depositing peer-reviewed manuscripts in institutional or disciplinary repositories, or by publishing in open-access journals). Mandates put all researchers on an even playing field, and give those who have already transitioned voluntarily a head start. Moreover, researchers are usually diligent in complying with mandates because of the potential consequences if they do not (Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, & Social Sciences and Humanities Research Council of Canada, 2021, pp. 14–15). However, as noted elsewhere, imposing a policy for its own sake is unlikely to be the basis for sustainable change (Ali-Khan, Jean, & Gold, 2018). This is because people must perceive laws and policies as reasonable, doable, and necessary for their implementation to be effective (Caillaud et al., 2021; Kelly, 1999; Vicente-Saez et al., 2021).

Along these lines, we argue that implementing open science practices necessitates disseminating and discussing top-down strategies within the communities in which they are intended to be applied. Mandates will only become fully effective after all stakeholders familiarize themselves with them, consider their motivation, and discuss what might be on the horizon. This type of discussion is becoming even more important in the current context in which the number of open science mandates and policies is growing steadily year by year (see, e.g., data from ROARMAP, the Registry of Open Access Repository Mandates and Policies, Researchers and students can create spaces to read, comment, and reflect on the contents of the policies that affect their local communities. For instance, Eurodoc, the European Council of Doctoral Candidates and Junior Researchers, launched an Open Science Working Group seeking to ease the transition to open science by connecting policy makers and researchers to elaborate recommendations on the implementation of open practices ( Those who wish to take it one step further can also organize informal meetings, working groups, discussion panels, and conferences on open science policies. Less conventional formats such as hackathons, unconferences (i.e., participant-oriented meetings, without a predetermined agenda, aimed at fostering collaboration and discussion; see Budd et al., 2015), and book sprints (i.e., short events aiming at the collaborative writing of a book; see Heller & Brinken, 2018) can also be constructive.

Creating new spaces dedicated to open science can nevertheless be challenging. For example, setting up an open science working group is more feasible in institutions that have long been building momentum for a change in research culture. In these institutions, new initiatives crystallize previous efforts and feel like natural outcomes of longer processes. In institutions where an open science culture is nascent, it is possible to begin by taking advantage of pre-existing venues with built-in audiences, such as lecture series, reading groups, and lab meetings, as well as informal spaces. For example, a PI may propose to discuss an open science mandate in a lab meeting, and a group of graduate students can organize a convivial lunch to brainstorm how to include open science practices in funding applications. These and other activities should be aimed at understanding the rationale behind policies, assessing whether this rationale is sound, and discussing how and to what extent the content of the policies would affect the researchers’ practice and careers.

Strategy 3: Analyze and Share Successful Cases of Implementation of Open Science Practices

Knowing and discussing how others have implemented open science practices is of great help to identify, compare, and eventually enact strategies for transitioning to open science. Each country, discipline, institution, and lab has particularities that must be considered when designing open science adoption policies. However, other individuals’ and institutions’ experiences can help researchers recognize potential obstacles and ways to overcome them.

A simple action researchers can take is to directly consult other peers with experience in implementing open science practices. This could be the head of a lab that has transitioned to open science, a researcher working openly with similar types of data, or an open science advocate working in the same department. These consultations can shed light on difficulties and provide first-hand solutions for adopting open science that work in specific research contexts. For example, colleagues might be able to share examples of pre-registration templates (e.g., Havron et al., 2020) or data sharing approaches (e.g., Houtkoop et al., 2018; Tenney et al., 2021; Walsh et al., 2018) that are tailored to a subfield and its particularities.

While such one-on-one interactions are largely informal and undocumented, we have better records for how large projects and institutions have embraced open science practices. These records document common pitfalls as well as solutions that different communities have found for embracing open science practices. ManyBabies 1, for instance, was a collaborative project involving 100+ researchers (including one of the authors of this paper) and 70+ labs that tested the development of infant-directed speech preference (The ManyBabies Consortium, 2020).

At the end of the project, the research team published a paper reporting on the experience and addressing how Big Team Science—the interdependent work of multiple teams to carry out joint projects—can help in fostering open science practices (Byers-Heinlein et al., 2020). For example, the team ensured that all study stimuli, analysis scripts, and data were shared via the Open Science Framework and published findings under the registered report format. In addition, many laboratories began to implement open science practices as a result of their participation in the project. The article also points out barriers the team faced while adopting open science practices, such as the lack of specific technical skills and the varying resources available in different labs, and provides valuable insights that can benefit other large-scale collaborations in psychology interested in doing research openly.

Another interesting example is that of the Montreal Neurological Institute in Canada. This institution recently adopted an ambitious open science policy thanks to the creation of the Tanenbaum Open Science Institute (TOSI) in 2016. The institute’s mission is to establish best practices and develop tools to support the transition to open science, as well as to measure its impact (Rouleau, 2017). Poupon et al. (2017, 2020) describe in detail the 18-month “buy-in process”, dividing it into six phases: (i) building awareness of open science at the institution, (ii) carrying out the internal consultation process, (iii) extending the discussion to McGill University, where all faculty of the institute are employed, (iv) establishing open science guiding principles and presenting them to the community, (v) generating a definition of open science for the institution, and (vi) spreading awareness of the open science initiative nationally and internationally. Similarly, other articles describe the attitudes and concerns of the institute’s researchers towards open science and their motivations and disincentives to participate in it (Ali-Khan et al., 2017, see also Ali-Khan, Jean, & Gold, 2018), and the instruments used by the institution to measure the impact of open science principles and practices on research and innovation (Gold et al., 2018). Knowledge of these materials can help researchers working elsewhere strategize how to spearhead the transition to open science at their own institution.

Spreading the word about successful cases of implementation of open science is essential, regardless of whether it is possible to initiate a transition process in the researchers’ institutions. The main purpose is to amplify the message that another style of science is possible and to inspire others to initiate change. In this sense, we strongly encourage those who have gone through implementing open practices to document and share it as soon as possible and those interested in open science to disseminate successful case studies.

Strategy 4: Introduce Open Science Practices in Your Courses

Instructors can contribute to a sustainable shift towards open science practices by introducing them into their courses and workshops. Incorporating open science into the curriculum allows students to come into contact and experiment with the open research culture long before carrying out major projects in the context of a research assistantship or their doctoral research. Moreover, since many of today’s students will be tomorrow’s researchers, professors, and policy makers, early contact with open science could contribute greatly to establishing new procedural standards for research.

In some knowledge areas, including open science practices in teaching is more straightforward. For instance, in statistics and research methods, courses can be designed by adopting the “new statistics” for better science approach (Calin-Jageman & Cumming, 2019; Cumming & Calin-Jageman, 2016; Morling & Calin-Jageman, 2020). This entails discussing issues such as the reproducibility crisis, explaining the importance of effect sizes, confidence intervals, and meta-analysis, and addressing the importance of transparent reporting (see Nosek et al., 2022). Also, procedural content such as data wrangling, analysis through statistical techniques, and visualization can be taught through open-source tools, such as R or Python (Auker & Barthelmess, 2020). Embracing open-source tools brings two benefits: (i) students can learn programming in a new language while learning statistics in a scaffolded educational context and (ii) they gain expertise in open-source tools, which are much more flexible than commercial software and are enriched continuously thanks to packages and modules developed by the community. In that sense, open-source programming languages such as R have a virtually unlimited expansion capacity and can act as natural bridges between data and open science practices (Lortie, 2017). Finally, instructors can incorporate replication research in their courses. For example, Hawkins et al. (2018) describe an experience in which students in a graduate-level experimental methods course engaged in pre-registered replications of findings from a volume of the journal Psychological Science.

Fortunately, the possibilities are not limited to courses on statistics or research methods. Course-based research projects in applied subjects (e.g., developmental psychology, social psychology, educational psychology, and other social sciences) can also act as playgrounds for testing open science practices (Frankowski, 2021). For instance, Marwick et al. (2020) implemented a replication report assignment in an upper-level archaeology class at the University of Washington. Working in groups, students chose a study to replicate, wrote their own code to analyze the original data, and finally produced a compendium of report, code, and data to submit for grading. Note that for most students, the assignment was the first experience they had at trying to replicate the findings of a scholarly publication. Yet, 91% agreed or strongly agreed that the ability to replicate is an important skill and 80% agreed or strongly agreed that the replication assignment helped them understand the base paper better than they would have reading it in order to write an essay.

There are myriad options for incorporating open science into courses: students can reuse open stimuli and data, deposit data and research reports in public repositories such as OSF, pre-register a study, or create repositories for version control and tracking group work (e.g., using GitHub). Also, institutions can create instruction programs consisting of open science concepts, principles, and techniques that can be adapted to be taught in several subjects (see Hanna et al., 2021).

Strategy 5: Embrace Open Educational Resources

Implementing the previous strategy requires new types of educational materials, which demand a considerable amount of time and effort to create. To help solve this problem, the open community has begun to engage with open educational resources (OERs), defined as “the open provision of educational resources, enabled by information and communication technologies, for consultation, use and adaptation by a community of users for non-commercial purposes” (UNESCO, 2002, p. 24). OERs encompass a variety of formats, including, but not limited to, complete courses, syllabi, lectures, assignments, textbooks, and other pedagogical resources (OER Commons, 2022).

Just as open data helps avoid duplication of research efforts by reusing datasets, OERs prevent instructors from working in parallel on the creation of potentially similar pedagogical materials. The goal is that instructors can access materials created by others, remix, modify, and enrich them, and release them to the open community again. In this way, using OERs contributes to building a network of increasingly developed and updated resources, drawing on the contributions of many individuals. Moreover, as open resources available to all, OERs favor the democratization of knowledge and allow institutions and individuals to save considerable sums of money (Hilton et al., 2013; Ikahihifo et al., 2017). In recent years, OERs advocates have created resources to assess their quality and facilitate their creation and use (e.g., FORRT, Azevedo et al., 2022; TIPS framework, Kawachi, 2014). Importantly, studies have shown that the transition to the OERs model is uncomplicated and leads to successful learning outcomes for students (e.g., da Silva et al., 2021).

An exciting example of OER implementation in psychology is Open Stats Lab (OSL), an initiative driven by Prof. Kevin McIntyre and funded through a grant from the APS Fund for Teaching and Public Understanding of Psychological Science (see McIntyre, 2017). Open Stats Lab is an open platform where users can download a set of articles published in Psychological Science, the datasets associated with those articles, activities designed to learn statistics by analyzing the data, and R guidance scripts, in addition to recommendations for students and instructors on how to use the platform. A related undertaking involves open textbooks made using technologies such as Jupyter and R Markdown notebooks, which permit the interweaving of narrative text, code, and figures, and have the ability to export static and dynamic output formats (see, for example, Neth, 2022; Rhoads & Gan, 2022). Some books even provide access to richer interactive educational experiences. For an example outside of psychology, the textbook Computational fluid dynamics: An open source approach (Vermeire et al., 2020), published by Concordia University, features myriad open-source technologies that can be run in the cloud to allow users to perform simulations and write their own code easily. Additionally, users can contribute to the development of the book, as it is available as a public repository on GitLab. Readers can interact with the authors by suggesting modifications or enriching the existing material.

At a time when educational institutions are increasingly encouraging the creation and use of OERs (Ahammad, 2019; Santos-Hermosa et al., 2021; Thompson & Muir, 2020; Zaid & Alabi, 2021), some practical suggestions are in order. During the planning stage of a course, educators can search for existing OERs in open repositories. It is essential to note the license of each downloadable resource to assess the permissions for reuse. Similarly, it is advisable to cite the source for each derivative material to allow for the reconstruction of its history. Also, it is worth contacting the original author(s) to let them know that their resource has been reworked (they will likely be pleased to see their creation being used!). This could lead to new collaborations to assess the quality of the material produced or even create new materials.

Whenever educators create new pedagogical resources (e.g., syllabi, slides, assignments, reading guides), they can publish them openly, using a Creative Commons license, through institutional repositories or open repositories such as OSF, Zenodo, and GitHub. If sharing the materials right away is not viable (e.g., an institution might require that the materials be made public only after its students benefit from them), educators can still deposit them in open repositories by setting an embargo period, after which the resources become open. Whenever possible, it is preferable to allow users to interact with the creators to suggest modifications to the educational resources. The most straightforward choice is to allow users to leave comments on the resource, but more efficient alternatives are collaboration on GitHub or GitLab. Whether for searching for or creating OERs, it is critical to share open resources as soon as they become available and accompany them with documentation that allows users to understand how they were produced (i.e., by whom, when, how, and for what course, and at what institution).

Just as with other open science practices, adopting OERs is not an all-or-nothing concept but rather a goal to strive towards. For example, teachers who do not have control over the syllabus content (e.g., due to content standards in place) can nevertheless reuse and create open slides and assignments, and those who must use existing slides or assignments can still document their teaching strategies so that others can potentially replicate them.

The above strategies allow researchers to begin their journey to opening up research practices. However, once they have transitioned to becoming users of open science practices, researchers may want to more explicitly encourage others to adopt the open way of doing research. The following five strategies support the task of open science advocacy by discussing concrete actions that researchers can begin to implement progressively.

Strategy 6: Collaborate with Others Using Open Tools

Transitioning to the open research model involves substantive changes in the ways of doing science. Lasting habits can develop thanks to deliberate, small, and motivation-driven changes in attitude (Verplanken & Orbell, 2022), so it is a good idea to incorporate open tools into personal daily workflows (see Strategy 1). However, a good part of what researchers do is collaborate with others, and changes in personal workflows may not reach beyond the local level. In this section, we want to argue that peer communication and collaboration provide a unique opportunity to rethink and overcome deeply embedded closed practices and take the adoption of open science practices to a new level. Teamwork helps sustain efforts over time to achieve common goals, fosters a sense of affiliation to a larger project, and provides a social framework that eases personal and professional insecurities (see Salas et al., 2004). Researchers can leverage these benefits to foster the implementation of open science practices.

How to get started? Researchers can attempt to collaborate in the writing of an article using R Markdown, the format for creating dynamic documents in R that we introduced in the previous section. Although an important advantage of this format is that it integrates both code and text into the same document–thereby improving the reproducibility of analyses– researchers need not include code to get started with this tool. It should be noted that although using R Markdown to write manuscripts used to be somewhat challenging, there are now packages that help researchers with the task. Papaja, for example, is a package that helps users seamlessly export submission-ready manuscripts conforming to the American Psychological Association (APA) guidelines, which is of great help to researchers in the field of psychology (Aust & Barth, 2022). Researchers already familiar with R Markdown can also capitalize on collaborations to learn how to use new technologies. Quarto, for instance, is a recently developed open-source scientific and technical publishing system sponsored by RStudio that supports executable R, Python, and Julia code blocks within Markdown and Observable JS for interactive data exploration and analysis (Allaire et al., 2022). Quarto eases the implementation of reproducible research and publications and conveniently allows users to export their documents in a plethora of formats.

Other possible practices that can be incorporated are keeping a version history of research-related files using GitHub or other open repositories, or sharing research data with team members using open formats (e.g., CSV or JSON formats) instead of proprietary ones (e.g., Excel or SPSS formats) that are not widely interoperable and do not guarantee long term data preservation. The main idea is that collaborative work should increasingly happen through open tools to make their use natural. Of course, in the early stages of building habits, incorporating open tools may have a steep learning curve (e.g., Lasser, 2020). Moreover, certain types of functionality may be less available in open tools, which should not be overlooked. For example, R can extend its power via a remarkable 18861 packages currently available on the Comprehensive R Archive Network (CRAN). However, users trained with more traditional software such as SPSS may miss the ability to interact through an uncomplicated GUI (graphical user interface) when working with R, which mainly relies on the user inputting commands. We offer two solutions to this dilemma. First, with regards to R, users can transition by using JASP (JASP Team, 2022), a GUI version of R that is designed to be familiar to users of SPSS, and supports copy-paste APA-style tables. Second, users can consider how to optimize their timing when adopting open-source software. As a case in point, it is advisable for a student who has to get started with statistical software to opt directly for open software. Comparably, it may be better for established researchers to switch to open-source software when starting a new project rather than changing tools midstream.

Despite their limitations, open tools are being updated and improved all the time by their supporting communities, which will progressively lessen the issue. At the same time, using open tools will often result in a more manageable collaboration. For example, centralizing research-related documents in an open repository makes it easier for new collaborators to catch up with how the project is progressing. It also eliminates the constant need to send e-mail attachments and allows multiple collaborators to work on a document simultaneously while ensuring that version history is handled efficiently (e.g., avoiding accidental deletions or overwriting other people’s contributions). Concurrently, collaborating through open tools has the potential to place researchers at the forefront of the shift toward openness. For instance, growing expertise might position the team or laboratory as an institutional reference for open science practices. Individuals can become open science champions who inspire other researchers (Seibold, 2020). These labs and individuals lead others by example.

Strategy 7: Develop Networks of Open Collaboration

Once researchers have incorporated open science practices into their daily workflows, they can seek to establish or join collaborative networks with other labs or teams that advocate for open research and work on related topics. Big Team Science has advantages for research, as it allows investigators to access more resources, work with greater sample sizes, take advantage of the expertise of a larger team of researchers in areas such as data analysis, and distribute work more efficiently (see Coles et al., 2022). All of this helps to overcome current challenges related to the replicability, reproducibility, and generalizability of research findings, which are of particular concern in psychology (see Byers‐Heinlein et al., 2021; Forscher et al., 2020). ManyBabies, which we discussed in a previous section, is one example of Big Team Science.

Big Team Science ventures are becoming more and more prevalent today. One notable example is the Psychological Science Accelerator (Moshontz et al., 2018), a globally distributed network of laboratories which as of November 15, 2022 has 1,328 members working in 84 countries (see The accelerator is investigating major issues such as the mental simulation of object orientation, the gendered nature of prejudice, and how situational factors shape moral judgements (Bago et al., 2022; Chen et al., 2018; Phills et al., 2021).

Participating in these large-scale projects is not easy and calls on researchers to develop specific standards and guidelines that ensure effective communication among collaborators and enable projects to develop coherently and cohesively. For example, a successful collaboration will require shared ways to wrangle, analyze, and visualize data, care in commenting and proofreading code, and high-quality accompanying documentation. In addition, research standards will likely apply to open repositories for file sharing (e.g., GitHub), using open formats (e.g., CSV datasets). This makes Big Team Science projects outstanding venues for training researchers in open science practices. Our recommendation for those who have begun to explore the ecosystem of open practices and tools and feel comfortable with them (or just want to learn more) is to contact existing consortia to assess the feasibility of participating. That could lead, eventually, to initiating a new project within an existing consortium, or even developing a new consortium. Participation in large-scale projects is a productive way to spark team members’ interest in both the research itself and open science practices while collaborating to achieve more robust research findings.

Strategy 8: Voice Your Opinion

Communities that have spent some time discussing open science policies will probably have reached some consensus and have interesting ideas to share with others. Where this is the case, one consolidating project that communities may wish to pursue is broadcasting their ideas as articles, testimonials, reports, glossaries, or declarations of support for open science. Such documents can outline the principles, values, and considerations that were more relevant during debates as community standards were being formed. Some examples are recent papers reflecting on the benefits of open science (Dal Ben et al., 2022; Engzell & Rohrer, 2021), The Beijing declaration on research data (CODATA - Committee on Data of the International Science Council et al., 2019), The Lindau guidelines, a project spearheaded by Nobel Prize winner Elizabeth Blackburn (Council for the Lindau Nobel Laureate Meetings/Foundation Lindau Nobel Laureate Meetings, 2020), and the community-sourced glossary of open scholarship terms developed by FORRT, the Framework for Open and Reproducible Research Teaching (Parsons et al., 2022).

Roadmaps are another format worth exploring. These are documents that often identify challenges, possible strategies for overcoming them, and recommendations. These documents are usually more practically oriented than manifestos and declarations of support, and are organized according to well-defined timeframes. Good examples are the open science roadmap produced by LIBER—Europe’s Research Libraries Network—for the period 2018-2022 (Ayris et al., 2018) and the roadmap produced by the Government of Canada (Office of the Chief Science Advisor of Canada, 2020), which lists a series of concrete milestones that would enable a phased transition to open science at the institutional level for research funded by government departments and agencies by 2025.

One benefit of these documents is that they effectively bring to light nuances in how different communities interpret and implement open science. For example, the notion of open access and the need to support it are widespread and reiterated in different documents. However, there are variations in its implementation in different regions and institutions that bear consideration if the aim is to grasp the bigger picture (see Becerril-García, 2022 on open access in Latin America versus other regions). At the same time, it should be remembered that open science is a growing cultural movement. All contributions add value to the overall effort to transform research practices. Finally, it is not only the contents of the documents that are important: it is also the act of publication itself. Discussing public policy and making the outcome of the discussion public are transformative acts for communities (i.e., a set of people and institutions with common interests), for they allow them to become political players in the process of building open science and society.

Strategy 9: Rethink and Promote Changes in the Assessment of Scholarly Production

Science benefits from open practices. However, current incentive and reward structures typically reward the quantity and prestige of research outputs such as grants and peer-reviewed publications, with limited consideration of open science activities. Without an incentive and reward structure that recognizes and values researchers’ engagement with open science, their efforts are bound to be side projects sparked by curiosity or ethical convictions. If open science is to become a standard for research, then its presence in the curricula vitae of academics must become a requirement and not just a nice addition. Adoption of open science practices should be considered in processes such as hiring, tenure, promotion, and the granting of student fellowships and awards.

Public declarations and recommendations have pointed out the problem with current incentive structures. The San Francisco Declaration on Research Assessment (DORA, 2012; Schmid, 2017), for instance, promotes academic assessment practices that take into account a myriad of scientific outputs (e.g., articles, open datasets, open software, open peer reviews) and cautions against the misuse of the impact factor as a proxy for assessing the quality of publications, which skews the dynamics of knowledge production. For example, researchers working at institutions that continue to focus on the impact factor are discouraged to publish in some types of open access journals (even if they adhere to the open science framework), because some do not have an impact factor.

Despite efforts such as the DORA, traditional academic assessment practices are still quite impermeable to change. For example, a recent study that analyzed 305 job advertisements for academic positions in 91 institutions found that only 2 of them mentioned open science (Khan et al., 2022). Relatedly, in 2019, the European University Association published the results of a survey of 260 universities in 32 European countries about research assessment and open science. Results show that European universities recognize the importance of modifying assessment practices but consider that the complexity of this modification is a barrier to attempting its implementation (Saenen et al., 2019, pp. 31–32).

Researchers can start making a difference right away. Sharing their concerns about academic evaluation with colleagues and including open science wording in job postings are great starting points (for some examples of academic job offers mentioning open science, see Schönbrodt et al., 2022). When participating in hiring and promotion committees, they can also point out candidates’ commitment to open science, consider scholarly outputs such as preprints and datasets to be as important as articles, and assess the quality of the candidate’s articles by the soundness of their contribution instead of by the metrics of the journals where they are published. They can also get involved in more formal discussions on academic assessment (Susi et al., 2022). A clear example of this is Project TARA, currently under development by the DORA, which seeks to build a toolkit of resources informed by the academic community to support the improvement of evaluation policies and practices (DORA, 2021). All these efforts support the greater goal of putting the issue of academic assessment on the agenda and sustaining awareness so that more and more people can join the discussion.

Strategy 10: Create Opportunities for People to Specialize in Open Science

Achieving the full implementation of open science will only happen through sustained efforts in each area of practice (i.e., open access, open data, open educational resources, among others). Many open community initiatives depend on and have been successful thanks to the altruistic contributions of researchers. However, this is not optimal if the goal is to make open science a global standard for scientific research. The spirit of open science—that the processes and outputs of scientific practice be freely available and accessible to all—should not be conflated with the nature of the efforts leading to it, which, like all other forms of work, deserve to be appropriately rewarded.

In this sense, it is paramount that those who have or can obtain funding and are in a position to hire personnel create job opportunities specifically related to open science. This recommendation is important considering that the people most likely to implement open science are early career researchers (e.g., Allen & Mehler, 2019; Farnham et al., 2017) and the people most likely to have such institutional power or access to funding are not. Possibilities to support open science include creating grants to organize and attend open science events, offering PhD scholarships and research assistantships for open science research, hiring research managers with an open science profile, and creating postdoctoral positions dealing specifically with open science issues. These roles can assist research teams and hiring institutions in transitioning to open science and amplifying the influence of the open research model in every discipline. At the same time, and perhaps more importantly, these jobs will promote the perception of open science as an area of expertise in its own right—just as important as any other area—and cultivate interest in open science over the long term.

We are aware that this may not always be possible. For instance, a faculty member might need more funding or authority to hire a full-time open science specialist. Where this is the case, a good side strategy is to record and report to the administration the time employees spend on open science practices, and to communicate the benefits of these practices. This helps to clarify that open science practices are valuable and to signal the effort involved in implementing them.

Adopting open science practices is imperative both for its scientific and social benefits. The current context increasingly favors the transition to open science, thanks to the hard work of many advocates, supported by open science mandates from governments and agencies. As a result, researchers have a unique opportunity to embrace a fairer and more effective way to do science. However, this process challenges researchers because of its complexity and the many behavioral and ideological changes involved.

In this paper, we provided ten concrete strategies through which researchers can begin or further their transition to open science. Strategies 1–5 are practical steps to becoming an open science user. Meanwhile, strategies 6–10 are aimed at those in a position to advocate for open science, amplify its reach, and facilitate its adoption. All strategies are comprehensive in that they relate to different levels of analysis (i.e., procedural, social, ideological, political, and economic). In addition, we included select references for each strategy that researchers can consult to dive deeper into the information we presented.

Open science has the potential to make scientific research more robust, sustainable, reproducible, and replicable. It also helps break down barriers that prevent the free flow of knowledge. As noted elsewhere, open science is simply “science done right” (Imming & Tennant, 2018). It should therefore be a priority for all those involved in scientific research. However, there is still a long way to go before open science becomes the by-design and by-default model for scientific research. Researchers can become agents of change by modifying everyday workflows to foster open science practices and supporting others to make changes towards openness. Hopefully, the strategies we have outlined here will help researchers achieve this goal and build a better science.

Contributed to conception: NA, KBH. Drafted and revised the article: NA, KBH. Approved the submitted version for publication: NA, KBH.

The authors have no known conflict of interest to disclose.

This work was supported by a Concordia Horizon Postdoctoral Fellowship held by NA, and a grant from the Social Sciences and Humanities Research Council of Canada to KBH (Grant #890-2020-0059). KBH holds the Concordia University Research Chair in Bilingualism and Open Science.

Abele-Brehm, A. E., Gollwitzer, M., Steinberg, U., & Schönbrodt, F. D. (2019). Attitudes toward open science and public data sharing: A survey among members of the German Psychological Society. Social Psychology, 50(4), 252–260.
Ahammad, N. (2019). Open source digital library on open educational resources. The Electronic Library, 37(6), 1022–1039.
Ali-Khan, S. E., Harris, L. W., & Gold, E. R. (2017). Motivating participation in open science by examining researcher incentives. eLife, 6, 29319.
Ali-Khan, S. E., Jean, A., & Gold, E. R. (2018). Identifying the challenges in implementing open science. MNI Open Research, 2, 5.
Ali-Khan, S. E., Jean, A., MacDonald, E., & Gold, E. R. (2018). Defining success in open science. MNI Open Research, 2, 2.
Allaire, J. J., Dervieux, C., Scheidegger, C., Teague, C., & Xie, Y. (2022). Quarto (0.9.230). RStudio.
Allen, C., & Mehler, D. M. A. (2019). Open science challenges, benefits and tips in early career and beyond. PLOS Biology, 17(5), e3000246.
Asai, S. (2019). Determinants of article processing charges for medical open access journals. The Journal of Electronic Publishing, 22(1).
Asai, S. (2021). An analysis of revising article processing charges for open access journals between 2018 and 2020. Learned Publishing, 34(2), 137–143.
Attendees Of The NDSF Summit. (2020). Kanata Declaration 2020.
Auker, L. A., & Barthelmess, E. L. (2020). Teaching R in the undergraduate ecology classroom: Approaches, lessons learned, and recommendations. Ecosphere, 11(4).
Aust, F., & Barth, M. (2022). papaja: Prepare reproducible APA journal articles with R Markdown [R package version].
Ayris, P., Bernal, I., Cavalli, V., Dorch, B., Frey, J., Hallik, M., Hormia-Poutanen, K., Labastida, I., MacColl, J., Ponsati Obiols, A., Sacchi, S., Scholze, F., Schmidt, B., Smit, A., Sofronijevic, A., Stojanovski, J., Svoboda, M., Tsakonas, G., van Otegem, M., … Horstmann, W. (2018). LIBER Open Science Roadmap. LIBER - Europe’s Research Library Network.
Azevedo, F., Liu, M., Pennington, C. R., Pownall, M., Evans, T. R., Parsons, S., Elsherif, M. M., Micheli, L., Westwood, S. J., & Framework for Open, R. R. T. (FORRT). (2022). Towards a culture of open scholarship: The role of pedagogical communities. BMC Research Notes, 15(1), 75.
Bago, B., Kovacs, M., Protzko, J., Nagy, T., Kekecs, Z., Palfi, B., Adamkovic, M., Adamus, S., Albalooshi, S., Albayrak-Aydemir, N., Alfian, I. N., Alper, S., Alvarez-Solas, S., Alves, S. G., Amaya, S., Andresen, P. K., Anjum, G., Ansari, D., Arriaga, P., & Aczel, B. (2022). Situational factors shape moral judgements in the trolley dilemma in Eastern, Southern and Western countries in a culturally diverse sample. Nature Human Behaviour.
Bakker, B. N., Jaidka, K., Dörr, T., Fasching, N., & Lelkes, Y. (2021). Questionable and open research practices: Attitudes and perceptions among quantitative communication researchers. Journal of Communication, 71(5), 715–738.
Becerril-García, A. (2022). Academic Publishing and Latin America [Oral presentation]. Open Science European Conference.
Berghmans, S., Cousijn, H., Deakin, G., Meijer, I., Mulligan, A., Plume, A., Rushforth, A., Rijcke, S., Tatum, C., Tobin, S., Leeuwen, T., Waltman, L., CWTS.Bottesini, J. G., Rhemtulla, M., & Vazire, S. (2021). Open data. The researcher perspective. CWTS.
Bergmann, C. (2018). How to integrate open science into language acquisition research. Student Workshop at the 43rd Boston University Conference on Language Development (BUCLD).
Bottesini, J. G., Rhemtulla, M., & Vazire, S. (2021). What do participants think of our research practices? An examination of behavioral psychology participants’ preferences. PsyArXiv.
Budd, A., Dinkel, H., Corpas, M., Fuller, J. C., Rubinat, L., Devos, D. P., Khoueiry, P. H., Förstner, K. U., Georgatos, F., Rowland, F., Sharan, M., Binder, J. X., Grace, T., Traphagen, K., Gristwood, A., & Wood, N. T. (2015). Ten simple rules for organizing an unconference. PLOS Computational Biology, 11(1), e1003905.
Byers-Heinlein, K., Bergmann, C., Davies, C., Frank, M. C., Hamlin, J. K., Kline, M., Kominsky, J. F., Kosie, J. E., Lew-Williams, C., Liu, L., Mastroberardino, M., Singh, L., Waddell, C. P. G., Zettersten, M., & Soderstrom, M. (2020). Building a collaborative psychological science: Lessons learned from ManyBabies 1. Canadian Psychology / Psychologie Canadienne, 61(4), 349–363.
Byers‐Heinlein, K., Bergmann, C., & Savalei, V. (2021). Six solutions for more reliable infant research. Infant and Child Development, 31(5).
Caillaud, S., Haas, V., & Castro, P. (2021). From one new law to (many) new practices? Multidisciplinary teams re‐constructing the meaning of a new disability law. British Journal of Social Psychology, 60(3), 966–987.
Calin-Jageman, R. J., & Cumming, G. (2019). The new statistics for better science: Ask how much, how uncertain, and what else is known. The American Statistician, 73(sup1), 271–280.
Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, Social Sciences and Humanities Research Council of Canada. (2021). Tri-Agency Framework. Responsible Conduct of Research. Secretariat on Responsible Conduct of Research / Government of Canada.
Chan, E., Bourgeois-Doyle, D., Donaldson, M., Haine-Bennett, E. (2020). Toward a UNESCO recommendation on open science: Canadian perspectives. Canadian Commission for UNESCO.
Chen, S.-C., Szabelska, A., Chartier, C. R., Kekecs, Z., Lynott, D., Bernabeu, P., Jones, B. C., DeBruine, L. M., Levitan, C., Werner, K. M., Wang, K., Milyavskaya, M., Musser, E. D., Papadatou-Pastou, M., Coles, N. A., Janssen, S., Özdoğru, A. A., Storage, D., Manley, H., … Schmidt, K. (2018). Investigating object orientation effects across 14 languages. PsyArXiv.
Chubin, D. E. (1985). Open science and closed science: Tradeoffs in a democracy. Science, Technology, Human Values, 10(2), 73–81.
CODATA - Committee on Data of the International Science Council, CODATA International Data Policy Committee, CODATA And CODATA China High-Level International Meeting On Open Research Data Policy And Practice, Hodson, S., Mons, B., Uhlir, P., Zhang, L. (2019). The Beijing Declaration on Research Data. Zenodo.
Coles, N. A., Hamlin, J. K., Sullivan, L. L., Parker, T. H., Altschul, D. (2022). Build up big-team science. Nature, 601(7894), 505–507.
Cook, B. G., Fleming, J. I., Hart, S. A., Lane, K. L., Therrien, W. J., van Dijk, W., Wilson, S. E. (2022). A how-to guide for open-science practices in special education research. Remedial and Special Education, 43(4), 270–280.
Council for the Lindau Nobel Laureate Meetings/Foundation Lindau Nobel Laureate Meetings. (2020). The Lindau guidelines. Council for the Lindau Nobel Laureate Meetings/Foundation Lindau Nobel Laureate Meetings.
Crüwell, S., van Doorn, J., Etz, A., Makel, M. C., Moshontz, H., Niebaum, J. C., Orben, A., Parsons, S., Schulte-Mecklenbeck, M. (2019). Seven easy steps to Open Science: An annotated reading list. Zeitschrift Für Psychologie, 227(4), 237–248.
Cumming, G., Calin-Jageman, R. (2016). Introduction to the new statistics. Routledge.
da Silva, S., White, K., Ben, R., Brouillard, M., Gonzalez-Barrero, A. M., Killam, H., Kremin, L. V., Quirk, E., Sander-Montant, A., Schott, E., Tsui, R. K.-Y., Byers-Heinlein, K. (2021). Development and use of open educational resources in research methods for psychology. International Journal for the Scholarship of Teaching and Learning, 15(2).
Dal Ben, R., Brouillard, M., Gonzalez-Barrero, A. M., Killam, H., Kremin, L. V., Quirk, E., Sander-Montant, A., Schott, E., Tsui, R. K.-Y., Byers-Heinlein, K. (2022). How open science can benefit bilingualism research: A lesson in six tales. Bilingualism: Language and Cognition, 25(5), 913–920.
Davis-Kean, P. E., Ellis, A. (2019). An overview of issues in infant and developmental research for the creation of robust and replicable science. Infant Behavior and Development, 57, 101339.
DORA. (2012). San Francisco Declaration on Research Assessment.
DORA. (2021). Project TARA.
Echevarría, L., Malerba, A., Arechavala-Gomeza, V., Rohrer, J. M. (2021). Researcher’s perceptions on publishing “negative” results and open access. Nucleic Acid Therapeutics, 31(3), 185–189.
Engzell, P., Rohrer, J. M. (2021). Improving social science: Lessons from the open science movement. PS: Political Science Politics, 54(2), 297–300.
Farnham, A., Kurz, C., Öztürk, M. A., Solbiati, M., Myllyntaus, O., Meekes, J., Pham, T. M., Paz, C., Langiewicz, M., Andrews, S., Kanninen, L., Agbemabiese, C., Guler, A. T., Durieux, J., Jasim, S., Viessmann, O., Frattini, S., Yembergenova, D., Benito, C. M., … Hettne, K. (2017). Early career researchers want open science. Genome Biology, 18(1), 221.
Fell, M. J. (2019). The economic impacts of open science: A rapid evidence assessment. Publications, 7(3), 46.
Forscher, P. S., Wagenmakers, E.-J., Coles, N. A., Silan, M. A. A., Dutra, N. B., Basnight-Brown, D., IJzerman, H. (2020). The benefits, barriers, and risks of big team science. PsyArXiv.
Frankowski, S. D. (2021). Increasing participation in psychological science by using course-based research projects: Testing theory, using open-science practices, and professionally presenting research. Teaching of Psychology, 009862832110242.
Friesike, S., Schildhauer, T. (2015). Open science: Many good resolutions, very few incentives, yet. In I. M. Welpe, J. Wollersheim, S. Ringelhan, M. Osterloh (Eds.), Incentives and performance (pp. 277–289). Springer International Publishing.
Gargouri, Y., Hajjem, C., Larivière, V., Gingras, Y., Carr, L., Brody, T., Harnad, S. (2010). Self-selected or mandated, open access increases citation impact for higher quality research. PLoS ONE, 5(10), e13636.
Gernsbacher, M. A. (2018). Rewarding research transparency. Trends in Cognitive Sciences, 22(11), 953–956.
Gold, E. R., Ali-Khan, S. E., Allen, L., Ballell, L., Barral-Netto, M., Carr, D., Chalaud, D., Chaplin, S., Clancy, M. S., Clarke, P., Cook-Deegan, R. R., Dinsmore, A. P., Doerr, M., Federer, L., Hill, S. A., Jacobs, N., Jean, A., Jefferson, O. A., Jones, C., … Thelwall, M. (2018). An open toolkit for tracking open science partnership implementation and impact. Gates Open Research.
Hall, C. A., Saia, S. M., Popp, A. L., Dogulu, N., Schymanski, S. J., Drost, N., van Emmerik, T., Hut, R. (2022). A hydrologist’s guide to open science. Hydrology and Earth System Sciences, 26(3), 647–664.
Hanna, S., Pither, J., Vis-Dunbar, M. (2021). Implementation of an open science instruction program for undergraduates. Data Intelligence, 3(1), 150–161.
Havron, N., Bergmann, C., Tsuji, S. (2020). Preregistration in infant research—A primer. Infancy, 25(5), 734–754.
Hawkins, R. X. D., Smith, E. N., Au, C., Arias, J. M., Catapano, R., Hermann, E., Keil, M., Lampinen, A., Raposo, S., Reynolds, J., Salehi, S., Salloum, J., Tan, J., Frank, M. C. (2018). Improving the replicability of psychological science through pedagogy. Advances in Methods and Practices in Psychological Science, 1(1), 7–18.
Heller, L., Brinken, H. (2018). How to run a book sprint – in 16 steps. LSE Impact Blog.
Hilton, J. L., III, Gaudet, D., Clark, P., Robinson, J., Wiley, D., L., B., Chambers, C., Macleod, M., Bishop, D. V. M., Nichols, T. E., Wagenmakers, E.-J. (2013). The adoption of open educational resources by one community college math department. The International Review of Research in Open and Distributed Learning, 14(4), 70–85.
Houtkoop, B. L., Chambers, C., Macleod, M., Bishop, D. V. M., Nichols, T. E., Wagenmakers, E.-J. (2018). Data sharing in psychology: A survey on barriers and preconditions. Advances in Methods and Practices in Psychological Science, 1(1), 70–85.
Ikahihifo, T. K., Spring, K. J., Rosecrans, J., Watson, J. (2017). Assessing the savings from open educational resources on student academic goals. The International Review of Research in Open and Distributed Learning, 18(7).
Imming, M., Tennant, J. (2018, June 8). Open science: just science done right (ENG) [Sticker]. Zenodo.
Inkpen, R., Gauci, R., Gibson, A. (2021). The values of open data. Area, 53(2), 240–246.
JASP Team. (2022). JASP (Version 0.16.3) [Computer software].
Kathawalla, U.-K., Silverstein, P., Syed, M. (2021). Easing into open science: A guide for graduate students and their advisors. Collabra: Psychology, 7(1), 18684.
Kawachi, P. (2014). Quality assurance guidelines for open educational resources: TIPS framework. Commonwealth Educational Media Centre for Asia.
Kelly, T. F. (1999). Why state mandates don’t work. The Phi Delta Kappan, 80(7), 543–544.
Khan, H., Almoli, E., Franco, M. C., Moher, D. (2022). Open science failed to penetrate academic hiring practices: A cross-sectional study. Journal of Clinical Epidemiology, 144, 136–143.
Khoo, S. Y.-S. (2019). Article processing charge hyperinflation and price insensitivity: An open access sequel to the serials crisis. LIBER Quarterly: The Journal of the Association of European Research Libraries, 29(1), 1–18.
Kiyonaga, A., Scimeca, J. M. (2019). Practical considerations for navigating registered reports. Trends in Neurosciences, 42(9), 568–572.
Lakomý, M., Hlavová, R., Machackova, H. (2019). Open science and the science-society relationship. Society, 56(3), 246–255.
Lasser, J. (2020). Creating an executable paper is a journey through open science. Communications Physics, 3(1), 143.
Lortie, C. (2017). Open sesame: R for data science is open science. Ideas in Ecology and Evolution, 10(1), 78–86.
Marwick, B., Wang, L.-Y., Robinson, R., Loiselle, H. (2020). How to use replication assignments for teaching integrity in empirical archaeology. Advances in Archaeological Practice, 8(1), 78–86.
Masuzzo, P., Martens, L. (2017). Do you speak open science? Resources and tips to learn the language. PeerJ Preprints.
McDermott, R. (2022). Breaking free: How preregistration hurts scholars and science. Politics and the Life Sciences, 41(1), 55–59.
McIntyre, K. P. (2017). Teaching statistics in the age of open science. APS Observer, 30(10).
McKiernan, E. C., Bourne, P. E., Brown, C. T., Buck, S., Kenall, A., Lin, J., McDougall, D., Nosek, B. A., Ram, K., Soderberg, C. K., Spies, J. R., Thaney, K., Updegrove, A., Woo, K. H., Yarkoni, T. (2016). How open science helps researchers succeed. ELife, e16800.
Morling, B., Calin-Jageman, R. J. (2020). What psychology teachers should know about open science and the new statistics. Teaching of Psychology, 47(2), 169–179.
Moshontz, H., Binion, G., Walton, H., Brown, B. T., Syed, M. (2021). A guide to posting and managing preprints. Advances in Methods and Practices in Psychological Science, 4(2), 251524592110199.
Moshontz, H., Campbell, L., Ebersole, C. R., IJzerman, H., Urry, H. L., Forscher, P. S., Grahe, J. E., McCarthy, R. J., Musser, E. D., Antfolk, J., Castille, C. M., Evans, T. R., Fiedler, S., Flake, J. K., Forero, D. A., Janssen, S. M. J., Keene, J. R., Protzko, J., Aczel, B., … Chartier, C. R. (2018). The Psychological Science Accelerator: Advancing psychology through a distributed collaborative network. Advances in Methods and Practices in Psychological Science, 1(4), 501–515.
National Academies of Sciences, Engineering, and Medicine. (2018). Open science by design: Realizing a vision for 21st century research. National Academies Press.
Neth, H. (2022). Data science for psychologists.
Nosek, B. A., Alter, G., Banks, G. C., Borsboom, D., Bowman, S. D., Breckler, S. J., Buck, S., Chambers, C. D., Chin, G., Christensen, G., Contestabile, M., Dafoe, A., Eich, E., Freese, J., Glennerster, R., Goroff, D., Green, D. P., Hesse, B., Humphreys, M., … Yarkoni, T. (2015). Promoting an open research culture. Science, 348(6242), 1422–1425.
Nosek, B. A., Hardwicke, T. E., Moshontz, H., Allard, A., Corker, K. S., Dreber, A., Fidler, F., Hilgard, J., Kline Struhl, M., Nuijten, M. B., Rohrer, J. M., Romero, F., Scheel, A. M., Scherer, L. D., Schönbrodt, F. D., Vazire, S. (2022). Replicability, robustness, and reproducibility in psychological science. Annual Review of Psychology, 73(1), 719–748.
Nosek, B. A., Lakens, D. (2014). Registered reports. Social Psychology, 45(3), 137–141.
OER Commons. (2022, March 14). Learn about the movement.
Office of the Chief Science Advisor of Canada. (2020). Roadmap for open science. Government of Canada.
Office of the Chief Science Advisor of Canada. (2021). A framework for implementing open-by-default with federal government science. Government of Canada.
Pardo Martínez, C., Poveda, A. (2018). Knowledge and perceptions of open science among researchers—A case study for Colombia. Information, 9(11), 292.
Parsons, S., Azevedo, F., Elsherif, M. M., Guay, S., Shahim, O. N., Govaart, G. H., Norris, E., O’Mahony, A., Parker, A. J., Todorovic, A., Pennington, C. R., Garcia-Pelegrin, E., Lazić, A., Robertson, O., Middleton, S. L., Valentini, B., McCuaig, J., Baker, B. J., Collins, E., … Aczel, B. (2022). A community-sourced glossary of open scholarship terms. Nature Human Behaviour, 6(3), 312–318.
Persic, A., Beigel, F., Hodson, S., Oti-Boateng, P. (2021). The time for open science is now. In UNESCO Science Report: The race against time for smarter development (pp. 12–16). UNESCO.
Phills, C., Kekecs, Z., Chartier, C. R., Akkas, H., Söylemez, S., Solak, Ç., Buchanan, E. M. (2021, March 31). Psychological Science Accelerator -- Gendered Prejudice Project.
Pontika, N., Knoth, P., Cancellieri, M., Pearce, S. (2015). Fostering open science to research using a taxonomy and an eLearning portal. Proceedings of the 15th International Conference on Knowledge Technologies and Data-Driven Business, 1–8.
Poupon, V., Seyller, A., Edwards, A., Gold, R., Rouleau, G. (2020). Open Science at the Montreal Neurological Institute and Hospital: The buy-in process. Gates Open Research.
Poupon, V., Seyller, A., Rouleau, G. A. (2017). The Tanenbaum Open Science Institute: Leading a paradigm shift at the Montreal Neurological Institute. Neuron, 95(5), 1002–1006.
Pownall, M., Talbot, C. V., Henschel, A., Lautarescu, A., Lloyd, K. E., Hartmann, H., Darda, K. M., Tang, K. T. Y., Carmichael-Murphy, P., Siegel, J. A. (2021). Navigating open science as early career feminist researchers. Psychology of Women Quarterly, 45(4), 526–539.
R Core Team. (2021). R: A language and environment for statistical computing. R Foundation for Statistical Computing. R Foundation for Statistical Computing.
Reich, J. (2021). Preregistration and registered reports. Educational Psychologist, 56(2), 101–109.
Rhoads, S., Gan, L. (2022). Computational models of human social behavior and neuroscience: An open educational course and Jupyter Book to advance computational training. Journal of Open Source Education, 5(47), 146.
Robinson, L. D., Cawthray, J. L., West, S. E., Bonn, A., Ansine, J. (2018). Ten principles of citizen science. In S. Hecker, M. Haklay, A. Bowser, Z. Makuch, J. Vogel, A. Bonn (Eds.), Citizen science. Innovation in open science, society and policy (pp. 27–40). UCL Press.
Robson, S. G., Baum, M. A., Beaudry, J. L., Beitner, J., Brohmer, H., Chin, J. M., Jasko, K., Kouros, C. D., Laukkonen, R. E., Moreau, D., Searston, R. A., Slagter, H. A., Steffens, N. K., Tangen, J. M., Thomas, A. (2021). Promoting open science: A holistic approach to changing behaviour. Collabra: Psychology, 7(1), 30137.
Rouleau, G. (2017). Open science at an institutional level: An interview with Guy Rouleau. Genome Biology, 18(1), 14.
Saenen, B., Morais, R., Gaillard, V., Borrel-Damián, L. (2019). Research assessment in the transition to open science. European University Association.
Salas, E., Sims, D. E., Klein, C. (2004). Cooperation at work. In C. Spielberger (Ed.), Encyclopedia of Applied Psychology (pp. 497–506). Elsevier.
Santos-Hermosa, G., Estupinyà, E., Nonó-Rius, B., París-Folch, L., Prats-Prats, J. (2021). Open educational resources (OER) in the Spanish universities. El Profesional de La Información, e290637.
Sarafoglou, A., Kovacs, M., Bakos, B., Wagenmakers, E.-J., Aczel, B. (2022). A survey on how preregistration affects the research workflow: Better science but more work. Royal Society Open Science, 9(7), 211997.
Schmid, S. L. (2017). Five years post-DORA: Promoting best practices for research assessment. Molecular Biology of the Cell, 28(22), 2941–2944.
Schönbrodt, F., Mellor, D., Bergmann, C., Penfold, N., Westwood, S., Lautarescu, A., Kowalczyk, O., Dall’Aglio, L., Blok, E., Schettino, A., Schramm, L., Weber, B., Tananbaum, G., Etzel, F., Montoya, A. (2022). Academic job offers that mentioned open science. Open Science Framework.
Seibold, H. (2020). Let’s become (open)science champions. Kick-off meeting of the Open Reproducible Data Science and Statistics (ORDS) network. Universität Rostock, Germany.
Simmons, J., Nelson, L., Simonsohn, U. (2021). Pre‐registration: Why and how. Journal of Consumer Psychology, 31(1), 151–162.
Smith, A. C., Merz, L., Borden, J. B., Gulick, C. K., Kshirsagar, A. R., Bruna, E. M. (2022). Assessing the effect of article processing charges on the geographic diversity of authors using Elsevier’s “Mirror Journal” system. Quantitative Science Studies, 2(4), 1123–1143.
Susi, T., Heintz, M., Hnatkova, E., Koch, W., Leptin, M., Andler, M., Masia, M., Garfinkel, M. (2022). Centrality of researchers in reforming research assessment. ISE - Initiative for Science in Europe, 162, 218–223.
Tenney, E. R., Costa, E., Allard, A., Vazire, S. (2021). Open science and reform practices in organizational behavior research over time (2011 to 2019). Organizational Behavior and Human Decision Processes, 162, 218–223.
Tenopir, C., Rice, N. M., Allard, S., Baird, L., Borycz, J., Christian, L., Grant, B., Olendorf, R., Sandusky, R. J. (2020). Data sharing, management, use, and reuse: Practices and perceptions of scientists worldwide. PLos ONE, 15(3), 0229003.
The ManyBabies Consortium. (2020). Quantifying sources of variability in infancy research using the infant-directed-speech preference. Advances in Methods and Practices in Psychological Science, 3(1), 24–52.
Thompson, S. D., Muir, A. (2020). A case study investigation of academic library support for open educational resources in Scottish universities. Journal of Librarianship and Information Science, 52(3), 685–693.
Toribio-Flórez, D., Anneser, L., deOliveira-Lopes, F. N., Pallandt, M., Tunn, I., Windel, H. (2021). Where do early career researchers stand on open science practices? A survey within the Max Planck Society. Frontiers in Research Metrics and Analytics, 5, 586992.
UNESCO. (2002). Forum on the impact of open courseware for higher education in developing countries. UNESCO.
Vermeire, B. C., Pereira, C. A., Karbasian, H. (2020). Computational fluid dynamics. An open-scource approach. Concordia University.
Verplanken, B., Orbell, S. (2022). Attitudes, habits, and behavior change. Annual Review of Psychology, 73(1), 327–352.
Vicente-Saez, R., Gustafsson, R., Martinez-Fuentes, C. (2021). Opening up science for a sustainable world: An expansive normative structure of open science in the digital era. Science and Public Policy, 48(6), 799–813.
Walsh, C. G., Xia, W., Li, M., Denny, J. C., Harris, P. A., Malin, B. A. (2018). Enabling open-science initiatives in clinical psychology and psychiatry without sacrificing patients’ privacy: Current practices and future challenges. Advances in Methods and Practices in Psychological Science, 1(1), 104–114.
Zaid, Y. A., Alabi, A. O. (2021). Sustaining open educational resources (OER) initiatives in Nigerian universities. Open Learning: The Journal of Open, Distance and e-Learning, 36(2), 181–197.
Zečević, K., Houghton, C., Noone, C., Lee, H., Matvienko-Sikar, K., Toomey, E. (2021). Exploring factors that influence the practice of open science by early career health researchers: A mixed methods study. HRB Open Research, 3, 56.
This is an open access article distributed under the terms of the Creative Commons Attribution License (4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Supplementary data