The scientific community has long recognized the benefits of open science. Today, governments and research agencies worldwide are increasingly promoting and mandating open practices for scientific research. However, for open science to become the by-default model for scientific research, researchers must perceive open practices as accessible and achievable. A significant obstacle is the lack of resources providing a clear direction on how researchers can integrate open science practices in their day-to-day workflows. This article outlines and discusses ten concrete strategies that can help researchers use and disseminate open science. The first five strategies address basic ways of getting started in open science that researchers can put into practice today. The last five strategies are for researchers who are more advanced in open practices to advocate for open science. Our paper will help researchers navigate the transition to open science practices and support others in shifting toward openness, thus contributing to building a better science.
In the last twenty years, we have witnessed the rise of open science—the cultural movement and ethical-political perspective that strives to make all processes and products involved in scientific research available and accessible to all people without barriers (Friesike & Schildhauer, 2015; Lakomý et al., 2019). Open science is so relevant today that it has been proposed as a by-design and by-default model for conducting scientific research (National Academies of Sciences, Engineering, and Medicine, 2018; Office of the Chief Science Advisor of Canada, 2020, 2021).
Open science comprises a diverse set of best practices that seek to democratize knowledge and improve the quality of scientific research. These practices are often grouped into areas as varied as open access, open educational resources, open data, open labs, open notebooks, open innovation, open evaluation, open hardware, open-source software, and citizen science (Chan et al., 2020; Pontika et al., 2015). In the field of psychology, open science practices such as pre-registration, sharing data in public repositories alongside code, and using open-source statistical software such as R (R Core Team, 2021) have been emphasized, because they can foster the reproducibility and replicability of findings (Crüwell et al., 2019; Davis-Kean & Ellis, 2019).
The benefits of open science go beyond enhancing the rigor and reliability of the scientific research process. For example, open publications get more citations and allow authors to retain the rights to their scholarly output through open licenses (Gargouri et al., 2010; McKiernan et al., 2016). Open data practices can contribute to generating high-quality datasets that others can reuse, thus avoiding unnecessary duplication of research efforts (Ali-Khan, Jean, & Gold, 2018), and citizen science can lead to more socially relevant research by involving citizens in the scientific endeavor (see Robinson et al., 2018). Open science practices can reduce costs, for example when articles are published open access rather than behind a paywall. Further, they enable impactful activities that would not otherwise be possible, like fostering innovative collaborations, products, and industries (Fell, 2019). This is because ideas, data, and discoveries flow freely, more quickly, and in more diverse contexts than do the results of closed research. It is, therefore, sensible to argue that the “early adoption of open and reproducible methods is an investment in the future and can put researchers ahead of the curve” (Allen & Mehler, 2019, p. 9).
Nevertheless, practically implementing open science poses significant difficulties. As Kathawalla et al. (2021) point out, researchers who want to transition to the open model often develop “a sense of paralysis associated with not knowing where to begin” (p. 1). Several factors likely contribute to this feeling. For one, open science resources are often more theoretical than practical (Farnham et al., 2017). Even if researchers agree with the general principles of open science, they may not know how to apply them on a day-to-day basis. In turn, navigating the plethora of open science resources and tools available can take significant time and energy, especially where the institutional rewards of practicing open science are more ephemeral than concrete (Gernsbacher, 2018; Hall et al., 2022).
Additionally, researchers wishing to embrace open science must often adjust aspects of their day-to-day scientific behavior (Chubin, 1985; Robson et al., 2021), which can prove daunting. For example, pre-registration and registered reports require researchers to elaborate an analysis pipeline prior to data collection, thus inverting the order in which they traditionally do research (Allen & Mehler, 2019). This fact can contribute to the feeling that open practices such as pre-registration crush creativity, increase work-related stress, or involve additional work that lengthens project duration unnecessarily (McDermott, 2022; Sarafoglou et al., 2022). Similarly, researchers contemplating open data may have concerns about data ownership, responsibility, and control (Berghmans et al., 2021; Inkpen et al., 2021). As another example, for-profit journals (especially those with high prestige) often have astronomical article processing charges that authors or research institutions are required to pay in order to publish openly (e.g., Asai, 2019, 2021; Khoo, 2019). Such high publication costs constitute an economic hardship for many authors, and are simply unaffordable for authors with fewer resources, restricting the free circulation of knowledge (see Persic et al., 2021; Smith et al., 2022). Finally, the limited empirical research on the benefits, risks, and limits of open science (see Ali-Khan, Jean, MacDonald, et al., 2018) acts as a barrier, holding back students, researchers, and the public from perceiving open science practices as a worthwhile endeavor (for recent research on this topic, see Bakker et al., 2021; Bottesini et al., 2021; Echevarría et al., 2021; Nosek et al., 2015; Pardo Martínez & Poveda, 2018; Tenopir et al., 2020).
In this article, we outline ten concrete strategies that can help researchers in psychology and beyond to foster open science practices (see Table 1). Rather than discussing the characteristics of different open science practices (e.g., the peculiarities of publishing a preprint or pre-registering a study), we describe simple courses of action that readers can take immediately to foster their implementation. In this way, our article calls on readers to revise day-to-day practices and implement changes at every step of the research cycle. We begin by describing five strategies that will allow researchers to make the transition to becoming open science users. This will be helpful both for those who are not familiar with the open approach to research practice and for those who are but have not yet implemented open science practices in their workflows. Afterward, we present five more advanced strategies for researchers to transition from users to advocates who help disseminate and broaden the adoption of open science in their local contexts. It is worth noting that the difference between a user and an advocate of open science is one of degree and not of kind. For example, doing open science where others can notice it and learn from it can be thought of as a simple act of advocacy. Our advanced strategies discuss more complex and challenging ways of making open science more widely available.
Basic strategies: becoming an open science user | |
Strategy | Concrete actions |
Experiment with integrating open science practices and policies into daily workflows | Start small: use open-source software, publish a preprint, pre-register a study or sign a peer-review (i.e., open peer reviews). As you get more comfortable with open science practices, share lab notebooks, deposit data and code in public repositories, or write a registered report. |
Become familiar with national and institutional open science policies | Create spaces to read, share, and discuss open science mandates developed by government agencies and other institutions. |
Analyze and share successful cases of implementation of open science practices | Become familiar with and inspired by cases of successful implementation of open science practices. Discuss them with others. Identify, compare and eventually implement strategies to spur open science at your institution. |
Introduce open science practices in your courses | Incorporate open science practices in course content. Allow students to come into contact and experiment with the open research culture before carrying out major projects. |
Embrace open educational resources (OERs) | Use and/or create OERs, such as open courses, syllabi, lectures, assignments, and textbooks. If you create OERs, publish them under a Creative Commons license through open repositories. |
Advanced strategies: from open science user to advocate | |
Strategy | Concrete actions |
Collaborate with others using open tools | Exploit collaborations to rethink, question, and overcome embedded closed practices. Try writing an article with others using R Markdown or Quarto, sharing data using open formats, or keeping a version history of research files using GitHub or similar platforms. |
Develop networks of open collaboration | Establish collaborative networks with others that advocate for open research and work on related topics. Develop standards and guidelines to wrangle, analyze, and visualize data, to comment and proofread code, and to create accompanying documentation. |
Voice your opinion | Write testimonials, reports, glossaries, or declarations of support for open science. Outline principles, values, and considerations relevant to your community. |
Rethink and promote changes in the assessment of scholarly production | Take ownership of the discussion on research assessment. Collaborate in building spaces to discuss and promote alternatives to the traditional measurement of academic performance (e.g., impact factor, h-index). Advocate for the recognition of a variety of research outputs (e.g., open datasets, open peer reviews). |
Create opportunities for people to specialize in open science | If you have or can pursue funding to hire personnel, create open-science related job opportunities (e.g., PhD scholarships and research assistantships, research manager positions, postdoctoral positions). |
Basic strategies: becoming an open science user | |
Strategy | Concrete actions |
Experiment with integrating open science practices and policies into daily workflows | Start small: use open-source software, publish a preprint, pre-register a study or sign a peer-review (i.e., open peer reviews). As you get more comfortable with open science practices, share lab notebooks, deposit data and code in public repositories, or write a registered report. |
Become familiar with national and institutional open science policies | Create spaces to read, share, and discuss open science mandates developed by government agencies and other institutions. |
Analyze and share successful cases of implementation of open science practices | Become familiar with and inspired by cases of successful implementation of open science practices. Discuss them with others. Identify, compare and eventually implement strategies to spur open science at your institution. |
Introduce open science practices in your courses | Incorporate open science practices in course content. Allow students to come into contact and experiment with the open research culture before carrying out major projects. |
Embrace open educational resources (OERs) | Use and/or create OERs, such as open courses, syllabi, lectures, assignments, and textbooks. If you create OERs, publish them under a Creative Commons license through open repositories. |
Advanced strategies: from open science user to advocate | |
Strategy | Concrete actions |
Collaborate with others using open tools | Exploit collaborations to rethink, question, and overcome embedded closed practices. Try writing an article with others using R Markdown or Quarto, sharing data using open formats, or keeping a version history of research files using GitHub or similar platforms. |
Develop networks of open collaboration | Establish collaborative networks with others that advocate for open research and work on related topics. Develop standards and guidelines to wrangle, analyze, and visualize data, to comment and proofread code, and to create accompanying documentation. |
Voice your opinion | Write testimonials, reports, glossaries, or declarations of support for open science. Outline principles, values, and considerations relevant to your community. |
Rethink and promote changes in the assessment of scholarly production | Take ownership of the discussion on research assessment. Collaborate in building spaces to discuss and promote alternatives to the traditional measurement of academic performance (e.g., impact factor, h-index). Advocate for the recognition of a variety of research outputs (e.g., open datasets, open peer reviews). |
Create opportunities for people to specialize in open science | If you have or can pursue funding to hire personnel, create open-science related job opportunities (e.g., PhD scholarships and research assistantships, research manager positions, postdoctoral positions). |
Basic Strategies: Becoming an Open Science User
Strategy 1: Experiment with Integrating Open Science Practices into Daily Workflows
Rather than being all or nothing, openness should be considered a continuum of practices ranging from uncomplicated to highly complex (McKiernan et al., 2016). Researchers can begin small and gradually integrate open science practices into everyday workflows by, so to speak, picking and choosing what to experiment with from the “open science buffet” (Bergmann, 2018). This is a wide-ranging strategy that can be implemented in various ways depending on the specific objectives of each investigation. Ways to start include using open-source software, publishing a preprint, pre-registering a study, or signing a review (i.e., open peer review). There are good guides on how to get started with these practices (e.g., Cook et al., 2022; Kathawalla et al., 2021; Masuzzo & Martens, 2017; Moshontz et al., 2021; Simmons et al., 2021).
As research team members become more comfortable with some basic open science practices, they can begin to consider more complex ones, such as sharing lab notebooks, depositing research data and code in public repositories, or writing a registered report article (Kiyonaga & Scimeca, 2019; Nosek & Lakens, 2014; Reich, 2021). Although more costly in terms of effort, ventures such as these are likely to have a greater impact on the behavior of researchers and eventually change the dynamics of the research process itself.
Early career researchers seem to have a more positive attitude towards open science than more established researchers and are eager to participate in open initiatives, even if they sometimes feel they need more support to do so (Abele-Brehm et al., 2019; Pownall et al., 2021; Toribio-Flórez et al., 2021; Zečević et al., 2021). Thus, giving graduate students and postdoctoral researchers the freedom and support to explore open science practices—and eventually use this knowledge to help other researchers engage in open science practices—can prove a valuable strategy for catalyzing change.
Strategy 2: Become Familiar with National and Institutional Open Science Policies
Transitioning to open science is a challenging undertaking because it involves both a shift in how science is understood and substantive changes in behaviors associated with research (see, for instance, Attendees Of The NDSF Summit, 2020). Strategy 1 developed above seeks to encourage researchers to gradually render certain behaviors normative. It is a bottom-up approach to promoting change in how science is done by accumulating meaningful practices at the grassroots level. By contrast, mandates from governments and granting agencies are a top-down approach to change.
Mandates help open science become a standard and no longer an add-on to research. For instance, in Canada, the federal funding agencies (also known as the Tri-council) have enacted an open access policy, whereby grant recipients must ensure that publications stemming from agency funding are freely accessible within 12 months of publication (e.g., by depositing peer-reviewed manuscripts in institutional or disciplinary repositories, or by publishing in open-access journals). Mandates put all researchers on an even playing field, and give those who have already transitioned voluntarily a head start. Moreover, researchers are usually diligent in complying with mandates because of the potential consequences if they do not (Canadian Institutes of Health Research, Natural Sciences and Engineering Research Council of Canada, & Social Sciences and Humanities Research Council of Canada, 2021, pp. 14–15). However, as noted elsewhere, imposing a policy for its own sake is unlikely to be the basis for sustainable change (Ali-Khan, Jean, & Gold, 2018). This is because people must perceive laws and policies as reasonable, doable, and necessary for their implementation to be effective (Caillaud et al., 2021; Kelly, 1999; Vicente-Saez et al., 2021).
Along these lines, we argue that implementing open science practices necessitates disseminating and discussing top-down strategies within the communities in which they are intended to be applied. Mandates will only become fully effective after all stakeholders familiarize themselves with them, consider their motivation, and discuss what might be on the horizon. This type of discussion is becoming even more important in the current context in which the number of open science mandates and policies is growing steadily year by year (see, e.g., data from ROARMAP, the Registry of Open Access Repository Mandates and Policies, http://roarmap.eprints.org/). Researchers and students can create spaces to read, comment, and reflect on the contents of the policies that affect their local communities. For instance, Eurodoc, the European Council of Doctoral Candidates and Junior Researchers, launched an Open Science Working Group seeking to ease the transition to open science by connecting policy makers and researchers to elaborate recommendations on the implementation of open practices (http://www.eurodoc.net/wg/open-science-wg). Those who wish to take it one step further can also organize informal meetings, working groups, discussion panels, and conferences on open science policies. Less conventional formats such as hackathons, unconferences (i.e., participant-oriented meetings, without a predetermined agenda, aimed at fostering collaboration and discussion; see Budd et al., 2015), and book sprints (i.e., short events aiming at the collaborative writing of a book; see Heller & Brinken, 2018) can also be constructive.
Creating new spaces dedicated to open science can nevertheless be challenging. For example, setting up an open science working group is more feasible in institutions that have long been building momentum for a change in research culture. In these institutions, new initiatives crystallize previous efforts and feel like natural outcomes of longer processes. In institutions where an open science culture is nascent, it is possible to begin by taking advantage of pre-existing venues with built-in audiences, such as lecture series, reading groups, and lab meetings, as well as informal spaces. For example, a PI may propose to discuss an open science mandate in a lab meeting, and a group of graduate students can organize a convivial lunch to brainstorm how to include open science practices in funding applications. These and other activities should be aimed at understanding the rationale behind policies, assessing whether this rationale is sound, and discussing how and to what extent the content of the policies would affect the researchers’ practice and careers.
Strategy 3: Analyze and Share Successful Cases of Implementation of Open Science Practices
Knowing and discussing how others have implemented open science practices is of great help to identify, compare, and eventually enact strategies for transitioning to open science. Each country, discipline, institution, and lab has particularities that must be considered when designing open science adoption policies. However, other individuals’ and institutions’ experiences can help researchers recognize potential obstacles and ways to overcome them.
A simple action researchers can take is to directly consult other peers with experience in implementing open science practices. This could be the head of a lab that has transitioned to open science, a researcher working openly with similar types of data, or an open science advocate working in the same department. These consultations can shed light on difficulties and provide first-hand solutions for adopting open science that work in specific research contexts. For example, colleagues might be able to share examples of pre-registration templates (e.g., Havron et al., 2020) or data sharing approaches (e.g., Houtkoop et al., 2018; Tenney et al., 2021; Walsh et al., 2018) that are tailored to a subfield and its particularities.
While such one-on-one interactions are largely informal and undocumented, we have better records for how large projects and institutions have embraced open science practices. These records document common pitfalls as well as solutions that different communities have found for embracing open science practices. ManyBabies 1, for instance, was a collaborative project involving 100+ researchers (including one of the authors of this paper) and 70+ labs that tested the development of infant-directed speech preference (The ManyBabies Consortium, 2020).
At the end of the project, the research team published a paper reporting on the experience and addressing how Big Team Science—the interdependent work of multiple teams to carry out joint projects—can help in fostering open science practices (Byers-Heinlein et al., 2020). For example, the team ensured that all study stimuli, analysis scripts, and data were shared via the Open Science Framework and published findings under the registered report format. In addition, many laboratories began to implement open science practices as a result of their participation in the project. The article also points out barriers the team faced while adopting open science practices, such as the lack of specific technical skills and the varying resources available in different labs, and provides valuable insights that can benefit other large-scale collaborations in psychology interested in doing research openly.
Another interesting example is that of the Montreal Neurological Institute in Canada. This institution recently adopted an ambitious open science policy thanks to the creation of the Tanenbaum Open Science Institute (TOSI) in 2016. The institute’s mission is to establish best practices and develop tools to support the transition to open science, as well as to measure its impact (Rouleau, 2017). Poupon et al. (2017, 2020) describe in detail the 18-month “buy-in process”, dividing it into six phases: (i) building awareness of open science at the institution, (ii) carrying out the internal consultation process, (iii) extending the discussion to McGill University, where all faculty of the institute are employed, (iv) establishing open science guiding principles and presenting them to the community, (v) generating a definition of open science for the institution, and (vi) spreading awareness of the open science initiative nationally and internationally. Similarly, other articles describe the attitudes and concerns of the institute’s researchers towards open science and their motivations and disincentives to participate in it (Ali-Khan et al., 2017, see also Ali-Khan, Jean, & Gold, 2018), and the instruments used by the institution to measure the impact of open science principles and practices on research and innovation (Gold et al., 2018). Knowledge of these materials can help researchers working elsewhere strategize how to spearhead the transition to open science at their own institution.
Spreading the word about successful cases of implementation of open science is essential, regardless of whether it is possible to initiate a transition process in the researchers’ institutions. The main purpose is to amplify the message that another style of science is possible and to inspire others to initiate change. In this sense, we strongly encourage those who have gone through implementing open practices to document and share it as soon as possible and those interested in open science to disseminate successful case studies.
Strategy 4: Introduce Open Science Practices in Your Courses
Instructors can contribute to a sustainable shift towards open science practices by introducing them into their courses and workshops. Incorporating open science into the curriculum allows students to come into contact and experiment with the open research culture long before carrying out major projects in the context of a research assistantship or their doctoral research. Moreover, since many of today’s students will be tomorrow’s researchers, professors, and policy makers, early contact with open science could contribute greatly to establishing new procedural standards for research.
In some knowledge areas, including open science practices in teaching is more straightforward. For instance, in statistics and research methods, courses can be designed by adopting the “new statistics” for better science approach (Calin-Jageman & Cumming, 2019; Cumming & Calin-Jageman, 2016; Morling & Calin-Jageman, 2020). This entails discussing issues such as the reproducibility crisis, explaining the importance of effect sizes, confidence intervals, and meta-analysis, and addressing the importance of transparent reporting (see Nosek et al., 2022). Also, procedural content such as data wrangling, analysis through statistical techniques, and visualization can be taught through open-source tools, such as R or Python (Auker & Barthelmess, 2020). Embracing open-source tools brings two benefits: (i) students can learn programming in a new language while learning statistics in a scaffolded educational context and (ii) they gain expertise in open-source tools, which are much more flexible than commercial software and are enriched continuously thanks to packages and modules developed by the community. In that sense, open-source programming languages such as R have a virtually unlimited expansion capacity and can act as natural bridges between data and open science practices (Lortie, 2017). Finally, instructors can incorporate replication research in their courses. For example, Hawkins et al. (2018) describe an experience in which students in a graduate-level experimental methods course engaged in pre-registered replications of findings from a volume of the journal Psychological Science.
Fortunately, the possibilities are not limited to courses on statistics or research methods. Course-based research projects in applied subjects (e.g., developmental psychology, social psychology, educational psychology, and other social sciences) can also act as playgrounds for testing open science practices (Frankowski, 2021). For instance, Marwick et al. (2020) implemented a replication report assignment in an upper-level archaeology class at the University of Washington. Working in groups, students chose a study to replicate, wrote their own code to analyze the original data, and finally produced a compendium of report, code, and data to submit for grading. Note that for most students, the assignment was the first experience they had at trying to replicate the findings of a scholarly publication. Yet, 91% agreed or strongly agreed that the ability to replicate is an important skill and 80% agreed or strongly agreed that the replication assignment helped them understand the base paper better than they would have reading it in order to write an essay.
There are myriad options for incorporating open science into courses: students can reuse open stimuli and data, deposit data and research reports in public repositories such as OSF, pre-register a study, or create repositories for version control and tracking group work (e.g., using GitHub). Also, institutions can create instruction programs consisting of open science concepts, principles, and techniques that can be adapted to be taught in several subjects (see Hanna et al., 2021).
Strategy 5: Embrace Open Educational Resources
Implementing the previous strategy requires new types of educational materials, which demand a considerable amount of time and effort to create. To help solve this problem, the open community has begun to engage with open educational resources (OERs), defined as “the open provision of educational resources, enabled by information and communication technologies, for consultation, use and adaptation by a community of users for non-commercial purposes” (UNESCO, 2002, p. 24). OERs encompass a variety of formats, including, but not limited to, complete courses, syllabi, lectures, assignments, textbooks, and other pedagogical resources (OER Commons, 2022).
Just as open data helps avoid duplication of research efforts by reusing datasets, OERs prevent instructors from working in parallel on the creation of potentially similar pedagogical materials. The goal is that instructors can access materials created by others, remix, modify, and enrich them, and release them to the open community again. In this way, using OERs contributes to building a network of increasingly developed and updated resources, drawing on the contributions of many individuals. Moreover, as open resources available to all, OERs favor the democratization of knowledge and allow institutions and individuals to save considerable sums of money (Hilton et al., 2013; Ikahihifo et al., 2017). In recent years, OERs advocates have created resources to assess their quality and facilitate their creation and use (e.g., FORRT, Azevedo et al., 2022; TIPS framework, Kawachi, 2014). Importantly, studies have shown that the transition to the OERs model is uncomplicated and leads to successful learning outcomes for students (e.g., da Silva et al., 2021).
An exciting example of OER implementation in psychology is Open Stats Lab (OSL), an initiative driven by Prof. Kevin McIntyre and funded through a grant from the APS Fund for Teaching and Public Understanding of Psychological Science (see McIntyre, 2017). Open Stats Lab is an open platform where users can download a set of articles published in Psychological Science, the datasets associated with those articles, activities designed to learn statistics by analyzing the data, and R guidance scripts, in addition to recommendations for students and instructors on how to use the platform. A related undertaking involves open textbooks made using technologies such as Jupyter and R Markdown notebooks, which permit the interweaving of narrative text, code, and figures, and have the ability to export static and dynamic output formats (see, for example, Neth, 2022; Rhoads & Gan, 2022). Some books even provide access to richer interactive educational experiences. For an example outside of psychology, the textbook Computational fluid dynamics: An open source approach (Vermeire et al., 2020), published by Concordia University, features myriad open-source technologies that can be run in the cloud to allow users to perform simulations and write their own code easily. Additionally, users can contribute to the development of the book, as it is available as a public repository on GitLab. Readers can interact with the authors by suggesting modifications or enriching the existing material.
At a time when educational institutions are increasingly encouraging the creation and use of OERs (Ahammad, 2019; Santos-Hermosa et al., 2021; Thompson & Muir, 2020; Zaid & Alabi, 2021), some practical suggestions are in order. During the planning stage of a course, educators can search for existing OERs in open repositories. It is essential to note the license of each downloadable resource to assess the permissions for reuse. Similarly, it is advisable to cite the source for each derivative material to allow for the reconstruction of its history. Also, it is worth contacting the original author(s) to let them know that their resource has been reworked (they will likely be pleased to see their creation being used!). This could lead to new collaborations to assess the quality of the material produced or even create new materials.
Whenever educators create new pedagogical resources (e.g., syllabi, slides, assignments, reading guides), they can publish them openly, using a Creative Commons license, through institutional repositories or open repositories such as OSF, Zenodo, and GitHub. If sharing the materials right away is not viable (e.g., an institution might require that the materials be made public only after its students benefit from them), educators can still deposit them in open repositories by setting an embargo period, after which the resources become open. Whenever possible, it is preferable to allow users to interact with the creators to suggest modifications to the educational resources. The most straightforward choice is to allow users to leave comments on the resource, but more efficient alternatives are collaboration on GitHub or GitLab. Whether for searching for or creating OERs, it is critical to share open resources as soon as they become available and accompany them with documentation that allows users to understand how they were produced (i.e., by whom, when, how, and for what course, and at what institution).
Just as with other open science practices, adopting OERs is not an all-or-nothing concept but rather a goal to strive towards. For example, teachers who do not have control over the syllabus content (e.g., due to content standards in place) can nevertheless reuse and create open slides and assignments, and those who must use existing slides or assignments can still document their teaching strategies so that others can potentially replicate them.
Advanced Strategies: From Open Science User to Open Science Advocate
The above strategies allow researchers to begin their journey to opening up research practices. However, once they have transitioned to becoming users of open science practices, researchers may want to more explicitly encourage others to adopt the open way of doing research. The following five strategies support the task of open science advocacy by discussing concrete actions that researchers can begin to implement progressively.
Strategy 6: Collaborate with Others Using Open Tools
Transitioning to the open research model involves substantive changes in the ways of doing science. Lasting habits can develop thanks to deliberate, small, and motivation-driven changes in attitude (Verplanken & Orbell, 2022), so it is a good idea to incorporate open tools into personal daily workflows (see Strategy 1). However, a good part of what researchers do is collaborate with others, and changes in personal workflows may not reach beyond the local level. In this section, we want to argue that peer communication and collaboration provide a unique opportunity to rethink and overcome deeply embedded closed practices and take the adoption of open science practices to a new level. Teamwork helps sustain efforts over time to achieve common goals, fosters a sense of affiliation to a larger project, and provides a social framework that eases personal and professional insecurities (see Salas et al., 2004). Researchers can leverage these benefits to foster the implementation of open science practices.
How to get started? Researchers can attempt to collaborate in the writing of an article using R Markdown, the format for creating dynamic documents in R that we introduced in the previous section. Although an important advantage of this format is that it integrates both code and text into the same document–thereby improving the reproducibility of analyses– researchers need not include code to get started with this tool. It should be noted that although using R Markdown to write manuscripts used to be somewhat challenging, there are now packages that help researchers with the task. Papaja, for example, is a package that helps users seamlessly export submission-ready manuscripts conforming to the American Psychological Association (APA) guidelines, which is of great help to researchers in the field of psychology (Aust & Barth, 2022). Researchers already familiar with R Markdown can also capitalize on collaborations to learn how to use new technologies. Quarto, for instance, is a recently developed open-source scientific and technical publishing system sponsored by RStudio that supports executable R, Python, and Julia code blocks within Markdown and Observable JS for interactive data exploration and analysis (Allaire et al., 2022). Quarto eases the implementation of reproducible research and publications and conveniently allows users to export their documents in a plethora of formats.
Other possible practices that can be incorporated are keeping a version history of research-related files using GitHub or other open repositories, or sharing research data with team members using open formats (e.g., CSV or JSON formats) instead of proprietary ones (e.g., Excel or SPSS formats) that are not widely interoperable and do not guarantee long term data preservation. The main idea is that collaborative work should increasingly happen through open tools to make their use natural. Of course, in the early stages of building habits, incorporating open tools may have a steep learning curve (e.g., Lasser, 2020). Moreover, certain types of functionality may be less available in open tools, which should not be overlooked. For example, R can extend its power via a remarkable 18861 packages currently available on the Comprehensive R Archive Network (CRAN). However, users trained with more traditional software such as SPSS may miss the ability to interact through an uncomplicated GUI (graphical user interface) when working with R, which mainly relies on the user inputting commands. We offer two solutions to this dilemma. First, with regards to R, users can transition by using JASP (JASP Team, 2022), a GUI version of R that is designed to be familiar to users of SPSS, and supports copy-paste APA-style tables. Second, users can consider how to optimize their timing when adopting open-source software. As a case in point, it is advisable for a student who has to get started with statistical software to opt directly for open software. Comparably, it may be better for established researchers to switch to open-source software when starting a new project rather than changing tools midstream.
Despite their limitations, open tools are being updated and improved all the time by their supporting communities, which will progressively lessen the issue. At the same time, using open tools will often result in a more manageable collaboration. For example, centralizing research-related documents in an open repository makes it easier for new collaborators to catch up with how the project is progressing. It also eliminates the constant need to send e-mail attachments and allows multiple collaborators to work on a document simultaneously while ensuring that version history is handled efficiently (e.g., avoiding accidental deletions or overwriting other people’s contributions). Concurrently, collaborating through open tools has the potential to place researchers at the forefront of the shift toward openness. For instance, growing expertise might position the team or laboratory as an institutional reference for open science practices. Individuals can become open science champions who inspire other researchers (Seibold, 2020). These labs and individuals lead others by example.
Strategy 7: Develop Networks of Open Collaboration
Once researchers have incorporated open science practices into their daily workflows, they can seek to establish or join collaborative networks with other labs or teams that advocate for open research and work on related topics. Big Team Science has advantages for research, as it allows investigators to access more resources, work with greater sample sizes, take advantage of the expertise of a larger team of researchers in areas such as data analysis, and distribute work more efficiently (see Coles et al., 2022). All of this helps to overcome current challenges related to the replicability, reproducibility, and generalizability of research findings, which are of particular concern in psychology (see Byers‐Heinlein et al., 2021; Forscher et al., 2020). ManyBabies, which we discussed in a previous section, is one example of Big Team Science.
Big Team Science ventures are becoming more and more prevalent today. One notable example is the Psychological Science Accelerator (Moshontz et al., 2018), a globally distributed network of laboratories which as of November 15, 2022 has 1,328 members working in 84 countries (see https://psysciacc.org/). The accelerator is investigating major issues such as the mental simulation of object orientation, the gendered nature of prejudice, and how situational factors shape moral judgements (Bago et al., 2022; Chen et al., 2018; Phills et al., 2021).
Participating in these large-scale projects is not easy and calls on researchers to develop specific standards and guidelines that ensure effective communication among collaborators and enable projects to develop coherently and cohesively. For example, a successful collaboration will require shared ways to wrangle, analyze, and visualize data, care in commenting and proofreading code, and high-quality accompanying documentation. In addition, research standards will likely apply to open repositories for file sharing (e.g., GitHub), using open formats (e.g., CSV datasets). This makes Big Team Science projects outstanding venues for training researchers in open science practices. Our recommendation for those who have begun to explore the ecosystem of open practices and tools and feel comfortable with them (or just want to learn more) is to contact existing consortia to assess the feasibility of participating. That could lead, eventually, to initiating a new project within an existing consortium, or even developing a new consortium. Participation in large-scale projects is a productive way to spark team members’ interest in both the research itself and open science practices while collaborating to achieve more robust research findings.
Strategy 8: Voice Your Opinion
Communities that have spent some time discussing open science policies will probably have reached some consensus and have interesting ideas to share with others. Where this is the case, one consolidating project that communities may wish to pursue is broadcasting their ideas as articles, testimonials, reports, glossaries, or declarations of support for open science. Such documents can outline the principles, values, and considerations that were more relevant during debates as community standards were being formed. Some examples are recent papers reflecting on the benefits of open science (Dal Ben et al., 2022; Engzell & Rohrer, 2021), The Beijing declaration on research data (CODATA - Committee on Data of the International Science Council et al., 2019), The Lindau guidelines, a project spearheaded by Nobel Prize winner Elizabeth Blackburn (Council for the Lindau Nobel Laureate Meetings/Foundation Lindau Nobel Laureate Meetings, 2020), and the community-sourced glossary of open scholarship terms developed by FORRT, the Framework for Open and Reproducible Research Teaching (Parsons et al., 2022).
Roadmaps are another format worth exploring. These are documents that often identify challenges, possible strategies for overcoming them, and recommendations. These documents are usually more practically oriented than manifestos and declarations of support, and are organized according to well-defined timeframes. Good examples are the open science roadmap produced by LIBER—Europe’s Research Libraries Network—for the period 2018-2022 (Ayris et al., 2018) and the roadmap produced by the Government of Canada (Office of the Chief Science Advisor of Canada, 2020), which lists a series of concrete milestones that would enable a phased transition to open science at the institutional level for research funded by government departments and agencies by 2025.
One benefit of these documents is that they effectively bring to light nuances in how different communities interpret and implement open science. For example, the notion of open access and the need to support it are widespread and reiterated in different documents. However, there are variations in its implementation in different regions and institutions that bear consideration if the aim is to grasp the bigger picture (see Becerril-García, 2022 on open access in Latin America versus other regions). At the same time, it should be remembered that open science is a growing cultural movement. All contributions add value to the overall effort to transform research practices. Finally, it is not only the contents of the documents that are important: it is also the act of publication itself. Discussing public policy and making the outcome of the discussion public are transformative acts for communities (i.e., a set of people and institutions with common interests), for they allow them to become political players in the process of building open science and society.
Strategy 9: Rethink and Promote Changes in the Assessment of Scholarly Production
Science benefits from open practices. However, current incentive and reward structures typically reward the quantity and prestige of research outputs such as grants and peer-reviewed publications, with limited consideration of open science activities. Without an incentive and reward structure that recognizes and values researchers’ engagement with open science, their efforts are bound to be side projects sparked by curiosity or ethical convictions. If open science is to become a standard for research, then its presence in the curricula vitae of academics must become a requirement and not just a nice addition. Adoption of open science practices should be considered in processes such as hiring, tenure, promotion, and the granting of student fellowships and awards.
Public declarations and recommendations have pointed out the problem with current incentive structures. The San Francisco Declaration on Research Assessment (DORA, 2012; Schmid, 2017), for instance, promotes academic assessment practices that take into account a myriad of scientific outputs (e.g., articles, open datasets, open software, open peer reviews) and cautions against the misuse of the impact factor as a proxy for assessing the quality of publications, which skews the dynamics of knowledge production. For example, researchers working at institutions that continue to focus on the impact factor are discouraged to publish in some types of open access journals (even if they adhere to the open science framework), because some do not have an impact factor.
Despite efforts such as the DORA, traditional academic assessment practices are still quite impermeable to change. For example, a recent study that analyzed 305 job advertisements for academic positions in 91 institutions found that only 2 of them mentioned open science (Khan et al., 2022). Relatedly, in 2019, the European University Association published the results of a survey of 260 universities in 32 European countries about research assessment and open science. Results show that European universities recognize the importance of modifying assessment practices but consider that the complexity of this modification is a barrier to attempting its implementation (Saenen et al., 2019, pp. 31–32).
Researchers can start making a difference right away. Sharing their concerns about academic evaluation with colleagues and including open science wording in job postings are great starting points (for some examples of academic job offers mentioning open science, see Schönbrodt et al., 2022). When participating in hiring and promotion committees, they can also point out candidates’ commitment to open science, consider scholarly outputs such as preprints and datasets to be as important as articles, and assess the quality of the candidate’s articles by the soundness of their contribution instead of by the metrics of the journals where they are published. They can also get involved in more formal discussions on academic assessment (Susi et al., 2022). A clear example of this is Project TARA, currently under development by the DORA, which seeks to build a toolkit of resources informed by the academic community to support the improvement of evaluation policies and practices (DORA, 2021). All these efforts support the greater goal of putting the issue of academic assessment on the agenda and sustaining awareness so that more and more people can join the discussion.
Strategy 10: Create Opportunities for People to Specialize in Open Science
Achieving the full implementation of open science will only happen through sustained efforts in each area of practice (i.e., open access, open data, open educational resources, among others). Many open community initiatives depend on and have been successful thanks to the altruistic contributions of researchers. However, this is not optimal if the goal is to make open science a global standard for scientific research. The spirit of open science—that the processes and outputs of scientific practice be freely available and accessible to all—should not be conflated with the nature of the efforts leading to it, which, like all other forms of work, deserve to be appropriately rewarded.
In this sense, it is paramount that those who have or can obtain funding and are in a position to hire personnel create job opportunities specifically related to open science. This recommendation is important considering that the people most likely to implement open science are early career researchers (e.g., Allen & Mehler, 2019; Farnham et al., 2017) and the people most likely to have such institutional power or access to funding are not. Possibilities to support open science include creating grants to organize and attend open science events, offering PhD scholarships and research assistantships for open science research, hiring research managers with an open science profile, and creating postdoctoral positions dealing specifically with open science issues. These roles can assist research teams and hiring institutions in transitioning to open science and amplifying the influence of the open research model in every discipline. At the same time, and perhaps more importantly, these jobs will promote the perception of open science as an area of expertise in its own right—just as important as any other area—and cultivate interest in open science over the long term.
We are aware that this may not always be possible. For instance, a faculty member might need more funding or authority to hire a full-time open science specialist. Where this is the case, a good side strategy is to record and report to the administration the time employees spend on open science practices, and to communicate the benefits of these practices. This helps to clarify that open science practices are valuable and to signal the effort involved in implementing them.
To Wrap up
Adopting open science practices is imperative both for its scientific and social benefits. The current context increasingly favors the transition to open science, thanks to the hard work of many advocates, supported by open science mandates from governments and agencies. As a result, researchers have a unique opportunity to embrace a fairer and more effective way to do science. However, this process challenges researchers because of its complexity and the many behavioral and ideological changes involved.
In this paper, we provided ten concrete strategies through which researchers can begin or further their transition to open science. Strategies 1–5 are practical steps to becoming an open science user. Meanwhile, strategies 6–10 are aimed at those in a position to advocate for open science, amplify its reach, and facilitate its adoption. All strategies are comprehensive in that they relate to different levels of analysis (i.e., procedural, social, ideological, political, and economic). In addition, we included select references for each strategy that researchers can consult to dive deeper into the information we presented.
Open science has the potential to make scientific research more robust, sustainable, reproducible, and replicable. It also helps break down barriers that prevent the free flow of knowledge. As noted elsewhere, open science is simply “science done right” (Imming & Tennant, 2018). It should therefore be a priority for all those involved in scientific research. However, there is still a long way to go before open science becomes the by-design and by-default model for scientific research. Researchers can become agents of change by modifying everyday workflows to foster open science practices and supporting others to make changes towards openness. Hopefully, the strategies we have outlined here will help researchers achieve this goal and build a better science.
Author Contributions
Contributed to conception: NA, KBH. Drafted and revised the article: NA, KBH. Approved the submitted version for publication: NA, KBH.
Competing Interests
The authors have no known conflict of interest to disclose.
Funding
This work was supported by a Concordia Horizon Postdoctoral Fellowship held by NA, and a grant from the Social Sciences and Humanities Research Council of Canada to KBH (Grant #890-2020-0059). KBH holds the Concordia University Research Chair in Bilingualism and Open Science.