Ever-increasing anthropogenic greenhouse gas emissions narrow the timeframe for humanity to mitigate the climate crisis. Scientific research activities are resource demanding and, consequently, contribute to climate change; at the same time, scientists have a central role in advancing knowledge, also on climate-related topics. In this opinion piece, we discuss (1) how open science – adopted on an individual as well as on a systemic level – can contribute to making research more environmentally friendly, and (2) how open science practices can make research activities more efficient and thereby foster scientific progress and solutions to the climate crisis. While many building blocks are already at hand, systemic changes are necessary in order to create academic environments that support open science practices and encourage scientists from all fields to become more carbon-conscious, ultimately contributing to a sustainable future.

The Earth is subject to a dramatic shift in climate dynamics, which threatens the existence of uncountable lifeforms, including humans. These ongoing changes will have a lasting impact, also on future generations. There is scientific consensus that this planetary health crisis is caused by human action (Whitmee et al., 2015) leading to ecological depletion, rising greenhouse gas emissions (Parmesan & Yohe, 2003; Rogelj et al., 2018; Thomas et al., 2004, p. 200), and consequently to a heating planet. According to the Intergovernmental Panel on Climate Change (IPCC) there are between 7 to 25 years left (as of January 2022) to reach the limit of 1.5°C of global warming (see Table 2.2 in Rogelj et al. (2018) and reports by Mercator Research Institute reporting an estimated 280-1030 Gt CO2 budget: https://web.archive.org/web/20220122093931/https://www.mcc-berlin.net/en/research/co2-budget.html), while a 2°C increase is estimated to be more likely (IPCC, 2021, B.1), missing the goal set by many nations in the Paris Agreement in 2015.

Scientific endeavors also contribute to climate change, stemming from high travel demands, high energy consumption of experimental machines and computers, costly substances, and non-reusable materials. For instance, the average astronomer based in Germany was estimated to emit 18 tCO2equivalents (CO2e) per year (Jahnke et al., 2020). Also, emissions for one PhD project were estimated at 21.5 tCO2e (Achten et al., 2013), which is about two to three times more CO2e than the average EU citizen emits in one year (8.4 tCO2e per year1). While (air) traveling is a major contributor (about 50%), emissions also stem from energy expenditure of scientific methods (e.g., high performance computing), the size of and access to acquired datasets, and consumption of research materials. Since estimated emissions depend on the research methodologies employed in a particular field, additional CO2 assessment and monitoring are needed to determine whether these numbers also apply to other fields. Open source tools for CO2 assessment developed by bottom-up initiatives are slowly growing (Anthony et al., 2020; Lannelongue et al., 2021; Mariette et al., 2021; Schmidt et al., 2021), for example assessing environmental impact of large-scale research projects like the GRAND project (Aujoux et al., 2021). The climate crisis urges the scientific community to become more sustainable on an individual and systemic level, which requires carbon-conscious scientists, the development of low carbon-impact labs and projects, as well as a reformation of academic incentive structures (Fardet et al., 2020; Rosen, 2017).

A promising approach to making science more sustainable is open science. Open science practices have gained momentum in the last decade, due to an increasing interest in the replicability and reproducibility of scientific findings (triggered by the “reproducibility crisis”; Baker, 2016; Ioannidis, 2005) as well as in the accessibility of scientific data, methods, and findings. Open science2 encompasses many different practices that aim to make science more open, accessible, transparent, reproducible, rigorous, and diverse (see taxonomy for open science: https://web.archive.org/web/20220203165032/https://www.fosteropenscience.eu/taxonomy/term/100).

In this opinion paper, we discuss how some of these open science practices can help to reduce resource demands in research, thus making scientific activities more environmentally friendly. Moreover, we argue that by being more sustainable, open science practices can be also more efficient and therefore foster progress in scientific research, which can ultimately leverage discoveries of solutions to the climate crisis.3 This includes progress in climate tracking and modeling, advances in the fields of (bio-)engineering, energy production, sustainable manufacturing, agriculture and land management, carbon capturing, biodiversity strategies, city planning, behavioral change, nudging, and many more. That open science can benefit innovative problem solving and societal developments during crises has been exemplified in the COVID-19 pandemic. Here, open science practices were readily implemented and rapidly spurred knowledge about the disease, the pandemic, and vaccine developments (Fraser et al., 2021; Hörmann et al., 2020; Zastrow, 2020). We believe that a similar and even stronger open exchange of knowledge is imperative to face the ongoing challenges of the climate crisis.

Previously, many arguments have been put forward to make the case for open science: The logical, the ethical, and the selfish argument. The logical argument holds that open science practices can help to increase overall research quality and credibility (Strech et al., 2020). The ethical argument prompts responsible work with living beings, not only valuing animal welfare, but also making research maximally transparent by adhering to open science principles (Strech & Dirnagl, 2019). The selfish argument outlines the benefits for the individual researcher of engaging in open science, including lowered risk of failure, improved coherence in writing and reviewing manuscripts, and enhancing continuity and reputation of one’s own work (Markowetz, 2015). We additionally propose the sustainable argument for open science, to highlight the motive for adopting and supporting open science practices to save resources. We define sustainability as a multifaceted term of wide applicability, which refers to saving human, monetary and, most of all, natural resources.

In the following, we will discuss how open science (1) can help making research practices more environmentally friendly, and moreover, (2) how it can foster scientific progress and thus solutions to the climate crisis, by increasing efficiency of the academic endeavor. We argue that all scientific stakeholders and decision-makers, who operate on an individual as well as systemic level, are needed to create enabling environments towards open and sustainable research.

Opening Up the Research Process

Study Design and Data Collection

Many studies waste resources by design: Either redundant or too much data is acquired that ends up not being analyzed, or the sample size does not suffice to draw conclusions from the study results (Button et al., 2013; Szucs & Ioannidis, 2020). That is, statistically underpowered studies are hardly informative or can even be misleading. More sustainable research designs and forms of data collection can be achieved with open science practices:

(1) Preregistrations and Registered Reports prompt researchers to plan data collection and analysis more carefully by including precise a priori hypotheses. Under both practices researchers define and publish the design and analysis pipeline of a study before data is collected and analyzed.4 This can increase the quality and efficiency of scientific workflows by counteracting (un-/intentional) unreported p-hacking (i.e., analyzing data until there is a statistically significant result) or HARKing (i.e., hypothesizing after results are known; John et al., 2012) and by investing carefully planned resources only. Moreover, preregistration increases transparency, which improves both verifiability and reproducibility. Preregistrations are mostly used in planned research, but can also be used to make exploratory data analysis more transparent (Dirnagl, 2020). Moreover, Registered Reports, which are peer reviewed preregistrations that are published regardless of the results (if the researchers adhere to their predefined plans), usually demand a power analysis. Power analyses deliver statistical estimates of how much data is required for testing hypothesized effects, reducing the risk that studies are being run with too small (or too big) sample sizes.

(2) To increase statistical power by obtaining bigger sample sizes, scientists can turn to crowd science methods (see Open Science for the Public section) or engage in multi-lab collaborations, including megastudies (Milkman et al., 2021), for example in research consortia like the Psychological Science Accelerator (https://web.archive.org/web/20220204122937/https://psysciacc.org/). Another example comes from neuroimaging research, where 70 teams were invited to analyze the same dataset (Botvinik-Nezer et al., 2020). As neuroimaging analysis pipelines are non-standardized and very individually employed by different research groups, this project revealed high methodological variability, which was also reflected in the results. A similar project is currently being carried out for research with electroencephalogram (EEG; EEGManyPipelines: https://web.archive.org/web/20220204123123/https://www.eegmanypipelines.org/). By providing research ideas, resources, methods and connecting research teams, those platforms and collaborative efforts have the potential to accumulate and streamline research and make it less redundant, while increasing reliability, transparency, and generalizability.

(3) Lastly, meta-analyses pool findings of previous studies to systematically assess the overall evidence for an effect and help unveiling publication bias (Simonsohn et al., 2014; see Publishing Null Findings section). A special form are community-augmented meta-analyses (Tsuji et al., 2014; examples can be found e.g., on MetaLab: https://web.archive.org/web/20220204123258/https://langcog.github.io/metalab/) that are open platforms (repositories), which can be continuously updated as “living evidence” (Elliott et al., 2021), making meta-analyses more sustainable. Even though meta-analyses might not be thought of as a prototypical open science practice, they benefit from open science practices such as data sharing and open code. By design they are resource-friendly, avoiding energy demanding data collection, and can increase the value of underpowered studies (for examples from the field of Developmental Psychology, see Bergmann et al., 2018).

In spite of the benefits for efficient use of resources that open science practices provide, there may be negative effects individual researchers can experience that should be weighed against the collective benefits of open science practices. For instance, long planning periods of Registered Reports can be seen as blocking progress, which weighs heavily especially for early career researchers, who are often employed on temporary contracts. Being more transparent in general can also be perceived as a risk, for instance, for future career steps, since transparency comes with higher visibility of errors (Allen & Mehler, 2019). Also, it may only pay off in the long run, and potentially not directly to the individual. However, for the larger community, transparency improves the quality of research (Allen & Mehler, 2019), and as the arguments above outline, decreases the risk of wasted human, monetary, and natural resources. Open science practices for study design and data collection should therefore be enabled and supported by scientific decision makers. Importantly, the best way to avoid negative consequences for early career researchers is by aligning the academic incentive structure such that employing open science practices becomes beneficial for research careers. Moreover, since public research is usually not governed by economic interests, scientific decision makers have the possibility to prioritize sustainable endeavors and potentially save costs in the long run.

Taken together, well-planned and adequately powered study designs as well as meta-analyses are invaluable assets for sustainable scientific progress in general, and, most importantly, can play a crucial role in climate-related research, while helping to decrease waste of resources in research.

Open Data, Open Code, Open Source

In classical research practices, scientists summarize their findings in highly condensed, shortened and often idealized journal publications without providing access to data, analysis code or the initial analysis plan. This is problematic in two ways: First, the science behind the publication remains largely inaccessible to the public and second, the replication of research studies and their methodological aspects is hindered. Because replication studies, which are vital validation procedures of research findings, already entail an additional investment of resources, it is important to conduct them maximally efficiently. When information on previous studies is restricted, replication attempts can be based on diverging assumptions, and will consequently cost resources for piloting data collection as well as the re-implementation and optimization of analysis workflows. As discussed above, differences in data analysis may lead to differences in results (Botvinik-Nezer et al., 2020), which makes transparency in reporting of data collection and analysis essential. Thus, this hindrance of scientific exchange and transparency leads to the waste of personal and natural resources. Instead, making data, code and software openly available can improve transparency, replicability, and sustainability of research.

Open data allows other researchers to conduct replication analyses and/or analyses addressing different research questions to the initial one, thereby saving (scientific) resources. Open datasets like the UK Biobank, Many Labs 1+2+3 (Klein et al., 2014; Stroebe, 2019; http://web.archive.org/web/20220419093519/https://osf.io/wx7ck/), FACES (find a full list of open datasets for psychological research: http://web.archive.org/web/20220203164620/https://docs.google.com/spreadsheets/d/1ejOJTNTL5ApCuGTUciV0REEEAqvhI2Rd2FCoj7afops/edit), and data sharing platforms such as OpenNeuro (https://web.archive.org/web/20220204123347/https://openneuro.org/) with more than 500 neuroimaging studies, or Copernicus Climate Data Store (https://web.archive.org/web/20220203163413/https://cds.climate.copernicus.eu/) host data that is ready to be used and analyzed. By sharing detailed information on methodology and analysis, publishing open code and analysis protocols ensures that results can be reproduced, replicated, and that the study can be taken into account for meta-analyses. Open data requires setting community standards, including naming conventions and file formats (Eggleton & Winfield, 2020). In neuroscience, there has been a still ongoing community-wide effort to standardize the organization of research data for better accessibility, spanning across various imaging modalities (Brain Imaging Data Structure, BIDS; Gorgolewski et al., 2016; Pernet et al., 2019). Setting these standards requires widespread acceptance and adaptation, however, eventually benefits the community and its research as a whole, making it an expense with high sustainable return. To support this endeavor, the FAIR Guiding Principles formalize scientific data management, with the main goal to organize data in such a way that they are Findable, Accessible, Interoperable and Reusable (Wilkinson et al., 2016). Further, accessibility to research is increased by open data and may therefore promote streamlining of climate action across governments and societies (Grinspan & Worker, 2021).

Open code means sharing code in repositories (e.g., on GitLab, or GitHub), or in the form of code snippets in forums (e.g., on Stack Overflow) or as blog posts. This practice saves personal resources by reducing redundant coding time, brings community-based software forward, boosts progress in many scientific fields and industries while surpassing regional boundaries of knowledge exchange.

Open source comprises free open source software (FOSS), such as R and Python, which can be used unrestrictedly without paying license fees. FOSS are extended further by community efforts, and through that process can be developed to industry standards. Using FOSS enables other scientists to review code, preventing bugs from staying hidden for years (e.g., as reported for the neuroimaging field in Eklund et al., 2016). Moreover, major scientific initiatives, like the Human Connectome Project in neurosciences, CERN in physical sciences and Pangeo in geosciences have already shown that open source is well suited for large-scale collaborative projects. Both open code and open source can challenge individual researchers with software of low quality. At the same time, open issue management, transparent changes and version control promise to increase coding literacy among researchers, establish new standards and conventions, and lower the risk of hidden mistakes, ultimately facilitating reproducible research.

Open data, open code and open source require larger computing and cloud services with high energy costs. However, computing centers hosting such services are usually highly optimized and energy efficient, when compared to individual local computing resources. Moreover, there are further endeavors to host more energy-efficient data centers, sustainable hardware procurement and software use, aiming for “digital sustainability” (George et al., 2020). Finally, we expect that the above listed benefits of sharing data openly outweigh the costs of running data centers and cloud services.

To improve science and its environmental impact, individual researchers and large-scale projects should implement open data, open code, and open source to reduce resource expenses and resource use and further increase efficiency of scientific research.

Publishing Process and Formats

Open Access and Open Peer Reviewing

Although scientific research is mainly funded by public money with the aim to broaden knowledge and to eventually serve the public welfare, depending on the field about 70% of published articles remain behind paywalls of scientific publishing houses (Day et al., 2020). In 2017 the annual global average budget for research and development was estimated at about $2.3 trillion for 8 million researchers and 2 million published scientific articles (Markram, 2017), with an estimated overwhelming number of 1.4 million articles behind paywalls. Access fees are hardly affordable for interested individuals and research institutions with limited funding. While this is problematic for science in general, a reformation of publishing formats and the publishing process becomes most pressing in light of the climate crisis. Having access to all previous literature is essential to conduct well-informed studies and to contribute effectively to specific research fields and to improve public literacy, for instance on climate change.

A solution to more sustainable publishing formats is open access publishing, which describes a set of principles on how research findings can be retrieved online and free of charge for the reader. Open access can be granted on various levels: In the gold model publishers make articles and accompanying material openly available. The article processing charge (APC) is usually covered by the authors or their associated institutions. The green model allows the author to self-archive and post their work online, for example on their website or on preprint or postprint servers (see below). Finally, the platinum or diamond model is used by journals that grant full open access without APCs. While in this case publishers need to find alternative financing models (e.g., via grants), diamond models are important for research groups that aim to make their research open access but do not have funds to pay APCs. Indeed, a common critique on open access publishing is the high costs involved that have to be paid by the authors, which limits open access publishing to financially privileged research institutions only.

In recent years, open publishing beyond traditional publishing houses has rapidly expanded with the advent of open preprint servers (Bourne et al., 2017), to which authors can submit their manuscript before it has been formally peer reviewed and/or accepted for publication in a journal. Research becomes immediately available through preprints, allowing others to build upon its results; the benefits of which have become clear during the COVID-19 pandemic (Fraser et al., 2021). Preprint servers allow sharing work via social media and oftentimes have a built-in commenting function for public feedback. Importantly, it should be clearly stated that the research in a preprint has not been peer reviewed yet. Lacking this form of quality assessment, preprints may have higher rates of errors and scientific shortcomings. However, if reviews of any form are already available, they can be linked to and presented at the side of the preprint, also post-journal publication, increasing verifiability of the research.

The common peer review process for journal publications is often a concealed, subjective and finite procedure in which a handful of people give anonymous feedback. In contrast, open review processes use transparent and interactive commenting and rating functions for both the research work and the reviewers’ feedback itself. In this way, reviews can evolve with time and thus enrich publications and the interpretation of their results continuously (an example is the OpenReview platform: https://web.archive.org/web/20220204123850/https://openreview.net/). In this process, reviewers are not selected by editors, which arguably could lead to a decline in quality of reviews. However, reviews are not anonymous, and therefore more constructive feedback can be expected, while comments on reviews themselves can further increase the quality of the assessment.

In summary, despite potential additional costs for research institutions, open access and open review save resources and stimulate international collaborations. For scientists, as well as for the general public, open access and transparent reviewing are crucial for high quality research. Moreover, better access to scientific literature also improves our understanding of climate-related challenges and helps to find respective solutions.

Publishing Null Findings

Another problem in the publishing process impeding the validity and reliability of research is publication bias, also called the file-drawer problem. Publication bias arises when the decision on publication is dependent on whether a study reports a statistically significant finding; this leads to null results (i.e., results that do not exceed the preset significance level) not being published. This systemic imbalance causes the dissipation of valuable resources (public funds, time, materials) within a research community on topics that might have been researched already but ended up in the file-drawer. Moreover, publication bias leads to a skewed representation of scientific knowledge (Murad et al., 2018). Ultimately, this asymmetry of evidence hinders scientific advancement, which is of particular relevance for pressing topics such as the climate crisis.

Publication bias can be mitigated with the help of different open science practices. For instance, publishing preprints leads to the dissemination of research prior to peer review and is not bound to editorial choices. Thereby, studies with null findings that might not find an outlet are openly accessible. However, missing quality assessments of preprints come at their own price, as has been argued above. Moreover, publishing null findings only as a preprint misses the scientific endorsement the authors would get through journal publications. A solution to this, and another tool against publication bias are Registered Reports (see Study Design and Data Collection section), which are published regardless of their findings. Also, journals can play an important role in fighting publication bias, by featuring null results if the submitted work has robust and credible methodology; for instance, if the work is well-grounded in theory (van Rooij & Baggio, 2021), and has open data and/or open code, possibly combined with preregistration. It can be argued that publishing any result could lead to an inflation of research papers, which means more time and effort for scientists to screen and review prior literature. However, mitigating the unsustainable consequences of skewed knowledge bases and unnecessarily re-running experiments comes with higher costs than screening additional literature.

Reducing publication bias, while minimizing the incentive to polish results, will provide a more objective perspective on the current state of research. This is fundamental for tackling epistemically well-grounded and informative research questions and avoiding the consumption of resources for studies which have been conducted before but have not been published. Especially in light of the climate crisis, finding the right questions and their solutions is of paramount importance.

Open Science for the Public

The general public is hardly involved in the research process. Crowd science (or citizen science) counteracts this lack of involvement by including citizens in different stages of the research process. Opening science for the public can not only make research more sustainable by reducing resource usage, but also has the potential to increase public knowledge and stimulate commitment in topics of wide societal significance such as the climate crisis.

Crowd Science to Make Science More Sustainable

Crowd science can help saving resources in research processes by engaging citizens in data collection and in data analysis, for instance, by reducing travel emissions and time of research staff that would normally be needed to collect equal amounts of data. Moreover, crowd science can help to reach adequate sample sizes more effectively for accurately powered studies (see the Study Design and Data Collection section), and offers the possibility to collect data from more diverse samples, making research more generalizable (Moshontz et al., 2018). However, depending on the research topic, crowd science potentially attracts a non-representative sub-population with specific interests - a bias that should be carefully assessed by the researchers (Sauermann et al., 2020).

There are examples of crowd science-based data collection from climate-related research, including CurieuzeNeuzen (https://web.archive.org/web/20220204122006/https://2016.curieuzeneuzen.be/en/) for measuring urban nitrogen dioxide levels, INCREASE (https://web.archive.org/web/20220204122127/https://www.pulsesincrease.eu/) for increasing pulse biodiversity, bird counting initiatives to investigate how environmental changes affect birds such as Audubon Christmas Bird Count in North America (https://web.archive.org/web/20220203164825/https://www.audubon.org/conservation/science/christmas-bird-count) and the “Garden Bird Hour” in Germany (https://web.archive.org/web/20220203164909/https://www.nabu.de/tiere-und-pflanzen/aktionen-und-projekte/stunde-der-gartenvoegel/), and Flora Incognita for mapping plant occurrences (Mahecha et al., 2021). Crowd science-based analyses platforms include Zooniverse (https://web.archive.org/web/20220204122249/https://www.zooniverse.org/) with over 2.3 million volunteers and MRIQC for neuroimaging quality control metrics (Esteban et al., 2019). This method risks lower data quality, when compared to studies in more controlled research lab environments. However, these effects can be mitigated by integrating adequate quality checks.

Crowd Science for Climate Action

Apart from making the research process more sustainable and improving scientific efficiency, crowd science has the additional potential to increase public understanding of scientific progress, including topics as fundamental as climate change. Citizens can be involved in the conceptualization of research questions (see CRIS: https://web.archive.org/web/20220203164853/https://ois.lbg.ac.at/en/projects/crowdsourcing-research-questions-in-science) and design, a concept called co-creation, which can give a greater sense of agency in otherwise highly complex topics. In particular, the climate crisis can be perceived as overwhelming and too far away to grasp (Schubert et al., 2019), triggering feelings of loss of agency and emotional or existential distress, referred to as eco-anxiety and solastalgia (Albrecht et al., 2007; Clayton et al., 2017; Panu, 2020). This can also result in coping mechanisms of denial and disengagement (Wong-Parodi & Feygina, 2020). Involving the public in climate-related scientific activities increases agency and literacy on climate change, and has the potential to strengthen the willingness to implement more sustainable lifestyles (Dickinson et al., 2012). Moreover, co-creation can also help to identify problems that might not be apparent to scientists, and to diminish voyeuristic practices by involving minority communities as active members of research efforts (Jull et al., 2018), thereby aligning interests of researchers and the community (Greenhalgh et al., 2016). This is important to ensure that, for instance, the implementation of science-based sustainability measures in communities are actually improving their ecological situations, and do not have unforeseen negative side-effects for the inhabitants and their environment (Ribeiro et al., 2018; Sauermann et al., 2020).

Moreover, crowd science is a tool for science communication. This is especially important since a growing part of the public shows a lack of confidence in science and its methods in general, which becomes apparent in the context of climate change in particular (Kabat, 2017). Open tools such as the climate simulation models by Climate Interactive (https://web.archive.org/web/20220208132820/https://www.climateinteractive.org/en-roads/) demonstrate how science communication can be effective in making complex relationships graspable for a lay audience, and should encourage climate researchers to continue engaging in discussions about climate change mitigation strategies. In an era of targeted disinformation campaigns, especially with respect to climate change (Treen et al., 2020), science communication is key to a well-informed public and the indispensable foundation for evidence-based policy-making.

In summary, crowd science can contribute to improving research output, making research practices more sustainable. Moreover, it can incentivize citizens to get involved in sustainability-related research, which increases the potential that they act upon the climate crisis.

In the sections above it has been discussed how open science practices can provide answers to unsustainable research methods. Yet, the questions remain how those practices can be implemented on a large scale, as well as who is responsible and in the position to employ them.

Some open science practices are closely linked to primary research activities (e.g., data collection, publishing preprints, and making code openly available) and therefore can be addressed by the individual researchers and their working groups. Other open science practices require more systemic changes and large-scale acceptance in the scientific community (e.g., publishing null results, contributing to open source software), which can only be accomplished by changing the academic incentive structure and procurement laws that regulate spending of public funds. This addresses in particular academic, administrative, and political decision-makers who have the means to create enabling environments making this change possible. One impactful systemic change is to reform the assessment of scientific output for hiring and funding decisions, for example going beyond the impact factor of journals, as proposed (amongst other things) in the San Francisco Declaration of Research Assessment (DORA). By signing this agreement and implementing its suggestions, individual researchers, research organizations, and scientific publishers contribute to fostering a scientific culture that incentivizes more open and therefore more sustainable research practices. Individual researchers may also advocate for systemic changes, for example by founding and supporting grassroots movements for open science (see this list of global open science initiatives: http://web.archive.org/web/20220203164905/https://docs.google.com/spreadsheets/d/1LNF5_bOkRV-RLIF4HYmu-gOemIa4IdfXEer89fM-Vy8/edit; and a guide on how to found an open science initiative: https://web.archive.org/web/20220203164924/https://ecrlife.org/how-to-start-an-os-initiative-2/).

Thus, we believe that beyond individual behavior, changes in the academic incentive structure are crucial to create an enabling environment that is needed to achieve fundamental steps towards a more sustainable academic environment.

We have outlined the sustainability argument for open science. Open science offers a set of practices that can make research more efficient and more sustainable, that is, saving human, monetary, and natural resources in all research fields.

These open science practices can also improve climate related research. For instance, open sustainable study designs and data collection can streamline urgent studies on the effectiveness of specific measures against the heating of the planet or the production of novel renewable energy sources. Open data, open code, and open source lead to a global exchange of invaluable information on weather extremes and climate tracking possibilities. Also, open access to all research independent of its results is important to enable necessary follow-up studies and the acceleration of development of new technologies. Lastly, open, transparent and intelligible communication and the involvement of the public is crucial to foster climate literacy, a sense of agency and acceptance for potentially strong political regulation and changes which become inevitable facing the climate crisis.

We believe that the adoption of open science practices at the individual and the systemic level will make science more sustainable and can play a crucial role in advancing and accelerating urgently needed transformations out of the climate crisis.

All authors contributed equally.

Substantial contributions to conception and design: GHG, SMH, EM

Drafting the article or revising it critically for important intellectual content: GHG, SMH, EM

Final approval of the version to be published: GHG, SMH, EM

We thank Tanguy Fardet (Max Planck Sustainability Network) and Tobias Leutritz (Max Planck Institute for Human Cognitive and Brain Sciences) for inspiring content, proof-reading and critical discussions. We also thank Bernd Wirsing (Max Planck Society General Administration) for taking the time for a short interview. We thank multiple groups of the Max Planck Society for their encouragement and support: the Open Science Initiative at the Max Planck Institute for Human Cognitive and Brain Sciences and the Open Science working group in the Max Planck PhDnet (GHG is an active member in both groups), the Max Planck Sustainability Network (EM is an active member and former steering committee member), and the Max Planck PhDnet (SMH is a former member of the steering group).

This research did not receive any specific funding. GHG is funded by the Max Planck Society and the Einstein Center for Neurosciences Berlin. SMH and EM are funded by the Max Planck Society.

The authors declare no competing interests.

There was no data collected or analyzed.

2.

We note that the term open science might appear as not inclusive of the humanities and are sympathetic to the term “open science and scholarship”. For reasons of brevity, we use the term open science, with which we mean to refer to both the natural sciences and the humanities.

3.

Since solutions to complex problems often arise from unforeseen disciplines (Annan-Diab & Molinari, 2017), we explicitly note that we mean climate-related research in its broadest form, whether it is approached from an ecological, sociological, cognitive, philosophical, economical, or another scientific angle.

4.

Preregistrations (either non-reviewed or as part of a Registered Report) are time-stamped and publicly available. They can be embargoed for the period in which the research is being conducted.

Achten, W. M. J., Almeida, J., & Muys, B. (2013). Carbon footprint of science: More than flying. Ecological Indicators, 34, 352–355. https://doi.org/10.1016/j.ecolind.2013.05.025
Albrecht, G., Sartore, G.-M., Connor, L., Higginbotham, N., Freeman, S., Kelly, B., Stain, H., Tonna, A., & Pollard, G. (2007). Solastalgia: The distress caused by environmental change. Australasian Psychiatry, 15(1_suppl), S95–S98. https://doi.org/10.1080/10398560701701288
Allen, C., & Mehler, D. M. A. (2019). Open science challenges, benefits and tips in early career and beyond. PLoS Biology, 17(5), e3000246. https://doi.org/10.1371/journal.pbio.3000246
Annan-Diab, F., & Molinari, C. (2017). Interdisciplinarity: Practical approach to advancing education for sustainability and for the Sustainable Development Goals. The International Journal of Management Education, 15(2, Part B), 73–83. https://doi.org/10.1016/j.ijme.2017.03.006
Anthony, L. F. W., Kanding, B., & Selvan, R. (2020). Carbontracker: Tracking and Predicting the Carbon Footprint of Training Deep Learning Models.
Aujoux, C., Blanchard, O., & Kotera, K. (2021). How to assess the carbon footprint of a large-scale physics project. Nature Reviews Physics, 3(6), 386–387. https://doi.org/10.1038/s42254-021-00325-2
Baker, M. (2016). 1,500 scientists lift the lid on reproducibility. Nature, 533(7604), 452–454. https://doi.org/10.1038/533452a
Bergmann, C., Tsuji, S., Piccinini, P. E., Lewis, M. L., Braginsky, M., Frank, M. C., & Cristia, A. (2018). Promoting replicability in developmental research through meta-analyses: Insights from language acquisition research. Child Development, 89(6), 1996–2009. https://doi.org/10.1111/cdev.13079
Botvinik-Nezer, R., Holzmeister, F., Camerer, C. F., Dreber, A., Huber, J., Johannesson, M., & Rieck, J. R. (2020). Variability in the analysis of a single neuroimaging dataset by many teams. Nature, 582(7810), 84–88.
Bourne, P. E., Polka, J. K., Vale, R. D., & Kiley, R. (2017). Ten simple rules to consider regarding preprint submission. PLOS Computational Biology, 13(5), e1005473. https://doi.org/10.1371/journal.pcbi.1005473
Button, K. S., Ioannidis, J. P. A., Mokrysz, C., Nosek, B. A., Flint, J., Robinson, E. S. J., & Munafò, M. R. (2013). Power failure: Why small sample size undermines the reliability of neuroscience. Nature Reviews Neuroscience, 14(5), 365–376. https://doi.org/10.1038/nrn3475
Clayton, S., Manning, C., Krygsman, K., & Speiser, M. (2017). Mental Health and Our Changing Climate: Impacts, Implications, and Guidance. American Psychological Association, and ecoAmerica.
Day, S., Rennie, S., Luo, D., & Tucker, J. D. (2020). Open to the public: Paywalls and the public rationale for open access medical research publishing. Research Involvement and Engagement, 6(1), 8. https://doi.org/10.1186/s40900-020-0182-y
Dickinson, J. L., Shirk, J., Bonter, D., Bonney, R., Crain, R. L., Martin, J., Phillips, T., & Purcell, K. (2012). The current state of citizen science as a tool for ecological research and public engagement. Frontiers in Ecology and the Environment, 10(6), 291–297. https://doi.org/10.1890/110236
Dirnagl, U. (2020). Preregistration of exploratory research: Learning from the golden age of discovery. PLOS Biology, 18(3), e3000690. https://doi.org/10.1371/journal.pbio.3000690
Eggleton, F., & Winfield, K. (2020). Open Data Challenges in Climate Science. Data Science Journal, 19(1). https://doi.org/10.5334/dsj-2020-052
Eklund, A., Nichols, T. E., & Knutsson, H. (2016). Cluster failure: Why fMRI inferences for spatial extent have inflated false-positive rates. Proceedings of the National Academy of Sciences, 113(28), 7900–7905. https://doi.org/10.1073/pnas.1602413113
Elliott, J., Lawrence, R., Minx, J. C., Oladapo, O. T., Ravaud, P., Tendal Jeppesen, B., Thomas, J., Turner, T., Vandvik, P. O., & Grimshaw, J. M. (2021). Decision makers need constantly updated evidence synthesis. Nature. https://doi.org/10.1038/d41586-021-03690-1
Esteban, O., Blair, R. W., Nielson, D. M., Varada, J. C., Marrett, S., Thomas, A. G., Poldrack, R. A., & Gorgolewski, K. J. (2019). Crowdsourced MRI quality metrics and expert quality annotations for training of humans and machines. Scientific Data, 6(1), 30. https://doi.org/10.1038/s41597-019-0035-4
Fardet, T., Hütten, M., Lohmann, S., Medawar, E., Milucka, J., Roesch, J. H., Rolfes, J. D., & Schweizer, J. (2020). Making Science Organizations Sustainable—The Mission of the Max Planck Sustainability Network. Frontiers in Sustainability, 1. https://doi.org/10.3389/frsus.2020.567211
Fraser, N., Brierley, L., Dey, G., Polka, J. K., Pálfy, M., Nanni, F., & Coates, J. A. (2021). Preprinting the COVID-19 pandemic. BioRxiv, 2020.05.22.111294. https://doi.org/10.1101/2020.05.22.111294
George, G., Merrill, R. K., & Schillebeeckx, S. J. D. (2020). Digital Sustainability and Entrepreneurship: How Digital Innovations Are Helping Tackle Climate Change and Sustainable Development. Entrepreneurship Theory and Practice, 45(5), 999–1027. https://doi.org/10.1177/1042258719899425
Gorgolewski, K. J., Auer, T., Calhoun, V. D., Craddock, R. C., Das, S., Duff, E. P., Flandin, G., Ghosh, S. S., Glatard, T., Halchenko, Y. O., Handwerker, D. A., Hanke, M., Keator, D., Li, X., Michael, Z., Maumet, C., Nichols, B. N., Nichols, T. E., Pellman, J., … Poldrack, R. A. (2016). The brain imaging data structure, a format for organizing and describing outputs of neuroimaging experiments. Scientific Data, 3(1), 1–9. https://doi.org/10.1038/sdata.2016.44
Greenhalgh, T., Jackson, C., Shaw, S., & Janamian, T. (2016). Achieving Research Impact Through Co-creation in Community-Based Health Services: Literature Review and Case Study. The Milbank Quarterly, 94(2), 392–429. https://doi.org/10.1111/1468-0009.12197
Grinspan, D., & Worker, J. (2021). Implementing Open Data Strategies for Climate Action: Suggestions And Lessons Learned for Government and Civil Society Stakeholders. World Resources Institute. https://doi.org/10.46830/wriwp.19.00093
Hörmann, N., van Scherpenberg, C., & Goel, R. (2020). Open doors with social distance—Research opens up during Covid-19 pandemic. The Offspring Blog. https://www.phdnet.mpg.de/131182/2020-04-14_openscience-covid19?c=22833
Ioannidis, J. P. A. (2005). Why Most Published Research Findings Are False. PLOS Medicine, 2(8), e124. https://doi.org/10.1371/journal.pmed.0020124
IPCC. (2021). Summary for Policymakers. In V. Masson-Delmotte, P. Zhai, A. Pirani, S. L. Connors, C. Péan, S. Berger, N. Caud, Y. Chen, L. Goldfarb, M. I. Gomis, M. Huang, K. Leitzell, E. Lonnoy, J. B. R. Matthews, T. K. Maycock, T. Waterfield, O. Yelekçi, R. Yu, & B. Zhou (Eds.), Climate Change 2021: The Physical Science Basis. Contribution of Working Group I to the Sixth Assessment Report of the Intergovernmental Panel on Climate Change In Press. https://www.ipcc.ch/report/ar6/wg1/downloads/report/IPCC_AR6_WGI_SPM_final.pdf
Jahnke, K., Fendt, C., Fouesneau, M., Georgiev, I., Herbst, T., Kaasinen, M., Kossakowski, D., Rybizki, J., Schlecker, M., Seidel, G., Henning, T., Kreidberg, L., & Rix, H.-W. (2020). An astronomical institute’s perspective on meeting the challenges of the climate crisis. Nature Astronomy, 4(9), 812–815. https://doi.org/10.1038/s41550-020-1202-4
John, L. K., Loewenstein, G., & Prelec, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953
Jull, J., Morton-Ninomiya, M., Compton, I., & Picard, A. (2018). Fostering the conduct of ethical and equitable research practices: The imperative for integrated knowledge translation in research conducted by and with indigenous community members. Research Involvement and Engagement, 4(1), 45. https://doi.org/10.1186/s40900-018-0131-1
Kabat, G. C. (2017). Taking distrust of science seriously. EMBO Reports, 18(7), 1052–1055. https://doi.org/10.15252/embr.201744294
Klein, R. A., Ratliff, K. A., Vianello, M., Adams, R. B., Jr., Bahník, Š., Bernstein, M. J., & et al. (2014). Investigating variation in replicability: A “many labs” replication project. Social Psychology, 45, 142–152.
Lannelongue, L., Grealey, J., & Inouye, M. (2021). Green Algorithms: Quantifying the Carbon Footprint of Computation. Advanced Science, 8(12), 2100707. https://doi.org/10.1002/advs.202100707
Mahecha, M. D., Rzanny, M., Kraemer, G., Mäder, P., Seeland, M., & Wäldchen, J. (2021). Crowd-sourced plant occurrence data provide a reliable description of macroecological gradients. Ecography, 44(8), 1131–1142. https://doi.org/10.1111/ecog.05492
Mariette, J., Blanchard, O., Berné, O., Aumont, O., Carrey, J., Ligozat, A. L., Lellouch, E., Roche, P., Guennebaud, G., Thanwerdas, J., Bardou, P., Salin, G., Maigne, E., Servan, S., & Ben-Ari, T. (2021). An open-source tool to assess the carbon footprint of research. BioRxiv, 2021.01.14.426384. https://doi.org/10.1101/2021.01.14.426384
Markowetz, F. (2015). Five selfish reasons to work reproducibly. Genome Biology, 16(1), 274. https://doi.org/10.1186/s13059-015-0850-7
Markram, K. (2017, April 17). Open Science can save the planet. https://www.youtube.com/watch?v=uPtP6-nAjJ0
Milkman, K. L., Gromet, D., Ho, H., Kay, J. S., Lee, T. W., Pandiloski, P., & Duckworth, A. L. (2021). Megastudies improve the impact of applied behavioural science. Nature, 600(7889), 478–483.
Moshontz, H., Campbell, L., Ebersole, C. R., IJzerman, H., Urry, H. L., Forscher, P. S., Grahe, J. E., McCarthy, R. J., Musser, E. D., Antfolk, J., Castille, C. M., Evans, T. R., Fiedler, S., Flake, J. K., Forero, D. A., Janssen, S. M. J., Keene, J. R., Protzko, J., Aczel, B., … Chartier, C. R. (2018). The Psychological Science Accelerator: Advancing Psychology Through a Distributed Collaborative Network. Advances in Methods and Practices in Psychological Science, 1(4), 501–515. https://doi.org/10.1177/2515245918797607
Murad, M. H., Chu, H., Lin, L., & Wang, Z. (2018). The effect of publication bias magnitude and direction on the certainty in evidence. BMJ Evidence-Based Medicine, 23(3), 84–86. https://doi.org/10.1136/bmjebm-2018-110891
Panu, P. (2020). Anxiety and the Ecological Crisis: An Analysis of Eco-Anxiety and Climate Anxiety. Sustainability, 12(19), 7836. https://doi.org/10.3390/su12197836
Parmesan, C., & Yohe, G. (2003). A globally coherent fingerprint of climate change impacts across natural systems. Nature, 421(6918), 37–42. https://doi.org/10.1038/nature01286
Pernet, C. R., Appelhoff, S., Gorgolewski, K. J., Flandin, G., Phillips, C., Delorme, A., & Oostenveld, R. (2019). EEG-BIDS, an extension to the brain imaging data structure for electroencephalography. Scientific Data, 6(1), 1–5. https://doi.org/10.1038/s41597-019-0104-8
Ribeiro, B., Bengtsson, L., Benneworth, P., Bührer, S., Castro-Martínez, E., Hansen, M., Jarmai, K., Lindner, R., Olmos-Peñuela, J., Ott, C., & Shapira, P. (2018). Introducing the dilemma of societal alignment for inclusive and responsible research and innovation. Journal of Responsible Innovation, 5(3), 316–331. https://doi.org/10.1080/23299460.2018.1495033
Rogelj, J., Shindell, D., Jiang, K., Fifita, S., Forster, P., Ginzburg, V., Handa, C., Kheshgi, H., Kobayashi, S., Kriegler, E., Mundaca, L., Séférian, R., Vilariño, M. V., Masson-Delmotte, V., Zhai, P., Pörtner, H.-O., Roberts, D., Skea, J., Shukla, P. R., … Waterfield , T. (2018). Global Warming of 1.5°C. An IPCC Special Report on the impacts of global warming of 1.5°C above pre-industrial levels and related global greenhouse gas emission pathways, in the context of strengthening the global response to the threat of climate change, sustainable development, and efforts to eradicate poverty. https://www.ipcc.ch/site/assets/uploads/sites/2/2019/02/SR15_Chapter2_Low_Res.pdf
Rosen, J. (2017). Sustainability: A greener culture. Nature, 546(7659), 565–567.
Sauermann, H., Vohland, K., Antoniou, V., Balázs, B., Göbel, C., Karatzas, K., Mooney, P., Perelló, J., Ponti, M., Samson, R., & Winter, S. (2020). Citizen science and sustainability transitions. Research Policy, 49(5), 103978. https://doi.org/10.1016/j.respol.2020.103978
Schmidt, V., Goyal, K., Joshi, A., Feld, B., Conell, L., Laskaris, N., Blank, D., Wilson, J., Friedler, S., & Luccioni, S. (2021). CodeCarbon: Estimate and Track Carbon Emissions from Machine Learning Computing. Zenodo. https://doi.org/10.5281/ZENODO.4658424
Schubert, S., Caviola, L., & Faber, N. S. (2019). The psychology of existential risk: Moral judgments about human extinction. Scientific Reports, 9(1), 1–8. https://doi.org/10.1038/s41598-019-50145-9
Simonsohn, U., Nelson, L. D., & Simmons, J. P. (2014). P-curve: A key to the file-drawer. Journal of Experimental Psychology: General, 143(2), 534–547. https://doi.org/10.1037/a0033242
Strech, D., & Dirnagl, U. (2019). 3Rs missing: Animal research without scientific value is unethical. BMJ Open Science, 3(1). https://doi.org/10.1136/bmjos-2018-000048
Strech, D., Weissgerber, T., & Dirnagl, U. (2020). Improving the trustworthiness, usefulness, and ethics of biomedical research through an innovative and comprehensive institutional initiative. PLOS Biology, 18(2), e3000576. https://doi.org/10.1371/journal.pbio.3000576
Stroebe, W. (2019). What Can We Learn from Many Labs Replications? Basic and Applied Social Psychology, 41(2), 91–103. https://doi.org/10.1080/01973533.2019.1577736
Szucs, D., & Ioannidis, J. P. A. (2020). Sample size evolution in neuroimaging research: An evaluation of highly-cited studies (1990–2012) and of latest practices (2017–2018) in high-impact journals. NeuroImage, 221, 117164. https://doi.org/10.1016/j.neuroimage.2020.117164
Thomas, C. D., Cameron, A., Green, R. E., Bakkenes, M., Beaumont, L. J., Collingham, Y. C., Erasmus, B. F. N., de Siqueira, M. F., Grainger, A., Hannah, L., Hughes, L., Huntley, B., van Jaarsveld, A. S., Midgley, G. F., Miles, L., Ortega-Huerta, M. A., Townsend Peterson, A., Phillips, O. L., & Williams, S. E. (2004). Extinction risk from climate change. Nature, 427(6970), 145–148. https://doi.org/10.1038/nature02121
Treen, K. M. d’I., Williams, H. T. P., & O’Neill, S. J. (2020). Online misinformation about climate change. WIREs Climate Change, 11(5), e665. https://doi.org/10.1002/wcc.665
Tsuji, S., Bergmann, C., & Cristia, A. (2014). Community-Augmented Meta-Analyses: Toward Cumulative Data Assessment. Perspectives on Psychological Science, 9(6), 661–665. https://doi.org/10.1177/1745691614552498
van Rooij, I., & Baggio, G. (2021). Theory Before the Test: How to Build High-Verisimilitude Explanatory Theories in Psychological Science. Perspectives on Psychological Science, 16(4), 682–697. https://doi.org/10.1177/1745691620970604
Whitmee, S., Haines, A., Beyrer, C., Boltz, F., Capon, A. G., de Souza Dias, B. F., Ezeh, A., Frumkin, H., Gong, P., Head, P., Horton, R., Mace, G. M., Marten, R., Myers, S. S., Nishtar, S., Osofsky, S. A., Pattanayak, S. K., Pongsiri, M. J., Romanelli, C., … Yach, D. (2015). Safeguarding human health in the Anthropocene epoch: Report of The Rockefeller Foundation–Lancet Commission on planetary health. Lancet (London, England), 386(10007), 1973–2028. https://doi.org/10.1016/s0140-6736(15)60901-1
Wilkinson, M. D., Dumontier, M., Aalbersberg, Ij. J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., da Silva Santos, L. B., Bourne, P. E., Bouwman, J., Brookes, A. J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C. T., Finkers, R., … Mons, B. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Scientific Data, 3(1), 160018. https://doi.org/10.1038/sdata.2016.18
Wong-Parodi, G., & Feygina, I. (2020). Understanding and countering the motivated roots of climate change denial. Current Opinion in Environmental Sustainability, 42, 60–64. https://doi.org/10.1016/j.cosust.2019.11.008
Zastrow, M. (2020). Open science takes on the coronavirus pandemic. Nature, 581(7806), 109–110. https://doi.org/10.1038/d41586-020-01246-3
This is an open access article distributed under the terms of the Creative Commons Attribution License (4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Supplementary Material