The undergraduate research dissertation in psychology is the capstone demonstration of research skills including project planning and design, considering and resolving ethical issues, and the analysis and dissemination of findings. The dissertation represents an opportunity for learning as well as an opportunity to contribute to the research literature in the student’s chosen area; however, few articles have considered both dimensions in detail. This article provides a roadmap for undergraduate thesis supervision, for early-career supervisors and supervisors aiming to better align their supervision and research activities and/or engage their students in open research practices via the dissertation.

Specifically, we review prior literature on undergraduate psychology research supervision and identify several dimensions that vary in existing approaches. Drawing on our own supervision experiences, we describe four key recommendations for undergraduate supervision in psychology and discuss how these can support student learning as well as benefit research.

The undergraduate (UG) research dissertation/thesis in psychology requires the student to carry out an empirical piece of research over the course of a single academic year. Students must individually demonstrate a range of research skills including project planning and design, considering and resolving ethical issues, and the analysis and dissemination of findings (British Psychological Society [BPS], 2019; Psychological Society of Ireland [PSI], 2019). Although the dissertation is an important learning process for individual UG students, it may also contribute to the research literature in the student’s chosen area, in the form of grey literature, or a peer-reviewed publication. Therefore, completing a dissertation has individual pedagogical benefits for the student, and potentially wider benefits for the research literature. The purpose of this paper is to discuss the dissertation as both a pedagogical exercise and a research endeavour and consider how to best support students’ learning while also enhancing the potential benefits for research. In doing so we hope to make the “private realm” (Wiggins et al., 2016, p. 11) of dissertation supervision explicit.

There is an extensive literature already on the value of undergraduate involvement in research (Perlman & McCann, 2005), models of research involvement aside from the dissertation (e.g., Lloyd et al., 2019), considerations specific to qualitative dissertations (e.g., Freeman et al., 2020), effective mentorship (e.g., Boysen et al., 2020), and postgraduate research (e.g., S. Williams, 2019). To be clear, our focus is not on those topics but on principles of undergraduate dissertation supervision that increase both the quality of undergraduate research and of the student learning experience. This is timely given advances in open science practices intended to improve the efficiency, reliability, and accessibility of research outputs.

Traditionally, the dissertation has been conducted in the context of a dyadic relationship between supervisor and student and involves the collection of new data. Thus, while the student learns from the supervisor, there can be limited opportunities for peer learning or collaboration outside of this dyadic relationship, and limited scope to learn from intellectual dialogue between the supervisor and other researchers (colleagues, or students). Besides this, the sheer numbers of student projects conducted can generate numerous small-scale quantitative studies with low statistical power, and an increased chance of false positive findings; some of which will ultimately be published. Thus, the traditional model has limitations not only for students’ learning, but for the research literature more broadly. In recent years, alternative models of supervision have been documented (see Table 1) that involve some variation on a team approach, and which may address some of the limitations of the traditional model.

Table 1.
Summary of models of undergraduate thesis supervision in psychology.
ReferenceOverview
(e.g.) Derounian (2011); Todd et al. (2006)  The dissertation is a partnership between the student and an individual staff member. 
Button et al. (2019) - Consortium Model Groups of students across multiple universities collaborate on a common research protocol under the supervision of their local academic supervisor, and a PhD student or post-doctoral researcher. Students design their dissertations to align with this larger multi-site study and meet with their supervisor as a group. The common protocol addresses an overarching research question and students test secondary hypotheses for their dissertations. 
Mickley-Steinmetz & Reid (2019) - Apprentice-Based Senior Thesis Groups of 3–6 students collaborate on common research protocols under a single academic supervisor. This model is similar to the consortium model run on a smaller scale (i.e., in a single institution rather than across multiple universities). 
Dautel (2020) - Collective Academic Supervision A group of students undertake independent, but related, research projects, under the supervision of a common supervisor. Students meet only on a group basis in this instance. 
Detweiler-Bedell & Detweiler-Bedell (2019) - Laddered Model Students are organised into three-person laddered teams, with an experienced student (team leader) mentoring a mid-level student alongside a student new to the research group. Though not specific to dissertation supervision, the duration of involvement in the laddered team over multiple years can facilitate dissertation supervision for the student acting as team leader. 
Freeman et al. (2020) - “Cluster” projects In the context of qualitative supervision, two members of staff are paired together to supervise a group of students (in addition to one-to-one supervision), for projects that operate around one or more of the following: (a) a single topic, (b) a specific research question, (c) a specific method or set of methods, (d) a specific approach to data analysis. 
Wagge et al. (2019) – Collaborative Replication Education Project (CREP) Students participate in high-quality direct replications selected and overseen by the CREP team. 
Moshontz et al. (2018) - Psychological Science Accelerator (PSA) The PSA is a distributed network of laboratories designed to enable and support crowdsourced research projects. Unlike CREP, it is not explicitly focused on either replications or on students; however, it is a mechanism within which undergraduate theses may be conducted. 
ReferenceOverview
(e.g.) Derounian (2011); Todd et al. (2006)  The dissertation is a partnership between the student and an individual staff member. 
Button et al. (2019) - Consortium Model Groups of students across multiple universities collaborate on a common research protocol under the supervision of their local academic supervisor, and a PhD student or post-doctoral researcher. Students design their dissertations to align with this larger multi-site study and meet with their supervisor as a group. The common protocol addresses an overarching research question and students test secondary hypotheses for their dissertations. 
Mickley-Steinmetz & Reid (2019) - Apprentice-Based Senior Thesis Groups of 3–6 students collaborate on common research protocols under a single academic supervisor. This model is similar to the consortium model run on a smaller scale (i.e., in a single institution rather than across multiple universities). 
Dautel (2020) - Collective Academic Supervision A group of students undertake independent, but related, research projects, under the supervision of a common supervisor. Students meet only on a group basis in this instance. 
Detweiler-Bedell & Detweiler-Bedell (2019) - Laddered Model Students are organised into three-person laddered teams, with an experienced student (team leader) mentoring a mid-level student alongside a student new to the research group. Though not specific to dissertation supervision, the duration of involvement in the laddered team over multiple years can facilitate dissertation supervision for the student acting as team leader. 
Freeman et al. (2020) - “Cluster” projects In the context of qualitative supervision, two members of staff are paired together to supervise a group of students (in addition to one-to-one supervision), for projects that operate around one or more of the following: (a) a single topic, (b) a specific research question, (c) a specific method or set of methods, (d) a specific approach to data analysis. 
Wagge et al. (2019) – Collaborative Replication Education Project (CREP) Students participate in high-quality direct replications selected and overseen by the CREP team. 
Moshontz et al. (2018) - Psychological Science Accelerator (PSA) The PSA is a distributed network of laboratories designed to enable and support crowdsourced research projects. Unlike CREP, it is not explicitly focused on either replications or on students; however, it is a mechanism within which undergraduate theses may be conducted. 

In addition to a shift from the traditional dyadic model, explicit consideration of how to raise awareness of and avoid questionable research practices (QRPs; John et al., 2012) at undergraduate level is necessary. One study of PhD researchers (Lubega et al., 2023) indicates that the majority experienced issues in reproducing published findings and tended to attribute this “failure” as indicative of a lack of skill on their own part. Participants described experiencing self-doubt, frustration, and depression; in some instances this interfered with their health and/or ability to work. Given undergraduate students typically develop their dissertation based on findings from published literature; they are likely to be vulnerable to these same issues, to some degree. Also, undergraduate students themselves may unknowingly engage in QRPs, particularly in relation to analysis and reporting (Krishna & Peter, 2018). Because dissertation supervisors are key in shaping students’ attitudes towards QRPs (Krishna & Peter, 2018), modelling best practice and actively training students in relation to QRPs if needed, should be a priority for undergraduate dissertation supervision.

Given an increased discourse around models of undergraduate dissertation supervision (e.g., Giuliano et al., 2019), it is timely to consider how best to support learning and research in the context of the dissertation. Based on our collective supervision experiences, we argue that adopting four key recommendations may help increase the quality of the research generated as part of the UG dissertation, without compromising, or indeed potentially increasing, the quality of student training and learning:

These recommendations are:

  1. Consider efficient use of data (e.g., by using secondary and/or meta-data)

  2. Consider team science approaches

  3. Promote openness and transparency

  4. Raise awareness of and avoid incentivising ­QRPs

We discuss each recommendation below with examples drawn from quantitative, qualitative, and mixed methods projects. We begin by discussing the efficient use of data (recommendation 1) with examples from (1) secondary data and (2) evidence synthesis. Given a limited literature on supervision processes in psychology, we describe our own experiences of team science approaches in relative depth across quantitative, qualitative, and mixed methods projects (recommendation 2). We then discuss how to promote openness and transparency in the context of the dissertation (recommendation 3), with a particular focus on planning (primarily via pre-registration), and open data. Finally, and to some degree in parallel with our recommendation to promote openness and transparency, we discuss how to raise awareness of and avoid incentivising QRPs (recommendation 4).

First, when planning a dissertation project, consider whether data needs to be collected at all, to address the research question. Data collection involves practical skills development (e.g., learning how to manage an experimental testing session); however, if these skills can be acquired outside of the dissertation, the use of existing data such as publicly available data, data already held by the supervisor, data not intended for research purposes, meta-data, or meta-synthesis, may be appealing. Meta-research projects are likely becoming more popular (e.g., Clarke et al., 2023) and can provide students with the opportunity to engage deeply with methodological issues in the literature. Replication studies using existing data are also feasible (e.g., Coyle et al., 2020) and the advantages of replication for learning and guidance on choosing what to replicate for teaching purposes are discussed in-depth elsewhere (see Janz, 2016; Wagge et al., 2019). Using existing data reduces research waste by minimising the unnecessary collection of new data, reducing the overall burden on potential participants, and reducing the burden on research ethics committees. In addition, a large sample size or dataset is typically available, and sampling is more representative of the general population than would be achieved with convenience or snowball sampling, leading to potentially better-quality research. Here, we discuss the benefits and drawbacks of conducting (1) a secondary analysis, and (2) of evidence synthesis.

In addition to the benefits for research, there are several potential benefits for learning. Secondary data allow students to gain hands-on experience with real-world datasets and all their idiosyncratic messiness. Students’ skills in data wrangling may require support at the start, particularly where they have previously encountered only unrealistically “clean” data and the skills required to obtain data from sources like Twitter are likely to only be available to those students who are enrolled in psychology programmes that have incorporated data skills and programming into their curriculums (e.g., PsyTeachR, n.d.). However, these are skills that are useful for a range of graduate jobs beyond those focused on research.

An important consideration for secondary data projects is that students will not gain first-hand experience of participant recruitment or data collection. Depending on individual School/Department requirements, students may not gain experience of developing a formal research ethics application, so considering how to develop competency in ethics is necessary, for example, by creating ethics forms to relate directly to secondary data and/or internet-mediated research. Ethics is particularly important for data not originally collected for research purposes. Researchers relying on data from online forums, for example, need to consider if individual forum members should be contacted for permission to analyse their online discussions. If appropriate, researchers need to consider if it is feasible and if it could alter the nature of the online discussions. There may be alternatives such as seeking permission from a forum administrator, or there may be grounds for researchers to choose not to seek consent. These considerations are complex (see e.g., Ahmed et al., 2017; Buchanan, 2017 for further discussion) and there is no clear answer. Indeed, for large-scale Twitter analyses that scrape data from a particular hashtag, informed consent is practically impossible to obtain. Instead, it is important to consider a formal application for access via Twitter’s Academic Research application, and to ensure users are granted anonymity in the write-up and/or the publication of data and analysis code. For example, Attard and Coulson (2012) used data in the public domain and thus did not seek consent. To preserve anonymity, they omitted not only participants’ names/ pseudonyms but also the names of the online support groups themselves, and only short segments of the original posts were quoted to reduce their traceability through search engines. In addition to this example, useful guidance on relevant ethical issues is available from the BPS (2017) (see also Granger et al. (2021), Sugiura et al. (2017) and Williams et al. (2017)).

As an alternative to secondary data analysis, conducting an evidence synthesis allows students to engage deeply with the literature and develop their methodological and appraisal skills. For dissertation projects adhering to British and Irish standards, our interpretation of current accreditation guidelines is that data analysis must be conducted; therefore, a systematic review without a meta-analysis (or equivalent) is unlikely to be acceptable in Britain and Ireland (e.g., BPS, 2019), while U.S. guidelines are less prescriptive. University libraries often offer training in systematic review techniques and there are many published exemplars available. An evidence synthesis can be undertaken even if others have previously been conducted addressing the same question. For example, Ahern and Semkovska (2017) addressed some limitations of an earlier meta-analysis (Lee et al., 2012) of cognitive functioning in the first episode of major depressive disorder. For qualitative evidence synthesis (QES, see Noyes et al., 2019) a student will ideally have prior experience with the methodology they are synthesizing (e.g., students undertaking thematic syntheses will have experience in thematic analysis), which is unlikely at UG level. However, working with data in existing papers where themes are already summarized is arguably more accessible than working with raw qualitative data, making qualitative evidence synthesis a viable option for some students under some specific circumstances. Students undertaking an evidence synthesis can use pre-registration templates and the PRISMA (Page et al., 2021) reporting guidelines to help scaffold and guide their project, as well as encouraging transparent reporting.

It is important to ensure students appreciate the distinction between the narrative and selective literature review that forms part of an assignment or research project, and the substantial workload involved in undertaking a systematic review prior to even conducting a meta-analysis. In contrast to narrative reviews, at least some steps of a systematic reviews should be conducted as part of a team (Jahan et al., 2016). The availability of team members may determine whether a systematic review is a feasible option for the dissertation. Additionally, the supervisor must consider the accreditation requirement for meta-analysis. Given the typical timeframe for UG dissertations, it may be helpful to consider if the research question and process can be constrained to reduce the burden associated with the searching and screening phases of the process. For example, it may be possible to update a previous review, or conduct a review of research during the last five years, or from the date a key research recommendation was made. Finally, supervisors and students should also plan for insufficient or inadequate reporting of data for meta-analysis. Although contacting the study authors is commonly done to access data, there is no guarantee authors will be responsive. If at least some data are available, conducting a meta-analysis while acknowledging the limitations of available data may demonstrate students’ computational skills. However, given the considerable time required to conduct evidence syntheses well, and the accreditation requirement for data analysis, this option is often sub-optimal for UG students.

Our second broad recommendation is to consider team science approaches to supervision. As illustrated in Table 1, several models of supervision involve a form of team approach. Teaming up across institutions as in the consortium model (Button et al., 2019) can lead to very large datasets and more generalizable results. However, many of the benefits for learning and for research can be achieved by groups of students working together within an individual department. The BPS/PSI accreditation guidance endorses group projects as long as the student can individually demonstrate each of the skills involved in conducting the empirical project, whilst the APA guidelines include refining project-management skills and enhancing teamwork capacity as core goals. Depending on the institution’s interpretation of the guidance, this could be as simple as each student writing up their dissertations separately (based on identical research questions and a common dataset) or ensuring that each student has a different research question.

The benefits for research include the generation of better-powered datasets to address a specific research question. Team approaches can provide opportunities for peer learning and peer support that are absent from the one-to-one supervision model. A team approach can facilitate practical data collection skills while making efficient use of the data collected and students are potentially more likely to have an opportunity for co-authorship on a resulting publication from the pooled, better-powered dataset.

Because the literature on undergraduate dissertation is relatively limited, and because team approaches vary in how they are implemented, we outline below how team approaches might be implemented for quantitative, mixed methods, and qualitative studies, and conclude by considering drawbacks to team approaches.

Quantitative studies

For quantitative studies, similar to the consortium model, group lab or survey projects can be supported by having an overarching primary hypothesis or project aim that is pre-registered and forms the primary focus of any paper written up for publication. Students then build in a series of secondary questions and hypotheses to become the focus of their individual dissertations. A key concern of this approach is the tension between pedagogy and research. The consortium lab-based example incorporates multiple outcomes and/or moderator variables for pedagogical reasons (i.e., to facilitate individual student research questions), and this complexity increases with the number of students in the team. However, the integrity of the overarching project (on which all students will be co-authors) is supported by pre-registering the primary aims, enabling easier detection of QRPs. Thus, any resulting publication will be confirmatory for the primary aims, with the students’ dissertation aims treated as secondary. Pre-registering (even informally) individual student hypotheses ensures that the dissertation projects retain their individuality, both in terms of academic integrity, and the perception of the process from the students’ point of view. It also minimizes the temptation to make use of measures other than those that were pre-registered, without a justifiable rationale for doing so.

Mixed methods

Mixed methods designs involve the collection and analysis of both quantitative and qualitative data and as such lend themselves well to a team dissertation project. Mixed methods provide both breadth and depth to the question under investigation (Johnson et al., 2007) and in the case of exploratory mixed methods, can support the formation of evidence-based hypotheses for NHST (Erzberger & Prein, 1997). Specific to the dissertation process, mixed methods projects have several advantages. First, the shared topic means that students can engage in peer support such as sharing papers and discussing theoretical models and the interpretation of their data; however, the divergence in methods and subsequent write-up ensures that the dissertation project retains its individuality. Additionally, students can support each other with participant recruitment and reviewing and proofreading study materials. Team mixed method designs also allow (or indeed may require) multiple supervisors to be involved on the project who have different methodological expertise. Students still benefit from an individual supervisor, but group meetings and reviews help promote a team science approach.

One important consideration with mixed method dissertations involves the timeline and the choice of core design. Explanatory and exploratory designs (see Creswell & Clark, 2017) require one student to “go first” in the collection of their qualitative or quantitative data, which may make convergent designs (where qualitative and quantitative data collection and analysis is conducted largely in parallel) more appealing. Supervisors should be clear upfront about the nature of the project and be prepared to support those students with different timelines than they may have expected; for example, completing drafts of the introduction and methods in advance of any data collection. It is crucial that contingency plans (e.g., switching to a convergent design) are developed for if the primary study does not take place in the expected timeline, so that for example, issues impacting student A do not disadvantage student B. Additionally, the quantitative component remains susceptible to the limitations of individual quantitative projects (e.g., small sample sizes). There are of course higher-level concerns with mixed method designs regarding how to meaningfully integrate studies that have different epistemological positions into a single paper (Clark, 2019), although this is not specific to mixed method research as a dissertation model.

Qualitative dissertations

For qualitative dissertations, team approaches are effective when students are interested in distinct but related questions with the same participant group (e.g., exploring experiences of (a) social support and (b) self-management for students with Type 1 diabetes). It is most straightforward when students also plan to use the same analytical approach (e.g., thematic analysis), as the same “thickness” of data is sought during interviews, and there are no differences in transcription requirements (e.g., text only transcription is acceptable for both research questions).

In addition to sharing participant recruitment, students can review each other’s materials such as interview or focus group questions before finalising a common schedule. Students can divide the conduct of interviews/focus groups and transcribe those they do not conduct, to gain familiarity with the data. In contrast to exploratory or explanatory mixed methods designs, students work to the same timeline for data collection.

The key concern with teaming up for qualitative studies involves assuring the quality and originality of the two (or more) individual research studies within the common research process. Students must clearly articulate their own specific research questions prior to combining question schedules. Otherwise, they may end up conducting several interviews or focus groups around the broad topic while ending up with very little material pertinent to their specific research question. Each student must also be sufficiently familiar with the others’ research question(s), to probe and follow up on participant responses relating to that question (and vice versa). Students can decide a priori to analyse the interviews in their entirety, or alternatively, that only about half of each interview will be relevant for each dissertation. Even with training, there is potential for a high level of variability in interview skills and establishment of rapport, and any section on reflexivity will be increasingly complex with increasing number of interviewers/researchers.

What are the drawbacks to team approaches?

Supervisors contemplating a team approach may be concerned about students’ ability to generate a truly independent project while working as part of a team. However, clarifying to students which activities can be shared (e.g., circulating relevant research papers), and which should not be done collaboratively (writing one’s individual results sections), can be helpful in supporting the retention of individual research integrity within a larger team project. Other strategies include balancing team meetings with some individual meetings focused on the student’s own specific research question and intellectual development and bringing the team together for particular activities (e.g., pilot testing in the lab), before dispersing for other activities.

In our experiences, students have valued both the informal peer support that comes from working in this way, and the tangible advantages of shared data collection (e.g., sharing efforts to recruit participants; generation of a larger sample size in the time allowed, and opportunities to gain teamwork skills); these benefits have been highlighted by others, also (e.g., Dautel, 2020). Moreover, if social loafing is a concern, this may be mitigated a priori by clear discussions and agreement on roles and responsibilities, potentially supported by the CRediT taxonomy, and/or by requiring individuals to collect a certain proportion of data to access the larger shared data set. Students value opportunities for one-on-one discussions with their supervisor, so a combination of team and individual meetings is possibly most beneficial (Dautel, 2020). There may also be instances where students have very legitimate concerns about working together and in these cases, individual projects may be more appropriate.

Besides this, the social and emotional aspects of learning and of the dissertation are also important to consider. Students can gain a sense of pride having completed a dissertation that they may not feel to the same degree working on a team-based project; this could be addressed by highlighting individual achievements within the team as well as team-work overall. Students progressing from a team-based dissertation to a traditional student-supervisor project for masters or PhD research may be vulnerable to “impostor syndrome” or find the transition disconcerting. Therefore, scaffolding transitions from team to traditional projects (as you would for transitions from traditional to team projects) may be necessary.

Having described the use of secondary/meta-data, team science approaches, and methods of, we move to our third recommendation:

There are several ways in which openness and transparency can be promoted; including pre-registration, open data, code, and/or materials, reporting contributions (e.g., using the Contributor Roles Taxonomy [CRediT]) and planning for dissemination of study findings (e.g., via conference presentation and/or journal submission). We discuss pre-registration and open data practices in further detail below.

Pre-registration

Full pre-registration with detailed analytic specificity is not appropriate for all research designs and analytic approaches (and indeed there is ongoing debate as to its efficacy and purpose for any research, see e.g., Nosek et al. (2019) vs. Szollosi et al. (2019)). However, incorporating the development of a study protocol, lighter-touch pre-registration of hypotheses for confirmatory work, or the explicit registration that the work is exploratory/intended solely as a learning experience, is entirely feasible. Preregistration (or a similar a priori plan) can also be a tool to support transparency in qualitative research. In terms of benefits for research, pre-registration can help promote transparent ways of working and protect against the increased risk of publication bias in the wider literature (Pownall, 2020). From a learning perspective, working through a detailed plan prior to data collection will inevitably lead to clarity of thinking and better research questions and higher quality dissertations. Indeed, van’t Veer and Giner-Sorolla (2016) note that a focus on theory testing and/or methodological replication and validation over results is likely to benefit researchers at an earlier stage of their career, specifically students and post-doctoral researchers. Further, the importance of transparency has long been recognised in qualitative constructs such as reflexivity, the process of a continual internal dialogue and critical self-evaluation of a researcher’s positionality as well as active acknowledgement that this position may affect the research process and outcome (Berger, 2015). Students doing qualitative or mixed methods research can also include reflexivity and/or positionality statements in their theses to enhance transparency.

Open data and code

Similarly, students can consider making the data “open” in line with FAIR data principles (Wilkinson et al., 2016). Although there is debate about the value of open data initiatives (e.g., Kitchin, 2013), and it is highly challenging to truly anonymise data (e.g., Rocher et al., 2019), making a conscious decision to make data open (or not, particularly for qualitative research) is an important element of student training. Except when the thesis will be published, the benefits of open data and code are primarily for student learning. For quantitative studies, students can be asked to provide syntax files to ensure the results reported in the dissertation can be reproduced. This exercise will better prepare students for subsequent research projects, given increasing emphasis on open data and data management in general. It also supports students’ awareness of ethical use of data. For qualitative studies, there is considerable debate about the relevance of open data guidelines for qualitative data (see Branney et al., 2019; Prosser et al., 2023, for discussions of this issue). Nonetheless, even when data are not open, generating a data availability statement to accompany the dissertation can facilitate students’ learning, given these are commonly required by journals regardless of whether the research is quantitative or qualitative.

Alongside promoting openness and transparency, our final recommendation is to explicitly raise awareness of and avoid incentivising QRPs. The literature indicates that QRPs exist in psychology, that some students engage in some QRPs (Krishna & Peter, 2018), that students learn about QRPs from supervisors (Krishna & Peter, 2018), and that early-career researchers can be disheartened when their project findings do not replicate existing published work (potentially owing to QRPs in that original work; Nelson et al., 2022). To be clear, we do not want to overemphasize the impact of QRPs, nor create a descriptive norm that QRPs are common (Fiedler & Schwarz, 2016). However, given QRPs are documented as problematic, it falls to the dissertation supervisor to model appropriate research practices and to provide specific training as needed both to promote transparency (as noted in recommendation #3) and to avoid QRPs.

The benefits of avoiding QRPs for research are perhaps obvious. The main benefit is that published research arising from undergraduate theses is more likely to be of good quality.

In terms of benefits for learning, supporting students to be aware of and avoid QRPs themselves, means developing research integrity and ethical awareness in these students. Supervisors can achieve this by modelling best practice themselves and by promoting openness and transparency as outlined above. Supervisors can also explicitly emphasize the methods used over the results generated by those methods, join in critical discussions of prior literature, guide students in adhering to their pre-registration (or explaining deviations from this), and emphasize caution in interpreting statistically significant findings. Otherwise, given the literature available to students disproportionately favours statistically significant results, students may be disappointed with null or counter-intuitive findings arising from their own projects (e.g., Nelson et al., 2022).

In summary, the above recommendations are intended to increase both the quality of undergraduate research and of the student learning experience during the undergraduate dissertation. By reflecting on our own collective supervision experiences across three institutions we hope to illuminate aspects of supervision practices that typically remain private or inaccessible to early-career academics embarking on supervision for the first time. Given the many approaches to dissertation supervision, what does supervision that adopts the recommendations look like? At their core, our recommendations are not revolutionary: careful consideration of the need to collect data and the merits and challenges of using existing data, early planning, with an interim deadline for a written plan (using a pre-registration template if preferred) and including early planning for dissemination. Implementing these recommendations also involves incorporating and scaffolding some element of collaboration (e.g., by formally sharing data collection, or discussing plans with peers), clear consideration of transparent and open practices (e.g., even if the decision is to not share data), and a focus on methodological rigour and appropriate interpretation of results, supported by guidance from the supervisory team. This paper combining our supervision experiences with the available relevant literature aims to provide researchers and educators with pragmatic solutions to support learning outcomes as well as the overall quality of undergraduate research. Given the increasing emphasis on open science practices, and increasing popularity of team approaches, formal evaluation of the impact of these approaches on student learning is an important next step in this area. In addition, further elaboration of emerging forms of undergraduate dissertations like meta-research projects is also required to ensure that early-career supervisors are well-supported to support their dissertation students.

Contributed to conception and design: AMC, KB, HCW, EN

Contributed to analysis of literature: AMC, KB, HCW, EN

Drafted and revised the article: AMC, KB, HCW, EN

Approved the submitted version for publication: AMC, KB, HCW, EN

The authors report no conflicts of interest.

We are very grateful to Professor Neil Coulson (University of Nottingham) for his helpful contribution to and feedback on an earlier draft of this manuscript and to Dr Peter Branney (Bradford University) for feedback also.

We are also grateful to the National Forum for the Enhancement of Teaching and Learning in Higher Education who supported a seminar on this topic as part of the 2020/2021 seminar series (recording here: https://www.youtube.com/watch?v=7iBVt2ZqPCo)

Ahern, E., & Semkovska, M. (2017). Cognitive functioning in the first-episode of major depressive disorder: A systematic review and meta-analysis. Neuropsychology, 31(1), 52–72. https://doi.org/10.1037/neu0000319
Ahmed, W., Bath, P. A., & Demartini, G. (2017). Using Twitter as a data source: An overview of ethical, legal, and methodological challenges. The Ethics of Online Research, 79–107. https://doi.org/10.1108/s2398-601820180000002004
Attard, A., & Coulson, N. S. (2012). A thematic analysis of patient communication in Parkinson’s disease online support group discussion forums. Computers in Human Behavior, 28(2), 500–506. https://doi.org/10.1016/j.chb.2011.10.022
Berger, R. (2015). Now I see it, now I don’t: researcher’s position and reflexivity in qualitative research. Qualitative Research, 15(2), 219–234. https://doi.org/10.1177/1468794112468475
Boysen, G. A., Sawhney, M., Naufel, K. Z., Wood, S., Flora, K., Hill, J. C., & Scisco, J. L. (2020). Mentorship of undergraduate research experiences: Best practices, learning goals, and an assessment rubric. Scholarship of Teaching and Learning in Psychology, 6(3), 212–224. https://doi.org/10.1037/stl0000219
Branney, P., Reid, K., Frost, N., Coan, S., Mathieson, A., & Woolhouse, M. (2019). A context-consent meta-framework for designing open (qualitative) data studies. Qualitative Research in Psychology, 16(3), 483–502. https://doi.org/10.1080/14780887.2019.1605477
British Psychological Society. (2017). Ethics Guidelines for Internet-mediated Research. https://www.bps.org.uk/sites/www.bps.org.uk/files/Policy/Policy%20-%20Files/Ethics%20Guidelines%20for%20Internet-mediated%20Research%20%282017%29.pdf
British Psychological Society [BPS]. (2019). Standards for the accreditation of undergraduate, conversion and integrated Masters programmes in psychology. https://www.bps.org.uk/sites/bps.org.uk/files/Accreditation/Undergraduate%20Accreditation%20Handbook%202019.pdf
Buchanan, E. (2017). Considering the ethics of big data research: A case of Twitter and ISIS/ISIL. PloS One, 12(12), e0187155. https://doi.org/10.1371/journal.pone.0187155
Clark, V. L. P. (2019). Meaningful integration within mixed methods studies: Identifying why, what, when, and how. Contemporary Educational Psychology, 57, 106–111. https://doi.org/10.1016/j.cedpsych.2019.01.007
Clarke, B., Schiavone, S., & Vazire, S. (2023). What limitations are reported in short articles in social and personality psychology? Journal of Personality and Social Psychology, 125(4), 874–901. https://doi.org/10.1037/pspp0000458
Coyle, D. K. T., Howard, S., Bibbey, A., Gallagher, S., Whittaker, A. C., & Creaven, A.-M. (2020). Personality, cardiovascular, and cortisol reactions to acute psychological stress in the Midlife in the United States (MIDUS) study. International Journal of Psychophysiology, 148, 67–74. https://doi.org/10.1016/j.ijpsycho.2019.11.014
Creswell, J. W., & Clark, V. L. P. (2017). Designing and conducting mixed methods research. SAGE.
Dautel, J. B. (2020). Applying a Collective Academic Supervision Model to the undergraduate dissertation. Psychology Teaching Review, 26(1), 18–26. https://doi.org/10.53841/bpsptr.2020.26.1.18
Derounian, J. (2011). Shall we dance? The importance of staff-student relationships to undergraduate dissertation preparation. Active Learning in Higher Education, 12(2), 91–100. https://doi.org/10.1177/1469787411402437
Detweiler-Bedell, B., & Detweiler-Bedell, J. B. (2019). Undergraduate research teams that build bridges, produce publishable research, and strengthen grant proposals. Frontiers in Psychology, 10, 133. https://doi.org/10.3389/fpsyg.2019.00133
Erzberger, C., & Prein, G. (1997). Triangulation: Validity and empirically-based hypothesis construction. Quality and Quantity, 31(2), 141–154. https://doi.org/10.1023/a:1004249313062
Fiedler, K., & Schwarz, N. (2016). Questionable research practices revisited. Social Psychological and Personality Science, 7(1), 45–52. https://doi.org/10.1177/1948550615612150
Freeman, L., Brooks, J., Crowley, C., Elmi-Glennan, C., Gordon-Finlayson, A., McDermott, H., & Seymour-Smith, S. (2020). Beyond the Comfort Zone: A Guide to Supervising Qualitative Undergraduate Psychology Dissertations for Quantitative Researchers. Psychology Teaching Review, 26(1), 39–47. https://doi.org/10.53841/bpsptr.2020.26.1.39
Giuliano, T., Skorinko, J. L. M., & Fallon, M. (2019). Editorial: Engaging Undergraduates in Publishable Research: Best Practices. Frontiers in Psychology, 10, 1878. https://doi.org/10.3389/fpsyg.2019.01878
Granger, J., Branney, P., Sullivan, P., & McDermott, S. (2021, August 26). Ethical considerations in post-GDPR social media based research. https://doi.org/10.31234/osf.io/6fsw7
Jahan, N., Naveed, S., Zeshan, M., & Tahir, M. A. (2016). How to conduct a systematic review: A narrative literature review. Cureus, 8(11), e864. https://doi.org/10.7759/cureus.864
Janz, N. (2016). Bringing the gold standard into the classroom: Replication in university teaching. International Studies Perspectives, 17(4), 392–407. https://doi.org/10.1111/insp.12104
John, L. K., Lowenstein, G., & Prelac, D. (2012). Measuring the prevalence of questionable research practices with incentives for truth telling. Psychological Science, 23(5), 524–532. https://doi.org/10.1177/0956797611430953
Johnson, R. B., Onwuegbuzie, A. J., & Turner, L. A. (2007). Toward a definition of mixed methods research. Journal of Mixed Methods Research, 1(2), 112–133. https://doi.org/10.1177/1558689806298224
Kitchin, R. (2013). Four critiques of open data initiatives. https://blogs.lse.ac.uk/impactofsocialsciences/2013/11/27/four-critiques-of-open-data-initiatives/#author
Krishna, A., & Peter, S. M. (2018). Questionable research practices in student final theses – Prevalence, attitudes, and the role of the supervisor’s perceived attitudes. PLoS ONE, 13(8), e0203470. https://doi.org/10.1371/journal.pone.0203470
Lee, R. S., Hermens, D. F., Porter, M. A., & Redoblado-Hodge, M. A. (2012). A meta-analysis of cognitive deficits in first-episode major depressive disorder. Journal of Affective Disorders, 140(2), 113–124. https://doi.org/10.1016/j.jad.2011.10.023
Lloyd, S. A., Shanks, R. A., & Lopatto, D. (2019). Perceived Student Benefits of an Undergraduate Physiological Psychology Laboratory Course. Teaching of Psychology, 46(3), 215–222. https://doi.org/10.1177/0098628319853935
Lubega, N., Anderson, A., & Nelson, N. C. (2023). Experience of irreproducibility as a risk factor for poor mental health in biomedical science doctoral students: A survey and interview-based study. PLoS ONE, 18(11), e0293584. https://doi.org/10.1371/journal.pone.0293584
Mickley Steinmetz, K. R., & Reid, A. K. (2019). Providing Outstanding Undergraduate Research Experiences and Sustainable Faculty Development in Load. Frontiers in Psychology, 10. https://doi.org/10.3389/fpsyg.2019.00196
Moshontz, H., Campbell, L., Ebersole, C. R., IJzerman, H., Urry, H. L., Forscher, P. S., & Chartier, C. R. (2018). The Psychological Science Accelerator: Advancing Psychology through a Distributed Collaborative Network. Advances in Methods and Practices in Psychological Science, 1(4), 501–515. https://doi.org/10.1177/251524591879760
Nosek, B. A., Beck, E. D., Campbell, L., Flake, J. K., Hardwicke, T. E., Mellor, D. T., van ’t Veer, A. E., & Vazire, S. (2019). Preregistration is hard, and worthwhile. Trends in Cognitive Sciences, 23(10), 815–818. https://doi.org/10.1016/j.tics.2019.07.009
Noyes, J., Booth, A., Cargo, M., Flemming, K., Harden, A., Harris, J., Garside, R., Hannes, K., Pantoja, T., & Thomas, J. (2019). Qualitative evidence. In J. P. T. Higgins, J. Thomas, J. Chandler, M. Cumpston, T. Li, M. J. Page, & V. A. Welch (Eds.), Cochrane Handbook for Systematic Reviews of Interventions (pp. 525–545). Wiley. https://doi.org/10.1002/9781119536604.ch21
Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., … Moher, D. (2021). The PRISMA 2020 statement: an updated guideline for reporting systematic reviews. BMJ, 372, n71. https://doi.org/10.1136/bmj.n71
Perlman, B., & McCann, L. I. (2005). Undergraduate research experiences in psychology: A national study of courses and curricula. Teaching of Psychology, 32(1), 5–14. https://doi.org/10.1207/s15328023top3201_2
Pownall, M. (2020). Pre-registration in the undergraduate dissertation: A critical discussion. Psychology Teaching Review, 26(1), 71–76. https://doi.org/10.53841/bpsptr.2020.26.1.71
Prosser, A. M. B., Hamshaw, R. J. T., Meyer, J., Bagnall, R., Blackwood, L., Huysamen, M., Jordan, A., Vasileiou, K., & Walter, Z. (2023). When open data closes the door: A critical examination of the past, present and the potential future for open data guidelines in journals. British Journal of Social Psychology, 62(4), 1635–1653. https://doi.org/10.1111/bjso.12576
Psychological Society of Ireland [PSI]. (2019). Undergraduate Accreditation Guidelines. https://www.psychologicalsociety.ie/accreditation/PSI-Accredited-Undergraduate-Courses-3
PsyTeachR. (n.d.). PsyTeachR. Retrieved April 26, 2021, from https://psyteachr.github.io/
Rocher, L., Hendrickx, J. M., & de Montjoye, Y.-A. (2019). Estimating the success of re-identifications in incomplete datasets using generative models. Nature Communications, 10(1), 3069. https://doi.org/10.1038/s41467-019-10933-3
Sugiura, L., Wiles, R., & Pope, C. (2017). Ethical challenges in online research: Public/private perceptions. Research Ethics, 13(3–4), 184–199. https://doi.org/10.1177/1747016116650720
Szollosi, A., Kellen, D., Navarro, D. J., Shiffrin, R., van Rooij, I., Van Zandt, T., & Donkin, C. (2019). Is preregistration worthwhile? Trends in Cognitive Sciences, 24(2), 94–95. https://doi.org/10.1016/j.tics.2019.11.009
Todd, M. J., Smith, K., & Bannister, P. (2006). Supervising a social science undergraduate dissertation: staff experiences and perceptions. Teaching in Higher Education, 11(2), 161–173. https://doi.org/10.1080/13562510500527693
van ’t Veer, A. E., & Giner-Sorolla, R. (2016). Pre-registration in social psychology—A discussion and suggested template. Journal of Experimental Social Psychology, 67, 2–12. https://doi.org/10.1016/j.jesp.2016.03.004
Wagge, J. R., Brandt, M. J., Lazarevic, L. B., Legate, N., Christopherson, C., Wiggins, B., & Grahe, J. E. (2019). Publishing research with undergraduate students via replication work: The Collaborative Replications and Education Project. Frontiers in Psychology, 10, 247. https://doi.org/10.3389/fpsyg.2019.00247
Wiggins, S., Gordon-Finlayson, A., Becker, S., & Sullivan, C. (2016). Qualitative undergraduate project supervision in psychology: current practices and support needs of supervisors across North East England and Scotland. Qualitative Research in Psychology, 13(1), 1–19. https://doi.org/10.1080/14780887.2015.1075641
Wilkinson, M. D., Dumontier, M., Aalbersberg, Ij. J., Appleton, G., Axton, M., Baak, A., Blomberg, N., Boiten, J.-W., da Silva Santos, L. B., Bourne, P. E., Bouwman, J., Brookes, A. J., Clark, T., Crosas, M., Dillo, I., Dumon, O., Edmunds, S., Evelo, C. T., Finkers, R., … Mons, B. (2016). The FAIR Guiding Principles for scientific data management and stewardship. Scientific Data, 3(1), 160018. https://doi.org/10.1038/sdata.2016.18
Williams, M. L., Burnap, P., & Sloan, L. (2017). Towards an ethical framework for publishing Twitter data in social research: Taking into account users’ views, online context and algorithmic estimation. Sociology, 51(6), 1149–1168. https://doi.org/10.1177/0038038517708140
Williams, S. (2019). Postgraduate Research Experience Survey. Advance HE. https://www.wlv.ac.uk/media/departments/research/documents/AdvanceHE-Postgraduate-Research-Experience-Survey-Report-2019.pdf
This is an open access article distributed under the terms of the Creative Commons Attribution License (4.0), which permits unrestricted use, distribution, and reproduction in any medium, provided the original author and source are credited.

Supplementary data