An introductory cell biology laboratory course was redesigned using two inquiry-based modules to align with the goals of scientific inquiry as described in Vision and Change. To evaluate the lab's efficacy, we used a broad range of materials, including pretests and posttests, online surveys, focus group interviews, and course evaluations. Although our students produced significant learning gains on technical laboratory skills, methods, and data analyses, during the first two years their affective dispositions toward the experience were more negative. By evaluating our class data in light of insights from the scholarship of teaching and learning, we were able to provide better guidance to students and subsequently to persist past this implementation dip, showing continued positive gains in student learning outcomes, and eliminating the negative impact on student affective outcomes. Our experience underscores the value of scientific teaching, using class data and evidence-based practices to persist beyond the implementation dips that come with adopting new curricula.

Introduction

Improving scientific literacy is a national priority for undergraduate education. Foundational to informed citizenship in a democratic society, scientific literacy includes core competencies highlighted by Vision and Change (AAAS, 2011) such as proficiency in using scientific processes, quantitative reasoning, and the ability to tap into the interdisciplinary nature of science. These competencies are particularly crucial with the current generation of STEM students pursuing vocations that address climate change, genome editing, chronic diseases, pollution, and other complex issues in contemporary society.

An introductory laboratory class, in particular, has advantages for instilling these core competencies in students. It is a natural fit to teach scientific processes during a laboratory. The experiments in the laboratory can have an inquiry-based design to mimic the authentic experience of scientists (Freeman et al., 2014). With the right choice of focus, quantitative reasoning can be an important part of the experience (Baumgartner et al., 2015). Connections to the interdisciplinary nature of science can be made by incorporating concepts and practices from related fields such as other natural and social sciences, a need that is receiving renewed attention and emphasis (Garlick & Levine, 2016).

Many of us, however, are also familiar with the implementation challenges associated with instilling core competencies, even in laboratory settings. If the design is inquiry-based and requires student self-direction, students can quickly become overwhelmed and frustrated by the lack of direction (Meyer et al., 2008). Affective outcomes can suffer with an inquiry-based design (Gormally et al., 2009), and if the students do not “like” the laboratory experience, the negative affective outcome can have a negative impact on the overall effectiveness of the learning (Trujillo & Tanner, 2014). Similarly, attempts to make connections to quantitative reasoning and other sciences can fall flat, as students often fail to transfer learning from one setting to another (Mestre, 2005). These challenges and others can result in an implementation dip (Fullan, 2007). Research shows that instructors who are using evidence-based pedagogical changes should expect that the first years of a new curriculum will be unpopular with students and may not show the learning effectiveness gains promised by the research literature (Fullan, 2007). This can undermine the success of such revisions to the point where they are deemed counterproductive and are abandoned (Felder & Brent, 1996), rather than revised to be more effective.

Here, we will describe new inquiry-based modules for an introductory biology course for majors. We will demonstrate how the laboratory activities are aligned with the goals of scientific literacy as described in Vision and Change. We will also show, through our assessment data, how this course was subject to an implementation dip. We will further describe how, by learning from our assessment data and revising the course, we were able to persist past the implementation dip, show net positive gains in student learning outcomes, and eliminate the negative impact on student affective outcomes. We hope that this article will provide the details of an implementation-ready laboratory investigation as well as an encouraging narrative of how to persist beyond the implementation dips that come with adopting new curricula.

Taken in the second semester of the first year, the Cellular and Genetics Systems lab uses investigatory modules to teach essential laboratory skills, prevailing methods, and core competencies. Recognizing that the typical flow of a research project – reading literature, designing and conducting experiments, analyzing data – parallels the learning cycle (Lawson, 1995), we surmised that modules aligned with this flow would promote effective learning. We recognized a growing body of literature – accessible to first-year students – showing that cooking affects the nutritional properties of foods (e.g., López-Berenguer et al., 2007). To take advantage of innate student interest in food, nutrition, and health, we selected the preparation of vegetables, specifically broccoli, for consumption as the research context for these modules. We believed this would provide a platform for achieving significant engagement with our target core competencies: scientific process, quantitative reasoning, and interdisciplinary science skills. The literature forms the basis for designing interesting experiments. Affordable, durable instrumentation and easy-to-use methodologies, even for students with little or no prior experience, enable a focus on quantitative skills, data standardization, and interpretation, based on descriptive statistics and graphs. Inherent interdisciplinary connections to chemistry, statistics, and social issues (including food systems and heath policy) facilitate integrative skills.

Materials

For a list of instruments and materials needed each week, please consult the Instructor's Manual provided in the online Supplemental Material.

Methods

This laboratory course comprises two multiweek modules, each centered around a research question pertaining to the production of isothiocyanates, nutraceutical compounds produced from glucosinolates in a reaction catalyzed by the enzyme myrosinase (Vermeulen et al., 2008). Initial weeks introduce relevant literature and essential laboratory skills. Next, students learn key experimental methods by performing them. Finally, student teams formulate a focused hypothesis (based on the literature), design and conduct an experiment to test this hypothesis, and then standardize their data and perform descriptive statistical analyses. (Note: a course in statistical analysis is not a prerequisite to our course.)

The following is an outline of our two modules. Please see the Instructor's Manual in the Supplemental Material for a more descriptive summary of each week's activities.

Module 1: How do food preparation parameters affect nutraceutical properties of broccoli?

  • Weeks 1–3: Explore literature and master essential skills

    • Conducting literature searches, reading scientific papers, microliter pipetting, preparing dilutions, using spectrophotometers, graphing

  • Weeks 4 and 6: Learn methods

    • Protein extraction and Bradford assay, myrosinase (glucose production) assay

  • Weeks 5 and 7: Design and conduct experiments to test hypotheses; analyze data

    • Measure protein content and myrosinase activity of broccoli cooked using various methods, standardize data, conduct descriptive statistical analyses, formulate conclusions based on data interpretation

Module 2: How do isothiocyanates (ITCs) affect the proliferation of human (Jurkat) cancer cells?

  • Weeks 1–2: Explore literature and master essential skills

    • Conducting literature searches, reading scientific papers, using microscopes, using sterile technique

  • Weeks 2–4: Learn methods

    • Human cell culture, cell counting using a hemocytometer, microplate-based cell viability assay, DNA extraction, gel electrophoresis

  • Weeks 2 and 4: Design and conduct experiments to test hypotheses; analyze data

    • Measure cell death and DNA fragmentation to assess the effect of ITCs on apoptosis, standardize data, conduct descriptive statistical analyses, formulate conclusions based on data interpretation

Methods: Assessment Strategy

Our assessment of the revised curriculum had several components. In the first week of the course, students completed a pre-assessment consisting of open-ended questions that involved quantitative reasoning skills (such as calculating dilutions) that are taught in a prerequisite chemistry course, and graphical interpretation. At midterm, we administered a lab practical exam based on Module 1 and, at the end of the course, a comprehensive final exam that included items that matched the pretest as well as other items assessing mastery of core competencies and essential skills, familiarity with literature, understanding of methodologies, and mastery of data standardization, graphing, and interpretation skills. We directly measured student learning gains by comparing the mean scores on questions presented to students on the pre-assessment to the mean scores on matched questions (see  Appendix B) presented on a midterm and a final exam (Table 1).

Table 1.
Student learning gains as measured by comparing performance on pretests and posttests in 2013–2014. Pretests were administered at the beginning of the semester, before students had been exposed to the lab activities. Posttests were conducted at midterm and during the final lab. Normalized gain (ḡ) was calculated by averaging individual gains (Bao, 2006). See  Appendix B for question details.
Learning OutcomePercent CorrectPa
PretestPosttest
Scientific literature search 59.0% 87.0% 0.000b 77.0% 
Simple dilutions 14.7% 68.3% 0.000b 64.3% 
Serial dilutions 3.3% 64.7% 0.000b 70.1% 
Using a standard curve 26.3% 77.8% 0.000b 36.7% 
Quantitative transfer with micropipette 2.0% 85.0% 0.000c 85.5% 
Test total 19.9% 72.7% 0.000d 70.6% 
Learning OutcomePercent CorrectPa
PretestPosttest
Scientific literature search 59.0% 87.0% 0.000b 77.0% 
Simple dilutions 14.7% 68.3% 0.000b 64.3% 
Serial dilutions 3.3% 64.7% 0.000b 70.1% 
Using a standard curve 26.3% 77.8% 0.000b 36.7% 
Quantitative transfer with micropipette 2.0% 85.0% 0.000c 85.5% 
Test total 19.9% 72.7% 0.000d 70.6% 

aα = 0.05 adjusted with a Bonferroni correction.

bWilcoxon signed-rank test.

cSign test.

dPaired-sample t-test.

We also assessed affective outcomes, another key indicator of the efficacy of our reforms. Students' overall attitude toward the lab was measured with campus-wide course evaluations ( Appendix C) administered at the end of the semester. To measure students' self-perception of learning, we administered customized Student Assessment of Learning Gains (SALG) surveys ( Appendix A) to the control (2012) and treatment (2013–16) groups (Figure 1). Finally, we used focus group interviews at the end of the first two treatment years (2013 and 2014) to solicit specific suggestions for improvement from students (Table 2).

Figure 1.

Students' perceptions of their core competency gains. Four SALG survey questions were used to assess perceived gains, by way of a Likert-like scale ranging from “no gains” (1) to “great gains” (5), with respect to scientific process, basic laboratory skills, analytical and quantitative skills, and interdisciplinary integration (see  Appendix A for question details). Sample sizes varied by year: 86 (2012), 93 (2013), 63 (2014), 71 (2015), and 58 (2016). Bars represent means ± SE. Data from 2012 serve as the pre-implementation controls; cohorts from 2013–16 are independent treatments reflecting ongoing adjustments to the course.

Figure 1.

Students' perceptions of their core competency gains. Four SALG survey questions were used to assess perceived gains, by way of a Likert-like scale ranging from “no gains” (1) to “great gains” (5), with respect to scientific process, basic laboratory skills, analytical and quantitative skills, and interdisciplinary integration (see  Appendix A for question details). Sample sizes varied by year: 86 (2012), 93 (2013), 63 (2014), 71 (2015), and 58 (2016). Bars represent means ± SE. Data from 2012 serve as the pre-implementation controls; cohorts from 2013–16 are independent treatments reflecting ongoing adjustments to the course.

Table 2.
Feedback from student focus groups. Interviews were conducted in one section of the course by a non-biology member of the faculty with expertise in student assessment. After compiling a larger set of responses to each question, each class voted on three that best reflected their consensus.
Focus Group QuestionsTop Three Responses
20132014
What aspects of the lab helped you learn? 
  • Instructor and TA were helpful during lab

  • Learning in a hands-on way

  • Practiced procedures before we had to use them

 
  • Learning practical techniques

  • Lab manual clarity

  • Group work

 
What aspects of the lab hindered your learning? 
  • Class was too much work, overwhelming

  • Provide more explanation concerning procedures

  • Instructor feedback was inconsistent

 
  • Lab write-ups were too lengthy

  • Lab went over scheduled time

  • Uncertainty as to why a procedure would be done

 
What specific suggestions do you have for modifying the lab? 
  • More class discussion

  • Better preparation for the midterm

  • Recap of last week's lab every week

 
  • Increased clarity in expectations

  • Increased clarity in lab manual

  • Connect lab to lecture

 
Focus Group QuestionsTop Three Responses
20132014
What aspects of the lab helped you learn? 
  • Instructor and TA were helpful during lab

  • Learning in a hands-on way

  • Practiced procedures before we had to use them

 
  • Learning practical techniques

  • Lab manual clarity

  • Group work

 
What aspects of the lab hindered your learning? 
  • Class was too much work, overwhelming

  • Provide more explanation concerning procedures

  • Instructor feedback was inconsistent

 
  • Lab write-ups were too lengthy

  • Lab went over scheduled time

  • Uncertainty as to why a procedure would be done

 
What specific suggestions do you have for modifying the lab? 
  • More class discussion

  • Better preparation for the midterm

  • Recap of last week's lab every week

 
  • Increased clarity in expectations

  • Increased clarity in lab manual

  • Connect lab to lecture

 

Results

Significant Learning Gains Realized

By comparing responses to identical questions on pre- and post-assessments (midterm and final exam), we directly measured learning gains in core competencies (Table 1). In all cases, normalized learning gains (ḡ) were significant. In particular, we examined science process skills such as ability to use the literature and perform lab techniques, quantitative reasoning as used in dilution calculations and with a standard curve, and interdisciplinary connections with chemistry and statistics. Most students entering this course have some knowledge of Internet search tools that are pertinent to scientific literature. Nevertheless, it was clear from the comparison of students' responses to the relevant pretest and posttest question (see  Appendix B) that they made learning gains that reflect a greater ability to distinguish primary from review articles and more attentiveness to the types of key words to use when performing literature searches. In the prerequisite General Chemistry course, students were taught to perform simple dilutions and use standard curves; yet only a minority of our students were able to explain these tasks on the pretest. Even so, through repeated performance of these tasks throughout the biology laboratory course, the majority of students demonstrated mastery of these crucial skills. Our assessments also revealed that few students came into the course with any experience in quantitative transfer with micropipettes or with serial dilution; indeed, neither of these was taught in our General Chemistry course. Despite this fact, the majority of our students mastered these skills. This is especially true of micropipetting, a technique introduced early and employed nearly every week; the normalized learning gain for this skill was the highest observed, at 85.5%.

Student Perceptions Reveal an Implementation Dip

Student perceptions of their mastery of core competencies, as determined by analyzing their responses on end-of-course SALG surveys (Figure 1), tended to decline in the first two years of implementation (2013 and 2014) compared to our control (2012). Students were asked to rate their improvement in four bioscience competencies on a scale from 1 to 5 (1 = no gains, 3 = moderate gains, 5 = great gains): mastering the scientific process of investigation; developing basic lab skills; developing analytical and quantitative skills; and understanding the connection between the concepts in biology, chemistry, and mathematics. Figure 1 reveals that for all competencies, except developing basic lab skills, mean perception scores declined in the first two years (2013 and 2014) following the curriculum revision in comparison to our pre-implementation control (2012). However, as improvements were made in subsequent years (2015 and 2016), there was a trend toward higher ratings in all of these areas. These ratings tended to mirror the kinds of comments voiced by students during the course, with frustrations most commonly expressed in 2013 and 2014.

To further illuminate the nature of our implementation dip (Fullan, 2007), we also scrutinized student ratings from end-of-semester course evaluations. Figure 2 reveals some very informative results. First, students perceived that the intellectual challenge and effort required in our project-based modules had increased compared to the level of challenge and effort in more traditional labs (which these students experienced the prior semester in General Chemistry). In fact, since project-based experiences provide a richer learning environment, we expected students to find this more challenging. Secondly, these data revealed a mismatch between the significant learning gains made by students in 2013–14 (Table 1) and these same students' low perceptions of their learning gains (Figure 2). Finally, our results clearly show a correlation between higher perceived levels of challenge and effort, lower levels of confidence in the amount learned, and lower perceptions of the course and instructor. Student opinion of the course as a whole declined significantly from a mean of 3.8, which is on par with department averages for all courses, to a mean of 2.6 in 2013 and 2.5 in 2014 (Figure 2). Similarly, perceptions of the course's organization declined from a mean of 4.0 to 2.7–2.8. These spilled over into lower opinions of the instructor's teaching effectiveness, clarity of instruction, fairness, and (to some extent) even the instructor's enthusiasm. Together, these data led us to initiate a series of revisions in attempts to recover from this demoralizing implementation dip.

Figure 2.

Students' perceptions and affective outcomes. End-of-semester course evaluation questions were used to assess perceptions of this course in relation to other courses by way of a Likert scale ranging from “much lower than average/poor” (1) to “average” (3) to “much higher than average/excellent” (5) (see  Appendix C for question details). Data are from one instructor who taught one or two sections of the course each year. Sample sizes varied by year: 16 (2012), 38 (2013), 34 (2014), 35 (2015), and 16 (2016). Bars represent means ± SE. Data from 2012 serve as the pre-implementation controls; cohorts from 2013–16 are independent treatments reflecting ongoing adjustments to the course.

Figure 2.

Students' perceptions and affective outcomes. End-of-semester course evaluation questions were used to assess perceptions of this course in relation to other courses by way of a Likert scale ranging from “much lower than average/poor” (1) to “average” (3) to “much higher than average/excellent” (5) (see  Appendix C for question details). Data are from one instructor who taught one or two sections of the course each year. Sample sizes varied by year: 16 (2012), 38 (2013), 34 (2014), 35 (2015), and 16 (2016). Bars represent means ± SE. Data from 2012 serve as the pre-implementation controls; cohorts from 2013–16 are independent treatments reflecting ongoing adjustments to the course.

Scientific Teaching Aids Recovery

Scientific teaching entails the combined use of evidence-based pedagogies and course data to promote better learning (Handelsman et al., 2004). Our data reveal that even though our students were making significant learning gains (Table 1), their perceptions were declining (Figures 1 and 2). This is consistent with commonly held faculty concerns about student resistance as a barrier to reform efforts (Seidel & Tanner, 2013). Nevertheless, we took these as evidence of the need for corrective action. Qualitative data from open-ended SALG survey questions and focus group interviews revealed the top areas of student dissatisfaction (Table 2):

  • Lack of a clear connection between lecture and lab material

  • Lack of understanding of significance of learning goals

  • Expectation of more direction from faculty, less student self-direction (i.e., less inquiry)

  • Too much effort required to meet lab expectations

In response to these student concerns, we made several revisions to the curriculum. We modified lab manual introductions to more clearly explain how weekly activities pertain to the scientific research process and the driving research question. We added scaffolding, such as calculation prompts, to guide quantitative analysis (Hmelo-Silver et al., 2007). We added text boxes to highlight learning outcomes, critical-thinking tips, and vocational connections. Questions that facilitate reflection at the end of each lab period were also added to the lab manual. Collectively, these steps were correlated with marked improvements in student perceptions and affective outcomes in 2015 and 2016 (Figures 1 and 2). It is important to note that desired learning outcomes and perceived intellectual challenge were sustained during the 2015 and 2016 iterations of the course. Thus, it is unlikely that the affective outcomes increase can be attributed to any perception of the course being made “easier.”

Implementation Notes

After our first year with the new course, the implementation dip guided revisions that we now see as key for other potential adopters to consider. The signature component of our implementation dip was the mismatch between student perceptions of their learning (Figures 1 and 2) and their actual learning gains (Table 1). This contributed to lower student self-efficacy, science identity, motivation, and negative attitudes (as is evident in written comments on SALG surveys, course evaluations, and focus group interviews; Table 2), all of which pertain to the affective domain (Trujillo & Tanner, 2014). This type of mismatch between perception and achievement is not uncommon when instructors adopt learner-centered pedagogies that alter the classroom dynamics (Van Sickle, 2016). Inquiry-based learning often requires the students to invest more time and effort, which the students may think could have been avoided if they were “told what to learn” (Loughran & Derry, 1997). We surmise that this contributed to our markedly lower scores for course organization and clarity of instruction in 2013 and 2014 (Figure 2). Indeed, providing prompts for performing a complex set of calculations for data standardization (involving unit conversions and adjusting for dilutions) – rather than guiding students to discover the required steps themselves by constructing a flowchart of the experimental manipulations (Figure 3) – contributed significantly to our implementation dip recovery in 2015 and 2016 (Figures 1 and 2).

Figure 3.

Type of instructional scaffolding influences learning outcomes. In 2014 three different instructors tested different types of instructional scaffolding in their sections of the course. One section (18 students) constructed experimental flowcharts to reveal quantitative manipulations. Another section (24 students) was given instructional prompts outlining the necessary steps in quantitative reasoning. Other sections (38 students) used both types of instructional scaffolding. One week later, a midterm exam assessed performance on an analogous quantitative problem (see  Appendix B for details). Scores for that problem are shown; bars represent means ± SE. One-way ANOVA revealed significant differences in performance as denoted by the asterisk (p = 0.006). Scheffe post hoc comparisons indicated that the mean score for the “flowchart only” section was significantly lower than the mean scores for the “prompts only” section (p = 0.011) and the “flowchart + prompts” sections (p = 0.036).

Figure 3.

Type of instructional scaffolding influences learning outcomes. In 2014 three different instructors tested different types of instructional scaffolding in their sections of the course. One section (18 students) constructed experimental flowcharts to reveal quantitative manipulations. Another section (24 students) was given instructional prompts outlining the necessary steps in quantitative reasoning. Other sections (38 students) used both types of instructional scaffolding. One week later, a midterm exam assessed performance on an analogous quantitative problem (see  Appendix B for details). Scores for that problem are shown; bars represent means ± SE. One-way ANOVA revealed significant differences in performance as denoted by the asterisk (p = 0.006). Scheffe post hoc comparisons indicated that the mean score for the “flowchart only” section was significantly lower than the mean scores for the “prompts only” section (p = 0.011) and the “flowchart + prompts” sections (p = 0.036).

Between the first and second iterations of the course, we took steps to address the concern that the lab contained too much material and was overwhelming. We attended to this by reducing the number of modules from three to two (Table 3), thereby providing more time for in-class guidance when performing more complex calculations (quantifying protein levels and enzyme activities; see scaffolding in  Appendix D) and making appropriate inferences that take into account the variance in the data (see Instructor's Manual, p. 9, in Supplemental Material). After this modification, we noted that student affective dispositions stopped declining in 2014 (Figures 1 and 2). However, they did not improve significantly until we had taken additional pedagogical steps to provide sufficient guidance in the research process and data analysis stages. Our data demonstrate the necessity of providing direct instruction in teaching unfamiliar methodologies. For examples of this direct instruction, see teaching tips in the Instructor's Manual for procedures such as micropipetting (p. 8), gel electrophoresis (p. 29), microscopy (p. 24), the use and calibration of a spectrophotometer (p. 8), the creation and implementation of Beer's Law standard curves (p. 9), and performance of the Bradford assay (p. 13). As a result of making these pedagogical improvements, students indicated in SALG surveys that these lab skills were taught well and would be helpful in the future.

Table 3.
“Adaptive Evolution of our Cellular and Genetic Systems” laboratory course. In prior versions of the course (e.g., 2012), students investigated isolated research questions pertaining to weekly lecture topics. The first implementation of the revised course in 2013 involved the coalescing of key methodologies and skills into multiweek projects, each focused on a general research question and culminating in a presentation and assessment exam. Adjustments were made in 2014–16 to promote mastery of core competencies (in italics) while improving students' affective outcomes.
Week2012: Last Iteration of Old Lab Activities2013: First Implementation of Project-Based Lab Modules2014–16: Adjustments to Facilitate Desired Outcomes
  • Learning strategies

 
Effects of cooking on broccoli proteins 
  • Literature skills

 
Effects of cooking on broccoli proteins 
  • Literature skills

 
  • Exploring cell diversity: microscopy

 
  • Essential skills: pipetting, dilutions, spectroscopy

 
  • Essential skills: pipetting, spectroscopy, graphing

 
  • Assessing protein content: dilutions, pipetting, spectroscopy, Bradford assay

 
  • Intro to protein extraction, Bradford assay

  • Experimental proposal

 
  • Essential skills: simple & serial dilutions

 
  • Assessing protein content: quantify protein in milk/juice, graphing

 
  • Cooking experiment

  • Protein quantification

 
  • Intro to protein extraction, Bradford assay

  • Essential skill: standard curves

  • Experimental proposal

 
  • Characterizing catalase catalytic properties

 
  • Myrosinase assay

 
  • Cooking experiment

  • Protein quantification

 
  • Investigating respiration & fermentation in yeast

 
ITC effects on bacteria 
  • Ampicillin dose-response

  • Literature & experiment proposal

 
  • Myrosinase assay

 
  • Investigating photosynthetic electron transport

 
  • ITC dose-response experiment

  • BacTiter Glo assay

 
  • Data analyses

  • ELN reports

 
  • Modeling mitosis and meiosis

 
  • e-Poster presentations

  • Midterm lab practical exam

 
  • Midterm lab practical exam

 
  • Detecting GM foods: DNA extraction, PCR

 
Effects of ITCs on cancer cell death 
  • Essential skills: microscopy, hemocytometry

  • Literature & experiment proposal

 
Effects of ITCs on cancer cell death 
  • Essential skills: microscopy, hemocytometry

  • Literature & experiment proposal

 
10 
  • Detecting GM foods: gel electrophoresis

 
  • Essential skill: sterile tech

  • ITC dose-response experiment

  • CellTiter Glo assay

 
  • Essential skill: sterile tech

  • ITC dose-response experiment

  • CellTiter Glo assay

 
11 
  • Investigating mutant strains of E. coli

 
  • Jurkat cell DNA extraction

  • Intro to electrophoresis

 
  • Jurkat cell DNA extraction

  • Intro to electrophoresis

 
12 
  • Modeling Drosophila genetics

 
  • Jurkat DNA electrophoresis

 
  • Jurkat DNA electrophoresis

  • Data analyses

  • ELN reports

 
13 – 
  • e-Poster presentations

  • Final exam

 
  • Final exam

 
Week2012: Last Iteration of Old Lab Activities2013: First Implementation of Project-Based Lab Modules2014–16: Adjustments to Facilitate Desired Outcomes
  • Learning strategies

 
Effects of cooking on broccoli proteins 
  • Literature skills

 
Effects of cooking on broccoli proteins 
  • Literature skills

 
  • Exploring cell diversity: microscopy

 
  • Essential skills: pipetting, dilutions, spectroscopy

 
  • Essential skills: pipetting, spectroscopy, graphing

 
  • Assessing protein content: dilutions, pipetting, spectroscopy, Bradford assay

 
  • Intro to protein extraction, Bradford assay

  • Experimental proposal

 
  • Essential skills: simple & serial dilutions

 
  • Assessing protein content: quantify protein in milk/juice, graphing

 
  • Cooking experiment

  • Protein quantification

 
  • Intro to protein extraction, Bradford assay

  • Essential skill: standard curves

  • Experimental proposal

 
  • Characterizing catalase catalytic properties

 
  • Myrosinase assay

 
  • Cooking experiment

  • Protein quantification

 
  • Investigating respiration & fermentation in yeast

 
ITC effects on bacteria 
  • Ampicillin dose-response

  • Literature & experiment proposal

 
  • Myrosinase assay

 
  • Investigating photosynthetic electron transport

 
  • ITC dose-response experiment

  • BacTiter Glo assay

 
  • Data analyses

  • ELN reports

 
  • Modeling mitosis and meiosis

 
  • e-Poster presentations

  • Midterm lab practical exam

 
  • Midterm lab practical exam

 
  • Detecting GM foods: DNA extraction, PCR

 
Effects of ITCs on cancer cell death 
  • Essential skills: microscopy, hemocytometry

  • Literature & experiment proposal

 
Effects of ITCs on cancer cell death 
  • Essential skills: microscopy, hemocytometry

  • Literature & experiment proposal

 
10 
  • Detecting GM foods: gel electrophoresis

 
  • Essential skill: sterile tech

  • ITC dose-response experiment

  • CellTiter Glo assay

 
  • Essential skill: sterile tech

  • ITC dose-response experiment

  • CellTiter Glo assay

 
11 
  • Investigating mutant strains of E. coli

 
  • Jurkat cell DNA extraction

  • Intro to electrophoresis

 
  • Jurkat cell DNA extraction

  • Intro to electrophoresis

 
12 
  • Modeling Drosophila genetics

 
  • Jurkat DNA electrophoresis

 
  • Jurkat DNA electrophoresis

  • Data analyses

  • ELN reports

 
13 – 
  • e-Poster presentations

  • Final exam

 
  • Final exam

 

The way we developed scaffolding to assist with quantitative tasks was fortuitous in that it enabled us to measure its impact directly. Prior to its development, even our best students were often confused and frustrated during the data standardization and analysis stage of the process. This led one instructor, who taught the first section of the course, to abandon the existing strategy – guiding students to discover the required steps themselves by constructing a flowchart of the experimental manipulations – and instead prompt them through a series of steps in the calculations. The prompts (PowerPoint slides given in  Appendix D) then became a scaffolding, which the instructor shared with colleagues. One colleague implemented the switch; the other did not. Subsequently, in the analysis of a complex midterm question, we observed a marked difference between sections that was directly correlated with exposure to the scaffolding (Figure 3). To eliminate the possibility that this difference was due to a baseline difference in students' inherent abilities, pre-assessment mean scores were also compared with a one-way ANOVA and revealed no significant difference between the three sections (p = 0.798). These data are consistent with the idea that in inquiry-based learning, students require more guidance to successfully navigate complex data analysis problems, a process that requires higher-order cognitive skills encompassing the application of knowledge and critical thinking and involving deep conceptual understanding (Zoller, 1993).

Finally, in 2015 and 2016 the instructor who taught at least one section of this course each year also made a conscious effort to engage in more non-content dialogue with students (Seidel et al., 2015). This included building rapport with students and within student teams, sharing personal experiences in research, explaining pedagogical choices, and clarifying expectations. As a result, this instructor realized a full recovery from our implementation dip. Other instructors, who have not been consistently assigned to this course, have seen improvements since the implementation dip, but not a full recovery.

Conclusion

Our experiences underscore the value of evidence-based curriculum development and revision (i.e., scientific teaching; Handelsman et al., 2004). In addition to using evidence-based pedagogies, our use of assessment data to drive interventions in the learning process were crucial in our recovery from an implementation dip (Fulan, 2007). Our experiences also attest to an essential feature of guided inquiry: since different levels of guidance are needed at different stages, its success is dependent on high-quality student–faculty and student–student partnerships. Our experiences demonstrate the importance of persistence in scientific teaching practices in order to find the appropriate level of guidance. In terms of educational theory, our scaffolding reduces the extraneous cognitive load (challenges posed by instructional choices), making it easier for students to manage the intrinsic cognitive load (imposed by the complexity of the calculations), and resulting in a higher germane cognitive load that facilitates effective learning (Sweller, 1988).

Reading the literature about inquiry-based learning, it is easy to get the impression that “if you build it, they will come.” But perhaps the comparison between traditional didactic teaching and inquiry-based learning is more like dancing. Instead of insisting that the role of the instructor is to lead and the role of the students is to follow, the reality is that each has to constantly pay close attention to the other. For the dance to be beautiful, both have to practice working together. If one dancer tries to do it all, then there is no dance. “It takes two to tango.”

This project was funded by the National Science Foundation (TUES grant no. 1140767). We are indebted to our biology department colleagues for their contributions in the planning and implementation of the revised lab curricula. We also thank our Calvin College collaborators – Paul Moes and Randall Pruim – for assistance with statistical analyses.

References

References
AAAS
(
2011
).
Vision and Change in Undergraduate Biology Education: A Call to Action
.
Washington, DC
:
American Association for the Advancement of Science
.
Ashcraft, M.H. (
2002
).
Math anxiety: personal, educational, and cognitive consequences
.
Current Directions in Psychological Science
,
11
,
181
185
.
Babikian, Y. (
1971
).
An empirical investigation to determine the relative effectiveness of discovery, laboratory, and expository methods of teaching science concepts
.
Journal of Research in Science Teaching
,
8
,
201
209
.
Bao, L. (
2006
).
Theoretical comparisons of average normalized gain calculations
.
American Journal of Physics
,
74
,
917
922
.
Baumgartner, E., Biga, L., Bledsoe, K., Dawson, J., Grammer, J., Howard, A. & Snyder, J. (
2015
).
Exploring phytoplankton population growth to enhance quantitative literacy: putting Vision and Change into action
.
American Biology Teacher
,
77
,
265
272
.
Blanchard, M.R., Southerland, S.A., Osborne, J.W., Sampson, V.D., Annetta, L.A. & Granger, E.M. (
2010
).
Is inquiry possible in light of accountability? A quantitative comparison of the relative effectiveness of guided inquiry and verification laboratory instruction
.
Science Education
,
94
,
577
616
.
Brownell, S.E., Kloser, M.J., Fukami, T. & Shavelson, R. (
2012
).
Undergraduate biology lab courses: comparing the impact of traditionally based “cookbook” and authentic research-based courses on student lab experiences
.
Journal of College Science Teaching
,
41
,
18
27
.
Chandler, P. & Sweller, J. (
1991
).
Cognitive load theory and the format of instruction
.
Cognition and Instruction
,
8
,
293
332
.
de Jong, T. (
2010
).
Cognitive load theory, educational research, and instructional design: some food for thought
.
Instructional Science
,
38
,
105
134
.
Felder, R.M. & Brent, R. (
1996
).
Navigating the bumpy road to student-centered instruction
.
College Teaching
,
44
,
43
47
.
Freeman, S., Eddy, S.L., McDonough, M., Smith, M.K., Okoroafor, N., Jordt, H. & Wenderoth, M.P. (
2014
).
Active learning increases student performance in science, engineering, and mathematics
.
Proceedings of the National Academy of Sciences USA
,
111
,
8410
8415
.
Fullan, M. (
2007
).
The New Meaning of Educational Change
.
New York, NY
:
Routledge
.
Garlick, J.A. & Levine, P. (
2016
).
Where civics meets science: building science for the public good through civic science
.
Oral Diseases
,
23
,
692
696
.
Goldey, E.S., Abercrombie, C.L., Ivy, T.M., Kusher, D.I., Moeller, J.F., Rayner, D.A., et al (
2012
).
Biological inquiry: a new course and assessment plan in response to the call to transform undergraduate biology
.
CBE–Life Sciences Education
,
11
,
353
363
.
Gormally, C., Brickman, P., Armstrong, N. & Hallar, B. (
2009
).
Effects of inquiry-based learning on students' science literacy skills and confidence
.
International Journal of Scholarship of Teaching and Learning
,
3
(
2
).
Hake, R.R. (
1998
).
Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses
.
American Journal of Physics
,
66
,
64
74
.
Handelsman, J., Ebert-May, D., Beichner, R., Bruns, P., Chang, A., DeHaan, R., et al (
2004
).
Scientific teaching
.
Science
,
304
,
521
522
.
Hmelo-Silver, C.E., Duncan, R.G. & Chinn, C.A. (
2007
).
Scaffolding and achievement in problem-based and inquiry learning: a response to Kirschner, Sweller, and Clark (2006)
.
Educational Psychologist
,
42
,
99
107
.
Huang, S.-H., Hsu, M.-H., Hsu, S.-C., Yang, J.-S., Huang, W.-W., Huang, A.-C., et al (
2014
).
Phenethyl isothiocyanate triggers apoptosis in human malignant melanoma A375.S2 cells through reactive oxygen species and the mitochondria-dependent pathways
.
Human & Experimental Toxicology
,
33
,
270
283
.
Hwang, J.H. & Lim, S.B. (
2015
).
Antioxidant and anticancer activities of broccoli by-products from different cultivars and maturity stages at harvest
.
Preventive Nutrition and Food Science
,
20
,
8
14
.
Jensen, J.L. & Lawson, A. (
2011
).
Effects of collaborative group composition and inquiry instruction on reasoning gains and achievement in undergraduate biology
.
CBE–Life Sciences Education
,
10
,
64
73
.
Kirschner, P.A., Sweller, J. & Clark, R.E. (
2006
).
Why minimal guidance during instruction does not work: an analysis of the failure of constructivist, discovery, problem-based, experiential, and inquiry-based teaching
.
Educational Psychologist
,
41
,
75
86
.
Koetje, D. & Wilstermann, A. (
2011
).
Living Systems: Global Concepts, Local Connections – a SENCER model course
. http://ncsce.net/living-systems-global-concepts-local-connections/.
Kuhn, D. (
2007
).
Is direct instruction an answer to the right question?
Educational Psychologist
,
42
,
109
113
.
Lawson, A.E. (
1995
).
Science Teaching and the Development of Thinking
.
Belmont, CA
:
Wadsworth
.
López-Berenguer, C., Carvajal, M., Moreno, D.A. & García-Viguera, C. (
2007
).
Effects of microwave cooking conditions on bioactive compounds present in broccoli inflorescences
.
Journal of Agricultural and Food Chemistry
,
55
,
10001
10007
.
Loughran, J. & Derry, N. (
1997
).
Researching teaching for understanding: the students' perspective
.
International Journal of Science Education
,
19
,
925
938
.
Mestre, J.P. (Ed.) (
2005
).
Transfer of Learning from a Modern Multidisciplinary Perspective
.
Charlotte, NC
:
IAP
.
Meyer, P., Hong, H.H. & Fynewever, H. (
2008
).
Inquiry‐based chemistry curriculum for pre‐service education students
.
Chemical Educator
,
13
,
120
125
.
Momsen, J.L., Long, T.M., Wyse, S.A. & Ebert-May, D. (
2010
).
Just the facts? Introductory undergraduate biology courses focus on low-level cognitive skills
.
CBE–Life Sciences Education
,
9
,
435
440
.
Moreno, D.A., Carvajal, M., López-Berenguer, C. & García-Viguera, C. (
2006
).
Chemical and biological characterisation of nutraceutical compounds of broccoli
.
Journal of Pharmaceutical and Biomedical Analysis
,
41
,
1508
1522
.
Schmidt, H.G., Loyens, S.M.M., Van Gog, T. & Paas, F. (
2007
).
Problem-based learning is compatible with human cognitive architecture: commentary on Kirschner, Sweller, and Clark (2006)
.
Educational Psychologist
,
42
,
91
97
.
Schnotz, W. & Kürschner, C. (
2007
).
A reconsideration of cognitive load theory
.
Educational Psychology Review
,
19
,
469
508
.
Seidel, S.B., Reggi, A.L., Schinske, J.N., Burrus, L.W. & Tanner, K.D. (
2015
).
Beyond the biology: a systematic investigation of noncontent instructor talk in an introductory biology course
.
CBE–Life Sciences Education
,
14
,
ar43
.
Seidel, S.B. & Tanner, K.D. (
2013
).
“What if students revolt?” Considering student resistance: origins, options, and opportunities for investigation
.
CBE–Life Sciences Education
,
12
,
586
595
.
Siritunga, D., Montero-Rojas, M., Carrero, K., Toro, G., Vélez, A. & Carrero-Martínez, F.A. (
2011
).
Culturally relevant inquiry-based laboratory module implementations in upper-division genetics and cell biology teaching laboratories
.
CBE–Life Sciences Education
,
10
,
287
297
.
Spell, R.M., Guinan, J.A., Miller, K.R. & Beck, C.W. (
2014
).
Redefining authentic research experiences in introductory biology laboratories and barriers to their implementation
.
CBE–Life Sciences Education
,
13
,
102
110
.
Sundberg, M.D. & Moncada, G.J. (
1994
).
Creating effective investigative laboratories for undergraduates
.
BioScience
,
44
,
698
704
.
Sweller, J. (
1988
).
Cognitive load during problem solving: effects on learning
.
Cognitive Science
,
12
,
257
285
.
Thomas, B. & Snider, B. (
1969
).
The effects of instructional method upon the acquisition of inquiry skills
.
Journal of Research in Science Teaching
,
6
,
377
386
.
Thomson, S.J., Brown, K.K., Pullar, J.M. & Hampton, M.B. (
2006
).
Phenethyl isothiocyanate triggers apoptosis in Jurkat cells made resistant by the overexpression of Bcl-2
.
Cancer Research
,
66
,
6772
6777
.
Tobias, S. & Duffy, T.M. (
2009
).
Constructivist Instruction: Success or Failure?
New York, NY
:
Routledge
.
Trujillo, G. & Tanner, K.D. (
2014
).
Considering the role of affect in learning: monitoring students' self-efficacy, sense of belonging, and science identity
.
CBE–Life Sciences Education
,
13
,
6
15
.
Udovic, D., Morris, D., Dickman, A., Postlethwait, J. & Wetherwax, P. (
2002
).
Workshop biology: demonstrating the effectiveness of active learning in an introductory biology course
.
BioScience
,
52
,
272
281
.
Van Sickle, J. (
2016
).
Discrepancies between student perception and achievement of learning outcomes in a flipped classroom
.
Journal of the Scholarship of Teaching and Learning
,
16
,
29
38
.
Vermeulen, M., Klöpping-Ketelaars, I.W.A.A., van den Berg, R. & Vaes, W.H.J. (
2008
).
Bioavailability and kinetics of sulforaphane in humans after consumption of cooked versus raw broccoli
.
Journal of Agricultural and Food Chemistry
,
56
,
10505
10509
.
Volkmann, M.J., Abell, S.K. & Zgagacz, M. (
2005
).
The challenges of teaching physics to preservice elementary teachers: orientations of the professor, teaching assistant, and students
.
Science Education
,
89
,
847
869
.
Yuan, G., Sun, B., Yuan, J. & Wang, Q. (
2009
).
Effects of different cooking methods on health-promoting compounds of broccoli
.
Journal of Zhejiang University Science B
,
10
,
580
588
.
Zoller, U. (
1993
).
Are lecture and learning compatible? Maybe for LOCS: unlikely for HOCS
.
Journal of Chemical Education
,
70
,
195
.

Appendix A

Assessment of Students' Perceptions of Their Core Competency Gains: SALG Survey Questions (see Figure 1)

  • Students were asked: Please rate your GAINS in each of the following BIOSCIENCE COMPETENCIES.

    [Scale = 1 (no gains), 2 (a little gain), 3 (moderate gain), 4 (good gain), 5 (great gain)]

  • Scientific Process:

    Mastering the scientific process of investigation (designing and conducting experiments)

  • Laboratory Skills:

    Developing basic lab skills (pipetting, making dilutions, following procedures)

  • Analytical/Quantitative Skills:

    Developing analytical and quantitative skills (statistical analysis, graphing)

  • Interdisciplinary Integration:

    Understanding the connection between the concepts in biology, chemistry, and mathematics

Appendix B

Assessment of Student Learning Gains: Matched Pretest & Posttest Questions (see Table 2)

Scientific Literature Search: If you had to find a scientific review article about the health effects of different “nutraceutical” (having nutritional and pharmaceutical properties) compounds in broccoli, how would you go about doing this? Explain in as much detail as you can.

Simple Dilutions: You are given a stock solution of 1 M glucose and told that you need to prepare 10 mL that has a final concentration of 10 mM glucose. How do you go about doing this?

Serial Dilutions: By using a standard curve, we will be able to quantify the amount of ATP in our cultures. The manufacturer of our assay kit suggests preparing ATP standards with a final volume of 100 μL and concentrations of 0.01–1000 nM. Starting with a 1 μM stock solution, what would be the most accurate way to prepare these standards? Show all your work.

Using a Standard Curve: Biologists often use a spectrophotometer to measure the absorbance of a solution and thereby determine the concentration of a particular solute in that solution. To do this requires the use of a standard curve in which different concentrations of known quantities of solute are measured.

  • What is a standard curve and how do you prepare one?

  • Absorbance is a ratio of the level of light passing through a sample over the level of incident light coming from a light source. As the concentration of solute increases, the absorbance _______________ (choose one: increases/decreases).

  • How can one determine the concentration of the solute in an experimental tissue sample in comparison with the standard curve?

Quantitative Transfer with a Micropipette: Suppose you have to measure out 250 μL of a 10 μM protein solution and transfer that into a microcentrifuge tube containing 1 mL of water. What is the best way to accurately measure out the 10 μM protein solution?

Appendix C

Assessment of Students' Perceptions & Affective Outcomes: Course Evaluation Questions (see Figure 2)

For the following questions, response options were “Much Higher,” “Higher,” “Average,” “Lower,” and “Much Lower”:

  • The intellectual challenge was …

  • The amount that I learned was …

  • The amount of effort expected of students in the course was …

For the following questions, response options were “Excellent,” “Very Good,” “Good,” “Fair,” and “Poor”:

  • The course as a whole was …

  • The effectiveness of the instructor's teaching methods in actively engaging me with the ideas of this course was…

  • The enthusiasm the instructor showed for the content of this course was …

  • The course organization and planning were …

  • The clarity of the instructor's interactions with students was …

  • The fairness of the instructor's evaluation of student performance was . …

Appendix D

Assessment of Type of Instructional Scaffolding That Influences Learning Outcomes: Scaffolding Prompts (see Figure 3)

The following three PowerPoint slides were used as scaffolding.

Complete the table below.
ProcedureStarting material or aliquot (incl. amount)Sample dilution during processEnding material (incl. amounts)
Cooking __ g of broccoli none __ g of broccoli 
Protein extraction __ g of broccoli in
__ μL of extraction buffer 
 __ μL of broccoli extract 
Bradford assay __ μL of broccoli extract  __ μL in each well 
Myrosinase assay __ μL of broccoli extract  __ μL in each well 
ProcedureStarting material or aliquot (incl. amount)Sample dilution during processEnding material (incl. amounts)
Cooking __ g of broccoli none __ g of broccoli 
Protein extraction __ g of broccoli in
__ μL of extraction buffer 
 __ μL of broccoli extract 
Bradford assay __ μL of broccoli extract  __ μL in each well 
Myrosinase assay __ μL of broccoli extract  __ μL in each well 

STANDARDIZING BRADFORD DATA

Steps needed to get us there:

  1. Determine what is known from the standard curve & sample data.

  2. Take into account the sample dilution (V1/Vf). For example, if you diluted 1:10, then you need to multiply by 10 to figure out the protein concentration of your broccoli extract.

  3. Convert your protein concentration to a weight (mg) of protein. Multiply the concentration by the volume and make sure your units “factor out.”

  4. Divide by the amount of broccoli tissue in your extract.

STANDARDIZING AMPLEX RED DATA

Steps needed to get us there:

  1. Determine what is known from the standard curve & sample data.

  2. Take into account the sample dilution (V1/Vf).

  3. Convert your glucose concentration to an amount (nmol) of glucose produced by the myrosinase. Multiply the concentration by the volume and make sure your units “factor out.”

  4. Divide by the total reaction time (min). This yields a rate of glucose production per minute. Note: 1 unit of myrosinase = 1 nmol of glucose/min

  5. Divide by the amount of broccoli tissue (or protein) in your extract.

Supplementary data