Recent reforms in K-16 science education advocate for the integration of science content and practice. However, engaging students in authentic science practices can be particularly challenging for certain subjects such as evolution. We describe Avida-ED, a research-based platform for digital evolution that overcomes many of the challenges associated with using biological model organisms in the classroom. We then report the findings of a nationwide, multiple-case study on classroom implementation of Avida-ED and its influence on student understanding and acceptance of evolution. We found that engagement in lessons with Avida-ED both supported student learning of fundamental evolution concepts and was associated with an increase in student acceptance of evolution as evidence-based science. In addition, we found a significant, positive association between increased understanding and acceptance. We discuss the implications of supporting reform-based pedagogical practices with tools such as Avida-ED that integrate science content with authentic science practice.

Introduction

In biology, evolution features prominently among disciplinary core concepts in recent reforms (Brewer & Smith, 2011; College Board, 2011; NGSS Lead States, 2013). The explicit emphasis on evolution in national science education standards is significant as less than half of the adult population in the United States accepts that humans have evolved (Miller et al., 2006; Newport, 2012). Evolution forms the basis for everything we understand about the diversity and history of life (Dobzhansky, 1973), and evolutionary biology is also particularly representative of the nature of science (Pennock, 2005). An understanding of evolution depends a great deal on understanding the nature and practices of science (Akyol et al., 2012; Lombrozo et al., 2008; Rutledge & Warden, 2000).

Misconceptions about evolution and its associated mechanisms abound and are well documented in the literature (e.g., Gregory, 2009). Low levels of understanding, as well as misalignment with personal beliefs, may contribute to outright rejection. However, science educators disagree about appropriate goals for learner outcomes where controversial issues such as evolution are concerned. Some argue that belief and acceptance are prerequisite for understanding scientific theories (Alters, 1997; Cobern, 1994; McKeachie et al., 2002), while others maintain that understanding and acceptance are independent (Ha et al., 2012; Sinatra et al., 2003; Smith & Siegel, 2004), and that the responsibility of science educators lies in teaching for understanding rather than belief (Smith & Siegel, 2004). Regardless, there is little doubt that understanding and acceptance play important roles when it comes to learning evolution, and a number of studies have focused on these constructs and how they interact (reviewed in Lloyd-Strovas & Bernal, 2012).

The results of studies in which the relationship between understanding and acceptance have been examined are largely inconclusive (reviewed in Glaze & Goldston, 2015; Lloyd-Strovas & Bernal, 2012; Pobiner, 2016). Significant positive relationships (e.g., Akyol et al., 2012; Rice et al., 2011) as well as insignificant associations (e.g., Bishop & Anderson, 1990; Ingram & Nelson, 2006) have been reported, and in at least one study it was found that acceptance decreased with increasing levels of understanding (Bailey et al., 2011).

This paper reports on a study designed to assess an innovative way to address issues of evolution understanding and acceptance using an experimental evolution platform that allows students both to directly observe evolutionary processes and to engage in science practices with evolving populations of digital organisms.

Avida-ED: Digital Evolution for Education

There is evidence that teaching evolution is a good way to integrate content and practices (Pennock, 2005, 2007a; reviewed in Glaze & Goldston, 2015). However, it can be difficult to engage students in authentic scientific practice around the topic of evolution, mainly because biological evolution can be difficult to observe. An option that overcomes limitations posed by biological model organisms is digital evolution. Populations of digital organisms—mini-programs similar to computer viruses capable of self-replication—evolve in minutes and can produce large quantities of data in a short time. An example of digital evolution software is Avida, a research platform that was developed to model and test hypotheses about evolutionary mechanisms in a highly controlled and fast system. Avida allows biologists to investigate evolutionary questions that are difficult or impossible to test in organic systems (Adami, 2006), and has been used as a model system in well over a hundred experimental evolution studies for many kinds of evolutionary hypotheses (e.g., Clune et al., 2010; Grabowski et al., 2013).

Software that simulates evolution is available for educators (e.g., SimBio's EvoBeaker), but Avida goes further in allowing teachers to incorporate authentic research experiences on evolution in the classroom. Chief among the many advantages of using Avida to study evolutionary processes is that it constitutes a true instance of evolution rather than a simulation of it (Pennock, 2007b). We will not repeat the argument here, but the key point is that Avida implements the causal mechanisms of evolution, producing outcomes that are not predetermined but can be studied experimentally. Digital organisms in Avida (aka “Avidians”) replicate, mutate, and compete with other organisms for resources in their computational environment (Fig. 1). The system possesses all of the requirements necessary for evolution by natural selection to occur (Dennett, 1995). This is why it is especially useful to evolutionary biologists for basic research, but it is also compelling to teachers who want their students to actually observe evolutionary change in the classroom in real time.

Figure 1.

A growing population of digital organisms in Avida.

Figure 1.

A growing population of digital organisms in Avida.

Recognizing the potential of Avida as a powerful tool for teaching about evolutionary processes and the nature of scientific reasoning and inquiry, one of us (RTP) developed an educational version of the software called Avida-ED (Pennock, 2007a). The program features a friendlier user interface than Avida and allows student observation and experimentation of the biological processes without the need for any special computer science knowledge. It makes use of a bacterial analogy to make the digital organisms and environment less abstract for students. Individual Avidians are visually represented by a circular instruction set (their “genome”) that consists of a sequence of 50 computer commands, each denoted by a letter of the alphabet representing one or another of 26 basic commands (Fig. 2). Like some organic prokaryotic organisms, Avidians are haploid and asexual, reproducing by making copies of their genome.

Figure 2.

Individual Avidians are visually represented by a circular instruction set: their “genome.”

Figure 2.

Individual Avidians are visually represented by a circular instruction set: their “genome.”

Populations of Avidians grow in a virtual Petri dish consisting of a grid, the size of which is defined by the user (Fig. 3). Each cell on the grid is a potential space for one individual. The user also determines a number of other parameters in the settings. Per site mutation rate, or the probability that any one locus in the genome will change randomly during a given replication event, can range from zero to 100%. In Avida, types of mutation include insertions and deletions as well as substitutions, but for novice users only substitutions are allowed in Avida-ED; therefore, the size of the Avidian genome remains a static 50 instructions in length. Offspring placement can be set to occur next to the parent (assigned randomly to one of the eight cells adjacent to the parent) or randomly anywhere in the dish, giving the user the ability to test the effects of different distribution patterns.

Figure 3.

Populations of Avidians grow in a virtual Petri dish.

Figure 3.

Populations of Avidians grow in a virtual Petri dish.

Finally, the user determines which of nine resources are available in the environment. The resources are analogous to sugars that would be part of an actual bacterial growth medium and, keeping with the analogy, end in “-ose” (e.g., notose, nanose, orose). These resources correspond to various computational logic operations or functions that the digital organisms can evolve to perform (e.g., NOT, NAND, OR). The default ancestor organism in Avida-ED is capable only of replication and cannot perform any of the logic functions, but during an experiment substitution mutations occur that alter the ancestor's genomic sequence. Sometimes these mutations accumulate to produce, by chance, a sequence of commands that enables the organism to perform one of the logic functions, such as inputting two bit strings of numbers (binary series of 0s and 1s) from the environment, manipulating them, and outputting their conjunction (i.e., the function AND). If the organism evolves the ability to perform a function corresponding to an available environmental resource (“andose” in this example), the organism will receive an “energy boost” in the form of increased processing power, which will allow that organism to acquire energy at a faster rate. For this reason, these operations are commonly referred to as metabolic functions. A biological example of an analogous process would be the evolution of an enzyme that allows for the metabolism of a particular sugar in an organism's environment, one that was previously unavailable as a resource; this very situation has been documented in E. coli (Blount et al., 2008).

Because resources are uniformly distributed and unlimited in the computational environment, organisms in Avida-ED essentially compete for space. Avidians possessing functions that increase replication rate will tend to produce relatively more offspring than organisms lacking those functions—they will be more fit—and the frequency of beneficial mutations that led to the expression of those fitter phenotypes will increase in the population—which is to say, the population will evolve.

Although Avida-ED lacks the full functionality of Avida, it still allows the user considerable latitude to experiment by manipulating various parameters and observing the effects on an organism or population. Thus, it is potentially ideal for engaging students in authentic science practices to learn about evolution and the nature of science, making it well aligned with the NGSS. Avida-ED can be used to teach about fundamental concepts such as mutations and mutation rates and their effects on individual organisms' genomes, as well as populations, selection and fitness, the relationship between genotype and phenotype, artificial selection, evolutionary loss vs. gain in function, and more. As mentioned, the current version only includes asexual organisms, but genetic recombination by sexual reproduction is already possible in Avida and will be implemented in the next version of Avida-ED. Model curricular materials are available on the project website (avida-ed.msu.edu). In addition, Smith et al. (2016) describe the development of the Avida-ED laboratory handbook, also free to download.

The primary purpose of this study (part of a larger project) was to investigate the effectiveness of Avida-ED in facilitating learning about basic aspects of evolution, and the influence this learning may have on student understanding and acceptance of evolution. To that end, we posed the following research questions:

  1. To what degree is Avida-ED an effective context for teaching and learning about fundamental evolution concepts, such as the role of random mutations and natural selection?

  2. What influence does learning with Avida-ED have on student acceptance of evolution?

Methods

Study Design and Overview

We conducted a nationwide, multiple-case study in which we characterized classroom implementations of Avida-ED and assessed student outcomes. In total, eleven instructors teaching ten courses at eight institutions across the United States volunteered for the study, chosen based on convenience sampling. Each case consisted of one course taught during a single semester (fall 2012 or spring 2013). To determine whether similar patterns emerged independent of context, we explicitly ensured a broad range of institution sizes and types, from small, private liberal arts colleges to very large, high-output research universities (Table 1). The courses differed in topic, enrollment, type (lecture, lab, or combination), and level (lower or upper division; see Table 2 for details). Cases also differed in the amount of instructor experience and expertise generally and with regard to Avida-ED specifically (Table 3). Instructors who were using Avida-ED for the first time were designated as novices, whereas experienced instructors had used the program in their courses at least once prior to the study, and expert users had worked extensively with the Avida research platform in addition to having used Avida-ED in their classrooms. For detailed accounts of individual classroom implementations, see Lark (2014).

Table 1.
List of participating institutions, characterized by size (estimated student population*) and Carnegie classification data (if applicable).
Institution CodeLevelControlStudent Pop'n (est.)Carnegie Classification (Size and Setting)Carnegie Classification (Basic)
High school Private 500 N/A N/A 
4 year Private 1,650 S4/HR: Small four-year, highly residential Bac/A&S: Baccalaureate Colleges, Arts & Sciences 
4 year Public 10,500 M4/R: Medium four-year, primarily residential DRU: Doctoral/Research Universities 
4 year Public 30,500 L4/NR: Large four-year, primarily nonresidential RU/H: Research Universities (high research activity) 
4 year Public 34,750 L4/R: Large four-year, primarily residential RU/VH: Research Universities (very high research activity) 
4 year Public 43,000 L4/NR RU/VH 
4 year Public 48,000 L4/R RU/VH 
4 year Public 51,000 L4/NR RU/VH 
Institution CodeLevelControlStudent Pop'n (est.)Carnegie Classification (Size and Setting)Carnegie Classification (Basic)
High school Private 500 N/A N/A 
4 year Private 1,650 S4/HR: Small four-year, highly residential Bac/A&S: Baccalaureate Colleges, Arts & Sciences 
4 year Public 10,500 M4/R: Medium four-year, primarily residential DRU: Doctoral/Research Universities 
4 year Public 30,500 L4/NR: Large four-year, primarily nonresidential RU/H: Research Universities (high research activity) 
4 year Public 34,750 L4/R: Large four-year, primarily residential RU/VH: Research Universities (very high research activity) 
4 year Public 43,000 L4/NR RU/VH 
4 year Public 48,000 L4/R RU/VH 
4 year Public 51,000 L4/NR RU/VH 
*

Rounded values are based on average student enrollments for the 2012–2013 academic year, determined from information made publically available on institution websites.

Table 2.
Case summaries. Case codes are designated by institution code (see Table 1) and course level/type. Class levels are designated as Lower (AP, 100-, or 200-level) or Upper (300- or 400-level). Only students taking both the pre- and post-assessment were included in data analyses; therefore, the number of students enrolled in each course may actually be greater than what is reported here.
Case CodeClass levelCourse typeLecture/LabMajor/Non-MajorN (matched pre/post)
A_APBio Lower Advanced Placement Biology (High School) Combined N/A 17 
B_300Evo Upper Evolution Lecture Major 
C_400Evo Upper Evolution Lecture Major 15 
D_100Evo Lower Evolution Lecture Major 12 
E_200Bio_HC Lower Biology (Honors College) Combined Non-Major 30 
F_400Evo Upper Evolution Combined Major 33 
G_100BioLabA Lower Biology Lab Major 153 
G_100BioLabB Lower Biology Lab Major 234 
G_100BioRes Lower Biology (Residential College) Combined Major 101 
H_100CompBio Lower Computational Biology Combined Major 24 
Case CodeClass levelCourse typeLecture/LabMajor/Non-MajorN (matched pre/post)
A_APBio Lower Advanced Placement Biology (High School) Combined N/A 17 
B_300Evo Upper Evolution Lecture Major 
C_400Evo Upper Evolution Lecture Major 15 
D_100Evo Lower Evolution Lecture Major 12 
E_200Bio_HC Lower Biology (Honors College) Combined Non-Major 30 
F_400Evo Upper Evolution Combined Major 33 
G_100BioLabA Lower Biology Lab Major 153 
G_100BioLabB Lower Biology Lab Major 234 
G_100BioRes Lower Biology (Residential College) Combined Major 101 
H_100CompBio Lower Computational Biology Combined Major 24 
Table 3.
Instructor profiles.
Case CodePositionFamiliarity with Avida-ED
A_APBio HS Teacher Expert 
B_300Evo Professor Experienced 
C_400Evo Assistant Professor Novice 
D_100Evo Senior Lecturer (tenured) Novice 
E_200BioHC Postdoctoral Fellow Experienced 
F_400Evo Associate Professor Expert 
G_100BioLabA Visiting Assistant Professor Experienced 
G_100BioLabB Coordinator Experienced 
G_100BioRes Postdoctoral Fellows Novice 
H_100CompBio Instructor (non-tenure) Expert 
Case CodePositionFamiliarity with Avida-ED
A_APBio HS Teacher Expert 
B_300Evo Professor Experienced 
C_400Evo Assistant Professor Novice 
D_100Evo Senior Lecturer (tenured) Novice 
E_200BioHC Postdoctoral Fellow Experienced 
F_400Evo Associate Professor Expert 
G_100BioLabA Visiting Assistant Professor Experienced 
G_100BioLabB Coordinator Experienced 
G_100BioRes Postdoctoral Fellows Novice 
H_100CompBio Instructor (non-tenure) Expert 

Data Sources and Analyses

Pre-/post-assessment of students

Students were given a common assessment immediately prior and subsequent to lessons involving Avida-ED to document the direction and magnitude of change in student outcomes as measured by our assessments and as a result of instruction with Avida-ED within each particular case.

Assessing conceptual understanding

The assessment consisted of two components, the first of which included two constructed-response items dealing with conceptual issues (Table 4). The concepts assessed—origins of genetic variation and the basic mechanism of natural selection—were chosen because they are (1) key to understanding how adaptive evolution occurs, and yet associated with a number of common misconceptions (Bishop & Anderson, 1990; Gregory, 2009); (2) universal (i.e., applicable to all evolving systems, biological and digital alike); and (3) particularly easy to observe in Avida-ED.

Each student-constructed response was compared to an ideal response synthesized from the relevant literature (e.g., Sadava et al., 2012; University of California Museum of Paleontology, 2014; Zimmer & Emlen, 2013) to ensure content validity (as discussed in Campbell & Nehm, 2013), and each was assessed for degree of accuracy and completeness. For each ideal response, we identified one or more critical components applicable to evolution in both digital and biological organisms. For example, on the question of the origins of variation, there were two critical components: “mutations” and “random”.

We used strict criteria when scoring assessment data. Student responses with components judged accurate and complete (i.e., well aligned with the ideal response) were given 2 points, and those with accurate but incomplete components (i.e., emerging understanding) were given 1 point. Responses that were ambiguous, incorrect, or missing the relevant critical component were given 0 points. Student scores were interpreted as a percentage of the ideal response, calculated as the ratio of points assigned to points possible. To reduce variability within cases and make cross-case patterns easier to see, scores for each question were pooled, and mean scores were calculated to arrive at the average content score for each case. These were compared statistically from pre- to post-assessment (one-tailed paired Student's t-test for cases in which scores were normally distributed, and one-tailed Mann-Whitney U test for cases in which score distributions did not meet assumptions of normality according to a Kolmogorov-Smirnov Goodness of Fit test). See Table 4 for questions and ideal responses, along with examples of actual student responses to illustrate how they were scored against the rubric. Additional details regarding rationale for and analysis of assessment items can be found elsewhere (Lark, 2014).

Table 4.
Examples of student constructed responses and scoring against rubric.
Question 1: Explain how variation arises in a population.
Student ResponseMutationRandomTotal% Ideal Response
Variation arises due to the appearance of recessive traits when the right genetic makeup is achieved. 0% 
Genetic mutations occur in an individual, which then may or may not pass on to offspring. 50% 
Variation arises through random mutations or natural selection. 100% 
Question 2: Imagine that a new life form was just discovered on another planet. It is not made up of cells, nor does it contain DNA. What characteristics of this life form would be necessary in order for it to evolve? Explain your reasoning. 
Student Response Inheritance Variation Selection Total % Ideal Response 
The life form would have to be made up of some kind of matter. If the matter has the ability to change, the life form can evolve. 0% 
Reproduction that allows for a mixing of genetic material so that weak and strong characteristics can emerge and natural selection can occur 33% 
Genetic information that is passed on. Reproduces. Mutations occur. Respond to environment. 67% 
It would need a replication system that isn't perfect, or random changes in whatever code it does use, this would allow for variation. It would also need to have some forms more fit for the environment than others in order to select for good or bad mutations. These two things would be enough for evolution: changes occur that make some things better/worse, and the environment is harsh and weeds out the worse ones. 100% 
Question 1: Explain how variation arises in a population.
Student ResponseMutationRandomTotal% Ideal Response
Variation arises due to the appearance of recessive traits when the right genetic makeup is achieved. 0% 
Genetic mutations occur in an individual, which then may or may not pass on to offspring. 50% 
Variation arises through random mutations or natural selection. 100% 
Question 2: Imagine that a new life form was just discovered on another planet. It is not made up of cells, nor does it contain DNA. What characteristics of this life form would be necessary in order for it to evolve? Explain your reasoning. 
Student Response Inheritance Variation Selection Total % Ideal Response 
The life form would have to be made up of some kind of matter. If the matter has the ability to change, the life form can evolve. 0% 
Reproduction that allows for a mixing of genetic material so that weak and strong characteristics can emerge and natural selection can occur 33% 
Genetic information that is passed on. Reproduces. Mutations occur. Respond to environment. 67% 
It would need a replication system that isn't perfect, or random changes in whatever code it does use, this would allow for variation. It would also need to have some forms more fit for the environment than others in order to select for good or bad mutations. These two things would be enough for evolution: changes occur that make some things better/worse, and the environment is harsh and weeds out the worse ones. 100% 

Inter-rater reliability of constructed response items

Two hundred constructed response items (twenty per case; one hundred pre- and one hundred post-test) were independently coded by two raters, and inter-rater reliability was determined for each of the five critical components (mutation, random, inheritance, variation, and selection) using percent agreement and Cohen's κ (Cohen, 1960). Initially, reliability for the critical component “inheritance” was low; the raters discussed the rubric for clarification, and re-coded these items. Ultimately, all five critical components had a percent agreement of over 80%, and κ values fell within “substantial” or “almost perfect” ranges, using benchmarks for interpretation proposed by Landis and Koch (1977). The inter-rater reliability outcomes are summarized in Table 5.

Table 5.
Summary of inter-rater reliability outcomes. Pr(a) is equivalent to percent agreement between coders.
Critical ComponentPr(a)Cohen's KappaInterpretation
Mutation 0.96 0.89 Almost Perfect 
Random 0.92 0.80 Substantial 
Inheritance 0.82 0.73 Substantial 
Variation 0.88 0.78 Substantial 
Selection 0.81 0.64 Substantial 
Critical ComponentPr(a)Cohen's KappaInterpretation
Mutation 0.96 0.89 Almost Perfect 
Random 0.92 0.80 Substantial 
Inheritance 0.82 0.73 Substantial 
Variation 0.88 0.78 Substantial 
Selection 0.81 0.64 Substantial 

Assessing evolution acceptance

The second component of the assessment consisted of ten forced-choice items that addressed student acceptance of evolution (Table 6). These items were borrowed or modified from two instruments, the widely used MATE (Measure of the Acceptance of the Theory of Evolution; Rutledge & Warden, 1999) and a new survey developed for internal assessment of education goals at the NSF-funded BEACON Center for the Study of Evolution in Action (Mead & Libarkin, 2013). Evidence of reliability for this student population for the MATE was established by Rutledge and Sadler (2007). Content validity of the BEACON Evolution in Action items was established through expert responses; an exploratory factor analysis was performed to examine internal structure with items loading onto three factors, Cronbach's α ranged from 0.724 to 0.833. The items chosen from these two instruments emphasized three main premises: (1) evolution is a real phenomenon that has happened and continues to happen; (2) evolutionary theory provides a good explanation for the diversity of life on earth; and (3) evolution is evidence-based science. We intentionally avoided any items having to do with religion or human evolution so as not to trigger an emotional or defensive response in students, which could have artificially reduced acceptance or otherwise confounded the results (Demastes et al., 1995; Berkman et al., 2008). Because several such items occur on the original MATE, these were excluded from our instrument.

Table 6.
Student acceptance of evolution was assessed with the following items. These were scored according to a 5-point Likert scale. Some items were borrowed or modified from the Measure of the Acceptance of Evolutionary Theory (MATE) instrument (Rutledge & Warden, 1999). Other items from an unpublished survey by Mead & Libarkin (2013) were written in the style of the MATE.
QuestionSourceMajor Concept(s) Addressed
1. Organisms existing today are the result of evolutionary processes that have occurred over millions of years. Rutledge & Warden (MATE) Evolution happens; Evolution is explanatory 
2. Evolution is a process that is happening right now. Mead & Libarkin Evolution happens 
3. Evolution cannot ever be observed because it happens over very long periods of time. Mead & Libarkin Evolution happens; Evolution is evidence-based 
4. Evolutionary biology generally does not investigate testable ideas about the natural world. Mead & Libarkin Evolution is evidence-based 
5. Evolutionary biology relies on evidence to make claims about the natural world. Mead & Libarkin Evolution is evidence-based 
6. The available data are ambiguous (unclear) as to whether evolution actually occurs. Rutledge & Warden Evolution happens; Evolution is evidence-based 
7. Evolution can explain changes in populations of species over time. Mead & Libarkin Evolution is explanatory 
8. Evolutionary theory is supported by factual, historical, and laboratory data. Rutledge & Warden Evolution is evidence-based 
9. Computer programs can create instances of evolution (within a computational environment). Mead & Libarkin Evolution happens 
10. Evolution is a scientifically valid theory. Rutledge & Warden Evolution is explanatory; Evolution is evidence-based 
QuestionSourceMajor Concept(s) Addressed
1. Organisms existing today are the result of evolutionary processes that have occurred over millions of years. Rutledge & Warden (MATE) Evolution happens; Evolution is explanatory 
2. Evolution is a process that is happening right now. Mead & Libarkin Evolution happens 
3. Evolution cannot ever be observed because it happens over very long periods of time. Mead & Libarkin Evolution happens; Evolution is evidence-based 
4. Evolutionary biology generally does not investigate testable ideas about the natural world. Mead & Libarkin Evolution is evidence-based 
5. Evolutionary biology relies on evidence to make claims about the natural world. Mead & Libarkin Evolution is evidence-based 
6. The available data are ambiguous (unclear) as to whether evolution actually occurs. Rutledge & Warden Evolution happens; Evolution is evidence-based 
7. Evolution can explain changes in populations of species over time. Mead & Libarkin Evolution is explanatory 
8. Evolutionary theory is supported by factual, historical, and laboratory data. Rutledge & Warden Evolution is evidence-based 
9. Computer programs can create instances of evolution (within a computational environment). Mead & Libarkin Evolution happens 
10. Evolution is a scientifically valid theory. Rutledge & Warden Evolution is explanatory; Evolution is evidence-based 

Student acceptance scores were calculated in the same manner as the MATE instrument (Rutledge & Warden, 1999), by assigning a numeric value to each response category (Strongly Disagree = 1; Strongly Agree = 5). The scale was inverted for items that were worded negatively (items 3, 4, and 6) for consistency in scoring. The sum across all ten items was calculated and doubled to produce an acceptance score ranging from 20 to 100. The average student acceptance score was then calculated for each case. Pre- and post-test mean acceptance scores for each case were then compared using paired Student's t-tests. Rutledge (1996) developed criteria for interpreting scores on the MATE instrument, and these were used as a very general guide to interpretation of the student acceptance scores in this study.

Content vs. acceptance

To determine whether there was a relationship between student learning of content and acceptance outcomes, normalized gains (g-ave; Hake, 2002) were calculated for each student's pre- and post-assessment scores, which were then averaged for each case. We also used normalized rather than raw gains to account for the fact that students in some cases (i.e., upper-division courses) were likely to possess higher initial understanding of fundamental evolution concepts. The Pearson's correlation coefficient was then calculated to determine the degree of linear correlation between these two variables.

Variations in institution size/type, course size, type, and level, and instructor teaching experience and familiarity with Avida-ED ruled out certain comparisons, but we were able to look for patterns and emergent themes across cases (Yin, 2009).

Results

Student Learning of Foundational Evolution Concepts

Content scores for each case are summarized in Table 7. In six of ten cases, average student content scores increased significantly from pre- to post-test (Fig. 4). All six of these were lower-division courses. Although one lower-division course, D_100Evo, did not show a statistically significant increase on the post-test, there was a moderate effect size (d = 0.59). No upper-division course showed significant changes from pre- to post-test in average student content score. In two of the three upper-division evolution courses, B_300Evo and F_400Evo, students had pre-test content scores that averaged 51% and 52% of the ideal response, respectively (refer to Table 4 for examples of how percent of the ideal response was calculated). These were the highest pre-test scores of all ten cases. Students in these two courses had similarly high scores on the post-test. In contrast, students in a third upper-division evolution course, C_400Evo, had the second lowest pre-test score (19%) and the lowest post-test score (26%).

Table 7.
Summary of average content scores by case. Pre- and post-test content scores and pre/post change are reported as the average percentage of the ideal response. Significance values marked with an asterisk were calculated using non-parametric analyses (Mann-Whitney U test). Effect sizes were determined using Cohen's d and are interpreted accordingly: 0.2 = small, 0.5 = moderate, 0.8 = large.
Case CodenPre contentPost contentPre/post changepEffect Size (d)
A_APBio 17 46% 56% 10% 0.01 0.69 
B_300Evo 51% 53% 2% 0.40 0.13 
C_400Evo 15 19% 26% 7% 0.09 0.51 
D_100Evo 12 21% 31% 10% 0.05 0.59 
E_200Bio_HC 30 17% 32% 15% 0.00 1.62 
F_400Evo 33 52% 56% 4% 0.12 0.22 
G_100BioLabA 153 33% 39% 6% 0.00* 0.32 
G_100BioLabB 234 36% 40% 5% 0.01* 0.29 
G_100BioRes 101 36% 50% 15% 0.00* 0.85 
H_100CompBio 24 33% 44% 11% 0.01 0.68 
Case CodenPre contentPost contentPre/post changepEffect Size (d)
A_APBio 17 46% 56% 10% 0.01 0.69 
B_300Evo 51% 53% 2% 0.40 0.13 
C_400Evo 15 19% 26% 7% 0.09 0.51 
D_100Evo 12 21% 31% 10% 0.05 0.59 
E_200Bio_HC 30 17% 32% 15% 0.00 1.62 
F_400Evo 33 52% 56% 4% 0.12 0.22 
G_100BioLabA 153 33% 39% 6% 0.00* 0.32 
G_100BioLabB 234 36% 40% 5% 0.01* 0.29 
G_100BioRes 101 36% 50% 15% 0.00* 0.85 
H_100CompBio 24 33% 44% 11% 0.01 0.68 
Figure 4.

Average student content scores increased significantly from pre- to post-test.

Figure 4.

Average student content scores increased significantly from pre- to post-test.

Student Acceptance of Evolution

Average student acceptance scores across all ten cases ranged from 73.95 to 90.04 on the pre-test and 76.28 to 91.06 on the post-test, or moderate to very high acceptance for both pre- and post-tests, using Rutledge's (1996) guide to interpretation (Table 8). Average acceptance score increased significantly from pre- to post-test in four of the ten cases (Fig. 5). These four cases were all lower-division courses that also had statistically significant gains in average content score. Students in two of the three upper-division evolution courses, B_300Evo and F_400Evo, had very high acceptance on both the pre- and post-tests, with no significant change. These were the same two upper-division courses in which the highest pre-test content scores were observed. Students in the remaining upper-division course, C_400Evo, also did not show a significant change in acceptance from pre- to post-test, with the lowest average acceptance score on the post-test (76.28). Thus, case C_400Evo showed the lowest average scores for both content and acceptance on the post-test, despite being a senior-level evolution course (discussed below).

Table 8.
Summary of average acceptance scores by case. Significance values marked with an asterisk were calculated using non-parametric analyses (Mann-Whitney U test). Effect size was determined using Cohen's d and are interpreted accordingly: 0.2 = small, 0.5 = moderate, 0.8 = large.
Case CodenPre acceptancePost acceptancePre/post changepEffect Size (d)
A_APBio 17 82.98 89.85 6.87 0.00 0.92 
B_300Evo 90.04 88.51 –1.53 0.15 –0.15 
C_400Evo 15 76.15 76.28 0.13 0.48 0.01 
D_100Evo 12 75.48 79.10 3.62 0.29 0.21 
E_200Bio_HC 30 81.86 88.82 6.95 0.00 0.66 
F_400Evo 33 88.58 91.06 2.48 0.06 0.22 
G_100BioLabA 153 80.70 82.23 1.53 0.22* 0.14 
G_100BioLabB 234 80.21 83.14 2.93 0.00* 0.28 
G_100BioRes 101 73.94 78.03 4.09 0.02* 0.29 
H_100CompBio 24 86.17 88.73 2.57 0.06 0.24 
Case CodenPre acceptancePost acceptancePre/post changepEffect Size (d)
A_APBio 17 82.98 89.85 6.87 0.00 0.92 
B_300Evo 90.04 88.51 –1.53 0.15 –0.15 
C_400Evo 15 76.15 76.28 0.13 0.48 0.01 
D_100Evo 12 75.48 79.10 3.62 0.29 0.21 
E_200Bio_HC 30 81.86 88.82 6.95 0.00 0.66 
F_400Evo 33 88.58 91.06 2.48 0.06 0.22 
G_100BioLabA 153 80.70 82.23 1.53 0.22* 0.14 
G_100BioLabB 234 80.21 83.14 2.93 0.00* 0.28 
G_100BioRes 101 73.94 78.03 4.09 0.02* 0.29 
H_100CompBio 24 86.17 88.73 2.57 0.06 0.24 
Figure 5.

Average acceptance score increased significantly from pre- to post-test in four of the ten cases.

Figure 5.

Average acceptance score increased significantly from pre- to post-test in four of the ten cases.

Understanding and Acceptance of Evolution

Most of the students in lower-division courses had significant increases in both average content and average acceptance scores, suggesting a relationship between the two. Again, we accounted for differences in levels of student prior knowledge by using normalized gains (g-avg; Hake, 2002), calculated for each student's pre- and post-assessment scores, which were then averaged for each case. The Pearson correlation confirmed a significant, positive association between the change in average normalized content score and in average normalized acceptance score across the ten cases (r = 0.60, p < 0.05; Fig. 6).

Figure 6.

The Pearson correlation confirmed a significant, positive association between the change in average normalized content score and in average normalized acceptance score.

Figure 6.

The Pearson correlation confirmed a significant, positive association between the change in average normalized content score and in average normalized acceptance score.

Discussion

Cross-case analysis of student assessment data provides evidence that Avida-ED may be an effective tool for teaching evolution content and simultaneously engaging students in authentic science practices. Students in lower-division biology courses who engaged in lessons with Avida-ED demonstrated increased knowledge of foundational evolutionary concepts. Our evidence suggests that students' experience with Avida-ED also had a positive influence on their acceptance of evolution. Average student acceptance scores increased significantly in four of the six lower-division courses, despite the fact that acceptance of evolution was already quite high on the pre-assessments for all ten cases in this study; indeed, there seems to be a ceiling effect with regard to acceptance. An especially notable finding was a significant, positive correlation between normalized gains in content and acceptance scores from pre- to post-assessment across the ten cases. These results indicate that there was a positive relationship between learning foundational evolution concepts and increasing acceptance of evolution after engaging in lessons with Avida-ED.

The results of the current study provide additional support for relationships between instruction and both understanding and acceptance of evolution, but our findings differ markedly from those of other researchers (e.g., Nehm & Schoenfeld, 2007) in that we did find a significant, positive relationship between understanding and acceptance, contributing to the small number of published investigations that report the same (e.g., Akyol et al., 2012; Rice et al., 2011).

The paucity of significant associations between understanding and acceptance of evolution in the literature should not be surprising, as these two constructs are not mutually exclusive; one need not understand evolution in order to accept it, nor must one accept evolution in order to understand it mechanistically. Meadows, Doster, and Jackson (2000) described teachers who held young earth creationist beliefs, but were able to compartmentalize their beliefs and teach the science of evolution accurately and without conflict. In their widely cited paper on the subject, Bishop and Anderson (1990) comment on this lack of association between understanding and acceptance, which has been echoed by others (e.g., Robbins & Roy, 2007):

It appears that a majority of both sides of the evolution-creation debate do not understand the process of natural selection or its role in evolution. One result of this lack of knowledge is that the debate is reduced to, as creationists argue, a dispute between two different kinds of faith. Most students who believed in the truth of evolution apparently based their beliefs more on acceptance of the power and prestige of science than on an understanding of the reasoning that had led scientists to their conclusions. (p. 426)

The results of this study provide evidence that engaging students in authentic science practice using a tool like Avida-ED improves not only student understanding of content but also acceptance of established scientific ideas, and that the degree to which acceptance increases is related to student learning. Although the exact nature of this relationship is not yet understood and requires further investigation, we are optimistic that Avida-ED can be used to address the problem of evolution denial in the United States. In addition, insights arising from this work might be extended to address other socio-politically contentious issues in which understanding and acceptance of science co-vary, such as climate change.

Future Directions and Next Steps

The results of the current study are based almost exclusively on quantified data. Understanding the potential mechanisms underlying student learning and changes in acceptance in relation to using Avida-ED may require qualitative approaches such as student interviews and case studies. Future studies may also examine the influences of contextual variables such as institution type and size. We have examined the influence of instructor familiarity with Avida-ED on instructional decisions elsewhere, and the results of that investigation are forthcoming.

Finally, there is the curious case of C_400Evo. None of the three upper-division evolution courses had statistically significant gains in either student content or acceptance scores from pre- to post-assessment. However, one of those three cases differed markedly from the other two. Both cases B_300Evo and F_400Evo had among the highest content and acceptance scores (pre and post), which makes sense given that these students, who were all in their junior or senior year of study and had successfully completed many college biology courses, would be expected to have mastered the fundamental concepts targeted by the assessment in addition to possessing a high rate of evolution acceptance. Case C_400Evo, in stark contrast, had among the lowest pre-assessment scores for both content and acceptance, and the lowest scores for both on the post-assessment. This raises concerns, particularly because the demographics of this case were very different from the other nine. Unlike the other institutions, this was an HBCU in the southern United States, and 100% of the students in the class were racial minorities. Bailey et al. (2011) studied attitudes toward science among students from the same demographic and concluded that they interact with science, and particularly evolutionary biology, differently from non-minority demographics. The authors argue that African Americans tend to have a more fatalistic worldview than other subpopulations, owing to their relatively strong belief in God as an external locus of control in their lives. This fatalistic worldview is often at odds with the progressive nature of science, and may cause African Americans to reject and avoid engagement in science at higher rates than other groups. In addition, they found that the strength of religious beliefs among the African American college students in their study was negatively correlated with knowledge of and attitudes toward evolution (and of science in general). Similarly, Mead et al. (2015) found that members of underrepresented minority groups, in general, show less interest in and understanding of evolutionary biology, which appears to correlate with holding misconceptions about evolution and higher levels of religiosity. The students in case C_400Evo are indeed different from students in the other cases, owing to the issues discussed in the work of Bailey et al. (2011) and Mead et al. (2015), and these differences may account for the results, but it will be valuable to explore these factors in light of equity and accessibility issues often associated with minority groups.

Implications For Educators

In this paper, we have described an educational technology useful for teaching the nature and practices of sciences in the context of evolutionary biology, and have presented evidence that Avida-ED can positively affect student conceptual understanding and attitudes toward evolution. We hope that biology teachers will find that Avida-ED can facilitate the kind of three-dimensional pedagogy advanced by the Next Generation Science Standards, and expand their existing pedagogical toolkit for teaching evolution.

Accessing Materials

The Avida-ED software and all associated curricular materials can be downloaded free of charge from the project website: http://avida-ed.msu.edu

Acknowledgments

This material is based in part upon work supported by the National Science Foundation under Cooperative Agreement No. DBI-0939454 as part of BEACON's Avida-ED Curriculum Development and Assessment Study to Pennock (PI), Mead, and Smith (Co-PIs). Any opinions, findings, and conclusions or recommendations expressed in this material are those of the author(s) and do not necessarily reflect the views of the funding agency.

References

References
Adami, C. (
2006
).
Digital genetics: Unravelling the genetic basis of evolution
.
Nature Reviews Genetics
,
7
,
109
118
.
Akyol, G., Tekkaya, C., Sungur, S., & Traynor, A. (
2012
).
Modeling the interrelationships among pre-service science teachers' understanding and acceptance of evolution, their views on nature of science and self-efficacy beliefs regarding teaching evolution
.
Journal of Science Teacher Education
,
23
(
8
),
1
21
.
Alters, B. J. (
1997
).
Should student belief of evolution be a goal?
Reports of the National Center for Science Education
,
17
(
1
),
15
16
.
Bailey, G., Han, J., Wright, D., & Graves, J. (
2011
).
Religiously expressed fatalism and the perceived need for science and scientific process to empower agency
.
Science in Society
,
2
(
3
),
55
87
.
Berkman, M. B., Pacheco, J. S., & Plutzer, E. (
2008
).
Evolution and creationism in America's classrooms: A national portrait
.
PLoS Biology
,
6
(
5
),
920
924
.
Bishop, B. A., & Anderson, C. W. (
1990
).
Student conceptions of natural selection and its role in evolution
.
Journal of Research in Science Teaching
,
27
(
5
),
415
427
.
Blount, Z. D., Borland, C. Z., & Lenski, R. E. (
2008
).
Historical contingency and the evolution of a key innovation in an experimental population of Escherichia coli
.
Proceedings of the National Academy of Sciences of the USA
,
105
(
23
),
7899
7906
.
Brewer, C. A., & Smith, D. (Eds.). (
2011
).
Vision and change in undergraduate biology education: A call to action
.
American Association for the Advancement of Science
.
Campbell, C.E., & Nehm, R.H. (
2013
).
A critical analysis of assessment quality in genomics and bioinformatics education research
.
CBE Life Sciences Education
,
12
(
3
),
530
541
.
Clune, J., Goldsby, H., Ofria, C., & Pennock, R. T. (
2010
).
Selective pressures for accurate altruism targeting: Evidence from digital evolution for difficult-to-test aspects of inclusive fitness theory
.
Proceedings of the Royal Society of London B: Biological Sciences
,
278
,
666
674
.
Cobern, W. W. (
1994
).
Comments and criticism. Point: Belief, understanding, and the teaching of evolution
.
Journal of Research in Science Teaching
,
31
(
5
),
583
590
.
Cohen, J. (
1960
).
A coefficient of agreement for nominal scales
.
Education & Psychological Measurement
,
20
(
1
),
37
46
.
College Board
. (
2011
).
AP Biology Curriculum Framework 2012–2013
.
New York
:
The College Board
.
Demastes, S. S., Settlage, J., & Good, R. (
1995
).
Students' conceptions of natural selection and its role in evolution: Cases of replication and comparison
.
Journal of Research in Science Teaching
,
32
(
5
),
535
550
.
Dennett, D. C. (
1995
).
Darwin's dangerous idea: Evolution and the meanings of life
.
New York
:
Simon & Schuster
.
Dobzhansky, T. (
1973
).
Nothing in biology makes sense except in the light of evolution
.
American Biology Teacher
,
35
,
127
129
.
Grabowski, L. A., Bryson, D. M., Dyer, F. D., Pennock, R. T., & Ofria, C. (
2013
).
A case study of the de novo evolution of a complex odometric behavior in digital organisms
.
PLoS One
,
8
(
4
),
e60466
.
Gregory, T. R. (
2009
).
Understanding natural selection: Essential concepts and common misconceptions
.
Evolution: Education & Outreach
,
2
(
2
),
156
175
.
Glaze, A. L., & Goldston, M. J. (
2015
).
U.S. Science teaching and learning of evolution: A critical review of the literature 2000–2014
.
Science Education
,
99
(
3
),
500
518
.
Ha, M., Haury, D. L., & Nehm, R. H. (
2012
).
Feeling of certainty: Uncovering a missing link between knowledge and acceptance of evolution
.
Journal of Research in Science Teaching
,
49
(
1
),
95
121
.
Hake, R. R. (
2002
,
August
).
Relationship of individual student normalized learning gains in mechanics with gender, high-school physics, and pretest scores on mathematics and spatial visualization
.
Paper presented at the Physics Education Research Conference
,
Boise, Idaho
.
Ingram, E. L., & Nelson, C. E. (
2006
).
Relationship between achievement and students' acceptance of evolution or creation in an upper-level evolution course
.
Journal of Research in Science Teaching
,
43
(
1
),
7
24
.
Landis, J. R., & Koch, G. G. (
1977
).
The measurement of observer agreement for categorical data
.
Biometrics
,
33
(
1
),
159
174
.
Lark, A. (
2014
).
Teaching and learning with digital evolution: Factors influencing implementation and student outcomes
(Doctoral dissertation).
Michigan State University
.
Lloyd-Strovas, J. D., & Bernal, X. E. (
2012
).
A review of undergraduate evolution education in U.S. universities: Building a unifying framework
.
Evolution: Education & Outreach
,
5
(
3
),
453
465
.
Lombrozo, T., Thanukos, A., & Weisberg, M. (
2008
).
The importance of understanding the nature of science for accepting evolution
.
Evolution: Education & Outreach
,
1
(
3
),
290
298
.
McKeachie, W. J., Lin, Y. G., & Strayer, J. (
2002
).
Creationist vs. evolutionary beliefs: Effects on learning biology
.
American Biology Teacher
,
64
(
3
),
189
192
.
Mead, L. S., & Libarkin, J. C. (
2013
). BEACON internal survey (unpublished).
Mead, L. S., Clarke, J. B., Forcino, F., & Graves, J. L. Jr. (
2015
).
Factors influencing minority student decisions to consider a career in evolutionary biology
.
Evolution: Education & Outreach
,
8
:
6
. https://doi.org/10.1186/s12052-015-0034-7
Meadows, L., Doster, E., & Jackson, D. F. (
2000
).
Managing the conflict between evolution & religion
.
American Biology Teacher
,
62
(
2
),
102
107
.
Miller, J. D., Scott, E. C., & Okamato, S. (
2006
).
Public acceptance of evolution
.
Science
,
313
(
5788
),
765
766
.
Nehm, R. H., & Schonfeld, I. S. (
2007
).
Does increasing biology teacher knowledge of evolution and the nature of science lead to greater preference for the teaching of evolution in schools?
Journal of Science Teacher Education
,
18
(
5
),
699
723
.
Newport, F. (
2012
).
In U.S., 46% hold Creationist view of human origins
. Gallup Pole. http://www.gallup.com/poll/155003/Hold-Creationist-View-Human-Origins.aspx
NGSS Lead States
. (
2013
).
Next Generation Science Standards: For states, by states
.
Washington, D.C.
:
National Academies Press
.
Pennock, R. T. (
2005
). On teaching evolution and the nature of science. In J. Cracraft & R. W. Bybee (Eds.),
Evolutionary science and society: Educating a new generation
.
Colorado Springs, CO
:
Biological Sciences Curriculum Study
.
Pennock, R. T. (
2007a
).
Learning evolution and the nature of science using evolutionary computing and artificial life
.
McGill Journal of Education
,
42
(
2
),
211
224
.
Pennock, R. T. (
2007b
).
Models, simulations, instantiations, and evidence: The case of digital evolution
.
Journal of Experimental & Theoretical Artificial Intelligence
,
19
(
1
),
29
42
.
Pobiner, B. (
2016
).
Accepting, understanding, teaching, and learning (human) evolution: Obstacles and opportunities
.
Yearbook of Physical Anthropology
,
159
,
S232
S274
.
Rice, J. W., Olson, J. K., & Colbert, J. T. (
2011
).
University evolution education: The effect of evolution instruction on biology majors' content knowledge, attitude toward evolution, and theistic position
.
Evolution: Education & Outreach
,
4
(
1
),
137
144
.
Robbins, J. R., & Roy, P. (
2007
).
The natural selection: Identifying & correcting non-science student preconceptions through an inquiry-based, critical approach to evolution
.
American Biology Teacher
,
69
(
8
),
460
466
.
Rutledge, M. L. (
1996
).
Indiana high school biology teachers and evolutionary theory: acceptance and understanding
(Doctoral dissertation).
Ball State University
,
Indiana
.
Rutledge, M. L., & Sadler, K. C. (
2007
).
Reliability of the Measure of Acceptance of the Theory of Evolution (MATE) instrument with university students
.
American Biology Teacher
,
69
(
6
),
332
335
.
Rutledge, M. L., & Warden, M. A. (
1999
).
The development and validation of the Measure of Acceptance of the Theory of Evolution instrument
.
School Science & Mathematics
,
99
(
1
),
13
18
.
Rutledge, M. L., & Warden, M. A. (
2000
).
Evolutionary theory, the nature of science, and high school biology teachers: Critical relationships
.
American Biology Teacher
,
62
(
1
),
23
31
.
Sadava, D. E., Hillis, D. M., Heller, H. C., & Berenbaum, M. (
2012
).
Life: The Science of Biology
(10th ed.).
London
:
Macmillan
.
Sinatra, G. M., Southerland, S. A., McConaughy, F., & Demastes, J. W. (
2003
).
Intentions and beliefs in students' understanding and acceptance of biological evolution
.
Journal of Research in Science Teaching
,
40
(
5
),
510
528
.
Smith, J. J., Johnson, W. R., Lark, A. M., Mead, L. S., Wiser, M. J., & Pennock, R. T. (
2016
).
An Avida-ED digital evolution curriculum for undergraduate biology
.
Evolution: Education & Outreach
,
9
(
1
),
9
20
.
Smith, M. U., & Siegel, H. (
2004
).
Knowing, believing, and understanding: What goals for science education?
Science & Education
,
13
(
6
),
553
582
.
University of California Museum of Paleontology
. (
2014
).
Understanding Evolution
. Retrieved May 22 2014, from http://evolution.berkeley.edu/
Yin, R. K. (
2009
).
Case study research: Design and methods
(4th ed.).
Thousand Oaks, CA
:
Sage
.
Zimmer, C., & Emlen, D. J. (
2013
).
Evolution: Making Sense of Life
.
Greenwood Village, CO
:
Roberts and Company Publishers, Inc
.