This simple inquiry-based lab was designed to teach the principle of osmosis while also providing an experience for students to use the skills and practices commonly found in science. Students first design their own experiment using very basic equipment and supplies, which generally results in mixed, but mostly poor, outcomes. Classroom “talk and argument” is then used to determine how their experiments could be changed to gather more reliable data. The final assessment consists of both formal and subjective testing, requiring students to explain their design choices.

“Why are you weighing that beaker?” I once asked a student in my Human Physiology course. I’ve never forgotten his reply: “Because that’s what scientists do, they weigh stuff.” The student had no idea what he was doing, or why he was doing it, only that he thought it was something he should do. This experience reinforced what I had learned from conferences and journals about the necessity of inquiry-based learning, and it initiated my work to transform my lab experiences into more student-centered activities.

There has been increased emphasis on the use of inquiry-based teaching in the past 10 years. Most recently, the Committee on a Conceptual Framework for New K–12 Science Education Standards reaffirmed that there has been a serious failure in how we teach science – emphasizing discrete facts with a focus on breadth over depth – and that teachers have not been providing students engaging opportunities to experience how science is actually done (National Research Council [NRC], 2013). The American Association for the Advancement of Science (AAAS) has long encouraged experiences wherein teamwork leads to problem solving (Varma-Nelson et al., 2005), and studies suggest that as students do so, they will naturally determine whether their plan will result in the desired outcome (Lord et al., 2007). If constructed properly, it is likely that a laboratory experience could present an interest-sparking problem for which the students could then formulate their own solutions. This self-development of knowledge and understanding is what defines scientific inquiry (National Science Teachers Association, 2004). Therefore, the paramount criterion of all laboratory experiences should be to encourage students to talk with each other in problem-solving activities (Varma-Nelson and Cracolice 2001, Michaels et al., 2007).

The following osmosis lab uses an inquiry-based approach wherein students create their own knowledge during collaborative work. This lab has been used in mostly nonmajor introductory biology courses for 8 years (typical lab sections contain 24 students). The intent of this lab design is to (1) allow students to use their prior knowledge to perform an experiment in a safe and constructive environment; (2) use this initial experiment to assess student misconceptions; (3) discuss, as a class and through lab-journal entries, these misconceptions; (4) let the students design a new experiment using newly found content knowledge; and (5) reassess their understanding through a final competition that provides a way for students to creatively work as a team to problem-solve.

The introductory biology courses that use this lab are designated as General Education (GE) courses. As such, the content is assumed to be new to the students. Moreover, GE courses are expected to contain a small number of our institution’s essential learning outcomes, which are based on the Association of American Colleges and University’s (AAC&U) LEAP learning outcomes and VALUE rubrics (AAC&U, 2013). This lab was designed to introduce students to basic inquiry and analysis skills (nature of science), requiring them to solve a problem creatively and to productively interact with others to complete an assignment. Additionally, this lab will help students (1) gain specific skills, such as proper use of rulers, calipers, and electronic scales to measure length and weight using the metric system; (2) increase their ability to solve problems scientifically; and (3) develop a deeper knowledge of osmosis as assessed by their lab journals.

Methods & Materials

This lab consists of three distinct phases that fill an action-packed 2 hours (Figure 1). The prelab setup is very easy, a strength of this lab design (for a list of materials and prelab directions, see Table 1). Three 5-gallon buckets are gathered, along with a variety of equipment and resources that students may elect to use throughout the lab. The materials are not placed at each work station but, rather, around the room. For example, the rulers are left in the drawer where they are typically stored. Hot plates, scales, and microscopes are also left in their usual locations. This setup introduces the students to the lab, helping them view it as their space to do science, streamlining future labs as well.

Figure 1.

Flowchart of the three phases of this 2-hour lab exercise.

Figure 1.

Flowchart of the three phases of this 2-hour lab exercise.

Table 1.

A list of materials and directions for prelab setup.

Materials
The following materials are needed for a typical class size of 24. With smaller classes, reduce the amount of potatoes. 
5-gallon buckets 
500-mL beakers for warming water 
100 Plastic cups (large party size) 
Electronic scales (±0.01) 
Dissecting microscopes 
Digital thermometers 
Black markers for labeling 
Container (737 g) of salt 
Small paring knives 
5-lb sack of potatoes 
Certificates of Achievement 
Directions 
1. Put tap water in each 5-gallon bucket until half filled. 
2. Place 300–500 mL of table salt in one bucket. Mix. 
3. Place 20–50 mL of table salt in a different bucket. Mix. 
4. OPTION: You could label the buckets (A, B, C). I have elected not to label, but prefer the students to keep track of their data how they see fit. Occasionally, students will mix up their water samples, leading to confusing results. This has led to wonderful discussions on the importance of paying attention to details and good field journal practices. 
5. Place scales, hotplates, beakers, microscopes, knives, rulers, digital thermometers, cups, and potatoes in one location within the lab. These materials should be made available for use if students so choose, but they should not feel compelled to use them. 
Materials
The following materials are needed for a typical class size of 24. With smaller classes, reduce the amount of potatoes. 
5-gallon buckets 
500-mL beakers for warming water 
100 Plastic cups (large party size) 
Electronic scales (±0.01) 
Dissecting microscopes 
Digital thermometers 
Black markers for labeling 
Container (737 g) of salt 
Small paring knives 
5-lb sack of potatoes 
Certificates of Achievement 
Directions 
1. Put tap water in each 5-gallon bucket until half filled. 
2. Place 300–500 mL of table salt in one bucket. Mix. 
3. Place 20–50 mL of table salt in a different bucket. Mix. 
4. OPTION: You could label the buckets (A, B, C). I have elected not to label, but prefer the students to keep track of their data how they see fit. Occasionally, students will mix up their water samples, leading to confusing results. This has led to wonderful discussions on the importance of paying attention to details and good field journal practices. 
5. Place scales, hotplates, beakers, microscopes, knives, rulers, digital thermometers, cups, and potatoes in one location within the lab. These materials should be made available for use if students so choose, but they should not feel compelled to use them. 

Phase 1: Mistakes Are Good

To begin, I walk in and point to the three 5-gallon buckets sitting on the floor and say, “One of these buckets has very salty water in it. One of them has a touch of salt, and the other is just tap water. You have 30 minutes to design and conduct an experiment that uses a potato to help you determine the difference between these liquids. Feel free to use almost anything in the room, as long as you ask for permission and describe in your journal why you want to use it. Go for it!”

I have seen numerous odd ideas and misconceptions over the years, and each lab brings something new. Few, if any, of the students’ ideas actually work. Students typically look to the instructor for verification of their ideas, but I purposely remove my guidance from the situation, intervening only when safety might be an issue. Typical verification labs tend to focus on terminology, concepts, and facts rather than the students’ prior experiences and knowledge (Concannon & Brown, 2008). By contrast, Phase 1 is designed to provide an environment for students to use their prior knowledge in a creative way to solve the problem. Invariably, the students will use their textbooks, frantically scanning the section that discusses the principles of osmosis and diffusion, with terms like “hypotonic” and “isotonic” filling the air. Many groups engage in “talk and argument,” discussing their methods, realizing their poor experimental design (Varma-Nelson & Cracolice, 2001; Michaels et al., 2007). The most common mistake is that students will act first and think about their design later. Many groups, realizing their poor design, will ask to start over. In these situations, I smile encouragingly and point to the clock, reminding them they must have an answer within the 30-minute time frame.

At the end of Phase 1, each group orally reports their methods and findings, ranking the buckets of water from the most salty to the least. Proceeding group by group, we discuss, as a class, the strengths and weaknesses of each design. This student-centered discussion is enlightening as we methodically address the misconceptions the students have. For example, I have had groups cut a potato in half and stick a thermometer into the potato. When describing their methods, they explained that they thought the “potato in the saltiest water would be the coldest because we add salt to ice to make it colder when making homemade ice cream.” Some groups point out that others had left the potato skins on their samples, or that other groups soaked their slices longer. Some students report that the shapes and sizes of the potatoes were different or that groups didn’t tare the scale. With detailed probing, I address all these concepts and ask why they think these facets contributed to our spotty outcome.

The key to this debriefing process is to not give out any of the answers, rather guiding the students to reason through them. Typically, groups will point out that weight gain or loss is the best way to determine salinity of the buckets. When I think they understand the underlying concepts of diffusion and osmosis, we begin Phase 2.

Phase 2: Reliability & Repeatability

To begin Phase 2, I ask the students, “What steps should we take to ensure reliability?” A list develops that carefully outlines these controls (size and shape of the potato, soaking time, amount of water used in soaking, how to calculate change in weight, etc.). Once the class is in agreement and the new parameters are understood, they are given 30 minutes to test the buckets again.

The typical list of variables to control includes the following: (1) uniformly shaped and weighted potato samples (generally, cubed samples that weigh 20 ± 0.1 g), (2) a standardized weighing technique that includes patting dry the potato samples, (3) uniform soaking times (typically 10 minutes), (4) a uniform amount of water/solution to use for soaking, and (5) how they will report their data and calculations ([final weight – initial weight/initial weight] × 100 = percent change in weight).

After 30 minutes, the groups record and report their data. The students type their resulting data into an Excel spreadsheet that can be viewed by the entire class using a projector. As this may be their first experience with statistics, I perform very basic descriptive statistics (i.e., mean, standard deviations, and standard error) and create a graph showing their data. Occasionally the studies suggest statistically significant differences in salinity between the buckets (Figure 2). However, understanding how to perform statistical tests is not one of the learning outcomes. We use the graph and statistics as a way to interpret our results and reflect upon our data. These reflections are written in their lab journals. This subsequent discussion helps ingrain the processes observed and gives them more experience using the proper science vocabulary.

Figure 2.

Typical graph generated by in-class results in Phase 2. The buckets containing various levels of salinity are labeled on the x axis, and the y axis represents the percent change in weight (final weight – original weight/original weight) × 100. The error bars are computed by using the standard error. Note the lack of statistical significance in this particular data set.

Figure 2.

Typical graph generated by in-class results in Phase 2. The buckets containing various levels of salinity are labeled on the x axis, and the y axis represents the percent change in weight (final weight – original weight/original weight) × 100. The error bars are computed by using the standard error. Note the lack of statistical significance in this particular data set.

Phase 3: The Great Potato Race

For the final phase, students are broken into new, smaller groups of two. I instruct them to create a potato design that will gain as much weight as possible after 10 minutes of soaking in a liquid of their choosing. Their finished design must weigh 20 g (±0.1) prior to soaking time. Grown college students suddenly and enthusiastically revert to their childhood, asking for unique supplies in hushed tones and setting up textbooks to hide their “secret” designs (Figure 3). The lab transforms into a playground. Eventually they present their designs that are intended to maximize the surface-area-to-volume ratio. After officially weighing the potatoes, I instruct them to prepare the liquid of their choice. One might assume that all groups would select distilled water. However, few ever do. They use as much creativity in this step as in their Phase 1 design, with most of the ideas being misguided. Students have used alcohol, soapy water, Gatorade, Diet Pepsi, and a potpourri of other liquids to soak their potatoes in. This is yet another assessment of their misconceptions.

Figure 3.

Examples of the creative ways students have tried to maximize the surface area of 20-g pieces of potato during Phase 3.

Figure 3.

Examples of the creative ways students have tried to maximize the surface area of 20-g pieces of potato during Phase 3.

Once the 10-minute soaking period begins, we discuss their designs. This informal discussion is a significant learning step. As the students orally share their various designs, one group may proclaim defeat upon hearing that another group procured distilled water and heated it with a hot plate. Another group might realize that soaking their potato in a popular cola drink may not have been the best idea. After the allotted soaking time, the potato designs are weighed and the change in weight tabulated. The winning group is awarded a certificate or other reward (I often allow the winners to select from a wide variety of candy bars on their way out of lab).

The final portion of the lab requires the students to write in detail their final design and why they designed it as such. This reflective process is the main assessment for the lab. It is from these entries that I determine their level of understanding about the nature of science and osmosis. Phase 3 is much more than a contest. For many, this phase is a moment of realization. One past group soaked their design in ethyl alcohol and witnessed a 20% drop in weight. When initially describing their design, the group noted how it maximized surface area and that they felt the alcohol would be the perfect solution for weight gain. This belief was based on prior experience: one of the group members reinforced the idea by stating that he typically could drink more beer than any other liquid. However, after weighing their potato sample, they quickly realized that their choice of solution was a poor one. Coming to this conclusion on his own was a powerful learning moment for the beer drinker: “This is why I am so thirsty after I’ve been drinking!” After the other students stopped laughing, we had a great discussion regarding alcohol as a diuretic.

The journaling method of assessment represents a large shift from how I historically measured understanding, which used a few multiple-choice questions as part of a larger exam. These lab journals are collected and serve as documentation of writing skills, creative and analytical thinking, and improvements in technical skills.

Conclusions

The strength of this laboratory is twofold. First, the setup is very quick, easy, and inexpensive. Second, it is a great model to help nonmajor students experience “what scientists do.” By removing heavy calculations and numerous and complex cookbook-style steps, the students experience the excitement of science. They fluidly move between the Investigating, Evaluating, and Developing Explanation and Solutions spheres of activity (NRC, 2012). Throughout this laboratory experience, students are routinely engaged in all of the Scientific Practices outlined by the National Academy of Sciences in the Next Generation Science Standards (see Table 2; NRC, 2013).

Table 2.

Comparisons between scientific skills and practices found in this lab to those outlined in the Next Generation Science Standards (NRC, 2012).

Practices in Science (NRC, 2012)Practices Found in Activity
Ask Questions, Define Problems: 
Formulating empirically answerable questions about phenomena. Students engage in the practices of science, gaining a better appreciation of how scientists ask questions and solve problems creatively. 
Develop & Use Models: 
Models enable predictions, helping develop explanations about natural phenomena. Potatoes are used as a simple model to learn more about a natural phenomenon, assisting them in applying that model more generally. 
Plan, Carry Out Investigations: 
Scientists plan and conduct systematic investigations, identifying what is to be recorded. Phase 1: Students use prior knowledge to plan and conduct an investigation, deciding what data to collect. 
Phase 2: Students develop a more systematic approach to investigate salinity, identifying the controls and variables themselves. 
Analyzing, Interpreting Data: 
Deriving meaning from data, using a range of tools (tabulation, graphing, visual, and statistical analysis). Phase 1: Students derive meaning from the data they collected. Often, their data are erroneous, which is discovered as they interpret them. 
Phase 2: Basic statistics is introduced by the instructor, offering a more reliable method to analyze and interpret the data. 
Using Math & Computational Thinking: 
These techniques are used to construct simulations, statistically analyze data, and recognize and express relationships. Phase 2: Graphing and basic descriptive statistics are used to determine differences in salinity between the buckets. 
Constructing Explanations & Designing Solutions: 
Construct logically coherent explanations of phenomena that incorporate their current understanding of science. Students are not given a “cookbook” set of instructions to follow. Rather, they must use their prior knowledge to explain the phenomena observed. Once discrepancies between empirical evidence and prior understanding have been resolved, students can reconstruct their knowledge. 
Arguing from Evidence: 
Reasoning and argument are used to identify strengths and weaknesses of explanations. This practice is found after each of the three rounds, in discussion about their experimental designs. 
Obtaining, Evaluating, & Communicating Information: 
Scientific findings are advanced when shared. Students share their creative designs, problems, and successes through oral and written communication. 
Practices in Science (NRC, 2012)Practices Found in Activity
Ask Questions, Define Problems: 
Formulating empirically answerable questions about phenomena. Students engage in the practices of science, gaining a better appreciation of how scientists ask questions and solve problems creatively. 
Develop & Use Models: 
Models enable predictions, helping develop explanations about natural phenomena. Potatoes are used as a simple model to learn more about a natural phenomenon, assisting them in applying that model more generally. 
Plan, Carry Out Investigations: 
Scientists plan and conduct systematic investigations, identifying what is to be recorded. Phase 1: Students use prior knowledge to plan and conduct an investigation, deciding what data to collect. 
Phase 2: Students develop a more systematic approach to investigate salinity, identifying the controls and variables themselves. 
Analyzing, Interpreting Data: 
Deriving meaning from data, using a range of tools (tabulation, graphing, visual, and statistical analysis). Phase 1: Students derive meaning from the data they collected. Often, their data are erroneous, which is discovered as they interpret them. 
Phase 2: Basic statistics is introduced by the instructor, offering a more reliable method to analyze and interpret the data. 
Using Math & Computational Thinking: 
These techniques are used to construct simulations, statistically analyze data, and recognize and express relationships. Phase 2: Graphing and basic descriptive statistics are used to determine differences in salinity between the buckets. 
Constructing Explanations & Designing Solutions: 
Construct logically coherent explanations of phenomena that incorporate their current understanding of science. Students are not given a “cookbook” set of instructions to follow. Rather, they must use their prior knowledge to explain the phenomena observed. Once discrepancies between empirical evidence and prior understanding have been resolved, students can reconstruct their knowledge. 
Arguing from Evidence: 
Reasoning and argument are used to identify strengths and weaknesses of explanations. This practice is found after each of the three rounds, in discussion about their experimental designs. 
Obtaining, Evaluating, & Communicating Information: 
Scientific findings are advanced when shared. Students share their creative designs, problems, and successes through oral and written communication. 

The design of this lab accomplishes everything I had hoped it would. Students easily surpass the benchmark of 2 when using the AAC&U VALUE rubrics to assess lab journals for Inquiry and Analysis, Teamwork, and Written Communication (Rhodes, 2010). Furthermore, because this may be the only life science course taken in their college careers, I want my students to understand that scientists investigate questions using the same skills they used. And, just like in this lab, scientists’ experiments occasionally don’t go as planned, requiring them to adjust the methods and try again. Ultimately, I want my students to understand that science is an adventure in learning and that it is really fun.

References

References
Association of American Colleges and Universities. (2013). VALUE: Valid assessment of learning in undergraduate education. [Online.] Available at https://www.aacu.org/value/rubrics/.
Concannon, J. & Brown, P.L. (2008). Transforming osmosis: labs to address standards for inquiry. Science Activities: Classroom Projects and Curriculum Ideas, 45(3), 23–25.
Lord, T., Shelly, C. & Zimmerman, R. (2007). Putting inquiry teaching to the test: enhancing learning in college botany. Journal of College Science Teaching, 36(7), 62–65.
Michaels, S., Shouse, A.W. & Schweingruber, H.A. (2007). Front matter. Ready, Set, SCIENCE! Putting Research to Work in K–8 Science Classrooms. Washington, DC: National Academies Press.
National Research Council. (2013). Next Generation Science Standards: For States, By States. Washington, DC: National Academies Press.
National Science Teachers Association. (2004). NSTA Position Statement on Scientific Inquiry. Washington, DC: NSTA Press.
Rhodes, T., Ed. (2010). Assessing Outcomes and Improving Achievement: Tips and Tools for Using Rubrics. Washington, DC: Association of American Colleges and Universities.
Varma-Nelson, P. & Cracolice, M.S. (2001). Peer-Led Team Learning: General, Organic and Biological Chemistry. Upper Saddle River, NJ: Prentice Hall.
Varma-Nelson, P., Cracolice, M.S. & Gosser, D.K. (2005). Peer-led team learning: a student-faculty partnership for transforming the learning environment. In Invention and Impact: Building Excellence in Undergraduate Science, Technology, Engineering, and Mathematics. Washington, DC: AAAS.