1) Now called Summer Institutes on Scientific Teaching (abbreviated SI), this professional development opportunity focuses on how to more effectively teach science.

3) While there are some inconsistencies among terms, we treat “interactive teaching methods” as ones in which instructors collect data about student learning in the moment (Handelsman et al. 2007). This allows instructors to modify instruction--sometimes in the moment--based on students’ current levels of understanding and performance. Interactive teaching methods can also refer to students working with and getting feedback from their peers. Note that active teaching is usually posed as any form of instruction other than lecture, whereas interactive teaching emphasizes instructors’ roles in cultivating environments in which students continually demonstrate and receive feedback on their understandings.

4) Scientific teaching, sometimes abbreviated ST, is a collection of “methods that encourage students to construct new knowledge and to develop scientific ways of thinking, provide both students and instructors with feedback about learning, and foster success for all students. Scientific teaching aims to create classrooms that reflect the true nature of science and promotes teaching as a scholarly endeavor” (Miller et al. 2008). Handelsman et al. 2007 offers more in-depth information on Scientific Teaching.

6) This study uses two independent sources of qualitative data: observations of classrooms and surveys of faculty.

7) Later in the paper, the authors define pedagogical goals as “learning processes, course structures, and classroom environments that are desired by the instructor.” The results of the study allow instructors to identify the methods that are commonly used in scientific teaching.

8) Later in the paper, the authors define supporting practices as the teaching practices that instructors use to achieve their pedagogical goals. The results of the study allow instructors to identify the methods that are commonly used in scientific teaching.

9) Students act in predictable ways when their instructors approach teach scientifically. One of the products of this paper is a list of behaviors that can be observed. For example, the taxonomy states that students may “identify, construct, or evaluate hypotheses and make predictions based on their hypotheses.” Similarly, students may “construct graphs or tables and analyze results presented in these formats.” These observable behaviors are a way to bridge between high-level goals in the taxonomy and specific practices that can been and measured.

10) In this context, artifacts are materials that an instructor develops that show evidence of scientific teaching, for example syllabi, class slides, and websites.

11)Tomes have been written to try and unpack the meaning of the nature of science. A straightforward entree into this literature is the 2011 Vision and Change Report. In addition to defining five core concepts for undergraduate biology majors the report defines six competencies and practices that collectively describe the nature of science.

12)Scientific practice or process is actually doing scientific work, whereas the phrase nature of science tends include why we do science and how it influences different societies (see Misconceptions about Teaching the Nature and Process of Science from the Understanding Science website). Philosophers distinguish among the nature of science and the process of science; the tendency within scientific teaching is to blend, rather than dichotomize, these definitions.

13)Learning principles often refer to ideas about how people learn. For example, constructivist learning theory as framed by Dewey (1916) posits that new information needs to be placed in the context of prior knowledge. It also includes factors such as motivation and how feedback impacts a student’s ability to acquire and assimilate new knowledge. Jones and Brader-Araje (2002) summarize the recent history and impacts of constructivism on classroom teaching and learning.

14)The Cognitive Science Society defines cognitive science as an interdisciplinary approach to “understanding the nature of the human mind”. One way in which cognitive scientists contribute to discipline-based education research is through their expertise in how people learn.

15)Gathering student performance data is also referred to as assessment, and it is a key component of scientific teaching. ST involves is testing the hypothesis that students have learned; thus, collecting data about what students understand about the material being taught, as well as where they need more practice, is critical.

16)The achievement gap between different student populations has been the focus of a great deal of research in LSE, including a special issue about broadening the participation in the life sciences, Volume 15, Issue 3 Sept. 1, 2016.

18)The literature on decreasing the achievement gap shows that evidence-based teaching practices, including ST, elevate the performance of all students, but especially of those who have been historically marginalized (e.g., Haak et al 2011).

19)As reported in the National Center for Education Statistics report on STEM Attrition (2013), retention rates for students who complete degrees in Science Technology Engineering and Mathematics majors are lower for students from historically underrepresented groups.

20) The Bio 2010 Report inspired the Summer Institutes on Scientific Teaching, which in turn influenced the high-impact Vision and Change Report.

22)The Carnegie Classification of Institutions in Higher Education describes different institutions types. According to the most recently published classification, 2-yr colleges are now referred to as Associate’s Colleges, and 4-yr colleges are Baccalaureate Colleges, and doctoral universities are divided into highest research activity (R1), Higher research activity (R2) and Moderate research activity (R3). Couch et al. do not mention other institutions types, including Master's Colleges and Universities, Special Focus Institutions, and Tribal Colleges. Biology education researchers have overemphasized studying students from baccalaureate colleges, master’s colleges, and doctoral universities, despite the fact that most biology students are enrolled in community colleges (Schinske et al. 2017).

24)This is sometimes referred to as inquiry-based learning. The idea is to model the practice of scientific discovery as part of the learning cycle by having students doing research and actively investigating questions relevant to the subject of the inquiry. The Next Generation Science Standards also make this recommendation for K-12 teaching.

26)Active learning requires input from the students to achieve the learning goals that an instructor develops. For example, students might generate hypotheses or graphically represent information. This approach contrasts with passive learning such as reading, watching a video, or listening to a lecture without taking notes (Handelsman et al. 2007).

27)Assessment means collecting data to determine where learners are in their progress towards achieving the learning outcomes. Assessments are often divided into formative assessments, which are low stakes queries taken during the learning process, and summative assessments, which are higher stakes performance evaluations such as exams that occur after the classroom learning interval (Dirks et al. 2014).

28)Teaching inclusively means intentionally supporting all students in the classroom, regardless of background or disability, to work towards their learning goals (Gargiulo and Metcalf 2017). The concept of universal design, in a broad use of the term, is often helpful. How can we enable all learners to achieve?

29)This paragraph includes a nice example of defining terms for the reader so that the definitions are clear for scholars who are new to the area, as well as for people who may use the same terms in different ways.

30) Formative assessment is used to inform the instructor and students where the students are relative to the learning outcomes during the learning process, for example an in-class activity, quiz, or homework assignment. They are often low or no stakes. The outcomes of this assessment are then used to modify instruction to improve learning. This kind of assessment describes an assessment-feedback loop that an instructor and a student use during a learning cycle. For additional information see Dirks et al. (2014) and Dixon and Worrell (2016).

31)Summative assessment is used to measure where students are at the end of a lesson, for example an exam or final course grade. In addition to being used as a more final measure of student achievement of learning outcomes, it is used in the assessment-feedback cycle to inform an instructor how to revise a segment of a course or the whole course. For additional information see Dirks et al. (2014) and Dixon and Worrell (2016).

35a)Task characteristics refers to the nature of what the student is being asked to do. Are they being asked to compare, decide, judge? Are they required to draw, calculate, or write?

35b)Cognitive demand refers to the kind and level of thinking needed to complete a task. (See Chapter 1 of Stein et al. 2009).

36)This is an example of appropriate uses and functions of footnotes. This footnote provides a clarification of a change in the choice of terms used in the paper and provides the rationale for this choice.

38)One challenge in developing observation protocols is determining whether they assess what is intended. That is why developing a taxonomy of evidence-based teaching practices is a separate step from creating an observation protocol to see if these practices are being implemented. The protocols that the authors cite here use different approaches and capture different aspects of high-impact teaching.

42)In this context, the word domain refers to a broad category of practices that needs to be further refined. In this paper, that domain is scientific teaching.

44)In education research, instruments are any tools used to collect data, such as tests, questionnaires, surveys, and observational protocols. A taxonomy generated through interviews and surveys with experts, and tested in the classroom, can identify the components that an instrument could measure.

45)The authors use the last paragraph of the Introduction to identify why their study is needed, and to state the goals of the study.

46)One of the goals of this study is to develop a mechanism for identifying and documenting the use of scientific teaching practices in the classroom. In this case, what behaviors can an observer see that indicate the use of scientific teaching?

47)This study documents the development of a taxonomy of scientific teaching, which sets the stage for a follow-up study presenting a measurement instrument (Durham et al. 2017).

48)Formal instruments have been assessed for evidence of reliability and validity with particular audiences. Reliability refers to the consistency with which an instrument measures the intended construct. For more on reliability, see Social Research Methods (Trochim 2006). Validity refers to whether the instrument is actually measuring the intended construct. Note that validity is contextual. What produces valid results in one context, for example biology courses for college students, or for a particular purpose, may not produce valid results in other contexts or for other purposes. For more on validity, see Social Research Methods (Trochim 2006). Additional discussion of reliability and validity can be found in annotations of Hanauer and Dolan (2014).

1)As with any scientific study, the scope of a qualitative research study must be carefully defined. In this case, the scope is framed by the question “what observable practices are there that indicate the use of scientific teaching?” The scope of a qualitative study can be harder for people with a more quantitative background to gauge, in part because sample sizes tend to be smaller. However, the data tend to be richer and more faceted in qualitative studies than in quantitative studies. Here, the authors explain how broadly they sampled, and they assess whether their sample is sufficient for making generalizable conclusions. For more about qualitative research design methods see Social Research Methods (Trochim 2006).

2)It is helpful to include a diagram or table that describes the steps in the study. What are the key points the reader needs to know and focus on? Figure 1 is especially helpful given the iterative nature of this kind of research.

3)This figure illustrates that taxonomy development is both logical and iterative. The researchers decide the next step of their study based on prior results. Assessment phases and benchmarks inform how the study is progressing and what modifications need to be made in subsequent iterations.

4)Validity refers to the idea that the instrument is actually measuring the intended construct. In this context, the authors need to determine whether their taxonomy provides evidence of scientific teaching in their sample. It is typical to present multiple forms of validity evidence to argue that an instrument is a valid or trustworthy measure of a construct. To assess construct validity in this study, the authors apply their taxonomy to 10 classroom observations of 10 different science instructors at the University of Colorado; they also surveyed 14 more faculty members from a total of 11 institutions to determine whether the taxonomy captures the instructors’ approaches to teaching.

5)The research design acknowledges the assumptions and limitations of the work.

7)The study begins by retrieving demonstrably effective pedagogical features of scientific teaching from the literature, sometimes called evidence-based practices. The other sections of the Methods explain how the authors refine this initial list to form an evidence-based consensus of the key features of scientific teaching.

12)Often, the term “student-centered” refers to a teaching style in which students learn through instructor-designed experiences. Here, the authors use “student-centered” to mean something complementary: they reword the scientific teaching practices in terms of students’ experiences, rather than instructors’ choices. For example, an instructor might say “I ask my students to solve problems in groups.” Rewording this to focus on the student experience would be “students collaborate to solve problems.” This helps generate a list of observations that can be made to classify what is happening in the classroom based on what the students are doing.

13)After identifying how instructors want to teach, the authors look at the specific practices instructors use to meet these goals. The authors explain this procedure with the example of teaching students to use metacognition, defined as an awareness about their learning, or, what do they know and not know. A goal might be to have students practice metacognition, and a practice supporting this goal might be to ask students to identify and reflect on the assumptions they made to solve a problem.

14)Metacognition can be thought of as an awareness about one’s learning or the ability to know what you know and do not know and how you know it. For more about metacognition, see Bransford et al. 2000.

16)Research that aims to develop guidelines or standards is necessarily iterative. In this case, cycles of observing, classifying, validating results, and revising continue until the rate of making changes slows dramatically, asymptoting toward a steady, consistent determination. For example, the authors conducted additional classroom observations to look for additional scientific teaching practices that earlier drafts of the taxonomy may have missed.

17)The next step was to “field test” the taxonomy. For researchers used to interpreting quantitative data, a sample size of 10 may seem small. However, it is appropriate here because the goal was to see whether the practices in the taxonomy were, in fact, being used in classrooms. Therefore, the authors chose to observe instructors who were experienced in using scientific teaching. By carefully selecting this sample, they increased the chances that the supporting practices in the taxonomy would be observed.

20)Interrater reliability is helpful to report, because data coded by people is always subject to different interpretations. For more on interrater reliability, see Social Research Methods (Trochim 2006).

21)Because of the iterative nature of this study, some of the results are included in the Methods, because they inform the next step of the study. The Results section of the paper focuses on the end product, rather than the intermediate products.

23)When researchers use instruments such as surveys, it is best practice, and often required by the journal, to include these instruments in the supplemental materials so that readers can use and/or evaluate it. Moreover, including the survey acknowledges that survey design itself is a research enterprise, and that surveys can be assessed for the validity and reliability of their results. For more about survey research design, see Social Research Methods (Trochim 2006).

24)It is time-consuming and costly to conduct classroom observations, so the authors restricted their field-testing to one institution. To determine whether the taxonomy describes approaches to teaching at other institutions, the authors distributed a survey to 14 additional faculty members, 11 of whom were from different institutions.

25)Because the taxonomy is long, the authors wanted to minimize survey fatigue. Thus, each volunteer survey participant evaluated only half of goals listed in the draft taxonomy (17 items).

26)Table 1 describes the pedagogical goals, the references supporting each goal, and the way each goal is refined into supporting practices. The table is particularly effective at emphasizing that the strategies mentioned in the taxonomy are based on evidence.

27)Alignment speaks to teaching that focuses on meeting predefined learning outcomes and goals. In an aligned class, the goals, outcomes, and expectations are made clear to the students, and the learning activities are designed to achieve the goals and outcomes. Moreover, the formative and summative assessments provide information about whether students achieved the outcomes. On the other hand, lack of alignment can occur if an instructor tests students on material that was not addressed in class activities. Note that this misalignment means that the students have not participated in an assessment-feedback loop (Handelsman et al. 2007).

28)Vision and Change: A Call to Action includes the abilities to apply and use scientific practices as part of the Core Competencies and Disciplinary Practices. In Vision and Change, these include hypothesis testing, evidence-based reasoning, the ability to see relationships between components of complex systems, and the ability to communicate ideas about science.

33)Cognitive processes are higher order mental processes including “problem solving...and abstract thinking” (American Psychological Association Dictionary; see also Bransford et al. 2000). Scientific teaching fosters an environment that enables students to practice higher order mental processes.

37)It is helpful to reiterate sample sizes in a summary table.

38) “Instructors trained elsewhere” refers to people who attended formal professional development programs other than the Summer Institutes.

39)Instructors informally trained refers to individuals that have not attended formal professional development training in scientific teaching, but that have experiences and track records with teaching scientifically.

44)The authors provide their rationale for keeping an item that in their taxonomy that didn't meet the criterion applied to the others.

47)By explaining how the taxonomy of practices differs from the original recommendations of scientific teaching, the authors caution that developing a taxonomy of teaching practices is an iterative and dynamic process. Their work reflects the current state-of-the-art, and this state may change.

49)The taxonomy is framed in a way that focuses on the student experience, rather than instructor work.

55)Table 3 provides a concise summary of the taxonomy in a format that is easy to reference. It also makes clear the relationships between the pedagogical goals; general approaches; and supporting, observable practices.

60)Constructivism is a learning theory stating that students learn new information based on the foundations of their prior knowledge and experiences. Students build upon what they already know. Learning is also impacted by the environmental contexts in which learning occurs. Jones and Brader-Araje (2002) discuss how constructivism has impacted approaches to teaching and learning.

61)In this context, cultural competency refers to teaching in a way that reaches across and respects the cultures that are present in a classroom (for more about cultural competence, see Tanner and Allen, 2007). Cultural competency is critical to increase the diversity of scientists, because it helps students from underrepresented groups see themselves as scientists.

64) Cognitive processes are higher order mental processes including “problem solving...and abstract thinking” (American Psychological Association Dictionary; see also Bransford et al. 2000). Scientific teaching fosters an environment that enables students to practice higher order mental processes.

65)The authors refer to the scales in Bloom’s taxonomy of cognitive domains, which has frequently been used in biology education research to identify the cognitive demands of different tasks (e.g., Crowe et al 2008). The lower domains refer to rote memorization and explanation, and the higher domains are related to analysis, evaluation, creation/synthesis. Because scientists use all of the domains to conduct research, students should learn how to use all domains, too.

1)The authors point out the contributions of their research to the field and remind the readers that the goal of their study was to develop a set of observations that can be used to assess the implementation of scientific teaching practices in the classroom.

2)The authors are providing information about the rationale and approach of their experimental design. One of the keys to their success is to draw on their experiences with scientific teaching while trying to view it from a student’s perspective. Trying to view scientific teaching from a non-instructor role helped them identify the practices that are the foundation of their taxonomy.

3)In backward design, instructors begin the course preparation by identifying what they want students to learn (learning goals), deciding how they will assess students’ progress towards those goals (learning outcomes that specify what students will be able to do), and then designing activities to achieve those outcomes. See Scientific Teaching (Handelsman et al. 2007).

4)The authors acknowledge the limitations of their study.

5)The authors discuss the relevance and applicability of their study.

7)As part of their experimental design, the authors have attempted to build a taxonomy that can be applied to any scientific discipline, although they acknowledge that their study design emphasizes its applicability to biology classrooms.

8)The authors point to outstanding questions and future research areas that the community may need to address.

9)The authors state the contributions of this work to the field and discuss the value of compiling this taxonomy in a single document.

10a)Here the authors hint at their future research plans, indicating that the taxonomy lays the groundwork for developing measurement instruments. In fact, these authors have since published MIST, a survey instrument using this taxonomy (Durham et al. 2017). One of the strengths of the MIST is the fact that it can be completed by students, instructors, or observers.

10b)Some instruments have been published since this was written. This includes the authors’ survey based on the scientific teaching taxonomy Measurement Instrument for Scientific Teaching (MIST) (Durham et al. 2017), as well as the Practical Observation Rubric To Assess Active Learning (PORTAAL) (Eddy et al. 2015) and the Decibel Analysis for Research in Teaching (DART) (Owens et al. 2017).

13)This paragraph describes some possible applications of this research in improving science education on individual as well as programmatic levels.

13a)Note that the taxonomy can be used with different goals in mind and at different scales of reflection. This paragraph offers advice for how individual members of the faculty can think about their own teaching, as well as how departments and national efforts at improving biology education can use this tool. It may be helpful to combine this taxonomy with the corresponding observation instrument (Durham et al. 2017).

14)LSE often publishes instruments in Supplemental Materials, rather than in the main text.

15)One outcome of the Vision and Change Report is the PULSE community (Partnership for Undergraduate Life Sciences Education), a national network of administrators and faculty from biology departments throughout the country that are collaborating to implement evidence-based teaching practices. The PULSE community has developed rubrics for departments to evaluate their biology teaching (Brancaccio-Taras et al. 2016) and (Aguirre et al. 2013).

17)Depending upon current practices of the journal, a “who to contact” notation may be appropriate.

1) LSE now requires that the Methods section of a paper includes either IRB approval or a statement of exemption for any article that involves human subjects. That includes all data from surveys and classroom observations.

*Department of Molecular, Cellular, and Developmental Biology, University of Colorado, Boulder, CO 80309

†Present address: School of Biological Sciences, University of Nebraska, Lincoln, NE 68588.

Search for more papers by this author

*Department of Molecular, Cellular, and Developmental Biology, University of Colorado, Boulder, CO 80309

Search for more papers by this author

*Department of Molecular, Cellular, and Developmental Biology, University of Colorado, Boulder, CO 80309

Search for more papers by this author

‡Center for Scientific Teaching, Department of Molecular, Cellular, and Developmental Biology and Department of Psychiatry, School of Medicine, Yale University, New Haven, CT 06511

Search for more papers by this author

*Department of Molecular, Cellular, and Developmental Biology, University of Colorado, Boulder, CO 80309

Search for more papers by this author

Teaching scientifically

Annotated by Clark R. Coffman and Rebecca M. Price

Annotation published July 26, 2018

Couch et al. develop a taxonomy of observable classroom practices that are aligned to the teaching goals in the Vision and Change Report and the principles of scientific teaching (Handelsman et al. 2004). Their approach offers one way to answer the question, “What does scientific teaching look like in practice?” Scientific teaching is a pedagogical approach in which instructors are constantly testing the hypothesis that their students are achieving course outcomes and learning how to think like scientists. The resulting taxonomy is useful for personal reflection on teaching practices, structuring professional development around best practices in pedagogy, and strategizing how departments can implement state-of-the-art teaching. It also lays the groundwork for developing a teaching observation protocol (Durham et al. 2017) by operationalizing the principles of scientific teaching.

We include this article in “Anatomy of an Education Study” because it expertly illustrates how to use qualitative methods to develop a framework for teaching scientifically. The authors use a combination of interviews and surveys to both identify scientific teaching practices and see whether those practices are actually used by college instructors. Researchers new to this approach will appreciate how the results of part of the qualitative research inform the methods applied in the subsequent stages of the research project. The authors’ careful choices and justification illustrate why qualitative methods are so powerful.

We thank Brian A. Couch and Jennifer K. Knight for answering questions that arose as we annotated this article and for their editorial contributions.

Scientific Teaching: Defining a Taxonomy of Observable Practices

Published Online: https://doi.org/10.1187/cbe.14-01-0002

Abstract

Over the past several decades, numerous reports have been published advocating for changes to undergraduate science education. These national calls inspired the formation of the National Academies Summer Institutes on Undergraduate Education in Biology (SI), a group of regional workshops to help faculty members learn and implement interactive teaching methods. The SI curriculum promotes a pedagogical framework called Scientific Teaching (ST), which aims to bring the vitality of modern research into the classroom by engaging students in the scientific discovery process and using student data to inform the ongoing development of teaching methods. With the spread of ST, the need emerges to systematically define its components in order to establish a common description for education researchers and practitioners. We describe the development of a taxonomy detailing ST’s core elements and provide data from classroom observations and faculty surveys in support of its applicability within undergraduate science courses. The final taxonomy consists of 15 pedagogical goals and 37 supporting practices, specifying observable behaviors, artifacts, and features associated with ST. This taxonomy will support future educational efforts by providing a framework for researchers studying the processes and outcomes of ST-based course transformations as well as a concise guide for faculty members developing classes.

INTRODUCTION

Recognizing the importance of undergraduate science ­education, national organizations have issued dozens of reports over the past several decades calling for dramatic alterations to undergraduate curricula and teaching methods. Written by scientists, educators, and policy leaders, these reports have three recurrent themes. First, they propose that students should learn about the nature of science and engage in scientific practices (American Association for the Advancement of Science [AAAS], 1989, 2011; National Research Council [NRC], 1999). Second, they stress the need to incorporate learning principles from the cognitive sciences and student performance data in the ongoing development of teaching methods (NRC, 2000, 2003b, 2012). Finally, they call attention to the persistent achievement gap for members of historically underrepresented groups and recommend teaching practices that promote the success and persistence of all students (NRC, 2011; President’s Council of Advisors on Science and Technology [PCAST], 2012).

These calls have had broad impacts within the life sciences community, serving as the impetus for local and national transformation efforts. In 2003, the National Research Council’s BIO2010 report initiated an important movement within biology education by specifically calling for a professional development workshop to help faculty members cultivate their teaching skills (NRC, 2003a). In 2004, this call was answered through the founding of the Summer Institutes on Undergraduate Education in Biology (SI) with support from the Howard Hughes Medical Institute (HHMI) and the National Academy of Sciences (NAS; Pfund et al., 2009). Initially focused on biology faculty members at research institutions and held only once per year, the SI has since expanded to seven regional sites, and more than 1000 faculty members have attended the 5-d workshop as of 2014. These faculty members are primarily—but not exclusively—biologists, and they represent more than 200 institutions from across the country, including 2- and 4-yr colleges and almost all of the nation’s research-intensive universities. SI participants are trained to develop, implement, and disseminate innovative teaching practices at their home institutions, leading to an extensive network of faculty members who have been influenced by the SI program.

The SI curriculum promotes a pedagogical framework called Scientific Teaching (ST) (Handelsman et al., 2004), an approach described in a book by the same name (Handelsman et al., 2007). Reflecting the national calls from which the SI emerged, ST builds on the foundational idea that the way science is taught should reflect the way science is practiced (AAAS, 1990). ST aims to capture the spirit of scientific research by immersing students in the scientific discovery process and using evidence, either local or published, to justify the selection of teaching methods (Cross and Steadman, 1996; Angelo, 1998; Hutchings and Shulman, 1999; Handelsman et al., 2002).

ST encompasses three central tenets: active learning, assessment, and inclusivity. Active learning refers to exercises in which students do something (e.g., writing, discussing, solving, or reflecting), rather than passively listening to a lecture (Crouch and Mazur, 2001; Prince, 2004; Michael, 2006; Wood, 2009; Osborne, 2010). Assessment can be used during a learning event (formative assessment) or at the completion of a unit (summative assessment), in each case providing information to students and instructors regarding student progress (Black and Wiliam, 1998; Tanner and Allen, 2004). Inclusivity embodies the idea that undergraduate science courses contain students of diverse backgrounds and that conscious efforts are required to achieve course environments that minimize potential biases and promote the success of all students (Milem, 2001; Tanner and Allen, 2007).1 In the past decade, ST has spread throughout the biology education community, providing an overarching framework for biology education research projects and serving as the basis for a number of professional development workshops (Ebert-May and Hodder, 2008; Miller et al., 2008).

The increasing prominence of the ST approach has created a specific need to identify and define its core elements and supporting practices. ST represents a specific articulation of key educational principles pertaining to undergraduate science instruction. It is consistent with broader consensus reports, but it is distilled and packaged in a manner suitable for biology faculty with little pedagogical training. ST implementation involves complex human behaviors modified by social interactions, classroom environments, and task characteristics, such as the content and cognitive demand of a given activity (Hora and Ferrare, 2013). ST can be embodied to different degrees, with one practitioner incorporating a short classroom activity, another revamping an entire course according to the ST paradigm, and both self-reporting as engaging in ST. Some ST practices are readily apparent in the classroom environment, while other important elements are less visible, occurring behind the scenes as an instructor makes plans and adjustments throughout the semester. Several observation protocols have been developed to document classroom practices, but the degree to which they align with ST has not been defined (e.g., Piburn and Sawada, 2000; Hora and Ferrare, 2013; Smith et al., 2013). Furthermore, students can engage in course-related activities either during class or outside class through homework, projects, online forums, or other exercises. For these reasons, future efforts to study ST implementation and associated student outcomes will require systematic definition of ST in a way that accounts for its diverse applications.

Taxonomy development has been used as a research methodology in many disciplines to clarify and elaborate overarching processes, structures, and goals. For example, within the medical education community, taxonomies have been used to better describe medical errors as well as to specify desired competencies for medical residents (Zhang et al., 2004; Graham et al., 2009). Often adopting a hierarchical organization, taxonomies use explicit criteria to systematically identify, classify, and define elements that fit within a broader structure. Specifying a given domain through taxonomy development is recommended as preparation for curriculum building, program evaluation, and instrument construction (Chatterji, 2003). In addition to informing these activities, taxonomies can also guide future research efforts by summarizing current understandings and providing a defined reference point for future studies (Bordage, 2009).

In this article, we describe the development of a taxonomy that operationalizes ST principles through the specification of observable teaching practices associated with ST. We detail the process underlying its initial construction and iterative revision, and we report on classroom observations demonstrating the applicability of the taxonomy within the course context. We also provide data from faculty surveys supporting the comprehensiveness of the taxonomy. The resulting taxonomy identifies core pedagogical goals of ST and articulates specific practices aligned with each goal. By defining observable indicators of ST practice, this taxonomy provides a common framework for researchers studying the processes and outcomes of ST-based educational transformations as well as an important resource for faculty members engaged in course transformations. This taxonomy can also serve as the basis for formal instruments designed to document the implementation of ST within a course.

METHODS

Project Scope

The overall goal of this project was to make explicit the pedagogical goals of ST and to develop a list of observable practices supporting these goals (Figure 1). Using the book Scientific Teaching as the primary guide (Handelsman et al., 2007), we sought to define ST elements that apply within the undergraduate course context, acknowledging from the outset that the resulting product would not include all valuable teaching practices or address other important considerations surrounding higher education. For example, though firmly rooted in the principles of teaching inclusively and embracing diversity, ST does not extensively address issues of affordable access or student disabilities. These issues fall outside the purview of a typical instructor, being governed largely by institutional policies and student support offices. There are also general behaviors associated with quality teaching (e.g., speaking clearly, being organized, or maintaining professionalism) that are not explicitly emphasized in ST.

Figure 1.

Figure 1. Flowchart providing a general overview of the taxonomy development process.

Identification of Pedagogical Goals

The first part of the taxonomy development process involved identifying ST’s core pedagogical goals. Here, pedagogical goals are defined as learning processes, course structures, and classroom environments that are desired by the instructor. We began by deconstructing the book Scientific Teaching into a comprehensive list of its recommended teaching practices, using other education literature to elaborate certain topics (e.g., see references included in Table 1). We next identified the central intentions and features of each teaching recommendation. For example, the specific suggestion to employ group problem solving in the classroom could be generalized into two components: having students work together and having students solve problems. These broad components were further consolidated based on related features into a list of core pedagogical goals, which were subsequently refined and translated into student-centered terms.

Table 1. References related to each ST goal

Course alignment
Students understand learning an row d performance expectations based on information from the instructor that defines what students should know and be able to do at course completion. Wiggins and McTighe, 2005
Allen and Tanner, 2007
Mestre, 2008
Wood, 2009
Students work to accomplish course objectives by participating in exercises and formative assessments that align with the desired outcomes. Biggs, 2003
Phillips et al., 2008
Blumberg, 2009
Student achievement of course objectives is accurately measured using summative assessments that are aligned with the desired outcomes. Danili and Reid, 2005
Allen and Tanner, 2006
Brilleslyper et al., 2012
Kishbaugh et al., 2012
Students inform course curriculum decisions by providing feedback and performance data to the instructor. Novak et al., 1999
Richardson, 2005
Science practices
Students explore the relationship between science and society by reflecting upon science in the context of society throughout history and in the present day. Sadler et al., 2004
Zeidler et al., 2005
Chamany et al., 2008
Labov and Huddleston, 2008
Pierret and Friedrichsen, 2009
Students use science process skills by engaging in practices integral to the performance of science. Hanauer, et al., 2006
Bao et al., 2009
Coil et al., 2010
Wei and Woodin, 2011
Goldey et al., 2012
Students synthesize experimental results by critically evaluating multiple pieces of data and drawing conclusions based on evidence and reasoning. Svoboda and Passmore, 2013
Wiley and Stover, 2014
Osborne, 2010
Students engage in formal scientific discourse by interpreting and communicating scientific ideas. Hoskins et al., 2007
Brownell et al., 2013
Stanton, 2013
Student participation
Students engage in class by participating in active-learning exercises that serve as formative 
assessments. Black and Wiliam, 1998
Hake, 1998
Prince, 2004
Nicol and Macfarlane-Dick, 2006
Freeman et al., 2007
Armbruster et al., 2009
Students refine their knowledge through peer interactions by participating in small-group ­activities that require discussion. Springer et al., 1999
Wright and Boggs, 2002
Tanner et al., 2003
Smith et al., 2009
Tanner, 2009
Students participate at the whole-class level, because the instructor provides mechanisms and formats that facilitate class-wide participation. Nicol and Boyle, 2003
Wood, 2004
Crossgrove and Curran, 2008
Kay and LeSage, 2009
Students of diverse backgrounds are affirmed as members of the class and scientific community by considering the perspectives and contributions of people with different origins, genders, and affiliations. Steele, 1997
Seymour, 2000
Dasgupta and Greenwald, 2001
Uhlmann and Cohen, 2005
Tanner and Allen, 2007
Cognitive processes
Students practice higher-order cognitive skills by applying, analyzing, synthesizing, or evaluating evidence, concepts, or arguments. Dori et al., 2003
Miri et al., 2007
Crowe et al., 2008
DeHaan, 2009
Students transfer knowledge and skills across disciplines by utilizing skills or concepts from 
multiple disciplines to solve scientific problems. Bialek and Botstein, 2004
Labov et al., 2010
Tra and Evans, 2010
Students learn to think metacognitively by reflecting on the effectiveness of their learning and problem-solving strategies. Ertmer and Newby, 1996
Pintrich, 2002
Schraw et al., 2006
Tanner, 2012

Elaboration of Supporting Practices

To operationalize ST in an explicit manner, we further described each pedagogical goal in terms of specific practices that support its achievement. For example, one goal is for students to “learn to think metacognitively,” a process that can manifest itself in many ways and be elicited by different kinds of activities. To elaborate each goal, we articulated a general approach to encapsulate the different ways that the goal could be achieved and then compiled a series of supporting practices that exemplify each general approach. Many supporting practices were found within the Scientific Teaching book, and we drew from our collective experiences as students and instructors to supplement these practices. Once a draft list had been developed, we conducted informal classroom observations of ST-trained instructors to identify additional practices that had been potentially overlooked. Several rounds of iterative revisions ultimately led to the production of a complete draft taxonomy.

Testing the Applicability of the Draft Taxonomy

Like many educational approaches, the nature of ST precludes definition in absolute or authoritative terms. The growing community of practitioners who implement and disseminate ST shapes its features in an ongoing manner. In light of these qualities, we conducted classroom observations to determine whether the supporting practices listed were in fact detectable within classroom environments. The intent of this exercise was not to use the taxonomy as a measurement tool but rather to ensure that the supporting practices we had compiled were being implemented in practice. Ten faculty members from biology and other science disciplines at the University of Colorado were recruited by email and observed for one class meeting each (Table 2). These faculty members were primarily, but not exclusively, former SI participants and/or had a reputation for utilizing transformed teaching practices. For each observation, two investigators recorded field notes regarding student and instructor activities. After class, each investigator separately determined which of the initial taxonomy’s 38 supporting practices occurred at least once during the class. Initial interrater agreement was 84%, and consensus was reached on the remaining items through discussion. Importantly, 89% of the supporting practices (34 of 38) were scored at least once during this series of classroom observations. These observations suggest that the supporting practices in the draft taxonomy represent a reasonable list of behaviors that are both used and observable.

Table 2. Sample demographics for class observations and faculty surveys

Class observations (n = 10 total)
Instructors trained at SI 5
Instructors trained elsewhere 2
Instructors informally trained 3
Lower-division courses 5
Upper-division courses 5
Small enrollment (10–25 students) 4
Medium enrollment (26–100 students) 3
Large enrollment (>100 students) 3
Biology courses 7
Other STEM courses 3
Faculty surveys (n = 14 total)
Instructors trained at SI 9
Instructors trained elsewhere 5
Biology instructors 10
Other STEM instructors 4

As a supplement to classroom observations, we also conducted an online survey to determine the extent to which a sample of faculty members would recapitulate our list of supporting practices. Fourteen additional faculty members, including 11 from other institutions, who had completed SI or other similar training were recruited by email to complete a 30-min online survey administered via Qualtrics (Table 2). In the survey, participants were presented with a series of pedagogical goals along with corresponding general approaches and were asked to describe practices they use to achieve these goals in their classes. Participants were given five text-entry boxes for each goal as well as an additional box for general comments. For prevention of survey fatigue, each participant was presented with roughly half of the 17 different goals from the draft taxonomy. From this survey, we collected a total of 288 reported practices, which were subsequently reviewed to determine alignment with the existing supporting practices.

The faculty surveys allowed us to address two principle objectives: to determine whether the supporting practices on the draft taxonomy were similar to those generated by an independent group of faculty members and to identify additional supporting practices reported by faculty members. Two investigators worked together to determine the degree of alignment between the so-called faculty-generated (FG) practices and the existing practices by determining whether each FG response qualified as a general restatement or a specific example of an existing practice based on related keywords, themes, and characteristics. For example, when presented with the goal of students “using science process skills,” the FG response of having “students generate hypotheses and make predictions” was judged to be analogous to the existing practice of having “students identify, construct, or evaluate hypotheses and make predictions based on their hypotheses.” One investigator initially aligned the FG practices to the existing supporting practices, a second investigator reviewed all of the assignments, and then the two investigators discussed any disagreements. Again, 89% of the existing practices (34 of 38) were aligned with one or more FG responses, providing confirmation that the existing practices could largely be corroborated by faculty practitioners.2 Faculty members employ similar practices to achieve certain pedagogical goals, and many supporting practices were therefore corroborated by several FG responses. While the majority of the FG practices could be paired with an existing practice, roughly 10% (28 of 288) did not align with an existing practice for the pedagogical goal under which they were originally submitted. Among these, some aligned with other pedagogical goals, while others were deemed to be outside the scope of ST. The few remaining FG practices were added during a final round of taxonomy revisions.

Final Taxonomy Revisions

While the draft taxonomy (with 38 practices) showed considerable alignment with observed and reported teaching practices, our efforts revealed a few areas for further revision. Language throughout the taxonomy was clarified to be more parsimonious, including the merging of two pairs of pedagogical goals based on overlap within faculty responses (e.g., faculty members did not make distinctions between collaborative and cooperative learning approaches). Four supporting practices were removed because they were redundant with other items, occurred outside the observable course context, or were not reported by faculty members. Three new supporting practices were added to reflect previously unlisted FG practices. As an example, for the goal of “affirming students of diverse backgrounds,” multiple respondents mentioned “employing mechanisms to enhance diversity within student groups.” This practice was added to the taxonomy. After final taxonomy revisions, only one supporting practice remained that had not been observed in the classroom or mentioned on faculty surveys. This was having “students analyze data using appropriate methods.” Because this practice had been identified in national reports as an important component of developing science process skills (NRC, 2003a; AAAS, 2011), it was retained in the final taxonomy (37 practices altogether).

RESULTS

Taxonomy Structure

The final ST taxonomy consists of a series of 15 pedagogical goals, 15 general approaches, and 37 supporting practices arranged in a hierarchical manner (Table 3). The taxonomy operationalizes ST by identifying its core elements and elaborating explicit behaviors, artifacts, and features associated with each element. Several aspects of the taxonomy reflect the ongoing evolution of ST since its original publication. For example, while ST was developed within the context of biology education, the taxonomy maintains its applicability throughout the sciences by utilizing interdisciplinary language. Furthermore, the SI curriculum and Scientific Teaching book are geared toward instructors, and they fittingly describe actions that instructors can take to build productive learning environments for their students. In contrast, the taxonomy is phrased with an explicit focus on student actions and perceptions. This student-centered language is not intended to diminish the importance of the instructor role but rather to emphasize that ST is ultimately about what students do and perceive.

Table 3. The complete taxonomy of observable ST practices

Pedagogical goal: a particular learning process, structure, or environment desired by the instructor General approach: a general statement of how the given pedagogical goal will be achieved Supporting practices: specific actions, materials, or capabilities that exemplify the general approach
Course alignment
Students understand learning and performance expectations based on information from the instructor that defines what students should know and be able to do at course completion. 1. Students are provided learning goals detailing conceptual understandings, content knowledge, and process skills they are expected to master.
Students work to accomplish course objectives by participating in exercises and formative assessments that align with the desired outcomes. 2. Students are able to connect activities and formative assessments with specific learning objectives.
Student achievement of course objectives is accurately measured by using summative assessments that are aligned with the desired outcomes. 3. Students are able to connect material on summative assessments to specific learning objectives.
4. Student summative assessments use different formats or multiple types of answer input.
Students inform course curriculum decisions by providing feedback and performance data to the instructor. 5. Students are given the opportunity to provide feedback on course structure and content.
6. Students ask questions or state interests that are pursued during class.
7. Students are given supporting activities when assessment reveals a problem area.
Science practices
Students explore the relationship between science and society by reflecting upon science in the context of society throughout history and in the present day. 8. Students use historical information to recognize why certain discoveries represent paradigm shifts or major technological advancements.
9. Students relate scientific concepts to everyday phenomena or human experiences.
10. Students utilize scientific judgment to address challenges facing nature or society.
Students use science process skills by engaging in practices integral to the performance of science. 11. Students identify, construct, or evaluate hypotheses and make predictions based on their hypotheses.
12. Students design and evaluate experimental strategies.
13. Students analyze data using appropriate methods, such as descriptive or inductive statistics.
14. Students construct graphs or tables and analyze results presented in these formats.
Students synthesize experimental results by critically evaluating multiple pieces of data and drawing conclusions based on evidence and reasoning. 15. Students formulate or evaluate conceptual models based on data and inference.
16. Students attempt to reconcile conflicting pieces of data.
17. Students develop arguments or make decisions based on experimental data.
Students engage in formal scientific discourse by interpreting and communicating scientific ideas. 18. Students read and evaluate scientific literature, including peer-reviewed and popular media articles.
19. Students present scientific ideas in written or oral formats.
Student participation
Students engage in class by participating in active-learning exercises that serve as formative assessments. 20. Students answer questions, solve problems, or construct representations.
21. Students complete formative assessment activities and receive feedback on their answers.
Students refine their knowledge through peer interactions by participating in small-group activities that require discussion. 22. Students complete worksheets, discuss problems, and perform other activities in groups of two or more.
23. Students provide peer feedback on projects, assessments, or other activities.
24. Students complete tasks wherein the success of the group involves the participation of each group member.
Students participate at the whole-class level because the instructor provides mechanisms and formats that facilitate class-wide participation. 25. Students use an audience response system or other polling method to answer content questions.
26. Students report the results of group work to the whole class.
27. Students are encouraged to respond to other student ideas.
Students of diverse backgrounds are affirmed as members of the class and scientific community by considering the perspectives and contributions of people with different origins, genders, and affiliations. 28. Students consider contributions of diverse people and perspectives in the realm of scientific discovery.
29. Students utilize examples and analogies that reflect diverse people and cultures.
30. Students are grouped using mechanisms that enhance the diversity of each group.
31. Students are aware of instructor sensitivity to socially controversial issues.
Cognitive processes
Students practice higher-order cognitive skills by applying, analyzing, synthesizing, or evaluating evidence, concepts, or arguments. 32. Students incorporate lower-order knowledge into higher-order cognitive skills development.
33. Students interpret or construct conceptual representations in a variety of formats, including video, pictorial, graphic, or mathematical.
34. Students engage in structured, open-ended inquiry exercises, such as case-based or problem-based activities.
Students transfer knowledge and skills across disciplines by utilizing skills or concepts from multiple disciplines to solve scientific problems. 35. Students apply knowledge from mathematics, computer science, biology, chemistry, physics, or other disciplines within the context of a different discipline.
Students learn to think metacognitively by reflecting on the effectiveness of their learning and problem-solving strategies. 36. Students consider assumptions, appropriateness of skills utilized, or thought processes when solving problems or answering questions.
37. Students reflect on the effectiveness of their study habits.

While ST is traditionally defined according to the general tenets of active learning, assessment, and inclusivity, the taxonomy is divided more specifically into four sections pertaining to course alignment, science practices, student participation, and cognitive processes. The course alignment section focuses on the interrelation of three different curricular components: learning goals, instructional activities, and summative assessments. Learning goals define what students should know and be able to do upon course completion, instructional activities provide students with opportunities to accomplish this learning, and summative assessments gauge the degree to which students achieved the original goals. ST advocates that instructors explicitly communicate their learning goals to ensure that students are aware of the conceptual understandings, content knowledge, and process skills they are expected to master. ST also stresses the requirement that activities and assessments must align with course learning goals to provide suitable avenues for intended learning and accurate measures of student achievement. This alignment creates an important feedback loop that enables instructors to use student data to help make decisions on how to spend valuable class time and improve their teaching methods.

The science practices section elaborates the idea that undergraduate science courses should capture the spirit of scientific discovery by engaging students in scientific processes. This differentiates ST from more general pedagogical approaches (e.g., student-centered learning or team-based learning) that do not explicitly focus on incorporating scientific reasoning. By articulating that students should explore the relationship between science and society, engage in experimental design and interpretation, and participate in formal scientific discourse, this section addresses the dual intentions of preparing scientifically literate citizens as well as training future scientists.

The student participation section of the taxonomy contains pedagogical goals related to how students participate within a course. These goals embody a constructivist approach by acknowledging the importance of student knowledge and the value of enabling students to build their own mental models through active engagement and peer feedback. One pedagogical goal describes involving students at the whole-class level through the use of classroom response systems such as clickers or the reporting out of group work. Another goal in this section serves to help students feel affirmed as members of the class and the larger scientific community, irrespective of their backgrounds or future career aspirations. Instructors are challenged to demonstrate “cultural competence” by adopting teaching materials and practices that help students from diverse backgrounds self-identify as science practitioners (Tanner and Allen, 2007).

The final section of the taxonomy focuses on different types of cognitive processes that should be cultivated. For students to be innovative and productive members of the future workforce, they must be able to apply, analyze, synthesize, and evaluate disparate pieces of information and to remember and comprehend essential definitions and processes (Bloom et al., 1956; Anderson and Krathwohl, 2001). The increasingly interdisciplinary nature of modern research fields has created demand for the ability to integrate concepts across disciplinary boundaries. Furthermore, students need to be encouraged to develop metacognitive habits that allow them to self-reflect in order to optimize their problem-solving and study skills (Tanner, 2012).

To visualize the relationship between the taxonomy and ST’s traditional tenets, we categorized each supporting practice as supporting active learning, assessment, or inclusivity (Figure 2). Not surprisingly, we found that there was extensive overlap between these tenets, with most supporting practices addressing multiple tenets. Scientific Teaching itself suggests that active learning and assessment are inextricably linked, in that active learning incorporates assessment of student understanding, and assessment necessarily elicits active student engagement. This is borne out in the taxonomy, as there are 27 practices that address both active learning and assessment. ST is intended to be inclusive, and there is growing evidence that implementation of interactive teaching practices and the development of science process skills can have beneficial outcomes for members of traditionally underrepresented groups (Dirks and Cunningham, 2006; Haak et al., 2011; NRC, 2012). Within the taxonomy, we found that 21 supporting practices were related to inclusive teaching. While these designations are not intended to be absolute, they do serve to illustrate the relationships between the traditional ST tenets.

Figure 2.

Figure 2. Venn diagram showing the classification of supporting practices under the ST pillars of active learning, assessment, and inclusivity. Numbers from the ST taxonomy are used to indicate the categorization of each supporting practice.

ST consolidates research-based practices from across the science education literature, but it contains particular terminology and emphases that make it unique. After constructing the final taxonomy, we sought to align the ST taxonomy to the broader education literature by collecting reviews and research articles describing and justifying ST approaches. Table 1 contains a listing of several references related to each pedagogical goal. These citations support the consistency of the ST pedagogical goals with broader educational dialogue. Most articles apply to more than one pedagogical goal, and so the designation of an article under a particular goal does not negate its applicability to other parts of the taxonomy. For example, in addition to an overall focus on helping students develop higher-order cognitive skills, the Blooming Biology Tool described by Crowe et al. (2008) has applications for course alignment, activity development, assessment, and student metacognition. We hope that this table will provide a helpful starting point for practitioners wanting to learn more about ST’s research foundations.

DISCUSSION

In response to ongoing national calls, ST was formulated as a way to help scientists bring their expertise into the classroom in more authentic and productive ways to improve student learning. With numerous copies of Scientific Teaching in circulation and hundreds of educators being trained in ST-based programs each year, ST has achieved significant influence within the education community. Future efforts to monitor ST’s use and impact will depend on having ways to identify its application. In this study, we have operationalized ST practices through the development of a taxonomy that identifies the core goals of ST and defines observable practices associated with each goal, thus providing faculty members with a concise, inclusive reference guide representing key ST elements.

To achieve the goal of defining ST in explicit terms, we took the viewpoint of an objective observer with course access similar to that of a student (i.e., someone who can attend class, download course materials, take exams, etc.). While many aspects of ST remain hidden from such an observer, we attempted to identify how these hidden elements are manifested. For example, ST advocates the use of a curriculum design process called backward design (Wiggins and McTighe, 2005). During the backward design process, an instructor first drafts course learning goals, then considers evidence that would indicate student mastery of the learning goals, and finally designs instructional activities that allow students to reach the desired performance level. While this design process provides a valuable guide, it is not possible to objectively determine whether an instructor’s method was in fact “backward” without being present throughout course construction. Nonetheless, the desired outcome of a rational course design can be judged through the provision of written learning goals to students and alignment of subsequent activities and assessments to these goals.

The ST taxonomy is designed to have broad relevance within the context of undergraduate science courses. ST can be used in many different parts of a course, including lecture, lab, recitation, homework, and online forums. The taxonomy describes student actions and experiences that can occur within any of these different settings and includes practices that are found across a large range of course sizes. While some ST practices are easier to implement in smaller classes, all of the practices listed in the taxonomy are feasible in a large-enrollment class, particularly with the aid of instructional technologies or additional personnel (Caldwell, 2007; Otero et al., 2010). Finally, the taxonomy is intended to be applicable across the sciences, and efforts were made during the validation process to include perspectives from non–biology disciplines. However, given ST’s historical roots and current prominence within biology, it remains possible that the taxonomy does not fully account for teaching practices unique to other disciplines. Understanding these pedagogical differences remains an important research question, particularly in light of ongoing efforts to develop interdisciplinary courses and learning environments (Meredith and Redish, 2013; O’Shea et al., 2013; Thompson et al., 2013).

The taxonomy addresses two important issues related to conducting research on the use of transformed teaching practices. First, previous efforts to gauge teaching practices have been criticized for an overreliance on self-reported data, which may not accurately reflect actual classroom practices (Kane et al., 2002; Ebert-May et al., 2011). Second, many previous studies on the implementation of research-based instructional practices have focused on the use of specific strategies (e.g., clicker questions) that each contain a number of different components (e.g., question content, peer discussion, group sharing, etc.). Instructors tend to adapt these strategies according to their own classroom needs, resulting in a wide range of different practices that recapitulate the original design with varying degrees of fidelity (Turpen and Finkelstein, 2009; Dancy and Henderson, 2010). By specifying a comprehensive list of practices indicative of ST, the taxonomy begins to address each of these issues by laying the groundwork for the development of observation-based rubrics that separately identify the different layers present within any given course exercise.

While the items on the taxonomy are written in explicit terms, the taxonomy is not intended as a formal observation protocol for classroom evaluation. We anticipate that the taxonomy will serve as the basis for the development of such instruments, which will require additional delineation of scoring mechanisms, response scales, and measurement criteria. The taxonomy is consistent with the frameworks underlying several existing observation protocols, but also identifies elements unique to ST. Developed for K–12 classrooms, the Reformed Teaching Observation Protocol, or RTOP, captures the student-centeredness of a classroom, and most of its items are consistent with the goals and practices listed in the ST taxonomy (Piburn and Sawada, 2000). The Teaching Dimensions Observation Protocol, or TDOP, is based on systems-of-practice theory and accounts for different dimensions of the classroom environment, including teaching methods, pedagogical strategies, student–teacher interactions, cognitive engagement, and instructional technology (Hora and Ferrare, 2013). The TDOP thus includes codes that capture elements of ST as well as practices beyond the scope of ST. Developed under the TDOP framework, the Classroom Observation Protocol for Undergraduate STEM, or COPUS, focuses on faculty and student behaviors, but does not include codes to capture course alignment, science practices, or cognitive processes (Smith et al., 2013).

In its current form, the taxonomy has several applications for instructors and departments working to improve their educational programs. First, individual faculty members can use the taxonomy to 1) self-assess their own teaching and identify ways to diversify their courses; 2) report and justify the use of transformed teaching practices to promotion and tenure committees; and 3) communicate the rationale behind instructional decisions to students (e.g., on the first day of class in discussing course goals and format). Second, as a professional development tool, the taxonomy can be used to informally document classroom practices and to facilitate dialogue with course instructors (see Supplemental Material 1 and 2). Third, the taxonomy can provide a basis for pedagogy-related conversations at the departmental level: 1) departments engaged in curricular reviews can identify when, where, and how their students are addressing each pedagogical goal within their program; 2) faculty members can use the taxonomy as a common frame of reference drawn from the education literature to guide formative peer feedback (Gormally et al., 2014), rather than invoking outdated or subjective ideas of what constitutes “good” teaching; and 3) since the content of the taxonomy complements the NSF Pulse Fellows Vision and Change rubrics, it can be used as part of departmental efforts to self-evaluate awareness, acceptance, and use of transformed teaching practices (Aguirre et al., 2013).

While the ST taxonomy has numerous applications for researchers and instructors with an ST background, we propose that the taxonomy is also useful for individuals engaged in other reform initiatives. The language of the taxonomy is general in nature and captures many key ideas related to educational reform. Thus, we foresee the taxonomy serving as a theoretical underpinning for researchers studying the outcomes of general professional development workshops or other efforts of a more comprehensive nature. We also expect that the taxonomy will be useful for introducing instructors to different aspects of course transformation and providing a concise representation of research-based educational practices. The instructional uses described above would apply equally well for faculty members who have received ST training, non-ST training, or no educational training.

The taxonomy presented in this article identifies and defines observable practices associated with ST. It is not expected that all the goals listed on the taxonomy would be addressed in a single class period or assignment, as the suitability of these practices depends on the scope and goals of a course. Finally, the taxonomy represents an articulation of ST in its current state. Staying true to ST principles, the taxonomy should evolve as future research efforts lead to a deeper understanding of how different teaching practices affect student outcomes. Please contact the corresponding 
author (B.A.C.) if you would like a copy of the taxonomy that has been formatted as a one-page handout.

FOOTNOTES

1Originally dubbed “diversity” in the ST literature, the label “inclusivity” is used here to remain consistent with the SI curriculum, which recently adopted this new terminology to reflect the notion that deliberate steps are required to achieve unbiased learning environments.

2Classroom observations and faculty surveys together supported 32 practices. Of the remaining practices, two practices were not observed, two practices were not reported, and two practices were neither observed nor reported.

ACKNOWLEDGMENTS

This work was supported by the University of Colorado Boulder through the Science Education Initiative and a Chancellor’s Award for Excellence in STEM Education to B.A.C. M.J.G. was supported by an HHMI Professors Program award to Jo Handelsman (principal investigator) and the Yale Center for Scientific Teaching. This work was also supported by a National Science Foundation TUES type 3 award (DUE-1323019) to B.A.C., M.J.G., J.K.K., and others. We thank Bill Wood for providing crucial feedback on the taxonomy and manuscript. We thank the faculty members and students who participated in classroom observations and the faculty members who completed our online survey. This research was classified as exempt from institutional review board review (protocol 13-0315).

REFERENCES

  • Aguirre KM, Balser TC, Jack T, Marley KE, Miller KG, Osgood MP, Pape-Lindstrom PA, Romano SL (2013). PULSE Vision & Change rubrics. CBE Life Sci Educ 12, 579-581.LinkGoogle Scholar
  • Allen D, Tanner K (2006). Rubrics: tools for making learning goals and evaluation criteria explicit for both teachers and learners. Cell Biol Educ 5, 197-203.AbstractGoogle Scholar
  • Allen D, Tanner K (2007). Putting the horse back in front of the cart: using visions and decisions about high-quality learning experiences to drive course design. CBE Life Sci Educ 6, 85-89.LinkGoogle Scholar
  • American Association for the Advancement of Science (AAAS) (1989). Science for All Americans, New York: Oxford University Press.Google Scholar
  • AAAS (1990). The Liberal Art of Science, Washington, DC.Google Scholar
  • AAAS (2011). Vision and Change in Undergraduate Biology Education: A Call to Action, Washington, DC.Google Scholar
  • Anderson LW, Krathwohl DR (2001). A Taxonomy for Learning, Teaching, and Assessing: A Revision of Bloom’s Taxonomy of Educational Objectives, New York: Longman.Google Scholar
  • Angelo TA (1998). Classroom Assessment and Research: An Update on Uses, Approaches, and Research Findings, San Francisco, CA: Jossey-Bass.Google Scholar
  • Armbruster P, Patel M, Johnson E, Weiss M (2009). Active learning and student-centered pedagogy improve student attitudes and performance in introductory biology. CBE Life Sci Educ 8, 203-213.LinkGoogle Scholar
  • Bao L, Cai T, Koenig K, Fang K, Han J, Wang J, Qing L, Ding L, Cui L, Luo Y, et al. (2009). Learning and scientific reasoning. Science 323, 586-587.MedlineGoogle Scholar
  • Bialek W, Botstein D (2004). Introductory science and mathematics education for 21st-century biologists. Science 303, 788-790.MedlineGoogle Scholar
  • Biggs J (2003). Aligning teaching and assessing to course objectives. Teach Learn High Educ 2, 13-17.Google Scholar
  • Black P, Wiliam D (1998). Assessment and classroom learning. Assess Educ Princ Policy Pract 5, 7-74.Google Scholar
  • Bloom BS, Engelhart MD, Furst FJ, Hill WH, Krathwohl DR (1956). Taxonomy of Educational Objectives: The Classification of Educational Goals, Handbook I: Cognitive Domain, New York: David McKay.Google Scholar
  • Blumberg P (2009). Maximizing learning through course alignment and experience with different types of knowledge. Innov High Educ 34, 93-103.Google Scholar
  • Bordage G (2009). Conceptual frameworks to illuminate and magnify. Med Educ 43, 312-319.MedlineGoogle Scholar
  • Brilleslyper M, Ghrist M, Holcomb T, Schaubroeck B, Warner B, Williams S (2012). What’s the point? The benefits of grading without points. PRIMUS 22, 411-427.Google Scholar
  • Brownell SE, Price JV, Steinman L (2013). A writing-intensive course improves biology undergraduates’ perception and confidence of their abilities to read scientific literature and communicate science. Adv Physiol Educ 37, 70-79.MedlineGoogle Scholar
  • Caldwell JE (2007). Clickers in the large classroom: current research and best-practice tips. CBE Life Sci Educ 6, 9-20.LinkGoogle Scholar
  • Chamany K, Allen D, Tanner K (2008). Making biology learning relevant to students: integrating people, history, and context into college biology teaching. CBE Life Sci Educ 7, 267-278.LinkGoogle Scholar
  • Chatterji M (2003). Designing and Using Tools for Educational Assessment, Boston: Allyn and Bacon.Google Scholar
  • Coil D, Wenderoth MP, Cunningham M, Dirks C (2010). Teaching the process of science: faculty perceptions and an effective methodology. CBE Life Sci Educ 9, 524-535.LinkGoogle Scholar
  • Cross KP, Steadman MH (1996). Classroom Research: Implementing the Scholarship of Teaching, San Francisco, CA: Jossey-Bass.Google Scholar
  • Crossgrove K, Curran KL (2008). Using clickers in nonmajors- and majors-level biology courses: student opinion, learning, and long-term retention of course material. CBE Life Sci Educ 7, 146-154.LinkGoogle Scholar
  • Crouch CH, Mazur E (2001). Peer instruction: ten years of experience and results. Am J Phys 69, 970-977.Google Scholar
  • Crowe A, Dirks C, Wenderoth MP (2008). Biology in Bloom: implementing Bloom’s taxonomy to enhance student learning in biology. CBE Life Sci Educ 7, 368-381.LinkGoogle Scholar
  • Dancy M, Henderson C (2010). Pedagogical practices and instructional change of physics faculty. Am J Phys 78, 1056-1063.Google Scholar
  • Danili E, Reid N (2005). Assessment formats: do they make a difference?. Chem Educ Res Pract 6, 204-212.Google Scholar
  • Dasgupta N, Greenwald AG (2001). On the malleability of automatic attitudes: combating automatic prejudice with images of admired and disliked individuals. J Pers Soc Psychol 81, 800-814.MedlineGoogle Scholar
  • DeHaan RL (2009). Teaching creativity and inventive problem solving in science. CBE Life Sci Educ 8, 172-181.LinkGoogle Scholar
  • Dirks C, Cunningham M (2006). Enhancing diversity in science: is teaching science process skills the answer. Cell Biol Educ 5, 218-226.AbstractGoogle Scholar
  • Dori YJ, Tal RT, Tsaushu M (2003). Teaching biotechnology through case studies—can we improve higher order thinking skills of nonscience majors?. Sci Educ 87, 767-793.Google Scholar
  • Ebert-May D, Derting TL, Hodder J, Momsen JL, Long TM, Jardeleza SE (2011). What we say is not what we do: effective evaluation of faculty professional development programs. BioScience 61, 550-558.Google Scholar
  • Ebert-May D, Hodder J (2008). Pathways to Scientific Teaching, Sunderland, MA: Sinauer.Google Scholar
  • Ertmer PA, Newby TJ (1996). The expert learner: strategic, self-regulated, and reflective. Instruct Sci 24, 1-24.Google Scholar
  • Freeman S, O’Connor E, Parks JW, Cunningham M, Hurley D, Haak D, Dirks C, Wenderoth MP (2007). Prescribed active learning increases performance in introductory biology. CBE Life Sci Educ 6, 132-139.LinkGoogle Scholar
  • Goldey ES, Abercrombie CL, Ivy TM, Kusher DI, Moeller JF, Rayner DA, Smith CF, Spivey NW (2012). Biological inquiry: a new course and assessment plan in response to the call to transform undergraduate biology. CBE Life Sci Educ 11, 353-363.LinkGoogle Scholar
  • Gormally C, Evans M, Brickman P (2014). Feedback about teaching in higher ed: neglected opportunities to promote change. CBE Life Sci Educ 13, 187-199.LinkGoogle Scholar
  • Graham MJ, Naqvi Z, Encandela J, Harding KJ, Chatterji M (2009). Systems-based practice defined: taxonomy development and role identification for competency assessment of residents. J Grad Med Educ 1, 49-60.MedlineGoogle Scholar
  • Haak DC, HilleRisLambers J, Pitre E, Freeman S (2011). Increased structure and active learning reduce the achievement gap in introductory biology. Science 332, 1213-1216.MedlineGoogle Scholar
  • Hake RR (1998). Interactive-engagement versus traditional methods: a six-thousand-student survey of mechanics test data for introductory physics courses. Am J Phys 66, 64-74.Google Scholar
  • Hanauer DI, Jacobs-Sera D, Pedulla ML, Cresawn SG, Hendrix RW, Hatfull GF (2006). Teaching scientific inquiry. Science 314, 1880-1881.MedlineGoogle Scholar
  • Handelsman J, Ebert-May D, Beichner R, Bruns P, Chang A, DeHaan R, Gentile J, Lauffer S, Stewart J, Tilghman SM, Wood WB (2004). Scientific teaching. Science 304, 521-522.MedlineGoogle Scholar
  • Handelsman J, Houser B, Kriegel H (2002). Biology Brought to Life: A Guide to Teaching Students to Think Like Scientists, New York: McGraw-Hill.Google Scholar
  • Handelsman J, Miller S, Pfund C (2007). Scientific Teaching, New York: Freeman.Google Scholar
  • Hora MT, Ferrare JJ (2013). Instructional systems of practice: a multidimensional analysis of math and science undergraduate course planning and classroom teaching. J Learn Sci 22, 212-257.Google Scholar
  • Hoskins SG, Stevens LM, Nehm RH (2007). Selective use of the primary literature transforms the classroom into a virtual laboratory. Genetics 176, 1381-1389.MedlineGoogle Scholar
  • Hutchings P, Shulman LS (1999). The scholarship of teaching: new elaborations, new developments. Change 31, 10-15.Google Scholar
  • Kane R, Sandretto S, Heath C (2002). Telling half the story: a critical review of research on the teaching beliefs and practices of university academics. Rev Educ Res 72, 177-228.Google Scholar
  • Kay RH, LeSage A (2009). Examining the benefits and challenges of using audience response systems: a review of the literature. Comp Educ 53, 819-827.Google Scholar
  • Kishbaugh TLS, Cessna S, Horst SJ, Leaman L, Flanagan T, Graber Neufeld D, Siderhurst M (2012). Measuring beyond content: a rubric bank for assessing skills in authentic research assignments in the sciences. Chem Educ Res Prac 13, 268-276.Google Scholar
  • Labov JB, Huddleston NF (2008). Integrating policy and decision making into undergraduate science education. CBE Life Sci Educ 7, 347-352.LinkGoogle Scholar
  • Labov JB, Reid AH, Yamamoto KR (2010). Integrated biology and undergraduate science education: a new biology education for the twenty-first century?. CBE Life Sci Educ 9, 10-16.LinkGoogle Scholar
  • Meredith DC, Redish EF (2013). Reinventing physics for life-sciences majors. Phys Today 66, 38-43.Google Scholar
  • Mestre J (2008). Learning goals in undergraduate STEM education and evidence for achieving them. In: Commissioned Paper Presented at NRC Workshop on Evidence on Selected Promising Practices in Undergraduate Science, Technology, Engineering, and Mathematics (STEM) Education, held 30 June 2008, in Washington, DC..Google Scholar
  • Michael J (2006). Where’s the evidence that active learning works?. Adv Physiol Educ 30, 159-167.MedlineGoogle Scholar
  • Milem JM (2001, Ed. G. Orfield, Increasing diversity benefits: how campus climate and teaching methods affect student outcomes In: In: Diversity Challenged: Evidence on the Impact of Affirmative Action, ed., Cambridge, MA: Harvard Education Publishing Group.Google Scholar
  • Miller S, Pfund C, Pribbenow CM, Handelsman J (2008). Scientific teaching in practice. Science 322, 1329-1330.MedlineGoogle Scholar
  • Miri B, David B-C, Uri Z (2007). Purposely teaching for the promotion of higher-order thinking skills: a case of critical thinking. Res Sci Educ 37, 353-369.Google Scholar
  • National Research Council (NRC) (1999). Transforming Undergraduate Education in Science, Mathematics, Engineering, and Technology, Washington, DC: National Academies Press.Google Scholar
  • NRC (2000). How People Learn: Brain, Mind, Experience, and School, Washington, DC: National Academies Press.Google Scholar
  • NRC (2003a). BIO2010: Transforming Undergraduate Education for Future Research Biologists, Washington, DC: National Academies Press.Google Scholar
  • NRC (2003b). Evaluating and Improving Undergraduate Teaching in Science, Technology, Engineering, and Mathematics, Washington, DC: National Academies Press.Google Scholar
  • NRC (2011). Expanding Underrepresented Minority Participation: America’s Science and Technology Talent at the Crossroads, Washington, DC: National Academies Press.Google Scholar
  • NRC (2012). Discipline-Based Education Research: Understanding and Improving Learning in Undergraduate Science and Engineering, Washington, DC: National Academies Press.Google Scholar
  • Nicol DJ, Boyle JT (2003). Peer instruction versus class-wide discussion in large classes: a comparison of two interaction methods in the wired classroom. Stud High Educ 28, 457-473.Google Scholar
  • Nicol DJ, Macfarlane-Dick D (2006). Formative assessment and self-regulated learning: a model and seven principles of good feedback practice. Stud High Educ 31, 199-218.Google Scholar
  • Novak G, Patterson E, Gavrin A, Christian W (1999). Just-in-Time Teaching: Blending Active Learning with Web Technology, Upper Saddle River, NJ: Addison-Wesley.Google Scholar
  • Osborne J (2010). Arguing to learn in science: the role of collaborative, critical discourse. Science 328, 463-466.MedlineGoogle Scholar
  • O’Shea B, Terry L, Benenson W (2013). From f = ma to flying squirrels: curricular change in an introductory physics course. CBE Life Sci Educ 12, 230-238.LinkGoogle Scholar
  • Otero V, Pollock S, Finkelstein N (2010). A physics department’s role in preparing physics teachers: the Colorado learning assistant model. Am J Phys 78, 1218-1224.Google Scholar
  • Pfund C, Miller S, Brenner K, Bruns P, Chang A, Ebert-May D, Fagen AP, Gentile J, Gossens S, Khan IM, et al. (2009). Summer Institute to improve university science teaching. Science 324, 470-471.MedlineGoogle Scholar
  • Phillips AR, Robertson AL, Batzli J, Harris M, Miller S (2008). Aligning goals, assessments, and activities: an approach to teaching PCR and gel electrophoresis. CBE Life Sci Educ 7, 96-106.LinkGoogle Scholar
  • Piburn M, Sawada D (2000). Reformed Teaching Observation Protocol (RTOP) Reference Manual, Technical Report, Arizona Collaborative for Excellence in the Preparation of Teachers.Google Scholar
  • Pierret C, Friedrichsen P (2009). Stem cells and society: an undergraduate course exploring the intersections among science, religion, and law. CBE Life Sci Educ 8, 79-87.LinkGoogle Scholar
  • Pintrich PR (2002). The role of metacognitive knowledge in learning, teaching, and assessing. Theor Pract 41, 219-225.Google Scholar
  • President’s Council of Advisors on Science and Technology Engage to Excel: Producing One Million Additional College Graduates with Degrees in Science, Technology, Engineering, and Mathematics, Washington, DC: Executive Office of the President..Google Scholar
  • Prince M (2004). Does active learning work? A review of the research. J Eng Educ 93, 223-231.Google Scholar
  • Richardson JT (2005). Instruments for obtaining student feedback: a review of the literature. Assess Eval High Educ 30, 387-415.Google Scholar
  • Sadler TD, Chambers FW, Zeidler DL (2004). Student conceptualizations of the nature of science in response to a socioscientific issue. Int J Sci Educ 26, 387-409.Google Scholar
  • Schraw G, Crippen KJ, Hartley K (2006). Promoting self-regulation in science education: metacognition as part of a broader perspective on learning. Res Sci Educ 36, 111-139.Google Scholar
  • Seymour E (2000). Talking about Leaving: Why Undergraduates Leave the Sciences, Boulder, CO: Westview.Google Scholar
  • Smith MK, Jones FHM, Gilbert SL, Wieman CE (2013). The Classroom Observation Protocol for Undergraduate STEM (COPUS): a new instrument to characterize university STEM classroom practices. CBE Life Sci Educ 12, 618-627.LinkGoogle Scholar
  • Smith MK, Wood WB, Adams WK, Wieman C, Knight JK, Guild N, Su TT (2009). Why peer discussion improves student performance on in-class concept questions. Science 323, 122-124.MedlineGoogle Scholar
  • Springer L, Stanne ME, Donovan SS (1999). Effects of small-group learning on undergraduates in science, mathematics, engineering, and technology: a meta-analysis. Rev Educ Res 69, 21-51.Google Scholar
  • Stanton JD (2013). A poster-session review to reinforce course concepts and improve scientific communication skills. J Microbiol Biol Educ 14, 116-117.MedlineGoogle Scholar
  • Steele CM (1997). A threat in the air. How stereotypes shape intellectual identity and performance. Am Psychol 52, 613-629.MedlineGoogle Scholar
  • Svoboda J, Passmore C (2013). The strategies of modeling in biology education. Sci Educ 22, 119-142.Google Scholar
  • Tanner KD (2009). Talking to learn: why biology students should be talking in classrooms and how to make it happen. CBE Life Sci Educ 8, 89-94.LinkGoogle Scholar
  • Tanner KD (2012). Promoting student metacognition. CBE Life Sci Educ 11, 113-120.LinkGoogle Scholar
  • Tanner K, Allen D (2004). From assays to assessments—on collecting evidence in science teaching. Cell Biol Educ 3, 69-74.LinkGoogle Scholar
  • Tanner K, Allen D (2007). Cultural competence in the college biology classroom. CBE Life Sci Educ 6, 251-258.LinkGoogle Scholar
  • Tanner K, Chatman LS, Allen D (2003). Approaches to cell biology teaching: cooperative learning in the science classroom—beyond students working in groups. Cell Biol Educ 2, 1-5.LinkGoogle Scholar
  • Thompson KV, Chmielewski J, Gaines MS, Hrycyna CA, LaCourse WR (2013). Competency-based reforms of the undergraduate biology curriculum: integrating the physical and biological sciences. CBE Life Sci Educ 12, 162-169.LinkGoogle Scholar
  • Tra YV, Evans IM (2010). Enhancing interdisciplinary mathematics and biology education: a microarray data analysis course bridging these disciplines. CBE Life Sci Educ 9, 217-226.LinkGoogle Scholar
  • Turpen C, Finkelstein ND (2009). Not all interactive engagement is the same: variations in physics professors’ implementation of Peer Instruction. Phys Rev Spec Top Phys Educ Res 5, 020101.Google Scholar
  • Uhlmann E, Cohen GL (2005). Constructed criteria: redefining merit to justify discrimination. Psychol Sci 16, 474-480.MedlineGoogle Scholar
  • Wei CA, Woodin T (2011). Undergraduate research experiences in biology: alternatives to the apprenticeship model. CBE Life Sci Educ 10, 123-131.LinkGoogle Scholar
  • Wiggins G, McTighe J (2005). Understanding by Design, Alexandria, VA: Association for Supervision and Curriculum Development.Google Scholar
  • Wiley EA, Stover NA (2014). Immediate dissemination of student discoveries to a model organism database enhances classroom-based research experiences. CBE Life Sci Educ 13, 131-138.LinkGoogle Scholar
  • Wood W (2004). Clickers: a teaching gimmick that works. Dev Cell 7, 796-798.Google Scholar
  • Wood WB (2009). Innovations in teaching undergraduate biology and why we need them. Annu Rev Cell Dev Biol 25, 93-112.MedlineGoogle Scholar
  • Wright R, Boggs J (2002). Learning cell biology as a team: a project-based approach to upper-division cell biology. Cell Biol Educ 1, 145-153.LinkGoogle Scholar
  • Zeidler DL, Sadler TD, Simmons ML, Howes EV (2005). Beyond STS: a research-based framework for socioscientific issues education. Sci Educ 89, 357-377.Google Scholar
  • Zhang J, Patel VL, Johnson TR, Shortliffe EH (2004). A cognitive taxonomy of medical errors. J Biomed Inform 37, 193-204.MedlineGoogle Scholar