JOURNAL OF RESEARCH IN SCIENCE TEACHING VOL. 51, NO. 1, PP. 65–83 (2014) Research Article Meaningful Assessment of Learners’ Understandings About Scientific Inquiry—The Views About Scientific Inquiry (VASI) Questionnaire Judith S. Lederman1 Norman G. Lederman1 Stephen A. Bartos2 Selina L. Bartels1 Allison Antink Meyer3 and Renee S. Schwartz4 1 Mathematics & Science Education Department, Illinois Institute of Technology, Chicago, Illinois 2 Womack Educational Leadership Department, Middle Tennessee State University, Murfreesboro, Tennessee 3 School of Teaching and Learning, Illinois State University, Normal, Illinois 4 Department of Biological Sciences, The Mallinson Institute for Science Education, Western Michigan University, Kalamazoo, Michigan Received 16 October 2012; Accepted 9 October 2013 Abstract: Helping students develop informed views about scientific inquiry (SI) has been and continues to be a goal of K-12 science education, as evidenced in various reform documents. Nevertheless, research focusing on understandings of SI has taken a perceptible backseat to that which focuses on the “doing” of inquiry. We contend that this is partially a function of the typical conflation of scientific inquiry with nature of science (NOS), and is also attributable to the lack of a readily accessible instrument to provide a meaningful assessment of learners’ views of SI. This article (a) outlines the framework of scientific inquiry that undergirds the Views About Scientific Inquiry (VASI) questionnaire; (b) describes the development of the VASI, in part derived from the Views of Scientific Inquiry (VOSI) questionnaire; (c) presents evidence for the validity and reliability of the VASI; (d) discusses the use of the VASI and associated interviews to elucidate views of the specific aspects of SI that it attempts to assess; and (e) discusses the utility of the resulting richdescriptive views of SI that the VASI provides for informing both further research efforts and classroom practice. The trend in recent reform documents, unfortunately, ignores much of the research on NOS and SI and implicitly presumes that the “doing” of inquiry is sufficient for developing understandings of SI. The VASI serves as a tool in further discrediting this contention and provides both the classroom teacher and the researcher a more powerful means for assessing learners’ conceptions about essential aspects of SI, consonant with the vision set forth in the recently released Next Generation Science Standards (Achieve, Inc., 2013). # 2013 Wiley Periodicals, Inc. J Res Sci Teach 51: 65–83, 2014 Keywords: scientific inquiry; assessment; knowledge of inquiry; nature of science Scientific inquiry has been a perennial focus of science education for the past century. Scientific inquiry refers to the combination of general science process skills with traditional science content, creativity, and critical thinking to develop scientific knowledge (Lederman, 2009). Various reform documents (e.g., Benchmarks for Science Literacy, AAAS, 1993; A Framework for K-12 Science Education: Practices, Crosscutting Concepts, and Core Ideas, National Research Council [NRC], 2011) emphasize that students should develop the abilities necessary to do inquiry, while the National Science Education Standards (NRC, 2000) Correspondence to: Judith S. Lederman; E-mail: [email protected] DOI 10.1002/tea.21125 Published online 5 November 2013 in Wiley Online Library (wileyonlinelibrary.com). # 2013 Wiley Periodicals, Inc. 66 LEDERMAN ET AL. differentiate between the abilities to do inquiry and having a fundamental understanding about specific characteristics of scientific inquiry. Oftentimes a learner’s knowledge about scientific inquiry is not explicitly assessed and it is assumed that students who do inquiry will necessarily develop understandings about inquiry. Scientific inquiry and nature of science are often used as synonymous terms. Although scientific inquiry and nature of science are not independent from one another, there is a difference between the two. Nature of science (NOS) embodies what makes science different from other disciplines such as history or religion. NOS refers to the characteristics of scientific knowledge that are necessarily derived from how the knowledge is developed (Lederman, 2006). Scientific inquiry (SI) is the processes of how scientists do their work and how the resulting scientific knowledge is generated and accepted. This contrast is further supported by the Next Generation Science Standards (NGSS; Achieve, Inc., 2013) which distinguishes between nature of science and scientific practices. Research indicates that, much like research on understandings of NOS, neither teachers nor students typically hold informed views of SI (Driver, Leach, Millar, & Scott, 1996; Lederman & Lederman, 2004; Schwartz et al., 2002). The research base for SI, though, is markedly smaller than that for NOS, due the lack of a readily available, or frequently utilized, instrument similar in nature to the Views of Nature of Science questionnaire (VNOS; Lederman, Abd-El-Khalick, Bell, & Schwartz, 2002). What is notable is the dearth of research centered on students’ understandings about SI as compared to that related to students’ understandings of NOS. Evident in the former, beyond the issue of conflation, is a preponderance of research focused on the doing of inquiry, which oftentimes is assumed to tacitly develop understandings of inquiry. The belief that doing SI is a sufficient condition for developing understandings about SI, unfortunately, is a misconception shared with the research on NOS (e.g., Wong & Hodson, 2009, 2010). Inquiry is typically taught in science classrooms by having students conduct investigations or, in general, by “doing” inquiry, or by the emersion of learners in authentic contexts (Sadler, Burgin, McKinney, & Ponjuane, 2010). This is assumed to develop students’ knowledge about SI. The problematic nature of the assumption can be illuminated by a simple example: students are often asked to control for variables when conducting investigations, but may not necessarily have an informed conception of the purpose of doing so, as it relates to the design. Students can participate in inquiry “experiences” but unless instruction explicitly addresses common characteristics of SI, students are more likely to continue to hold naı̈ve conceptions. As Metz (2004) summarizes, “the small research literature examining the epistemic outcomes of inquiry-based classroom instruction indicates that simply engaging students in ‘inquiry’ is insufficient to bring about these desired changes”(p. 220). The paucity of research focused on improving students’ understandings of SI is also related to the lack of readily accessible and meaningful assessments regarding educationally appropriate aspects of SI. The persistent belief equating the doing of inquiry with understandings about inquiry, as well as the common conflation of NOS with SI, are both partially to blame. In contrast, both a previous instrument, the Views of Scientific Inquiry (VOSI) questionnaire (Schwartz, Lederman, & Lederman, 2008), and the current Views About Scientific Inquiry (VASI) questionnaire focus on understanding about inquiry, not just students’ actions while engaged in inquiry activities. Although the VOSI provides a measure of understandings about inquiry, revisions to it were deemed necessary, particularly in light of the need for more comprehensive attention to characteristics of scientific inquiry and scientific practices, consonant with those espoused in the Framework for K-12 Science Education (NRC, 2011), and the related Next Generation Science Journal of Research in Science Teaching VASI QUESTIONNAIRE 67 Standards (NGSS; Achieve, Inc., 2013). In short, the VOSI assesses a subset of what is currently emphasized in education research and recent reform documents. The intent here is to report on the development of a new open-ended instrument for assessing understandings about scientific inquiry, the Views About Scientific Inquiry questionnaire (VASI). This article will (a) outline the framework of scientific inquiry assessed by the VASI; (b) describe the development of the VASI, as derived from the aforementioned VOSI; (c) present evidence for the validity and reliability of the VASI; (d) discuss the use of the VASI and associated interviews to elucidate views of specific aspects of SI assessed; and (e) discuss the utility of the resulting rich-descriptive views of SI that the VASI provides for informing further research efforts and classroom practice. Scientific Inquiry Like NOS, the meaning of “science as inquiry” has been debated for decades, and precise descriptions of what inquiry means for science education seem to vary as much as the methods of inquiry (Bybee, 2000). “Scientific inquiry refers to the characteristics of the processes through which scientific knowledge is developed, including the conventions involved in the development, acceptance, and utility of scientific knowledge” (Schwartz, 2004, p. 8). The NSES states: [Inquiry] involves making observations; posing questions; examining books and other sources of information to see what is already known; planning investigations; reviewing what is already known in light of experimental evidence; using tools to gather, analyze, and interpret data; proposing answers, explanations, and predictions; and communicating the results. Inquiry requires identification of assumptions, use of critical and logical thinking, and consideration of alternative explanations. (NRC, 1996, p. 23) The Framework for K-12 Science Education (NRC, 2011) includes inquiry under the umbrella term “scientific practices.” and states, “we use the term “practices” instead of a term such as “skills” to emphasize that engaging in scientific investigation requires not only skill but also knowledge that is specific to each practice” (p. 30). The National Science Education Standards (NRC, 2000) also clearly emphasizes both “skills of” and “knowledge about” inquiry. These documents stress what students should be able to do as well as what they should know. The National Academy of Sciences (2002) identified guiding principles for scientific inquiry that serve as a common basis across disciplines of research including political science, geophysics, and education. These principles are similar to the items on the list of generalized inquiry skills identified in the NSES (NRC, 2000), and included among the eight scientific practices of the NGSS. The NRC (2000, 2011), AAAS (1993), and the National Academy of Sciences (2002) offers descriptions of knowledge about SI, beyond basic investigative skills, that share commonalties. When describing aspects of SI, we draw from these documents as they highlight the fundamental understandings learners should develop about SI (e.g. “different kinds of questions suggest different kinds of scientific investigations; current scientific knowledge and understanding guide scientific investigations” (NRC, 2000, p. 20)). Researchers who have explored scientists in practice (e.g. Dunbar, 2001; Knorr-Cetina, 1999; Latour & Woolgar, 1979) have described similar aspects. Some of these common elements about inquiry were used as the framework in the development of the Views of Scientific Inquiry (VOSI) questionnaire (Schwartz, 2004; Schwartz et al., 2008; Schwartz, Lederman, & Thompson, 2001). We have since revisited the common elements and identified additional aspects from this literature base to serve as the framework for the VASI instrument, which are likewise Journal of Research in Science Teaching 68 LEDERMAN ET AL. educationally and developmentally appropriate in the context of K-16 science classrooms. Specifically, students should develop an informed understanding of the following aspects of SI: (1) scientific investigations all begin with a question and do not necessarily test a hypothesis; (2) there is no single set of steps followed in all investigations (i.e. there is no single scientific method); (3) inquiry procedures are guided by the question asked; (4) all scientists performing the same procedures may not get the same results; (5) inquiry procedures can influence results; (6) research conclusions must be consistent with the data collected; (7) scientific data are not the same as scientific evidence; and that (8) explanations are developed from a combination of collected data and what is already known. It should not be presumed that the specific conceptualization of the construct represented by the eight aspects of SI is intended to be forwarded as the only one. These eight aspects are the ones that guided the development of the VASI instrument. Furthermore, it is argued, that these understandings are important for students to know and are developmentally appropriate for K-16 students. Scientific Investigations All Begin With a Question and Do Not Necessarily Test a Hypothesis It is valid to think that observations spark interest before a question exists and this is part of science. However, it is important to distinguish science from just walking through this world and observing. In other words, watching a baseball game is not doing science. It is this very issue that is at the heart of students not being able to ask a valid scientific question. They need to have specific knowledge that has been melded into some curious pattern or question. This is the practice followed in science investigations and in research in any area. There is no denying the importance of observing the world, but observing the world without something guiding your observations is not science. Furthermore, “scientific investigations involve asking and answering a question and comparing the answer with what scientists already know about the world” (NRC, 2000). In order for scientific investigations to “begin” there needs to be a question asked about the world and how it works. Though these questions may originate through a variety of means (e.g., general curiosity about the world, a response to a prediction of a theory), congruent with the vision set forth in the NGSS, students need to understand that, in general, “science begins with questions” (Practice 1, Appendix F, p. 4). Unlike what is prescribed by the scientific method, all investigations do not necessarily have to formally state a hypothesis. Though traditional experimental designs typically include one, this is not necessary or typical of other designs in the more descriptive sciences. There Is No Single Set or Sequence of Steps Followed in All Investigations Even when not explicitly communicated, school science often looks like the scientific method because of an over reliance on experimental design. Clearly, there are other ways that scientists perform investigations such as observing natural phenomena. The field of astronomy relies heavily on ways of gathering data, drawing inferences, and developing scientific knowledge that do not follow the scientific method, with descriptive and correlational research as two of the more prominent examples. Students need to develop not only an understanding of the variety of research methodologies employed both across and within the domains of science, but that, in general, “scientist[s] use different kinds of investigations depending on the questions they are trying to answer” (NRC, 2000, p. 20). Put another way, these methods are guided by epistemological goals (Sandoval, 2005). This is supported by The Framework for K-12 Science Education (NRC, 2011) that states that “[s]tudents should have the opportunity to plan and carry out several different kinds Journal of Research in Science Teaching VASI QUESTIONNAIRE 69 of investigations. . .” (p. 61), including both “laboratory experiments” and “field observations.” Assessing students’ understanding that there is no single scientific method, after instruction to explicitly address these different methods and their appropriateness, is consonant with this aim. Inquiry Procedures Are Guided by the Question Asked Though scientists may design different procedures to answer the same question, these invariably need to be capable of answering the question proposed. The procedures implied by the scientific method (i.e., experimental design) are not always tenable approaches for answering certain questions as “control of conditions may be impractical (as in studying stars), or unethical (as in studying people), or likely to distort the natural phenomena (as in studying wild animals in captivity)” (AAAS, 1990, Chapter 1). Similar to the aforementioned aspect of SI, students need to understand the necessity of this alignment between research question and method, in that the former drives, and ultimately determines, the latter. In general, students should understand that the question governs the approach, with the approaches differing both within and between scientific disciplines and fields (Lederman, Antink, & Bartos, 2012). Furthermore, the method of investigation must be suitable for answering the question that is asked. This aim is mirrored by the NGSS, which emphasizes learners should be able to “plan a course of action that will provide the evidence to support their conclusions” (Practice 3, Appendix F, p. 7). All Scientists Performing the Same Procedures May Not Get the Same Results Students need to understand that scientific data do not stand alone, can be interpreted in various ways, and “that scientists may legitimately come to different interpretations of the same data” (Osborne, Collins, Ratcliffe, Millar, & Duschl, 2003, p. 708). As such, scientists who ask similar questions and follow similar procedures may reach different conclusions. This is due, in part, to a scientist’s theoretical framework, what he or she considers as evidence, and how anomalous data are handled, for example. Likewise, as students engage in the scientific practice of analyzing and interpreting data (NGSS, Practice 4) and in critiquing the work of their peers, an understanding of the subjective nature of this process can be made explicit. The history of science is replete with examples of differences in interpretation. The use of similar data by evolutionary biologists to support their specific conclusions is a case in point. A researcher operating from a Darwinian framework might focus his/her efforts on the location of transitional species, while, by contrast, from a punctuated equilibrium perspective transitional species would not be expected nor would what a Darwinian considered a transitional species be considered as such (see Gould & Eldridge, 1977). Inquiry Procedures Can Influence Results The procedure selected for a scientific investigation invariably influences the outcome. The operationalization of variables, the methods of data collection, and how variables are measured and analyzed all influence the conclusions reached by the researcher. For instance, a common investigation in high school biology class examines the root cells of a plant to identify cells in various stages of mitosis. The procedures used by the student invariably influence the type of data they collect, therefore affecting the conclusions they may reach. More generally, throughout the history of science technological advances have impacted the common practices of scientists, the results of their undertakings, and knowledge generated. Our understanding of the structure of the nucleus is just one example that shows our knowledge changing as a function of the investigatory procedures employed. Journal of Research in Science Teaching 70 LEDERMAN ET AL. As one of the eight scientific practices emphasized in the NGSS (Practice 4), students must not only be adept at analyzing and interpreting data, but must also be able to compare the results from different data sets generated through a variety of methodologies. As such, they should develop an understanding of the logical connection between the method of inquiry, the specific procedures therein, the data collected, and thus the conclusions drawn. Research Conclusions Must Be Consistent With the Data Collected Each research conclusion must be supported by evidence that stems from the data collected (see the following aspect). Students need to understand that the strength of a scientist’s claim is a function of the preponderance of evidence that supports it. The validity of the claims is further strengthened by the alignment of the research method with the research question. It follows, then, that claims must be reflected in the data collected. Scientific knowledge is empirically based, thus any explanations are anchored by the data that facilitates scientists’ development of those explanations. Consider the relatively unusual case of pharmaceuticals whose clinical trial data, emerging after their approval, exhibit questionable links to more serious side effects than reported. Although the safety claims about such medicines may be supported to an extent by clinical studies, trends in the data that are suggestive of serious concerns may go without interpretation. The conclusions in these situations are inconsistent with the data and such inconsistencies in these types of cases have serious implications for consumers. As communicated through the NGSS, students are expected to construct explanations supported empirically and to engage in argumentation from evidence (Appendix F, Practice 6). As such, it is important for students to understand that the tenets of their explanations and arguments must be consistent with, and are likewise qualitatively a function of, the data they collected. Scientific Data Are Not the Same as Scientific Evidence Data and evidence serve different purposes in a scientific investigation. Data are observations gathered by the scientist during the course of the investigation, and they can take various forms (e.g., numbers, descriptions, photographs, audio, physical samples, etc.). Evidence, by contrast, is a product of data analysis procedures and subsequent interpretation, and is directly tied to a specific question and a related claim. Observations of the orbit of Mars around the sun, in and of themselves, are, simply put, an example of data. When these observations are made in conjunction with an attempt to determine the validity of Einstein’s General Theory of Relativity, they constitute evidence in support of, or in opposition to, this claim. It is necessary that students understand the distinction between data and evidence and can describe how the interpretation of data (i.e., the use of data as evidence) is a potential source of bias. This is congruent with the NGSS (2013, Appendix F) which state: Being a critical consumer of information about science and engineering requires the ability to read or view reports of scientific or technological advances or applications (whether found in the press, the Internet, or in a town meeting) and to recognize the salient ideas, identify sources of error and methodological flaws, distinguish observations from inferences, arguments from explanations, and claims from evidence [emphasis added] (p. 13). Journal of Research in Science Teaching VASI QUESTIONNAIRE 71 Explanations Are Developed From a Combination of Collected Data and What Is Already Known “Scientists strive to make sense of observations of phenomena by constructing explanations for them that use, or are consistent with, currently accepted scientific principles” (AAAS, 1990, Chapter 1). As such, investigations are guided by current knowledge, and conclusions, while derived from empirical data, are additionally informed by previous investigations and accepted scientific knowledge. Students that are engaged in scientific practices consistent with the NGSS need to grasp this relationship. In addition, they must also understand that scientists must recognize when well-supported conclusions differ from accepted scientific knowledge, or have “greater explanatory power of phenomena than previous theories (NRC, 2011, p. 52). Scientists are then faced with determining how these findings must be interpreted given what is already understood. Consider, for example, when paleontologists unearth dinosaur bones. These bones are not found in a perfect skeleton. Indeed, the bones are not even found in complete pieces. Scientists must use what they already know about skeletons in conjunction with the data (the newly unearthed bones) to construct the skeleton, while also remaining aware of any potential inconsistencies with current knowledge. The overarching intent of the VASI questionnaire is to assess learners’ knowledge of scientific inquiry, as represented by these eight well-supported, core aspects. Though there is clear overlap between the eight scientific practices in the Next Generation Science Standards (NGSS) and the aspects of SI targeted by the VASI, the focus of the VASI is on understandings about SI as opposed to the doing of SI. Thus, while the practices emphasize that students must be able to plan and carry out investigations, the VASI targets students’ knowledge about aspects of these common scientific practices. Put another way, a procedure tightly aligned with a guiding question may serve as an assessment criterion for discerning a student’s ability to plan and carry out an investigation, but whether a student has an informed conception of this necessary coupling may not always be assessed, but is a targeted aspect of the VASI. The doing of inquiry and the knowledge about inquiry are both important, unfortunately the knowledge about inquiry is typically not assessed. For practices such as “engaging in argument from evidence” and “constructing explanations,” for example, the VASI could help discern students’ understanding of such constituent aspects as: research conclusions must be consistent with the data collected, scientific data are not the same as scientific evidence and that explanations are developed from a combination of collected data and what is already known. The Framework contends that “[e]ngaging in the practices of science helps students understand how scientific knowledge develops; such direct involvement gives them an appreciation of the wide range of approaches that are used to investigate, model, and explain the world” (NRC Framework, 2012, p. 42). The intent of the VASI is to facilitate an “inquiry as ends” approach (Abd-El-Khalick et al., 2004) that positions scientific inquiry as an explicit instructional outcome. This is unlike implicit approaches that typically emphasize the “doing” of science, but neglect to address instructional objectives targeting learners’ understandings of foundational aspects of scientific practice (i.e., knowledge about scientific inquiry). In this paradigm, “doing science” is seen as a sufficient vehicle to help students “know science.” A body of research that spans decades (e.g., Abd-El-Khalick, 1998; Barufaldi, Bethel, & Lamb, 1977; Haukoos & Penick, 1985; Riley, 1979; Scharmann & Harris, 1992; Spears & Zollman, 1977; among others) has indicated that these implicit approaches are not sufficient for improving students’ and teachers’ understandings of NOS or SI. In general, we echo the sentiments of Sandoval and Reiser (2004) in that “[p]lacing these epistemic aspects of scientific practice in the foreground of inquiry Journal of Research in Science Teaching 72 LEDERMAN ET AL. may help students to understand and better conduct inquiry, as well as provide a context to overtly examine the epistemological commitments underlying it” (p. 346). With this in mind, it is argued that the eight aspects of SI targeted by the VASI, while not explicitly communicated in all eight core scientific practices of the NGSS are, nonetheless, contained therein to varying degrees. An understanding of each aspect of SI, it is further argued, can only serve to bolster efforts to develop the knowledge that is specific to each of these core practices. The VASI, it is hoped, will ensure that teachers are not leaving to chance whether their students’ engagement in various scientific practices leads to their development of informed conceptions about these practices. Put another way, in the NSES and NGSS there is a clear recognition that doing something alone does not engender an understanding of what is being done. The knowledge targeted by the VASI is embedded in the doing or practices of science, but it needs to be made explicit so that students can recognize and apply the knowledge to unique situations. Again, students can learn to use procedures and algorithms and not understand what is actually being done. This view is supported by the NGSS, in large part, through the following statement from the Reflecting on the Practice of Science and Engineering section of Appendix F: Engaging students in the practices of science and engineering outlined in this section is not sufficient for science literacy. It is also important for students to stand back and reflect on how these practices have contributed to their own development, and to the accumulation of scientific knowledge and engineering accomplishments over the ages (p. 16). This section continues, further emphasizing “that reflection is essential if students are to become aware of themselves as competent and confident learners and doers in the realms of science and engineering” (p. 16). The VASI is a means to assess the efficacy of these referred to explicit-reflective classroom practices aimed at developing understandings of the practice of science outlined in the NGSS. In summary, understandings about SI have long been viewed as an integral component of scientific literacy. Various reform documents emphasize developing students’ and teachers’ “understandings of inquiry” (AAAS, 1993; NRC, 2000), beyond that of typical science concepts and common procedures of scientific investigations. Furthermore, these aspects of SI are not new to the field of science, as they were explicated a half century ago in Schwab’s The Teaching of Science (1962). The NRC (2000), AAAS (1993), and the NAS (2002), along with various science educators (e.g., Flick & Lederman, 2004; Osborne et al., 2003) and researchers who have examined how scientists “practice” (e.g., Dunbar, 2001; Knorr-Cetina, 1999) helped inform the identification of essential aspects of scientific inquiry that serve as a foundation for the VASI questionnaire. To the overarching goal of developing a scientifically literate populace—the general citizen will need to have a strong knowledge about how scientists construct knowledge and with what level of confidence they should have about that knowledge. They need to know why and how scientists looking at the same data can validly disagree, for example. The scientifically literate citizen will make decisions about controversial topics through their knowledge about scientific inquiry and scientific practices, as opposed to running to their garage to do an experiment. Accurate descriptions of the changing landscape of scientific inquiry from science studies and the learning sciences clearly support Duschl’s (2008) contention that “there are additional important details for the development of learners’ scientific literacy” (p. 9) beyond the eight aforementioned aspects of SI. At the level of the K-12 learner, though, the aspects of SI undergirding the development of the VASI, it is argued, are at a level of generality that positions them as requisite knowledge for developing understandings of these “additional important details.” Journal of Research in Science Teaching VASI QUESTIONNAIRE 73 Problems With Standardized/Convergent and Broad Epistemic Assessments of SI Understandings During the time that knowledge of SI has been emphasized as a worthwhile instructional objective and overarching goal in science education, various standardized and convergent assessments have been developed that were (or could be) used to assess students’ understandings about scientific inquiry (see Supporting Information Appendix A). It should be noted that a detailed evaluation of each is not the intended goal here. What is of concern, though, is that with the exception of the VOSI, none of these instruments were designed to provide an assessment of knowledge about SI consistent with the framework described herein. Before proceeding, it seems appropriate to reassert previously leveled criticisms of standardized instruments as a means to assess students’ understandings of NOS, and similarly SI. The first concerns the basis on which such assessments are developed, specifically regarding the assumption that “respondents perceive and interpret an instrument’s items in a manner similar to that of the instrument developer” (Lederman et al., 2002, p. 502). This, in short, constitutes a threat to validity that makes the use of such instruments untenable. Additionally, forced choice responses almost invariably reflect the philosophical lens of its developers, which tends to distort respondents’ replies with a bias toward more developed, firmly held, and coherent philosophical orientations (Lederman et al., 2002). One instrument of note, the Student Understanding of Science and Scientific Inquiry (SUSSI; Liang et al., 2006) does provide an interesting addition to the use of a forced-choice, 5-point Likert scale, in that respondents are required to provide an explanation for each answer they select. Unfortunately, the instrument has been shown to lack concurrent validity with the VNOS (Mesci & Schwartz, 2013). Furthermore, and irrespective of the architecture of the instrument, only one clear-cut aspect of scientific inquiry, Scientific Methods, is addressed on the SUSSI, thus further bolstering the need to develop a questionnaire to assess key aspects of SI. It also measures attitudes toward science and the teaching of NOS. There are concerns with other assessment approaches that aim to explore learner epistemologies, due to the issue with conflating NOS and SI, and assuming that doing inquiry equates with knowing about inquiry. The aspects of SI targeted by the VASI are epistemological, meaning they deal with the nature of scientific knowledge development. Other researchers aiming to explore epistemological views, including knowledge about SI, have used instruments emphasizing NOS ideas solely or in conjunction with some SI aspects (Cobern & Loving, 2002; Tao, 2002; among others). The findings may describe broader epistemological stances because they blend NOS and SI, but do not describe how respondents understand specific aspects of either NOS or SI. Other scholars maintain that learners’ epistemological views can be inferred from what learners do while engaging in scientific practices (Sandoval, 2005; Sandoval & Millwood, 2005). Nonetheless, Sandoval (2005) recognizes that studies of students conducting inquiries “are insufficient to illuminate the epistemological goals or criteria that students pursue during their practice” (p. 650). These works offer insights into how learners conduct inquiries and engage in argumentation, yet do not sufficiently explore the conceptual knowledge learners hold about the SI aspects targeted on the VASI. Various reform documents have emphasized the importance of developing students’ understandings about scientific inquiry (NRC, 1996, 2000, 2011), but the standardized tests on which these understandings are assessed provide little utility to the classroom teacher. What is needed is a means for readily assessing students’ conceptions of SI much like that provided by the VNOS instrument in assessing views of NOS. Though the VOSI provided the initial means to accomplish this task, a broader representation of important aspects of SI needed to be reflected in the questionnaire. Such an instrument would provide a means for the assessment of additional Journal of Research in Science Teaching 74 LEDERMAN ET AL. educationally appropriate aspects of SI, all in-line with the vision set forth in the recently released NGSS. Development of the Views About Scientific Inquiry (VASI) Questionnaire The Views of Scientific Inquiry (VOSI; Schwartz, 2004; Schwartz et al., 2008) questionnaire is actually a collection of instruments developed to measure students’, teachers’, and scientists’ understandings about the nature of scientific inquiry (see Supporting Information Appendix B). As mentioned, the aspects of scientific inquiry assessed were determined through a thorough examination of various reform documents, along with the work of science educators and other researchers. The first version of the VOSI is a five question, open-ended, paper-and-pencil survey that addresses five aspects of SI. It is evaluated holistically to inform the generation of a descriptive profile of respondents’ understandings of SI, and also to provide “a rating” across each aspect of inquiry as unclear, naı̈ve, mixed, or informed. The VOSI was also available in an oral protocol, designed to overcome the difficulty with the administration of pencil-and-paper assessment to young learners without appropriate reading and writing skills. While the VOSI has provided valid insights into respondents’ views of scientific inquiry (Schwartz & Lederman, 2008; Schwartz et al., 2008), after scoring and reflecting on many VOSI items and responses, it was determined that this instrument did not assess the more comprehensive list of aspects of inquiry previously identified. As such, an updated version of the VOSI was desired, and the VASI Questionnaire was the result of these efforts (see Supporting Information Appendix C). An expert panel was assembled to guide the development of the new questionnaire. This group was comprised of two of the science educators who were part of the developmental team responsible for the creation of the original VOSI and VNOS questionnaires. In addition, 10 PhD students, all of whom have backgrounds in various contexts in science education, both as inservice teachers in grades K-12 in a variety of content areas including physics, chemistry and biology, and who have likewise developed and provided professional development in science content, NOS, and SI were also included. Although the convention is to typically use external experts only, the group had been working on the construct for several years, had administered and scored hundreds of VOSI questionnaires, and were clear on what was missing from the original questionnaire. As such it was felt that this group was the most informed for this undertaking, as they knew what they wanted to measure and were likewise acutely aware of the problems of measuring it. It should be noted that people external to the group (e.g., teachers) were employed in both the initial vetting of items and the reliability check. The panel first examined the VOSI for correspondence between its questions and the aspects of SI they addressed. Of particular interest was identifying which, if any, of the aforementioned eight aspects of SI that the VOSI did not explicitly target. Table 1 provides a comparison between targeted aspects of SI and the VOSI items. Questions 1 and 2 of the VOSI intentionally did not correspond with any of the eight aspects of SI, serving more as icebreakers for the open-ended nature of the questionnaire. Aspect #1, “science begins with a question,” could be, but is not, explicitly addressed in question 3a of the VOSI. It reads, “Do you consider this person’s investigation to be scientific?” Question 3c of the VOSI, which explicitly asks if science can follow more than one method, corresponds with aspect #2 (“multiple methods”), while aspect #4 (“all scientists performing the same procedures may not get the same result”) is aligned with question 4a on the VOSI. This question allows researchers to see if the respondent understands that scientists can ask the same question, follow the same procedure but still potentially come to different conclusions. VOSI question 4b relates to aspect #5 in targeting an understanding that the procedures of an investigation can impact results. VOSI Journal of Research in Science Teaching VASI QUESTIONNAIRE 75 Table 1 Aspects of scientific inquiry (SI) and corresponding VOSI questionnaire items Aspect of Scientific Inquiry VOSI Item# 1. Scientific investigations all begin with a question but do not necessarily test a hypothesis 2. There is no single set and sequence of steps followed in all scientific investigations (i.e., there is no single scientific method) 3. Inquiry procedures are guided by the question asked 4. All scientists performing the same procedures may not get the same results 5. Inquiry procedures can influence the results 6. Research conclusions must be consistent with the data collected 7. Scientific data are not the same as scientific evidence 8. Explanations are developed from a combination of collected data and what is already known 3a 3c 4a 4b 7 questions 4a and 4c target an understanding of the distinction between data and evidence, thus are aligned with aspect #7. After the original VOSI was mapped to the aspects of SI, three were left unaddressed by the VOSI: (3) inquiry procedures are guided by the question asked, (6) research conclusions must be consistent with the data collected, and (8) explanations are developed from a combination of collected data and what is already known. With this in mind, additional questions were added to fully capture all the targeted aspects of SI. New questions were vetted with the committee, revised when necessary, and then confirmed with 100% agreement as addressing these remaining aspects. The first two “icebreaker” questions from the VOSI were removed over concerns that the questionnaire was too long. VOSI questions 4c and 4d were also removed due to similar concerns, while it was agreed that questions 4a and 4b adequately addressed aspects #4 and #5 (“All scientists performing the same procedures may not get the same results” and “Inquiry procedures can influence the results,” respectively). Question 5 was rewritten for the VASI, in a much simpler and direct way (“Please explain if ‘data’ and ‘evidence’ are different from one another”), to address aspect #7. Four new questions were added to complete the remainder of the VASI. First, a new question was designed to further assess knowledge of the first aspect of SI (science begins with a question). This question explained a situation with two students arguing about how scientists begin their work, and asks the respondent to side with one student and explain their reasoning. A second question was designed to determine the degree to which respondents understand that procedures followed in a scientific investigation need to be guided by the question that is asked (aspect #3). A case of scientists with differing opinions is presented, and students are asked to determine which scientists’ procedure is “better” and to further explain their reasoning. For the third new question, respondents are presented with a scientific question and a data set to access their knowledge of aspect #6 (“Research conclusions must be consistent with data collected). A data set that displays information contrary to common science knowledge was used to make sure that the question would elicit a response that connected the data gathered and the conclusion drawn. The final question added to the VASI sought to address aspect #7, that explanations are developed from a combination of collected data and what is already known. A hypothetical situation (please refer to the full instrument in the Appendix) is presented where a box of dinosaur bones is found by a group of scientists, and they arrive at two possible arrangements for the bones. Journal of Research in Science Teaching 76 LEDERMAN ET AL. The respondent needs to explain why most of the scientists selected one arrangement over another, and also what kinds of information scientists used to explain their conclusions. Validity and Reliability In general, to establish the content validity of the VASI questionnaire, all new questions were vetted by the committee, revised when necessary, and then confirmed to address the targeted aspect of SI with 100% agreement among the 12 committee members. The committee also ensured that all aspects of SI were addressed. This process was identical to that previously described when examining the congruence of the VOSI and to further ensure validity with the eight essential aspects of SI. Please see Table 2 for the connection between targeted aspects of SI and the VASI questionnaire. Once content validity was determined, face validity was established by the original panel, in conjunction with a group of three middle school teachers. This group inspected the VASI regarding the appropriateness of the language used for upper middle school to high school learners. Additionally the VASI was piloted with a group of 60 grade eight students. Concerns were addressed by the committee until unanimous approval was reached. Evidence for construct validity can be established if individuals believed to differ in understanding actually respond differently on a targeted assessment. Research has indicated that students typically hold uninformed or naı̈ve conceptions of NOS and SI (Lederman, 2006; Schwartz et al., 2008) prior to explicit instruction. As such, students should score differently prior to instruction than following instruction that strongly emphasizes developing knowledge about SI across the aforementioned eight aspects. Another two groups of middle school students from grade eight over the course of two years (N ¼ 111 year one, N ¼ 116 year two) completed the VASI at the start of the school year, and again following instruction. There were three teachers involved in instruction. These teachers met with a PhD student weekly over a 7-month-period to help them plan for explicit instruction targeting appropriate aspects of SI. These lesson plans included instructional objectives related to SI, formative and summative assessments of students’ understandings, in addition to providing explicit-reflective instruction (Khishfe & Abd-El-Khalick, 2002). The same PhD student observed the lessons to gauge fidelity to the written plan. Instructional content related to SI was examined by the expert panel for congruence with the previously defined eight aspects. Table 2 Aspects of scientific inquiry (SI) and corresponding items on VASI questionnaire Aspect of Scientific Inquiry 1. Scientific investigations all begin with a question but do not necessarily test a hypothesis 2. There is no single set and sequence of steps followed in all scientific investigations (i.e., there is no single scientific method) 3. Inquiry procedures are guided by the question asked 4. All scientists performing the same procedures may not get the same conclusions 5. Inquiry procedures can influence the conclusions 6. Research conclusions must be consistent with the data collected 7. Scientific data are not the same as scientific evidence 8. Explanations are developed from a combination of collected data and what is already known Journal of Research in Science Teaching VASI Item# 1a, 1b, 2 1b, 1c 5 3a 3b 6 4 7 VASI QUESTIONNAIRE 77 Individual profiles were developed based on holistic analysis of VASI responses. Results indicated that the majority of students (see Table 3) (a) were uninformed in their conceptions of SI prior to instruction, and (b) increased their understandings to some degree for each aspect of inquiry (e.g., from Naı̈ve to Mixed or from Mixed to Informed) following instruction. These results were considered strong evidence that the VASI questionnaire was indeed accurately representing the construct of SI and its eight constituent aspects. It should be noted that quantification of results and the use of inferential statistics was not deemed appropriate by the committee and is not recommended with this or other similar instruments (e.g., Views of Nature of Science Questionnaire), as numerical scores provide a very limited picture of this complex construct and do not provide a rich description of respondents’ conceptions. To ensure that respondents were interpreting the VASI in a manner congruent with the questionnaire’s intent, profiles generated through the VASI data were compared to those developed from interviews with approximately 20% of the respondents. These interviews were used to verify the evaluators’ assumptions about each respondent’s views as well as provide greater insight. All interviews were recorded and transcribed and independently coded. A 95% level of agreement was reached between the codes for the written VASI responses and the interview responses. Results support the efficacy of the VASI data for validly reflecting respondents’ understandings about SI across the eight aspects. Lastly, interrater reliability was determined by multiple members of the committee analyzing a sample of 24 VASI questionnaires. Agreement was reached on over 90% of the questions scored. Exemplary Responses Table 4 provides example responses to each of the VASI items. These are verbatim quotes selected from the responses of eighth graders who completed the VASI prior to and following instruction that strongly emphasized developing understandings of SI. The students in this sample were explicitly taught the eight aspects of SI over the period of 7 months in various activities. The naı̈ve view respondent examples are taken from pretests and the more informed examples are Table 3 Percentage of students categorized as holding naı̈ve, transitional, and informed views across eight aspects of SI Begins With a Question Multiple Methods Same Procedures May Not Get the Same Results Procedures Influence Results N ¼ 227 % of Students Pre Post Pre Post Pre Post Pre Post Naı̈ve Transitional Informed 89 10 0 21 44 34 57 42 0 17 36 42 66 31 2 36 49 14 58 39 2 16 61 22 N ¼ 227 % of Students Naı̈ve Transitional Informed Procedures Are Guided by the Question Asked Data Is Not the Same as Evidence Explanations are Developed From Data and What Is Already Known Conclusions Consistent With Data Collected Pre Post Pre Post Pre Post Pre Post 66 20 12 24 42 33 76 18 4 25 43 21 50 16 0 17 49 20 26 42 26 4 28 66 Journal of Research in Science Teaching 78 LEDERMAN ET AL. Table 4 Exemplary responses in both the informed and naı̈ve categories across eight aspects of SI Aspect Knowledge of Inquiry More Naı̈ve Views More Informed Views 1. Scientific investigations all begin with a question but do not necessarily test a hypothesis “I agree with no, because they don’t always need to have a question.” 2. There is no single set and sequence of steps followed in all scientific investigations (i.e., there is no single scientific method) 3. Inquiry procedures are guided by the question asked “because you have to have the scientific method: purpose, hypothesis, procedure. . .” 4. All scientists performing the same procedures may not get the same results “Yes they would because they’re doing the same thing step by step” 5. Inquiry procedures can influence the results “Yes because if you have the same question it must lead to the same answer no matter what the procedures are.” “Plants need water, food and sunlight to grow.” “Yes, because in order to know what to investigate you have to have a question asking you or telling you what to find.” “Yes the scientist could (1) dissect frogs, observe internal organs or (2) Grow plants and change a part of photosynthesis” “Team A’s procedure is better because it matches the question. My evidence is that both the question and Team A’s procedure involves different types of tires.” “If several scientists are working independently, ask the same questions and follow the same procedure to collect data they won’t necessarily draw the same conclusion because things can be different indicators to different people based on their experiences, they may also collect different data and data leads to different conclusions.” If they are doing different procedures they may get them different results.” 6. Research conclusions must be consistent with the data collected “Team B’s procedure is better because they show the tires reactions to different types of roads.” 7. Scientific data are not the same as scientific evidence “They are the same because you collect both.” 8. Explanations are developed from a combination of collected data and what is already known “Because it is bigger.” Journal of Research in Science Teaching “Plants grow taller with less sunlight because you can see on the data table above you see the more light the less it grow.” “Data is stuff you observe from an experiment, evidence is organized data making them support the conclusion.” “I think they use the main dino structure, their prior knowledge of how the dino looked, and fixing the dino like a puzzle.” VASI QUESTIONNAIRE 79 taken from students’ posttests. These views are presented along a continuum from naı̈ve to more informed understandings of SI. As evidenced in the exemplary responses, data generated by using the VASI surpass the oftentimes cumulative, quantitative data generated by using standardized, convergent, paper-andpencil assessments of scientific inquiry. Due to the nature of the questionnaire, respondents are forced to critically think about SI and the reasons underlying their thinking. This reasoning can and should be further examined in follow-up interviews, as described in what follows. Administering the VASI Similar to procedures for administering the VNOS questionnaire, it is preferred that the VASI be given under controlled conditions with no set time limit for completion. VASI respondents have typically taken 30–45 min to complete the questionnaire. As is suggested for the VNOS and VOSI questionnaires, respondents are encouraged to write down as much information as they can to address the given item. Respondents should also be encouraged to provide illustrative examples to help support their explanations. Due to the design of the questions on the VASI, students need to exhibit more than simple declarative answers by providing examples to further illustrate their knowledge about SI. By elaborating on their responses, the VASI is capable of providing data beyond simple answers and declarative statements. Informed responses can range from a sentence or two to multiple paragraphs. As is the case with any open-ended response question, the goal is for the respondent to answer the question as completely and clearly as possible. The VASI can be used as either a summative or formative assessment. It should be noted that the VASI is deemed valid for students with at least 6th grade (middle school) reading and writing ability, as no discernible influence of reading and writing ability has been evidenced with students of 6th-grade or older. Scoring the VASI Interviews. When administering the VASI, it is recommended that at least 20% of respondents are interviewed to better corroborate responses on the VASI with a scorer’s inferences regarding the quality of respondents’ understandings of various aspects of SI. Once a high degree of congruity has been evidenced between scores gleaned from questionnaire responses and those from interview data, fewer respondents need be interviewed, though no less than 15% is recommended. When attempting to calibrate the scoring accuracy of numerous scorers (interrater reliability), it is recommended that each researcher examine a common set of VASI questionnaires until at least 80% agreement is reached. Analysis of remaining data should proceed only once an acceptable level of reliability has been established by the scoring group. Holistic Scoring. Although a table of specifications linking questions to various aspects of SI is provided, it should not be assumed that scoring is done as a simple one-to-one correspondence between an aspect and a single item. Respondents should preferably demonstrate their understandings in various contexts, as often a more holistic picture of understandings of SI can be gleaned from considering responses to the VASI as a whole, as opposed to examining each item in isolation. Although each item targets a particular aspect of SI, comments pertinent to several aspects may be found in a single item response. For example, one student in the aforementioned sample answered question 3b, “No because the person will end up with different conclusions. If you don’t follow the same questions, you will end up doing different procedures.” This question was written to elicit students’ understandings about how inquiry procedures can influence results. This student’s response not only allows us to ascertain their understanding of procedures Journal of Research in Science Teaching 80 LEDERMAN ET AL. influencing results but also reveals that the student understands that procedures are guided by the question asked. Evaluating Responses. When assessing each aspect of SI, views are categorized as informed, mixed, naı̈ve, and unclear. If a respondent provides a response consistent across the entire questionnaire that is wholly congruent with the target response for a given aspect of SI, they are labeled as “informed.” If, by contrast, a response is either only partially explicated, and thus not totally consistent with the targeted response, or if a contradiction in the response is evident, a score of “mixed” is given. A response that is contradictory to accepted views of a particular aspect, or provides no evidence of congruence with accepted views of the specific aspect of SI under examination, is scored as “naı̈ve.” Lastly, for scores that are incomprehensible, unintelligible, or that, in total, indicate no relation to the particular aspect, a categorization of “unclear” is assigned. In regard to concerns about the open-ended format of the VASI, any essay-type questions require additional effort by the teacher or researcher to discern the level of understanding of the student. To identify general trends in student understanding at the unit, term, semester, or even course-level this type of open-ended instrument is typically utilized, and can be facilitated by the four-tiered assessment scale. The format also best serves the overarching intent of the instrument, which is to create profiles of student understanding, and afford deeper insight into how students may vary greatly in the nature and extent of their understandings, as is the case with the VNOS. The VASI can also be used to get a general idea of students’ understandings of certain aspects of SI across the four aforementioned categories (unclear, naı̈ve, mixed, informed). Trends within particular groups (i.e., between students) are also more easily identified with this scoring system, while the construction of individual profiles affords a more nuanced understanding of students’ conceptions of SI. This scoring system is not unlike the commonly used A, B, C, D, and F grade scale. Though it would certainly be more insightful for a classroom teacher to address each student’s learning in a long-form essay detailing his/her strengths and weaknesses across each instructional objective, a more economical approach is often favored, albeit at the loss of information. As such, the four-tiered grading scale is used in conjunction with, and often helps inform, the construction of more detailed student and, at times, class profiles of SI understanding. Discussion and Implications Since the unveiling of the National Science Education Standards (NSES; NRC, 2000), the teaching and research community has realized that inquiry can be considered as content, just as more familiar science subject matter (e.g., forces and motion, photosynthesis, oxidation-reduction reactions). The recently released Next Generation Science Standards (NGSS; Achieve, Inc., 2013) similarly recognize the importance of students knowing about scientific inquiry and science practices in addition to being able to do what scientists do. To this end, the VASI is one of the first instruments that will allow teachers and researchers to measure what students and teachers know about SI. Hopefully, with this readily available instrument, research on learners’ understandings of SI can more systematically be undertaken, much like that on understandings of NOS—most notably addressing the assumption that students develop understandings about SI by simply doing science (i.e., engaging in inquiry activities) or by teachers “modeling” certain inquiry-related scientific practices. Furthermore, the VASI will allow researchers to explore many of the long standing assumptions regarding understandings about SI that are presumed similar to understandings about NOS. Questions from NOS research that have not been similarly investigated for SI include, but are not limited to: (a) How teachers’ and students’ conceptions of SI develop over time, (b) The relative efficacy of various approaches to teaching about SI, (c) The relationship between a Journal of Research in Science Teaching VASI QUESTIONNAIRE 81 teacher’s understanding of SI and their ability to teach about SI, (d) The relationship between conceptions of SI and learning of other science content, and (e) The relationship between the cognitive load of a particular science topic and students’ abilities to learn about SI concurrently with the science topic. That is, are there certain topics that facilitate the teaching and learning of scientific inquiry? More systematic information about teachers’ understandings of SI will help inform professional development that targets not just the doing of inquiry, but developing understandings about SI. Likewise, and on a broader level, the VASI will assist the research community in assessing the success of the NGSS in meeting its goal of students developing informed conceptions of scientific inquiry/practices. Overall, the development of the VASI is just a beginning. The instrument will most likely need further revision as it is used with more teachers and students from a more diverse population. In addition, similar assessments will need to be developed for elementary students and for very young students who have limited reading and writing abilities. References Abd-El-Khalick, F., (1998) The influence of history of science courses on students’ conceptions of the nature of science. Unpublished dissertation. Oregon State University. Abd-El-Khalick, F., BouJaoude, S., Duschl, R., Lederman, N. G., Mamlok-Naaman, R., Hofstein, A., Niaz, M., Treagust, D., & Tuan, H. (2004). Inquiryin science education: International perspectives. Science Education, 88(3), 397–419. Abd-El-Khalick, F., & Lederman, N. G. (2000). The influence of history of science courses on students’ views of nature of science. Journal of Research in Science Teaching, 37(10), 1057–1095. Achieve, Inc., on behalf of the twenty-six states and partners that collaborated on the NGSS (2013). Next generation science standards. Retrieved June 25, 2013 from http://www.nextgenscience.org/nextgeneration-science-standards Akerson, V. L., Abd-El-Khalick, F., & Lederman, N. G. (2000). Influence of a reflective explicit activity-based approach on elementary teachers’ conceptions of nature of science. Journal of Research in Science Teaching, 37(4), 295–317. American Association for the Advancement of Science [AAAS]. (1990). Science for all Americans: Project 2061. New York: Oxford University Press. American Association for the Advancement of Science [AAAS]. (1993). Benchmarks for science literacy. New York: Oxford University Press. Barufaldi, J. P., Bethel, L. J., & Lamb, W. G. (1977). The effect of a science methods course on the philosophical view of science among elementary education majors. Journal of Research in Science Teaching, 14(4), 289–294. Bybee, R. (2000). Teaching science as inquiry. In: Minstrell J. & van Zee E. (Eds.), Inquiring into inquiry learning and teaching in science. Washington, DC: American Association for the Advancement of. Science. Cobern, W. W., & Loving, C. C. (2002). Investigation of preservice elementary teachers’ thinking about science. Journal of Research in Science Teaching, 39(10), 1016–1031. Driver, R., Leach, J., Millar, R., & Scott, P. (1996). Young people’s images of science. Buckingham, England: Open University Press. Dunbar, K. (2001). What scientific thinking reveals about the nature of cognition. In: Crowley, K. Shunn, C. & Okada T. (Eds.), Designing for science: Implications from everyday classroom and professional settings. Mahwah, NJ: Lawrence Erlbaum Associations, Inc. Duschl, R. (2008). Science education in three-part harmony: Balancing conceptual, epistemic, and social learning goals. Review of Research in Education, 32(1), 268–291. Flick L. & Lederman N. (Eds.), (2004). Scientific inquiry and nature of science. Dordrecht, The Netherlands: Kluwer Academic Publishers. Journal of Research in Science Teaching 82 LEDERMAN ET AL. Gould, S. J., & Eldridge, N. (1977). Punctuated equilibria: The tempo and model of evolution reconsidered. Paleobiology, 3, 115–151. Haukoos, G. D., & Penick, J. E. (1985). The effects of classroom climate on college science students: A replication study. Journal of Research in Science Teaching, 22, 163–168. Khishfe, R., & Abd-El-Khalick, F. (2002). Influence of explicit and reflective versus implicit inquiry oriented instruction on sixth graders views of nature of science. Journal of Research in Science Teaching, 39, 551–578. Knorr-Cetina, K. (1999). Epistemic cultures: How the sciences make knowledge. Cambridge: Harvard University Press. Latour, B., & Woolgar, S. (1979). Laboratory life: The social construction of scientific facts. London: Sage. Lederman, J. S. (2009). Teaching scientific inquiry: Exploration, directed, guided, and opened-ended levels. In National geographic science: Best practices and research base (pp. 8–20). Hapton-Brown Publishers. Lederman, N. G. (1992). Students’ and teachers’ conceptions of the nature of science: A review of the research. Journal of Research in Science Teaching, 29, 331–359. Lederman, N. G. (2006). Research on nature of science: Reflections on the past, anticipations of the future. In Asia-Pacific Forum on Science Learning and Teaching, 7, (1). Retrieved July 13, 2012, from http:// www.ied.edu.hk/apfslt/v7 issue1/foreword/index.htm Lederman, N. G., Abd-El-Khalick, F., Bell, R. L., & Schwartz, R. S. (2002). Views of nature of science questionnaire: Toward valid and meaningful assessment of learners’ conceptions of nature of science. Journal of Research in Science Teaching, 39, 497–521. Lederman, N. G., Antink, A., & Bartos, S. (2012). Nature of science, scientific inquiry, and socioscientific issues arising from genetics: A pathway to developing a scientifically literate citizenry. Science & Education, doi: 10.1007/s11191-012-9503-3. Lederman, N. G., & Lederman, J. S., (2004) Project ICAN: A professional development project to promote teachers’ and students’ knowledge of nature of science and scientific enquiry. In Proceedings of the 11th annual SAARMSTE conference. Cape Town, South Africa. Liang, L. L., Chen, S., Chen, X., Kaya, O. N., Adams, A. D., Macklin, M., & Ebenezer, J., (2006). Student understanding of science and scientific inquiry: revision and further validation of an assessment instrument. In Paper presented at the Annual Conference of the National Association for Research in Science Teaching (NARST). San Francisco, CA. Mesci, G., & Schwartz, R. (2013). Changing preservice teachers’ views of NOS: Why some conceptions are more easily altered than others (unpublished document). Kalamazoo, MI: Western Michigan University. Metz, K. E. (2004). Children’s understanding of scientific inquiry: Their conceptualization of uncertainty in investigations of their own design. Cognition and Instruction, 22(2), 219–290. National Academy of Sciences. (2002). Guiding principles for scientific inquiry. In: Shavelson R. J. & Towne L. (Eds.), Scientific research in education. Washington, DC: National Academy Press. National Research Council [NRC]. (1996). National science education standards. Washington, DC: National Academy Press. National Research Council [NRC]. (2000). Inquiry and the national science education standards. Washington, DC: National Academy Press. National Research Council [NRC]. (2011). A framework for K-12 science education: Practices, crosscutting concepts, and core ideas. Washington, DC: National Academy Press. Osborne, J., Collins, S., Ratcliffe, M., Millar, R., & Duschl, R. (2003). What “ideas-about science” should be taught in school science? A Delphi study of the expert community. Journal of Research in Science Teaching, 40(7), 692–720. Riley, J. P. II, (1979). The influence of hands-on science process training on preservice teachers’ acquisition of process skills and attitude toward science and science teaching. Journal of Research in Science Teaching, 16, 373–384. Sadler, T., Burgin, S., McKinney, L., & Ponjuan, L. (2010). Learning science through research apprenticeships: a critical review of the literature. Journal of Research in Science Teaching, 47(3), 235–256. Journal of Research in Science Teaching VASI QUESTIONNAIRE 83 Sandoval, W. A. (2005). Understanding students’ practical epistemologies and their influence on learning through inquiry. Science Education, 89(4), 634–656. Sandoval, W. A., & Millwood, K. (2005). The quality of students’ use of evidence in written scientific explanations. Cognition and Instruction, 23(1), 23–55. Sandoval, W. A., & Reiser, B. J. (2004). Explanation-driven inquiry: Integrating conceptual and epistemic scaffolds for scientific inquiry. Science Education, 88(3), 345–372. Scharmann, L. C., & Harris, W. M. Jr., (1992). Teaching evolution: Understanding and applying the nature of science. Journal of Research in Science Teaching, 29, 375–388. Schwartz, R. S. (2004). Epistemological views in authentic science practices: A cross-discipline comparison of scientists’ views of nature of science and scientific inquiry (unpublished doctoral dissertation). Oregon State University, Corvallis, Oregon. Schwartz, R., & Lederman, N. (2008). What scientists say: Scientists’ views of nature of science and relation to science context. International Journal of Science Education, 30(6), 727–771. Schwartz, R. S., Lederman, N., & Crawford, B. A. (2004). Developing views of nature of science in an authentic context: An explicit approach to bridging the gap between nature of science and science inquiry. Science Education, 88, 610–645. Schwartz, R. S., Lederman, N., Khishfe, R., Lederman, J. S., Matthews, L., & Liu, S., (2002) Explicit/ reflective instructional attention to nature of science and scientific inquiry: Impact on student learning. In Paper presented at the 2002 annual international conference of the association for the education of teachers in science. Charlotte, NC. Schwartz, R. S., Lederman, N., & Lederman, J., (2008, March). An instrument to assess views of scientific inquiry: The VOSI questionnaire. In Paper presented at the international conference of the National Association for Research in Science Teaching (NARST). Baltimore, MD. Schwartz, R. S., Lederman, N. G., & Thompson, R., (2001, March). Grade nine students’ views of nature of science and scientific inquiry: The effects of an inquiry-enthusiast’s approach to teaching science as inquiry. In Paper presented at the international conference of the National Association for Research in Science Teaching (NARST). St. Louis, MO. Spears, J., & Zollman, D. (1977). The influence of structured versus unstructured laboratory on students’ understanding the process of science. Journal of Research in Science Teaching, 14, 33–38. Tao, P. K. (2002). A study of students’ focal awareness when studying science stories designed for fostering understanding of the nature of science. Research in Science Education, 32(1), 97–120. Wong, S. L., & Hodson, D. (2009). From the horse’s mouth: What scientists say about scientific investigation and scientific knowledge. Science Education, 93(1), 109–130. Wong, S. L., & Hodson, D. (2010). More from the horse’s mouth: What scientists say about science as a social practice. International Journal of Science Education, 32(11), 1431–1463. Supporting Information Additional Supporting Information may be found in the online version of this article at the publisher’s web-site. Journal of Research in Science Teaching
© Copyright 2026 Paperzz