Thursday, May 04, 2006

Inquiry learning: an alternative paradigm to direct instruction.

Science 28 April 2006:
Vol. 312. no. 5773, pp. 532 - 533
DOI: 10.1126/science.1127750
Prev | Table of Contents | Next
Education Forum

COMPUTER SIMULATIONS:

Technological Advances in Inquiry Learning
Ton de Jong*

The promise offered by inquiry learning is tempered by the problems students typically experience when using this approach. Fortunately, integrating supportive cognitive tools with computer simulations may provide a solution.

Learning by Inquiry

Studies of young students' knowledge and skills indicate that many students in large parts of the world are not optimally prepared for the requirements of society and the workplace (1). To meet this challenge, curricula should be designed to help students learn how to regulate their own learning, how to continue to gain new knowledge, and how to update their existing knowledge.

Inquiry learning is defined as "an approach to learning that involves a process of exploring the natural or material world, and that leads to asking questions, making discoveries, and rigorously testing those discoveries in the search for new understanding" (2). This means that students adopt a scientific approach and make their own discoveries; they generate knowledge by activating and restructuring knowledge schemata (3). Inquiry learning environments also ask students to take initiative in the learning process and can be offered in a naturally collaborative setting with realistic material.
The idea of inquiry, or discovery, as a learning approach has a long history (4, 5). Now, technological developments such as computer simulations can implement more effective inquiry learning. Using simulations to model a phenomenon or process, students can perform experiments by changing variables (such as resistances in an electrical circuit) and then observe the effects of their changes (e.g., the current). In this way, students (re-)discover the properties of the underlying model (Ohm's law).

The Inquiry Process

Inquiry learning mimics authentic inquiry. [There are some exceptions, such as the origin of the research question, the number of (known) variables, and the presence of flaws in data (6).] Because they are closely related, they share the following constitutive cognitive processes (7): orientation (identification of variables and relations); hypothesis generation (formulation of a statement or a set of statements, perhaps as a model); experimentation (changing variable values, making predictions, and interpreting outcomes); reaching conclusions (on the validity of the hypothesis); evaluation (reflection on the learning process and the acquired knowledge); planning (outlining a schedule for the inquiry process); and monitoring (maintaining an overview of the inquiry process and the developing knowledge).

However, research indicates that, overall, students have substantial problems with all of the inquiry processes listed above (8). Students have difficulty choosing the right variables to work with, they find it difficult to state testable hypotheses, and they do not necessarily draw the correct conclusions from experiments. They may have difficulty linking experimental data and hypotheses, because their pre-existing ideas tend to persist even when they are confronted with data that contradict those ideas (9). Students also struggle with basic experimental processes. They find it difficult to translate theoretical variables from their hypothesis into manipulable and observable variables in the experiment (10); they design ineffective experiments, for example, by varying too many variables at one time (11); they may use an "engineering approach," where they try to achieve a certain state in the simulation instead of trying to test a hypothesis (12); they fail to make predictions; and they make mistakes when interpreting data (13). Students also tend to do only short-term planning and do not adequately monitor what they have done (14).

Supporting the Inquiry Process

Research in inquiry learning currently focuses on finding scaffolds or cognitive tools that help to alleviate these problems and produce effective and efficient learning situations. Computer environments can integrate these cognitive tools with the simulation. Examples of cognitive tools are assignments (exercises that set the simulation in the appropriate state); explanations and background information; monitoring tools (to help students keep track of their experiments); hypothesis scratchpads (software tools to create hypotheses from predefined variables and relations); predefined hypotheses; experimentation hints (such as "vary one thing at a time " or "try extreme values"); process coordinators (which guide the students through the complete inquiry cycle); and planning tools. Overviews can be found in (7) and (15); examples of integrated inquiry systems are SimQuest applications (16), Co-Lab (17), GenScope (18), and Inquiry Island (19).

One example from a SimQuest application explores the physics of moments (see the first figure) (20). Support is offered in the form of an assignment that asks students to explore the balance of the seesaw by changing variables. Another available aid is a hypothesis scratchpad that lets students build expressions from variables (e.g., force F1, distance a1, and moment M1) and relations (e.g., increases) to create testable hypotheses (e.g., if F1 increases, then M1 increases).

Most experimental evaluations of cognitive tools offer different configurations of learning environments to different experimental groups. Effects measured include the acquisition of conceptual knowledge, procedural knowledge, and/or inquiry skills. Often the learning process can be analyzed from log files that track the behavior of students in the learning environment and/or data from students who are requested to think aloud during learning. The most effective learning results are found with tools that structure the learning process, provide students with predefined hypotheses and background information, help students plan (e.g., by providing a sequence of assignments), or give hints for efficient experimentation (7, 15, 21). For example, students offered simulations and assignments performed better in tests of intuitive knowledge of the physics of oscillation (22). Also, biology students who received prompts on experimental strategies outperformed in tests those who received other prompts or no prompts at all (23).

The Road Ahead

Unguided inquiry is generally found to be an ineffective way of learning (24). Reviewing classical research in three areas of learning--problem-solving rules, conservation strategies, and programming concepts--Mayer (3) concluded that guided discovery learning is effective. These guided inquiry environments are starting to enter educational practice, especially for ages 14 and up, and large-scale evaluations are promising (18). Mostly physical science topics have been tested, but inquiry environments have been used in other areas. In psychology, for instance, simulations have modeled Pavlovian (classical) conditioning, where an organism learns to relate one event to another previously unrelated event (25, 26) (see the figure below).

A number of research issues still lie ahead. First, the introduction of cognitive tools may lead to overly complex learning environments that hinder learning by requiring too much working memory capacity. Ways to reduce this extraneous cognitive load, such as by integrating representations (27), are being investigated. Another challenge lies in adapting the learning environment to respond not only to differences between learners but also to the developing knowledge and skills of an individual learner. Learning environments could use "fading," in which cognitive tools gradually disappear so that the learner can ultimately take over the learning process. Automating this would need an adequate cognitive diagnosis of both a student's learning process and developing knowledge and might be based on the log files of the student's interactions with the system (28). A further challenge is to find ways to combine collaborative learning and inquiry learning (17, 29). Specific tools to structure the collaboration and sharing of (intermediate) models between students are only now being developed. Students may also be offered the opportunity to create informal models (17). Such a facility helps them to articulate intuitive knowledge and at the same time gives them a specific task to complete.

Sound curricula combine different forms of tuition, both inquiry learning and direct instruction. Inquiry learning may be more effective in acquiring intuitive, deep, conceptual knowledge; direct instruction and practice can be used for more factual and procedural knowledge. Ultimately, we want students to gain a well-organized knowledge base that allows them to reason and solve problems in the workplace and in academic settings. Finding the right balance between inquiry learning and direct instruction, therefore, is a major challenge.

References and Notes

Organisation for Economic Co-operation and Development, Learning for Tomorrow's World-First Results from PISA 2003 (OECD, Paris, 2004).
National Science Foundation, in Foundations: Inquiry: Thoughts, Views, and Strategies for the K-5 Classroom (NSF, Arlington, VA, 2000), vol. 2, pp. 1-5 (www.nsf.gov/pubs/2000/nsf99148/intro.htm).
R. E. Mayer, Am. Psych. 59, 14 (2004).
J. S. Bruner, Harvard Ed. Rev. 31, 21 (1961).
J. Dewey, Logic: The Theory of Inquiry (Holt, New York, 1938).
C. A. Chinn, B. A. Malhotra, Sci. Ed. 86, 175 (2002).
T. de Jong, in Dealing with Complexity in Learning Environments, J. Elen, R. E. Clark, Eds. (Elsevier Science, London, 2006), pp. 107-128.
T. de Jong, W. R. van Joolingen, Rev. Ed. Res. 68, 179 (1998).
C. A. Chinn, W. F. Brewer, Rev. Ed. Res. 63, 1 (1993).
A. E. Lawson, J. Res. Sci. Teach. 39, 237 (2002).
A. Keselman, J. Res. Sci. Teach. 40, 898 (2003).
L. Schauble, R. Glaser, R. A. Duschl, S. Schulze, J. John, J. Learn. Sci. 4, 131 (1995).
E. L. Lewis, J. L. Stern, M. C. Linn, Ed. Technol. 33, 45 (1993).
S. Manlove, A. W. Lazonder, T. de Jong, J. Comput. Assist. Learn. 22, 87 (2006).
C. Quintana et al., J. Learn. Sci. 13, 337 (2004).
W. R. van Joolingen, T. de Jong, in Authoring Tools for Advanced Technology Educational Software: Toward Cost-Effective Production of Adaptive, Interactive, and Intelligent Educational Software, T. Murray, S. Blessing, S. Ainsworth, Eds. (Kluwer Academic, Dordrecht, Netherlands, 2003), pp. 1-31.
W. R. van Joolingen, T. de Jong, A. W. Lazonder, E. Savelsbergh, S. Manlove, Comput. Human. Behav. 21, 671 (2005).
D. T. Hickey, A. C. H. Kindfield, P. Horwitz, M. A. Christie, Am. Ed. Res. J. 40, 495 (2003).
B. White, J. Frederiksen, Ed. Psych. 40, 211 (2005).
The full interactive example, including hypothesis scratchpad, is available online (www.simquest.nl).
M. C. Linn, P. Bell, E. A. Davis, in Internet Environments for Science Education, M. Linn, E. A. Davis, P. Bell, Eds. (Lawrence Erlbaum Associates, Mahwah, NJ, 2004), pp. 315-341.
J. Swaak, W. R. van Joolingen, T. de Jong, Learn. Instruct. 8, 235 (1998).
X. Lin, J. D. Lehman, J. Res. Sci. Teach. 36, 837 (1999).
D. Klahr, M. Nigam, Psych. Sci. 15, 661 (2004).
C. D. Hulshof, T. H. S. Eysink, S. Loyens, T. de Jong, Interactive Learn. Environ. 13, 39 (2005).
The classical conditioning example is available online (http://zap.psy.utwente.nl/english/).
J. Sweller, J. J. G. van Merriƫnboer, F. Paas, Ed. Psych. Rev. 10, 251 (1998).
K. H. Veermans, W. R. van Joolingen, T. de Jong, Int. J. Sci. Ed. 28, 341 (2006).
T. Okada, H. A. Simon, Cog. Sci. 21, 109 (1997).
In part sponsored by Netherlands Organization for Scientific Research (NWO/PROO), the Information Society Technologies (IST) priority of the European Community (the Kaleidoscope Network of Excellence), and Stichting SURF.
10.1126/science.1127750

The author is at the Faculty of Behavioral Sciences, University of Twente, Enschede 7500AE, Netherlands. E-mail: a.j.m.dejong@utwente.nl

0 Comments:

Post a Comment

<< Home