|Designing a Research Plan
Ways to Answer Your Research Questions
What information will help you answer your research questions? The classroom is a rich source of data and it is easy to get overwhelmed. As you design your study and think about methodology, the following considerations should help improve the quality of the information you collect.
Making Thinking Visible
A hitting coach tells a player, “When you start to swing you drop your back shoulder. As a result you’re swinging under the ball. That’s a major reason why you are making poor contact and popping it up so much.” Hitting a baseball is a complex act, but it is directly observable. The baseball coach can watch all facets of the act—the stance, how the player moves into the pitch, the body mechanics of the player given different types of pitches and so forth.
Thinking is a complex act too. But, intellectual activity is not always overt and teachers may have little direct access to students’ thinking. Too often, teachers view only the final product of students’ thinking—the completed paper or examination, and not the intermediate phases of thinking that lead up to the product. This is the equivalent of being a baseball coach whose sole source of information about his team is what he reads in the newspaper.
A significant challenge in classroom inquiry is trying to make students’ thinking visible and subject to observation and analysis at key times during the learning process. This involves engaging students in activities, exercises, and assignments through which their understanding becomes externalized.
To illustrate, consider the following episode from an elementary school math lesson. At the start of class, the teacher introduces a problem, ½ + 1/3 = ?, for the class to solve. This is the first time they have seen addition of fractions with unlike denominators. After the children have worked out their answers, the teacher asks three children to put their answers on the board.
Child A: ½ + 1/3 = 2/5
Child B: ½ + 1/3 = 2/5
Child C: ½ + 1/3 = about ¾.
Based on their answers about all we can conclude is that all three children are wrong. But, the teacher also asks the children to explain how they arrived at the answer—to talk through their solutions. As they do so, it is evident that each has a different understanding of the mathematical concepts involved.
Child A: ½ + 1/3 = 2/5 (“I added the numerators first and got 2 and then the denominators and got 5. That gave me 2/5.”)
Child B: ½ + 1/3 = 2/5 (“First I changed ½ to 2.1 and then I changed 1/3 to 3.1. Then I added 2.1 + 3.1 and got 5.2. Then I changed this back to a fraction—2/5.”)
Child C: ½ + 1/3 = about ¾. (“Well, I don’t know how to add the fractions. But I thought about how big the two fractions are. I imagined a pizza and what ½ and 1/3 of it would look like. It just seemed like if you added ½ a pizza and 1/3 of a pizza you’d get about ¾ of a pizza. I don’t know if it’s right, it’s just kind of an estimate.”)
As students describe how they did the problem, we get a glimpse of how they make sense out of the subject matter and reveals significant differences in their understanding. If we focus only on the endpoint of their thinking, we find out whether they get the correct answer but we have no basis for explaining how and why students learn or do not learn what they were taught.
Strategies for making students’ thinking visible
To the extent possible, we want access to students’ thinking as they engage the subject matter and tasks in our classes. These opportunities already exist when students answer questions, participate in discussions, work out problems, write papers, and in general, “show their work.”
So, why use “strategies” if students already reveal their thinking in class? The answer to this question depends upon whether or not students produce the kind of evidence you want relevant to your research question. In some cases a naturalistic observation may suffice—you simply observe or record their classroom activity. In other cases, you may need to interrupt or intrude on students’ activity to ask specific questions. Or, you may need to design a new task that externalizes their thinking.
Strategies can be as simple as asking students to write a brief response to a question in class or more highly structured performances of understanding (e.g., presentation of projects). The key feature is that they evoke students’ understanding and reveal how students make sense of the subject matter relevant to your research question (e.g., engage students in such things as using knowledge to explain, interpret, analyze, compare, make analogies, extrapolate, and find connections among unrelated facts and ideas).
Examples of Strategies to Make Students’ Thinking Visible
The type of evidence you need will determine the type of strategies you use. The list below is not ordered in any particular way.
Background knowledge probe. To determine students’ knowledge and understanding of a new topic ask them one or more questions that you hope they will understand by the end of the class period (or unit or eventually). These should focus on the major concepts rather than details. Over time you might develop a set the questions that works particularly well. (See Classroom Assessment Techniques Second Edition by Tom Angelo and Pat Cross, p. 119)
Written responses. Ask students to answer a question, solve a problem or identify gaps in their understanding at the end of class. Have students summarize their grasp of the material (i.e., the “minute paper”) or identify ideas that do not make sense (i.e., muddiest points). Alternatively, pose a question, problem or exercise that elicits students’ understanding of the subject at any time during a class period. These can vary from brief insertions (e.g., the instructor stops and asks students to write a brief answer to a question) to more extensive “performances of understanding” (e.g., a challenging exercise or activity through which students demonstrate and develop their understanding).
Think-Pair-Share. Pose a thought-provoking question in class. Give students a minute or two to write a response, then ask them to compare their responses to classmates sitting next to them. Ask several students who have different views to report their responses to the entire class.
ConcepTests. Harvard physicist, Eric Mazur, uses conceptests during a class period to determine students’ understanding of key concepts. He follows a sequence similar to the think-pair-share technique: 1) Instructor poses question, 2) Students think about it and then record individual answers, 3) Students try to convince neighbors of their answer, 4) Students revise their answers, and 5) the instructor tallies answers and explains the correct answer. About one third of class time is used for conceptests. (See Peer Instruction published by Prentice Hall for a full explanation.)
Class presentations can be significant performances of understanding. However, not all presentations demonstrate or develop understanding, especially if students’ have scripted the entire event. Presenters should respond to questions that probe their grasp of the topic as part of the presentation. Questions might be submitted before, during or after the presentation.
Class discussions can be performances of understanding to the extent that they involve progressive discourse—i.e., students actually develop ideas, not just talk about stuff. This depends upon how the instructor structures the discussion and the topic or question motivating the discussion. Instructors also need a way to monitor students’ understanding during the discussion. This could be a checklist that indicates the types of ideas, perspectives, etc. that ought to appear in the discussion.
Online assignments and discussions. Students post their ideas to an electronic discussion forum and respond to classmates’ work, or submit their work electronically to the instructor prior to class (e.g., students write answers to complex questions and email them to the instructor one day before class).
Personal Response Systems. Some classrooms have Personal Response Systems in which students use keypads on their desks to select their answers to questions. Instructors can stop periodically to ask an understanding question and get immediate feedback from the entire class. You can also do this procedure without the expensive technology by asking for a show of hands.
Drafts of significant papers and projects. Drafts of papers and projects can reveal intermediate forms of thinking and understanding. Of course, whether the drafts reflect qualitative differences depends on how the instructor structures the revision process.
Think aloud protocols. Cognitive psychologists use a research technique called verbal protocol analysis to gain access to individuals’ thought processes while they perform a task. The person is instructed to say whatever comes to mind and to keep talking for the duration of the task. This method, sometimes called “Think Alouds,” has been used to study complex psychological processes. For example, much of what we know about how people solve complex problems problem solving processes is derived from protocol analysis. The technique produces a verbal record that closely reflects the thought processes that occur during the task performance. Refer to the article by Lendol Calder which describes how he has used think aloud protocols to analyze students’ understanding in an introductory history class.
Monitoring the development of understanding throughout the semester Instructors can examine the development of important concepts and skills across the entire semester (and beyond the semester if so desired). Students’ understanding of central ideas may evolve throughout the semester, and it is easy to overlook this development as you focus on day-to-day material. Examine development by asking students periodically to address questions or problems related to core concepts (i.e., the most central ideas you want students to understand deeply in the class). The questions could be the same ones asked earlier in the term or they could be a series of questions that are progressively more sophisticated.
To make this a manageable take, decide how many different pieces of work to examine and how many students to sample. For example, one strategy is to select several pieces of work from 2-3 students at different performance levels (high performing, typical performing, low performing).
Assessing student understanding beyond the end of the course. Sometimes instructors want to know whether student understanding endures beyond the class where they first developed it. If you have the time and inclination you can contact a sample of students (email is good, phone calls are better) to do a follow up assessment. Another option is to ask instructors in follow up classes to use an assessment instrument in their classes. This works well if the instrument can be used as a diagnostic pre-test in the follow up course.
In summary, thinking aloud is a natural part of the college classroom. Making students’ understanding visible is not difficult, but in order to be useful we need to think about how, when, and why to observe students’ in the process of learning and thinking.
Sample Method: Using "Think Alouds"
Using "Think Alouds" to Evaluate Deep Understanding
Lendol Calder and Sarah-Eva Carlson
"Deep understanding" is what teachers want for students. But how do we know when it has been achieved? And are certain assessments better than others at shedding light on what students really know and understand? The story is told of a deaf English public schools inspector who assessed learning on the basis of facial reconnaissance. "I have not been able to hear anything you have said," he would admit to students, "but I perceive by the intelligent looks on your faces that you have fully mastered the text." Are today's assessments any more valid than this? The following essay describes the authors' experience with invalid student learning assessments and their subsequent employment of an effective technique called the think aloud method.
Think alouds are a research tool originally developed by cognitive psychologists for the purpose of studying how people solve problems. The basic idea behind a think aloud is that if a subject can be trained to think out loud while completing a defined task, then the introspections can be recorded and analyzed by researchers to determine what cognitive processes were employed to deal with the problem. In fields such as reading comprehension, mathematics, chemistry, and history, think alouds have been used to identify what constitutes "expert knowledge" as compared to the thinking processes of nonexperts. For first year assessors, think alouds offer a promising method to uncover what conventional assessment methods often miss: hidden levels of student insight and/or misunderstanding.
Experienced teachers know that popular assessment methods conceal as much as they reveal. Papers and exams, for example, offer little help for figuring out why a student has recorded a wrong answer or struggled unsuccessfully with an assignment. Conventional assessments also run into problems of validity. Because they rely on students' ability to articulate themselves in formal language, papers and exams tend to conflate understanding with fluency. But sometimes, especially with first-year students, the tongue-tied harbor deep understandings even though they perform poorly. The reverse is true, as well; sometimes articulate students are able to say more than they really understand. "The thorniest problem" of assessment, according to Grant Wiggins and Jay McTighe (1998), calls for differentiating between the quality of an insight and the quality of how the insight is expressed.
We first utilized think alouds when assessing a new design for a first-year history course. The new design shifted emphasis away from tidy summaries of historical facts and knowledge toward the central questions, methods, assumptions, skills, and attitudes that characterize history as a discipline. Students completed eight identical assignments in the course, and student learning was measured by comparing the students' first and last papers. The results were disheartening. It was the rare student who showed steady progress from week to week, and few of the final papers were superior to the first ones. On the basis of this evidence, it seemed the new course was a failure.
But different evidence suggested otherwise. In course evaluations and self-reports, students insisted they had learned a great deal, a claim that certainly squared nicely with the intelligent looks on their faces at the end of the term. Puzzled by the conflicting evidence, we turned to think alouds for help.
Our procedure was as follows. From sixty students in the course, twelve were selected to participate in a think aloud study, representing a cross-section of students in terms of gender, grade point average, and major/nonmajors. For their participation, subjects were paid ten dollars an hour. In week one of the course, we sat down with each student in a room equipped with a tape recorder. After training subjects how to think out loud, we presented them with documents concerning the Battle of the Little Bighorn, a subject most knew little about. Then we asked our subjects to think out loud while "making sense" of the documents. This was essentially the same task they would perform eight times over the length of the course, though in this case their thoughts would not be filtered by the task of composing an essay. With the tape recorder running, subjects read through the documents aloud, verbalizing any and all thoughts that occurred to them. When subjects fell silent, we would prompt them to think out loud or to elaborate on their thoughts as they attempted to make sense of the historical evidence.
Our think aloud sessions lasted anywhere from forty to ninety minutes. After all twelve sessions were completed, the tape recordings were transcribed for analysis. Analysis took the form of coding each discrete verbalization in the transcript according to the type of thinking it exemplified. We were able to identify fifteen different types of thinking processes displayed in the think alouds, from the uncategorizable ("it sure is hot in here") to comprehension monitoring ("I don't understand that part") to the six types of historical thinking we were particularly looking for, such as sourcing a document ("I can't trust Whittaker; he wasn't there"), asking a historical question ("I wonder what caused this battle?"), or recognizing limits to knowledge ("I need to see more evidence than this"). After coding each think aloud independently, we used a common rubric to rate each subject's proficiency on the six thinking skills taught in the course. For this, we used a 5-point Likert scale where "1" indicated the undeveloped ability of an average high school senior and "5" indicated a sophistication comparable to that of a professional historian. We then compared our coded transcripts until reaching consensus on how to rate the students' abilities in the six key areas. To prevent our bias as course designers from influencing the results, we contracted with an outside analyst to help us code the transcripts and rate students' abilities.
At the end of the term the twelve subjects completed a second think aloud. When these sessions had been transcribed and coded and the subjects' abilities rated, we compared the first and second think alouds to determine whether students had made gains in their understanding of what it means to "think historically."
The think alouds opened a fascinating window into the thinking patterns of students before and after the course. Overall, the think alouds revealed cognitive enhancements that were not as dramatic as claimed in student self-reports, but much greater than indicated by using comparisons of early and late papers.
Other surprises were equally interesting. Under-performing students struggled less with historical thinking than with reading itself. Moreover, in the second set of think alouds, we noted that some of the best insights and meaning making came from students who, in the gradebook, were steady "B" and "C" performers. For them, deep understandings seemed to evaporate when they tried to wrestle their thoughts to paper. This told us that we had work to do if we wanted to distinguish between assessing understanding and assessing students' ability to communicate their understanding. The real roadblocks to learning historical thinking, we discovered, are poor reading comprehension and prose writing.
On our campus, the potential of think aloud protocols has not been lost on other faculty. For example, library staff are using think alouds to assess how first-year students find information when they go to the library. Information gained from the study will be used to help library staff identify misconceptions and teach against common missteps students make when doing research.
Think alouds are not perfect assessment instruments. The advantage of think alouds is that they give us insight into our students' struggle to formulate problem solving strategies, employ skills, and develop insights. Papers, exams, and ex post facto commentary by students are helpful in their own way. But they make the process of understanding seem more orderly than it is, covering up the confusion, the disorientation, the mimicry of correct responses, and the lucky guesses-all of which are good to know about when assessing teaching and learning.
As the emphasis in first-year pedagogy switches from teaching to learning, from "what am I going to do today" to "what are they going to do today," the days using simply papers and exams to assess student learning are long gone. Teachers need more procedures capable of opening up the hidden world of learning. Think alouds can be helpful this way, especially in courses emphasizing the development of cognitive skills.
The authors Lendol Calder is chair of the department of history at Augustana College, Rock Island, IL. Sarah-Eva Carlson is an Augustana senior and research assistant for the think aloud project.
On course design and assessment:
Grant Wiggins and Jay McTighe, Understanding by Design (Association for Supervision and Curriculum Development, 1998).
On think alouds-their history, effectiveness, and procedures:
Maarten W. van Someren, Yvonne F. Barnard, and Jacobijn A. C. Sandberg, The Think Aloud Method: A Practical Guide to Modelling Cognitive Processes (Academic Press, 1994).
K. Anders Ericsson and Herbert Alexander Simon, Protocol Analysis: Verbal Reports as Data (MIT Press, 1993).
Debra K. Meyer, "Recognizing and Changing Students' Misconceptions: An Instructional Perspective," College Teaching 41 (Summer 1993): 104-108.
Disciplinary Uses of Think Alouds:
Catherine Crain-Thoreson, Marcia Z. Lippman, and Deborah McClendon-Magnuson, "Windows on Comprehension: Reading Comprehension Processes as Revealed by Two Think-Aloud Procedures," Journal of Educational Psychology 89 (December 1997): 579-591.
Linda Kucan and Isabel L. Beck, "Thinking Aloud and Reading Comprehension Research: Inquiry, Instruction, and Social Interaction," Review of Educational Research 67 (Fall 1997): 271-299).
Rebecca Swearingen and Diane Allen, Classroom Assessment of Reading Processes (Houghton Mifflin, 1997), pp. 21-25.
Suzanne E. Wade, "Using Think Alouds to Assess Comprehension," Reading Teacher 43 (Mar 1990): 442-451.
Craig W. Bowen, "Think-Aloud Methods in Chemistry Education: Understanding Student Thinking," Journal of Chemical Education 71 (March 1994): 184-190.
Alan H. Schoenfeld, Cognitive science and mathematics education (Lawrence Erlbaum Associates, 1987).
Lendol Calder, "Looking for Learning in the History Survey," American Historical Association Perspectives 40 (March 2002): 43-45.
Samuel S. Wineburg, "Probing the Depths of Students' Historical Knowledge," American Historical Association Perspectives 30 (March 1992) : 1-24.
Samuel S. Wineburg, "Historical Problem Solving: A Study of the Cognitive Processes Used in the Evaluation of Documentary and Pictorial Evidence," Journal of Educational Psychology 83 (March 1991): 73-87.
Copyright 2002, Lendol Calder and Sarah-Eva Carlson. This essay may be copied and used for non-commercial use without obtaining copyright permission but should include the source and author information. Any commercial use must be approved by the author. A Web version of this essay is available on the Policy Center's Web site should you wish to bookmark or print the article. Please visit