SlideShare ist ein Scribd-Unternehmen logo
1 von 31
Improving Student Response Quality
Utilizing “Thinking Maps®”
-Action Research Final Report-
Ann Younce
University of Colorado at Denver
IT 6720 – Research in Information and Learning Technologies
May 2, 2010
Thinking Maps®
2
Abstract
This introductory experience into action research focuses on the utilization of “Thinking Maps®”
(specialized visual organizational tools for content learning, designed by Thinking Maps, Inc.) to increase
student response quality and depth of understanding of complex content. A fifth grade teacher (the
researcher) working in a suburban district southeast of Denver, Colorado, was motivated to find out about
the effectiveness of “Thinking Maps®” (or TMs) strategies in her science class, and whether the maps
could produce a significant change in the quality and depth of responses from her students, many of whom
were reluctant to provide more than surface answers and who generally resisted higher level thought. The
researcher had never before used the TMs tools, but was inspired by a team member’s inclusion in a
school-wide pilot of the “Thinking Map®” program, and observed its use in the neighboring classroom
earlier in the year. The researcher believed that these purposeful thinking tools offered the right blend of
graphic organizer and critical thinking strategies to increase student engagement and achieve additional
scaffolding for differentiated student learning.
The guiding question evolved into determining to what extent “Thinking Maps®” (or TMs) are
effective tools and strategies to significantly produce a change in the quality and depth of written
responses, as well as to build scaffolding techniques, deepen understanding, and increase engagement.
The research process included implementing the use of a variety of TMs into the instruction of a new
“Human Body” science unit. As the action research progressed over the course of the four-week study, the
researcher taught students to use a variety of TMs, applied related key word and questioning strategies,
recorded field notes, observed student behaviors and attitudes, collected data in the form of questionnaires
and assessments, and monitored changes in the quality of student written responses and the level of
engagement with new information. She also pursued peer contributions in the form of discussions and
interviews. Throughout the implementation, she began to recognize improved note taking and study
patterns among her students, and determined that TMs indeed added a new layer of metacognition for
students that led to an increase in performance on science assessments. As a result of this action research,
the researcher plans to fully incorporate TMs into her instructional methodology, and to participate in a
school-wide “Thinking Map®” cohort next school year to continue with a new phase of action research.
Thinking Maps®
3
Introduction
In my current role as a fifth grade teacher in a Denver-area elementary school, I serve an impacted
suburban community, represented by the following school profile data: 580 student enrollment (100 of
which are in the fifth grade; 24 of which are in my class), with 34% on free-reduced lunch, an 82%
stability rate, a 95% attendance rate, and diversity demographics including Native American (0.9%),
Asian American (8.7%), African American (16.5%), Hispanic American (17%) and Caucasian (56.9%)
populations.
The school setting and culture is one of high expectations for best practice and professional
learning with a strong base of data-driven practices among faculty and staff. The school utilizes embedded
Professional Learning Communities to build capacity while continually seeking and expanding learning
experiences. The school’s organizational purpose and motto clearly states: “Learning…Whatever It Takes”
and it is an effective philosophy for student achievement as well as professional growth. This foundational
message both supports the implementation of research-based solutions and encourages action research
among colleagues.
Research Statement and Guiding Question
As a largely unexplored topic, I planned to focus my action research in the general area of
effective instructional strategies that impact all content. I aimed to take a closer look at the use of graphic
organizers as metacognitive tools for deepening understanding and promoting higher level thinking. A
large number of my students were consistently providing written responses that lacked substance and
failed to reach higher levels of thinking. I had many questions about what the appropriate solutions could
be and I saw this action research as an opportunity to investigate and attempt a workable remedy. The
overall research goal was to try to increase literacy and science achievement (as demonstrated by increased
comprehension and improved written response, predominantly with large amounts of new science
information) by using “Thinking Maps®” (or TMs) in an effort to close current achievement gaps. This
topic involved both an increase of skills for students - the need for independent/automatic skill use to
communicate/show their thinking and the ability to apply strategies in cross-curricular ways, as well as
their ability to internalize a new learning strategy process -incorporating specific, visual representations
and patterns for thinking, utilizing a common language (Hyerle, 1996). My main goal was to incorporate
new TMs strategies during a large instructional science unit, utilizing the process of journaling, interviews,
disaggregated pre- and post-assessment data, anchor work samples, and similar artifacts.
The projected affect of such research was three-fold: specifically, my students could gain
improved ability to acquire and retain large amounts of information, making it both meaningful and
relevant to them through cross-curricular strategies. The teachers (initially me and my grade level team)
Thinking Maps®
4
could also gain in our ability to differentiate instruction for those who need a more structured approach to
internalizing information and communicating higher level thought. Ultimately, my colleagues and the
entire school community could benefit from shared information and increased student, instructional and
programmatic successes. This was a beneficial challenge because my personal interest in brain-compatible
instruction and metacognition served as another impetus for this research. Finally, the research was both
timely and relevant due to the fact that sections of my school have been in various stages of the
implementation of a new “Thinking Maps®” pilot program and have established a cross-grade level cohort
for vertical alignment.
As a desired outcome, I hoped for increased understanding and achievement for students, utilizing
quick, readily-available scaffolding frames at an independent level in order to successfully accomplish
three goals: (1) read/acquire to filter information, consistently identify main idea and provide supporting
details, (2) organize to construct meaningful information, secure logical patterns and make emotional
connections, and (3) write to demonstrate increased comprehension by showing quality responses, high
level thinking and internalized deep understanding. The guiding question for my research continued to
evolve and finally emerged with several facets:
To what extent are “Thinking Maps®” effective tools and strategies: (1) to significantly produce
a change in the quality and depth of written responses, and (2) to build scaffolding techniques,
deepen understanding, and increase engagement?
Review of Literature
It was important for me to select parameters to guide my search for relevant literature as I began
the ongoing process of researching “Thinking Maps®” (or TMs). As my topic became more and more
narrowed, it naturally diverged into several themes: graphic organizers, engagement, high-order thinking,
effective cross-disciplinary instructional strategies, depth/complexity of understanding, and metacognition.
I quickly made use of the various online databases such as Google Scholar, and Education Full Text
(Wilson’s Web), available from the university’s Auraria Library, to access electronic resources. In addition,
I created general Internet searches of familiar professional sites I regularly use, such as ASCD and McRel,
in addition to the “Thinking Maps®” website itself, as well as made use of professional texts from my own
personal library. A quick search through each of their bibliographies allowed me to identify additional data
sources that these other references had used.
As I began to search for relevant literature to collect and review, I compiled a list of keyword
search terms and tried multiple variations, such as: “thinking maps,” “graphic organizers + deep
understanding,” “common visual language,” “non-linguistic representation,” and similar combinations
involving maps or organizers used to solidify learning and increase comprehension. At first, it seemed like
a treasure hunt as I searched through articles about “Thinking Maps®” (or TMs) and found references to
Thinking Maps®
5
additional articles, studies and names of researchers. In particular, the “Thinking Maps®” website had
devoted entire sections to the research conducted in support of their product, including links to the
Thinking Foundation and Designs for Thinking websites. These sites contained various research formats
including journal articles, case studies, and news articles, and I found excerpts and glimpses of classroom
teacher research projects explained throughout the article texts.
In order to produce relevant search results of the literature, I instituted various descriptors to serve
as lenses or filters through which to pass my reference sources through. These filter criteria included:
recently published references, results from actual studies, implementation information, elementary/middle
school-preferred ranges, and a science-based focus (for example, brain research in education). These filters
helped me to stay organized, categorize my research into subtopic themes, and evaluate potential
references.
The chart below provides a general overview of the types of collected literature which includes a
variety of source formats such as professional journals, professional texts, and relevant websites. (For a
complete listing of all literature sources, see References).
Figure 1: Sample Literature Collected
Texts/Authors Periodicals Websites Other
Hyerle New Hampshire Journal of Ed www.mapthemind.com course website
Medina Journal of Educational Psych. www.thinkingfoundation.org site cohort
Marzano Teaching, Thinking & Creativity
Costa Educational Leadership
Robinson & Lai The Reading Teacher
Jensen Metacognitive Learning
Tomlinson
I began to collect large amounts of information before I realized that most of it was related to the
same researcher (Dr. David Hyerle, the founder of “Thinking Maps®”) and was limited to a specific time
frame (late 1980’s and through the 1990s). I quickly became frustrated not being able to find many current
(within the last few years) studies or publications that related to my topic. Although I utilized several
database searches, I still came up with the same listings, from 10 to 15 years ago. Even though the brain
research of the last decade shows links between understanding, non-linguistic representation and
achievement (Marzano, Pickering & Pollack, 2001), I was unable to find any more recent studies and
immediately wondered why, second-guessing my choice of research. These gaps of somewhat outdated
literature, as well as some additional identified problems (such as many articles that applied to ELA or
special education needs only) left me disappointed with the small scope of the research and I had to
consider some sources to not be as valid and reliable as others. I have since assumed that some of the gaps
(especially in Hyerle’s case) may have been due to renewed longitudinal studies since the earlier years of
Thinking Maps®
6
research. Therefore, as far as the scope is concerned, certain literature may not be included because there
seems to be very little (as far as research studies) conducted past 2001.
Amid the articles and studies that I did happen to find, there were varying degrees of cohesiveness
in the findings, and just as many areas of conflict. From the significant researchers, relevant ideas emerged
and my beliefs were further derived and substantiated. As for a few key studies or articles of significance
to my research, there were some similarities that related directly to my own questions, beginning with the
foremost expert in the field, Dr. David Hyerle. Hyerle suggests that after being reinforced over several
years (a main difference from my own study) students are able to “transfer multiple maps into each content
area, becoming spontaneous in their ability to choose and use maps for whatever content information and
concepts they are learning” (Hyerle & Williams, 2009). Although my students had only about four weeks
of “intensive” map training, they too could choose from multiple maps. (Although the breakdown here for
my students was that they were not always used correctly for the corresponding skill). Like Hyerle’s recent
study, some of my students were also able to identify the correct thinking process and select the
appropriate map and use the obvious cognition pattern. In a path that diverges from Hyerle’s study, I found
that my students would all start with the common language and graphic, yet how they completed the
information varied greatly among students.
I also found design strategies among the literature that were appealing to me and similar those I
chose for my study. Stull and Mayer’s (2007) study, although far more complex than my own, had
students in one portion generate their own graphic organizers after being frontloaded in how to generate
various types of visual tools, such as hierarchical, lists, flowcharts and matrices. Both their study and my
own had the instructor first modeling types of graphic organizers.
In contrast however, I found conflicting analyses and conclusions in the Stull and Mayer (2007)
piece, where they found that there was little to no evidence that students constructing their own graphic
organizers achieved any more than students who received author-provided organizers. Although my
students all received the same “blank” organizer, they differed on their responses within. There were many
discrepancies in the results of studies and the conclusions of articles as to whether or not the visual
patterns (and I interpreted “Thinking Maps®” here) resulted in extraneous processing or generative
processing, and the evidence for deeper learning was contradictory. I question here whether this effects
“Thinking Maps®” (or TMs) and do all the various types of graphic organizers interfere with a student’s
cognitive process as they’re selecting and using the maps – do they use too many versions of graphic
organizers and waste time/cognitive processes independently deciphering which is the best tool to use?
In addition I found very little from which to model the design of my own research among the
articles and studies. Many of the earlier studies had hundreds of participants across multiple schools or
districts, whereas I am limited to my own class of 24 students; although I did find a few useable
Thinking Maps®
7
similarities in some of the shared vignettes from classroom to classroom that were mentioned in some of
the articles and research studies (Leary, 1999; Hyerle, Curtis & Alpert, 2004).
TMs are advanced graphic tools used to secure brain pathways for learning, and metacognitive
tools used to increase awareness of cognitive cues and help to deepen student understanding. One other
aspect that concerned me (and confused me) was in the contradiction of some of the evidence in some
studies (Merkley & Jeffries, 2000/2001; Stull & Mayer, 2007; and Ritchhart, Turner & Hadar, 2009) that
showed that general graphic organizers (not mentioning TMs specifically) may or may not increase student
comprehension or deepen learning. Following recent brain research, it was stated that vision dominates all
other senses in the brain, and the more elaboration, the better the coding and hence better storing and
retrieval (Medina, 2008), which seems to make an argument in favor of visual tools like graphic
organizers. Although Medina also states that it is through pictures - not through written or spoken words –
that the brain makes the most lasting connections, that leads me to wonder why the TMs program doesn’t
make mention of incorporating sketches and pictures in place of some words within the various map
structures?
In summary of the literature, although there definitely were more articles than any recent studies
that pertained to my research topic, I feel as though I semi-exhausted my search across the related fields of
cognitive science, effective instructional practice, and brain research, and it did allow me to form a more
complete research picture, especially as I began to see a saturation of the same researchers mentioned
again and again. Some further thoughts or questions that were prompted by my literature review would
have to include whether or not my choice of study still represents a valid instructional practice? The
methodology and research presented by the TMs website itself seems relevant and robust, yet I am still not
clear as to why there is not more current research out there? Perhaps I have stumbled upon a gap in the
literature as it seems as though there should be more studies that explore variations of the TMs strategies
or, at the very least, further challenge the relevance of this type of tool as it compares with other brain
research.
Relevant Background Knowledge
TMs were designed to integrate content learning with thinking process instruction across
disciplines and be utilized as a “common visual language” for learning (Hyerle & Yeager, 2007). Learners
are expected to identify 8 fundamental thinking skills and link them to dynamic visual representations to
create specific pathways for “thinking about their thinking”. Maps are designed to be used in combination
to increase depth and complexity, and as a developmental process, complexity in use increases overtime
(Hyerle, Curtis & Alpert, 2004). “Frames of reference” surround each map to provide connections to
sources, reflection and ownership. The maps are consistent, integrative, and flexible for various types of
Thinking Maps®
8
implementation. Each map corresponds with a specific thinking process designed to activate and build
schema:
Figure 2: “Thinking Maps®” and 8 Corresponding Cognitive Skills
“Thinking Maps®” Cognitive Skill
“circle map” defines in context; presents point of view
“bubble map” describes sensory, emotional, logical qualities
“double bubble map” compares & contrasts qualities
“tree map” shows relationships between main idea & supporting detail
“flow map” relates events as a sequence
“multi-flow map” indicates cause & effect; helps predict outcomes
“brace map” deconstructs physical structures & part-to-whole relationships
“bridge map” transfers or forms analogies
(See Appendix D for actual map designs)
Expository text structure is often difficult to digest for intended meaning, complex concepts,
specialized vocabulary and inferred relationships such as cause & effect, compare/contrast, sequences or
cycles (McCormick, 1995). TMs enable learners to assess prior knowledge, respond to complex inquiry
and, working independently or collaboratively, draw out essential knowledge, identify critical information
with supporting details, and make inferences and find connections to the text.
TMs have been purposefully organized to align with current brain research to enable movement
from concrete to abstract thinking with greater depth and to directly apply thinking to complex tasks.
Meaningful context links to emotional and metacognitive “frames of reference” and ownership, addressing
essential questions, prior knowledge, primary sources, point of view and other influences. Over the last
decade, brain research shows that 80% of all information that comes into the brain is visual (Jensen, 1998)
and the visual patterns of TMs help learners create concrete images from abstract thought. The brain is a
natural pattern detector and teachers need to provide students with experiences that enable them to
perceive patterns and make connections (Caine & Caine, 1991). It has been shown that “explicitly
engaging students in the creation of nonlinguistic representations stimulates and increases activity in the
brain” (Gerlic & Jausovec, 1999) and unlike earlier forms of concept maps (Novak & Gowin, 1984) and
other types of random or ready-made graphic organizers that do not fully suffice, TMs encourage learners
to become independent thinkers due to the ownership and connections made in their completion.
In further support, the dual-coding theory (Paivio, 1986) states that simultaneous use of pictures
and words enhance the brain’s ability to organize information for subsequent retrieval and use. Research
proves that the more learners use both linguistic and nonlinguistic representations the better their thinking
and recall (Marzano, Pickering & Pollack, 2001). TMs align nicely and integrate into practice with many
Thinking Maps®
9
of Marzano’s 9 strategies most likely to increase achievement, among them: identifying similarities &
differences, summarization & note-taking, cooperative learning, cues, questions and advanced organizers.
Class time spent using TMs increases engagement with specialized thinking skills, and it is suggested that
students should spend 60-80% of class time engaged in “process, discussion, group work, self-assessment,
journal writing, feedback, mapping” (Tomlinson & Kalbfleisch, 1998) with such differentiation evident in
the strategic components of TMs.
Methodology/Research Design:
The purpose of the study was to determine the effectiveness of “Thinking Maps®” (or TMs)
strategies for use in science instruction, and whether the maps could produce a significant change in the
quality and depth of responses from students, many of whom were reluctant to provide more than surface
answers and who generally resisted higher level thought. The guiding research question was: To what
extent are “Thinking Maps®” effective tools and strategies: (1) to significantly produce a change in the
quality and depth of written responses, and (2) to build scaffolding techniques, deepen understanding, and
increase engagement?
In order to best serve my research into TMs, one small way in which I employed aspects of
problem-based methodology in using both qualitative and quantitative research methodology and design
traditions, was in exploring some limited use of the “Theory of Action” techniques (Robinson & Lai,
2006) in exploration of constraints, actions and consequences which will be addressed later in this study. I
began by asking questions about the assumptions and beliefs I held about my classroom and my concepts
of student response quality. Students need a certain type of response to demonstrate understanding and I
believe I had been a good judge of the depth of their understanding, asking them to “write/draw it in
another way” or to “tell me more” to elicit the best possible response. From here I needed to further
investigate what I do in my practice and why I do it when assessing student response quality.
The main type of data analysis I employed however, most resembles that of Wolcott’s Three
Stages of Analysis (Wolcott, 1994), where different aspects of analysis based on qualitative data collection
inform each other to incorporate (1) descriptive data from original notes, questionnaires and pre-/post- test
experiences, (2) analyses based on how the descriptive data initially relates to the research question in an
exploratory manner, looking for patterns and themes, identifying key factors and relationships, and (3) my
interpretations to make sense of findings and the conclusions based on the extent to which they pose
meaningful, relevant and compelling answers to the research question. Quantitative data collection
appeared in the formulation of results tables to quantify through tallies the number of occurrences of
specific written response types, frequency of TMs used, or data points for behaviors.
Thinking Maps®
10
Site Selection
I selected my own fifth grade classroom as the site for this study, as I believed this would help me
achieve the maximum impact and benefit, thus making the experience as meaningful and personally
relevant as possible. The classroom is one of four adjacent fifth grade classes, with 24 students (11 boys
and 13 girls) with a demographic composition that closely reflects that of the entire school population (see
Abstract). The science lessons take place predominantly in the afternoon block, being the last 45-minute
class of the day. The students are situated in table groups of 4 to 5 students per group, and a typical day
consists of a quick preview of new material (or a review of the previous day’s lessons), followed by either
a group exploration activity or experiment to “kick-off” the inquiry. Time is then usually spent in direct
instruction through interactive SmartBoard lessons, class discussion, group reading or data mining,
collaborative or independent work production, and finally reflection. Occasionally, due to schedule
changes or anticipated interruptions of the instructional time, I would switch (or add) science instruction
time into the morning block, with each day usually starting out with a pre-reading activity or TMs
preparatory work as part of a morning menu.
Role
My role was generally that of “facilitator”, where I would introduce content, model and suggest
strategies, employ questioning techniques, elicit responses and reflection, and monitor collaboration. My
goal for consistency was to develop an instructional routine to closely follow in order to keep the
experience familiar and consistent for the students, and to keep an order for me to follow in terms of
making observational notations and focusing specifically on student response quality and incidents of
engagement. The classroom climate needed to be one where thinking is valued, visible and actively
modeled (Ritchhart, Turner & Hadar, 2009). Just prior to the beginning of the first data collection cycle,
we held an initial class meeting to discuss the action research project they would be a vital part of – using
TMs to improve content learning and assess response quality. I answered any questions or address any
concerns they had (and the only “fear” that needed to be alleviated was that they wouldn’t be graded on
their questionnaire responses – only on the regular unit quizzes and tests). In describing my relationship
with the students, I tried to maintain a positive report and communication style, rigorous expectations, a
casual teaching style that incorporated flexible pace as needed and interjecting humor to engage. I utilized
a school-wide positive reinforcement behavior management system (“Mountain of Trust” – earning their
way “up the mountain” using silent and visual cues employed without a break in instructional momentum)
with predictable rewards and consequences. My students were comfortable with me and that climate
allowed them to take risks this group endeavor when asked, and at times, independently.
Thinking Maps®
11
Purposeful Sampling Strategies
When it came to the selection of individual respondents, I made the decision to sample all 24 of
the students in my class. As an intermediate grade level teacher, it was inconceivable to run any sort of
“control group” and “variable group” when I’m ethically bound to provide consistent instruction for all.
Instead, I would make general comparisons of my class as TM beginners, to other classes with more or
less TM training. Although my grouping was a relatively small data set, it was one that was manageable by
me. For my interview subjects, I first selected one of my grade level colleagues who was participating in
the tandem TMs pilot cohort on site, as well as an instructional coach/trainer who was overseeing the
cohort. Additional questions and probes were asked of other team members who were not part of the pilot.
The selection of activities to observe included taking anecdotal field notes on random student
behaviors, attitudes, engagement, and expressions. I rotated around the classroom listening-in on table
collaborations and cooperative learning routines. Other activities included observing TM use during class
and for assessments. I was looking for confident and appropriate use of TMs as well as an improvement in
overall response quality.
As for the selection of documents to analyze, I looked at the use of thinking strategies, TM
completion and response quality on student work products such as class and individual TMs, science
notebooks, test questions and research questionnaires, and my own field notes. I created a mix of closed
questions with a range of pre-determined responses, and open-ended opportunities that were non-
threatening and would allow for rich and detailed responses (Ritchhart, Turner & Hadar, 2009).
Data Collection Strategies
In order to remain consistent with the traditions of the selected research designs, data collection
tools were designed purposefully to elicit clear pictures of student thought processes to more accurately
determine content knowledge and assess response quality. I established a balance of different artifacts to
collect (Robinson & Lai, 2006) which included observations of students recorded in field notes,
questionnaires for students to explain use of strategies or to rate difficulty, examination of student work
product (actual TMs), and interviews with colleagues. Data was obtained by observation techniques, note
taking, disaggregating student responses on questionnaires, pre-tests and post-tests, and comparisons of
student TMs to anchor maps. I was intrigued by the notion that students learn more deeply when they
construction their own organizers, or “learn by doing” than when the organizers are provided for them, or
“learn by viewing” (Stull & Mayer, 2007). I knew I had to model the initial use of each type of TM while
balancing the need to allow for learner-generated variance. I established a collection cycle and kept it
consistent for each subtopic within the unit: pre-test, map introduction and use (“circle map”, followed by
“tree map” or “bubble map” or “brace map” as needed, and culminating with a “flow map”) and finally
Thinking Maps®
12
post-test. I would rotate around from group to group, observing students over-the-shoulder, have students
verbally explain their strategy to me, or observe from a clear vantage point during an assessment.
Interviews of colleagues were conducted largely in an unstructured conversational style and I designed
questions so as not to be leading in any way. The collection of actual data was planned for as I recorded
observations, thoughts and process notes in a spiral notebook. Throughout the lessons and assessments I
would collect data in an informal way (sticky notes or clipboard), tally according to criteria (for example,
the number of times students looked up vaguely, trying to remember; or the number of times they looked
around the room looking for visual cues), or anecdotal notations be formally transcribed after class.
Assurances of Confidentiality
In order to ensure confidentiality throughout the process, assurances were made and designed into
the research methodology. Verbal consent of colleagues was achieved, and informed consent of the
students was implied as I spoke with them at length about the process and their role in it. Parental consent
was not necessary as this was part of the daily routines of class and no identifying elements would present
a conflict. I concentrated on the student group as a whole, with no individual identifying comments in field
notes (other than pre-determined codes to safeguard identity) and I took care not to mention any
identifying factors of specific individuals in class conversations, interviews with colleagues, or in the
writing of this document.
Data Analysis Strategies
I endeavored to employ techniques to be consistent with traditional qualitative and quantitative
methods, although my approach was not a particularly linear process but one that was flexible among the
complexities of the teaching and learning day. I looked for obvious shifts in the classroom as far as
behavior, engagement, strategy use, and quality of response, and I also looked to evaluate changes in
student conceptions of thinking. The literature provided me with guidance in the form of 3 components for
successful and creative data analysis: (1) patience - reminding myself that it was my first time with action
research, as well as a first for my students, (2) mistakes – maintaining a willingness to take risks and an
acceptance that mistakes would be made, and (3) playfulness – remembering to have fun with the process
and with my students as we investigated TMs together (Hubbard & Power, 2003).
In each of the data collection tools I attempted to examine frequency of specific question
responses, and tally instances of successful use, or marginal attempts at use, or non-use of various TMs.
Some of the participant coding makes reference to both gender as well as ethnicity data (and this particular
use of demographics is something that my district and school consistently utilize in order to focus on
equity issues, as well as serving as a reminder to always be cognizant of potential equity constraints in
Thinking Maps®
13
instructional practices). The coding indicated: B (boy), G (girl), 4/3 (ethnicity code for “hispanic/black”),
and 5 (ethnicity code for “white”) as there are no other ethnicity groups represented in this participant
pool. I also made an attempt to code emotions (such as “easy, fun, hard, or confusing”) in my field notes
and look for emergence of new types of responses. There also came the realization that, after many years
of experimenting with tracking student responses and behavior, it may be difficult to analyze the “subtle
nuances of change” and find different ways to code than what had been intuitive thus far as simple
anecdotal records (Hubbard & Power, 2003).
Other techniques of verification included transcription after the fact into usable data, looking for
recurring patterns to identify, as well as the use triangulation to observe from several different angles, such
as student interviews and observations and pre-tests/post-tests, to see if they all independently produce the
same conclusions (Robinson & Lai, 2006). The additional use of colleague data will help to round out the
questionnaires, observations and test data to confirm stronger triangulation within data collection.
Presentation of Findings: Results, Implications and Significance
When looking at the data through the lens of “Theory of Action” (Robinson & Lai, 2006) making
sense of the constraints meant identifying the factors that led to my use of TMs: the program was being
piloted at my school with a pilot cohort on site, and one of my fifth-grade team members was actively
using TMs in our classes. In my class there was varying ability to read/write independently and a wide
variance in the depth of learning and quality of written responses for some individuals, so there were many
diverse learning styles and needs within the same group. TM authors recommend at least an 8-week
introduction period followed by a year-long process to work with TMs until they start to become intuitive
for learners. My own 3-4 week study pales in comparison and the limited exposure to TMs (except in
flexible grade-level groupings for math and reading where they may have been exposed to more TM
training with the pilot teacher, or from other grade level team teachers attempting to emulate without the
training) may have produced inequities with TM instruction repetition for some and being new for others.
In addition to absences which are unavoidable, the majority of instruction occurred at the end of the day
which may have produced a loss of attention span, frequent interruptions (PA announcements, students
leaving school for appointments) and the like. One other possible constraint was determining whether it
was the presentation of new material (“human body”) or new map use (or both) that made the difference in
results.
In an attempt to satisfy these constraints on the problem, various actions through instruction were
used to narrow to a range of solutions (Robinson & Lai, 2006). I worked to keep students focused (using
SmartBoard strategies designed to engage) and created predictable routines for each lesson. I implemented
alternate activity to increase attention span (integrated movement, varied the use of text, video,
Thinking Maps®
14
SmartBoard, discussion, cooperative groupings for pair-shares and jigsaw activities). I also tried to
transition the students in their use of TMs by starting out with the pre-printed blank TMs at first and
gradually released them to use their own self-drawn maps. TMs by their design were meant to allow for
differentiation and the needs of those learners with varying abilities were met in the flexible nature of the
maps. I attempted to prevent lesson interruption by occasionally rearranging the day so that the science
TMs lesson could be earlier to avoid afternoon disruption. I also maintained constant communication with
other TM teachers to aim for consistency in instruction, modeling and the rate of exposure to TMs.
The subsequent consequences of instructional actions produced a more focused and engaged
climate with an apparent increase in student interest (but again, was that the TMs or subject matter – or
both?) and both intended and unintended consequences emerged (Robinson & Lai, 2006). Intended
consequences were positive reactions to the TMs demo and students were more motivated to get their own
written, more productive in written responses, and the collaborative groups seemed to increase the sense of
community while working with new maps and sharing connections. I made sure to encourage different
types of color-coding on the maps and for this, student connections to references and schema were more
effective in the depth to which they could transfer their learning during class.
Other expected results included the findings that TMs were most effective when used in
combination for students to fully develop their conclusions (Hyerle & Yeager, 2007). The most precise and
detailed responses came when students used two or more maps to explain their thinking. Also intended
was my increased ability to use TMs to assess students’ development of concepts and misconceptions
(Novak & Gowan, 1984). It was far easier to spot patterns and connections in thinking as well as the
glaring gaps from misconceptions or map misuse. It was anticipated that students would achieve the ability
to organize their ideas independently, assess the quality and quantity of their own written responses, and
possess increased awareness of own thinking and an increased motivation to learn. In conversations with
students, it was made clear that they thought these strategies helped them as they read through text,
watched videos and participated in class discussions. 20% of the class even alluded to the fact that the use
of TMs helped them more than the traditional study guides when preparing for the test.
There were, however, unintended consequences that I did not anticipate. Although I knew that
those students who generally do well would continue to do well (and the data supported this), I assumed
that all of those who did not generally do well would make some growth. Instead, I found that only a small
group began to improve and for some, there was no improvement. For some inexplicable reason, there was
also a reversal in performance about midway through, and students started to use TMs less and less
effectively. I relied on the assumption that the maps would begin to make a permanent change in the
written response quality of most students, and during direct instruction or independent class time, it did.
However, on assessment it appears that perhaps the “novelty” wore off because TMs were utilized less and
Thinking Maps®
15
less across the board. My only thought is that in the constraint of available time and length of data
collection could not be altered, therefore it could still be too early to tell if any significant growth is
possible without continuing the TM instruction and monitoring again toward the end of the school year.
The overall presentation of the data is organized both by type of data tool and chronological stage
of data collection. By examining the data closely I was able to comment and reflect on different aspects
which appear either below the data sets or further on in the discussion section. As with all the data
presented, absences occurred on a daily basis so there are few data sets showing results for all the
respondents at any one time. For “Questionnaire #1”, I thought it was important to establish a baseline of
prior knowledge about the specific content to be studied (the human body) and of various thinking
strategies and tools previously used prior to the introduction of any TMs in my science class.
Figure 3: Student Questionnaire #1: Prior Knowledge
Q-A: How much do you know about the human body before we start our unit?
Participant code/Response “ a lot” “some” “a little” “nothing”
B 4/3 0 2 0 0
B 5 1 2 2 0
G 4/3 0 0 6 0
G 5 0 4 0 1
Q-B: What strategies and tools do you use most to help you learn information?*
Participant code/Response “ study guide” “books” “notes” “TMs”
B 4/3 0 0 2 1
B 5 1 2 1 0
G 4/3 0 1 3 2
G 5 1 1 1 1
**Other: There were other responses noted such as: mnemonics, cheers/chants, flash cards, games, videos,
online support, group study, memorization…
(See Appendix A for actual questionnaire)
Most of the respondents (about 88%) had “some” or “a little” prior knowledge about the human
body content which I think would make for an easier transition into using TMs to organize knowledge and
remember terms, facts and processes. There were 4 students (18%) who already used TMs to varying
degrees as a learning strategy or tool in other classes, and the majority (59%) used some traditional form of
study strategy. A smaller percentage (22%) used additional strategies that they delineated, but the data
shows that all students were comfortable using strategies of some kind. I anticipated that even though the
number of participants previously familiar with TMs was small, that this number would grow with the
instruction in each sub-unit of instruction, or body system. As for looking at ethnicity data… there are few
if any implications here that show variance in ethnic groups and I wasn’t sure going forward if any
significant patterns would be revealed. (One of the initial constraints I noticed was the issue of participant
absences and it made for a varying number of responses every time I collected data. Because the absences
were attributed to different individuals each time, I wasn’t sure how address the validity of data collection,
Thinking Maps®
16
other than to continue to be consistent for all who were present. If I were to try to “make-up” the data
when a student returned, it might skew the overall data because it was collected in different settings, and if
I were to ignore the absent individuals, then the total number of participants would change with each
collection tool. I chose to try a little of each remedy, although I’m not sure that I can disaggregate the
effect from the overall data).
Throughout each sub-unit, pretests were given to assess prior knowledge. At no time (especially in
the later pre-tests) did students independently utilize TMs to share their thinking. I assumed that they
would gradually start to appear, but not once in a pre-test situation was a TM used by a student. During
instruction, TMs were utilized by all students to their fullest capacity, since I began modeling with the
whole class emulating and then I would wander as they worked looking for completeness. When it came
time to assess independent use, the next form of data collection was the post-tests following each sub-unit
where it was anticipated that TMs would appear in student responses. Both the “pre-test” and “post-test”
asked an identical question: whether the student could describe the process for that specific body system
(such as the process of digestion, the process of breathing, the process of blood circulation, the process of
the brain sending a message, etc.). Most pre-tests were answered in the same manner (short written
responses to the question, or “I don’t know”) but after content instruction and TMs modeling of the “flow
map”, the post tests began to show some TMs use to varying degrees of success:
Figure 4: Post-Test Results
Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code
1 1 0 B 4/3
4 5 0 B 5
4 3 0 G 4/3
Digestive
1 5 0 G 5
Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code
0 2 0 B 4/3
2 5 1 B 5
2 6 0 G 4/3
Respiratory
2 0 3 G 5
Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code
0 3 0 B 4/3
1 3 3 B 5
4 2 2 G 4/3
Circulatory
1 3 0 G 5
Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code
0 2 3 B 4/3
0 1 4 B 5
0 4 1 G 4/3
Nervous
1 2 2 G 5
Thinking Maps®
17
The initial pre-test responses were for the most part the same - minimal answers with little
terminology or content knowledge, and no use of TMs – even after the students had started to receive
instruction in their use. Targeted TMs were introduced (such as the “flow map” for sequence of
events/processes) and utilized throughout instruction and, when the same question was posed on the post-
test, my expectation was that the students would use flow maps on the assessment to explain the specific
body function process. The data shows TMs usage with varying degrees of success over time – the
digestive system was the first sub-unit introduced, with the nervous system being one of the last in the 4-
week time frame. Some participants successfully used the correct TMs with the correct terminology at
each body system stage. Others attempted to use the flow map, but did not provide the correct information
in each stage of the process. Still others made no attempt at using TMs whatsoever, reverting back to short
written responses, or various other diagrams/graphic organizers such as Venn Diagrams, cycles and
labeled models. Subsequent body systems (skeletal and muscular) did not make use of the same type of
flow map for processes, and a different set of TMs were utilized. Data was not collected in the same
manner for these final systems.
After all of the TM instruction and modeling was introduced, about midway through the data
collection, a questionnaire was given to students to see if they could identify the specific thinking
processes and match it to the corresponding map:
Figure 5: Student Questionnaire #2: Identify Intended Thinking Strategy
Q- Match each TM to its intended thinking strategy (that is, why do we use it)?
Participant code/
Number Correct of 6
0 1 2 3 4 5 6
B 4/3 0 0 1 0 0 0 1
B 5 0 2 1 2 2 0 1
G 4/3 0 0 2 1 3 0 0
G 5 0 0 1 3 2 0 0
(See Appendix D for actual questionnaire)
Most of the students ranged from 2 to 4 correct matches with only about 41% passing acceptably.
It was still only midway through an admittedly short research period, so there was hope that with more
practice an increasing number of students would be more successful in the future.
Yet another area within the data where unintended results occurred unwittingly was with the
human body “final test”. It was designed by one of the other grade level team members and administered
to the entire grade level – including my participants (and ultimately was not always consistent in language
with my earlier post-tests). Students were asked on 3 occasions to describe body function “processes” – all
3 which could have been suited for TMs responses if the student was aware of the cues (the test used
similar language like “describe how __ works”, “describe how ___ occurs”, and in one very consistent
instance with a pre-test and class work, “compare and contrast ____”; in addition to a bonus section where
students were asked to “describe” additional facts not previously required). Even though the questions
Thinking Maps®
18
were worded in a slightly different way, it was still my hope that my students would see the key word,
make the connections and utilize the appropriate TMs:
Figure 5: Final Test Questions (that included key word descriptions of specific processes)
Arteries & Veins
(success/marginal)
Breathing
(success/marginal)
Muscle Pairs
(success/marginal)
“Extra credit facts”
(success/marginal)
(no TM
attempt)
Participant
Code
0/1 0/1 0/0 0/0 1 B 4/3
1/1 0/2 0/0 0/0 5 B 5
1/2 0/0 0/0 0/0 3 G 4/3
2/0 0/0 0/0 0/0 4 G 5
Students seemed to fully utilize TMs much more during class and independently than they did on
the final assessment. Perhaps it was due to the fact that I purposely did not mention TMs as a specific
option for test responses? I wanted to see rather if they would transfer the use of TMs independently
without my prompting. The number of students who were successful with their maps on these questions
(only about 16%) was disappointing, but even more alarming was the number of students who chose not to
use a TM at all (about 54%) on test responses, even within the “low pressure” extra credit section. The
“muscle pairs” question also rated no attempts at TM usage for some unknown reason even though we
distinctly used TMs in class to discuss the relationship.
At the end of all the TM instruction and modeling, and after the final assessment near the end of
the data collection, a third questionnaire was given to students to see if they could make connections with
the key words and phrases associated with each thinking process and then draw the corresponding map:
Figure 6: Student Questionnaire #3: Connect with key words and draw TM
Q- Match each TM to its intended thinking strategy (that is, why do we use it)?
Participant code/
Number Correct of 6
0 1 2 3 4 5 6
B 4/3 0 1 0 0 0 1 0
B 5 0 0 2 1 3 0 1
G 4/3 0 1 0 2 0 0 2
G 5 0 1 1 2 0 0 1
(See Appendix E for actual questionnaire)
Most of the students ranged once again from 2 to 4 correct responses and similarly to the second
questionnaire, only about 42% passed within acceptable margins. Although a few had moved over into the
4, 5, or 6 correct columns, even more had moved in the opposite direction, showing more incorrect
responses overall. I was curious how this result might have related to the frequency of use of each type of
map over the course of the data collection. I tallied the frequency of each map as it was used throughout
the sub-units, then I determined the percentage of students that correctly connected key words with
drawing to see if there were any correlations:
Thinking Maps®
19
Figure 7: Frequency of TM use v. success rate on questionnaire #3
Type of map Frequency of use
across sub-units
% correct responses
“circle map” 7 70%
“tree map” 3 39%
“double bubble map” 1 48%
“flow map” 5 57%
“cause & effect map” 1 39%
“brace map” 1 30%
* Not all TM types were used during this data collection period
The “circle map” was most widely used and showed in the number of correct responses. The “flow
map” and “tree map” were the next most used maps, yet the number of correct responses doesn’t seem to
correlate. Maybe it had to do with amount of exposure time – or perhaps my sample is too small?
The remainder of data was collected in the form of observation notes on student engagement,
depth of knowledge and colleague interview responses (see Appendices F & G). General observations
included that, after four weeks of TM introduction, some students (about 25%) began to speak about
content in a more confident manner and often referred back to a TM when making a point and attempting
to clarify their thoughts. More students (about 30%) were able to discuss concepts with great detail using
TMs with little prompting from me. I observed a limited increase in written response depth and quality
(about 20%), but a much larger increase in class engagement (about 60%). The data collected from
colleague input was more revealing however. All (100%) of my colleagues found the use of TMs,
especially the common language they created, to be extremely helpful. Vertical conversations including
teachers and administration revealed that students were largely on task and participated more when able to
offer the different perspectives that the TMs encourage. The use of TMs also enabled the students in their
classrooms to be more independent and concise, and several expressed their amazement at what their
students could do with the maps. Most (about 80%) agreed that more teacher professional training was
necessary moving forward.
Emerging Themes/Results:
As a result of all the data, I was surprised to see that the strongest data (and participation in TMs
usage) came at the very beginning and declined on every data collection that followed. Perhaps most
curious for me is that the data appears to show a decrease in consistent and independent TM use; instead of
getting stronger with TMs use, it appears that the students were becoming less and less consistent with the
proper TMs, and not independently choosing those graphic organizers, but reverting back to other more
general forms (or poorly constructed written responses). Not at all what I had expected! In addition, there
appears to be some difficulty for some participants to remember which TMs belongs with which thinking/
process skill and I’m concerned that this might present a “double load” and be just as hard as learning the
Thinking Maps®
20
content material itself, presenting further areas of confusion. As an initial response to my research
questions, the first use of TMs produced more depth of responses and more engagement in the TMs
process. As each sub unit progressed, the participants utilized TMs in their notes, class discussions and
post tests, but with decreasing degrees of accuracy. There seemed to be a small number of students who
increased their understanding, response levels and engagement, but they weren’t the ones who needed to in
terms of skill progression. I am still trying to sort out the reasons why this may have occurred.
Some other constraints I hadn’t thought to explore emerged as learning style constraints that TMs
seemed to produce – TMs being predominantly visual and systematic organizers (with some tactile
qualities of writing or drawing, and auditory qualities when paired with “verbal rehearsal” of their
contents). I realize after looking at the data that I should have allowed students the opportunity to “think
aloud” and verbalize the process in a way that they could have explained it better and it would have
produced different results for some students. In wanting to remain consistent in how I modeled the use of
TMs and the expectations for assessment responses, I overlooked meeting the needs of other learning
styles. Another potential barrier to clear and consistent data could have been that this participant group
may have had varying degrees of previous “response instruction” (modeled by different team teachers in
flexible groupings for reading and math, in addition to a different graphic organizer system used in a
school-wide writing program), as well as a different amount of previous exposure to TMs use, and these
instances may have offered some confusion when confronted with the TMs approach.
My overall findings appear to parallel those I read in a sample study (Burden & Silver, 2006)
where teachers were convinced that students were capable of achieving more than what they were
demonstrating and intended to raise expectations and realize an increased number of students becoming
more active learners with TMs. Although the results did not quite meet their assumptions, and the
researchers noted that the TMs needed extensive further piloting, they came to the conclusion that students
were just starting to shows signs that the TM tools helped guide thinking and problem-solving and that
there were many different uses still ahead for multiple maps implementation.
I too agree that there are early signs of habit-forming and that TMs are beneficial, having
connections to several related clusters of Costa’s “Habits of Mind” (Hyerle, 2008) such as “attending to
tasks that require patience & analytical, detailed thinking”. Over time, students should develop fluency
with each map and become more able to transfer skills from multiple maps into each content area, having
“the flexibility to choose and use maps for a variety of purposes” (Hyerle & Williams, 2009).
Discussion and Suggestions for Further Research
From the perspective of my action research, I have learned that TMs are really powerful tools
when modeled and used in appropriate ways. Consistent use (and especially their use across disciplines)
provides the ability to make connections to one’s learning in more depth. They offer a way to express
Thinking Maps®
21
thoughts and personalize learning by utilizing the map features (frames of reference and color coding) to
differentiate one’s learning and offer structure to note-taking and the absorption of new content.
Some realizations that I had as a result of this action research were that I’ve barely scratched the
surface with my implementation of TMs, and although I learned about these TM strategies “second-hand”
from a team member who was part of the actual pilot group, I realize that there are more deliberate skills
and strategies that I need to learn in order to have the maximum effect of my students. As I made
observations of my students, I noticed some emerging trends in that many students still required a constant
“nudging” to encourage them to utilize the TMs to their fullest potential and the most difficult instructional
glitch was getting some of the students to transition to independence in their use of TMs.
As I sought answers to my research questions I’ve come to appreciate and need to reinforce the
notion that only through extended exposure over time and consistent practice will TM use become intuitive
for the students (and teachers)! Upon reflection of my research technique I found the use of field notes in
this way to be more powerful than other types of anecdotal record keeping that I’ve attempted in the past.
I’ve gained increased reflection and metacognition of my own instructional practices and an increased use
of strategies on my part since TMs have slowly become more intuitive with use.
As for the future, I will continue to utilize TMs in my classroom across all subject disciplines.
I will also begin the use of portfolios to monitor learners as they assimilate new knowledge and develop
incremental skill using TMs making note as patterns emerge. This research has warranted continued
exploration of my practice and to actively invite explicit thinking, integrate more TMs into the curriculum,
and monitor the effects of TMs on scores. The way to increase long-term memory (and improve student
achievement) is to incorporate new information gradually and repeat it at regular intervals to “mind the
gap” while attracting and holding attention (Medina, 2008) and this increased exposure to TM usage
should solidify these metacognitive strategies for students to take with them for future years.
I have since participated in a school-wide training of TMs and reviewed a culminating report of
the first pilot group’s findings. I will be part of a summer training conducted by the first pilot group to
review “lessons learned” and share in how to implement TMs successfully in the classroom. I have also
recently been invited to participate in the second round of the TMs pilot program in the fall, so I will be
starting again with action research on TMs and refocusing my efforts with a new class. Perhaps I will also
take a look at keeping students motivated once they have been trained in TMs use in an effort to prevent
the decline of use over time that I found in this current study. I look forward to learning more about
Thinking Maps® in the future, increasing my instructional skill and continually informing my practice.
Thinking Maps®
22
References
Burden, B., & Silver, J. (2006). Thinking maps in action. Teaching, Thinking & Creativity
Retrieved from http://www.teachthinking.com
Caine, R.N., & Caine, G. (1991). Making connections, teaching and the human brain. Wheaton, MD:
ASCD.
Dual-coding theory. (n.d.) In Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Dual-
coding_theory
Gerlic, I., & Jausovec, N. (1999). Multimedia: Differences in cognitive processes observed with EEG.
Educational Technology Research and Development, 47(3), 5-14.
Hubbard, R.S., & Power, B.M. (2003). The art of classroom inquiry: A handbook for teacher-researchers.
Portsmouth, NH: Heinemann.
Hyerle, D. (1996). Thinking maps: seeing is understanding. Educational Leadership, 53(4).
Hyerle, D., Curtis, S., & Alper, L. (2004). Student successes with thinking maps: School-based research,
results and models for achievement using visual tools. Thousand Oaks, CA: Corwin Press.
Hyerle, D. & Yeager, C. (2007). Thinking maps: A language for learning. Cary, NC: Thinking Maps, Inc.
Hyerle, D. (2008). Thinking maps: Visual tools for activating habits of mind. In A.L. Costa and B. Kallick
(eds.), Learning and leading with habits of mind: 16 Essential characteristics for success (149-
176). Alexandria, VA: ASCD
Hyerle, D., & Williams, K. (2009). Bifocal assessment in the cognitive age: thinking maps for assessing
content learning and cognitive processes. New Hampshire Journal of Education, 12, 32-38.
Jensen, E. (1998). Brain-based learning: The new paradigm of teaching. Thousand Oaks, CA: Sage.
Leary, S. F. (1999). The effect of thinking maps instruction on the achievement of fourth-grade students.
Retrieved from http://www.thinkingfoundation.org/research/graduate_studies/pdf/samuel-leary-
dissertation.pdf
Marzano, R.J., Pickering, D.J., & Pollock, J.E. (2001). Classroom instruction that works: Research-based
strategies for increasing student achievement. Alexandria, VA: Prentice Hall.
McCormick, S. (1995). Instructing students who have literacy problems. Englewood Cliffs, NJ: Prentice
Hall.
Medina, J. (2008). Brain rules. Retrieved from IT 6710 http://cu.ecollege.com
Merkley, D. M., & Jeffries, D. (Dec. 2000/Jan. 2001). Guidelines for implementing a graphic organizer.
The Reading Teacher, 54 (4), 350-357.
Novak, J.D., & Gowin, D.B. (1984). Learning to learn. New York, NY: Cambridge University Press.
Thinking Maps®
23
Paivio, A. (1990). Mental representations: A dual coding approach. Available from
http://books.google.com/books?id=hLGmKkh_4K8C&lpg=PA3&ots=B1GWeFjljn&dq=Paivio%
20%2B%20dual%20coding%20theory&lr&pg=PP1#v=onepage&q=Paivio%20+%20dual%20cod
ing%20theory&f=false
Ritchhart, R., Turner, T., & Hadar, L. (2009). Uncovering students' thinking about thinking using concept
maps. Metacognition and Learning, 4 (2), 145-159.
Robinson, V. & Lai, M.K. (2006). Practitioner research for educators: A guide to improving classrooms
and schools. Thousand Oaks, CA: Corwin Press.
Stull, A.T., & Mayer, R. (2007). Learning by doing versus learning by viewing: Three experimental
comparisons of learner-generated versus author-provided graphic organizers. Journal of
Educational Psychology, 99 (4), 808-820.
Tomlinson, C.A., & Kalbfleisch, M.L. (1998). Teach me, teach my brain: A call for differentiated
classrooms. Educational Leadership, 56 (3), 52-55.
Thinking Maps®
24
Appendix A: (Data collection questionnaire sample 1)
Questionnaire #1
Directions:
 Please carefully read each question and respond thoughtfully. Remember to write neatly and make
your thoughts clear.
1. How much do you know about “the human body” before we start our unit? (Circle one)
a lot some a little nothing
2. What strategies or tools do you use the most to help you learn and remember new information?
_____________________________________________________________________________________
_____________________________________________________________________________________
3. How do you think you usually do on tests? (Circle one)
very well pretty good just OK not very well
4. What do you think of the quality of your test responses? (Select the most appropriate answer)
 My work is usually thorough and I’m proud of my success
 I usually recognize my mistakes and I can fix them
 I’m usually not sure where I went wrong and need further clarification
 My responses are not usually sufficient
 Other: ________________________________________________________________________
**************************************************************
The questions above were about your learning strategies for science. Please answer the following questions
about learning in all subject areas:
Should students be taught “thinking skills”? (Circle one) Y N
Why or why not?
_____________________________________________________________________________________
_____________________________________________________________________________________
List all of the thinking skills/tools/strategies that you use to help you learn in school:
____________________________ ____________________________
____________________________ ____________________________
____________________________ ____________________________
Thinking Maps®
25
Appendix B: (Data collection pre-test sample)
Pretest #1: Digestion
Directions:
 Please carefully read each question and respond thoughtfully. Remember to write neatly and make
your thoughts clear.
1. Using words or pictures, describe how human digestion works:
2. How do you know?
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
3. Where will you be able to find out more about digestion?
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Thinking Maps®
26
Appendix C: (Data collection post-test sample)
Post-test #3: Circulation
Directions:
 Please carefully read each question and respond thoughtfully. Remember to write neatly and make
your thoughts clear.
1. Using words and/or pictures, describe the process of human circulation:
2. How do you know?
_____________________________________________________________________________________
_____________________________________________________________________________________
_____________________________________________________________________________________
Thinking Maps®
Appendix D: (Data collection questionnaire sample 2)
Questionnaire #2
Directions:
 Look at each of the “thinking maps” on the left side of the page.
 Carefully draw a line to match each to its intended thinking strategy on the right side of the page.
Restated: Match each “thinking map” to the reason we use it…
27
comparing & contrasting
sequencing & ordering
identifying part-to-whole
relationships
brainstorming or
defining in context
analyzing cause & effect
classifying & grouping
Thinking Maps®
Appendix E: (Data collection questionnaire sample 3)
Questionnaire #3
Directions:
 Carefully read each description below that contains the “code words & phrases” that help us know
which type of “thinking map” works best for what we study.
 Decide which “thinking map” is being described and draw it into the frame of reference.
Restated: Draw the “thinking map” that best describes the phrases below it…
define
brainstorm
describe in context
list
identify
generate ideas…
…as many as you can
…tell everything you know
…use your prior knowledge
explore the meaning of…
classify
categorize
organize
group
sort
give sufficient details for…
types of…
describe parts & functions of…
list & elaborate on…
compare/contrast
similarities/differences
alike/unlike
distinguish between…
show what ___ has in common…
describe shared components of…
sequence
order events
describe phases/cycles/process
recount/retell
summarize procedures
tell/show how…
identify steps/stages
first/then/next…
put in order
how to get from ___ to ___
cause/effect
discuss consequences of…
why did/does…
what would happen if…
predict the outcome of…
describe changes
discuss strategy/outcome
what is the impact of…
what would be the result if…
part-to-whole
describe/show structure
what are the parts/components
take apart __/ put together __
__ is made up of…
describe the
subparts/subcomponents
how does __ fit in a larger context
28
Thinking Maps®
29
Appendix F: Field notes excerpts
Entry (3/9/10): Started on the third sub-unit today “Circulatory System” … introduced circle map to be
completed as part of the “morning menu”… First elicited “what you already know” and marked the pre-
established color-coding on the board: brain = pencil;
For the first time, a G5 asked at the beginning of ‘morning menu’, “Can we add in our own notes with
other colors if we once we finish our brainstorming?” This is the first time that anyone has made the
connection to use additional colors for additional sources on their own, without my prompting. Maybe
they’re starting to get it?? We then added to the board the remainder of the color-coding chart and left it
there permanently for the duration of all the human body units:
Brain = pencil; text = 2nd
color; class discussions & teacher lesson = 3rd
color; video = 4th
color;
Assigned corresponding text readings and note-taking to the menu… Will introduce tree map this
afternoon for 3 types of blood cells
***
Entry (3/17/10): I walked around the room during group work activity (students were establishing a flow
chart for “Nervous System” – How does the brain send/respond to a message?)… observed individuals
displaying – and/or + behaviors with respect to the DB map skill…
General Observations:
(B 3/4) participants (B 5) participants (G 3/4) participants (G 5) participants
-quickly scribbling a
double bubble; no care
taken to align related
bubbles
+using tree map for
facts to put into DB map
+completed DB map
before rest of group with
lots of detail
-starts a Venn Diagram
when asked to
compare/contrast
+explaining how to use
a double bubble to
elbow partner
+comparing DB map to
the difference paragraph
‘bug box’ in writing
+2 working together
“divide and conquer”-
one uses red for
compare, other uses blue
for contrast
-needs explanation from
elbow partner; does not
remember how to use
DB
-uses bubble map
instead of DB map
-off task and talking; no
signs of map being
started
Note: this week’s lessons are becoming “extra”- feeling like “one more thing’ in an already tight schedule;
have to embed brace map as a “quick check” activity (direct instruction) to make time instead prior to
quiz….
***
Entry (3/23/10): Group Observation tally
Behaviors during
“Skeletal/Muscular” assessment:
(B 3/4)
participants
(B 5)
participants
(G 3/4)
participants
(G 5)
participants
Staring at page or looking up in thought lll I I II
Using FM instead of MFM to show contractions II IIII III II
Color-coding maps on test I I II I
Asks what map to use I I
Note: 6 absences today!! Need to determine how that will affect overall data…. Make-up exams!!
Thinking Maps®
30
Appendix G: Interview question format
The following questions were asked of my colleagues (grade-level team members, pilot participants, other
teachers trying to emulate the pilot instruction on their own) in an effort to find common experiences and
to help me make comparisons between my class and other classes at similar levels of implementation:
Domain Interview Questions
Quantitative 1. Which TM do you most frequently use? Why?
2. How often do you use each of the TMs?
3. How many of each type of map have the students
completed?
4. Have you measured growth?
Probe Further:
How long does the introduction/modeling take for each?
Qualitative 1. What instructional strategies do you use to teach TMs?
2. Describe the experience of using TMs in your classroom
3. How have you extended the use of TMs?
Probe Further:
Which TMs seem easiest for your students? Most difficult?
Reflective: Outcome of Lesson 1. How does the TM relate to the lesson being taught?
2. How do you monitor engagement?
3. How do you incorporate TMs in to assessment?
Probe Further:
How do you differentiate using TMs?
Professional Impact and Practice 1. Describe your pacing for TM implementation?
2. What success have you observed?
3. What areas for improvement have you observed?
Probe Further:
How does the use of TMs support our school or grade-level
agreements?
(adapted from the literature- Leary, 1999; used that study’s questionnaire format as the model )
Thinking Maps®
31
Appendix H: Research Timeline
Task Date/Time Frame People involved
Scope the research problem Begin Feb 1 (due Feb. 7) Self
Create timeline, or “plan of action” Begin Feb 1 (due Feb. 14) Self
Conduct the initial literature review Begin Feb 1 (due Apr. 4) Self
Design the study; plan data collection Begin Feb 7 (due Feb. 21) Self; team
Create data collection tools Begin Feb 7 (due Feb. 21) Self
Arrange to conduct research with respondents and
involved educators
Begin Feb 7 (due Feb. 21) Self; team; students
Create permission forms and obtain permission to
conduct research at designated site
Begin Feb 7 (due Feb. 21) Self; team; parents
Collect data (activity part 1) Begin Feb 22 (due Feb. 28) Self; team; students
Collect data (activity part 2) Begin Feb 22 (due Mar. 7) Self; team; students
Continue data collection Begin Feb 22 (due Mar 26) Self; team; students
Analyze data Begin Mar. 8 (due Mar. 28) Self; team
Make inferences and determine findings Begin Mar. 8 (due Apr. 11) Self
Literature review (activity due) Begin Feb. 1 (due Apr. 4) Self
Finalize literature review Begin Feb. 1 (due Apr. 25) Self
Write final report draft Begin Feb 7 (due Apr. 11) Self
Peer edit Begin Apr. 12 (due Apr. 18) Self; IT 6720 cohort;
team
Review findings with key person Begin Apr. 12 (due Apr. 18) Self; team
Revise final report Begin Apr. 19 (due Apr. 25) Self
Submit final report ** Due Apr. 25 ** Self
Prepare presentation Begin Apr. 25 (due May 2) Self
Presentation: review of study and findings ** Due May 2 ** Self

Weitere ähnliche Inhalte

Was ist angesagt?

SOARES, DORIS DE A. Developing critical writing skills in L2. BRAZ-TSOL Newsl...
SOARES, DORIS DE A. Developing critical writing skills in L2. BRAZ-TSOL Newsl...SOARES, DORIS DE A. Developing critical writing skills in L2. BRAZ-TSOL Newsl...
SOARES, DORIS DE A. Developing critical writing skills in L2. BRAZ-TSOL Newsl...Doris Soares
 
Defining Comprehension Strategies and Instructional Strategies
Defining Comprehension Strategies and Instructional StrategiesDefining Comprehension Strategies and Instructional Strategies
Defining Comprehension Strategies and Instructional StrategiesMarilyn Velez
 
Powerpoint presentation wk3
Powerpoint presentation wk3Powerpoint presentation wk3
Powerpoint presentation wk3Makenzy Deckard
 
Reciprocal teaching klingner and vaughn
Reciprocal teaching klingner and vaughnReciprocal teaching klingner and vaughn
Reciprocal teaching klingner and vaughnannes86
 
Defining Comprehension Strategies and Instructional Strategies
Defining Comprehension Strategies and Instructional StrategiesDefining Comprehension Strategies and Instructional Strategies
Defining Comprehension Strategies and Instructional Strategieseilene315
 
Content literacy research into practice
Content literacy   research into practiceContent literacy   research into practice
Content literacy research into practiceJennifer Evans
 
Into the Gauntlet: Letting Students Teach One Another
Into the Gauntlet: Letting Students Teach One AnotherInto the Gauntlet: Letting Students Teach One Another
Into the Gauntlet: Letting Students Teach One Anotherjcmcintosh
 
Guided Reading: A Research-Based Response to the Challenges of Early Reading ...
Guided Reading: A Research-Based Response to the Challenges of Early Reading ...Guided Reading: A Research-Based Response to the Challenges of Early Reading ...
Guided Reading: A Research-Based Response to the Challenges of Early Reading ...rathx039
 
Literate environment analysis
Literate environment analysisLiterate environment analysis
Literate environment analysisGillian27
 
Enhancing efl readers’ metacognition
Enhancing efl readers’ metacognitionEnhancing efl readers’ metacognition
Enhancing efl readers’ metacognitionAlexander Decker
 
USING AND EVALUATING INSTRUCTIONAL MATERIALS
USING AND EVALUATING INSTRUCTIONAL MATERIALSUSING AND EVALUATING INSTRUCTIONAL MATERIALS
USING AND EVALUATING INSTRUCTIONAL MATERIALSjanehbasto
 
Readers are Leaders
Readers are Leaders Readers are Leaders
Readers are Leaders galindoar526
 
Literacy environment analysis
Literacy environment analysisLiteracy environment analysis
Literacy environment analysisMelinda Bratton
 

Was ist angesagt? (19)

Fors.whitepaper
Fors.whitepaperFors.whitepaper
Fors.whitepaper
 
Annotations
AnnotationsAnnotations
Annotations
 
SOARES, DORIS DE A. Developing critical writing skills in L2. BRAZ-TSOL Newsl...
SOARES, DORIS DE A. Developing critical writing skills in L2. BRAZ-TSOL Newsl...SOARES, DORIS DE A. Developing critical writing skills in L2. BRAZ-TSOL Newsl...
SOARES, DORIS DE A. Developing critical writing skills in L2. BRAZ-TSOL Newsl...
 
Defining Comprehension Strategies and Instructional Strategies
Defining Comprehension Strategies and Instructional StrategiesDefining Comprehension Strategies and Instructional Strategies
Defining Comprehension Strategies and Instructional Strategies
 
Powerpoint presentation wk3
Powerpoint presentation wk3Powerpoint presentation wk3
Powerpoint presentation wk3
 
Reciprocal teaching klingner and vaughn
Reciprocal teaching klingner and vaughnReciprocal teaching klingner and vaughn
Reciprocal teaching klingner and vaughn
 
Defining Comprehension Strategies and Instructional Strategies
Defining Comprehension Strategies and Instructional StrategiesDefining Comprehension Strategies and Instructional Strategies
Defining Comprehension Strategies and Instructional Strategies
 
Content literacy research into practice
Content literacy   research into practiceContent literacy   research into practice
Content literacy research into practice
 
Into the Gauntlet: Letting Students Teach One Another
Into the Gauntlet: Letting Students Teach One AnotherInto the Gauntlet: Letting Students Teach One Another
Into the Gauntlet: Letting Students Teach One Another
 
From Divide and Conquer to Dynamic Teamwork: A New Approach to Teaching Publi...
From Divide and Conquer to Dynamic Teamwork: A New Approach to Teaching Publi...From Divide and Conquer to Dynamic Teamwork: A New Approach to Teaching Publi...
From Divide and Conquer to Dynamic Teamwork: A New Approach to Teaching Publi...
 
Guided Reading: A Research-Based Response to the Challenges of Early Reading ...
Guided Reading: A Research-Based Response to the Challenges of Early Reading ...Guided Reading: A Research-Based Response to the Challenges of Early Reading ...
Guided Reading: A Research-Based Response to the Challenges of Early Reading ...
 
Language Learning Strategy Use; Does Critical Thinking Make a Difference?
Language Learning Strategy Use; Does Critical Thinking Make a Difference?Language Learning Strategy Use; Does Critical Thinking Make a Difference?
Language Learning Strategy Use; Does Critical Thinking Make a Difference?
 
Literate environment analysis
Literate environment analysisLiterate environment analysis
Literate environment analysis
 
Bc week 6
Bc week 6Bc week 6
Bc week 6
 
Enhancing efl readers’ metacognition
Enhancing efl readers’ metacognitionEnhancing efl readers’ metacognition
Enhancing efl readers’ metacognition
 
ksklitreview
ksklitreviewksklitreview
ksklitreview
 
USING AND EVALUATING INSTRUCTIONAL MATERIALS
USING AND EVALUATING INSTRUCTIONAL MATERIALSUSING AND EVALUATING INSTRUCTIONAL MATERIALS
USING AND EVALUATING INSTRUCTIONAL MATERIALS
 
Readers are Leaders
Readers are Leaders Readers are Leaders
Readers are Leaders
 
Literacy environment analysis
Literacy environment analysisLiteracy environment analysis
Literacy environment analysis
 

Andere mochten auch

Reaching Your Audience Faster Final
Reaching Your Audience Faster FinalReaching Your Audience Faster Final
Reaching Your Audience Faster FinalMichael Zabinski
 
Digital Storytelling Documentation
Digital Storytelling DocumentationDigital Storytelling Documentation
Digital Storytelling Documentationayounce
 
EdWeb Analysis & Design Documentation
EdWeb Analysis & Design DocumentationEdWeb Analysis & Design Documentation
EdWeb Analysis & Design Documentationayounce
 
Creative Designs Documentation
Creative Designs DocumentationCreative Designs Documentation
Creative Designs Documentationayounce
 
Job Aid Spectacular
Job Aid SpectacularJob Aid Spectacular
Job Aid Spectacularayounce
 
Open APIs: A Telco's Perspective
Open APIs: A Telco's PerspectiveOpen APIs: A Telco's Perspective
Open APIs: A Telco's Perspectiveaubs
 
Webinar: Planning for Success: Formulating a Strategy Before and After Your E...
Webinar: Planning for Success: Formulating a Strategy Before and After Your E...Webinar: Planning for Success: Formulating a Strategy Before and After Your E...
Webinar: Planning for Success: Formulating a Strategy Before and After Your E...Social Tables
 
Study: The Future of VR, AR and Self-Driving Cars
Study: The Future of VR, AR and Self-Driving CarsStudy: The Future of VR, AR and Self-Driving Cars
Study: The Future of VR, AR and Self-Driving CarsLinkedIn
 

Andere mochten auch (8)

Reaching Your Audience Faster Final
Reaching Your Audience Faster FinalReaching Your Audience Faster Final
Reaching Your Audience Faster Final
 
Digital Storytelling Documentation
Digital Storytelling DocumentationDigital Storytelling Documentation
Digital Storytelling Documentation
 
EdWeb Analysis & Design Documentation
EdWeb Analysis & Design DocumentationEdWeb Analysis & Design Documentation
EdWeb Analysis & Design Documentation
 
Creative Designs Documentation
Creative Designs DocumentationCreative Designs Documentation
Creative Designs Documentation
 
Job Aid Spectacular
Job Aid SpectacularJob Aid Spectacular
Job Aid Spectacular
 
Open APIs: A Telco's Perspective
Open APIs: A Telco's PerspectiveOpen APIs: A Telco's Perspective
Open APIs: A Telco's Perspective
 
Webinar: Planning for Success: Formulating a Strategy Before and After Your E...
Webinar: Planning for Success: Formulating a Strategy Before and After Your E...Webinar: Planning for Success: Formulating a Strategy Before and After Your E...
Webinar: Planning for Success: Formulating a Strategy Before and After Your E...
 
Study: The Future of VR, AR and Self-Driving Cars
Study: The Future of VR, AR and Self-Driving CarsStudy: The Future of VR, AR and Self-Driving Cars
Study: The Future of VR, AR and Self-Driving Cars
 

Ähnlich wie Younce action researchproject

Classroom Instr That Works
Classroom Instr That WorksClassroom Instr That Works
Classroom Instr That WorksTeresa Castellaw
 
Curriculum InceptionTaya Hervey-McNuttStrayer Universi
Curriculum InceptionTaya Hervey-McNuttStrayer UniversiCurriculum InceptionTaya Hervey-McNuttStrayer Universi
Curriculum InceptionTaya Hervey-McNuttStrayer UniversiOllieShoresna
 
Development and Evaluation of Concept Maps as Viable Educational Technology t...
Development and Evaluation of Concept Maps as Viable Educational Technology t...Development and Evaluation of Concept Maps as Viable Educational Technology t...
Development and Evaluation of Concept Maps as Viable Educational Technology t...paperpublications3
 
Information is used in various ways, including as a tool to gain sup.docx
Information is used in various ways, including as a tool to gain sup.docxInformation is used in various ways, including as a tool to gain sup.docx
Information is used in various ways, including as a tool to gain sup.docxdoylymaura
 
Debbie watkin realms, achievement
Debbie watkin   realms, achievementDebbie watkin   realms, achievement
Debbie watkin realms, achievementguest2b32b2e
 
Debbie watkin realms, achievement
Debbie watkin   realms, achievementDebbie watkin   realms, achievement
Debbie watkin realms, achievementguest2b32b2e
 
INNOVATION OF LEARNING IN THE EUROPEAN FRAMEWORK OF EDUCATION AT UNIVERSITIES...
INNOVATION OF LEARNING IN THE EUROPEAN FRAMEWORK OF EDUCATION AT UNIVERSITIES...INNOVATION OF LEARNING IN THE EUROPEAN FRAMEWORK OF EDUCATION AT UNIVERSITIES...
INNOVATION OF LEARNING IN THE EUROPEAN FRAMEWORK OF EDUCATION AT UNIVERSITIES...University of Jaén-Psychology
 
Planning for learning in maritime education
Planning for learning in maritime educationPlanning for learning in maritime education
Planning for learning in maritime educationStein Laugerud
 
Nguyen, Phuong LIT5203 Strengthening Literacy - Module 4
Nguyen, Phuong LIT5203 Strengthening Literacy - Module 4Nguyen, Phuong LIT5203 Strengthening Literacy - Module 4
Nguyen, Phuong LIT5203 Strengthening Literacy - Module 4Cuong NGUYEN
 
Collaborative Lesson Plan Farooqi
Collaborative Lesson Plan  FarooqiCollaborative Lesson Plan  Farooqi
Collaborative Lesson Plan FarooqiAysha Farooqi
 
Effectiveness of Concept Mapping Strategy on Cognitive Process in Sciences at...
Effectiveness of Concept Mapping Strategy on Cognitive Process in Sciences at...Effectiveness of Concept Mapping Strategy on Cognitive Process in Sciences at...
Effectiveness of Concept Mapping Strategy on Cognitive Process in Sciences at...Ramakanta Mohalik
 
Implementation of Semantic Mapping
Implementation of Semantic MappingImplementation of Semantic Mapping
Implementation of Semantic MappingRic
 
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...Marion Piper
 
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...Marion Piper
 
Fy ssp 15
Fy ssp 15Fy ssp 15
Fy ssp 15Har1982
 
Journal Effective, Consistent, and Ethical CommunicationInformati.docx
Journal Effective, Consistent, and Ethical CommunicationInformati.docxJournal Effective, Consistent, and Ethical CommunicationInformati.docx
Journal Effective, Consistent, and Ethical CommunicationInformati.docxcareyshaunda
 
Actionresearch 110301041332-phpapp02
Actionresearch 110301041332-phpapp02Actionresearch 110301041332-phpapp02
Actionresearch 110301041332-phpapp02rechelle anasco
 

Ähnlich wie Younce action researchproject (20)

Classroom Instr That Works
Classroom Instr That WorksClassroom Instr That Works
Classroom Instr That Works
 
Curriculum InceptionTaya Hervey-McNuttStrayer Universi
Curriculum InceptionTaya Hervey-McNuttStrayer UniversiCurriculum InceptionTaya Hervey-McNuttStrayer Universi
Curriculum InceptionTaya Hervey-McNuttStrayer Universi
 
Development and Evaluation of Concept Maps as Viable Educational Technology t...
Development and Evaluation of Concept Maps as Viable Educational Technology t...Development and Evaluation of Concept Maps as Viable Educational Technology t...
Development and Evaluation of Concept Maps as Viable Educational Technology t...
 
6. Waminton Rajagukguk.pdf
6. Waminton Rajagukguk.pdf6. Waminton Rajagukguk.pdf
6. Waminton Rajagukguk.pdf
 
6. Waminton Rajagukguk.pdf
6. Waminton Rajagukguk.pdf6. Waminton Rajagukguk.pdf
6. Waminton Rajagukguk.pdf
 
Information is used in various ways, including as a tool to gain sup.docx
Information is used in various ways, including as a tool to gain sup.docxInformation is used in various ways, including as a tool to gain sup.docx
Information is used in various ways, including as a tool to gain sup.docx
 
Debbie watkin realms, achievement
Debbie watkin   realms, achievementDebbie watkin   realms, achievement
Debbie watkin realms, achievement
 
Debbie watkin realms, achievement
Debbie watkin   realms, achievementDebbie watkin   realms, achievement
Debbie watkin realms, achievement
 
INNOVATION OF LEARNING IN THE EUROPEAN FRAMEWORK OF EDUCATION AT UNIVERSITIES...
INNOVATION OF LEARNING IN THE EUROPEAN FRAMEWORK OF EDUCATION AT UNIVERSITIES...INNOVATION OF LEARNING IN THE EUROPEAN FRAMEWORK OF EDUCATION AT UNIVERSITIES...
INNOVATION OF LEARNING IN THE EUROPEAN FRAMEWORK OF EDUCATION AT UNIVERSITIES...
 
Planning for learning in maritime education
Planning for learning in maritime educationPlanning for learning in maritime education
Planning for learning in maritime education
 
Nguyen, Phuong LIT5203 Strengthening Literacy - Module 4
Nguyen, Phuong LIT5203 Strengthening Literacy - Module 4Nguyen, Phuong LIT5203 Strengthening Literacy - Module 4
Nguyen, Phuong LIT5203 Strengthening Literacy - Module 4
 
Collaborative Lesson Plan Farooqi
Collaborative Lesson Plan  FarooqiCollaborative Lesson Plan  Farooqi
Collaborative Lesson Plan Farooqi
 
Effectiveness of Concept Mapping Strategy on Cognitive Process in Sciences at...
Effectiveness of Concept Mapping Strategy on Cognitive Process in Sciences at...Effectiveness of Concept Mapping Strategy on Cognitive Process in Sciences at...
Effectiveness of Concept Mapping Strategy on Cognitive Process in Sciences at...
 
Implementation of Semantic Mapping
Implementation of Semantic MappingImplementation of Semantic Mapping
Implementation of Semantic Mapping
 
Module 4 Application
Module 4 ApplicationModule 4 Application
Module 4 Application
 
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
 
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
ACEL - Girl talk and writing voices _ improving conferencing for middle schoo...
 
Fy ssp 15
Fy ssp 15Fy ssp 15
Fy ssp 15
 
Journal Effective, Consistent, and Ethical CommunicationInformati.docx
Journal Effective, Consistent, and Ethical CommunicationInformati.docxJournal Effective, Consistent, and Ethical CommunicationInformati.docx
Journal Effective, Consistent, and Ethical CommunicationInformati.docx
 
Actionresearch 110301041332-phpapp02
Actionresearch 110301041332-phpapp02Actionresearch 110301041332-phpapp02
Actionresearch 110301041332-phpapp02
 

Younce action researchproject

  • 1. Improving Student Response Quality Utilizing “Thinking Maps®” -Action Research Final Report- Ann Younce University of Colorado at Denver IT 6720 – Research in Information and Learning Technologies May 2, 2010
  • 2. Thinking Maps® 2 Abstract This introductory experience into action research focuses on the utilization of “Thinking Maps®” (specialized visual organizational tools for content learning, designed by Thinking Maps, Inc.) to increase student response quality and depth of understanding of complex content. A fifth grade teacher (the researcher) working in a suburban district southeast of Denver, Colorado, was motivated to find out about the effectiveness of “Thinking Maps®” (or TMs) strategies in her science class, and whether the maps could produce a significant change in the quality and depth of responses from her students, many of whom were reluctant to provide more than surface answers and who generally resisted higher level thought. The researcher had never before used the TMs tools, but was inspired by a team member’s inclusion in a school-wide pilot of the “Thinking Map®” program, and observed its use in the neighboring classroom earlier in the year. The researcher believed that these purposeful thinking tools offered the right blend of graphic organizer and critical thinking strategies to increase student engagement and achieve additional scaffolding for differentiated student learning. The guiding question evolved into determining to what extent “Thinking Maps®” (or TMs) are effective tools and strategies to significantly produce a change in the quality and depth of written responses, as well as to build scaffolding techniques, deepen understanding, and increase engagement. The research process included implementing the use of a variety of TMs into the instruction of a new “Human Body” science unit. As the action research progressed over the course of the four-week study, the researcher taught students to use a variety of TMs, applied related key word and questioning strategies, recorded field notes, observed student behaviors and attitudes, collected data in the form of questionnaires and assessments, and monitored changes in the quality of student written responses and the level of engagement with new information. She also pursued peer contributions in the form of discussions and interviews. Throughout the implementation, she began to recognize improved note taking and study patterns among her students, and determined that TMs indeed added a new layer of metacognition for students that led to an increase in performance on science assessments. As a result of this action research, the researcher plans to fully incorporate TMs into her instructional methodology, and to participate in a school-wide “Thinking Map®” cohort next school year to continue with a new phase of action research.
  • 3. Thinking Maps® 3 Introduction In my current role as a fifth grade teacher in a Denver-area elementary school, I serve an impacted suburban community, represented by the following school profile data: 580 student enrollment (100 of which are in the fifth grade; 24 of which are in my class), with 34% on free-reduced lunch, an 82% stability rate, a 95% attendance rate, and diversity demographics including Native American (0.9%), Asian American (8.7%), African American (16.5%), Hispanic American (17%) and Caucasian (56.9%) populations. The school setting and culture is one of high expectations for best practice and professional learning with a strong base of data-driven practices among faculty and staff. The school utilizes embedded Professional Learning Communities to build capacity while continually seeking and expanding learning experiences. The school’s organizational purpose and motto clearly states: “Learning…Whatever It Takes” and it is an effective philosophy for student achievement as well as professional growth. This foundational message both supports the implementation of research-based solutions and encourages action research among colleagues. Research Statement and Guiding Question As a largely unexplored topic, I planned to focus my action research in the general area of effective instructional strategies that impact all content. I aimed to take a closer look at the use of graphic organizers as metacognitive tools for deepening understanding and promoting higher level thinking. A large number of my students were consistently providing written responses that lacked substance and failed to reach higher levels of thinking. I had many questions about what the appropriate solutions could be and I saw this action research as an opportunity to investigate and attempt a workable remedy. The overall research goal was to try to increase literacy and science achievement (as demonstrated by increased comprehension and improved written response, predominantly with large amounts of new science information) by using “Thinking Maps®” (or TMs) in an effort to close current achievement gaps. This topic involved both an increase of skills for students - the need for independent/automatic skill use to communicate/show their thinking and the ability to apply strategies in cross-curricular ways, as well as their ability to internalize a new learning strategy process -incorporating specific, visual representations and patterns for thinking, utilizing a common language (Hyerle, 1996). My main goal was to incorporate new TMs strategies during a large instructional science unit, utilizing the process of journaling, interviews, disaggregated pre- and post-assessment data, anchor work samples, and similar artifacts. The projected affect of such research was three-fold: specifically, my students could gain improved ability to acquire and retain large amounts of information, making it both meaningful and relevant to them through cross-curricular strategies. The teachers (initially me and my grade level team)
  • 4. Thinking Maps® 4 could also gain in our ability to differentiate instruction for those who need a more structured approach to internalizing information and communicating higher level thought. Ultimately, my colleagues and the entire school community could benefit from shared information and increased student, instructional and programmatic successes. This was a beneficial challenge because my personal interest in brain-compatible instruction and metacognition served as another impetus for this research. Finally, the research was both timely and relevant due to the fact that sections of my school have been in various stages of the implementation of a new “Thinking Maps®” pilot program and have established a cross-grade level cohort for vertical alignment. As a desired outcome, I hoped for increased understanding and achievement for students, utilizing quick, readily-available scaffolding frames at an independent level in order to successfully accomplish three goals: (1) read/acquire to filter information, consistently identify main idea and provide supporting details, (2) organize to construct meaningful information, secure logical patterns and make emotional connections, and (3) write to demonstrate increased comprehension by showing quality responses, high level thinking and internalized deep understanding. The guiding question for my research continued to evolve and finally emerged with several facets: To what extent are “Thinking Maps®” effective tools and strategies: (1) to significantly produce a change in the quality and depth of written responses, and (2) to build scaffolding techniques, deepen understanding, and increase engagement? Review of Literature It was important for me to select parameters to guide my search for relevant literature as I began the ongoing process of researching “Thinking Maps®” (or TMs). As my topic became more and more narrowed, it naturally diverged into several themes: graphic organizers, engagement, high-order thinking, effective cross-disciplinary instructional strategies, depth/complexity of understanding, and metacognition. I quickly made use of the various online databases such as Google Scholar, and Education Full Text (Wilson’s Web), available from the university’s Auraria Library, to access electronic resources. In addition, I created general Internet searches of familiar professional sites I regularly use, such as ASCD and McRel, in addition to the “Thinking Maps®” website itself, as well as made use of professional texts from my own personal library. A quick search through each of their bibliographies allowed me to identify additional data sources that these other references had used. As I began to search for relevant literature to collect and review, I compiled a list of keyword search terms and tried multiple variations, such as: “thinking maps,” “graphic organizers + deep understanding,” “common visual language,” “non-linguistic representation,” and similar combinations involving maps or organizers used to solidify learning and increase comprehension. At first, it seemed like a treasure hunt as I searched through articles about “Thinking Maps®” (or TMs) and found references to
  • 5. Thinking Maps® 5 additional articles, studies and names of researchers. In particular, the “Thinking Maps®” website had devoted entire sections to the research conducted in support of their product, including links to the Thinking Foundation and Designs for Thinking websites. These sites contained various research formats including journal articles, case studies, and news articles, and I found excerpts and glimpses of classroom teacher research projects explained throughout the article texts. In order to produce relevant search results of the literature, I instituted various descriptors to serve as lenses or filters through which to pass my reference sources through. These filter criteria included: recently published references, results from actual studies, implementation information, elementary/middle school-preferred ranges, and a science-based focus (for example, brain research in education). These filters helped me to stay organized, categorize my research into subtopic themes, and evaluate potential references. The chart below provides a general overview of the types of collected literature which includes a variety of source formats such as professional journals, professional texts, and relevant websites. (For a complete listing of all literature sources, see References). Figure 1: Sample Literature Collected Texts/Authors Periodicals Websites Other Hyerle New Hampshire Journal of Ed www.mapthemind.com course website Medina Journal of Educational Psych. www.thinkingfoundation.org site cohort Marzano Teaching, Thinking & Creativity Costa Educational Leadership Robinson & Lai The Reading Teacher Jensen Metacognitive Learning Tomlinson I began to collect large amounts of information before I realized that most of it was related to the same researcher (Dr. David Hyerle, the founder of “Thinking Maps®”) and was limited to a specific time frame (late 1980’s and through the 1990s). I quickly became frustrated not being able to find many current (within the last few years) studies or publications that related to my topic. Although I utilized several database searches, I still came up with the same listings, from 10 to 15 years ago. Even though the brain research of the last decade shows links between understanding, non-linguistic representation and achievement (Marzano, Pickering & Pollack, 2001), I was unable to find any more recent studies and immediately wondered why, second-guessing my choice of research. These gaps of somewhat outdated literature, as well as some additional identified problems (such as many articles that applied to ELA or special education needs only) left me disappointed with the small scope of the research and I had to consider some sources to not be as valid and reliable as others. I have since assumed that some of the gaps (especially in Hyerle’s case) may have been due to renewed longitudinal studies since the earlier years of
  • 6. Thinking Maps® 6 research. Therefore, as far as the scope is concerned, certain literature may not be included because there seems to be very little (as far as research studies) conducted past 2001. Amid the articles and studies that I did happen to find, there were varying degrees of cohesiveness in the findings, and just as many areas of conflict. From the significant researchers, relevant ideas emerged and my beliefs were further derived and substantiated. As for a few key studies or articles of significance to my research, there were some similarities that related directly to my own questions, beginning with the foremost expert in the field, Dr. David Hyerle. Hyerle suggests that after being reinforced over several years (a main difference from my own study) students are able to “transfer multiple maps into each content area, becoming spontaneous in their ability to choose and use maps for whatever content information and concepts they are learning” (Hyerle & Williams, 2009). Although my students had only about four weeks of “intensive” map training, they too could choose from multiple maps. (Although the breakdown here for my students was that they were not always used correctly for the corresponding skill). Like Hyerle’s recent study, some of my students were also able to identify the correct thinking process and select the appropriate map and use the obvious cognition pattern. In a path that diverges from Hyerle’s study, I found that my students would all start with the common language and graphic, yet how they completed the information varied greatly among students. I also found design strategies among the literature that were appealing to me and similar those I chose for my study. Stull and Mayer’s (2007) study, although far more complex than my own, had students in one portion generate their own graphic organizers after being frontloaded in how to generate various types of visual tools, such as hierarchical, lists, flowcharts and matrices. Both their study and my own had the instructor first modeling types of graphic organizers. In contrast however, I found conflicting analyses and conclusions in the Stull and Mayer (2007) piece, where they found that there was little to no evidence that students constructing their own graphic organizers achieved any more than students who received author-provided organizers. Although my students all received the same “blank” organizer, they differed on their responses within. There were many discrepancies in the results of studies and the conclusions of articles as to whether or not the visual patterns (and I interpreted “Thinking Maps®” here) resulted in extraneous processing or generative processing, and the evidence for deeper learning was contradictory. I question here whether this effects “Thinking Maps®” (or TMs) and do all the various types of graphic organizers interfere with a student’s cognitive process as they’re selecting and using the maps – do they use too many versions of graphic organizers and waste time/cognitive processes independently deciphering which is the best tool to use? In addition I found very little from which to model the design of my own research among the articles and studies. Many of the earlier studies had hundreds of participants across multiple schools or districts, whereas I am limited to my own class of 24 students; although I did find a few useable
  • 7. Thinking Maps® 7 similarities in some of the shared vignettes from classroom to classroom that were mentioned in some of the articles and research studies (Leary, 1999; Hyerle, Curtis & Alpert, 2004). TMs are advanced graphic tools used to secure brain pathways for learning, and metacognitive tools used to increase awareness of cognitive cues and help to deepen student understanding. One other aspect that concerned me (and confused me) was in the contradiction of some of the evidence in some studies (Merkley & Jeffries, 2000/2001; Stull & Mayer, 2007; and Ritchhart, Turner & Hadar, 2009) that showed that general graphic organizers (not mentioning TMs specifically) may or may not increase student comprehension or deepen learning. Following recent brain research, it was stated that vision dominates all other senses in the brain, and the more elaboration, the better the coding and hence better storing and retrieval (Medina, 2008), which seems to make an argument in favor of visual tools like graphic organizers. Although Medina also states that it is through pictures - not through written or spoken words – that the brain makes the most lasting connections, that leads me to wonder why the TMs program doesn’t make mention of incorporating sketches and pictures in place of some words within the various map structures? In summary of the literature, although there definitely were more articles than any recent studies that pertained to my research topic, I feel as though I semi-exhausted my search across the related fields of cognitive science, effective instructional practice, and brain research, and it did allow me to form a more complete research picture, especially as I began to see a saturation of the same researchers mentioned again and again. Some further thoughts or questions that were prompted by my literature review would have to include whether or not my choice of study still represents a valid instructional practice? The methodology and research presented by the TMs website itself seems relevant and robust, yet I am still not clear as to why there is not more current research out there? Perhaps I have stumbled upon a gap in the literature as it seems as though there should be more studies that explore variations of the TMs strategies or, at the very least, further challenge the relevance of this type of tool as it compares with other brain research. Relevant Background Knowledge TMs were designed to integrate content learning with thinking process instruction across disciplines and be utilized as a “common visual language” for learning (Hyerle & Yeager, 2007). Learners are expected to identify 8 fundamental thinking skills and link them to dynamic visual representations to create specific pathways for “thinking about their thinking”. Maps are designed to be used in combination to increase depth and complexity, and as a developmental process, complexity in use increases overtime (Hyerle, Curtis & Alpert, 2004). “Frames of reference” surround each map to provide connections to sources, reflection and ownership. The maps are consistent, integrative, and flexible for various types of
  • 8. Thinking Maps® 8 implementation. Each map corresponds with a specific thinking process designed to activate and build schema: Figure 2: “Thinking Maps®” and 8 Corresponding Cognitive Skills “Thinking Maps®” Cognitive Skill “circle map” defines in context; presents point of view “bubble map” describes sensory, emotional, logical qualities “double bubble map” compares & contrasts qualities “tree map” shows relationships between main idea & supporting detail “flow map” relates events as a sequence “multi-flow map” indicates cause & effect; helps predict outcomes “brace map” deconstructs physical structures & part-to-whole relationships “bridge map” transfers or forms analogies (See Appendix D for actual map designs) Expository text structure is often difficult to digest for intended meaning, complex concepts, specialized vocabulary and inferred relationships such as cause & effect, compare/contrast, sequences or cycles (McCormick, 1995). TMs enable learners to assess prior knowledge, respond to complex inquiry and, working independently or collaboratively, draw out essential knowledge, identify critical information with supporting details, and make inferences and find connections to the text. TMs have been purposefully organized to align with current brain research to enable movement from concrete to abstract thinking with greater depth and to directly apply thinking to complex tasks. Meaningful context links to emotional and metacognitive “frames of reference” and ownership, addressing essential questions, prior knowledge, primary sources, point of view and other influences. Over the last decade, brain research shows that 80% of all information that comes into the brain is visual (Jensen, 1998) and the visual patterns of TMs help learners create concrete images from abstract thought. The brain is a natural pattern detector and teachers need to provide students with experiences that enable them to perceive patterns and make connections (Caine & Caine, 1991). It has been shown that “explicitly engaging students in the creation of nonlinguistic representations stimulates and increases activity in the brain” (Gerlic & Jausovec, 1999) and unlike earlier forms of concept maps (Novak & Gowin, 1984) and other types of random or ready-made graphic organizers that do not fully suffice, TMs encourage learners to become independent thinkers due to the ownership and connections made in their completion. In further support, the dual-coding theory (Paivio, 1986) states that simultaneous use of pictures and words enhance the brain’s ability to organize information for subsequent retrieval and use. Research proves that the more learners use both linguistic and nonlinguistic representations the better their thinking and recall (Marzano, Pickering & Pollack, 2001). TMs align nicely and integrate into practice with many
  • 9. Thinking Maps® 9 of Marzano’s 9 strategies most likely to increase achievement, among them: identifying similarities & differences, summarization & note-taking, cooperative learning, cues, questions and advanced organizers. Class time spent using TMs increases engagement with specialized thinking skills, and it is suggested that students should spend 60-80% of class time engaged in “process, discussion, group work, self-assessment, journal writing, feedback, mapping” (Tomlinson & Kalbfleisch, 1998) with such differentiation evident in the strategic components of TMs. Methodology/Research Design: The purpose of the study was to determine the effectiveness of “Thinking Maps®” (or TMs) strategies for use in science instruction, and whether the maps could produce a significant change in the quality and depth of responses from students, many of whom were reluctant to provide more than surface answers and who generally resisted higher level thought. The guiding research question was: To what extent are “Thinking Maps®” effective tools and strategies: (1) to significantly produce a change in the quality and depth of written responses, and (2) to build scaffolding techniques, deepen understanding, and increase engagement? In order to best serve my research into TMs, one small way in which I employed aspects of problem-based methodology in using both qualitative and quantitative research methodology and design traditions, was in exploring some limited use of the “Theory of Action” techniques (Robinson & Lai, 2006) in exploration of constraints, actions and consequences which will be addressed later in this study. I began by asking questions about the assumptions and beliefs I held about my classroom and my concepts of student response quality. Students need a certain type of response to demonstrate understanding and I believe I had been a good judge of the depth of their understanding, asking them to “write/draw it in another way” or to “tell me more” to elicit the best possible response. From here I needed to further investigate what I do in my practice and why I do it when assessing student response quality. The main type of data analysis I employed however, most resembles that of Wolcott’s Three Stages of Analysis (Wolcott, 1994), where different aspects of analysis based on qualitative data collection inform each other to incorporate (1) descriptive data from original notes, questionnaires and pre-/post- test experiences, (2) analyses based on how the descriptive data initially relates to the research question in an exploratory manner, looking for patterns and themes, identifying key factors and relationships, and (3) my interpretations to make sense of findings and the conclusions based on the extent to which they pose meaningful, relevant and compelling answers to the research question. Quantitative data collection appeared in the formulation of results tables to quantify through tallies the number of occurrences of specific written response types, frequency of TMs used, or data points for behaviors.
  • 10. Thinking Maps® 10 Site Selection I selected my own fifth grade classroom as the site for this study, as I believed this would help me achieve the maximum impact and benefit, thus making the experience as meaningful and personally relevant as possible. The classroom is one of four adjacent fifth grade classes, with 24 students (11 boys and 13 girls) with a demographic composition that closely reflects that of the entire school population (see Abstract). The science lessons take place predominantly in the afternoon block, being the last 45-minute class of the day. The students are situated in table groups of 4 to 5 students per group, and a typical day consists of a quick preview of new material (or a review of the previous day’s lessons), followed by either a group exploration activity or experiment to “kick-off” the inquiry. Time is then usually spent in direct instruction through interactive SmartBoard lessons, class discussion, group reading or data mining, collaborative or independent work production, and finally reflection. Occasionally, due to schedule changes or anticipated interruptions of the instructional time, I would switch (or add) science instruction time into the morning block, with each day usually starting out with a pre-reading activity or TMs preparatory work as part of a morning menu. Role My role was generally that of “facilitator”, where I would introduce content, model and suggest strategies, employ questioning techniques, elicit responses and reflection, and monitor collaboration. My goal for consistency was to develop an instructional routine to closely follow in order to keep the experience familiar and consistent for the students, and to keep an order for me to follow in terms of making observational notations and focusing specifically on student response quality and incidents of engagement. The classroom climate needed to be one where thinking is valued, visible and actively modeled (Ritchhart, Turner & Hadar, 2009). Just prior to the beginning of the first data collection cycle, we held an initial class meeting to discuss the action research project they would be a vital part of – using TMs to improve content learning and assess response quality. I answered any questions or address any concerns they had (and the only “fear” that needed to be alleviated was that they wouldn’t be graded on their questionnaire responses – only on the regular unit quizzes and tests). In describing my relationship with the students, I tried to maintain a positive report and communication style, rigorous expectations, a casual teaching style that incorporated flexible pace as needed and interjecting humor to engage. I utilized a school-wide positive reinforcement behavior management system (“Mountain of Trust” – earning their way “up the mountain” using silent and visual cues employed without a break in instructional momentum) with predictable rewards and consequences. My students were comfortable with me and that climate allowed them to take risks this group endeavor when asked, and at times, independently.
  • 11. Thinking Maps® 11 Purposeful Sampling Strategies When it came to the selection of individual respondents, I made the decision to sample all 24 of the students in my class. As an intermediate grade level teacher, it was inconceivable to run any sort of “control group” and “variable group” when I’m ethically bound to provide consistent instruction for all. Instead, I would make general comparisons of my class as TM beginners, to other classes with more or less TM training. Although my grouping was a relatively small data set, it was one that was manageable by me. For my interview subjects, I first selected one of my grade level colleagues who was participating in the tandem TMs pilot cohort on site, as well as an instructional coach/trainer who was overseeing the cohort. Additional questions and probes were asked of other team members who were not part of the pilot. The selection of activities to observe included taking anecdotal field notes on random student behaviors, attitudes, engagement, and expressions. I rotated around the classroom listening-in on table collaborations and cooperative learning routines. Other activities included observing TM use during class and for assessments. I was looking for confident and appropriate use of TMs as well as an improvement in overall response quality. As for the selection of documents to analyze, I looked at the use of thinking strategies, TM completion and response quality on student work products such as class and individual TMs, science notebooks, test questions and research questionnaires, and my own field notes. I created a mix of closed questions with a range of pre-determined responses, and open-ended opportunities that were non- threatening and would allow for rich and detailed responses (Ritchhart, Turner & Hadar, 2009). Data Collection Strategies In order to remain consistent with the traditions of the selected research designs, data collection tools were designed purposefully to elicit clear pictures of student thought processes to more accurately determine content knowledge and assess response quality. I established a balance of different artifacts to collect (Robinson & Lai, 2006) which included observations of students recorded in field notes, questionnaires for students to explain use of strategies or to rate difficulty, examination of student work product (actual TMs), and interviews with colleagues. Data was obtained by observation techniques, note taking, disaggregating student responses on questionnaires, pre-tests and post-tests, and comparisons of student TMs to anchor maps. I was intrigued by the notion that students learn more deeply when they construction their own organizers, or “learn by doing” than when the organizers are provided for them, or “learn by viewing” (Stull & Mayer, 2007). I knew I had to model the initial use of each type of TM while balancing the need to allow for learner-generated variance. I established a collection cycle and kept it consistent for each subtopic within the unit: pre-test, map introduction and use (“circle map”, followed by “tree map” or “bubble map” or “brace map” as needed, and culminating with a “flow map”) and finally
  • 12. Thinking Maps® 12 post-test. I would rotate around from group to group, observing students over-the-shoulder, have students verbally explain their strategy to me, or observe from a clear vantage point during an assessment. Interviews of colleagues were conducted largely in an unstructured conversational style and I designed questions so as not to be leading in any way. The collection of actual data was planned for as I recorded observations, thoughts and process notes in a spiral notebook. Throughout the lessons and assessments I would collect data in an informal way (sticky notes or clipboard), tally according to criteria (for example, the number of times students looked up vaguely, trying to remember; or the number of times they looked around the room looking for visual cues), or anecdotal notations be formally transcribed after class. Assurances of Confidentiality In order to ensure confidentiality throughout the process, assurances were made and designed into the research methodology. Verbal consent of colleagues was achieved, and informed consent of the students was implied as I spoke with them at length about the process and their role in it. Parental consent was not necessary as this was part of the daily routines of class and no identifying elements would present a conflict. I concentrated on the student group as a whole, with no individual identifying comments in field notes (other than pre-determined codes to safeguard identity) and I took care not to mention any identifying factors of specific individuals in class conversations, interviews with colleagues, or in the writing of this document. Data Analysis Strategies I endeavored to employ techniques to be consistent with traditional qualitative and quantitative methods, although my approach was not a particularly linear process but one that was flexible among the complexities of the teaching and learning day. I looked for obvious shifts in the classroom as far as behavior, engagement, strategy use, and quality of response, and I also looked to evaluate changes in student conceptions of thinking. The literature provided me with guidance in the form of 3 components for successful and creative data analysis: (1) patience - reminding myself that it was my first time with action research, as well as a first for my students, (2) mistakes – maintaining a willingness to take risks and an acceptance that mistakes would be made, and (3) playfulness – remembering to have fun with the process and with my students as we investigated TMs together (Hubbard & Power, 2003). In each of the data collection tools I attempted to examine frequency of specific question responses, and tally instances of successful use, or marginal attempts at use, or non-use of various TMs. Some of the participant coding makes reference to both gender as well as ethnicity data (and this particular use of demographics is something that my district and school consistently utilize in order to focus on equity issues, as well as serving as a reminder to always be cognizant of potential equity constraints in
  • 13. Thinking Maps® 13 instructional practices). The coding indicated: B (boy), G (girl), 4/3 (ethnicity code for “hispanic/black”), and 5 (ethnicity code for “white”) as there are no other ethnicity groups represented in this participant pool. I also made an attempt to code emotions (such as “easy, fun, hard, or confusing”) in my field notes and look for emergence of new types of responses. There also came the realization that, after many years of experimenting with tracking student responses and behavior, it may be difficult to analyze the “subtle nuances of change” and find different ways to code than what had been intuitive thus far as simple anecdotal records (Hubbard & Power, 2003). Other techniques of verification included transcription after the fact into usable data, looking for recurring patterns to identify, as well as the use triangulation to observe from several different angles, such as student interviews and observations and pre-tests/post-tests, to see if they all independently produce the same conclusions (Robinson & Lai, 2006). The additional use of colleague data will help to round out the questionnaires, observations and test data to confirm stronger triangulation within data collection. Presentation of Findings: Results, Implications and Significance When looking at the data through the lens of “Theory of Action” (Robinson & Lai, 2006) making sense of the constraints meant identifying the factors that led to my use of TMs: the program was being piloted at my school with a pilot cohort on site, and one of my fifth-grade team members was actively using TMs in our classes. In my class there was varying ability to read/write independently and a wide variance in the depth of learning and quality of written responses for some individuals, so there were many diverse learning styles and needs within the same group. TM authors recommend at least an 8-week introduction period followed by a year-long process to work with TMs until they start to become intuitive for learners. My own 3-4 week study pales in comparison and the limited exposure to TMs (except in flexible grade-level groupings for math and reading where they may have been exposed to more TM training with the pilot teacher, or from other grade level team teachers attempting to emulate without the training) may have produced inequities with TM instruction repetition for some and being new for others. In addition to absences which are unavoidable, the majority of instruction occurred at the end of the day which may have produced a loss of attention span, frequent interruptions (PA announcements, students leaving school for appointments) and the like. One other possible constraint was determining whether it was the presentation of new material (“human body”) or new map use (or both) that made the difference in results. In an attempt to satisfy these constraints on the problem, various actions through instruction were used to narrow to a range of solutions (Robinson & Lai, 2006). I worked to keep students focused (using SmartBoard strategies designed to engage) and created predictable routines for each lesson. I implemented alternate activity to increase attention span (integrated movement, varied the use of text, video,
  • 14. Thinking Maps® 14 SmartBoard, discussion, cooperative groupings for pair-shares and jigsaw activities). I also tried to transition the students in their use of TMs by starting out with the pre-printed blank TMs at first and gradually released them to use their own self-drawn maps. TMs by their design were meant to allow for differentiation and the needs of those learners with varying abilities were met in the flexible nature of the maps. I attempted to prevent lesson interruption by occasionally rearranging the day so that the science TMs lesson could be earlier to avoid afternoon disruption. I also maintained constant communication with other TM teachers to aim for consistency in instruction, modeling and the rate of exposure to TMs. The subsequent consequences of instructional actions produced a more focused and engaged climate with an apparent increase in student interest (but again, was that the TMs or subject matter – or both?) and both intended and unintended consequences emerged (Robinson & Lai, 2006). Intended consequences were positive reactions to the TMs demo and students were more motivated to get their own written, more productive in written responses, and the collaborative groups seemed to increase the sense of community while working with new maps and sharing connections. I made sure to encourage different types of color-coding on the maps and for this, student connections to references and schema were more effective in the depth to which they could transfer their learning during class. Other expected results included the findings that TMs were most effective when used in combination for students to fully develop their conclusions (Hyerle & Yeager, 2007). The most precise and detailed responses came when students used two or more maps to explain their thinking. Also intended was my increased ability to use TMs to assess students’ development of concepts and misconceptions (Novak & Gowan, 1984). It was far easier to spot patterns and connections in thinking as well as the glaring gaps from misconceptions or map misuse. It was anticipated that students would achieve the ability to organize their ideas independently, assess the quality and quantity of their own written responses, and possess increased awareness of own thinking and an increased motivation to learn. In conversations with students, it was made clear that they thought these strategies helped them as they read through text, watched videos and participated in class discussions. 20% of the class even alluded to the fact that the use of TMs helped them more than the traditional study guides when preparing for the test. There were, however, unintended consequences that I did not anticipate. Although I knew that those students who generally do well would continue to do well (and the data supported this), I assumed that all of those who did not generally do well would make some growth. Instead, I found that only a small group began to improve and for some, there was no improvement. For some inexplicable reason, there was also a reversal in performance about midway through, and students started to use TMs less and less effectively. I relied on the assumption that the maps would begin to make a permanent change in the written response quality of most students, and during direct instruction or independent class time, it did. However, on assessment it appears that perhaps the “novelty” wore off because TMs were utilized less and
  • 15. Thinking Maps® 15 less across the board. My only thought is that in the constraint of available time and length of data collection could not be altered, therefore it could still be too early to tell if any significant growth is possible without continuing the TM instruction and monitoring again toward the end of the school year. The overall presentation of the data is organized both by type of data tool and chronological stage of data collection. By examining the data closely I was able to comment and reflect on different aspects which appear either below the data sets or further on in the discussion section. As with all the data presented, absences occurred on a daily basis so there are few data sets showing results for all the respondents at any one time. For “Questionnaire #1”, I thought it was important to establish a baseline of prior knowledge about the specific content to be studied (the human body) and of various thinking strategies and tools previously used prior to the introduction of any TMs in my science class. Figure 3: Student Questionnaire #1: Prior Knowledge Q-A: How much do you know about the human body before we start our unit? Participant code/Response “ a lot” “some” “a little” “nothing” B 4/3 0 2 0 0 B 5 1 2 2 0 G 4/3 0 0 6 0 G 5 0 4 0 1 Q-B: What strategies and tools do you use most to help you learn information?* Participant code/Response “ study guide” “books” “notes” “TMs” B 4/3 0 0 2 1 B 5 1 2 1 0 G 4/3 0 1 3 2 G 5 1 1 1 1 **Other: There were other responses noted such as: mnemonics, cheers/chants, flash cards, games, videos, online support, group study, memorization… (See Appendix A for actual questionnaire) Most of the respondents (about 88%) had “some” or “a little” prior knowledge about the human body content which I think would make for an easier transition into using TMs to organize knowledge and remember terms, facts and processes. There were 4 students (18%) who already used TMs to varying degrees as a learning strategy or tool in other classes, and the majority (59%) used some traditional form of study strategy. A smaller percentage (22%) used additional strategies that they delineated, but the data shows that all students were comfortable using strategies of some kind. I anticipated that even though the number of participants previously familiar with TMs was small, that this number would grow with the instruction in each sub-unit of instruction, or body system. As for looking at ethnicity data… there are few if any implications here that show variance in ethnic groups and I wasn’t sure going forward if any significant patterns would be revealed. (One of the initial constraints I noticed was the issue of participant absences and it made for a varying number of responses every time I collected data. Because the absences were attributed to different individuals each time, I wasn’t sure how address the validity of data collection,
  • 16. Thinking Maps® 16 other than to continue to be consistent for all who were present. If I were to try to “make-up” the data when a student returned, it might skew the overall data because it was collected in different settings, and if I were to ignore the absent individuals, then the total number of participants would change with each collection tool. I chose to try a little of each remedy, although I’m not sure that I can disaggregate the effect from the overall data). Throughout each sub-unit, pretests were given to assess prior knowledge. At no time (especially in the later pre-tests) did students independently utilize TMs to share their thinking. I assumed that they would gradually start to appear, but not once in a pre-test situation was a TM used by a student. During instruction, TMs were utilized by all students to their fullest capacity, since I began modeling with the whole class emulating and then I would wander as they worked looking for completeness. When it came time to assess independent use, the next form of data collection was the post-tests following each sub-unit where it was anticipated that TMs would appear in student responses. Both the “pre-test” and “post-test” asked an identical question: whether the student could describe the process for that specific body system (such as the process of digestion, the process of breathing, the process of blood circulation, the process of the brain sending a message, etc.). Most pre-tests were answered in the same manner (short written responses to the question, or “I don’t know”) but after content instruction and TMs modeling of the “flow map”, the post tests began to show some TMs use to varying degrees of success: Figure 4: Post-Test Results Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code 1 1 0 B 4/3 4 5 0 B 5 4 3 0 G 4/3 Digestive 1 5 0 G 5 Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code 0 2 0 B 4/3 2 5 1 B 5 2 6 0 G 4/3 Respiratory 2 0 3 G 5 Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code 0 3 0 B 4/3 1 3 3 B 5 4 2 2 G 4/3 Circulatory 1 3 0 G 5 Body System Successful TM Use Marginal TM Use Non-TM Use Participant Code 0 2 3 B 4/3 0 1 4 B 5 0 4 1 G 4/3 Nervous 1 2 2 G 5
  • 17. Thinking Maps® 17 The initial pre-test responses were for the most part the same - minimal answers with little terminology or content knowledge, and no use of TMs – even after the students had started to receive instruction in their use. Targeted TMs were introduced (such as the “flow map” for sequence of events/processes) and utilized throughout instruction and, when the same question was posed on the post- test, my expectation was that the students would use flow maps on the assessment to explain the specific body function process. The data shows TMs usage with varying degrees of success over time – the digestive system was the first sub-unit introduced, with the nervous system being one of the last in the 4- week time frame. Some participants successfully used the correct TMs with the correct terminology at each body system stage. Others attempted to use the flow map, but did not provide the correct information in each stage of the process. Still others made no attempt at using TMs whatsoever, reverting back to short written responses, or various other diagrams/graphic organizers such as Venn Diagrams, cycles and labeled models. Subsequent body systems (skeletal and muscular) did not make use of the same type of flow map for processes, and a different set of TMs were utilized. Data was not collected in the same manner for these final systems. After all of the TM instruction and modeling was introduced, about midway through the data collection, a questionnaire was given to students to see if they could identify the specific thinking processes and match it to the corresponding map: Figure 5: Student Questionnaire #2: Identify Intended Thinking Strategy Q- Match each TM to its intended thinking strategy (that is, why do we use it)? Participant code/ Number Correct of 6 0 1 2 3 4 5 6 B 4/3 0 0 1 0 0 0 1 B 5 0 2 1 2 2 0 1 G 4/3 0 0 2 1 3 0 0 G 5 0 0 1 3 2 0 0 (See Appendix D for actual questionnaire) Most of the students ranged from 2 to 4 correct matches with only about 41% passing acceptably. It was still only midway through an admittedly short research period, so there was hope that with more practice an increasing number of students would be more successful in the future. Yet another area within the data where unintended results occurred unwittingly was with the human body “final test”. It was designed by one of the other grade level team members and administered to the entire grade level – including my participants (and ultimately was not always consistent in language with my earlier post-tests). Students were asked on 3 occasions to describe body function “processes” – all 3 which could have been suited for TMs responses if the student was aware of the cues (the test used similar language like “describe how __ works”, “describe how ___ occurs”, and in one very consistent instance with a pre-test and class work, “compare and contrast ____”; in addition to a bonus section where students were asked to “describe” additional facts not previously required). Even though the questions
  • 18. Thinking Maps® 18 were worded in a slightly different way, it was still my hope that my students would see the key word, make the connections and utilize the appropriate TMs: Figure 5: Final Test Questions (that included key word descriptions of specific processes) Arteries & Veins (success/marginal) Breathing (success/marginal) Muscle Pairs (success/marginal) “Extra credit facts” (success/marginal) (no TM attempt) Participant Code 0/1 0/1 0/0 0/0 1 B 4/3 1/1 0/2 0/0 0/0 5 B 5 1/2 0/0 0/0 0/0 3 G 4/3 2/0 0/0 0/0 0/0 4 G 5 Students seemed to fully utilize TMs much more during class and independently than they did on the final assessment. Perhaps it was due to the fact that I purposely did not mention TMs as a specific option for test responses? I wanted to see rather if they would transfer the use of TMs independently without my prompting. The number of students who were successful with their maps on these questions (only about 16%) was disappointing, but even more alarming was the number of students who chose not to use a TM at all (about 54%) on test responses, even within the “low pressure” extra credit section. The “muscle pairs” question also rated no attempts at TM usage for some unknown reason even though we distinctly used TMs in class to discuss the relationship. At the end of all the TM instruction and modeling, and after the final assessment near the end of the data collection, a third questionnaire was given to students to see if they could make connections with the key words and phrases associated with each thinking process and then draw the corresponding map: Figure 6: Student Questionnaire #3: Connect with key words and draw TM Q- Match each TM to its intended thinking strategy (that is, why do we use it)? Participant code/ Number Correct of 6 0 1 2 3 4 5 6 B 4/3 0 1 0 0 0 1 0 B 5 0 0 2 1 3 0 1 G 4/3 0 1 0 2 0 0 2 G 5 0 1 1 2 0 0 1 (See Appendix E for actual questionnaire) Most of the students ranged once again from 2 to 4 correct responses and similarly to the second questionnaire, only about 42% passed within acceptable margins. Although a few had moved over into the 4, 5, or 6 correct columns, even more had moved in the opposite direction, showing more incorrect responses overall. I was curious how this result might have related to the frequency of use of each type of map over the course of the data collection. I tallied the frequency of each map as it was used throughout the sub-units, then I determined the percentage of students that correctly connected key words with drawing to see if there were any correlations:
  • 19. Thinking Maps® 19 Figure 7: Frequency of TM use v. success rate on questionnaire #3 Type of map Frequency of use across sub-units % correct responses “circle map” 7 70% “tree map” 3 39% “double bubble map” 1 48% “flow map” 5 57% “cause & effect map” 1 39% “brace map” 1 30% * Not all TM types were used during this data collection period The “circle map” was most widely used and showed in the number of correct responses. The “flow map” and “tree map” were the next most used maps, yet the number of correct responses doesn’t seem to correlate. Maybe it had to do with amount of exposure time – or perhaps my sample is too small? The remainder of data was collected in the form of observation notes on student engagement, depth of knowledge and colleague interview responses (see Appendices F & G). General observations included that, after four weeks of TM introduction, some students (about 25%) began to speak about content in a more confident manner and often referred back to a TM when making a point and attempting to clarify their thoughts. More students (about 30%) were able to discuss concepts with great detail using TMs with little prompting from me. I observed a limited increase in written response depth and quality (about 20%), but a much larger increase in class engagement (about 60%). The data collected from colleague input was more revealing however. All (100%) of my colleagues found the use of TMs, especially the common language they created, to be extremely helpful. Vertical conversations including teachers and administration revealed that students were largely on task and participated more when able to offer the different perspectives that the TMs encourage. The use of TMs also enabled the students in their classrooms to be more independent and concise, and several expressed their amazement at what their students could do with the maps. Most (about 80%) agreed that more teacher professional training was necessary moving forward. Emerging Themes/Results: As a result of all the data, I was surprised to see that the strongest data (and participation in TMs usage) came at the very beginning and declined on every data collection that followed. Perhaps most curious for me is that the data appears to show a decrease in consistent and independent TM use; instead of getting stronger with TMs use, it appears that the students were becoming less and less consistent with the proper TMs, and not independently choosing those graphic organizers, but reverting back to other more general forms (or poorly constructed written responses). Not at all what I had expected! In addition, there appears to be some difficulty for some participants to remember which TMs belongs with which thinking/ process skill and I’m concerned that this might present a “double load” and be just as hard as learning the
  • 20. Thinking Maps® 20 content material itself, presenting further areas of confusion. As an initial response to my research questions, the first use of TMs produced more depth of responses and more engagement in the TMs process. As each sub unit progressed, the participants utilized TMs in their notes, class discussions and post tests, but with decreasing degrees of accuracy. There seemed to be a small number of students who increased their understanding, response levels and engagement, but they weren’t the ones who needed to in terms of skill progression. I am still trying to sort out the reasons why this may have occurred. Some other constraints I hadn’t thought to explore emerged as learning style constraints that TMs seemed to produce – TMs being predominantly visual and systematic organizers (with some tactile qualities of writing or drawing, and auditory qualities when paired with “verbal rehearsal” of their contents). I realize after looking at the data that I should have allowed students the opportunity to “think aloud” and verbalize the process in a way that they could have explained it better and it would have produced different results for some students. In wanting to remain consistent in how I modeled the use of TMs and the expectations for assessment responses, I overlooked meeting the needs of other learning styles. Another potential barrier to clear and consistent data could have been that this participant group may have had varying degrees of previous “response instruction” (modeled by different team teachers in flexible groupings for reading and math, in addition to a different graphic organizer system used in a school-wide writing program), as well as a different amount of previous exposure to TMs use, and these instances may have offered some confusion when confronted with the TMs approach. My overall findings appear to parallel those I read in a sample study (Burden & Silver, 2006) where teachers were convinced that students were capable of achieving more than what they were demonstrating and intended to raise expectations and realize an increased number of students becoming more active learners with TMs. Although the results did not quite meet their assumptions, and the researchers noted that the TMs needed extensive further piloting, they came to the conclusion that students were just starting to shows signs that the TM tools helped guide thinking and problem-solving and that there were many different uses still ahead for multiple maps implementation. I too agree that there are early signs of habit-forming and that TMs are beneficial, having connections to several related clusters of Costa’s “Habits of Mind” (Hyerle, 2008) such as “attending to tasks that require patience & analytical, detailed thinking”. Over time, students should develop fluency with each map and become more able to transfer skills from multiple maps into each content area, having “the flexibility to choose and use maps for a variety of purposes” (Hyerle & Williams, 2009). Discussion and Suggestions for Further Research From the perspective of my action research, I have learned that TMs are really powerful tools when modeled and used in appropriate ways. Consistent use (and especially their use across disciplines) provides the ability to make connections to one’s learning in more depth. They offer a way to express
  • 21. Thinking Maps® 21 thoughts and personalize learning by utilizing the map features (frames of reference and color coding) to differentiate one’s learning and offer structure to note-taking and the absorption of new content. Some realizations that I had as a result of this action research were that I’ve barely scratched the surface with my implementation of TMs, and although I learned about these TM strategies “second-hand” from a team member who was part of the actual pilot group, I realize that there are more deliberate skills and strategies that I need to learn in order to have the maximum effect of my students. As I made observations of my students, I noticed some emerging trends in that many students still required a constant “nudging” to encourage them to utilize the TMs to their fullest potential and the most difficult instructional glitch was getting some of the students to transition to independence in their use of TMs. As I sought answers to my research questions I’ve come to appreciate and need to reinforce the notion that only through extended exposure over time and consistent practice will TM use become intuitive for the students (and teachers)! Upon reflection of my research technique I found the use of field notes in this way to be more powerful than other types of anecdotal record keeping that I’ve attempted in the past. I’ve gained increased reflection and metacognition of my own instructional practices and an increased use of strategies on my part since TMs have slowly become more intuitive with use. As for the future, I will continue to utilize TMs in my classroom across all subject disciplines. I will also begin the use of portfolios to monitor learners as they assimilate new knowledge and develop incremental skill using TMs making note as patterns emerge. This research has warranted continued exploration of my practice and to actively invite explicit thinking, integrate more TMs into the curriculum, and monitor the effects of TMs on scores. The way to increase long-term memory (and improve student achievement) is to incorporate new information gradually and repeat it at regular intervals to “mind the gap” while attracting and holding attention (Medina, 2008) and this increased exposure to TM usage should solidify these metacognitive strategies for students to take with them for future years. I have since participated in a school-wide training of TMs and reviewed a culminating report of the first pilot group’s findings. I will be part of a summer training conducted by the first pilot group to review “lessons learned” and share in how to implement TMs successfully in the classroom. I have also recently been invited to participate in the second round of the TMs pilot program in the fall, so I will be starting again with action research on TMs and refocusing my efforts with a new class. Perhaps I will also take a look at keeping students motivated once they have been trained in TMs use in an effort to prevent the decline of use over time that I found in this current study. I look forward to learning more about Thinking Maps® in the future, increasing my instructional skill and continually informing my practice.
  • 22. Thinking Maps® 22 References Burden, B., & Silver, J. (2006). Thinking maps in action. Teaching, Thinking & Creativity Retrieved from http://www.teachthinking.com Caine, R.N., & Caine, G. (1991). Making connections, teaching and the human brain. Wheaton, MD: ASCD. Dual-coding theory. (n.d.) In Wikipedia. Retrieved from http://en.wikipedia.org/wiki/Dual- coding_theory Gerlic, I., & Jausovec, N. (1999). Multimedia: Differences in cognitive processes observed with EEG. Educational Technology Research and Development, 47(3), 5-14. Hubbard, R.S., & Power, B.M. (2003). The art of classroom inquiry: A handbook for teacher-researchers. Portsmouth, NH: Heinemann. Hyerle, D. (1996). Thinking maps: seeing is understanding. Educational Leadership, 53(4). Hyerle, D., Curtis, S., & Alper, L. (2004). Student successes with thinking maps: School-based research, results and models for achievement using visual tools. Thousand Oaks, CA: Corwin Press. Hyerle, D. & Yeager, C. (2007). Thinking maps: A language for learning. Cary, NC: Thinking Maps, Inc. Hyerle, D. (2008). Thinking maps: Visual tools for activating habits of mind. In A.L. Costa and B. Kallick (eds.), Learning and leading with habits of mind: 16 Essential characteristics for success (149- 176). Alexandria, VA: ASCD Hyerle, D., & Williams, K. (2009). Bifocal assessment in the cognitive age: thinking maps for assessing content learning and cognitive processes. New Hampshire Journal of Education, 12, 32-38. Jensen, E. (1998). Brain-based learning: The new paradigm of teaching. Thousand Oaks, CA: Sage. Leary, S. F. (1999). The effect of thinking maps instruction on the achievement of fourth-grade students. Retrieved from http://www.thinkingfoundation.org/research/graduate_studies/pdf/samuel-leary- dissertation.pdf Marzano, R.J., Pickering, D.J., & Pollock, J.E. (2001). Classroom instruction that works: Research-based strategies for increasing student achievement. Alexandria, VA: Prentice Hall. McCormick, S. (1995). Instructing students who have literacy problems. Englewood Cliffs, NJ: Prentice Hall. Medina, J. (2008). Brain rules. Retrieved from IT 6710 http://cu.ecollege.com Merkley, D. M., & Jeffries, D. (Dec. 2000/Jan. 2001). Guidelines for implementing a graphic organizer. The Reading Teacher, 54 (4), 350-357. Novak, J.D., & Gowin, D.B. (1984). Learning to learn. New York, NY: Cambridge University Press.
  • 23. Thinking Maps® 23 Paivio, A. (1990). Mental representations: A dual coding approach. Available from http://books.google.com/books?id=hLGmKkh_4K8C&lpg=PA3&ots=B1GWeFjljn&dq=Paivio% 20%2B%20dual%20coding%20theory&lr&pg=PP1#v=onepage&q=Paivio%20+%20dual%20cod ing%20theory&f=false Ritchhart, R., Turner, T., & Hadar, L. (2009). Uncovering students' thinking about thinking using concept maps. Metacognition and Learning, 4 (2), 145-159. Robinson, V. & Lai, M.K. (2006). Practitioner research for educators: A guide to improving classrooms and schools. Thousand Oaks, CA: Corwin Press. Stull, A.T., & Mayer, R. (2007). Learning by doing versus learning by viewing: Three experimental comparisons of learner-generated versus author-provided graphic organizers. Journal of Educational Psychology, 99 (4), 808-820. Tomlinson, C.A., & Kalbfleisch, M.L. (1998). Teach me, teach my brain: A call for differentiated classrooms. Educational Leadership, 56 (3), 52-55.
  • 24. Thinking Maps® 24 Appendix A: (Data collection questionnaire sample 1) Questionnaire #1 Directions:  Please carefully read each question and respond thoughtfully. Remember to write neatly and make your thoughts clear. 1. How much do you know about “the human body” before we start our unit? (Circle one) a lot some a little nothing 2. What strategies or tools do you use the most to help you learn and remember new information? _____________________________________________________________________________________ _____________________________________________________________________________________ 3. How do you think you usually do on tests? (Circle one) very well pretty good just OK not very well 4. What do you think of the quality of your test responses? (Select the most appropriate answer)  My work is usually thorough and I’m proud of my success  I usually recognize my mistakes and I can fix them  I’m usually not sure where I went wrong and need further clarification  My responses are not usually sufficient  Other: ________________________________________________________________________ ************************************************************** The questions above were about your learning strategies for science. Please answer the following questions about learning in all subject areas: Should students be taught “thinking skills”? (Circle one) Y N Why or why not? _____________________________________________________________________________________ _____________________________________________________________________________________ List all of the thinking skills/tools/strategies that you use to help you learn in school: ____________________________ ____________________________ ____________________________ ____________________________ ____________________________ ____________________________
  • 25. Thinking Maps® 25 Appendix B: (Data collection pre-test sample) Pretest #1: Digestion Directions:  Please carefully read each question and respond thoughtfully. Remember to write neatly and make your thoughts clear. 1. Using words or pictures, describe how human digestion works: 2. How do you know? _____________________________________________________________________________________ _____________________________________________________________________________________ _____________________________________________________________________________________ 3. Where will you be able to find out more about digestion? _____________________________________________________________________________________ _____________________________________________________________________________________ _____________________________________________________________________________________
  • 26. Thinking Maps® 26 Appendix C: (Data collection post-test sample) Post-test #3: Circulation Directions:  Please carefully read each question and respond thoughtfully. Remember to write neatly and make your thoughts clear. 1. Using words and/or pictures, describe the process of human circulation: 2. How do you know? _____________________________________________________________________________________ _____________________________________________________________________________________ _____________________________________________________________________________________
  • 27. Thinking Maps® Appendix D: (Data collection questionnaire sample 2) Questionnaire #2 Directions:  Look at each of the “thinking maps” on the left side of the page.  Carefully draw a line to match each to its intended thinking strategy on the right side of the page. Restated: Match each “thinking map” to the reason we use it… 27 comparing & contrasting sequencing & ordering identifying part-to-whole relationships brainstorming or defining in context analyzing cause & effect classifying & grouping
  • 28. Thinking Maps® Appendix E: (Data collection questionnaire sample 3) Questionnaire #3 Directions:  Carefully read each description below that contains the “code words & phrases” that help us know which type of “thinking map” works best for what we study.  Decide which “thinking map” is being described and draw it into the frame of reference. Restated: Draw the “thinking map” that best describes the phrases below it… define brainstorm describe in context list identify generate ideas… …as many as you can …tell everything you know …use your prior knowledge explore the meaning of… classify categorize organize group sort give sufficient details for… types of… describe parts & functions of… list & elaborate on… compare/contrast similarities/differences alike/unlike distinguish between… show what ___ has in common… describe shared components of… sequence order events describe phases/cycles/process recount/retell summarize procedures tell/show how… identify steps/stages first/then/next… put in order how to get from ___ to ___ cause/effect discuss consequences of… why did/does… what would happen if… predict the outcome of… describe changes discuss strategy/outcome what is the impact of… what would be the result if… part-to-whole describe/show structure what are the parts/components take apart __/ put together __ __ is made up of… describe the subparts/subcomponents how does __ fit in a larger context 28
  • 29. Thinking Maps® 29 Appendix F: Field notes excerpts Entry (3/9/10): Started on the third sub-unit today “Circulatory System” … introduced circle map to be completed as part of the “morning menu”… First elicited “what you already know” and marked the pre- established color-coding on the board: brain = pencil; For the first time, a G5 asked at the beginning of ‘morning menu’, “Can we add in our own notes with other colors if we once we finish our brainstorming?” This is the first time that anyone has made the connection to use additional colors for additional sources on their own, without my prompting. Maybe they’re starting to get it?? We then added to the board the remainder of the color-coding chart and left it there permanently for the duration of all the human body units: Brain = pencil; text = 2nd color; class discussions & teacher lesson = 3rd color; video = 4th color; Assigned corresponding text readings and note-taking to the menu… Will introduce tree map this afternoon for 3 types of blood cells *** Entry (3/17/10): I walked around the room during group work activity (students were establishing a flow chart for “Nervous System” – How does the brain send/respond to a message?)… observed individuals displaying – and/or + behaviors with respect to the DB map skill… General Observations: (B 3/4) participants (B 5) participants (G 3/4) participants (G 5) participants -quickly scribbling a double bubble; no care taken to align related bubbles +using tree map for facts to put into DB map +completed DB map before rest of group with lots of detail -starts a Venn Diagram when asked to compare/contrast +explaining how to use a double bubble to elbow partner +comparing DB map to the difference paragraph ‘bug box’ in writing +2 working together “divide and conquer”- one uses red for compare, other uses blue for contrast -needs explanation from elbow partner; does not remember how to use DB -uses bubble map instead of DB map -off task and talking; no signs of map being started Note: this week’s lessons are becoming “extra”- feeling like “one more thing’ in an already tight schedule; have to embed brace map as a “quick check” activity (direct instruction) to make time instead prior to quiz…. *** Entry (3/23/10): Group Observation tally Behaviors during “Skeletal/Muscular” assessment: (B 3/4) participants (B 5) participants (G 3/4) participants (G 5) participants Staring at page or looking up in thought lll I I II Using FM instead of MFM to show contractions II IIII III II Color-coding maps on test I I II I Asks what map to use I I Note: 6 absences today!! Need to determine how that will affect overall data…. Make-up exams!!
  • 30. Thinking Maps® 30 Appendix G: Interview question format The following questions were asked of my colleagues (grade-level team members, pilot participants, other teachers trying to emulate the pilot instruction on their own) in an effort to find common experiences and to help me make comparisons between my class and other classes at similar levels of implementation: Domain Interview Questions Quantitative 1. Which TM do you most frequently use? Why? 2. How often do you use each of the TMs? 3. How many of each type of map have the students completed? 4. Have you measured growth? Probe Further: How long does the introduction/modeling take for each? Qualitative 1. What instructional strategies do you use to teach TMs? 2. Describe the experience of using TMs in your classroom 3. How have you extended the use of TMs? Probe Further: Which TMs seem easiest for your students? Most difficult? Reflective: Outcome of Lesson 1. How does the TM relate to the lesson being taught? 2. How do you monitor engagement? 3. How do you incorporate TMs in to assessment? Probe Further: How do you differentiate using TMs? Professional Impact and Practice 1. Describe your pacing for TM implementation? 2. What success have you observed? 3. What areas for improvement have you observed? Probe Further: How does the use of TMs support our school or grade-level agreements? (adapted from the literature- Leary, 1999; used that study’s questionnaire format as the model )
  • 31. Thinking Maps® 31 Appendix H: Research Timeline Task Date/Time Frame People involved Scope the research problem Begin Feb 1 (due Feb. 7) Self Create timeline, or “plan of action” Begin Feb 1 (due Feb. 14) Self Conduct the initial literature review Begin Feb 1 (due Apr. 4) Self Design the study; plan data collection Begin Feb 7 (due Feb. 21) Self; team Create data collection tools Begin Feb 7 (due Feb. 21) Self Arrange to conduct research with respondents and involved educators Begin Feb 7 (due Feb. 21) Self; team; students Create permission forms and obtain permission to conduct research at designated site Begin Feb 7 (due Feb. 21) Self; team; parents Collect data (activity part 1) Begin Feb 22 (due Feb. 28) Self; team; students Collect data (activity part 2) Begin Feb 22 (due Mar. 7) Self; team; students Continue data collection Begin Feb 22 (due Mar 26) Self; team; students Analyze data Begin Mar. 8 (due Mar. 28) Self; team Make inferences and determine findings Begin Mar. 8 (due Apr. 11) Self Literature review (activity due) Begin Feb. 1 (due Apr. 4) Self Finalize literature review Begin Feb. 1 (due Apr. 25) Self Write final report draft Begin Feb 7 (due Apr. 11) Self Peer edit Begin Apr. 12 (due Apr. 18) Self; IT 6720 cohort; team Review findings with key person Begin Apr. 12 (due Apr. 18) Self; team Revise final report Begin Apr. 19 (due Apr. 25) Self Submit final report ** Due Apr. 25 ** Self Prepare presentation Begin Apr. 25 (due May 2) Self Presentation: review of study and findings ** Due May 2 ** Self