+ All Categories
Home > Documents > Three metacognitive approaches to training pre-service teachers in different learning phases of...

Three metacognitive approaches to training pre-service teachers in different learning phases of...

Date post: 18-Dec-2016
Category:
Upload: tova
View: 212 times
Download: 0 times
Share this document with a friend
23
This article was downloaded by: [RMIT University] On: 01 October 2013, At: 05:27 Publisher: Routledge Informa Ltd Registered in England and Wales Registered Number: 1072954 Registered office: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK Educational Research and Evaluation: An International Journal on Theory and Practice Publication details, including instructions for authors and subscription information: http://www.tandfonline.com/loi/nere20 Three metacognitive approaches to training pre-service teachers in different learning phases of technological pedagogical content knowledge Bracha Kramarski a & Tova Michalsky a a School of Education, Bar-Ilan University, Ramat-Gan, Israel Published online: 02 Dec 2009. To cite this article: Bracha Kramarski & Tova Michalsky (2009) Three metacognitive approaches to training pre-service teachers in different learning phases of technological pedagogical content knowledge, Educational Research and Evaluation: An International Journal on Theory and Practice, 15:5, 465-485, DOI: 10.1080/13803610903444550 To link to this article: http://dx.doi.org/10.1080/13803610903444550 PLEASE SCROLL DOWN FOR ARTICLE Taylor & Francis makes every effort to ensure the accuracy of all the information (the “Content”) contained in the publications on our platform. However, Taylor & Francis, our agents, and our licensors make no representations or warranties whatsoever as to the accuracy, completeness, or suitability for any purpose of the Content. Any opinions and views expressed in this publication are the opinions and views of the authors, and are not the views of or endorsed by Taylor & Francis. The accuracy of the Content should not be relied upon and should be independently verified with primary sources of information. Taylor and Francis shall not be liable for any losses, actions, claims, proceedings, demands, costs, expenses, damages, and other liabilities whatsoever or howsoever caused arising directly or indirectly in connection with, in relation to or arising out of the use of the Content. This article may be used for research, teaching, and private study purposes. Any substantial or systematic reproduction, redistribution, reselling, loan, sub-licensing, systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &
Transcript

This article was downloaded by: [RMIT University]On: 01 October 2013, At: 05:27Publisher: RoutledgeInforma Ltd Registered in England and Wales Registered Number: 1072954 Registeredoffice: Mortimer House, 37-41 Mortimer Street, London W1T 3JH, UK

Educational Research and Evaluation:An International Journal on Theory andPracticePublication details, including instructions for authors andsubscription information:http://www.tandfonline.com/loi/nere20

Three metacognitive approachesto training pre-service teachersin different learning phases oftechnological pedagogical contentknowledgeBracha Kramarski a & Tova Michalsky aa School of Education, Bar-Ilan University, Ramat-Gan, IsraelPublished online: 02 Dec 2009.

To cite this article: Bracha Kramarski & Tova Michalsky (2009) Three metacognitive approachesto training pre-service teachers in different learning phases of technological pedagogical contentknowledge, Educational Research and Evaluation: An International Journal on Theory and Practice,15:5, 465-485, DOI: 10.1080/13803610903444550

To link to this article: http://dx.doi.org/10.1080/13803610903444550

PLEASE SCROLL DOWN FOR ARTICLE

Taylor & Francis makes every effort to ensure the accuracy of all the information (the“Content”) contained in the publications on our platform. However, Taylor & Francis,our agents, and our licensors make no representations or warranties whatsoever as tothe accuracy, completeness, or suitability for any purpose of the Content. Any opinionsand views expressed in this publication are the opinions and views of the authors,and are not the views of or endorsed by Taylor & Francis. The accuracy of the Contentshould not be relied upon and should be independently verified with primary sourcesof information. Taylor and Francis shall not be liable for any losses, actions, claims,proceedings, demands, costs, expenses, damages, and other liabilities whatsoever orhowsoever caused arising directly or indirectly in connection with, in relation to or arisingout of the use of the Content.

This article may be used for research, teaching, and private study purposes. Anysubstantial or systematic reproduction, redistribution, reselling, loan, sub-licensing,systematic supply, or distribution in any form to anyone is expressly forbidden. Terms &

Conditions of access and use can be found at http://www.tandfonline.com/page/terms-and-conditions

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

Three metacognitive approaches to training pre-service teachers in

different learning phases of technological pedagogical content knowledge

Bracha Kramarski* and Tova Michalsky

School of Education, Bar-Ilan University, Ramat-Gan, Israel

Our study investigated 3 metacognitive approaches provided during differentphases of learning technological pedagogical content knowledge (TPCK) in aWeb-based learning environment. These metacognitive approaches were based onself-question prompts (Kramarski & Mevarech, 2003) which appeared in pop-upscreens and fostered the Self-Regulated Learning (SRL) of pre-service teachers(n ¼ 144) through 1 of the 3 learning phases (Zimmerman, 2000): planning, actionand performance, and evaluation. Four measures (pre/post) were administered inthe study: SRL self-report questionnaires in the contexts of pedagogical learningand teaching and TPCK in the comprehension and design lessons. Mixedquantitative and qualitative analyses showed that fostering students’ SRLthrough the evaluation phase was the most effective for the pre-service teachers’perceived SRL in both the learning and teaching contexts and for their TPCK(comprehension and design lessons). Furthermore, students from the planningapproach outperformed the students from the action approach in most of theSRL and TPCK measures.

Keywords: pre-service teachers; TPCK; SRL; learning phases; IMPROVE self-question prompts; Web-based learning

Introduction

Educators and researchers propose that practice in Self-Regulated Learning (SRL)should begin with teachers’ professional education in learning environments thatencourage knowledge construction through SRL. When teachers understand moreabout their own SRL, they can better appreciate the value of SRL and thereby moreeffectively teach SRL to their students (e.g., Kramarski & Michalsky, 2009a, 2009b;National Council for Accreditation of Teacher Education, 2002; Putnam & Borko,2000; Randi & Corno, 2000).

Educational theory points to SRL as a construct that provides a holistic view ofstudents’ cognitive and metacognitive skills and their motivation for the attainmentof personal goals in an environmental context (e.g., Pintrich, 2000; Zimmerman,2000, 2008). According to Zimmerman, SRL proceeds along three learning phases:forethought (i.e., planning), action and performance (i.e., monitoring), andevaluation (i.e., reflection). Findings indicate that SRL is a complex construct:

*Corresponding author. Email: [email protected]

Educational Research and Evaluation

Vol. 15, No. 5, October 2009, 465–485

ISSN 1380-3611 print/ISSN 1744-4187 online

� 2009 Taylor & Francis

DOI: 10.1080/13803610903444550

http://www.informaworld.com

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

Students performed few of the SRL activities spontaneously in their learning, andmetacognitive support is important for developing SRL (e.g., Azevedo, 2005;Kramarski & Mizrachi, 2006).

Our study focuses on three metacognitive approaches provided during thedifferent phases of learning technological pedagogical content knowledge (TPCK) ina Web-based learning environment. The study aims to investigate in which learningphase the metacognitive training approach is most effective for the development ofthe SRL and TPCK of pre-service teachers.

Next, in our theoretical framework, we will explain Zimmerman’s model(2000, 2008) of SRL learning phases. We will then present TPCK in a Web-basedlearning environment (WBLe), and, finally, we will discuss the theoreticalbasis for the metacognitive approaches (based on IMPROVE self-questionsprompts).

Self-regulated learning (SRL) phases

According to Zimmerman (2000), ‘‘self-regulation refers to self-generated thoughts,feelings, and actions that are planned and cyclically adapted to the attainment ofpersonal goals’’ (p. 14). The SRL model includes three recursive phases, forethought,action and performance, and evaluation, and these phases are implemented in anenvironmental context (Zimmerman, 2000, 2008). The forethought phase includes theplanning process (task analysis, goal setting, and strategic planning) and self-motivation beliefs (self-efficacy, outcome expectations, intrinsic interest/value, andgoal orientation). The next phase, action and performance, includes the self-controlprocess (self-instruction, attention focusing, and task strategies) and self-observation(metacognitive monitoring and self-recording). Finally, the evaluation phase refers tostudents’ ability to reflect on their learning performance in order to control andadjust their learning accordingly. According to Zimmerman, the evaluation phaseplays a central role in achieving SRL.

Although research indicates the importance of SRL in academic achievement,there is indication that students do not implement such behaviors spontaneously.Students are often ‘‘cognitively overloaded’’ during the learning process andhave difficulty in self-observation and reflection (Zimmerman, 2000). Conse-quently, Zimmerman suggests that multicomponent teacher training is necessaryto help students better interpret the SRL cyclical loop within a given learningsituation.

Research has shown the effects of metacognitive training on different areas oflearning, such as mathematics (e.g., Kramarski, 2004; Kramarski & Mevarech, 2003;Kramarski & Mizrachi, 2006), achievements in sciences (e.g., Michalsky, Zion, &Mevarech, 2007), pedagogical knowledge (e.g., Kramarski, 2008; Kramarski &Michalsky, 2009a, 2009b), and SRL skills (e.g., Kramarski & Gutman, 2006;Kramarski & Mizrachi, 2006). Effects were found in diverse learning environments(e.g., Azevedo, 2005; Kramarski, 2004; Kramarski & Michalsky, 2009a, 2009b).However, little attention has been given to the relative efficacy of providingmetacognitive training during different learning phases. Researchers have suggestedthat the same metacognitive training given at different times seems to affect thecognitive learning tasks in each learning phase differently (e.g., Bannert &Mengelkamp, 2008; Michalsky, Mevarech, & Haibi, 2009; Pol, Harskamp, & Suhre,2008). As Kauffman, Ge, Xie, and Chen (2008) argued, the utility and the success of

466 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

metacognition training depend on what was activated in the cognitive process andwhen it was activated.

It is presently unknown whether providing metacognitive training prior toperforming a TPCK task is more or less effective than providing metacognitivetraining during or after the performance of the task.

In the present study, we addressed this gap in the literature through a quasi-experiment comparing three kinds of metacognitive training prompts: metacognitivetraining provided before, during, and immediately after receiving the TPCK task in aWeb-based learning environment.

TPCK within a Web-based learning environment (WBLe)

We designed a Web-based learning environment (WBLe) for the purpose of thestudy (described in the Method section). WBLe is a nonlinear environment in apedagogical context. WBLe provides new possibilities for teaching about thestructure of domain and pedagogical knowledge by using representations or deliverymedia (e.g., video clips, sound bite graphics, hypertexts, animations). WBLe isconsidered a powerful cognitive tool that transforms abstract content into moreconcrete or realistic forms of knowledge, and, as such, it may facilitate conceptualknowledge development (Azevedo, 2005; Jacobson & Archodidou, 2000; Jonassen,2000). Furthermore, such an environment allows student-centered activities bynavigating in the environment and deciding what, when, and how to learn (Azevedo& Jacobson, 2008; Winters, Green, & Costich, 2008).

However, researchers advocate that, although WBLe can readily providemultiple tools, and opportunities to manipulate them in pedagogical uses, teachingin such an environment is a complex process (Angeli & Valanides, 2009; Kramarski& Michalsky, 2009a, 2009b; Mishra & Koehler, 2006; Niess, 2005). First, teachersmust develop technological pedagogical content knowledge which is conceptualizedas TPCK. At the heart of TPCK conceptualization is the view that technology is acognitive partner that amplifies or augments student learning ‘‘toward transforma-tion of these contributing knowledge bases into something new’’ (Angeli &Valanides, 2009, p. 5).

Second, teaching with WBLe demands SRL decisions. The teacher mustfrequently: identify suitable tasks for teaching with WBLe resources; determinethe most appropriate tools to infuse in teaching/learning; determine when and howto use these tools; and, finally, select the optimal pedagogical method to supporttheir choices. Researchers demonstrated that teachers often exhibit difficulties inmaking such decisions and suggest to support optimal implementation of SRL inWBLe for pedagogical uses (e.g., Kramarski & Michalsky, 2009a, 2009b; Putnam &Borko, 2000; Randi & Corno, 2000).

Providing metacognitive training in different learning phases

More recently, some researchers have explored how prompting students withautomated instruction supports in the form of metacognitive questions influences thestudents’ SRL and their achievement in WBLe (e.g., Kauffman et al., 2008;Kramarski & Michalsky, 2008, 2009b). To support students’ involvement inregulatory learning, Mevarech and Kramarski (1997) designed the IMPROVE meta-cognitive self-questioning method that actively engages students in self-regulating

Educational Research and Evaluation 467

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

their learning by using four categories of questions: comprehension, connection,strategy, and reflection. Comprehension questions help students understand theinformation of the task/problem to be solved (e.g., ‘‘What is the problem/task?’’;‘‘What is the meaning of. . .?’’). Connection questions prompt students to understandtasks’ deeper-level relational structures by articulating thoughts and explicitexplanations (e.g., ‘‘What is the difference/similarity?’’; ‘‘How do you justify yourconclusion?’’). Strategy questions encourage students to plan an appropriate strategyand to monitor the effectiveness of their selection (e.g., ‘‘What is the strategy?’’;‘‘Why?’’). Reflection questions help students evaluate their entire problem-solvingprocesses, encouraging students to consider various perspectives and values regardingtheir selected solutions and modify them if it is necessary (e.g., ‘‘Does the solutionmake sense?’’; ‘‘Can the solution be presented otherwise?’’).

The previous studies investigated the IMPROVE model with the four self-questions while guiding students’ thoughts and actions throughout the entiresolution process (e.g., Kramarski & Zoldan, 2008) and its effects on SRLcomponents (cognition, metacognition, and motivation; Kramarski & Michalsky,2009a, 2009b). Until now, no study has investigated the effects of IMPROVEmetacognitive question prompts directed to the different learning phases in WBLefor fostering SRL and TPCK of pre-service teachers.

Supporting students in advance (the planning phase) can be accomplished withcomprehension question prompts in which students are clearly instructed to ‘‘thinkahead’’ about the goal of the task, possible difficulties, and methods of findingsolutions. The second option for prompting support occurs during (the action phase)the problem-solving process (just in time) with strategy questions that guide thestudent to ‘‘think step-by-step’’ on the task with less reliance on teacher-basedassistance (Pol et al., 2008). Such a process might develop a habit of strategicthinking for task solving and provide the student with more mental capacity to solveparticular tasks. The third option of prompting support is by reflective questions thatask students ‘‘to think back’’ and evaluate the solution at the end of the process (theevaluative phase).

The current study objectives

We designed three learning approaches to explore how pre-service teachers capitalizeon different question prompts (as described in the Method section). Theseapproaches are based on the IMPROVE model directed to each learning phase ina WBLe to stimulate and support the teachers’ SRL ability and TPCK. The M_Papproach was directed to the planning phase with the comprehension questionprompts, the M_A approach was directed to the action performance phase with thestrategy question prompts, and the M_E approach was directed to the evaluationphase with the reflection question prompts.

Educators and researchers propose (Kramarski & Michalsky, 2009b; Leelawonget al., 2002; Putnam & Borko, 2000) that preparing to teach in a self-directed open-ended technology environment like hypermedia is tied to pre-service teachers’ ownself-regulation abilities in two ways. First, pre-service teachers must be able toachieve SRL for themselves (the learner’s perspective in SRL), that is, be themselvesself-regulated learners. Second, pre-service teachers must be able to understand howto help their students achieve SRL (the teacher’s perspective in SRL). Based on thesesuggestions, our study aimed to assess pre-service teachers’ SRL in the learning and

468 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

teaching context with two complementary self-reports measures. We used theMotivated Strategies for Learning Questionnaire (MSLQ; Pintrich, Smith, Garcia, &McKeachie, 1991). This tool assesses the SRL components cognition, metacognition,and motivation in the learning context. In addition, we implemented theMetacognitive Awareness Index (MAI; Schraw & Dennison, 1994). This tool wasadapted for assessing SRL in teaching by referring to the three learning phasesplanning, monitoring the action and performance, and evaluation.

In addition, we examined TPCK in reference to comprehending and designinglessons. TPCK can be developed through different skills. Comprehension skillsare basic TPCK, necessitating only that teachers process data concerning existinginformation. Likewise, design skills in TPCK demand that teachers synthesize andcreate learning activities such as lesson plans using a technology environment.Unlike comprehension skills, design skills are more complex, requiring higherorder thinking in TPCK (Kramarski & Michalsky, 2009b; Zohar & Schwartzer,2005).

Taking into account the importance of reflection, which supports the use ofmetacognition processes after the action (Michalsky et al., 2009; Schon, 1996;Zimmerman, 2000), we assumed that providing metacognitive question prompts inthe evaluation phase (M_E) would exert more positive effects on pre-service teachers’SRL (MSLQ and MAI) and TPCK (comprehending and design tasks) than wouldthe implementation of question prompts in the two other phases (M_P and M_A).We based this assumption on the following two reasons. First, prompting at the endof the process (e.g., ‘‘Does it make sense?’’) encourages students to think back on theentire solution process. Second, at the end of the task, students have more time toevaluate the process than during the other two phases. Thus, the M_E participantsare likely to develop more awareness of their understanding and strategies; this, inturn, may affect their SRL and TPCK.

In contrast, the (M_P) and the (M_A) approaches demand that the studentsimultaneously implement both TPCK and SRL skills, which may increase thecognitive load during the solution process. We made no specific assumptionsregarding the relative efficacies of the M_P and M_A approaches. This decision wasbased on the non-inclusive research conclusion regarding the effect of metacognitivesupport during these two phases (Michalsky et al., 2009; Moreno, 2006; Pol et al.,2008).

In conclusion, the twofold purpose of this study was to investigate the effects ofthree metacognitive approaches (M_P, M_A, M_E) that pre-service teachersimplement with question prompts in the different learning phases embedded inWBLe on (a) self-perceived SRL ability in learning perspective (MSLQ ques-tionnaire) and teaching perspective (MAI questionnaire) and (b) TPCK of thecomprehension and design of lessons.

Method

Participants

The study was conducted in one of the universities in central Israel and included 144first-year pre-service teachers (62% females and 38% males) for high schools in thesciences (physics, biology, chemistry, and mathematics). Most of the students wereJewish (76%), and the rest were Arab (24%). Participants studied in a jointmandatory 1st-year course, ‘‘Designing Learning Activities with a Web-based

Educational Research and Evaluation 469

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

environment’’. All of the pre-service teachers who were enrolled in this course wererandomly assigned to one of three metacognitive approach groups: M_P (planningprompts), M_A (action and performance prompts), and M_E (reflection prompts).Statistical comparisons between the three learning approaches at the pretest intervalshowed no significant differences in age (M ¼ 22.6 years, SD ¼ 5.2), grade pointaverage (in percents) in major subject (M ¼ 80, SD ¼ 5.3), or in other demographiccharacteristics (e.g., gender, socioeconomic status, and ethnicity).

Teacher background and training

The three female teachers who taught the course each held a university PhD degreein education. Each teacher had more than 10 years of teaching experience and wasconsidered by the students to be an expert teacher. For the purpose of the presentstudy, the three teachers were randomly assigned to the three research groups (M_P,M_A, M_E).

The teachers were trained in a 2-day inservice training program (6 hr) thatfocused on the TPCK framework for developing teaching and learning knowledge intechnology environment (e.g., Angeli & Valanides, 2009) and on SRL sociocognitivemodels (Zimmerman, 2000, 2008). On the 1st day of the training seminar (lasting3 hr), the training instructor (the second author) informed teachers that they wereparticipating in an experiment in which new materials were being used/designed in aWeb-based environment, including TPCK tasks in teaching science. All threeteachers were asked to solve the tasks and to think about possible difficulties theymight encounter in the class. Emphasize was given to guidance in how to use WBLe,encourage forum and class discussion, manage teachers’ role while students workwith the computer, and maintain learner-centered learning by clarifying but notdirectly answering students’ questions. The 2nd day of training (lasting 3 hr) differedfor the three teachers. Each teacher was exposed to the IMPROVE self-questioningmodel but practiced different question prompts (comprehension, strategy, andreflection; see description in the next section).

Teachers received an introduction to the rationale and techniques of theparticular approach they would be implementing. The instructor modeled ways forusing pop-ups on the computer screen to introduce the question prompts. During theperiod of the study, the second author observed both teachers 6 times (every 2ndweek) to help ensure adherence to implementation of the instructional approaches.In addition, the authors met each teacher after the observations and discussed anydeviations from the approach.

Shared structure and curriculum for the three learning approaches

The three learning approaches (M_P, M_A, M_E) comprised 14 pedagogicalworkshops lasting 4 hr each week, for 56 hr of total training. Each of the workshopsin all groups contained the same structure. First, the teacher presented the lesson’ssubject. Practice was based on WBLe designed by the researchers for the purpose ofthe study. The environment provided two kinds of TPCK tasks: analyzing videovignettes to capture recurrent and authentic classroom cases and designing lessons inWBLe. The tasks followed by questions, and links for additional information inorder to solve the task were embedded in the task. Pre-service teachers were asked tosolve the given tasks by discussing the solution in the forum and presenting the

470 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

conclusions to the whole class. Finally, the teacher presented a summary in theclassroom, addressing any difficulties that arose.

Workshops focused on implementing theoretical teaching and learning methodsfor TPCK activities based on Angeli and Valanides’ (2009) principles andIMPROVE self-questioning. Participants were taught that TPCK is a unique bodyof knowledge that is constructed from the interaction of its individual contributingknowledge bases. Teachers were exposed to computer tools (e.g., representations,animations, and hyperlinks) and appropriate strategies for the infusion oftechnology in the classroom. In particularly, emphasis was given to strategies thatput the learner at the center of the learning process.

Unique structure of the learning approaches

During the practice in the Web environment, each group (M_P, M_A, M_E) wasprompted with questions (based on the IMPROVE model) adapted to the three SRLphases (planning, action and performance, and evaluation) and appeared as pop-upscreens.

The M_P group. Participants were prompted with the comprehension questions tofocus on the TPCK task before solving it or before designing the activity (theplanning phase).

Pre-service teachers employed comprehension questions to think on the TPCKtasks goal, to find links to previous knowledge, identify/design or select the topicsthat were difficult for students to understand and impossible to implement viatraditional means, and as such they are preferable to be taught with WBLe. Thesetopics could include abstract concepts that need to be visualized or phenomenathat need to be animated and topics that require multimodal transformations (i.e.,textual, iconic, auditory). Figure 1 summarizes the IMPROVE metacognitive self-questioning model for prompting teachers’ SRL in each of the three phases inWBLe.

The M_A group. Participants were prompted with the strategy questions during theaction and performance phase. Strategic questions were designed to help teachers intheir decisions to employ strategies that were appropriate for solving or teaching thegiven TPCK task and monitor their reasons. In addressing the strategic questions,teachers had to describe ‘‘what’’ strategy they selected (declarative decision), ‘‘how’’they suggested it should be implemented (procedural decision), and ‘‘why’’ thestrategy was the most appropriate one for solving or teaching the task (conditionaldecision). Pre-service teachers were encouraged to identify/design strategies thatencourage learning processes that put the learner in the center. For example, suchstrategies could include exploration and discovery in virtual worlds (i.e., virtualmuseums), testing of hypotheses and/or application of ideas into contexts that couldnot possibly be experienced in real life, complex decision-making, long-distancecommunication, and collaboration with peers.

The M_E group. Participants were prompted with the reflection questions at the endof the process to evaluate their problem-solving and design of learning activities. Inaddressing the reflection questions, teachers judge their understanding or solutionprocess and modify (if needed) ways to solve problems or teaching approaches. Pre-

Educational Research and Evaluation 471

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

service teachers employed reflection questions to control the infusion of technologyinto the classroom.

Teachers from all learning approaches practiced the same TPCK tasks(comprehending and designing). The problem-solving process was prompteddifferentially with IMPROVE self-questions pop-ups in three intervals accordingto the learning approach.

The M_P students received the comprehension prompts immediately after the taskappeared but before starting the solution. The M_A students received the strategyprompts immediately after they began to answer on the first item, and the M_Estudents received the reflection prompts at the end of the solution. The pre-serviceteachers were asked to answer online (in the pop-up box) on the question promptsand send it to the teacher. The theoretical and didactical rationale (Figure 1) of usingthe question prompts and how to use the prompts were explained and modeled bythe teacher trainer in each group. Table 1 summarizes the design of the WBLetraining program by learning approaches, indicating that only on the questionprompts did the three groups differ.

Assessment measures

Four pre/postmeasures were administered in the study: SRL at two self-reportquestionnaires, MSLQ and MAI, and TPCK in comprehending and design lessons.

Figure 1. The IMPROVE Metacognitive Self-Questioning Model for Prompting Teachers’SRL in each of the three phases embedded in WBLe.

472 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

Self-report SRL measures

MSLQ questionnaire. The 50-item MSLQ questionnaire (Pintrich et al., 1991)assessed pre-service teachers’ self-reported cognition, metacognition, and motivationin a pedagogical learning context. Sixteen items referred to general cognitionstrategies: rehearsal strategies (e.g., ‘‘When I read material for the course, I say thewords over and over to myself to help me remember’’), elaboration strategies such assummarizing and paraphrasing (e.g., ‘‘When I study for this course, I put importantideas into my own words’’), and organizational strategies (e.g., ‘‘I outline thechapters in my task to help me study’’). Twenty items referred to metacognition:planning (e.g., ‘‘When I begin to work on the task for the course, I think what wouldbe a good way to do it’’), monitoring (e.g., ‘‘During the task process I often askmyself if am I going in the right direction’’), and evaluation (e.g., ‘‘At the end of thetask I ask questions to make sure I know the material I have been studying’’).Fourteen items referred to motivational factors: intrinsic value of learning (e.g., ‘‘Ithink what we are learning in this pedagogical course is interesting’’) and persistencein the face of difficulties (e.g., ‘‘Even when the study materials are dull anduninteresting, I keep working until I finish’’). Participants rated each item on a 7-point Likert scale, ranging from 1 (not at all true for me) to 7 (very true for me).Higher scores indicated a higher level of SRL. Cronbach’s alphas were 0.81, 0.74,and 0.73, respectively.

MAI questionnaire. The 19-item questionnaire was adapted from Schraw andDennison’s MAI questionnaire (1994). The questionnaire assessed pre-serviceteachers’ self-reported regulation: planning, monitoring, and evaluation in a

Table 1. Summary of the pre-service WBLe training program by three learning approaches.

Learning Approaches Planning (M_P)

Action andperformance(M_A)

Evaluation(M_E)

Theoretical teachingand learningframework

TPCK approach – computer as a cognitive tool that amplifiesstudent learning; focus on learner-directed learning such as activeand cooperative learning and inquiry; SRL sociocognitive theories

Teacher training(for 3 instructors)

Two-day, 6-hr inservice training seminar and teachers’observations

Pre-service course onusing/designinglearning activitieswith WBLe

One-semester course, 14 meetings – 56 hr, for high schoolscience teachers:

(a) Instructor discusses pedagogical tasks of various computertools (e.g., internet, representations, animations, andhyperlinks) according to the TPCK principles

(b) Students practice TPCK tasks in WBLe and online forumdiscussions; tasks demand comprehension and design skills

(c) Instructor presents a summary in the classroom,addressing any difficulties

IMPROVEmetacognitivequestion prompts:

Pop-ups embedded in TPCK tasks of WBLe. Studentsanswer them by writing and send them online to the teacher.

Adapted to each ofthe SRL phases

Comprehension questions Strategy Questions Reflectionquestions

Educational Research and Evaluation 473

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

pedagogical teaching context. Seven items referred to planning (e.g., ‘‘Before Ibegin to teach a new topic I ask my self what I should teach?’’), seven itemsreferred to monitoring (e.g., ‘‘During my teaching a new topic I ask my self if Ido well’’), and five items refer to evaluation (e.g., ‘‘I know if the lesson was goodimmediately when I finish to teach’’). Participants rated each item on a 5-pointLikert scale, ranging from 1 (not at all true for me) to 5 (very true for me). Higherscores indicated a higher level of SRL. Cronbach’s alphas were 0.82, 0.76, and0.72, respectively.

TPCK comprehending and design skills

Comprehending skills

To measure TPCK comprehension, pre-service teachers were given a structuredstudy unit that referred to pedagogical implementation of technology withpedagogical content knowledge based on Angeli and Valanides’ (2009) principlesand the Kramarski and Michalsky (2009b) study. The pretest and the posttestlearning units differed to avoid familiarity effects, but both were similar instructure, style, and topic using cellular phone. At each interval, the pre-serviceteachers were given 1 hr to peruse the study unit and to complete a paper-and-pencil task.

The paper-and-pencil task comprised 10 open questions tapping five subscales (2questions per level) that referred to different cognitive levels of TPCK comprehen-sion skills. The skills were assessed according to five criteria:

(1) comprehension (‘‘What are the goals of the teaching unit and what is requiredto meet them?’’ and ‘‘Identify the topics that are taught with technology’’);

(2) application (e.g., ‘‘Sort the learning activities that engage the students indynamic activities’’);

(3) analysis (e.g., ‘‘What are the difficulties expected in learning/teachingstrategies to be implemented by traditional means? Explain’’);

(4) synthesis (e.g., ‘‘Based on the present task, suggest another strategy for theinfusion of technology into the classroom. Explain’’); and

(5) evaluation (e.g., ‘‘What is the ideal teaching method in your opinion?Explain’’).

Participants’ comprehension skills were scored as low (1), medium (2), high (3),or no answer (0). Scores ranged between 0 and 30. A full answer (3) in each categoryrequired teachers’ response to provide three elements from the teaching unit (e.g., forthe evaluation question: providing two TPCK teaching methods and a clearjustification). A score of 1 or 2 indicated partial provision of one or two elements,respectively (e.g., for the evaluation question: providing one to two elements withoutjustification). Teachers’ responses were coded by two trained judges with expertise inTPCK.

Inter-judge reliability, calculated with Cohen’s kappa measure for the same 30%of the responses coded by both judges, yielded high reliability coefficients(comprehension: 0.95; application: 0.93; analysis: 0.92; synthesis: 0.94; andevaluation: 0.91). Disagreements on the scoring and coding of comprehension skillswere resolved through discussion.

474 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

Design skills

Each participant was given 1.5 hr to design a two-lesson study unit, in WBLe,regarding the effects of the collar energy. Lessons were analyzed and scored by usinga TPCK index (adapted from Angeli & Valanides, 2009; Kramarski & Michalsky,2009b). This index focused on four categories: identifying learning objectives,selecting content, planning didactic material, and designing the learning environ-ment. Each category was assessed by four rubrics on a scale from low (1 – partialanswer) to high (4 – full answer), with total scores ranging from 0–16. The fourrubrics are presented in Appendix 1.

All coders underwent training to analyze and code the open-ended responses.Inter-judge reliability was calculated for the same 30% of the responses coded byboth judges expert in TPCK, yielding high Cohen’s kappa reliability coefficients(identifying learning objectives: 0.92; selecting content: 0.87; planning didacticmaterial: 0.85; and designing the learning environment: 0.87). Disagreements on thescoring and coding of design skills were resolved through discussion.

Procedure

Instruction began at the beginning of the second academic semester andcontinued for 56 hr. The WBLe learning program was the same in eachclassroom, but the learning approaches were adapted according to the researchdesign. Four pre/postmeasures were administered by the teachers in the classroomsetting on the first and last days of the course (lasting 2 hr each day). Themeasures were administered in the same order on both days: two SRL aptitudemeasures, MSLQ and MAI, followed by TPCK (comprehending and designlessons). Participants were informed that these measures were part of a researchstudy to determine the effectiveness of pre-service training. All students in thecourse participated in the study.

Results

Self-report SRL measures

SRL learning perspective (MSLQ questionnaire)

Table 2 presents the mean scores and standard deviations of self-reported SRL (thelearning perspective) for the three MSLQ components (cognition, metacognition,and motivation) by time (pretest/posttest) and learning approach (M_P, M_A,M_E). A multivariate analysis of variance (MANOVA) for the pretest resultsindicated that, before the course began, no significant differences emerged betweenthe three learning approaches on any of the perceived SRL components regardinglearning perspectives: simultaneously, F (4, 285) ¼ 1.89, p 4 0.27, Z2 ¼ 0.12.

Analysis of variance (ANOVA) with repeated measures (2 times 6 3approaches) on each of the three components of the SRL variable indicated asignificant time effect for all SRL components: cognition, F (1, 141) ¼ 20.37,p 5 0.001, Z2 ¼ 0.39; metacognition, F (1, 141) ¼ 26.36, p 5 0.001, Z2 ¼ 0.58; andmotivation, F (1, 141) ¼ 32.17, p 5 0.001, Z2 ¼ 0.52. Significant effects emerged forthe interaction between the learning approach and the time of measurement for eachof the three SRL components: cognition, F (2, 139) ¼ 7.36, p 5 0.001, Z2 ¼ 0.22;

Educational Research and Evaluation 475

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

metacognition, F (2, 139) ¼ 9.16, p 5 0.001, Z2 ¼ 0.35; and motivation, F (2,139) ¼ 22.31, p 5 0.001, Z2 ¼ 0.51.

A post-hoc analysis according to Scheffe and Cohen’s d effect size (d wascalculated as the ratio between the differences of the pretest and the posttest and theaverage standard deviation of the pretest) at the end of the course indicated thatlearning with metacognitive support in the M_E phase was more effective for thevarious components of SRL in the learning perspective than was learning withmetacognitive support in the two other phases, M_P and M_A, whereas the M_Aparticipants gained the lowest scores in SRL components.

SRL teaching perspective (MAI questionnaire)

Table 3 presents the mean scores and standard deviations of self-reported SRL(teaching perspective) for the three MAI components (planning, monitoring, andevaluation) by time (pretest/posttest) and learning approach (M_P, M_A, M_E). AMANOVA for the pretest results indicated that, before the course began, nosignificant differences emerged between the three learning approaches on any of theperceived SRL components regarding teaching perspective: simultaneously, F (4,285) ¼ 1.23, p 4 0.31, Z2 ¼ 0.16.

ANOVA with repeated measures (2 times 6 3 approaches) on each of the threecomponents of the SRL variable indicated a significant time effect for all SRLcomponents: planning, F (1, 141) ¼ 16.38, p 5 0.001, Z2 ¼ 0.25; monitoring, F (1,141) ¼ 24.19, p 5 0.001, Z2 ¼ 0.43; and evaluation, F (1, 141) ¼ 36.15, p 5 0.001,Z2 ¼ 0.49. Significant effects emerged for the interaction between the learningenvironment and the time of measurement for each of the three SRL components:planning, F (2, 139) ¼ 6.85, p 5 0.001, Z2 ¼ 0.34; monitoring, F (2, 139) ¼ 4.36,p 5 0.001, Z2 ¼ 0.21; and evaluation, F (2, 139) ¼ 21.17, p 5 0.001, Z2 ¼ 0.43.

Table 2. Pre-service teachers’ means, standard deviations, and Cohen’s d effect size ofperceived SRL for learner’s perspective by MSLQ components, time and learning approach.

Learning approach

M_P M_A M_En ¼ 45 n ¼ 52 n ¼ 47

MSLQ Components Pre Post Pre Post Pre Post

Cognition

M 4.2 4.8 4.1 4.4 3.9 5.2SD 1.4 1.3 1.3 1.3 1.4 1.3d 0.4 0.2 0.9Metacognition

M 3.7 4.5 3.9 4.3 3.8 5.2SD 1.3 1.5 1.3 1.3 1.4 1.3d 0.6 0.3 1Motivation

M 4.2 4.8 4.3 4.7 4.3 5.4SD 1.3 1.5 1.4 1.6 1.3 1.5d 0.5 0.3 0.8

Note: Scores ranged from 1 to 7 for the Motivated Strategies for Learning Questionnaire.

476 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

A post-hoc analysis according to Scheffe and Cohen’s d effect size at the end ofthe course indicated that learning with metacognitive support directed to theevaluation phase (M_E) was more effective for the various components of self-regulation in the teaching perspective than was learning in the other two approaches(M_P, M_A), whereas the M_A participants gained the lowest scores in SRLcomponents.

TPCK (comprehension and design skills)

Comprehension skills

We compared the effects of the three learning approaches (M_P, M_A, M_E) on thepre-service teachers’ development of TPCK comprehension skills. Table 4 presentsthe participants’ means, standard deviations, and Cohen’s d effect sizes for theTPCK comprehension skills, by time and learning groups.

A MANOVA for the pretest results indicated that, before the course began, nosignificant differences emerged between the three learning approaches on any of thecomprehension skills, F (4, 285) ¼ 1.29, p 4 0.51, Z2 ¼ 0.21. The ANOVA withrepeated measures (2 times 6 3 approaches) on each of the components of theTPCK measure indicated a significant time effect for comprehending skills:comprehension, F (1, 141) ¼ 42.31, p 5 0.001, Z2 ¼ 0.35; application, F (1,141) ¼ 36.12, p 5 0.001, Z2 ¼ 0.31; analysis, F (1, 141) ¼ 52.13, p 5 0.001,Z2 ¼ 0.34; synthesis, F (1, 141) ¼ 27.86, p 5 0.001, Z2 ¼ 0.31; and evaluation, F(1, 141) ¼ 32.15, p 5 0.001, Z2 ¼ 0.37. Significant interaction effects also emergedbetween learning approaches and time of measurement for each comprehendingskill: comprehension, F (2, 139) ¼ 40.12, p 5 0.001, Z2 ¼ 0.32; application, F (2,139) ¼ 42.39, p 5 0.001, Z2 ¼ 0.35; analysis, F (2, 139) ¼ 43.25, p 5 0.001,

Table 3. Pre-service teachers’ means, standard deviations, and Cohen’s d effect size ofperceived SRL for teachers’ perspective by MAI components, time and learning approach.

Learning approach

M_P M_A M_En ¼ 45 n ¼ 52 n ¼ 47

MAI Components Pre Post Pre Post Pre Post

Planning

M 2.3 3.7 2.5 3.3 2.4 4.2SD 1.5 1.6 1.2 1.3 1.2 1.4d 0.9 0.7 1.5Monitoring

M 2.1 2.9 2.3 3.1 2.2 3.9SD 1.7 1.7 1.2 1.4 1.2 1.3d 0.5 0.7 1.4Evaluation

M 2.8 4.1 2.9 3.7 3.1 4.6SD 1.4 1.5 1.2 1.3 1.2 1.4d 0.9 0.7 1.3

Note: Scores ranged from 1 to 5 for the Motivated Strategies for Learning Questionnaire.

Educational Research and Evaluation 477

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

Z2 ¼ 0.42; synthesis, F (2, 139) ¼ 39.74, p 5 0.001, Z2 ¼ 0.45; and evaluation, F (2,139) ¼ 37.15, p 5 0.001, Z2 ¼ 0.31.

Post-hoc analysis according to Scheffe and Cohen’s d effect size revealed that, atthe end of the study, the participants in the three learning approaches improved theirTPCK comprehension skills components. The M_E approach was more effective forthe various components of TPCK comprehension skills than the other twoapproaches (M_P and M_A), whereas the M_P approach was more effective thenthe M_A approach on TPCK comprehension skills.

Design skills

We compared the differential effects of the three learning approaches (M_P, M_A,M_E) on the pre-service teachers’ development of TPCK design skills. Table 5presents the participants’ means, standard deviations, and Cohen’s d effect sizes forthe TPCK design skills, by time and learning approaches.

A MANOVA for the pretest results indicated that, before the course began, nosignificant differences emerged between the three learning groups on any of thedesign skills, F (2, 192) ¼ 3.25, p 4 0.51, Z2 ¼ 0.19. The ANOVA with repeatedmeasures (2 times 6 3 approaches) on each of the components of the designmeasure indicated a significant time effect for design skills: identifying learning

Table 4. Pre-service teachers’ means, standard deviations, and Cohen’s d effect size forTPCK comprehension skills by time and learning approach.

Learning approach

M_P M_A M_En ¼ 45 n ¼ 52 n ¼ 47

TPCK Comprehension skills Pre Post Pre Post Pre Post

Comprehension

M 1.9 2.4 1.8 2.1 2.1 2.8SD 0.5 0.6 0.4 0.5 0.5 0.6D 1 0.8 1.5Application

M 1.6 2.1 1.1 1.8 0.9 2.6SD 0.4 0.6 0.5 0.5 0.6 0.7d 1.3 1.4 2.8Analysis

M 1.3 1.9 1.3 1.5 1.2 2.5SD 0.5 0.5 0.6 0.7 0.6 0.6d 1.2 0.3 2.1Synthesis

M 1.1 1.6 0.9 1.2 1.0 2.3SD 0.6 0.7 0.5 0.5 0.5 0.6d 0.8 0.6 2.6Evaluation

M 0.8 1.4 0.9 1.2 0.8 2.1SD 0.4 0.4 0.5 0.6 0.5 0.7d 1.5 0.6 2.6

Note: Scores ranged from 0 to 3.

478 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

objectives, F (1, 141) ¼ 42.31, p 5 0.001, Z2 ¼ 0.35; selecting content, F (1,141) ¼ 36.12, p 5 0.001, Z2 ¼ 0.31; planning didactic material, F (1, 141) ¼ 27.86,p 5 0.001, Z2 ¼ 0.31; and designing the learning environment, F (1, 141) ¼ 32.15,p 5 0.001, Z2 ¼ 0.37. Significant interaction effects also emerged between learningapproaches and time of measurement for each comprehending skill: identifyinglearning objectives, F (2, 139) ¼ 34.56, p 5 0.001, Z2 ¼ 0.31; selecting content, F (2,139) ¼ 45.36, p 5 0.001, Z2 ¼ 0.49; planning didactic material, F (2, 139) ¼ 41.09,p 5 0.001, Z2 ¼ 0.39; designing the learning environment, F (2, 139) ¼ 37.19,p 5 0.001, Z2 ¼ 0.38.

Post-hoc analysis according to Scheffe and Cohen’s d effect size revealed that, atthe end of the study, the participants in the three learning approaches improved theirTPCK design skills. The M_E approach was more effective for the variouscomponents of TPCK design skills than the other two approaches (M_P and M_A),whereas the M_P approach was more effective than the M_A approach on TPCKdesign skills components.

Discussion

Our findings indicated that the three approaches of metacognitive question prompts(based on the IMPROVE method) directed to each of the three learning phases(M_P, M_A, M_E) in WBLe were effective for developing pre-service teachers’ SRLand enhancing TPCK in the comprehension and design of lessons. These findingssupport other studies’ conclusions that automated instructional support in WBLemay act as a ‘‘more able other’’, prodding the students to consider issues they may

Table 5. Pre-service teachers’ means, standard deviations, and Cohen’s d effect size forTPCK design skills by time and learning approach.

Learning approach

M_P M_A M_En ¼ 45 n ¼ 52 n ¼ 47

TPCK Design skills Pre Post Pre Post Pre Post

Identifying learning objectivesM 1.2 2.1 1.4 1.8 1.1 2.5SD 0.6 0.7 0.8 0.9 0.6 0.6d 1.5 0.5 2.1Selecting contentM 1.4 2.4 1.3 1.8 1.4 2.6SD 0.8 0.9 0.7 0.8 0.6 0.6d 1.3 0.7 2.0Planning didactic materialM 1.7 2.8 1.9 2.4 1.6 3.1SD 0.6 0.7 0.7 0.7 0.8 0.9d 1.8 0.7 1.9Designing the learning environmentM 1.8 3.0 1.8 2.7 2.1 3.5SD 0.8 0.9 0.9 1.1 0.8 0.9d 1.5 1.0 1.8

Note: Scores ranged from 0 to 4.

Educational Research and Evaluation 479

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

not have considered otherwise (Davis, 2003; Kauffman et al., 2008; Nuckles,Hubner, & Renkl, 2009). Self-questioning can guide students’ attention to specificaspects of their learning process, thereby helping students to monitor and evaluatetheir learning processes (Kramarski & Michalsy, 2009a, 2009b).

As we assumed, we found that prompting pre-service teachers with reflectionquestions in the evaluation phase (M_E) was clearly the most effective method indeveloping participants’ SRL and abilities in comprehending and designing morecomplex lesson plans (TPCK) in WBLe. We suggest two possible reasons for thebeneficial effect of the M_E approach. First, the planning approach (M_P)encouraged students to think ahead about their goals, their own understanding,their ability to make links, and restructure ideas. But, perhaps not all students areable to take advantage of such opportunities given to them for integrating theirknowledge. However, reflection question prompts enable the student to look backand walk through the activities step-by-step and as such may help students inintegrating their knowledge. Second, perhaps the reflection question prompts maylessen the cognitive load on students by reminding them to evaluate the activity(Nuckles et al., 2009). Our findings support theoretical recommendations thatreflection plays an important role in learning processes and that reflection alsoconstitutes an important factor in the acquisition of learning competences (e.g.,Schon, 1996; Zimmerman, 2000, 2008).

Several findings need further consideration. Why was the M_P approach morebeneficial on SRL and TPCK than the M_A approach? Although promptingstrategy use just in time in the M_A phase is a key metacognitive strategy thatsupports students with the ability to apply learned knowledge to problem-solvingtasks (e.g., Kramarski & Mevarech, 2003; Pol et al., 2008; Schoenfeld, 1992), beingaware of employing an appropriate strategy is itself a complex cognitive task. In thecontext of problem solving, it requires two simultaneous, coordinated ‘‘processes’’:one that develops a sequence of steps to solve the task and a second process thatmonitors and evaluates the accuracy and efficiency of the problem-solving process.Analyzing discrepancies and making corrections or changing strategies adds furthercomplexity to the self-monitoring task in the M_A approach.

Our findings are in line with other findings which have indicated that the use ofmetacognitive support before the solution process is more beneficial than its useduring the process (Michalsky et al., 2009). However, there is contradictory evidenceregarding metacognitive support during the action phases. Moreno (2006) suggeststhat, for most students, just-in-time instruction during problem-solving may be moreeffective than instruction prior to problem-solving. Pol et al. (2008) found thatproviding metacognitive support during the solution process is more effective thanproviding the metacognitive support after the solution process. Further researchshould investigate this issue among novice and expert students in different ages.

SRL – self-report questionnaires (MSLQ and MAI)

Findings on both measures of the self-reported SRL questionnaires (MSLQ andMAI) indicated complementary conclusions. These findings indicate that providingpre-service teachers with question prompts directed to one learning phase has asynergic effect on the entire SRL in terms of learning (cognitive and metacognitivestrategies and motivation) and teaching regarding planning, monitoring, andevaluation (Zimmerman, 2000).

480 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

The MSLQ self-report measure (the learner perspective) indicated that, at the endof the study, participants in the M_E perceived themselves more engaged incognition (planning and setting goals), metacognition, and evaluation at variouspoints during the process of skill acquisition. In terms of motivational processes,these pre-service teachers reported high intrinsic interest and persistence in learning.These findings support the theoretical SRL model (Zimmerman, 2000) that reflectingis an important component in SRL. Furthermore, MAI self-report measures (theteacher perspective) indicated that, at the end of the study, participants in the threedirected approaches reported more confidently on the planning and evaluatingphases than on the monitoring phase. This finding is in line with previousconclusions that students are often ‘‘cognitively overloaded’’ during the learningprocess and have difficulty in self-observation and reflection (Michalsky et al., 2009;Zimmerman, 2000).

TPCK in WBLe

The current findings concerning TPCK based on comprehension skills indicatedthat, at the end of the study, the M_E participants outperformed the M_P andthe M_A participants in all five skills; in particular on the high-order skills(synthesis and evaluation). The M_E participants were more successful inintegrating their answers, TPCK teaching methods, and in providing a clearjustification. Furthermore, the findings on design skills indicated that participantsof the M_E approach were more successful in developing their lesson plan tointegrate technology with pedagogical content knowledge according to TPCK’stheoretical principles (Angeli & Valanides, 2009; Mishra & Koehler, 2006)than participants of the M_P and the M_A approaches. Perhaps, the M_Eapproach that directed participants to judge and modify the solution helpedteachers to think about the content of the task, suitable technology and teachingmethods that emphasize student-centered learning, and why. By thinking aboutsuch considerations, teachers can become aware of possible alternatives forplanning strategies, monitoring, and evaluating the solution process that arecritical aspects of SRL. These aspects might, in turn, help to integratemultifaceted knowledge of TPCK in designing lessons. The current resultscoincide with the findings of TPCK researchers (Angeli & Valanides, 2009 andMishra & Koehler, 2006) that pre-service teachers must be explicitly taught aboutand supported in the interactions among technology knowledge and pedagogicalcontent knowledge.

Practical implications and future research

Our study makes an important contribution to theoretical research and practicalimplications about fostering learning phases (planning, action and performance, andevaluation) for self-regulation in teacher technology education (TPCK). The SRLand TPCK of pre-service teachers who learn in WBLe under question promptsdirected to each learning phase are a relatively new topic. Our study (which is a partof an ongoing research on developing SRL of pre-service teachers) implementedthe metacognitive IMPROVE model with question prompts embedded in WBLe forpre-service teachers, and we suggest that further studies apply also othermetacognitive models in different technology environments.

Educational Research and Evaluation 481

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

We propose that future research investigates other factors which may affect SRLand TPCK, such as different types of implementing prompts and the conditionsunder which they appear – for example, prompts embedded in the technology versusstudents asking prompts for help; using mixed categories of prompts adapted tonovice and expert students; and assessing the effects of the prompts provided in theprogram and their connection to cognitive load. We suggest that researchers payparticular attention to different types of just-in-time instruction for fostering SRL inthe action and performance phase.

Limitations of the study

We recognize several limitations inherent to the study. Although the presentresearch adds to the literature data on SRL in learning and teaching, by connectingteachers’ TPCK with SRL (MSLQ and MAI) under different question prompts,our study does not supply data about the school student outcomes obtained by thecurrently participating teachers. Future studies would do well to examine theassumption that teachers’ SRL is extremely important to their success in teaching(Perry, Phillips, & Hutchinson, 2006; Randi & Corno, 2000). Second, our SRLquestionnaires provided us with complementary data referring to various SRLperspectives manifested in the learning (MSLQ) and teaching context (MAI).However, these data were gathered as self-perceived data and not in real time, at thebeginning, and the end of the process; thus, we cannot draw conclusions on thepattern of SRL behaviors along the course of the study. We suggest that futurestudies would do well to conduct a long-term follow-up for this claim (e.g., at 6months and 12 months after intervention), including evaluations of both kinds ofmeans to assess SRL as aptitudes (questionnaires) and events (real-time), such asthinking aloud, observations, log files, and forum discussions. The use of such atriangulation of measurements may shed further light on the effects of metacognitivequestion prompts directed to different phases on academic performance (e.g.,Azevedo, 2005; Veenman, 2007; Winne & Perry, 2000). With respect to TPCK, wepropose the use of delayed post-test measures for comprehending and designing tasksand observations of teaching in practice. Finally, our study implemented eachapproach with only one teacher. This decision could have confounded the teacher/classroom with the instructional approach.

Despite these limitations, our results support recommendations to capitalize onmetacognitive question prompts in WBLe in professional development training ofpre-service teachers in order to enhance SRL and learning opportunities (e.g.,Kaufman et al. 2008; Kramarski & Michalsky, 2009a, 2009b)

References

Angeli, C., & Valanides, N. (2009). Epistemological and methodological issues for theconceptualization, development, and assessment of ICT-TPCK: Advances in technologicalpedagogical content knowledge (TPCK). Computers & Education, 52, 154–168. doi:10.1016/j.compedu.2008.07.006

Azevedo, R. (2005). Using hypermedia as a metacognitive tool for enhancing student learning?The role of self-regulated learning. Educational Psychologist, 40, 199–209.

Azevedo, R., & Jacobson, M. (2008). Advances in scaffolding learning with hypertext andhypermedia: A summary and critical analysis. Educational Technology Research andDevelopment, 56, 93–100.

482 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

Bannert, M., & Mengelkamp, C. (2008). Assessment of metacognitive skills by means ofinstruction to think aloud and reflect when prompted: Does the verbalization methodaffect learning? Metacognition and Learning, 3, 39–58.

Davis, E.A. (2003). Prompting middle school science students for productive reflection:Generic and directed prompts. Journal of the Learning Sciences, 12, 91–142.

Jacobson, M.J., & Archodidou, A. (2000). The design of hypermedia tools for learning:Fostering conceptual change and transfer of complex scientific knowledge. Journal of theLearning Sciences, 9, 145–199.

Jonassen, D.H. (2000). Computers as mindtools for schools: Engaging critical thinking (2nded.). Upper Saddle River, NJ: Prentice Hall.

Kauffman, D.F., Ge, X., Xie, K., & Chen, C.H. (2008). Prompting in web-basedenvironments: Supporting self-monitoring and problem solving in college students.Journal of Educational Computing Research, 38, 115–137.

Kramarski, B. (2004). Making sense of graphs: Does metacognitive instruction make adifference on students’ mathematical conceptions and alternative conceptions? Learningand Instruction, 14, 593–619.

Kramarski, B. (2008). Promoting teachers’ algebraic reasoning and self-regulation withmetacognitive guidance. Metacognition and Learning, 3, 83–99.

Kramarski, B., & Gutman, M. (2006). How can self-regulated learning be supported inmathematical e-learning environments? Journal of Computer Assisted Learning, 22, 24–33.

Kramarski, B., & Mevarech, Z.R. (2003). Enhancing mathematical reasoning in theclassroom: Effects of cooperative learning and metacognitive training. AmericanEducational Research Journal, 40, 281–310.

Kramarski, B., & Michalsky, T. (2008). Preparing pre-service teachers for professionaleducation within a metacognitive computer-based learning environment. In N. Schwartz,J. Zumbach, T. Seufert, & L. Kester (Eds.), Beyond knowledge: The legacy of competencemeaningful computer-based learning environments (pp. 93–101). New York: Springer.

Kramarski, B., & Michalsky, M. (2009a). Investigating pre-service teachers’ professionalgrowth in self-regulated learning environments. Journal of Educational Psychology, 101,161–175.

Kramarski, B., & Michalsky, T. (2009b). Preparing preservice teachers for self-regulatedlearning in the context of technological pedagogical content knowledge. Learning andInstruction. doi:10.1016/j.learninstruc.2009.05.003

Kramarski, B., & Mizrachi, N. (2006). Online discussion and self-regulated learning: Effects ofinstructional methods on mathematical literacy. Journal of Educational Research, 99, 218–230.

Kramarski, B., & Zoldan, S. (2008). Using errors as springboards for enhancing mathematicalreasoning with three metacognitive approaches. Journal of Educational Research, 102,137–151.

Leelawong, K., Davis, J., Veye, N., Biswas, G., Schwartz, D., Belynne, K., et al. (2002, October).The effects of feedback in supporting learning by teaching in a teachable agent environment. Paperpresented at the Fifth International Conference of Learning Sciences, New York.

Mevarech, Z.R., & Kramarski, B. (1997). IMPROVE: A multidimensional method forteaching mathematics in heterogeneous classrooms. American Educational ResearchJournal, 34, 365–395.

Michalsky, T., Zion, M., & Mevarech, Z.R. (2007). Developing students’ metacognitiveawareness in asynchronous learning networks in comparison to face-to-face discussiongroups. Journal of Educational Computing Research, 36, 421–450.

Michalsky, T., Mevarech, Z.R., & Haibi, L. (2009). Elementary school children readingscience texts: Effects of metacognitive instruction. The Journal of Educational Research,102, 363–374.

Mishra, P., & Koehler, M.J. (2006). Technological pedagogical content knowledge: Aframework for teacher knowledge. Teachers College Record, 108, 1017–1054.

Moreno, R. (2006). When worked examples don’t work: Is cognitive load theory at animpasse? Learning and Instruction, 16, 170–181.

National Council for Accreditation of Teacher Education. (2002). Professional standards forthe accreditation of schools, colleges, and departments of education. Washington, DC:Author.

Educational Research and Evaluation 483

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

Niess, M.L. (2005). Preparing teachers to teach science and mathematics with technology:Developing a technology pedagogical content knowledge. Teaching and TeacherEducation, 21, 509–523.

Nuckles, M., Hubner, S., & Renkl, A. (2009). Enhancing self-regulated learning by writinglearning protocols. Learning and Instruction, 19, 259–271.

Perry, N.E., Phillips, L., & Hutchinson, L. (2006). Mentoring student teachers to support self-regulated learning. Elementary School Journal, 106(3), 237–254.

Pintrich, P.R. (2000). Multiple goals, multiple pathways: The role of goal orientation inlearning and achievement. Journal of Educational Psychology, 92, 544–555.

Pintrich, P.R., Smith, D.A.F., Garcia, T., & McKeachie, W.J. (1991). A manual for the use ofthe motivational strategies learning questionnaire (MSLQ). Ann Arbor, MI: University ofMichigan, National Center for Research to Improve Postsecondary Teaching andLearning.

Pol, H.J., Harskamp, E.G., & Suhre, C.J.M. (2008). The effect of the timing of instructionalsupport in a computer-supported problem-solving program for students in secondaryphysics education. Computers in Human Behavior, 24, 1156–1178.

Putnam, R.T., & Borko, H. (2000). What do new views of knowledge and thinking have to sayabout research on teacher learning? Educational Researcher, 29(1), 4–15.

Randi, J., & Corno, L. (2000). Teacher innovations in self-regulated learning. In P. Pintrich,M. Boekaerts, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 651–685). Orlando,FL: Academic Press.

Schoenfeld, A.H. (1992). Learning to think mathematically: Problem solving, metacognition,and sense making in mathematics. In D.A. Grouws (Ed.), Handbook of research onmathematics teaching and learning (pp. 165–197). New York: Macmillan.

Schon, D.A. (1996). Educating the reflective practitioner: Toward a new design for teaching andlearning in the professions. San Francisco: Jossey-Bass.

Schraw, G., & Dennison, R.S. (1994). Assessing metacognitive awareness. ContemporaryEducational Psychology, 19, 460–475.

Veenman, M.V.J. (2007). The assessment and instruction of self-regulation in computer-basedenvironments: A discussion. Metacognition and Learning, 2, 177–183.

Winne, P.H., & Perry, N.E. (2000). Measuring self-regulated learning. In M. Boekaerts, P.Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 532–566). Orlando, FL:Academic Press.

Winters, F.I., Green, J.A., & Costich, C.M. (2008). Self-regulation of learning withincomputer-based learning environments: A critical analysis. Educational PsychologyReview, 20, 429–444.

Zimmerman, B.J. (2000). Attaining of self-regulation: A social cognitive perspective. In P.Boekaerts, M. Pintrich, & M. Zeidner (Eds.), Handbook of self-regulation (pp. 13–39).Orlando, FL: Academic Press.

Zimmerman, B.J. (2008). Investigating self-regulation and motivation: Historical background,methodological development, and future prospects. American Educational ResearchJournal, 45, 166–183.

Zohar, A., & Schwartzer, N. (2005). Assessing teachers’ pedagogical knowledge in the context ofteaching higher order thinking. International Journal of Science Education, 27, 1595–1620.

Appendix 1. A rubric for assessing TPCK design lessons

Pre-service teachers received a full score of 4 for identifying learning objectives when their studyunit design presented clear learning objectives, specific for the topic, detailing the capacitiesthat students are supposed to develop, and identifying computer tools (e.g., using internetresources).

A full score (4) for selecting content referred to a study unit design that selected relevantinformation, experience, and computer tools and indicated the extent to which each tool couldsupport content transformation (e.g., selecting visualization techniques to help students learnabstract concepts).

A full score (4) for planning didactic material referred to a set of materials (computer tools)for student use and justification for how these tools place the learner at the center of thelearning process (e.g., selecting hypermedia for problem solving inquiry activities).

484 B. Kramarski and T. Michalsky

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013

A full score (4) for designing the learning environment referred to integrating three learningstrategies in designing the learning environment for infusing technology in the classroom andjustifying their choices (e.g., planning peer dialogue with the students during the learning,online communication, and context-sensitive feedback).

A partial answer was scored 1 when the designed lessons did not clearly indicate use oftechnology. Other partial answers were scored as 2 for a design that lacked justification for aspecific technology or were scored as 3 for a design that did not clearly justify such technologyusage.

Educational Research and Evaluation 485

Dow

nloa

ded

by [

RM

IT U

nive

rsity

] at

05:

27 0

1 O

ctob

er 2

013


Recommended