+ All Categories
Home > Documents > Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia...

Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia...

Date post: 10-Oct-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
30
Page 1 of 30 Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In Press), Fifteen Common but Questionable Principles of Multimedia Learning, In R. E. Mayer (Ed.) The Cambridge Handbook of Multimedia Learning (Chapter 3). New York: Cambridge University Press. Fifteen Common but Questionable Principles of Multimedia Learning Richard E. Clark Rossier School of Education University of Southern California David F. Feldon Department of Instructional Technology & Learning Sciences Utah State University Soojeong Jeong Department of Instructional Technology & Learning Sciences Utah State University
Transcript
Page 1: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 1 of 30

Citation:

Clark, R. E., Feldon, D. F. and Jeong, S. (In Press), Fifteen Common but Questionable Principles of Multimedia Learning, In R. E. Mayer (Ed.) The Cambridge Handbook of Multimedia Learning (Chapter 3). New York: Cambridge University Press.

Fifteen Common but Questionable Principles of Multimedia Learning

Richard E. Clark Rossier School of Education

University of Southern California

David F. Feldon Department of Instructional Technology & Learning Sciences

Utah State University

Soojeong Jeong Department of Instructional Technology & Learning Sciences

Utah State University

Page 2: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 2 of 30

Abstract

The chapter begins with a quick summary and research update on earlier lists of ten questionable multimedia principles (Clark & Feldon, 2005, 2014). We then add five additional principles that have gained traction in recent years. The updated 10 former questionable beliefs include the unfulfilled expectations that multimedia instruction: (1) yields more learning than live instruction or older media, (2) is more motivating than other instructional media, (3) provides animated pedagogical agents that aid learning, (4) accommodates different learning styles and so maximizes learning for more students; and also benefits learning by allowing and encouraging (5) student managed constructivist and discovery approaches, (6) autonomy and control over the sequencing of instruction; (7) higher order thinking skills (8) incidental learning of enriching information (9) interactivity, and (10) authentic learning environments and activities. The more recent additions and the focus of this discussion are false expectations that multimedia instruction benefits learning by providing: (11) presence from virtual reality sessions, (12) virtual and remote laboratories, (13) gamification, (14) recorded lectures, and (15) brain training. We end the chapter with a suggestion for using big data and aptitude-treatment interaction research to avoid or resolve many past and future mistaken principles.

Page 3: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 3 of 30

Introduction

Multimedia offers significant benefits for education when it is used to provide cost-effective access to high quality, evidence-based instruction. As this chapter is written the world is facing a virus pandemic and nearly all schooling at all levels has been shifted from the traditional classroom to multimedia platforms for distance education on the internet. After years of experience with multimedia, a number of mistaken or questionable beliefs about its uses have gained the status of what may seem to be principles or “a basic truth that explains or controls how something happens or works” (Cambridge English Dictionary, n.d.). As schools, universities and businesses are moving instruction and training online for the foreseeable future, we have an opportunity to avoid mistaken principles and to guide the shift to evidence-based online instruction. Some questionable principles have no impact on learning, but a few have negative effects on motivation and/or learning.

In addition to the pandemic, this third review is conducted during what has been called a “crisis” in attempts at replicating psychological studies and a dramatic increase in the use of meta-analytic techniques for summarizing studies in learning and instruction (Open Science Collaboration, 2015; Sala, Aksayli, Tatlidil, Tatsumi, Gondo, & Gobet, 2019). Because of the use of small samples, poor research designs, vague descriptions of experimental treatments and low statistical power due to small effect sizes in published studies, attempts at validating multimedia learning studies by replicating them are failing in large numbers. It is also obvious that some journal editors are biased against even well-designed studies that result in ‘no differences’ and so tend not to publish them. Most reviewers have turned to meta-analytic techniques to examine many similar replication studies in order to determine whether, on average, the replications have confirmed or refuted the original studies that have led to claims for or against a specific type of multimedia instructional approach (Sala et al., 2019; Sala, Tatlidl & Gobet, 2018). Meta-analytic reviews often ignore studies that were poorly designed and integrate the findings of adequately designed studies to overcome statistical power problems and more reliably estimate the effect size of the treatments being studied. Readers who want a clear description of meta-analytic techniques and rationale should consult a brief and well-written article by Turlik (2009) and Wilson (2014). As we begin this review, there are so many meta-analytic studies that researchers have begun conducting second-order meta-analyses of the meta-analyses (Schmidt & Oh, 2013) and so attempt to further reduce the sampling error in existing meta-analyses and summarize the results of many meta-analytic reviews. Where possible we have included the most accurate first and second order meta-analytic findings in our review. Our goal is not to point out the errors in anyone’s reasoning about an important and

exciting educational technology. In the past, the authors have advocated some of the fifteen mistaken principles discussed here before solid evidence indicated that something would not work or would cause unexpected learning problems. We hope instead to join readers in marveling at how careful research occasionally produces counter-intuitive findings that help us all avoid mistakes and adjust practice so that it is compatible with the best available evidence. The evidence presented here is yet another indication of that old saying that research more easily and definitively informs us about what does not work than what does work.

Page 4: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 4 of 30

Multimedia defined

Multimedia learning, in its most basic definition, is the process of learning from instructional materials that utilize more than one sensory channel (e.g., both visually and aurally presented information). Accordingly, any principles of multimedia learning (mistaken or otherwise) apply to a vast range of instructional formats, and research on their effectiveness must be generalizable across those contexts and formats. In its modern application, multimedia learning materials often manifest as audiovisual presentations and interactive exercises delivered via computer (e.g., animation, virtual reality, etc.) and may be used either within a classroom environment or at a distance from the school, institution, or company providing the instruction. Multimedia programs often also permit learners to interact with the computer and influence the pace, sequence and content of the presentations they receive.

Update on past mistaken principles. We begin the chapter with a brief review and update of the ten principles described in more detail in Clark and Feldon (2005, 2014). Our goal here is to give a status report on the evidence for each principle that has been published in the years since the first reports were provided. After the review and update we will describe five new mistaken principles.

Review and Update of Ten Past Mistaken Principles

1) Multimediaproducesmore(orless)learningthanliveteachersoroldermediasuchastextbooksortelevision.

In the midst of the virus pandemic it seems that the old mistaken principle has been reversed because so many parents and university students have reacted negatively to the early version of internet translations of their classroom courses. There continues to be no credible evidence of learning benefits from any medium or combination of media including classroom teaching or multimedia teaching that cannot be explained by other, non-media factors such as instructional methods, different content, or mismatched assessment plans that can be presented by any medium including teachers. For example, Triona and Klahr (2003) compared the learning of a “control of variables” approach to teaching the design of experiments with either computer based or physical materials to 4th and 5th grade students. They reported that both multimedia and real materials were equally effective for learning. Clark (2012) has described all arguments with this conclusion and has argued that none are supported by plausible data from adequately designed studies.

However, there is strong positive evidence that different media can influence the number and variety of students who can access instruction and that some media are more cost-effective than others (Clark, 2012). In earlier versions of this chapter, we urged K-12 school systems and universities to analyze the cost and ease of access for online instruction, suggesting that it is these data which will identify the main benefits of the educational applications of multimedia. However, such analyses cannot ignore impacts on those who lack such access due to socioeconomic status, geography, or disability (van Dijk, 2020).

Page 5: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 5 of 30

Research comparing multimedia with other media. In poorly designed experiments comparing multimedia with older forms of media (including teachers in classrooms), instructional methods that influence learning are present in one condition (often the multimedia condition) and not in a comparison condition. Examples of instructional methods are process simulations (a demonstration of how something works), practice exercises, and formative feedback. When, for example, the demonstration of a problem-solving strategy being learned is available in a multimedia presentation but not in a comparison presentation, learning differences are likely to be due to the presence of the worked example rather than its presentation in multimedia (see Clark, 2009 for a discussion of instructional methods). It is also sometimes the case that the information content that must be learned in order to pass learning tests is available in the multimedia condition but not in a comparison media. Clark (2012) describes many of these experiments and the confounding that results. A meta-analysis of 232 adequately designed studies that compared classroom and multimedia instruction on college student achievement, attitude and retention of learning found “effect sizes of essentially zero” (Bernard et al., 2004, Thus, nothing has changed since the reviews by Clark (1983), Bernard et al. (2004), and Clark and Feldon (2005, 2014) to modify our claim that when multimedia is used for instruction, there is no credible evidence in thousands of comparison studies that the choice of medium influences learning. It seems, however, that we might need more studies highlighting the conditions where multimedia can greatly increase student access at a lower cost than other alternatives or that multimedia might be more motivating or engaging to students. This suggestion leads us to our next principle.

2) Multimedia instruction is more motivating than other instructional media We also suggested that all claims for the motivating properties of multimedia were mistaken principles. We stated, “The best conclusion at this point is that overall, multimedia courses may be more attractive to students and so they tend to choose them when offered options (yet)… student interest does not result in more learning” (Clark & Feldon, 2005, p. 99). Clark (1982) reviewed 20 studies where students were allowed to choose between more or less structured versions of the same courses and found posttest learning scores were significantly lower than pretest scores, regardless of the version chosen. He found indications that students with lower prior knowledge were choosing courses that were discovery oriented and so provided minimal structure or guidance. Students with higher prior knowledge were choosing courses with more guidance perhaps believing that the added guidance would allow them to finish quickly. Each group had made exactly the wrong decision based on their prior knowledge. High prior knowledge students experienced what Kalyuga (2007) described as the “expertise reversal effect” (see Principle 5 below for a description; see also Chapter 13) and their learning was damaged by attempting to replace existing, correct knowledge with a different approach. Novice students were unprepared to handle discovery learning successfully and were unable to self-structure their learning effectively because they lacked needed prior knowledge (Clark, Kirschner & Sweller, 2012).

Subsequent reviews of the research on motivation (for example Clark, Howard & Early, 2006; Clark & Saxberg, 2018; 2019; Schunk, Pintrich & Meese, 2008) have provided improved motivational models and added evidence for this claim. Clark,

Page 6: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 6 of 30

Howard, and Early (2006) speculated that learners who are able to choose course formats that attract them do so because they assume learning will require less effort and so do not work as hard to learn (e.g., Salomon, 1984). While Sung and Mayer (2013) found no learning benefits from using different media for instruction, they reported that students preferred individual multimedia platforms and interpreted their preference as a motivational benefit. We suggest caution about assuming that student self-reported preferences indicate either motivated behavior or learning benefits. We therefore see no reason to modify our conclusion about the principle that multimedia instruction is more motivating for students than other instructional media is based on a mistaken principle.

2) Multimediainstructionprovidesanimatedpedagogicalagentsthataid

learning

Multimedia technology allows for the use of animated figures that can interact with learners during instruction. These “cartoon” figures continue to be advocated as one of the benefits of multimedia (e.g., Kramer & Bente, 2010; Kim, Thayne, & Wei, 2017). It has been proposed that animated agents personalize the instructional experience and serve as beneficial pedagogical guides for learners by interacting with them and, for example, directing attention to key concepts, answering questions, giving hints or providing feedback. Research on animated multimedia agents. Since our first reviews of animated agents was published, studies of animated agents appear to have increased, and yet their efficacy for promoting learning or motivation to learn is still in doubt. While it is possible to locate well-designed studies that contain positive findings such as Wang, Li, Mayer, and Liu (2018), one also finds other well-designed studies like Domagk (2010), which find no benefits.

Schroeder and Gotch (2015) attempted to summarize the reasons for conflicts between studies with very different outcomes that were asking apparently similar questions. They concluded that, “Pedagogical agents have been researched for nearly two decades, yet the effectiveness of including an agent in a learning environment remains debatable.” (p. 183). Their conclusion matched similar past attempts to make sense of this research by, for example, Kramer and Bente (2010) who concluded that beneficial effects to learning from agents had not been demonstrated in studies. On the other hand, a meta-analysis study of other studies by Davis (2018) reached the opposite conclusion “Overall, the findings revealed that PA (pedagogical agent) gestures are beneficial for student learning and perception in multimedia learning environments” (p. 193).

In the face of such a large body of conflicting studies, it seems reasonable to urge caution about the use of pedagogical agents. It appears that under some conditions some versions of voice-only agents provide a small increase in specific learning of some items being taught from ten to fifteen percent (Davis, 2018) over non-agent alternatives. Yet other studies have also found that learning can be damaged by agents who distract learners, increase their cognitive load (Chen, 2012), and lead to less learning than no agent alternatives (Davis, 2018). Clark and Choi (2005) suggested design principles for research in this area that, if followed, might lead to clearer outcomes in this research. It is troubling that the small positive effect sizes for learning (retention) and transfer of

Page 7: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 7 of 30

learning are due in part to the negative impact of the cognitive load that some pedagogical agents impose on learners. However, it is promising that recent research has begun to target specific features of pedagogical agents, which can lead to evidence regarding of what may help or hinder learning (e.g., Park, 2015; Yung & Paas, 2015; Wang et al., 2018). Within this context, it is important that the specific features be tested against alternatives that offer the same functionality without the use of an agent. Most often, these features are compared against the absence of the function provided by the agent (e.g., Yung & Paas, 2015), which hinders the ability to assess the value of the agent itself. As in other areas of multimedia learning research, more studies need to compare the functions performed by pedagogical agents to non-agent-based delivery of the same functions to determine if agents are simply another medium for enacting instructional methods, or if they add some additional value that can be capitalized upon (cf. Clark, 1983, 2012).

3) Multimediainstructionaccommodatesdifferentlearningstylesandso

maximizeslearningformorestudents

James and Gardner (1995) define learning styles as “… an individual’s natural or habitual pattern of acquiring and processing information in learning situations. A core concept is that individuals differ in how they learn” (p. 19). Many varieties of learning styles have been proposed including visual, verbal and kinesthetic or tactile learners; convergent, divergent, assimilating and accommodating learners; field dependent or field independent learners; and a number of learning styles inventories have been developed such as the one offered by Dunn and Dunn (1978). Research on learning styles and multimedia. Since the first two reviews in past handbooks (Clark & Feldon, 2005, 2014) the number of negative reviews of learning styles has increased (e.g., Husmann & O’Loughlin, 2018; Kirschner, 2017; Nancekivell, Shah & Gelman, 2020; Newton & Miah, 2017; Pashler, McDaniel, Rohrer, & Bjork, 2009) and most have reflected similar views to those advanced by Dembo and Howard (2007) who concluded “… there is no benefit to matching instruction to preferred learning style, and there is no evidence that understanding one’s learning style improves learning and its related outcomes” (p. 107). A very careful and complete review by Pashler and colleagues (2009, p. 105) was even more devastating to the learning styles research when they concluded:

Although the literature on learning styles is enormous, very few studies have even used an experimental methodology capable of testing the validity of learning styles applied to education. Moreover, of those that did use an appropriate method, several found results that flatly contradict the popular meshing hypothesis … We conclude therefore, that at present, there is no adequate evidence base to justify incorporating learning styles assessments into general educational practice … Thus, limited education resources would better be devoted to adopting other educational practices that have a strong evidence base, of which there are an increasing number.

Despite three decades of negative information about the utility of learning style measures to support student learning and motivation it appears that their use at all levels of education continues. For example, Newton and Miah (2017) surveyed UK higher

Page 8: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 8 of 30

education faculty in a wide variety of disciplines and learned that a third reported using learning styles but over two-thirds believed that people learned better if styles were considered. Nancekivell, Shah, and Gelman (2020) surveyed an even larger group of educators at all levels, as well as non-educators, and also found approximately two-thirds of both groups believed in the utility of learning styles. Based on these and other reviews, we see no reason to modify our earlier claims that it is mistaken to expect multimedia instruction accommodating different learning styles to maximize learning for more students. While individual differences such as prior knowledge have been found to interact with different instructional methods, learning styles do not aid learning despite powerful evidence to the contrary. Indeed, it might be suggested that the continuing belief in the utility of learning styles by educators, despite decades of contrary evidence, is a distressing indicator of the lack of evidence-based practice in education and in university-based teacher education programs. ] 5) Multimedia instruction facilitates student-managed constructivist and discovery approaches that are beneficial to learning Since multimedia permits students to have control over the sequencing and content of instruction, constructivist-learning advocates (e.g., Savery & Duffy, 2001) have encouraged its use to support discovery learning. In Clark and Feldon (2005) we claimed that it was not beneficial for student learning if multimedia instruction provides constructivist or discovery instruction to students. After 2005, there were many debates about this issue (Kirschner, Sweller & Clark, 2006; Tobias & Duffy, 2009). As a result of those debates, in Clark and Feldon (2014) we modified our earlier claim and stated:

Discovery-based multimedia programs seem to benefit experts or students with higher levels of prior knowledge about the topic being learned. Students with novice to intermediate levels of prior knowledge learn best from fully guided instruction. Prior knowledge is therefore an individual difference that leads to learning benefits from more guidance at low to moderate levels but not at higher levels, regardless of the media used to deliver instruction” (p. 159).

The reason for this change was clear evidence from a number of studies that discovery learning was not only best for experts, but when experts received guided learning approaches, they did worse than in discovery conditions – a situation termed the “expertise reversal effect” by Kalyuga (2007). Khacharem, Zoudji, and Kalyuga (2015) updated the finding and related it to dynamic visual presentations on multimedia. We find no reason to change the conclusion in Clark and Feldon (2014). 6) Multimedia instruction provides students with autonomy and control over the sequencing of instruction

Student control of elements of instruction is another contentious issue in

educational research (Merrill, 2006; see also Chapter 35). A number of prominent multimedia advocates have argued that the decision to provide linear, autocratic, and controlling instruction has demotivated students and ignored the constructivist nature of learning (e.g., Duffy & Jonassen, 1992; Shepherd, 2003). The solution they offer is to provide students with multimedia that permits and encourages learner control of content,

Page 9: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 9 of 30

sequencing and pacing. When sequencing is under student control, they may skip or revisit topics, practice exercises, examples and demonstrations. In essence, control of sequence permits learners to determine the order and, to some extent, the content of their instruction. This claim has been controversial and has benefitted from a clearer description of learner control possibilities (Landers & Reddick, 2017) and more comprehensive meta-analytic reviews of research (Rey, Beege, Nebel, Wirzberger, Schmitt & Schneider, 2019)

Research on sequencing control. In our previous review, we stated that there

“appears to be solid evidence that student control of the pacing of instruction is beneficial for learning (e.g. Mayer & Chandler, 2001). Pacing control permits students to stop, pause, or slow down multimedia or other instructional presentations, presumably so that they have the opportunity to elaborate and remember what they have seen and/or heard before continuing.” (Clark & Feldon, 2014, p. 157). This recommendation continues to be supported by more recent studies (Rey et al., 2019; see also Chapter 19).

Many other kinds of learner control have appeared in more recent studies. Landers and Reddick (2017) provide a comprehensive list of learner control types, including pacing, segmenting (forced pauses inserted in a multimedia lesson), skipping (avoiding segments of instruction), supplementing (navigating to additional content), sequencing (change the order of lessons or lesson content), practice (skip practice exercises), guidance (choose to avoid suggestions after practice exercises), and scheduling (changing the duration or location of instruction).

A recent meta-analysis of the learner control with multimedia technology by Karich, Burns, and Maki (2014) computed 29 outcomes in 18 studies and concluded that the effects were nearly zero. Another recent meta-analysis of pacing and segmenting research in multimedia instruction (Rey et al., 2019) has reached the conclusion that:

[M]ultimedia instruction should be presented in (meaningful and coherent) learner-paced segments, rather than as continuous units, to improve learning performance and reduce the learners’ overall cognitive load. First, instructional designers should facilitate chunking and structuring due to segmenting the multimedia instruction. Second, learners should have enough time to process the multimedia instruction. Third, they should be given the possibility to adapt the presentation pace to their individual needs. Furthermore, especially learners with high rather than no or low prior domain knowledge should receive segmented learner-paced multimedia instructions rather than unsegmented system-paced instructions (p. 414).

Thus, evidence-based advice continues to encourage student pacing of instruction while more recently suggesting that forced pauses or segments help higher prior knowledge students without hurting those with lower prior knowledge. Other forms of student control are not supported by clear evidence. 7) Multimedia instruction allows students the opportunity to practice critical and higher order thinking

The interest in teaching critical or higher order thinking (HOT) skills has not

diminished since the first version of this chapter over fifteen years ago (Clark & Feldon,

Page 10: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 10 of 30

2005). Yet as this chapter goes to press, there appears to be no compelling evidence supporting the expectations of, for example, Fontana , Dede, White, and Cates (1993), Scheibe and Rogow (2012), and Stoney and Oliver (1999) that multimedia is any more effective than other types of media at teaching HOT skills.

Higher order thinking defined. Facione (1990) described the critical thinker as

“…habitually inquisitive, well-informed, trustful of reason, open-minded, flexible, fair-minded in evaluation, honest in facing personal biases, prudent in making judgments, willing to reconsider … and persistent in seeking results which are as precise as the subject and the circumstances of inquiry permit” (p. 3). Studies in this area generally compare multimedia-provided HOT skills with similar skills taught by different media. For example, Bagarukayo, Weide, Mbarika, and Kim (2012) taught over 200 undergraduate students’ similar HOT skills such as problem-solving, analysis, synthesis and interpretation using multimedia and textbook media. They found no evidence of differences in learning of any HOT skills due to the medium used for teaching. Other studies have explored the impact of learning problem solving skills in specific areas such as mathematics on general problem-solving skills (e.g., Elfeky, 2017). In general, these studies find success at learning the mathematics that was taught (often called “near transfer learning”) but no success at generalizing those skills to other disciplines (called “far transfer learning”). The goal of HOTs research is to find ways to train higher order thinking skills so that they generalize to all disciplines (i.e. far transfer).

Research on higher order thinking. In a meta-analysis of over 100 studies

involving more than 20,000 students, Abrami and colleagues (2008) provide an excellent discussion of critical and higher order thinking skill instruction. They found no advantages for multimedia and concluded that critical thinking tends to result when pedagogy involves specific learning objectives focused on clearly defined thinking skills within a specific knowledge domain and effective demonstration and practice of the skills being taught by well-trained instructors.

The HOTs question has generated so many studies that there are now many new meta-analyses available, but they do not all agree. A recent “second order” meta-analysis by Sala et al. (2019) attempts to handle this disagreement with a meta-analysis of the meta-analytic reports on HOT studies. They concluded that “when placebo effects and publication bias were controlled for, the overall effect size and true variance equaled zero. That is, no impact on far-transfer measures was observed regardless of the type of population and cognitive-training program” (p. 18).

8) Multimedia instruction encourages incidental learning of enriching information

Incidental learning has been defined as “learning without awareness of what is

being learned” (Dekeyser, 2003, p. 314) and as “automatic, associative, nonconscious and unintentional” (Kaufman et al., 2010, p. 321). It’s obvious that children learn a great deal about language and culturally acceptable behavior from observing the world around themselves rather than from intentional instruction by parents or others. This fact leads some to expect that all knowledge can be obtained unintentionally if only we experience rich and diverse information in multimedia and other places. However, the research on

Page 11: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 11 of 30

this topic since our last review give us no reason to change our view that Sweller’s (2008) argument continues to be the best view of the role of incidental learning in our lives when he explains that:

[W]hile evolution has prepared us to learn walking and talking (he calls this “primary learning”), it has not prepared us to learn mathematics, physics, history or other complex problems (called “secondary learning”). Primary learning of walking and talking is to some extent “incidental,” but all other kinds of learning are likely to be secondary learning, and therefore require guided instruction. Neither primary nor secondary learning require multimedia, however (Clark & Feldon, 2014, p. 163).

Explanations of the incidental learning evidence. The long history of studies in this area indicate that incidental learning is not a function of media or multimedia but instead fostered by a lack of learning objectives and/or attention directing devices such as hyperlinks related to unintended outcomes. Where attention-directing devices and hyperlinks are inserted in instructional multimedia programs it seems reasonable to expect learning related to the information accessed by these devices. Since these devices must be intentionally inserted in a multimedia program, it seems unlikely that the learning that results from them classifies as incidental. 9) Multimedia instruction promotes interactivity When computers first became widely available, one of their key selling points for instruction was the expectation that they would promote collaborative interactivity between students and more active interactions with instruction. Accordingly, they were expected to foster more learning. This expectation continues to be one of the most important for multimedia advocates. In the fifteen years since we began these reviews, the issue of interactivity has persisted despite a lack of a clear, operational definition and any agreement about benefits. Interactivity defined. Essentially multimedia interactivity involves collaborative interactions between learners when using a multimedia (computer) based instructional program. This active interaction on the part of students is presumed to increase both their engagement and their learning. It appears that the intent by many who advocate interactivity is to avoid learner passivity and promote more collaborative engagement by learners, both physically (manipulating the computer and related control equipment) and using more active cognitive processing to learn as a result of the interaction. Research on interactivity. Scholars who have tackled the issue of interactivity (see for example Domagk, Schwarz & Plass, 2010; Koedinger & Aleven, 2007; see also Chapter 25) have complained about the lack of specificity and rigor in definitions of interactivity. Most of the research on this topic seems to avoid a key question: “what kind of interaction is required to increase learning?” Some interactions have a positive impact on learning and other types are neutral or even damaging to learning (Clark, 1989; Domagk, Schwarz & Plass, 2010; Koedinger & Aleven, 2007).

Page 12: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 12 of 30

Borokhovski, Bernard, Tamim, Schmid, and Sokolovskaya (2016) conducted a meta-analysis of well-designed studies that examined the impact of planned and unplanned student collaborative interaction in a variety of multimedia courses. They found that only planned interactions led to more student collaboration with each other or with instruction during learning. Unplanned interactions were not enhanced by multimedia courses unless they were planned. Bernard et al. (2009) conducted a meta-analysis on 74 studies that compared different kinds of interactions between students and course content in the same courses offered in a classroom and online using multimedia. They found no learning benefits for multimedia or classroom due to student to student interaction, students and teacher interactions or student interactions with course content. However, they did find a sizeable benefit for all three kinds of interactions when they were present in either multimedia or classroom contexts. However, it appears that planned student interactions are even more important for learning when online instruction is conducted via asynchronous media and student interactions with teachers and each other are not possible (Bernard et al., 2009).

Thus, we concluded that while different kinds of interactivity may promote cognitive engagement and so learning, interactivity is not more effective on multimedia than other contexts for learning. While multimedia may be able to present most or all of the productive forms of interactivity, other media can also facilitate all necessary interactivity. Thus, while planned student interactivity is beneficial to learning, interactivity is not an exclusive benefit of multimedia.

10) Multimedia instruction permits students to experience an authentic learning environment and activities Authentic learning environments are defined as contexts or settings for instruction that reflect the critical features of the environments where learning is expected to be transferred and applied (Herrington & Kervin, 2007). Thus, when teacher trainees learn classroom management strategies in school-like settings interacting with actors playing students, or when engineers learn their jobs in simulations that duplicate work settings, they are experiencing authentic learning environments. Those who advocate authentic environments for instruction expect that transfer of learning will be enhanced when learning and application environments are similar (Herrington, Reeves & Oliver, 2014). Research on authentic learning environments. In the past five years there has been very little research on authentic environments. We suspect that the difficulty in specifying the elements that are necessary for authenticity have eluded researchers. Further, controlled studies contrasting simulation-based instruction and traditional classroom instruction do not often find significant differences when instructional features are equivalent. For example, Gulikers, Bastiaens, and Martens (2005) found no differences in motivation, learning, or transfer between business students learning within a simulated business environment and those in a classroom instructional environment. Based on a lack of positive evidence in the research literature, it is likely that the claims made for the unique capacity of multimedia to provide learning benefits in the form of authentic application environments are mistaken.

Page 13: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 13 of 30

The discussion turns next to a review of five relatively new principles that have received attention in the past few years.

Review of Five New Mistaken Principles

11) A sense of “presence” in virtual multimedia training environments enhances learning outcomes.

As the technology supporting virtual learning environments has continued to

evolve, it has become possible to generate ever more realistic visual representations of objects and events. Further, technologies can facilitate real-time interactions between individuals or between individuals and virtual agents intended to offer human-like communication. Enthusiasm for these technological advances has fueled beliefs that the immersive experiences they offer hold inherent advantages over training systems without such realism. The subjective sense of being actually present in the simulated space can reflect a combination of personal presence (i.e., the sense of being physically present in the simulated space due to 3D images, sound, photorealism, tactile feedback, and so on), social presence (i.e., the extent to which other people seem to exist and react realistically to the user within the virtual space) and environmental presence (i.e., the extent to which the simulated physical environment responds realistically to the user’s actions within the virtual space) (Heeter, 1992).

To support this premise, many studies have identified correlations between presence and factors associated with successful learning, such as motivation, emotion, and engagement (Allcoat & von Mühlenen, 2018). However, studies comparing learning outcomes in high- and low-presence training conditions generally find no significant differences in learning gains (e.g., Dengel & Mägdefrau, 2019; Krassmann et al., 2020; Makransky, Borre-Gude, & Mayer, 2019). Indeed, Schrader and Bastiaens (2012) found that higher levels of presence were associated with poorer learning outcomes, potentially as a result of high levels of extraneous cognitive load invoked by the high presence features.

When studies do report significant differences, features of the training conditions tend to differ in multiple ways, preventing presence from being the sole viable explanation. For example, Makransky and colleagues found that training on laboratory tasks in 3D virtual reality context was associated with better physical performance in the lab when compared with a text-based learning condition. However, it is not possible to determine that the results were due to the enhanced presence of a 3D environment, because training for a physical task that includes the opportunity to observe and practice physical manipulations cannot be considered equivalent to one in which participants could only read written descriptions. Indeed, performance on the written posttest in that study (for which opportunities to learn would have been equivalent) did not differ across conditions. Likewise, Allcoat and von Mühlenen (2018) reported improved recall for a higher presence 3D training environment with interactive learning features when compared to a passively consumed instructional video or text. Thus, it is impossible to determine that the difference is attributable to the sense of presence provided by the 3D virtual reality material and not the availability of interactive engagement with the content to be learned.

Page 14: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 14 of 30

Similarly, research on social presence is limited by the design of most studies. One recent meta-analysis observed that while social presence was a consistent predictor of self-reported satisfaction and self-reported learning, there were few studies that reported direct assessments of learning gains (Richardson, Maeda, Lv, & Caskurlu, 2017). Those studies that did use direct assessments of learning outcomes found null or negative effects (e.g., Homer, Plass, & Blake, 2008; Picciano, 2002; Wise, Chang, Duffy, & Del Valle, 2004). Likewise, Parong and Mayer (2020) found that an immersive virtual reality training system intended to invoke a high level of presence yielded significantly worse learning outcomes and transfer and higher levels of cognitive load relative to a desktop monitor presenting the relevant information as a slide show (see also Chapter 42).

12) Virtual and remote laboratories yield better learning outcomes than physical laboratories

With advancements in multimedia technologies, virtual and remote laboratories

(VRLs) have become increasingly appealing alternatives to physical laboratories in science and engineering education. The advocacy of VRL use is based on the belief that they offer a variety of advantages over physical laboratories and therefore produce better learning outcomes. For instance, VRLs allow students to perform multiple experiments within a short period of time, which may strengthen corresponding conceptual knowledge and procedural skills. In particular, virtual labs afford students opportunities to experience unobservable phenomena (e.g., chemical reactions) that cannot be carried out through physical experiments. In contrast, advocates of physical labs emphasize that tactile information obtained during physical manipulation is an essential component to develop conceptual understanding and inquiry skills. In addition, in physical labs, students often experience unexpected errors in measurement and design, which may help them interpret their experimental results in different ways and thereby deepen their understanding of the experiments.

Types of laboratories defined. According to Ma and Nickerson (2006, p. 5-6),

physical laboratories are defined as those that “involve a physically real investigation process” in which “(1) all the equipment required to perform the laboratory is physically set up and (2) the students who perform the laboratory are physically present in the lab.” They then define virtual laboratories as “imitations of real experiments” in which “all the infrastructure required for laboratories is not real but simulated on computers.” Furthermore, remote laboratories are “characterized by mediated reality,” in that “experimenters obtain data by controlling geographically detached equipment.” In other words, students remotely control physical equipment and obtain data from real experiments.

Research on the comparisons between VRLs and physical laboratories. Despite the abundance of research regarding the educational benefits of VRLs and physical laboratories, only a few attempts have been made to consolidate the findings of such research in a systematic way (e.g., Brinson, 2015; Ma & Nickerson, 2006; Post, Guo, Saab, & Admiraal, 2019). And yet, even these reviews have remained descriptive rather than analytic. For instance, Ma and Nickerson (2006) reviewed 60 studies

Page 15: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 15 of 30

published before 2006 (20 for each lab type) and concluded that it was impossible to determine the superiority of any lab type due to large variations in the study focus. For example, studies on physical labs focused on measuring design skills, while VRLs studies relied more on conceptual understanding. Following this review, Brinson (2015) analyzed articles published after 2006 but only included 56 experimental studies comparing learning outcomes between VRL and physical lab conditions, such as content knowledge, inquiry skills, perception, etc. His findings revealed that overall, VRLs yielded equal or better learning outcomes than physical labs. This review, however, did not report whether the selected studies met sufficient methodological rigor, such as adequate sample sizes, the presence of both pre-posttests, and whether other confounding variables had been controlled, all of which may diminish the validity of the findings. Indeed, a recent review by Post et al. (2019) combining the findings of 23 investigations on the effects of remote lab found that research related to this topic typically suffers from methodological weakness; only a few studies (four of the reviewed studies) used a pre-posttest control group design, and one of those four had a sample size that was much too small. However, a number of well-designed studies (e.g., Triona & Klahr, 2003; Wiesner & Lan, 2004; Zacharia & Constantinou, 2008) have consistently confirmed that there is no difference in conceptual understanding between virtual and physical labs when instructional content and other potential factors are sufficiently controlled.

Given the insufficient empirical evidence and lack of methodical rigor, we therefore conclude that the belief that virtual and remote laboratories offered by multimedia technologies yield better learning gains than physical laboratories is based on a questionable principle.

13) Multimedia instruction provides gamification that promotes motivation and learning

Implementing gamification has become increasingly popular in recent years (Sailer & Homner, 2020). Multimedia enables gamification approaches, which can provide learners with game-like activities and experiences during learning. The widespread use of gamification stems from the premise that it elicits targeted learning-related behaviors that engage and motivate learners and thereby improves their learning outcomes.

Gamification defined. Deterding et al. (2011) proposed a general and now

widely accepted definition of gamification as “the use of game design elements in nongame contexts” (p.5). Applying this to the context of learning, gamification is referred to as “the introduction of game design elements and gameful experiences in the design of learning processes” (Dichev & Dicheva, 2017, p. 2) or “a design process of adding game elements in order to change existing learning processes” (Sailer & Homner, 2020, p.78). These definitions all share the same notion that learning with gamification differs from game-based learning, in that while the former focuses on the implementation of certain elements of games (e.g., points, badges, leaderboards, progress bars, etc.), the latter focuses on fully fledged games (e.g., serious game) (Deterding et al. 2011; Sailer & Homner, 2020).

Page 16: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 16 of 30

Research on gamification in education. Research on the effects of gamification in learning contexts has rapidly proliferated over the last decade. Following this trend, a substantial effort has also been made to combine and synthesize the empirical findings of this research (e.g., Dichev & Dicheva, 2017; Dicheva et al., 2015; Faiella & Ricciardi, 2015; Sailer & Homner, 2020; Seaborn & Fels, 2015). These reviews indicate that overall, there is a dearth of solid evidence supporting the effectiveness of the use of gamification in education. For instance, Dichev and Dicheva (2017) reviewed approximately 40 empirical studies investigating the impact of gamification on motivational processes and learning outcomes. The authors concluded that “there is still insufficient evidence that it [gamification] (1) produces reliable, valid and long-lasting educational outcomes, or (2) does so better than traditional educational models. There is still insufficient empirical work that investigates the educational potential of gamification in a rigorous manner.” (p. 21). In contrast, a recent meta-analysis by Sailer and Homner (2020) revealed some positive evidence. Analyzing the findings of about 40 published articles, the authors found significant, small effects of gamification on conceptual understanding (g = .49), motivational beliefs and attitudes (g = .36), and skill performance (g = .25). However, their subsequent analyses including only studies of high methodological quality (e.g., controlled experiments, quasi-experimental studies with both pre- and posttests) found that while gamification still had significant, small effects on conceptual understanding (g = .42, an average increase of about 16 percent), these positive effects no longer existed on either motivational outcomes or skill performance.

Given the lack of methodological soundness and definitive results, we therefore conclude that the belief that gamification offered by multimedia promotes motivation and learning is based on a questionable principle.

14) Recorded lectures lead to equivalent or better learning outcomes than live lectures.

Online courses have increasingly become a prominent segment of higher

education. Colleges and universities have invested heavily in infrastructure and technologies to facilitate and expand their online programs and courses. Many institutions are also offering the same courses in both face-to-face and online formats so students can enroll in whichever they prefer.

Video- or audio-recorded lectures are often a primary component of an online course. Further, the main difference between face-to-face and online offerings within the same course is often whether the instructor’s lectures are delivered live in class or recorded and posted online. Other components of the course, such as the syllabus, textbooks, assignments, and exams, tend to remain identical. This is especially the case in recent years, in that most course activities other than lectures occur through learning management systems (LMS) even in face-to-face courses.

The premise underlying the increased prevalence of recorded lectures is that they provide at least equivalent learning outcomes for students or may lead to even better outcomes by accommodating the diverse needs of students under different learning circumstances, such as different learning pacing, schedules, or locations. Unfortunately, such beliefs do not seem to be sufficiently supported by the available empirical evidence. Although few random-assignment experiments have been conducted on this issue, the

Page 17: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 17 of 30

findings from such studies consistently suggest that online courses yield negative or null advantages for learning outcomes compared to face-to-face courses (e.g., Alpert, Couch, & Harmon, 2016; Bowen et al., 2014; Figlio, Rush, & Yin, 2013). For instance, Figlio and colleagues (2013) randomly assigned 312 students to either online or face-to-face lectures from the same course. Other aspects of the course, such as the instructor, access to help from teaching assistants, online course materials, assignments, and exams, were consistent across the two groups. Thus, the primary difference between the two groups was whether the lectures were recorded or live. Their findings revealed that students in the face-to-face section had better scores on both their midterm and final exams than those in the online section. Similarly, a recent large-scale study by Bettinger and colleagues (2017) including approximately 230,000 students in 750 different courses reported negative outcomes from online course-taking. Their results showed that students attending lectures online had higher dropout rates and lower grades compared to students taking the same courses in-person. Notably, such negative effects were found to be larger for students with lower previous grade point averages (GPAs).

Recorded lectures and flipped classrooms. As the flipped classroom concept

becomes more popular, the replacement of live lectures with prerecorded lectures is becoming more frequent in face-to-face courses. The essential notion of the flipped classroom approach is for students to master course content before class through recorded lectures or other relevant resources in order to engage in collaborative activities and problem solving during in-person class time. However, reliable evidence supporting the advantages of flipped classrooms compared with traditional classrooms remains sparse, largely due to a lack of experimental studies with appropriate methodologies such as random assignment to conditions (e.g., Gillette et al., 2018; Cheng, Ritzhaupt, & Antonenko, 2019). However, it should be noted that the prominent difference between flipped and traditional classroom approaches is not merely whether lectures are prerecorded or live, but whether students engage in constructive and collaborative activities during in-person class time traditionally allocated to listening to lectures. As such, any findings regarding the effectiveness of flipped classroom instruction should not be attributed exclusively to the use of recorded lectures.

Although the literature on the issue is still growing and more empirical work is needed, the current literature is limited in its support for the benefits of recorded lectures as a substitute for live lectures. Thus, we conclude that the belief that recorded lectures result in equivalent or better learning outcomes than live lectures is based on a questionable principle.

15) Engaging in “brain training” and video games can enhance general mental abilities, such as intelligence, memory, and attention.

Over the past 20 years, various individuals and corporations have advocated the

use of computer-based games that include puzzles, rapid attention shifting, and recall tasks as a viable strategy for enhancing general cognitive function (Simons et al., 2016; see also Chapter 40). Examples include Nintendo’s Brain Age and SharpBrains’s

Page 18: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 18 of 30

Cogmed. These brain training games purportedly improve mental processing speed, intelligence, working memory span, and other general abilities that are associated with success on cognitive tasks. However, the evidence supporting these claims has been hotly contested—to the extent that one recent meta-analysis (Aksayli, Sala, & Gobet, 2019) reported identifying over 2700 scholarly publications addressing related issues. However, when those publications were screened to include only those that reported empirical results from controlled studies and included both pre- and post-tests, only 50 remained. Further, such meta-analyses have disagreed about the aggregate results of empirical studies (Makin, 2016).

The appeal of brain training games as a strategy for enhancing cognitive performance is intuitive. Being smarter is culturally desirable and often—and problematically—associated with material benefits including academic achievement, professional achievement, and wealth (Sternberg, Grigorenko, & Kidd, 2005). However, the logic justifying the claims associated with these games relies on potentially faulty logic: The primary assumptions are twofold: 1) that practicing basic mental processes will lead to their improvement, and 2) that improvement on the game tasks will lead to improvement on other tasks outside the game context.

The first assumption is well-supported by research. Practicing cognitive skills with feedback leads to increased accuracy, speed, and fluidity on the tasks that are practiced (Anderson, 1982; VanLehn, 1996). However, the second assumption—that the improvement of skills practiced in one domain (e.g., a specific game, task, or area of proficiency) will enhance one’s ability to perform well in a different domain—has little empirical support. This phenomenon, known as far transfer in the psychological literature, has a long history of failed attempts to improve it (Barnett & Ceci, 2002; Makin, 2016). Thus, learning and practicing chess, computer programing, or playing music will improve individuals’ skills in those respective domains, but they will not demonstrably enhance intelligence or improve performance in other activities (Sala & Gobet, 2016). Put another way, substantial evidence supports the notion that training has positive impacts on learning tasks specifically targeted by that training (e.g., performance within a type of puzzle), but such training is not successful in enhancing untargeted components (Kassai, Futo, Demetrovics, & Takacs, 2019).

When scholars marshal arguments in favor of the effectiveness of brain training games, they typically point to small but positive average effects across studies, ranging from 0.1 to 0.2 standard deviations or about a 6 to 7 percent increase in outcomes (e.g., Bediou et al., 2018; Spencer-Smith & Klingberg, 2015; Karbach & Verhaeghen, 2014). However, critical analyses of those findings tend to find influential methodological errors in both the studies selected for inclusion in the meta-analyses and the meta-analyses themselves (e.g., Dovis, van Rentergem, & Huizenga, 2015; Redick, 2015; see Simons et al., 2016 for a discussion). Accordingly, when placebo effects and publication bias (the increased likelihood of articles being published when they report positive results) were statistically accounted for, even the small positive effects could not be detected (Sala et al., 2019; Sala & Gobet, 2020). Further, when meta-analyses differentiate between short term and long-term outcomes, they typically find that positive effects observed immediately following training do not persist when participants are re-tested later (Schwaighofer, Fischer, & Bühner, 2015).

Page 19: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 19 of 30

Discussion and Conclusion

As noted at the outset of this chapter, the current writing represents the third

assessment of “common but questionable” principles for multimedia learning we have compiled over the past 15 years (Clark & Feldon, 2005, 2014). Recently, this trend has expanded, with multiple books dedicated specifically to the debunking of educational myths (e.g., Barton, 2019; Berliner, Glass, & Associates, 2014; Christodoulou, 2014; De Bruyckere, Kirschner, & Hulshof, 2015, 2019; Neelen & Kirschner, 2020). However, a discerning reader might notice that the evidence cited in many of these sources is not exclusively recent. Pertinent research evidence that demonstrates the failure of some ideas to successfully impact learning spans at least 50 years. For example, Mayer’s (2004) article discussing the failure of pure discovery learning is entitled “Should there be a three-strikes rule against pure discovery learning?” because extensive evidence related to this issue emerged around three bodies of research in three different decades (discovery of problem-solving rules in the 1960s, discovery of conservation strategies in the 1970s, discovery of programming concepts in the 1980s). Indeed, some references in the current chapter date back 50 years.

Why are mistaken principles so durable in the face of extensive evidence? We posit several possible answers: First, many advances in multimedia learning technologies and proponents of technology adoption are driven by commercial enterprises (see our discussion of brain-training games in this chapter). Accordingly, there exists a profit motive to market products as new, even if the ideas that motivate them are not (e.g., technology revolutionizing learning; Cuban, 1986). Such financial incentives extend even into the academic research domain, where competition for grant dollars can lead to exaggerated claims and selective attention to evidence (Lilienfeld, 2017).

Second, in academic circles, there often exists a belief that only the most recent research is applicable to current problems. As a result, researchers may not necessarily invest effort in examining the history of an idea or the evidence that has accumulated over extended periods of time. Scholars are also not immune to the influence of trends within their fields, where an idea’s popularity may lead it to be accepted as valid without deep scrutiny of the evidence supporting it. To this point, Valsiner (1988; as cited in Gredler & Shields, 2004) wryly observed that the depth of understanding of a theory is often inversely related to its popularity.

Third, many educational researchers focus narrowly within their own subfields and do not examine or give maximal credence to research conducted in other content areas. Shulman (1986; Shulman & Quinlan, 1996) advocated for the importance of subject-specific psychologies in terms of their ability to inform educational practice. While these ideas have been very productive for conceptualizing teaching, cognition, and instruction as having intrinsic foundations in the nature of the subject matter to be taught (Mayer, 2001; Stevens, Wineburg, Rupert Herrenkohl, & Bell, 2005), it has arguably contributed to a trend of looking for prior research only within a specific content area and not more broadly across domains for relevant lessons learned.

Fourth, when people do search for evidence regarding the effectiveness of a particular strategy, they are often met with a vast array of published studies that may vary considerably in terms of their methodological rigor. One general example of this

Page 20: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 20 of 30

challenge is the use of inappropriate comparison conditions when investigating the efficacy of technology-enhanced instruction (Clark, 1983). Too often, studies finding positive effects of media on learning do so by comparing a new intervention that is cognitively engaging and productively interactive to a poorly implemented, hyperbolized example of instruction that offers only passive and inadequate learning. Accordingly, students in the new intervention demonstrate stronger performance. However, when one compares student outcomes to an intervention that offers the same learning affordances across media conditions (e.g., instructional methods, time on task, opportunities for practice and feedback), no difference can be detected. Thus, the positive effects were attributable to a false equivalency in experimental conditions rather than the unique merits of the intervention.

What’s at stake and what can be done?

There are some who would question why it matters that vehicles for multimedia learning might be based on mistaken principles. People have argued that engaged technology use is, itself, a sufficient benefit to justify the adoption of technology, regardless of whether or not it represents a value-added contribution to learning outcomes (Papert, 1987; Selwyn, 2013). Others have suggested that generalizable principles for effectiveness are of limited use for learning technologies, because they must be dynamically co-constructed as an “interaction…between the designer, the situation, and the medium in which the design both shapes and is shaped by each of these factors” (Kozma, 1994, p. 17) to have a demonstrable effect on learning.

However, a single multimedia learning application has the potential to impact vast numbers of learners when implemented at scale. Over the past 10 years, use of virtual learning platforms, digital LMS systems, and 1-to-1 laptop use across all levels of the education system has expanded dramatically to impact millions of students in North America alone (Wang, 2016). An application disseminated and used through one or more of these systems has the potential to shape the learning outcomes of substantial portions of the total student population. Further, at the time of this writing, accommodations for the COVID-19 pandemic have led to thousands of schools rapidly shifting part or all of their instruction to online delivery (Ferdig, Baumgartner, Hartshorne, Kaplan-Rakowski, & Mouza, 2020; Reich et al., 2020). Consequently, a failure to implement multimedia learning principles that are generalizable and supported by rigorous research can result in tangible harms. Opportunities to effectively support student learning are lost and may actively harm student understanding and achievement (Clark, 1989; Lohman, 1986). Likewise, schools working with limited resources collectively squander hundreds of millions of dollars invested in technology that cannot deliver its promised benefits.

An opportunity does emerge from this potentially dire landscape, however. With multimedia learning technologies deployed at scale, the opportunity to analyze learning experiences and their effectiveness has never been more powerful. Further, the interoperability of systems permits such analyses to include a great deal of background data about individual learners to enable more nuanced understandings of the ways in which individuals’ characteristics (aptitudes) may interact with instructional treatments to drive learning outcomes. The goal of aptitude-by-treatment research (ATI) is to identify treatments that are helpful for learners with a specific range of aptitude scores but either

Page 21: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 21 of 30

neutral or harmful for students with different scores on the same aptitudes (Cronbach & Snow, 1977).

Education and psychology more or less abandoned ATI studies about two decades ago, because researchers agreed that meaningful analyses required a very large number of learners with measured aptitudes who were experiencing different, contrasting treatments (Kyllonen & Lajoie, 2003). The conditions for restarting ATI studies are now available based on the scale of internet-based multimedia learning systems and may allow for a more accurate and helpful parsing of students who achieve more with different treatments. An example of the benefits of ATI methods are described in principle five above where studies demonstrate that guided instruction is harmful for a small number of students with the highest prior knowledge scores but is exceptionally helpful for students with lower prior knowledge. It may also be the case that ATI studies will identify some students with aptitude scores indicating no necessary benefits or harm from guided instruction, so issues such as cost and accessibility of treatments become key factors in matching them to treatments. Similar findings might be forthcoming with a number of other principles on our list.

The inclination of researchers to focus on main effects and ignore interactions in research after the 1990’s may have obscured subtle but important benefits for a minority of students who share aptitude profiles. It seems reasonable to expect that few treatments are equally beneficial (or harmful or neutral) for all students. Our goal must be to adopt instruction to students whose individual differences indicate that a clearly specified treatment will maximize their learning. In order to make this transition we need much more work identifying the different cognitive, affective and motivational aptitudes that interact with instructional treatments to produce different learning outcomes under different treatments (Ackerman, 2003). Yet the larger need is to find a useful way to describe the active ingredients in instructional treatments (Clark & Saxberg, 2012). Sadly, Shulman’s (1970) critique a half-century ago that ATI research “will likely remain an empty phrase as long as aptitudes are measured by micrometer and environments are measured by divining rod” (p. 374) is still accurate. Accordingly, more care needs to be taken in the characterization of the elements of an intervention such that the relationships between aptitudes, treatments, and outcomes can be best understood.

The scale of data available has created both a need for new methods (e.g., learning analytics for “big data”) and opportunities to pursue past research strategies that were abandoned due to the inability to reach the necessary scale with older technologies (e.g., full factorial designs to test all possible combinations of features within a multimedia learning technology); Cronbach & Snow, 1977; Ackerman, 2003). Both learning analytics and at-scale factorial designs can be powerful tools to identify “what works,” for whom, and under what circumstances. However, the sheer scope of data possible to collect highlights the need to collect and analyze data with purpose informed by well-supported theories of learning (Wise & Shaffer, 2015). When statistically analyzing very large data sets, it is possible to detect significant relationships that occur by chance or mistake the directionality of a relationship between variables (Calude & Longo, 2017; Gandomi & Haider, 2015). Without a foundation in prior research that offers coherent explanations of how instruction may effectively support students’ learning, it is possible to place undue interpretive weight on spurious relationships or derive explanations for findings that are incompatible with well-established principles

Page 22: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 22 of 30

based on extensive prior research. Thus, the enhanced power of big data to detect relationships in pursuit of better supporting learning carries with it the added responsibility to situate findings in the context of previous findings from well-designed and implemented studies. Failure to do so risks perpetuation of mistaken principles applied across a large number of learners, ultimately harming those we want to help.

References

Abrami, P. C., Bernard, R. M., Borokhovski, E., Wade, A. Surkes, M. A. Tamin, R. &

Zhang, D. (2008) Instructional interventions affecting critical thinking skills and dispositions: A stage 1 Meta Analysis. Review of Educational Research. 78(4), 1102-1134.

Ackerman, P. L. (2003) Cognitive Ability and Non-Ability Trait Determinants of Expertise. Educational Researcher. 32(8). 15-20.

Aksayli, N., Sala, G., & Gobet, F. (2019). The cognitive and academic benefits of CogMed: A meta-analysis. Educational Research Review, 27, 229-243.

Allcoat, D., & von Mühlenen, A. (2018). Learning in virtual reality: Effects on performance, emotion and engagement. Research in Learning Technology, 26, 1-13.

Alpert, W. T., Couch, K. A., & Harmon, O. R. (2016). A randomized assessment of online learning. American Economic Review, 106(5), 378-382.

Anderson, J. R. (1982). Acquisition of cognitive skill. Psychological Review, 89(4), 369–406.

Bagarukayo, E., Weide, T., Mbarika, V. & Kim, M. (2012). The impact of learning driven constructs on the perceived higher order cognitive skills improvement: Multimedia vs. text. International Journal of Education and Development using ICT, 8(2). 120-130.

Barnett, S. M., & Ceci, S. J. (2002). When and where do we apply what we learn?: A taxonomy for far transfer. Psychological Bulletin, 128(4), 612–637.

Barton, C. (Ed.) (2019). The research-ED guide to education myths: An evidence-informed guide for teachers. Melton, UK: John Catt Educational.

Bediou, B., Adams, D. M., Mayer, R. E., Tipton, E., Green, C. S., & Bavelier, D. (2018). Meta-analysis of action video game impact on perceptual, attentional, and cognitive skills. Psychological Bulletin, 144, 77–110.

Berliner, D., Glass, G., & Associates. (2014). 50 myths & lies that threaten America’s public schools: The real crisis in education. New York: Teachers College Press.

Bernard, R. M. (1990). Effects of processing instructions on the usefulness of a graphic organizer and structural cueing in text. Instructional Science, 19, 207-217.

Bernard, R. M., Abrami, P. C., Borokhovski, E., Wade, C. A., Tamim, R. M., Surkes, M. A., & Bethel, E. C. (2009). A meta-analysis of three types of interaction treatments in distance education. Review of Educational research, 79(3), 1243-1289.

Bernard, R. M., Abrami, P. C., Lou, Y., Borokhovski, E., Wade, A., Wozney, L., ... & Huang, B. (2004). How does distance education compare with classroom instruction? A meta-analysis of the empirical literature. Review of educational research, 74(3), 379-439.

Page 23: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 23 of 30

Bettinger, E. P., Fox, L., Loeb, S., & Taylor, E. S. (2017). Virtual classrooms: How online college courses affect student success. American Economic Review, 107(9), 2855-2875.

Borokhovski, E., Bernard, R. M., Tamim, R. M., Schmid, R. F., & Sokolovskaya, A. (2016). Technology-supported student interaction in post-secondary education: A meta-analysis of designed versus contextual treatments. Computers & Education, 96, 15-28.

Bowen, W. G., Chingos, M. M., Lack, K. A., & Nygren, T. I. (2014). Interactive learning online at public universities: Evidence from a six‐campus randomized trial. Journal of Policy Analysis and Management, 33(1), 94-111.

Brinson, J. R. (2015). Learning outcome achievement in non-traditional (virtual and remote) versus traditional (hands-on) laboratories: A review of the empirical research. Computers & Education, 87, 218-237.

Calude, C., & Longo, G. (2017). The deluge of spurious correlations in big data. Foundations of Science, 22, 595-612.

Cambridge English Dictionary (n.d.). Principle. https://dictionary.cambridge.org/us/dictionary/english/

Chen, Z. (2012). We care about you: Incorporating pet characteristics with educational agents through reciprocal caring approach. Computers & Education, 59, 1081-1088.

Cheng, L., Ritzhaupt, A. D., & Antonenko, P. (2019). Effects of the flipped classroom instructional strategy on students’ learning outcomes: A meta-analysis. Educational Technology Research and Development, 67(4), 793-824.

Christodoulou, D. (2014). Seven myths about education. New York: Routledge. Clark, R. E. (1982). Antagonism between achievement and enjoyment in ATI studies.

Educational Psychologist, 17(2), 92-101. Clark, R. E. (1983). Reconsidering research on learning from media., Review of

Educational Research, 53(4), 445-459. Clark, R. E. (1989). When teaching kills learning: Research on mathemathantics. In H.

Mandl, E. De Corte, N. Bennett, & H. F. Friedrich (Eds.) Learning and instruction. European Research in an International Context. Volume II. Oxford: Pergamon.

Clark, R. E. (2009). How much and what type of guidance is optimal for learning from instruction? In S. Tobias and T.M. Duffy (Eds.) Constructivist theory applied to instruction: Success or failure? (pp. 158-183). New York: Taylor and Francis.

Clark, R. E. (2012) Learning from media: Arguments, analysis and evidence, second edition. Greenwich Conn: Information Age Publishing.

Clark, R. E., & Choi, S. (2005). Five design principles for experiments on the effects of animated pedagogical agents. Journal of Educational Computing Research. 32(3). 209-225.

Clark, R. E. & Feldon, D. F. (2005). Five common but questionable principles of multimedia learning. In R. E. Mayer (Ed.) The Cambridge Handbook of Multimedia Learning (pp. 97- 115). New York: Cambridge University Press.

Clark, R. E. & Feldon, D. F. (2014). Ten common but questionable principles of multimedia learning. In R. E. Mayer (Ed.) The Cambridge Handbook of Multimedia Learning (pp. 151-173). New York: Cambridge University Press.

Page 24: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 24 of 30

Clark, R. E., Howard, K., & Early, S. (2006). Motivational challenges experienced in highly complex learning environments. In J. Elen & R. E. Clark (Eds.), Handling complexity in learning environments: Theory and research (pp. 27–43). Oxford: Elsevier.

Clark, R. E., Kirschner, P. A., & Sweller, J. (2012). Putting students on the path to learning: The case for fully guided instruction. American Educator, 36(1), 6-11.

Clark, R. E., & Saxberg, B. (2012). The “active ingredients” approach to the development and testing of evidence-based instruction by instructional designers. Educational Technology, 52(5), 20–25.

Clark, R. E., & Saxberg, B. (2018). Engineering motivation using the belief-expectancy-control framework. Interdisciplinary Education and Psychology, 2(1), 4-32.

Clark, R. E., & Saxberg, B. (2019, March). 4 reasons good employees lose their motivation. Harvard Business Review. Retrieved from https://hbr.org/2019/03/4-reasons-good-employees-lose-their-motivation?autocomplete=true

Cronbach, L., & Snow, R. (1977). Aptitudes and instructional methods: A handbook for research on interactions. New York: Halsted Press.

Cuban, L. (1986). Teachers and machines: The classroom use of technology since 1920. New York: Teachers College Press.

Davis, R. O. (2018). The impact of pedagogical agent gesturing in multimedia learning environments: A meta-analysis. Educational Research Review, 24, 193-209.

De Bruyckere, P., Kirschner, P., & Hulshof, C. (2015). Urban myths about learning and education. Waltham, MA: Academic Press.

De Bruyckere, P., Kirschner, P. A., & Hulshof, C. (2019). More Urban Myths about Learning and Education: Challenging Eduquacks, Extraordinary Claims, and Alternative Facts. Routledge.

DeKeyser, R. M. (2003). Implicit and explicit learning. In C. Doughty & M. Long (Eds.), The handbook of second language acquisition (pp. 313-348). Oxford: Blackwell.

Dembo, M. H., & Howard, K. (2007) Advice about the use of learning styles: A major myth in education. Journal of College Reading and Learning. 37(2). 101-109.

Dengel A., Mägdefrau, J. (2019) Presence is the key to understanding immersive learning. In D. Beck et al. (eds), Immersive Learning Research Network. iLRN 2019. Communications in Computer and Information Science, vol 1044 (pp. 185-198). Springe, Cham.

Deterding, S., Dixon, D., Khaled, R., & Nacke, L. (2011). From game design elements to gamefulness: Defining “gamification.” In A. Lugmayr, H. Franssila, C. Safran, & I. Hammouda (Eds.), MindTrek 2011 (pp. 9–15).

Dichev, C., & Dicheva, D. (2017). Gamifying education: what is known, what is believed and what remains uncertain: a critical review. International journal of educational technology in higher education, 14(1), 9.

Dicheva, D., Dichev, C., Agre, G., & Angelova, G. (2015). Gamification in education: a systematic mapping study. Educational Technology & Society, 18(3), 75–88.

Domagk, S. (2010). Do pedagogical agents facilitate learner motivation and learning outcomes?: The role of the appeal of agent’s appearance and voice. Journal of Media Psychology: Theories, Methods, and Applications, 22(2), 84–97.

Domagk, S., Schwarz, R. N., & Plass, J.L. (2010) Interactivity in multimedia learning: An integrated model. Computers and Human Behavior, 25(1), 1024-1033.

Page 25: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 25 of 30

Dovis, S., van Rentergem, J., & Huizenga, H. (2015). Does CogMed working memory training really improve inattention in daily life? A Reanalysis. PLoS ONE, 10(3), e0119522.

Duffy, T. M., & Jonassen, D. H. (Eds.). (1992). Constructivism and the technology of instruction, a conversation. Mahwah, NJ: Lawrence Erlbaum Associates.

Dunn, R, & Dunn, K. (1978). Teaching students through their individual learning styles: A practical approach. Reston, VA: Reston Publishing Company.

Elfeky, A. I. M. (2019). The effect of personal learning environments on participants’ higher order thinking skills and satisfaction. Innovations in Education and Teaching International, 56(4), 505-516.

Facione, P. A. (1990). The California Critical Thinking Skills Test—College level: Interpreting the CCTST, group norms and sub-scores (Technical Report No. 4). Millbrae: California Academic Press.

Faiella, F., & Ricciardi, M. (2015). Gamification and learning: a review of issues and research. Journal of e-Learning and Knowledge Society, 11(3), 1-12.

Ferdig, R., Baumgartner, E., Hartshorne, R., Kaplan-Rakowski, R., & Mouza, C. (Eds.). (2020). Teaching, technology, and teacher education during the COVID-19 pandemic: Stories from the field. Waynesville, NC: Association for the Advancement of Computing in Education.

Figlio, D., Rush, M., & Yin, L. (2013). Is it live or is it internet? Experimental estimates of the effects of online instruction on student learning. Journal of Labor Economics, 31(4), 763-784.

Fontana, L. A., Dede, C., White, C. S. & Cates, W. M. (1993). Multimedia: A gateway to higher-order thinking skills. Fairfax, VA: George Mason University, Center for Interactive Educational Technology.

Gandomi, A., & Haider, M. (2015). Beyond the hype: Big data concepts, methods, and analytics. International Journal of Information Management, 35(2), 13-144.

Gillette, C., Rudolph, M., Kimble, C., Rockich-Winston, N., Smith, L., & Broedel-Zaugg, K. (2018). A meta-analysis of outcomes comparing flipped classroom and lecture. American journal of pharmaceutical education, 82(5), Article 6898.

Gredler, M., & Shields, C. (2004). Does no one read Vygotsky’s words? Commentary on Glassman. Educational Researcher, 33(2), 21-25.

Gulikers, J. T. M., Bastiaens, T. J., & Martens, R. L. (2005). The surplus value of an authentic learning environment. Computers in Human Behavior, 21(3), 509-521.

Heeter, C. (1992). Being there: The subjective experience of presence. Presence: Teleoperators and Virtual Environments, 1(2), 262-271.

Herrington, J., & Kervin, L. (2007). Authentic learning supported by technology: 10 suggestions and cases of integration in classrooms. Educational Media International, 44(3), 219-236.

Herrington J., Reeves T.C., Oliver R. (2014) Authentic learning environments. In: J. Spector, M. Merrill, J. Elen, & M. Bishop. (eds). Handbook of Research on Educational Communications and Technology (pp. 401-412). Springer, New York, NY.

Homer, B., Plass, J., & Blake, L. (2008). The effects of video on cognitive load and social presence in multimedia-learning. Computers in Human Behavior, 34, 786-797.

Page 26: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 26 of 30

Husmann, P. R. & O’Loughlin, V. D., (2018). Another nail in the coffin for learning styles? Disparities among undergraduate anatomy students’ study strategies, class performance, and reported VARK learning styles. Anatomical Sciences Education, 12, 6–19.

James, W. B., & Gardner, D. L. (1995). Learning styles: Implications for distance learning. New directions for adult and continuing education, 67, 19-31.

Kalyuga, S. (2007). Expertise reversal effect and its implications for learner-tailored instruction. Educational Psychology Review, 19, 509-539.

Karbach, J., & Verhaeghen, P. (2014). Making working memory work: a meta-analysis of executive-control and working memory training in older adults. Psychological science, 25(11), 2027-2037.

Karich, A. C., Burns, M. K., & Maki, K. E. (2014). Updated meta-analysis of learner control within educational technology. Review of Educational Research, 84(3), 392-410.

Kassai, R., Futo, J., Demetrovics, Z., & Takacs, Z. K. (2019). A meta-analysis of the experimental evidence on the near- and far-transfer effects among children’s executive function skills. Psychological Bulletin, 145(2), 165–188.

Kaufman, S. B., DeYoung, C. G., Gray, J. R., Jimenez, L., Brown, J., & Mackintosh, N. (2010). Implicit learning as an ability. Cognition, 116(3), 321–340.

Khacharem, A., Zoudji, B., & Kalyuga, S. (2015). Expertise reversal for different forms of instructional designs in dynamic visual representations. British Journal of Educational Technology, 46(4), 756-767.

Kim, Y., Thayne, J., & Wei, Q. (2017). An embodied agent helps anxious students in mathematics learning. Educational Technology Research and Development, 65(1), 219-235.

Kirschner, P. A. (2017). Stop propagating the learning styles myth. Computers & Education, 106, 166-171.

Kirschner, P. A., Sweller, J., & Clark, R. (2006). Why minimal guidance during instruction does not work: An analysis of the failure of constructivist, discovery, problem-based, experiential and inquiry-based teaching. Educational Psychologist, 41, 75–86.

Koedinger, K. R. & Aleven, V. (2007). The assistance dilemma in experiments with cognitive tutors. Educational Psychology Review, 19, 239-264.

Kozma, R. (1994) Will media influence learning? Reframing the debate. Educational Technology Research and Development, 42(2), 7-19.

Kramer, N. C. and Bente, G. (2010) Personalizing e-learning: The social effects of pedagogical agents. Educational Psychology Review. 22(1), 71-87.

Krassmann, A., Melo, M., Peixoto, B., Pinto, D., Bessa, M., & Bercht, M. (2020). Learning in virtual reality: Investigating the effects of immersive tendencies and sense of presence. In J. Y. C. Chen & G. Fragomeni (Eds.), International Conference on Human-Computer Interaction (HCII 2020, Lecture Notes in Computer Science, vol. 12191) (pp. 270-286). Springer, Cham.

Kyllonen, P. C., & Lajoie, S. P. (2003). Reassessing aptitude: Introduction to a special issue in honor of Richard E. Snow. Educational Psychologist, 38(2), 79-83.

Page 27: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 27 of 30

Landers, R. N., & Reddock, C. M. (2017). A meta-analytic investigation of objective learner control in web-based instruction. Journal of Business and Psychology, 32(4), 455-478.

Lilienfeld, S. (2017). Psychology’s replication crisis and the grant culture: Righting the ship. Perspectives on Psychological Science, 12, 660-664.

Lohman, D. F., (1986), Predicting mathemathantic effects in the teaching of higher-order thinking skills, Educational Psychologist, 21(3), 191-208.

Ma, J., & Nickerson, J. V. (2006). Hands-on, simulated, and remote laboratories: a comparative literature review. ACM Computing Surveys, 38(3), 1-14.

Makin, S. (2016). Memory games. Nature, 531, S10-S11. Makransky, G., Borre‐Gude, S., & Mayer, R. E. (2019). Motivational and cognitive

benefits of training in immersive virtual reality based on multiple assessments. Journal of Computer Assisted Learning, 35(6), 691-707.

Mayer, R. (2001). What good is educational psychology? The case of cognition and instruction. Educational Psychologist, 36(2), 83-88.

Mayer, R. (2004). Should there be a three-strikes rule against pure discovery learning? The case for guided methods of instruction. American Psychologist, 59, 14–19.

Mayer, R. E. & Chandler, P. (2001) When learning is just a click away: Does simple user interaction foster a deeper understanding of multimedia messages? Journal of Educational Psychology, 94(2) 390-397.

Merrill, D. M. (2006). Hypothesized performance on complex tasks as a function of scaled instructional strategies. In J. Elen and R. E. Clark (Eds.), Handling complexity in learning environments: Research and theory. Oxford: Elsevier Science.

Nancekivell, S. E., Shah, P., & Gelman, S. A. (2020). Maybe they’re born with it, or maybe it’s experience: Toward a deeper understanding of the learning style myth. Journal of Educational Psychology, 112(2), 221–235.

Neelen, M., & Kirschner, P. (2020). Evidence-informed learning design: Creating training to improve performance. London: Kogan Page.

Newton, P. M., & Miah, M. (2017). Evidence-based higher education–is the learning styles ‘myth’ important? Frontiers in psychology, 8, 444-454.

Open Science Collaboration. (2015). Estimating the reproducibility of psychological science. Science, 349, aac4716.

Papert, S. (1987). Computer criticism vs. technocentric thinking. Educational Researcher, 16(1), 22-30.

Park, S. (2015). The effects of social cue principles on cognitive load, situational interest, motivation, and achievement in pedagogical agent multimedia learning. Educational Technology & Society, 19(4), 211-229.

Parong, J., & Mayer, R. (2020). Cognitive and affective processes for learning science in immersive virtual reality. Journal of Computer Assisted Learning. Advance online publication. DOI: 10.1111/jcal.12482.

Pashler, H., McDaniel, M., Rohrer, D., & Bjork, R. (2008). Learning styles: Concepts and evidence. Psychological science in the public interest, 9(3), 105-119.

Picciano, A. (2002). Beyond student perceptions: Issues of interaction, presence, and performance in an online course. Journal of Asynchronous Learning Networks, 6(1), 21-40.

Page 28: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 28 of 30

Post, L. S., Guo, P., Saab, N., & Admiraal, W. (2019). Effects of remote labs on cognitive, behavioral, and affective learning outcomes in higher education. Computers & Education, 140, 103596.

Redick, T. (2015). Working memory training and interpreting interactions in intelligence interventions. Intelligence, 50, 14-20.

Reich, J., et al. (2020). Remote learning guidance from state education agencies during the COVID-19 pandemic: A first look. Retrieved from osf.io/k6zxy/

Rey, G. D., Beege, M., Nebel, S., Wirzberger, M., Schmitt, T. H., & Schneider, S. (2019). A meta-analysis of the segmenting effect. Educational Psychology Review, 31, 389–419.

Richardson, J., Maeda, Y., Lv, J., & Caskurlu, S. (2017). Social presence in relation to students’ satisfaction and learning in the online environment: A meta-analysis. Computers in Human Behavior, 71, 402-417.

Sailer, M., & Homner, L. (2020). The gamification of learning: A meta-analysis. Educational Psychology Review, 32, 77–112.

Sala, G., Aksayli, N., Tatlidil, K., Tatsumi, T., Gondo, Y., & Gobet, F. (2019). Near and far transfer in cognitive training: A second-order meta-analysis. Collabra: Psychology, 5(1), art.18.

Sala, G., & Gobet, F. (2016). Do the benefits of chess instruction transfer to academic and cognitive skills? A meta-analysis. Educational Research Review, 18, 46-57.

Sala, G., & Gobet, F. (2020). Working memory training in typically developing children: A multilevel meta-analysis. Psychonomic Bulletin & Review, 27, 423-434.

Sala, G., Tatlidil, K. S., & Gobet, F. (2018). Video game training does not enhance cognitive ability: A comprehensive meta-analytic investigation. Psychological Bulletin, 144(2), 111–139.

Salomon, G. (1984). Television is “easy” and print is “tough”: The differential investment of mental effort in learning as a function of perceptions and attributions. Journal of Educational Psychology, 76(4), 647-658.

Savery, J. R. & Duffy, T. M. (2001). Problem based learning: an instructional model and its constructivist framework (rapport no. CRLT Technical Report No. 16-01).: Indiana University.

Scheibe. C. & Rogow, F. (2012) The teachers guide to media literacy: Critical thinking in a multimedia world. Thousand Oaks CA: Corwin Press.

Schmidt, F. L., & Oh, I. S. (2013). Methods for second order meta-analysis and illustrative applications. Organizational Behavior and Human Decision Processes, 121(2), 204-218.

Schrader, C., & Bastiaens, T. (2012). The influence of virtual presence: Effects on experienced cognitive load and learning outcomes in educational computer games. Computers in Human Behavior, 28, 648-658.

Schroeder, N. L., & Gotch, C. M. (2015). Persisting Issues in Pedagogical Agent Research. Journal of Educational Computing Research, 53(2), 183–204.

Schunk, D. H., Pintrich, P. R., & Meece, J., L. (2008). Motivation in education (3rd ed.). Upper Saddle River, NJ: Pearson Merrill Prentice Hall.

Page 29: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 29 of 30

Schwaighofer, M., Fischer, F., & Bühner, M. (2015). Does working memory training transfer? A meta-analysis including training conditions as moderators. Educational Psychologist, 50, 138-166.

Seaborn, K., & Fels, D. I. (2015). Gamification in theory and action: A survey. International Journal of human-computer studies, 74, 14-31.

Selwyn, N. (2013). Education in a digital world: Global perspectives on technology and education. New York: Routledge.

Shepherd, A. (2003). Case for computer-based multimedia in adult literacy classrooms. Encyclopedia of Educational Technology. Retrieved from: http://www.etc.edu.cn/eet/articles/adultliteracy/index.htm

Shulman, L. S. (1970). Reconstruction of educational research. Review of Educational Research, 40(3), 371-396.

Shulman, L. S. (1986). Paradigms and research programs in the study of teaching: A contemporary perspective. In M. C. Wittrock (Ed.), Handbook of research on teaching (3rd ed., pp. 3–36). New York: Macmillan.

Shulman, L. S., & Quinlan, S. S. (1996). The comparative psychology of school subjects. In D. C. Berliner & R. C. Calfee (Eds.), Handbook of educational psychology (pp. 399–422). New York: Macmillan.

Simons, D., Boor, W., Charness, N., Gathercole, S., Chabris, C., Hambrick, D., & Stine-Morrow, E. (2016). Do “brain-training” programs work? Psychological Science in the Public Interest, 17(3), 103-186.

Spencer-Smith, M., & Klingberg, T. (2016). Benefit of a working memory training program for inattention in daily life: A systematic review and meta-analysis. PLoSONE, 10(3), e0119522.

Sternberg, R. J., Grigorenko, E. L., & Kidd, K. K. (2005). Intelligence, race, and genetics. American Psychologist, 60(1), 46–59.

Stevens, R., Wineburg, S., Rupert Herrenkohl, L., & Bell, P. (2005). Comparative understanding of school subjects: Past, present, future. Review of Educational Research, 75(2), 125-157.

Stoney, S. & Oliver, R. (1999) Can higher order thinking and cognitive engagement be enhanced with multimedia. Interactive Multimedia Electronic Journal of Computer-Enhanced Learning. Accessed from: http://imej.wfu.edu/articles/1999/2/07/index.asp

Sung, E. & Mayer, R. E. (2013) Online multimedia learning with mobile devices and desktop computers: An experimental test of Clark’s methods-not-media hypothesis. Computers in Human Behavior, 29, 639-647.

Sweller, J. (2008). Instructional implications of David C. Geary's evolutionary educational psychology, Educational Psychologist, 43(4), 214-216.

Tobias, S., & Duffy, T. M. (Eds.). (2009). Constructivist instruction: Success or failure. New York, NY: Routledge.

Triona, L. M. & Klahr, D. (2003). Point and click or grab and heft: Comparing the influence of physical and virtual instructional materials on elementary students ability to design experiments. Cognition and Instruction, 21(2). 149-173.

Turlik, M. (2009) Evaluating the results of a systematic review/ meta-analysis. The Foot and Ankle Online Journal, 2(7), 5.

Page 30: Citation: Clark, R. E., Feldon, D. F. and Jeong, S. (In ... · Multimedia defined Multimedia learning, in its most basic definition, is the process of learning from instructional

Page 30 of 30

Valsiner, J. (1988). Developmental psychology in the Soviet Union. Bloomington: Indiana University Press.

van Dijk, J. (2020). The Digital Divide. Medford, MA: Polity Press. VanLehn, K. (1996). Cognitive skill acquisition. Annual Review of Psychology, 47, 513-

539. Wang, F., Li, W., Mayer, R. E., & Liu, H. (2018). Animated pedagogical agents as aids

in multimedia learning: Effects on eye-fixations during learning and learning outcomes. Journal of Educational Psychology, 110(2), 250–268.

Wang, Y. (2016). Big opportunities and big concerns of big data in education. TechTrends, 60, 381-384.

Wiesner, T. F., & Lan, W. (2004). Comparison of student learning in physical and simulated unit operations experiments. Journal of Engineering Education, 93(3), 195-204.

Wilson, L. C. (2014, September). Introduction to meta-analysis: A guide for the novice. Retrieved from: https://www.psychologicalscience.org/observer/introduction-to-meta-analysis-a-guide-for-the-novice

Wise, A., Chang, J., Duffy, T., & Del Valle, R. (2004). The effects of teacher social presence on student satisfaction, engagement, and learning. Journal of Educational Computing Research, 31, 247-271.

Wise, A., & Shaffer, D. (2015). Why theory matters more than ever in the age of big data. Journal of Learning Analytics, 2(2), 2-13.

Yung, H. I., & Pass F. (2015). Effects of cueing by a pedagogical agent in an instructional animation: A cognitive load approach. Educational Technology & Society, 18(3), 153–160.

Zacharia, Z. C., & Constantinou, C. P. (2008). Comparing the influence of physical and virtual manipulatives in the context of the Physics by Inquiry curriculum: The case of undergraduate students’ conceptual understanding of heat and temperature. American Journal of Physics, 76(4), 425-430.


Recommended