+ All Categories
Home > Documents > White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest...

White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest...

Date post: 12-Jul-2020
Category:
Upload: others
View: 1 times
Download: 0 times
Share this document with a friend
13
Illinois White Paper Next-Generation Assessment for Illinois
Transcript
Page 1: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

Illinois

White Paper

Next-Generation Assessment for Illinois

Page 2: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

Contents

* Sections of this report fi rst appeared in “Study Island Scientifi c Research Base” by Magnolia Consulting (September 2011).

Executive Summary ............................................................................................ 3

The Next Generation of Study Island in Illinois ................................................ 4

Study Island Users in Illinois ................................................................................ 4

Key Components of Study Island ....................................................................... 5Rigorous and Engaging Content ...................................................................................................................5

Differentiated Instruction and Real-Time Feedback ....................................................................................5

Web-Based and User Friendly .......................................................................................................................5

Study Island Research Base ................................................................................ 5Standards Alignment* .....................................................................................................................................5

Formative Assessment Feedback Loops* .....................................................................................................6

Benchmark Assessments .................................................................................................................................6

Distributed, Ongoing Practice* ......................................................................................................................7

Student Engagement and Motivation* .........................................................................................................7

Teacher Training and Professional Development .........................................................................................8

Effi cacy of Study Island ....................................................................................... 9Preparing Students for Increasingly Rigorous High-Stakes Assessments ...................................................9

Case Study: Building a Foundation for Success at

Leland Elementary School ............................................................................... 11The Challenge ................................................................................................................................................11

How They Did It .............................................................................................................................................11

Success ...........................................................................................................................................................11

The Future ......................................................................................................................................................12

References .......................................................................................................... 13

Page 3: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

3

Executive Summary As standards, assessments, and policies in Illinois change, Study Island continues to evolve, tailoring programs and materials to meet the needs of Illinois learners.

Study Island works for schools because it delivers rigorous and engaging content, differentiated instruction, and formative assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study Island. Researchers agree that alignment of content and assessments with standards is crucial for student success, which is why Study Island built its Illinois programs from the ground up to ensure complete alignment and coverage of content standards. Study Island also provides students and teachers with formative and interim assessment solutions, both of which have been shown to increase student achievement. Study Island builds the confi dence of students with instruction differentiated to meet them where they are. It also includes the latest motivational tools, such as blue ribbon icons and mobile-ready multiplayer games, to keep students engaged in learning.

Study Island has already demonstrated success in preparing students for next-generation assessments. Many schools and districts in Illinois have found Study Island to be their solution for standards mastery and formative assessment. For example, George Leland Elementary School in Chicago started using Study Island in the 2011–12 school year. That spring, 85 percent of the school’s students met or exceeded expectations on the Illinois Standards Achievement Test (ISAT), compared with the state average of 59 percent.

ISAT Results Students Meeting Profi ciency: 2011–12

This paper details the latest features of Edmentum’s Study Island program, discusses its updated research base, and shares a recent success story from an Illinois school.

0%

20%

40%

60%

80%

100%

Reading Math

Leland Elementary Chicago School District Illinois State

88%

47%

83%

59% 60%

48%

Page 4: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

4

The Next Generation of Study Island in IllinoisStudy Island’s program authors designed the Illinois formative assessments and standards mastery program to help K–12 students improve performance in all skill areas tested on the ISAT and master knowledge and skills outlined in the Illinois Learning Standards and the Common Core State Standards (CCSS).

In 2010, Illinois adopted the CCSS for English language arts and mathematics, and teachers and administrators across the state are fully implementing the new standards during the 2013–14 school year. By emphasizing depth over breadth, the CCSS ensure that students have a comprehensive understanding of key concepts. Illinois is also part of the Partnership for the Assessment of Readiness for College and Careers (PARCC) and plans to implement a new technology-enhanced assessment aligned to CCSS in the 2013–14 school year. The next generation of Study Island in Illinois has been designed from the ground up to meet these new standards. Study Island prepares students in Illinois for more rigorous, technology-enhanced assessments and provides formative data to inform instruction and learning.

Study Island Users in IllinoisMore than 1,000 schools in Illinois used a Study Island product in 2013. Most schools use Study Island math and reading standards mastery products to help their students prepare for the ISAT. In addition to these core products, Study Island offers a benchmarking product that provides an interim assessment solution for districts.

In total, there were 924 schools in Illinois using Study Island math products and 941 using reading products, with 887 schools using both math and reading products during 2013.

Current Illinois Study Island School Installations

Reading941 schools

Math924 schools

899(95.5%)

42(4.5%)

883(95.6%)

41(4.4%)

Core Only Core & Benchmark Core Only Core & Benchmark

Since 2012, Study Island has successfully delivered almost 13,000 Study Island benchmark assessments to over 7,000 students in Illinois. Students often take multiple benchmark assessments (i.e., different forms) in the same content area through the school year. Study Island provides Illinois schools with a stable connectivity platform, maintaining enough servers and network bandwidth to successfully deliver a very large number of benchmark assessments in an online environment.

Page 5: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

5

Key Components of Study IslandThe Study Island program was created with student success as its core mission. Users in Illinois have attributed student success to several of the program’s key components: rigorous and engaging content, differentiated instruction and real-time feedback provided to each student, and an easy-to-use Web-based format.

Rigorous and Engaging ContentStudy Island comprises lessons and activities that reinforce and reward learning achievement. Each item and topic is built from the ground up to align with Illinois and Common Core standards. Built-in lessons, animations, and activities keep students engaged as they demonstrate mastery and prepare for high levels of achievement on high-stakes assessments.

Differentiated Instruction and Real-Time FeedbackStudy Island is ideal for self-paced, individualized learning. Teachers can easily guide students through the program, communicate expectations, and create class assignments. Students can work through items using a variety of formats, such as a standard test format, an interactive game format, printable worksheets, and a classroom response system. Study Island gives students and teachers instant feedback on student progress and summative reports to track standards mastery.

Web-Based and User Friendly Study Island’s mobile-optimized interface ensures seamless 24/7 access to its award-winning test preparation and standards mastery solutions. The same great resources, rigorous content, and engaging games that educators and students can access from a Web browser are also readily accessible from any tablet, smartphone, or other mobile device.

Study Island Research BaseStudy Island was designed using proven education methodology, including close alignment with standards, assessment feedback loops, distributed ongoing practice, and motivational features to ensure that students are engaged and mastering the Illinois Learning Standards and the CCSS.Standards AlignmentTypically, states complete alignment procedures to determine how well an existing state assessment aligns with state standards in an effort to demonstrate the overall strength or predictive validity of the state assessment (Roach, Elliot, and Webb, 2005). Some alignment experts have taken this a step further and have developed procedures to examine the alignment between instructional content and assessments (Porter and Smithson, 2002). Research using these procedures has shown that a strong relationship exists between alignment of instructional content with assessments and student achievement gains: the better instructional content aligns with assessments, the higher student achievement can be (Gamoran, Porter, Smithson, and White, 1997).

Study Island program authors have gone beyond traditional alignment procedures to develop the continuously updated program content from an in-depth analysis of the CCSS and Illinois Learning Standards. The content has been built from the ground up, based specifi cally on the standards, which ensures complete standards coverage. The instructional practice and progress monitoring tools provide precise methods to track and improve students’ progress toward meeting the Illinois New Learning Standards. The alignment process also ensures that teachers can use data from Study Island formative assessments with confi dence, knowing that it is completely aligned to the skills tested on state assessments.

Page 6: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

6

Formative Assessment Feedback LoopsEducational programs, especially technology-based programs such as Study Island, promote ongoing standards mastery by using the results of formative assessments to inform instruction. For illustrative purposes, one can conceptualize the interactive relationship between assessment results and instructional practice as a continual feedback loop, or cycle. For example, poor results on formative assessments can lead to remedial instruction, which automatically adapts the level of future practice within the program and creates new instructional paths designed to promote learning. The cycle is then completed or restarted through further performance evaluation. These cycles are typically of three lengths—long, medium, or short—all of which can operate concurrently. Longer cycles focus on the results of summative assessments and can last an entire school year, while medium and short cycles use formative and interim assessment results. Regardless of the length of the cycle, researchers suggest that the use of feedback is the most critical element (Duke and Pearson, 2002; Wiliam, 2006).

For a formative assessment feedback loop to be successful, the instructional delivery mechanism (whether a teacher or a computer) must be fl exible and proactive, adapting the instructional content, the delivery of the instructional content, or both as needed to ensure mastery (Cassarà, 2004). Research shows that when a feedback loop is modifi ed based on student performance, student learning can be accelerated and improved (Jinkins, 2001; Wiliam, Lee, Harrison, and Black, 2004), especially when feedback is given quickly so it impacts instruction on a day-by-day or minute-by-minute basis (Leahy, Lyon, Thompson, and Wiliam, 2005). Shorter-cycle, formative feedback loops, such as those found in Study Island, typically comprise three main functions: ongoing assessment, immediate feedback on results, and quick remediation. Ongoing, continual assessment is critical to the success of a formative assessment feedback loop.

A consensus within the research literature suggests that students who receive frequent assessments have higher achievement scores (Black and Wiliam, 1998a; Fuchs and Fuchs, 1986; Wolf, 2007), especially when that assessment is cumulative (Dempster, 1991; Rohm, Sparzo, and Bennett, 1986) and provides students with opportunities to learn (Kilpatrick, Swafford, and Bradford, 2001). Marzano, Pickering, and Pollock (2001) further concluded that feedback that also provides an explanation of the correct answer was the most effective. Through their meta-analysis, they additionally concluded that feedback is best when it encourages students to keep working on a task until they succeed.

Taken together, these results suggest that a cycle of ongoing formative assessment feedback followed by remediation and further assessment contributes to increases in student achievement. Study Island incorporates a formative assessment feedback loop into its design through a system of continual assessment, immediate feedback, and quick remediation. Students will know immediately whether they answered the item correctly or incorrectly, and upon completing each item in Study Island, they are given the option to view an explanation showing how to arrive at the correct answer. Further, when a student struggles with a particular concept, the Study Island system automatically moves the student to a building block that teaches the same concept at a slightly easier level. When educators integrate Study Island into their instructional practices, it acts as a formative, ongoing assessment tool that provides students with a platform to practice, or demonstrate, their knowledge of required standards and provides teachers with valuable information to inform their instruction.

Benchmark AssessmentsIn addition to its core program, Study Island offers a benchmarking program, which serves as a medium-cycle assessment. The benchmarking program provides a set of four benchmark tests designed to be taken periodically throughout the school year. The benchmark tests are typically 30 to 40 items long and mirror the structure and item types that will be part of the new CCSS assessments. Therefore, the results of each benchmark test will closely refl ect how students would perform on high-stakes assessments.

Page 7: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

7

In “The Role of Interim Assessments in a Comprehensive Assessment System” (Perie, Marion, Gong, & Wurtzel, 2007), a paper sponsored by the National Center for the Improvement of Educational Assessment, interim assessment is defi ned as a medium-cycle, or benchmark, assessment used to (1) evaluate student mastery of specifi c goals within a specifi c time frame and (2) provide information to inform decisions both in the classroom and at the school or district level. The timing and administration of interim assessments are typically controlled at the school or district level, and results may be used to evaluate programs, diagnose gaps in student learning, or predict students’ ability to succeed on higher-stakes summative assessments.

The test administration process for Study Island Benchmarks is secure but fl exible and can be controlled by program administrators.

Distributed, Ongoing Practice For instruction or remediation to have a lasting effect on student knowledge and foster further learning, instructors must provide reinforcement through ongoing practice and review of learned material (Dempster, 1991; Marzano, Pickering, and Pollock, 2001). Research shows that review can affect both the quantity and quality of the material that students learn. Mayer (1983) found that after multiple presentations of text, not only did the overall amount of information that students could recall increase, but participants also were able to recall more conceptual information than technical information. This suggests that after repeated presentations, students may process material at cognitively deeper levels. Further, Marzano, Pickering, and Pollock (2001) concluded that students may need many practice sessions to reach high levels of competence. For more diffi cult, multistep skills, students may need to engage in focused practice, allowing students to target specifi c subskills within a larger skill. This research suggests that teachers should allow time for students to internalize skills through practice so they can apply concepts in unique and conceptually challenging situations.

Distributed practice of material not only affects the amount of information learned but can be a motivating factor as well. Research shows that distributive practice of material is more interesting and enjoyable to students than massed practice (Dempster, 1991; Elmes, Dye, and Herdelin, 1983), indicating that distributed practice could contribute to increased motivation to learn new material. Taken together, these results suggest that the distributed practice of instructional material could signifi cantly impact students’ retention in addition to their understanding and enjoyment of course material.

The fl exibility of the instructional framework of the Study Island program allows for ongoing skill practice and review of learned material and provides the ability to space out practice to foster higher rates of recall and retention. Teachers can customize the amount and frequency of practice that each student receives or assign students to review specifi c standards or learning objectives as needed. Because students do not have to complete the program lessons in any specifi c order, teachers can distribute the presentation of the skill practice—especially the more complex material—over multiple days. When teachers use Study Island in conjunction with classroom instruction, they can present material and assign practice on that material as needed throughout the year, creating an effective and motivating learning environment for practicing state standards.

Student Engagement and MotivationA variety of effective methods can be used to increase student motivation within an instructional environment (Guthrie and Davis, 2003). These strategies have been shown to be effective for increasing student motivation:

■ providing students with high-interest, diverse materials (Ivey and Broaddus, 2000; Worthy, Moorman, and Turner, 1999);

■ embedding instruction within context (Biancarosa and Snow, 2004; Dole, Sloan, and Trathen, 1995);

■ increasing students’ self-effi cacy and competence (Ryan and Deci, 2000);

■ offering competitive-based rewards for performance (Reeve and Deci, 1996); and

■ providing students with choice and autonomy (Patall, Cooper, and Robinson, 2008; Zahorik, 1996).

Research has also shown that offering performance-contingent rewards, such as the chance to play a game after successfully completing a task, is more motivating than positive performance feedback alone (Harackiewicz and Manderlink, 1984). Rewards symbolize competence at an activity (Boggiano and Ruble, 1979; Harackiewicz, 1979), and

Page 8: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

8

research shows that in situations where performance-contingent rewards are available, individuals are more concerned about their performance and perceived competence than those receiving positive performance feedback alone. Furthermore, the personal importance of doing well at a task enhances subsequent interest in the task. Therefore, performance-contingent rewards can foster task interest via an individual’s drive to do well and thus overcome the negative anxiety-producing effects typically associated with performance evaluation (Boggiano, Harackiewicz, Bessette, and Main, 1986; Harackiewicz and Manderlink, 1984).

Additionally, Marzano, Pickering, and Pollock (2001) found that providing students with personalized recognition for their academic accomplishments, especially in the form of concrete symbols, can be a strong motivator to increase student achievement. However, to have a positive impact on students’ intrinsic motivation, recognition should be contingent on the achievement of a specifi c performance goal, not just for the completion of any one task. Therefore, recognition has the strongest impact on achievement when a student connects the reward to reaching a specifi ed level of performance.

Technology-based instructional programs, such as Study Island, although inherently motivating (Relan, 1992), have a unique capacity to incorporate such motivational strategies concurrently within their instructional environments. In particular, computer programs can easily include both the fl exibility and modifi ability of instructional sequences. Such open architecture can provide students with a sense of autonomy and ownership in the instructional tasks. Research has shown that presenting students with choices during instruction, especially choices that enhance or affi rm autonomy, augments intrinsic motivation, increases effort, improves task performance, and contributes to growth in perceived confi dence (Patall et al., 2008). Likewise, Corbalan, Kester, and van Merriënboer (2006) suggest that technology-based environments that allow task personalization promote self-regulated learning and provide the learner with control over his or her environment, which can increase motivation and foster positive learning outcomes.

The Study Island program design and implementation incorporates diverse motivational factors to engage students and further program use. For instance, Study Island includes a wide variety of material covering multiple content areas and subjects within those content areas. Additionally, it builds instructional opportunities into the standards practice to motivate students to apply skills as they are learning them. Study Island aims to build student confi dence and self-effi cacy by providing students with suffi cient practice and learning opportunities that will help them realize positive gains in achievement. Students can monitor their own progress as they complete lessons and feel successful watching their mastery levels rise. When students reach the specifi ed mastery level of an objective, they earn a personalized reward in the form of a blue ribbon icon, which serves as a concrete symbol of recognition for their academic achievements and further motivates them to succeed.

Within the Study Island program, students also have access to a wide variety of simple and short games that they can play when they have answered a question correctly. Students compete with other Study Island users to try to achieve the highest score on the games: this competition is intended to motivate the students to perform well on assessments. One of the most signifi cant motivational factors Study Island provides is its open architecture, which allows students the ability to complete lessons in any order and to switch between tasks as desired. This offers students ownership of their learning environment, allowing them to set their own goals, plan personalized learning experiences, execute their work with fl exibility, and regulate their own progress.

Teacher Training and Professional DevelopmentInformation about students’ academic progress and preparedness is of little Information about students’ academic progress and preparedness is of little use if educators lack the tools necessary to access and leverage those data at the school and classroom levels (McCann and Kabaker, 2013). Research demonstrates that data usage is a critical component of instructional planning in the most effective classrooms (James-Ward, Fisher, Frey, and Lapp, 2013). Teachers with a solid working knowledge of academic data can use that information to better understand their students’ progress and tailor their instruction to help meet the needs of individual learners.

McCann and Kabaker (2013) evaluated two statewide efforts to increase data use by classroom teachers and found common themes that made the

Page 9: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

9

programs effective. First, few teachers have familiarity with data or the skills necessary to use data to inform instruction, so professional development sessions focused on building and maintaining data skills are necessary and important. Second, teachers greatly benefi ted from high-quality coaching from highly trained professionals who deliver skill-based training, instilling in teachers a strong internal working knowledge of data use and helping to build a ”data-heavy” culture in schools. Finally, administrative support of data-focused efforts and allotting time during the workday for participation in training also factored into increasing data use among teachers.

Study Island helps teachers and administrators leverage student achievement data to improve instruction by providing clear reports and offering professional development. The reports generated by Study Island make data analysis easy and effective for teachers and principals. Users can choose which report(s) they would like to view from an easy-to-use menu, or they can set up reports to be automatically emailed. Reports contain real-time student performance, so users are always seeing the most recent student performance data. To ensure that teachers have the training they need to use student data, Edmentum Services provides customized, skills-based professional development delivered by highly trained professionals.

Effi cacy of Study Island The Common Core State Standards Initiative (CCSSI) is a state-led effort to defi ne the knowledge and skills that students should attain during their K–12 education so that they will graduate with the ability to succeed in college courses and workforce training programs.

The CCSSI includes an assessment component designed to document whether students are on track to be college- and career-ready by the time they graduate from high school. Therefore, not only will students learn from a more rigorous and relevant set of standards, they will also be introduced to a new type of assessment that is signifi cantly different from current state tests. Gone are the days when states mapped test items to a single standard and multiple-choice items were expected to be a true measure of mastery. The new Common Core assessments measure critical thinking, problem solving, and 21st century skills. They are aligned not only to factual knowledge but also to the requirements of colleges and the workplace.

To address these challenges, Study Island has developed entirely new content and item types to specifi cally align to the CCSS and individual state standards. Study Island can be used now to familiarize teachers and students with the new standards and assessments, including new technology-enhanced and constructed-response items. Because Study Island provides valid and reliable information on how students are progressing toward the new standards, students will have an advantage when new, more rigorous assessments are put in place.

Preparing Students for Increasingly Rigorous High-Stakes AssessmentsEarly indicators reveal that the transition to the new standards and assessments will be particularly challenging for students, educators, and administrators. During the 2011–12 academic year, Kentucky, the fi rst state to adopt the CCSS, administered statewide tests that were developed explicitly to measure the CCSS. Although the new Kentucky tests are not the actual tests from either assessment consortium, they are closely aligned and are thus seen as a harbinger of the PARCC and SBAC assessments. Compared with the previous academic year (2010–11), the percentage of students scoring “profi cient” or better dropped by more than 33 percent in math and reading.

Kentucky’s steep drop in profi ciency rates is likely to be repeated in dozens of other states as the CCSS are implemented. The results from Kentucky confi rmed what most stakeholders expected: assessments measuring the CCSS are going to result in signifi cant challenges to educators in helping students prepare for college and the workplace. While it’s common to see performance declines when new assessment systems are introduced, a lack of available resources compounds the issue for teachers, schools, districts, and

Page 10: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

10

states transitioning to the CCSS. The move to CCSS will require major curriculum changes and new types of assessments. Study Island has responded to this need by investing heavily in developing resources that support classroom learning and provide a full spectrum of skills practice and year-round assessments.

Although Kentucky experienced precipitous drops in test scores, students in schools using Study Island outperformed the state average on the new assessments by as much as 14 percent. The fi gure below compares the performance of students who used Study Island benchmark assessment solutions during the 2011–12 academic year with the state average on Kentucky’s Core Content Test, which is based exclusively on CCSS.

Study Island Users’ Profi ciency RatesAfter Introduction of New Common Core Assessments

Students will be challenged by the new CCSS and the corresponding assessments to measure those standards. Educational experts have warned that precipitous drops in student performance (as indicated by profi ciency when measured against the more rigorous CCSS) are likely. The 2012 assessment results from Kentucky corroborate the challenges that students, educators, and administrators will encounter once the new assessments are administered operationally. Study Island provides the assessment feedback and skills practice that students need to be prepared for these new assessments.

0%

10%

20%

30%

40%

50%

60%

70%

StateAverage

StudyIsland

ElementaryMath

ElementaryReading

Middle SchoolMath

Middle SchoolReading

Page 11: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

11

George Leland Elementary School, Chicago, Illinois

663 studentsGrades pre-K to 8

Case Study: Building a Foundation for Success at Leland Elementary School Early learning is key to building literacy skills for school and for life. With the help of Study Island, an elementary school in a challenging neighborhood of inner-city Chicago builds a foundation for unprecedented success with its young learners.

The ChallengeBased on its demographics, George Leland Elementary School in Chicago should be struggling: 96 percent of its students come from economically disadvantaged homes. The neighborhood surrounding the school is blighted. But through a commitment to early learning, Leland Elementary is outpacing the rest of the district and the state itself. With the arrival of the CCSS, the students and teachers needed a better way to prepare for the new standards or risk losing that measure of excellence.

How They Did ItLeland Elementary needed an accurate practice solution to ensure student success on the new Common Core assessments. It found that solution in Study Island, which the school has been using since 2011. “[Study Island] addresses the needs of our special students and those in need of intervention, as well as prepares students for test taking and meeting the new standards,” said Dr. Loretta Brown-Lawrence, principal of Leland Elementary.

Leland Elementary, like most other schools, does not have unlimited technology resources, which makes the success the students are experiencing even more impressive. Study Island is available to students through twice-weekly visits to the computer lab. Because the program is Web-based, however, at-home practice is possible and quite popular with the teachers and students. “The fact that it is Web-based and can be used at home is a big plus,” said Brown-Lawrence.

SuccessTo the credit of the staff and teachers at Leland Elementary, the success their school experiences on an annual basis is an inspiring story; indeed, it was already considered a high-performing school by state standards even before beginning its partnership with Edmentum. Study Island has pushed Leland Elementary to be considered one of the best schools in the district and state, as well as a model for other schools nationwide.

Leland Elementary started using Study Island in the 2011–12 school year. That year, 85 percent of students met or exceeded expectations on the Illinois Standard Achievement Tests (ISAT), compared with the state average of 59 percent.

97% African American3% Hispanic96% eligible for free/reduced lunch< 1% English language learners

Dr. Loretta Brown-Lawrence, Principal

Page 12: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

White Paper

12

ISAT Results:Students Meeting Profi ciency: 2011–12

The FutureLeland Elementary is growing—by a factor of three! Because of school consolidation in the Chicago school district, Leland Elementary is absorbing two more local schools for the 2013–14 school year, expanding to nearly 700 students and serving grades pre-K to 8. With the abilities of its staff and teachers, as well as faith in Study Island, George Leland Elementary School is ready to excel in what would seem like an intimidating situation to most other schools.

0%

20%

40%

60%

80%

100%

Reading Math

Leland Elementary Chicago School District Illinois State

88%

47%

83%

59% 60%

48%

Page 13: White Paper - Edmentum...assessment feedback in a Web-based, mobile-ready format. The latest education research supports the design of Study ... instant feedback on student progress

2425 North Central ExpresswaySuite 1000Richardson, TX 75080

© 2014 EDMENTUM, INC.

edmentum.com800.447.5286 [email protected] 031114

White Paper

Biancarosa, G., & Snow, C. E. (2004). Reading next—a vision for action and research in middle and high school literacy: A report to Carnegie Corporation for New York. Washington, DC: Alliance for Excellent Education.

Black , P., & Wiliam, D. (1998) Assessment and classroom learning. Assessment in Education: Principles, Policy & Practice, 5, 7–74.

Boggiano, A. K., Harackiewicz, J. M., Bessette, J. M., & Main, D. S. (1986). Increasing children’s interest through performance-contingent reward. Social Cognition, 3, 400–411.

Boggiano, A. K., & Ruble, D. N. (1979). Competence and the over justifi cation effect: A developmental study. Journal of Personality and Social Psychology, 37, 1462–1468.

Cassarà, S. (2004). Many nations: Building community in the classroom. Assessment Update, 16(3), 9–10.

Corbalan, G., Kester, L., & van Merriënboer, J. J. G. (2006). Towards a personalized task selection model with shared instructional control. Instructional Science, 34, 399–422.

Dempster, F. N. (1991). Synthesis of research on reviews and tests. Educational Leadership, 48 (7), 71–76.

Dole, J., Sloan, C., & Trathen, W. (1995). Teaching vocabulary within the context of literature. Journal of Reading, 38, 452–460.

Duke, N. K., & Pearson, P. D. (2002). Effective practices for developing reading comprehension. In A. E. Farstrup & S. J. Samuels (Eds.), What research has to say about reading instruction (pp. 205–243). Newark, DE: International Reading Association.

Elmes, D. G., Dye, C. J., & Herdelin, N. J. (1983). What is the role of affect in the spacing effect? Memory and Cognition, 11, 144–151.

Fuchs, L. S. & Fuchs, D. (1986) Effects of systematic formative evaluation: a meta-analysis. Exceptional Children, 53, 199–208.

Gamoran, A., Porter, A. C., Smithson, J., & White, P. A. (1997, Winter). Upgrading high school mathematics instruction: Improving learning opportunities for low-achieving, low-income youth. Educational Evaluation and Policy Analysis, 19, 325–338.

Guthrie, J. T., & Davis, M. H. (2003). Motivating struggling readers in middle school through an engagement model of

classroom practice. Reading and Writing Quarterly, 19, 59–85.

Harackiewicz, J. M. (1979). The effects of reward contingency and performance feedback on intrinsic motivation. Journal of Personality and Social Psychology, 37, 1352–1361.

Harackiewicz, J. M., & Manderlink, G. (1984). A process analysis of the effects of performance contingent rewards on intrinsic motivation. Journal of Experimental Social Psychology, 20, 531–551.

Ivey, G., & Broaddus, K. (2000). Tailoring the fi t: Reading instruction and middle school readers. The Reading Teacher, 54, 68–78.

James-Ward, C., Fisher, D., Frey, N., & Lapp, D. (2013). Using Data to Focus Instructional Improvement. Alexandria, VA: ASCD.

Jinkins, D. (2001). Impact of the implementation of the teaching/learning cycle on teacher decision making and emergent readers. Reading Psychology, 22, 267–288.

Kilpatrick, J., Swafford, J., & Bradford, R. (Eds.) 2001. Adding it up: Helping children learn mathematics. Washington, DC: National Academy Press.

Leahy, S., Lyon, C., Thompson, M., & Wiliam, D. (2005). Classroom assessment: Minute by minute, day by day. Educational Leadership, 63(3), 19–24.

Marzano, R. J., Pickering, D. J., & Pollock, J. E. (2001). Classroom instruction that works: Research-based strategies for increasing student achievement. Alexandria, VA: Association for Supervision and Curriculum Development

Mayer, R. E. (1983). Can you repeat that? Qualitative effects of repetition and advance organizers on learning from science prose. Journal of Educational Psychology, 75, 40–49.

McCann, C., & Kabaker, J. C. (2013). Promoting Data in the Classroom. Retrieved from http://education.newamerica.net/sites/newamerica.net/fi les/policydocs/Promoting%20Data_FINAL_FOR_RELEASE_0.pdf

Patall, E. A., Cooper, H., & Robinson, J. C. (2008). The effects of choice on intrinsic motivation and related outcomes: A meta-analysis of research fi ndings. Psychological Bulletin, 134, 270–300.

Perie, M., Marion, S., Gong, B., & Wurtzel, J. (2007). The role of interim assessments in a comprehensive assessment system.

Washington, DC: The Aspen Institute. Retrieved December, 3, 2009.

Porter, A. C. & Simthson, J. L. (2002, April). Alignment of assessments, standards, and instruction using curriculum indicator data. Paper presented at the Annual Meeting of the American Educational Research Association, New Orleans, LA.

Reeve, J., & Deci, E. L. (1996). Elements of the competitive situation that affect intrinsic motivation. Personality and Social Psychology Bulletin, 22, 24–33.

Relan, A. (1992). Motivational strategies in computer-based instruction: Some lessons from theories and models of motivation. In proceedings of selected research and development presentations at the Convention of the Association for Educational Communications and Technology (ERIC Document Reproduction Service No. ED 348 017).

Roach, A. T., Elliott, S. N., & Webb, N. L. (2005). Alignment of an alternate assessment with state academic standards. The Journal of Special Education, 38, 218–231.

Rohm, R. A., Sparzo, F. J., Bennett, C. M. (1986). College student performance under repeated testing and cumulative conditions: Report on fi ve studies. The Journal of Educational Research, 80, 99–104.

Ryan, R. M., & Deci, E. L. (2000). Intrinsic and extrinsic motivations: Classic defi nitions and new directions. Contemporary Educational Psychology, 25, 54–67.

Wiliam , D. (2006). Formative assessment: Getting the focus right. Educational Assessment, 11(3 & 4), 283–289

Wiliam, D., Lee, C., Harrison, C., & Black, P. (2004). Teachers developing assessment for learning: Impact on student achievement. Assessment in Education, 11, 49–65.

Wolf, P. J. (2007). Academic improvement through regular assessment. Peabody Journal of Education, 82, 690–702.

Worthy, J., Moorman, M., & Tuner, M. (1999). What Johnny likes to read is hard to fi nd in school. Reading Research Quarterly, 34, 12–27.

Zahorik, J. A. (1996). Elementary and secondary teachers’ reports of how they make learning interesting. The Elementary School Journal, 96, 551–565.

References


Recommended