+ All Categories
Home > Documents > 86097 NCEA ACRE Interpretation Manual.pmd

86097 NCEA ACRE Interpretation Manual.pmd

Date post: 31-Dec-2016
Category:
Upload: ngocong
View: 227 times
Download: 2 times
Share this document with a friend
68
2001 Revised Edition NCEA ACRE Interpretation Manual
Transcript
Page 1: 86097 NCEA ACRE Interpretation Manual.pmd

2001 Revised Edition

NCEA ACRE

Interpretation Manual

Page 2: 86097 NCEA ACRE Interpretation Manual.pmd

Department of Religious EducationNational Catholic Education Association

1077 30th Street N.W., Suite 100Washington, DC 20007-3852

Copyright © 2002, National Catholic Educational Association

Printed in the United States of AmericaState of Kansas, Division of Printing

Page 3: 86097 NCEA ACRE Interpretation Manual.pmd

NCEA ACRE Interpretation Manual

Table of Contents

Introduction . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 1

Structure, Components and Features of the Revised NCEA ACRE . . . . . . . . . . . . 3

NCEA ACRE Part 1 - Faith Knowledge . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 6

NCEA ACRE Part 2 - Affective Assessment . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 16

NCEA ACRE Score and Summary Reports . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 27

Appendix A: Catholic Faith Literacy . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 51

Appendix B: NCEA ACRE Blueprint: Content Specifications, Domains,Student Objectives, and Elaborated Key Concepts . . . . . . . . . . . . . . . . . . . 53

Appendix C: Examining Local Curriculum Content Standards . . . . . . . . . . . . . . 57

Appendix D: Enhancing the NCEA ACRE Assessment usingLocally Selected/Constructed Questions . . . . . . . . . . . . . . . . . . . . . . . . . . 63

Contributors . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . . 64

Page 4: 86097 NCEA ACRE Interpretation Manual.pmd

1

The Revised NCEA ACRE: A New Horizon

Introduction

The convergence of several Church documents together with the technological andpsychometric advances made in the field of “tests and measurements” prompted the NationalCatholic Educational Association to revise the 1992 edition of the Assessment of CatholicReligious Education (ACRE), its religious education program assessment tool. Thepromulgation of the Catechism of the Catholic Church in 1994, The General Directory forCatechesis in 1997, and heightened awareness of Church documents on Catholic SocialTeachings brought new insights to the catechetical enterprise. These insights included theneed for greater attention to a common faith language, acknowledgment of the baptismalcatechumenate (Rite of Christian Initiation of Adults —1972, General Directory forCatechesis, 1997) as inspiration for catechesis, renewed emphasis on faith formation rooted inthe essentials of the faith, and focused attention on the formative role of liturgy in education forfaith. When NCEA looked at these convergences in light of evolving standards in the field ofassessment and requests from users for more informative reports, more direct relationshipbetween assessments and “standards” (both content and performance), the opportunity toincorporate local assessments as part of a common assessment (multiple measurements), aformal means to create informative reports to share with parents and guardians, and access todata based on advances in computer technology, a revision of the 1992 version of ACREbecame even more urgent. In preparing the revised 2001 NCEA Assessment of Catechesis/Religious Education (NCEA ACRE), we have worked to make all these tenets a reality.

The 1992 edition of ACRE has been and the 2001 edition of NCEA ACRE continues to be avalid and reliable evaluation tool for religion programs. Whether in parish-based programs orin Catholic schools, evaluating the status of a religion program requires that educators in faithbe attuned to the mission, goals and strategies inherent to the delivery of sound catechesis.Such an evaluation also calls for a thorough review of the religion curriculum, teachingmethods, instructional approaches and available resources allocated to accomplish themission. NCEA ACRE gives bishops, pastors, administrators, teacher/catechists, parents, andstudents a picture of the basic faith knowledge and religious perceptions of students at variouslevels of age and stage of development. With these data, faith education leaders andparticipants can explore areas of strength as well as areas that need additional attention. Usingsuch an assessment annually allows leaders to identify the trajectory of their faith educationefforts and track improvements as goals and strategies are changed or refined.

NCEA ACRE not only provides an overall picture of the catechetical/religious educationeffort at a Catholic school or parish religious education program (called a local site), but it nowalso incorporates performance standards into a local site’s group summary report as well as anindividual student’s report. The individual student reports are new, optional, and intended togive more personalized, useful feedback to students, parents and institutions. The individualstudent reports are only available for the Part 1 Faith Knowledge portion of the assessment.NCEA ACRE reports are not intended to be used to measure the success or failure of aparticular student, teacher/catechist, or religion textbook series. Many components make up asuccessful religion program in addition to those mentioned. Using documents provided withthe assessment materials, administrators and teacher/catechists are strongly encouraged toprepare themselves before administering NCEA ACRE to the students by engaging in thecurriculum alignment review (Appendix C). There is no need to hold special sessions to

Page 5: 86097 NCEA ACRE Interpretation Manual.pmd

ensure that the students perform well. It is vitally important that all students be given theopportunity to approach NCEA ACRE under optimum conditions, unencumbered bypressure to perform at a certain level.

No single evaluative paper and pencil instrument can measure a person’s love of God or faith.However, especially in the formative years, knowing where one stands in relation toknowledge of the faith and faith awareness can help engender a desire to know more aboutGod and the Church. The goal of catechesis/religious education is to foster a relationship withJesus Christ that leads to maturity in faith. Such maturity will only be achieved through a life-long pursuit of prayerful faith formation based on the realization that “the love of God hasbeen poured out in our hearts through the Holy Spirit who has been given to us” (Romans 5: 5).The revised NCEA ACRE is a tool to assist school and parish catechetical leaders in providinga sound comprehensive religion program.

The General Directory for Catechesis reminds Catholics that all catechesis is essentiallyecclesial and that the six fundamental tasks of catechesis help us to know, to celebrate and tocontemplate the mystery of Christ. In every culture, it takes a variety of teaching methods toaccomplish these tasks: knowledge of the faith, liturgical education, moral formation,knowing how to pray, knowing how to live as a Christian community, and knowing whatevangelization means. Before administering the NCEA ACRE, you were asked to attend tothe Curriculum Alignment Review (Appendix C) to determine the “fit” between NCEA ACREand your local religion program. As you approach the reports to interpret them, ask yourselfhow well your religion program aligns with NCEA ACRE and attends to the six tasks ofcatechesis.

NCEA ACRE was developed following a carefully crafted blueprint (content specifications)based on the Catechism of the Catholic Church and the most recent catechetical documents.By defining key concepts to be evaluated, the blueprint assists educators in determining howwell their total religion program aligns with the assessment. Because it is built around the coreessentials of the faith, NCEA ACRE is an evaluative instrument that can be used by everyoneregardless of the religion textbook series, provided that the series is in conformity with theOffice for the Catechism at the United States Conference of Catholic Bishops.

By integrating “head” knowledge with desires of the heart, that is, personal beliefs, attitudes,practices and perceptions, NCEA ACRE offers a profile of faith formation that is wellintegrated. Moreover, NCEA ACRE is unique in that it is the only assessment that gathers dataat the national level for the purpose of helping catechetical leaders improve the totalcatechetical/religious education program. Field-tested with over 9,000 students to insure highpsychometric performance, submitted to a cultural bias review panel to ensure freedom fromcultural bias, and circulated among dozens of religious educators to make sure that the itemsare age appropriate. NCEA ACRE is an assessment tool of the highest quality. You are invitedto check the assessment icons at the NCEA website frequently (www.ncea.org). There youcan find updates on interpreting and using NCEA ACRE data, and give NCEA feedback onyour experience with this revised tool.

2

Page 6: 86097 NCEA ACRE Interpretation Manual.pmd

Structure, Components and Features of the Revised (2001) NCEA ACRE

General Information regarding Assessment Features and Development

This Interpretation Guide has been prepared to assist local parish and Catholic schoolpersonnel with their review of results from administration of the revised NCEA ACRE (2001)assessments. There are three (3) different 2001 NCEA ACRE assessments each referred to as a“Level.” The Level designation indicates the grade(s) that the assessment is intended to serve.Each unique assessment booklet is divided into two major parts or sections. Part 1 contains theFaith Knowledge questions; Part 2 is a self-report assessment survey of students’ personalbeliefs, attitudes, practices and perceptions. The chart below identifies the components of theNCEA ACRE assessments, the students to be served, and the number of questions containedin each part.

Each assessment was designed and developed with the specific grade(s) of the student inmind. On occasion, a local school and parish may desire to use an assessment at a grade otherthan those intended (e.g., administering the Level 3 assessment to grade 10 students). In suchcases, this is acceptable only when the local parish or school has reviewed carefully theassessment and has judged it appropriate for students at the other grade level. In thesesituations, careful attention to the information being assessed and the reading level of thequestions are important considerations in deciding the appropriateness and fairness of theassessment for “off-grade” groups.

Part 2: Beliefs,Attitudes, Practices &

PerceptionsPart 1: FaithKnowledge

Students servedAssessment

Level 1 Grade 5 51 36

Level 2 Grade 8 or Grade 9 57 46

Level 3 Grade 11 or Grade 12 63 46

3

Page 7: 86097 NCEA ACRE Interpretation Manual.pmd

4

“The appropriateness (i.e., validity) of a measure of learning should be judged based on theopportunity for a student or group to learn the information assessed and the appropriateness ofthe method of assessment as well. Validity is not a property of the assessment, rather it is aproperty of the assessment score to support the inferences made, actions taken, or conclusionsreached. Catholic schools and parish religion programs set out to meet the learning andspiritual needs of all children. Local users need to determine the adequacy of the NCEA ACREtools on a case by case basis. It could be that the reading level, the form of the assessment (e.g.,a paper and pencil tool for a visually impaired student, etc.), or the concepts being assessed arebeyond expectations for some special needs students, that is, the assessment is inappropriatefor some students. Users need to determine and when justified use appropriateaccommodations for students with special learning needs. Typically, accommodations that arepart of the local instructional process are acceptable assessment accommodations as well.Also, we serve many students in our programs whose English language proficiency could limitthe validity of the tool for non-native speakers. Local users should use professional judgmentto determine whether the NCEA ACRE should be administered to or adapted for particularstudents, what, if any, accommodations are appropriate to support a valid assessment, or ifadministered, whether response forms should be returned for processing and reporting.

The plan for development and the actual preparation of knowledge questions1 and affectivestatements that comprise each assessment followed a “content validation” methodology. Thatis, the actual items appearing on an assessment were prepared to assess specific instructionalobjectives of religious knowledge and to survey affective characteristics judged to be centralto the practices, attitudes and beliefs central to the faith for students at specific grades. Thisconceptualization for the 2001 revised NCEA ACRE led to the formation of a blueprint (tableof specifications) for each assessment level form. Each level’s blueprint details exactly whatthe assessment measures by way of content focus. The exacted specifications of the NCEAACRE are presented in the next section. Following is a listing of key features that supported thedevelopment of the revised NCEA ACRE.

The 2001 revised assessments contain approximately 35 to 55 percent “old” ACRE FaithKnowledge questions to maintain continuity allowing programs to compare to their pastrecord of performance. Linkage to the past edition is greatest for Level 1 and least forLevel 3. As in the past, but to a considerably lesser extent, a few faith knowledgequestions are repeated across levels. Continuity between the new (2001) and old (1992)editions has also been maintained for the personal beliefs, attitudes, practices andperceptions section of the assessment at all levels (approximately half the statements arerepeated).

1 NCEA ACRE Part 1 uses exclusively a multiple-choice assessment format. Four response choices are presented and thestudent is directed to select the one choice that best answers the question or completes the statement. Throughout thisManual, we refer to Part 1 multiple-choice tasks interchangeably as “items” or “questions.” For the affective assessment(Part 2), we refer to the collection of personal beliefs, attitudes, practices and perceptions queries posed to students as“statements.”

Page 8: 86097 NCEA ACRE Interpretation Manual.pmd

5

The focus, substance and coverage of each revised 2001 NCEA ACRE were defined by anassessment “blueprint” (sometimes referred to as a table of specification). The newblueprint was prepared, working from the prior ACRE blueprint, by a committee ofeducators, theologians and church leaders and served as the sole guide in the constructionand selection of old and new questions that make up the NCEA ACRE.

Five independent review stages served as the means for the creation, drafting, and ongoingediting of the “new” questions and affective statements. At each stage, panels of scholarsand educators worked on each level’s new questions to fashion the strongest and most validand representative collection of items. To support development, the NCEA reliedcontinuously on national assessment experts to assure the adequacy of the assessmentdesign and the framing of the new items.

A large pilot field tryout of all new questions was conducted involving 9,000 students inschool and parish programs across the country. In addition the attitudinal/affectivestatements were revised based on actual one-on-one interviews with students to furtherstrengthen their relevance, interpretation and understanding.

During development, the NCEA systematically evaluated the questions for fairness (racial,cultural, and ethnic equity) and appropriateness (readability, clarity, etc.) for the individualstudents who shall participate in the NCEA ACRE evaluation process.

Begun in 2002-2003 academic year, score reports now provide for “standards-based”result reports. Relying on two performance standards methodologies (the Angoff andContrasting Group approaches to standards setting) and the input and consideration ofbroadly representative advisory groups to the NCEA, performance standards (cutscores)were established for the faith knowledge section of each NCEA ACRE level assessment.Performance standards categories of Advanced, Proficient, and Needs Improvement arenow available to assist interpretation of student performance on the knowledge portion ofthe assessment.

Each of these features is presented and discussed more fully in the sections that follow. We wishto acknowledge the many individuals who participated in the formulation of the 2001 NCEAACRE blueprint, the assessment design, and the assessment questions, and those who advisedthe NCEA with regard to defining and setting the performance standards. Those persons areidentified in the Contributors Section at the end of this Guide. Excluded from this listing are the9000+ students in school and parish programs and their instructors who participated in theassessment during the pilot testing phase throughout the winter of 2000 and the spring of 2001.In the next section is presented information that describes and sets out the 2001 NCEA ACREassessment blueprint or content specifications.

NCEA ACRE Blueprint and Assessment Specifications

NCEA ACRE was designed to represent eight domains of Faith Knowledge and seven themes ofaffect (includes beliefs, attitudes, practices and perceptions). The Faith Knowledge (NCEAACRE Part 1) blueprint is presented first.

Page 9: 86097 NCEA ACRE Interpretation Manual.pmd

6

NCEA ACRE Part 1 - Faith Knowledge

The Faith Knowledge Part 1 section of NCEA ACRE is designed to measure the extent to whichindividuals and groups have mastered information from eight identified religious educationdomains. Questions were developed and selected solely based on their content to reflect theNCEA ACRE Blueprint. The process of readying items for the revised NCEA ACRE include:drafting questions by religious education parish and school instructors who teach at the gradesthe assessment is intended, review and editing of items by panels of experts and instructors toclarify wording and accuracy of questions, national pilot testing of all new items, externalreview for their readability, alignment to the Blueprint, and evaluation for their cultural, ethnicand linguistic fairness. These are domain-referenced assessments (that is, assessments that areespecially constructed to measure well defined and very specific content areas) designed tooffer information as to the extent of learning with reference to content/curricular standards andto allow for evaluation of performance expectations. The standards-based reporting format andrelated information underscores the “criterion-referenced” design of the assessments. Theassessment questions in the Faith Knowledge portion of the NCEA ACRE assessment aregrouped for reporting following two strategies: reports detailing performance based onDomains and Key Concepts, and reports that present performance of student and groups basedon Pillars of Faith according to the Catechism of the Catholic Church. Each of theseconfigurations is detailed below.

Information Available and Assembled by Domains and Key Concepts

The Faith Knowledge portion of each NCEA ACRE assessment was established by definingeight Domains. Domains as used here refer to broad areas of knowledge representing majorareas of religious education teaching. Each Domain is then further broken down, specified anddefined by Key Concepts. Key Concepts are detailed statements of knowledge that point tospecific information that is expected to be taught and thus learned by students. The actualDomains and their Key Concepts are detailed in the following tables and charts. The sameDomains and Key Concepts are represented across each NCEA ACRE level. Assessmentquestions were prepared to assess the range of Key Concepts in a representative fashion; that is,the number of questions for a particular NCEA ACRE assessment contributing to the Domain’sKey Concepts is not equal, but rather representative of the specification recommended byNCEA Advising Committees. Thus, Domains and Key Concepts judged to be more important atparticular grades have more items associated with them; less crucial areas for a given gradereceive less weight by presenting fewer questions or statements to be completed.

Clear learning objectives are crucial for the success of any educational effort. While the goal ofcatechesis/religious education is growth in faith, a reality that cannot be strictly measured,catechesis does involve systematic instruction and learning dimensions, which can be measuredand evaluated. The learning objectives for a religion program are usually stated in the TeacherManuals, Scope and Sequence charts, and Curriculum Standards provided by a diocese or thataccompany a particular religion series. It is important that these be recognized and followed sothat students do receive a systematic and complete catechesis on all the essentials of the Catholicfaith during their participation in your program. Appendix B contains the elaborated KeyConcept specifications for each Domain; and supporting student objectives define the 2001NCEA ACRE assessments. These specifications detail the material that students should have anopportunity to learn and be expected to know, and which is measured by the assessment.

Page 10: 86097 NCEA ACRE Interpretation Manual.pmd

7

Content coverage is, by design, integrated or spiraled across levels so that relative overallcomparisons can be inferred when schools, dioceses and parish programs use the assessment atdifferent grades and thereby afford a basis for evaluation from one level to the next.

Identified in the charts that follow are the questions (as numbered in the 2001 NCEA ACREassessment booklet) categorized by Domain and Key Concept for each assessment level. Asyou begin your review of results, consider these questions as they appear on the assessment andhow they have been configured to assess the Domains and Key Concepts. Note that in the charts,areas designated as “—” signify Key Concepts for which instruction is desirable and expected,but due to assessment length considerations, no questions are included at this time. These areimportant areas for instruction that you may be covering now in your religion program and mayreceive coverage in subsequent NCEA ACRE revisions. If you are covering this key conceptand want to assess it and others like it through NCEA ACRE, design locally crafted questions tocover these concepts as part of the Optional Twenty Questions section described in theAdministration Manual and in Appendix D of the Interpretation Manual.

Faith Knowledge Questions Aligned to Key Concepts within Each Domain of the 2001NCEA ACRE Assessments

Key Concepts Level 1 Items Level 2 Items Level 3 Items

Trinity 51 9, 34 26

God the Father 8 - 50

God the Son 16 15, 26 18, 27, 58

God the Holy Spirit 31 1 1

Creed 45 42 33, 42

God’s activity in human history 9 55 -

TOTALS: 6 7 8

Domain 1- God: Father, Son and Holy Spirit

Key Concepts Level 1 Items Level 2 Items Level 3 Items

Attributes of the Church - - 2, 10, 16

Mary in the Church 1 10 -

Church: People of God, Body of 17 - 19Christ, Communion of saints

Role of Church leaders 32 27, 36 -

Relationship of Church to others - 20, 50 23and world

Church’s mission and evangelization - 43 43, 51

Church as communion 39, 46 51, 56 59

TOTALS: 5 8 8

Domain 2- Church: One, Holy, Catholic and Apostolic

Page 11: 86097 NCEA ACRE Interpretation Manual.pmd

8

Key Concepts Level 1 Items Level 2 Items Level 3 Items

Liturgical year 5, 10 11, 53 7, 11

Liturgical symbols 18 19 -

The Mass 25, 30 28, 37 9, 20

Roles in the Liturgy - - -

Celebration of sacraments 33 44 36

Sacraments of initiation 40 2 -

Sacraments of healing 47 - -

Sacraments of vocation 23 57 52, 60

TOTALS: 9 8 7

Domain 3- Liturgy and Sacraments

Key Concepts Level 1 Items Level 2 Items Level 3 Items

The Bible as the inspired Word of 3, 11 - 4 God

Parts of the Bible and gospel writers 20 - 12, 21

Major biblical themes: Old - 12, 21 28Testament

Major biblical themes: New 22, 34, 41 29 37, 45Testament

Fonts of revelation - 45 53

Responses to God’s call 48 52, 54 61

TOTALS: 7 6 8

Domain 4- Revelation, Scripture, and Faith

Key Concepts Level 1 Items Level 2 Items Level 3 Items

God’s plan for Christian life 4, 12, 19 5, 13 -

Theological virtues 27 22, 38 13

Ongoing conversion - 40 22

Personal and social aspects of sin 42, 49 30 29

Catholic Social Teaching - 3, 18 38, 46, 54

Conscience, freedom, decision 35 35 17, 34, 62making, responsibility and courageto act

Morality as based on natural and - 49 55divine law

TOTALS: 7 10 10

Domain 5- Life in Christ: Personal Morality and Catholic Social Teaching

Page 12: 86097 NCEA ACRE Interpretation Manual.pmd

9

Key Concepts Level 1 Items Level 2 Items Level 3 Items

God 7, 15 8 8

Church 24 17 -

Liturgy/Sacraments - 25 25

Revelation, Scripture, and Faith 38 33 32

Life in Christ/Morality 44 41, 48 5, 41, 49

Church History - 4 -

Prayer/Religious Practices 50 - -

Christian Hope 43 - -

TOTALS: 7 7 6

Domain 8- Catholic Faith Literacy

Key Concepts Level 1 Items Level 2 Items Level 3 Items

The Lord’s Prayer 6, 14 - -

Sacramentals 2, 29 7 3

Devotional practices in different 37 16 15, 24cultures

Purpose and forms of prayer 26 24 31, 40

Precepts of the Church - 32 48, 57

Christian Spirituality - 47 63

TOTALS: 6 5 8

Domain 7- Prayer/Religious Practices

Key Concepts Level 1 Items Level 2 Items Level 3 Items

Apostolic age 13, 36 6, 14 6

End of persecution, Nicene creed, - 23 14, 35, 44and catechumenate

Mary: Mother of God 21, 28 31 30

Saints (11th Century) - 39 -

Reformation and Council of Trent - - 39 (16th Century)

Age of Enlightenment (18th Century) - - -

Church and workers (19th Century) - - -

Pius X and age of communion (20th - 46 47, 56Century forward)

TOTALS: 4 6 8

Domain 6- Church History

Page 13: 86097 NCEA ACRE Interpretation Manual.pmd

10

Information Available and Assembled according to the Pillars of the Faith: NCEAACRE and the Catechism of the Catholic Church

It is useful to align the Part 1: Faith Knowledge questions based on categorization defined by thePillars of the Faith in the Catechism of the Catholic Church. We continue this alignment traditionestablished with the 1992 ACRE assessments. In 1992 on the thirtieth anniversary of the openingof the Second Vatican Council, Pope John Paul II gave the world the Apostolic Constitution FideiDepositum: On the Publication of the Catechism of the Catholic Church. In it, the Pope describesthe Catechism as “a statement of the Church’s faith and of catholic doctrine, attested to orilluminated by Sacred Scripture, the Apostolic Tradition, and the Church’s Magisterium...a sureand authentic reference text for teaching catholic doctrine.’’

Five years later, the Congregation for Clergy promulgated the General Directory for Catechesis.While retaining the basic structure of its 1971 predecessor, the General Catechetical Directory,the Preface of the GDC sets forth its unique purpose: “... to arrive at a balance between twoprincipal requirements: – on the one hand the contextualization of catechesis in evangelization asenvisaged by Evangelii Nuntiandi (Apostolic Exhortation by Pope Paul VI, 1975); – on the otherthe appropriation of the content of the faith as presented in the Catechism of the Catholic Church.(No. 7c)

As the Department of Religious Education began a process to prepare the National CatholicEducational Association Assessment of Catechesis/Religious Education (NCEA ACRE), theseand other catechetical documents that guide Catholic catechesis also guided the development ofthe revised assessment instrument. Faith Knowledge Domains already judged to be theologicallysound were expanded to strengthen their connection to the Catechism. Assessment objectives foreach Domain were stated more succinctly and Key Concepts that particularize the FaithKnowledge Domains were expanded to provide greater content coverage. NCEA ACRE iscarefully crafted and aligned to the Catechism of the Catholic Church as well as other foundationaldocuments that guide the Church’s worldwide commitment to evangelization and its handmaidencatechesis.

The four major divisions of the Catechism, Part One: The Profession of Faith, Part Two: TheCelebration of the Christian Mystery, Part Three: Life in Christ, and Part Four: Christian Prayer,recall the fourfold division of the Catechism of the Council of Trent (also called the RomanCatechism) promulgated in 1566. This structural arrangement often referred to as the Pillars2

presents a succession of movements: the Church believing, celebrating, living, and praying.

You will observe that the optional individual student reports which aggregate data are reported incommon religious language according to this fourfold division: Creed, Liturgy and Sacraments,Morality, and Prayer. The charts below detail the configuration of the 2001 revised NCEA ACREassessment questions as they align with the Catechism. Note that not all assessment items are usedto configure the Pillar scores and resultant performance summary reports. While the categoricallabels used for some of the Pillars and the Domains described previously are often similar (e.g.,Liturgy and Sacraments, etc.), the assessment questions that comprise these sections are notaltogether the same as the source of descriptions and specifications are themselves not identicalfor the categories even though common labels are used.

2 Bernard L. Marthaler, OFM CONV, “The Catechism of the Catholic Church in U.S. Context” in Source Book forModern Catechetics, Vol. 2, Ed. Michael Warren (Winona, MN: Saint Mary’s Press, 1997), 279.

Page 14: 86097 NCEA ACRE Interpretation Manual.pmd

11

Faith Knowledge Questions Aligned with Pillars of the Catechismfor Each Level of the 2001 NCEA ACRE Assessments

PILLAR I: PROFESSION OF FAITH (CREED)

Level 1 Items Level 2 Items Level 3 Items

7 1 18 8 29 9 8

15 15 1016 20 1617 25 1822 26 1924 27 2331 33 2639 34 2745 36 3351 42 42

45 4551 50

5358

Total: 12 14 16

PILLAR II: THE CELEBRATION OF THE CHRISTIAN MYSTERY

(LITURGY AND SACRAMENTS)

Level 1 Items Level 2 Items Level 3 Items

5 2 610 6 713 11 918 12 1123 19 1225 23 2028 28 2530 29 3633 31 3740 37 4147 44 4448 53 52

54 6057

Total: 12 14 13

Page 15: 86097 NCEA ACRE Interpretation Manual.pmd

12

PILLAR III: LIFE IN CHRIST (MORALITY)

Level 1 Items Level 2 Items Level 3 Items

4 3 512 13 1319 18 1727 21 2232 22 2835 30 2938 32 3441 35 3842 38 4343 48 4646 49 5149 52 54

56 555962

Total: 12 13 15

PILLAR IV: CHRISTIAN PRAYER (PRAYER)

Level 1 Items Level 2 Items Level 3 Items

1 4 32 5 153 7 246 10 30

11 14 4014 16 4826 17 4929 24 5637 40 5750 41 61

47 63 Total: 10 11 11

Page 16: 86097 NCEA ACRE Interpretation Manual.pmd

13

Links to the 1992 Edition of ACRE

Steps were taken to include in Part 1 of the 2001 revised assessment questions from the priorACRE assessment (1992). It was judged desirable to have continuity to allow parishes andschools to monitor performance trends over time, though somewhat limited, during this shiftfrom the “old” to the “new” NCEA ACRE. For programs that assessed students using the priorACRE edition (1992) and who would like to “track performance” from past groups and years togroups being assessed at this time, the table below details the assessment questions that havebeen carried forward onto the revised 2001 NCEA ACRE. Obviously, should you be interestedin comparison, you will need results from past years to compare to this year’s performance.Information in the table below allows the user to track performance on an item-by-item basis.Direct comparisons can only be made between performance on the assessment questions thathave been repeated between the 1992 and 2001 NCEA ACRE editions. Should you wish tocompare “scores attained across years” (a more stable/reliable measure than monitoring anindividual item), then you should sum (that is, add) the percent correct values for these itemsassociated in a particular year. Divide the sum by the number of items summed for the year andgroup. The result will be the average percent correct score on the common items (for the year andgroup), which may then be compared over the time periods being evaluated.

3 Refers to question number in Part 1 - Religious Knowledge, of the 2001 NCEA ACRE booklet

Overlapping Faith Knowledge Questions Between the1992 and 2001 NCEA ACRE Assessments

LEVEL 1 ITEMS LEVEL 2 ITEMS LEVEL 3 ITEMS

2001 Item#3 1992 Item # 2001 Item # 1992 Item # 2001 Item # 1992 Item #

4 12 5 24 9 455 39 6 6 11 76 15 10 14 13 348 1 11 26 18 24

10 42 12 27 20 3911 40 13 28 21 4912 48 14 13 23 5213 6 15 8 27 814 47 21 43 29 1815 16 26 29 32 4016 8 27 1 33 6119 49 28 45 36 3220 27 30 18 39 3722 19 35 42 46 2824 11 36 23 47 16

Page 17: 86097 NCEA ACRE Interpretation Manual.pmd

14

4 Refers to question number in Part 1 - Religious Knowledge, of the 2001 NCEA ACRE booklet

Common Faith Knowledge Questions Across Levels of the 2001 NCEA ACREAssessments

On occasion during development it was decided that a particular assessment item could be usedat more than one assessment level. This decision was made when it was judged that a questionmerited tracking over the grade/age spans, or that content assessed at one level was appropriatefor continued assessment at a subsequent level. Though not relied on often (a change from theprevious ACRE edition), the table below identifies questions that are repeated over levels of the2001 NCEA ACRE assessments. When a local school or parish uses the NCEA ACRE at morethan one level, a review of performance on these across-level common items should beinformative.

LEVEL 1 ITEMS LEVEL 2 ITEMS LEVEL 3 ITEMS

2001 Item #4 1992 Item # 2001 Item # 1992 Item # 2001 Item # 1992 Item #

25 5 38 48 49 4629 25 40 39 50 133 37 41 25 55 4834 17 44 37 56 2538 43 48 41 58 3640 14 50 44 60 1441 20 53 2144 13 54 4645 29 55 246 41 57 947 3848 4549 1851 10

Total: 29 25 21

Page 18: 86097 NCEA ACRE Interpretation Manual.pmd

15

5 Item 35 in Level 1 is in Domain 3 (Worship), whereas its matching item (item 38) in Level 2 isin Domain 5 (Scripture).

6 Item 38 in Level 1 is in Domain 8 (Religious Terms), whereas its matching item (item 21) inLevel 2 is in Domain 4 (Sacraments).

7 Item 41 in level 2 is in Domain 8 (Religious Terms), whereas its matching item (item 56) inLevel 3 is in Domain 6 (Morality).

8 Represents the unique number of overlapping items per the column heading.

Overlapping Faith Knowledge Questions Across Levels of the2001 NCEA ACRE Assessments

DOMAIN

Overlapping OverlappingQuestions Between Questions Between Overlapping Questions AcrossLevel 1 and Level 2 Level 2 and Level 3 All Levels

Level 1 Level 2 Level 2 Level 3 Level 1 Level 2 Level 3# # # # # # #

1. God: Father, Son45 26 9 26

and Holy Spirit

2. Church: One, Holy, 51 16Catholic and 56 59Apostolic

3. Liturgy and 35 385 28 9Sacraments 33 44

4. Revelation, 38 216 29 37Scripture, and Faith

5. Life in Christ: 35 38 30 29 49 30 29Personal Morality 49 30and Catholic SocialTeaching

6. Church History 13 6 41 567

7. Prayer/Religious 16 24Practices

8. Catholic Faith 38 21 41 56Literacy 8 8

TOTALS8 6 9 1

Page 19: 86097 NCEA ACRE Interpretation Manual.pmd

16

NCEA ACRE Part 2 - Affective Assessment(Beliefs, Attitudes, Practices and Perceptions)

Part 2 of the 2001 NCEA ACRE assessment provides for an affective appraisal that addresses student personalbeliefs, attitudes, practices and perceptions judged to be of interest and concern to religious educators. It isimportant to remember that for many students certain beliefs, attitudes, practices and perceptions may well bethe direct outcome of a religious education program; for other students these may well be primarily the resultof other influences such as family, peer group or personal experiences, etc. While the relationship betweenprogram content and affective behaviors is somewhat difficult to establish, the NCEA Advisory Committeeshold that this form of information, which can be obtained from students, is essential for a balanced assessmentand evaluation of religious education outcomes.

The affective statements prepared and assembled for the revised 2001 NCEA ACRE assessment representseven categories or “themes.” The seven reporting categories or “themes” identified in the table below areconsistent with the categories used in the 1992 edition of ACRE. Three tables follow and provide for: 1) theidentification of the themes assessed and their reporting category, identification of the statements in theassessment booklets that comprise each category and the number of statements in each category for eachassessment level; 2) the identification of statements that have been repeated/continued from the 1992 ACREassessment; and finally, 3) a cross reference table that identifies affective statements that repeat across the2001 NCEA ACRE assessment levels. As the information contained in these tables parallels informationdiscussed in the previous Faith Knowledge description, that presentation is not repeated here.

Affective Statements in Part 2 of the 2001 NCEA ACRE Assessmentgrouped by Reporting Category9

9 Affective themes or categories address student attitudes, beliefs, practices and perceptionsjudged to be of interest to catechetical/religious education leaders and teachers/catechists.

10 Numbers in the table identify the statement based on its sequence number in the assessmentbooklet.

Level 1 Level 2 Statements Level 3 Statements Number ofReporting Categories Statements10 (36) (46) (46) Statements

L1 L2 L3

Relationship with Jesus 1, 7, 13, 19, 23 1, 7, 19, 24, 27 1, 7, 13, 19, 25 5 5 5

Images of God 2, 8, 14 2, 8, 14 2, 8, 14, 20 3 3 4

Catholic Identity 3, 9, 15, 20, 24 3, 9, 15, 20, 25 3, 9, 15, 21, 26 5 5 5

Morality 4, 10, 16 4, 10, 16, 21, 26, 29 4, 10, 16, 22, 27, 30 3 8 832, 33 33, 34

Relationships with Others 5, 11, 17, 21, 25 5, 11, 13, 17, 22, 30 5, 11, 17, 23, 28, 31 5 6 6

Perceptions About Your 6, 12, 18, 22, 26, 27 6, 12, 18, 23, 28, 31 6, 12, 18, 24, 29, 32 6 6 6 School/Parish Program

Students’ Concerns 28, 29, 30, 31, 32 34, 35, 36, 37, 38 35, 36, 37, 38, 39 9 13 1233, 34, 35, 36 39, 40, 41, 42, 43 40, 41, 42, 43, 44

44, 45, 46 45, 46

Page 20: 86097 NCEA ACRE Interpretation Manual.pmd

17

Overlapping Affective Statements Between the 1992 and the 2001 NCEA ACREAssessments

11 Values recorded in each row identify the identical statement by its booklet number as itappeared in the 2001 and in the 1992 editions of the ACRE assessments.

LEVEL 1 STATEMENTS LEVEL 2 LEVEL 3 STATEMENTS

STATEMENTS

2001 #11 1992 # 2001 # 1992 # 2001 # 1992 #

1 54 1 54 1 61

3 62 3 61 6 143

6 111 6 134 7 68

7 63 7 62 10 86

10 75 10 81 11 109

11 89 11 101 12 132

12 102 12 134 13 62

13 55 13 106 14 129

14 50 14 50 15 99

15 76 15 91 16 97

16 85 16 80 17 112

17 88 17 104 18 134

18 104 18 126 20 57

19 66 19 66 21 102

20 79 20 71 22 84

21 87 21 79 24 125

25 82 23 117 25 58

27 108 25 102 26 110

26 100 27 108

27 55 28 105

28 123 29 131

30 97 30 96

31 131 31 114

32 140

33 89

Total: 18 23 25

Page 21: 86097 NCEA ACRE Interpretation Manual.pmd

18

Overlapping Statements Overlapping StatementsReporting Between Level 1 and Between Level 2 and Overlapping Statements Across AllCategory Level 2 Only Level 3 Only Levels

Level 1 #12 Level 2 # Level 2 # Level 3 # Level 1 # Level 2 # Level 3 #

Relationship with 1 1 1Jesus 24 25 7 7 7

13 27 1319 19 19

Images of God 2 2 28 8 814 14 20

Catholic Identity 3 3 39 9 915 15 1520 20 2121 25 26

Morality 16 16 4 4 421 22 10 10 1029 30 16 26 2732 3333 34

Relationships with 21 13 17 17 5 5 5Others 22 23 17 11 11

25 30 28Perceptions About 6 6 6Your School/ 12 12 12Parish Program 18 18 18

22 23 2426 28 2927 31 32

Students’ 28 34 38 39 29 35 36Concerns 39 40 30 44 45

40 41 31 45 4642 43 32 46 3543 44 33 36 37

35 41 4236 37 38

TOTALS: 2 13 31

Overlapping Affective Statements Across Levels of the 2001 NCEA ACRE Assessments

12 Refers to the number of the statement in Part II, Personal Beliefs, Attitudes, Practices, and Perceptions, of the 2001NCEA/ACRE.

Page 22: 86097 NCEA ACRE Interpretation Manual.pmd

19

Statements in the Part 2 portion of the NCEA ACRE assessment are unique in one major respect.Unlike the faith knowledge questions for which there are correct answers, the affective sectionof the assessment asks each student to self report a belief, attitude, practice or perception. Assuch, students respond using a scale that reflects the extent of agreement or disagreement withthe statement, or indicate the extent to which they perceive a specific problem to exist (rangingfrom a major problem to not a problem at all). The next section offers guidance regarding how tointerpret and use information from this portion of the assessment.

Part 2 of the NCEA ACRE Assessment and Confidentiality

As students prepare to respond to the affective statements that comprise Part 2, they are informedthat their responses will be kept confidential. Persons administering the assessment are advisedto take steps to assure respondents’ confidentiality. Without an assurance of confidentiality,data from this section can be suspect. To the degree that students believe that program staff willread and take note of their individual responses, the students will be inclined to mark thepreferred responses which may not truly reflect their own beliefs, attitudes, practices andperceptions. Remember this condition and caveat as you explore these results characterizingyour students. To further support the assurance made to students, when results are summarized,which shall be discussed in the next section, no individual student Part 2 results are given, onlygroup data are reported, and when a group is comprised of fewer than six (6) students, no groupsummary is presented at all. When the gathering of this information follows the administrationprocedures detailed for students, then these data become a useful and important resource to aidwith the review of your religious education programs.

NCEA ACRE Standards-Based Reporting:

Information, Rationale and Description

Historically, test scores have taken on meaning when the score of a student or group wascompared to the performance of others. The often used “percentile rank” score identifies one’sstanding in relation to a group that the student is a member of or to which s/he aspires. Theeducation goal that supports reliance on percentile ranks was “doing better than others,” notnecessarily learning to one’s potential or with reference to what is expected or desired. Over thepast decade, there has been a shift to “standards-based” scoring and reporting on the knowledgeportions of most educational assessments.

Today, assessments as the revised NCEA ACRE are crafted to reflect adequate coverage ofknowledge content and specific standards or instructional objectives that are to have beenlearned. What content knowledge is to be assessed is thoughtfully planned and balanced toassure proper coverage, and then resultant scores of students and groups point to the percent ofquestions answered correctly (see the earlier presentation regarding the NCEA ACREassessment specifications). This simple index, what percent of the questions are answeredcorrectly by a student or group, is then compared to carefully determined score cutpoints orperformance standards to provide an independent criterion appraisal of the performance of theindividual or group. Whereas assessment results for a long period only informed users as to howthe student or group performed in relation to others, it is common for today’s assessments to

Page 23: 86097 NCEA ACRE Interpretation Manual.pmd

20

inform users how the student (or group) scored in relation to independent or criterionexpectations. The revised NCEA ACRE has embraced this new principle of standards-basedreporting.

ACRE Reports EnhancedUsers of the revised NCEA ACRE receive group score reports that detail the percent of theirstudents classified as Advanced, Proficient or Needs Improvement. Users who chose to receiveindividual student score reports also receive specific information that classifies each student’sscore on the Part I Faith Knowledge section of the NCEA ACRE into these standards-basedperformance categories.

With the introduction in the 2001-2002 academic year of the revised NCEA ACRE assessment,instructors in parish and school religious education programs were asked to express theirthoughts about standards-based reporting. For the most part, the input was positive, as long asthe focus and purpose would provide users with guidance to support instruction and curriculumplanning information for the local religion program. Users responded affirmatively to a systemthat would allow a program to gauge how groups of its students performed with reference toindependent and challenging knowledge-achievement standards (i.e., the Part 1 portion of theNCEA ACRE assessment alone).

Many also indicated a desire to know how each individual student had performed in relation tocriterion expectations on the Part 1 portion of the assessment so they could plan instructionaccordingly. However, both NCEA ACRE users and the NCEA developers of the studentassessment cautioned against using performance-standard classification based on a singleassessment as a “high stakes” evaluation for student proficiency that results in a student’s NCEAACRE scores directly affecting course grading, promotion, graduation, or course placement.The same caution must be made with reference to the overall performance of a group. Using asingle assessment tool is not an appropriate or sufficient basis for making generalizedjudgments. For example, NCEA ACRE alone is not an appropriate or sufficient basis on which tojudge the merit or worth of a local religion program or instructor. The NCEA ACRE is, however,a valuable and key tool to contribute toward evaluation of the strengths and areas of concern of alocal religion program. It is also a valuable tool for advising catechetical leaders regarding thestrength of the various components that support faith formation: parents, community of faith,liturgy, social outreach, and formal instruction. When NCEA ACRE results are interpreted in thecontext of the local setting and considered along with other systematically gathered informationand data, planning for change becomes intentional rather than capricious.

Arriving at the NCEA Performance Categories and StandardsThe NCEA ACRE revision project put in place a process of “performance-standard setting.” Thefirst task was to determine what performance categories would best serve the knowledge-information scores that result from the assessment. After months of discussion andconsideration relying on advice and input from users and national leaders, it was determined thatNCEA would use, and NCEA ACRE faith knowledge total scores would be classified into, threecategories: Advanced, Proficient and Needs Improvement. Due to the relatively few number ofquestions available on some of the NCEA ACRE level assessments, a decision was made to notset score expectations for Domain or Pillar scores.

The categories are used with each NCEA ACRE level assessment and serve as the performance-standards classification for both school and parish users. As performance-standards categories

Page 24: 86097 NCEA ACRE Interpretation Manual.pmd

21

were confirmed, performance category descriptions were created by the NCEA. The completedescriptions are presented in the box on a later page. As is the case in many state academicachievement assessment programs, the categories have specific and intended meanings. Asestablished by the NCEA, the intention of the categories provides for the followingconsiderations.

The Advanced category is intended to clearly signal strong, outstanding achievement andevidence of exceptional faith knowledge learning. This is the ultimate cognitive goal for allstudents in any religion program, but one that will not be achieved without effort by students,catechists, and the faith community alike.

The Proficient category represents the expected achievement level for all students. Theproficient category does nevertheless represent a challenging goal: students and religionprograms that are not focused on core and essential faith knowledge and learning will notattain proficient status.

The Needs Improvement category signals that more work is needed or that the extent of thestudent’s faith knowledge is insufficient even at a minimum expectation level.

The current NCEA ACRE is a longer assessment than prior editions. Individual and group scoreresults will be useful for identifying areas of strength and weakness. For example, not only doesNCEA ACRE report pupil overall performance while results in performance classification, butindividual and group summary reports identify levels of scores attained in eight key domainswith results disaggregated for local review and inspection. The current edition of the NCEAACRE provides a comprehensive assessment well planned and equipped to assist local programpersonnel to identify strengths and weakness in the faith knowledge learning of students andgroups of students. Standards-based reporting information provides users with a perspective ofcurrent standing and when NCEA ACRE is used annually, progress toward meetingexpectations of religious education learning over time.

Identifying the Performance Standard CutpointsThroughout spring 2002, random samples of all NCEA ACRE users were invited to providetheir judgment concerning suitable and appropriate “cutscores” using one of two distinct andindependent procedures. A formal standard-setting judgmental procedure, known as theAngoff method, was used to solicit instructor input. Using this procedure, religious educationinstructors judged the difficulty of the assessments’ faith knowledge questions and providedtheir expectation for scores needing to be attained that would result in classification into thethree categories. A standard setting empirical approach, known in the testing literature as theContrasting-Groups method, was also used to obtain an independent and confirmingperspective (a second opinion so to speak). This procedure relies on teacher judgment aboutstudents they taught and who had taken the assessment. Data using this method were collectedon over 10,000 students. Multiple cutscore procedures were employed as no one method is bestsuited to such a task, and each method is based on different assumptions from which to establishthe eventual cutscores. Once information from each method was gathered and analyzed, anNCEA advisory panel was convened to review the findings from each method, synthesize theresults, review actual student and group performance, and recommend cutscores to NCEA.

Page 25: 86097 NCEA ACRE Interpretation Manual.pmd

22

To arrive at their recommendations, advisory panel members considered questions as:

School and parish programs differ in their opportunities to work with students, soshould cutpoints for classification differ for these two groups?The assessment is administered at different times in the year, so should classificationexpectations vary in relation to which month the assessment occurs?Should expectations intended by the common categorical labels vary by the NCEAACRE levels?When a form of NCEA ACRE is used at different grades (e.g., Level Three is intendedfor grade 11 or 12 students), should the performance expectations change to reflect thediffering ages/grades of the students taking the assessment?

On these issues, the national ad hoc panel judged that the NCEA ACRE should identify highexpectations for all students regardless of these factors. NCEA ACRE has one set of testspecifications that concentrates on core faith essentials; therefore, it should also have only oneset of performance standards. That judgment was reaffirmed by the NCEA. With input from thenational ad hoc panel, the NCEA determined the cutscores that create each performancecategory.

Monitoring and Evaluating Achievement as Evidenced in NCEA ACRE PerformanceIncorporating standards-based reporting and referencing scores based on three performancegroups on the knowledge portion of the NCEA ACRE is intended to support the catechetical/religious education efforts in both school and parish religion programs. The standards-basedresults are not meant to hold programs strictly accountable to independent standards establishedby an external body. This would be a misuse and a misunderstanding of the standards reportinginformation that comes from the NCEA ACRE assessments.

Users and reviewers of results realize that the scores a student attains are not solely a result of areligious education instructional experience. Home, community, parish, and school, incombination with dispositional as well as student learning-cognitive readiness factors such asreading level, language proficiency, acculturation, motivation, when and at which grade theassessment is taken, all affect and influence scores. Results must be reviewed and decisions forimprovement made in consideration of these conditions, any one (or combination) of which caninfluence each student differently.

Establishing Local Program Performance Standards and CutscoresInvestigating the alignment of the local program’s curriculum to the NCEA ACRE contentcoverage needs to be the first step toward interpreting results (see Appendix C in this Manual).While NCEA has considered many issues that have a bearing on the determination andfinalization of cutpoints for the assessment, local programs may wish to consider setting theirown cutpoints to aid local program review and evaluation. In the very specialized and veryquantitative world of educational testing and psychometrics, there are a number of formalmethods and procedures that can be followed to arrive at cutscores for a cognitive assessment.While guidance from applying such approaches can be had (and indeed were at the basis forestablishing the NCEA ACRE cutscores), in a local school or parish setting more sensitive,understandable, as well as equally defensible approaches can be used to arrive at local cutscorestandards. Programs that would desire to identify cutscores of their own creation could proceedas follows. To arrive at local cutscores, a six step process as follows is recommended.

Page 26: 86097 NCEA ACRE Interpretation Manual.pmd

23

1. The first step needs to be conducting a local alignment evaluation as detailed in Appendix Cof this manual. While it will take some time (a couple of hours), we cannot overstate the veryreal benefits for religious education instructional staff to review systematically as a grouptheir local religious education program in consideration of the NCEA ACRE religiouseducation curriculum coverage. As your local alignment evaluation is completed (again,Appendix C provides instructions), teachers/catechists will have gained familiarity with theactual NCEA ACRE assessment items. Being knowledgeable and keenly aware of theassessment questions is a vital element to determining the cutscores for any assessment tool.Alignment evaluation and the “local standard setting review” described below should becarried out separately for each NCEA ACRE assessment and grade assessed in the localprogram. Dioceses and parish programs that use NCEA ACRE across different locations(schools, parishes, etc.) might wish to identify common grade cutscores for use across themultiple locations. It is important, as the assessments are very different by level, that thefollowing activities give consideration for unique cutscores for each NCEA ACRE level andassessed grade.

2. Immediately following completion of the alignment evaluation (step 1 above), teachers/catechists should begin to review the NCEA ACRE Performance Standards categories anddefinitions (see the chart below), and discuss these expectations for their students andprogram. This discussion needs to attend to the reasonableness of the NCEA descriptionsand expectations in consideration of the local context as it impacts instruction, the localreligious education offering, community, and students. This discussion and review mustoccur without reference to or information regarding the actual cutscores used by the NCEA,or the scores attained by local or national students. Focus the discussion on: are the NCEAStandards reasonable or unreasonable for your program and students, and why or why not?All participants should be invited and encouraged to have input to this discussion (expectedtime 15 to 20 minutes).

3. Following from the above, Step 3 is a discussion among instructors as to the generalexpectations regarding faith knowledge learning with specific attention on the assessmentitself. How do the teachers/catechists judge the NCEA ACRE assessment: do they see it asgenerally hard, of moderate difficulty or easy; they need to discuss and reflect how willstudents do and how should your students do on the assessment; they need to revisit thequestion of how well does the actual assessment match the local curriculum referring back tothe alignment evaluation in Step 1 (strong match suggests, relatively speaking, highercutscores, etc.); what are reasonable expectations for your students given the religionprogram/curriculum that is offered; when in the academic term will the assessment beadministered and how might this influence/impact performance; how do the instructors findthe actual test items: fair, trivial, adequate, representative, relevant, etc. (At this step,participants should avoid stating particular score values as the cutscores s/he wouldrecommend/desire.) The Step 3 discussion is intended to focus on the assessment itself, itsproperties and features in relation to your students and program (perhaps 30 minutes ofdiscussion).

4. Step 4 is the point where each person participating in the review of standards, the assessmentitself, and the discussion of performance expectations on the NCEA ACRE is to write downwhat s/he judges to be appropriate cutscores that would classify students into the NCEAACRE categories. There are two key decisions that each participant must answer: in the

Page 27: 86097 NCEA ACRE Interpretation Manual.pmd

24

individual’s professional opinion, what is the minimum score a student should obtain toconclude the student is Proficient (as defined by the NCEA Proficient expectation); and thesecond decision, what is the minimum NCEA ACRE assessment score that results in theAdvanced classification. These two judgments are to be done “privately and alone.” It ispreferred/recommended that these be recorded by each participant as the number of itemswith a correct response (not percent). Again, the task needs to be completed by eachparticipant contemplating this decision alone, but influenced by the activities and discussionof the prior three (3) tasks. What each participant records should be returned to one personwho prepares a summary of the “individual recommendations” (e.g., the average scorerecommended for each category, the highest and lowest recommendation, etc.) and thenimmediately returns it to participants for continuing group discussion. (5 to 8 minutes)

5. In Step 5, the group reviews and discusses the summary information from the “individualrecommendations.” Each person should briefly discuss his/her thinking and rationale fortheir cutscore recommendations. At this stage, participants should be informed of thecutscores used by the NCEA ACRE, and portions of the preceding paragraphs shared/presented to the participants as to rationale, process and justification for the NCEA cutscores.By the end of this discussion each participant needs to finalize her/his recommendations forthe category cutscores. As this discussion and conversation winds down, participants shouldrecord their final recommendations (working independently). These recommendationsshould be collected and given to that person in charge of establishing “final local standards.”(15 minutes)

6. We strongly recommend that the local instructor/participant recommendations be a basis fora program’s determination as to its own standards in relation to its own context. As decidedlocally, final cutpoints can be set down by school and parish leaders in consideration of therecommendations of individual program staff. It is also advisable for the decision makers toreview how students in the local program and the national program are actually scoring onthe NCEA ACRE. An important criterion for cutscores is that they present reasonableexpectations for student performance. Finally, it is advisable that this process be repeatedapproximately every three years for the local performance cutscores to be fine-tuned asneeded and justified.

NOTE: Setting cutscores on assessments is a process. This process needs to assure thatparticipants are familiar with community expectations and goals, are informed and guided by ahigh degree of familiarity with the assessment itself, and set out criterion category cutpoints thatare not only defensible but reasonable. The steps above detail a process of review, discussion,reflection and advisement. The cutscore process does not lead to or “identify” firm, clear orknown truths. The process can only assure that resultant cutscores are fair and representative.Score reports and group summaries returned to users will provide extensive detail to allow aschool or parish program to judge performance based on the NCEA cutscores as well asincorporate performance standards as may be set down by the local user.

Page 28: 86097 NCEA ACRE Interpretation Manual.pmd

25

The NCEA Performance Standards, Categories and CutscoresNCEA ACRE’s performance standards categories are intended to support local programmaticuse, not to usurp or interrupt local religion program planning. Standardized assessments such asNCEA ACRE are tools that can inform initiatives for change, but that information must be usedin the context of local conditions, goals and expectations. NCEA ACRE reports also provideusers with extensive documentation referencing national group performance as well, andnational information should also contribute to evaluating local performance. Local contextinformation helps to form the background in which a local assessment coordinator attends topreparation activities (i.e., curriculum alignment review using Appendix C, and setting localstandards and expectations) before actually administering NCEA ACRE to the students. A chartof the standards categories, including descriptions, and the cutscores used to classify studentPart 1 faith knowledge scores follow. It is these categories and cutscores that are used to preparethe individual student and group summary statistical reports that are returned to NCEA ACREusers.

_

Page 29: 86097 NCEA ACRE Interpretation Manual.pmd

26

NCEA ACRE Performance Standards Categories and Expectations

NCEA has adopted the following performance standards definitions to assist in categorizing student andgroup performance on the revised NCEA ACRE. The performance standards were developed by NCEA tosupport standards-based program review and to set in place expectations for student performance. Scorereports are configured to provide displays and tabulations with reference to group and individual studentperformance defined by these categories. These categories are not intended to replace or become localperformance standards or expectations. Other summaries and statistical tabulations based on actual observedassessment scores (averages, percent correct, assessment-question statistics, etc.) continue to be provided tousers. Review of all results is strongly encouraged.

_

The following table identifies the student score decision values (cutscores) that result in categoryclassification for each NCEA ACRE level assessment. Classification occurs based on individual studentperformance (the number of questions answered correct) on the Faith Knowledge portion (Part 1) of theassessment.

NCEA ACRE FAITH KNOWLEDGE CLASSIFICATION CUTSCORES

NCEAACRELevel

# FaithKnowledgeQuestions

Needs Improvement

Number Correct(percent)

Proficient

Number Correct(percent)

Advanced

Number Correct(percent)

I

II

III

51

57

63

less than 33 correct(less than 64% correct)

less than 37 correct(less than 64% correct)

less than 41 correct(less than 65% correct)

33 to 44 correct(64% to 87%)

37 to 49 correct(64% to 86%)

52 to 54 correct(65% to 86%)

45 or more correct(88% or higher)

50 or more correct(87% or higher)

55 or more correct(87% or higher)

The next section of this Manual presents information specific to the group and student result reportsprepared and returned to users.

Advanced CategoryStudents at this level consistently demonstrate superior knowledge and understanding of the faithknowledge assessed appropriate to the student’s age and stage of faith formation, with balanced andexceptional strengths in evidence relative to the key concepts supporting the core faith knowledgethemes and domains assessed.

Proficient CategoryStudents at this level demonstrate satisfactory knowledge and understanding of the faithknowledge assessed with evidence of acceptable command of key concepts that support core faiththemes and domains assessed appropriate to age and stage of faith formation. Performance generallyis strong and competent with few deficiencies and typically not substantially concentrated in anyone faith knowledge domain or theme.

Needs Improvement CategoryStudents at this level demonstrate a below-basic knowledge and understanding of the faithknowledge assessed that is not yet fully satisfactory for the age and stage of faith formation. There isthe need for improvement as evidenced by inconsistent and minimal command of the key conceptsthat support core faith themes and domains with deficiencies substantially concentrated in single orin multiple domains or themes.

Page 30: 86097 NCEA ACRE Interpretation Manual.pmd

27

NCEA ACRE Score and Summary Reports

Each school or parish administering the 2001 NCEA ACRE receives the following groupreports.

1. Part 1 Faith Knowledge Item-by-Item Summary Reports displaying group performance toeach Part 1 question organized and presented by the eight Domains assessed (see ScoreReport for an illustration)

2. Part 1 Total Score, Standards Classification and Domain summaries showing groupperformance on each Domain for the NCEA ACRE assessment (see Score Report for anillustration of this score and standards-based reporting information)

3. Part 1 Score summaries showing group performance on each of the four Pillars of theCatechism of the Catholic Church based on the NCEA ACRE assessment (see ScoreReport for an illustration)

4. Part 1 Faith Knowledge Score Frequency Distributions based on the NCEA ACREassessment (see Score Report for an illustration)

5. Affective Statement Summary Reports summarizing group responses to the Part 2statements organized and presented by student information categories (see Score Report

for an illustration)

6. Affective Statement Summary Report summarizing total group responses only to the Part2 Student Concerns statements (see Score Report for an illustration)

Score Reports 2, 3, 4, and 5 present results for four groupings of students: 1) all studentscombined, 2) Catholic students only, 3) non-Catholic students, and 4) students who have beenin the local program for two (2) years or longer. For all score report types 1 through 6, scoresummaries and question-by-question summaries are only given for a group when six (6) or morestudents comprise the group. This restriction is imposed to limit the possibility that specificindividuals may be identifiable in “summary reports” when the number of students in a group isvery few.

In addition to the group reports, three other optional reports are provided when selected by thelocal user:

7. Local School/Parish Faith Knowledge Question Summary Report displaying groupperformance on each locally created and supplied faith knowledge question and for a totalscore across these questions (see Score Report (2 pages) for illustrations). This option isavailable to users in schools or parishes desiring to add questions to the faith knowledgeassessment. Appendix D provides an overview of the process and procedure that yieldsthese reports. The service is provided at no charge.

8. Individual Student Report Summary listing showing all individual students assessed andtheir performance (for Total Score, NCEA Performance Classification, Domain, Pillarscores) based only on responses to the Part 1 Faith Knowledge questions (see Score Report

for an illustration). This report becomes available only when individual student scorereports are ordered. See next.

Page 31: 86097 NCEA ACRE Interpretation Manual.pmd

28

9. Individual Student Report, one student to a page, presenting the student’s Total Score (onthe Part 1 Faith Knowledge portion of the assessment only), the NCEA PerformanceClassification, Domain and Pillar scores, and identifying student performance incomparison to all students assessed locally and national performance information forstudents at the same grade (see Score Report for an illustration). There is a charge toobtain the Individual Student reports. If you wish to receive these reports, contact CETE(1-866-367-2383). Ordinarily, Individual Student Reports are requested when orderingNCEA ACRE services, but these reports can be produced at any time.

Score data in the reports are computed, configured and detailed using a few common quantities:the actual score attained by the student (number correct) and the resulting performance categoryclassification (Advanced, Proficient, Needs Improvement), the percent correct score (numbercorrect divided by the total points possible times 100), averages (means) of these indices for aparticular group, and the score distribution of student performance by grade and studentinformation. Question-by-question summaries identify the percent of students making a correct(or preferred) response, and the percent of students choosing each incorrect (non-preferred)response choice. Access to revised NCEA ACRE booklets may be of use when reviewingresults, but the actual assessment questions have been abridged for presentation in the reports.

A few notes regarding accuracy: As percentages are used to convey findings in most of thereport summaries, rounding errors can occur and will be observed; that is, sums will be observedto total to 99 percent or 101 percent but this is due to the rounding of the percent quantities atdifferent stages in the tallying and analysis process. Also recall that student groupinginformation (Catholic or not, time in program, etc.) is captured as self-reported information fromindividual students at the start of the NCEA ACRE assessment on a response form/answer sheet.Accuracy of this classification information can only be assured by local assessmentadministrators, and these reports assume the information when recorded is accurate. Related tothis, should any of the student identifying information be omitted by the student, then thatstudent’s scores (total, classification, domain and pillar) are not included in reports that arecompiled based on that information. Thus, as an example, you could find that the total number ofstudents for an “All Students” analysis does not equal the sum of counts shown for the Catholicand non-Catholic tabulations combined, etc.

In addition to detailing and summarizing Local performance, many of the reports provide resultsreferencing National performance. National summaries offer comparative statistical indices andhave been computed based on all administrations of the particular NCEA ACRE assessmentduring the previous academic year. National statistics were computed and are reported for aparticular grade. If the user assessed students at a non-traditional grade (e.g., giving the NCEAACRE Level 3 assessment to grade 10 students), the National statistics shown are for the gradeclosest to the traditional grade for which national statistics were available for the NCEA ACREadministered. In this illustration, the national statistics shown would be for grade 11 students.When a report shows performance for a specific student category (e.g., Catholic only students),the National statistic summary is computed and shown for all students in that grade for thatinformation category.

Page 32: 86097 NCEA ACRE Interpretation Manual.pmd

29

Those programs ordering Individual Student Reports will very often wish to give these reports toparents (the Student Summary Listing can serve as the school/program’s file copy) or share themwith the student. Sharing reports with parents and the student is encouraged by the NCEA. (Seethe sample letter from an institution to a parent at www.ncea.org). Should parents wish to reviewthe assessment itself, this is allowed, but only under the supervision of a staff member oradministrator. Under no condition are assessments to be copied, distributed or disseminated, orused to create instructional material. Booklets and questions will be re-used and security isimportant to assuring a fair annual assessment.

A close and thoughtful review of the results is encouraged. The NCEA does offer and isavailable for phone consultation, workshops and assistance in using the results to monitor andevaluate local performance. Contact the Department of Religious Education with inquiries(1-202-337-6232). Below is an overview of the information presented and summarized in eachreport type noted above. Exemplar reports are included to assist the presentation. While there aremultiple reports for different configurations of students, only one illustration is provided foreach report template. Presentation and interpretation of results is the same regardless of thegroup report inspected. The following reports contain imaginary data offered for illustrativepurposes only.

1. School/Parish Summary Reports: Part 1 Faith Knowledge Items by Domains

In the upper right-hand corner is identified the School or Parish program, the diocese in whichthe program is located, the grade of the students assessed, and when the assessment wasadministered. Also identified is the number of students comprising each student group: allstudents, Catholic students, non-Catholic students, and students who have attended the schoolor parish program for two years or more. The question-by-question analysis shows the percentof students selecting each response choice (for each of the reported groups) including anindication of the percent of students who did not attempt the question (Omit). An asteriskdenotes the correct answer to each item. Also presented is an abridged statement of the actualassessment question (Item Description) to facilitate interpretation and use.

As there are eight Faith Knowledge Domains represented in the revised NCEA ACRE, thisreport is prepared and organized for each Domain. NCEA ACRE items are grouped andpresented by the Domain in which the item was assigned by NCEA ACRE advisors. To facilitateyour use and interpretation, note that the items are presented in order from the easiest to the mostdifficult question within each domain listing for your students. As results for up to four differentgroups of students are summarized (all students assessed at the grade, Catholic students only,non Catholic students, and students attending the school/parish program for two years or more),the ranking of the items from easiest to most difficult is done based on the performance of theCatholic students. While up to four different configurations of students are possible, results foritems are presented only when there are 6 or more students in a group. Reviewing briefly thesummary of results offered with reference to the Score Report 1 illustration, 172 Grade 12students were assessed from Our Lady School. In this group 126 are Catholic, 41 are non-Catholic, and 168 of the students have attended the program for two years or longer. Note that 5students did not identify themselves as Catholic or non-Catholic. The easiest item in this Domainwas question 26, followed by 18, then 58, 27 and so on (six item summaries are printed perpage). Referring to Question 50 in the illustration, 78% of all examinees responded correctly(response choice B) whereas 73% of the non-Catholic students answered this question correct.The most common incorrect response made to this question by students in any group wasresponse choice D.

Page 33: 86097 NCEA ACRE Interpretation Manual.pmd

30

A - 81%C

- 83N

- 76R

- 81

25%50%

75%

A - 86%C

- 87N

- 83R

- 86

A - 85%C

- 87N

- 80R

- 85

A - 78%C

- 82N

- 73R

- 78

A - 82%C

- 84N

- 73R

- 82

Jesus is truly God and truly

man

Meaning of the Trinity

Foundation of our hope for union w

ith God

Identify Christian

description of God

Jesus welcom

ed sinners

27 261850

A - All students

C - C

atholic students only

* Note: W

ithin each domain, item

s are ordered from strongest to poorest perform

ance for your Catholic students. The correct answ

er is marked by an asterisk.

** When there are few

er than 6 students in a category, no summ

ary is reported for that group.

N - N

on Catholic students only

R - R

eturning students (attended the school/parish program for 2 years or m

ore)

58

A - 81%*

C - 83

N - 76

R - 81

A - 1%C

- 1N

- 2R

- 1

A - 3%C

- 2N

- 7R

- 4

A - 3%C

- 2N

- 5R

- 4

A - 82%*

C - 84

N - 73

R - 82

5%*

575 12%1115127%9278%6178

5%475 86%*

87838685%*

8780854%454

9%*

9109 1%1014%21046%656

0%000

78%*

827378

7%6107

11%*

101211

0%000 0%0000%0000%000

Item#*A

B

C

D

O

mit

Selected Response

Percent of Correct R

esponses**Item

Description

SCO

RE

RE

POR

T �

NC

EA

AC

RE

PAR

T I FA

ITH

KN

OW

LE

DG

E IT

EM

S BY

DO

MA

INS

DO

MA

IN 1: G

OD

, FAT

HE

R, SO

N A

ND

HO

LY SP

IRIT

School/Parish:C

ity:D

iocese:D

ate of Assessm

ent:G

rade:

OU

R LA

DY S

CH

OO

LK

AN

SA

S C

ITYK

AN

SA

S C

ITY, KS

NO

V 2002

12Total S

tudents: 172

Non C

atholic Students:

41C

atholic Students:

126 R

eturning Students:

168

Page 34: 86097 NCEA ACRE Interpretation Manual.pmd

31

When reviewing the specific assessment question results, a number of elements of performancedeserve your consideration. Here are a few queries to assist and guide your review. Otherquestions are possible and equally important.

First, do different groups of students perform similarly on the questions, or areunexpected discrepancies observed? Are the incorrect response profiles to items similaracross the groups?

Continuing, does the ordering of the difficulty of the questions align with or confirm whatwould be your expectations (what did your Alignment review/evaluation show?)? If not,why not? Is there a curricular or instructional issue that needs attention?

When an item is answered incorrectly by a sizable number of students, what does theprofile of incorrect responses reveal? Do incorrect respondents move toward a particularincorrect choice, or do they spread out among the incorrect choices? Spread among theresponse choices signifies the content may not have been taught or effectively taught.When many students center on one incorrect choice, this signals that instruction isneeded and the point of misinformation is somewhat identified by the incorrect choicemade, but there may have been misinformation passed on to students.

How high is performance on the easiest questions for your students? Is this acceptable?In addition, what are the characteristics and content properties of the most challengingitems for your students? Why are students having difficulty with these questions? Whatdoes the alignment review suggest? Can you identify a plan to correct deficient areas andinformation?

2. School/Parish Summary Report: Part 1 Total Score, Standards Classification andDomain Summary

At the top of each page is identified the School/Parish program, its diocesan affiliation, the groupof students being summarized, the grade of these students, when students were assessed, and thenumber of students in the group for which information is reported. In the upper right hand cornerof the report the group summarized in the report is identified. This report is repeated for groupscomprised of six or more students (possible report summaries: all students, Catholic students,non-Catholic students, and those attending the program for at least two years).

The results are presented for the Part 1 Total Faith Knowledge score (current year local andacademic years 2001-2002 and 2002-2003 national averages) and percent of students scoringin each of the three NCEA performance standards categories (both local and national rates areshown). Following these Total Score summaries look on the left-hand side of the page for theDomain group average (mean) scores, the number of points possible for the Domain, and theaverage percent correct for Local and National groups on each set of Domain questions isreported. The National averages and percentages were determined using all actual results fromthe tens of thousands of NCEA ACRE school and parish students assessed during the baselineschool years of the Revised NCEA ACRE. (National summaries are computed for the specificgrade and particular student group for which results are shown.) The national statistics are basedon all students sitting for the Revised NCEA ACRE assessment during academic years 2001-2002 and 2002-2003. The statistics were computed to represent the average between theseinitial (or baseline) years. Annual results are available and are distributed annually by theNCEA.

Page 35: 86097 NCEA ACRE Interpretation Manual.pmd

SCORE REPORT �NCEA ACRE All Students

PART I TOTAL SCORE, STANDARDS CLASSIFICATION AND DOMAIN SUMMARY

SCHOOL/PARISH:DIOCESE:

NO. OF STUDENTS:

TOTAL ASSESSMENT AVERAGE CLASSIFICATION PERCENTAGE OF STUDENTS

NO.DOMAIN AVERAGE CORRECT PERCENTAGE OF YOUR STUDENTS**

Part I: Faith Knowledge

# Items 63

Group Avg: 70.1%

Nat’l Avg*: 69.6%

AdvancedProficientNeeds Improvement

OUR LADY SCHOOLKANSAS CITY, KS172

GRADE:DATE OF

ASSESSMENT:

12 Nov 2002

25%

9.9%60.5%

29.7%

18.9%47.8%

33.3%

82.6%

16.9%

0.6%

90.7%

8.7%

0.6%

72.7%

26.7%

0.6%

55.2%

41.9%

2.9%

49.4%

47.7%

2.9%

55.2%

39.5%

5.2%

87.2%

12.8%

0.0%

71.5%

23.8%

4.7%

50% 75%

25% 50% 75%

AdvancedProficientNeeds Improvement

1. God- Father, Son and Holy Spirit

Group Average: 6.2/8 = 78%

Nat’l Avg*: 6.1/8 = 76%

Group Average: 6.5/8 = 81%

Nat’l Avg*: 6.4/8 = 80%

Group Average: 5.2/7 = 75%

Nat’l Avg*: 5.2/7 = 74%

Group Average: 4.7/8 = 59%

Nat’l Avg*: 4.8/8 = 60%

Group Average: 6.4/10 = 64%

Nat’l Avg*: 6.3/10 = 63%

Group Average: 4.7/8 = 58%

Nat’l Avg*: 4.8/8 = 60%

Group Average: 6.3/8 = 79%

Nat’l Avg*: 5.9/8 = 74%

Group Average: 4.1/6 = 68%

Nat’l Avg*: 4.0/6 = 67%

6-8

3-5

0-2

2. Church- One, Holy, Catholic and Apostolic

6-8

3-5

0-2

3. Liturgy and Sacraments

6-7

3-5

0-2

4. Revelation, Scripture and Faith

6-8

3-5

0-2

5. Life in Christ- Personal Morality and Catholic Social Teaching

8-10

4-7

0-3

6. Church History 6-8

3-5

0-3

7. Prayer/Religious Practices

6-8

3-5

0-2

8. Catholic Faith Literacy 5-6

3-4

0-2

* The National Average is based on all scores from the 2001/2002 NCEA ACRE administrations.** The percentage of your students scoring in the identified “number correct” range. 32

Page 36: 86097 NCEA ACRE Interpretation Manual.pmd

33

In the Domain section of this report on the bottom right-hand half of the page, the Domain scoresof individual students have been tallied (No. Correct) to show the group’s score distribution.That is, for the Number Correct score ranges shown for each Domain, the percent of studentsattaining scores in specific ranges is displayed. For each Domain, reported also on the bottomright-hand are the percent of students attaining scores in three identified Domain score ranges(low, mid, and high score bands).

As mentioned, this score summary report is provided for up to four groups of students (whenthere are at least six students available to comprise a group). To assist your interpretation of thesereported summaries, refer to the illustration in Score Report shown on page 32. Shown is thesummary for all students assessed for Our Lady School. In all, 172 students took the assessment.On the Total Assessment on the Part 1 Faith Knowledge assessment, there were 63 pointspossible (that is, 63 total questions), and this group scored an average percent correct of 70.1%.The national average on the Total Assessment for all grade 12 students is 69.6%. On the TotalAssessment, 9.9 percent of the local school group achieved percent correct scores that wouldclassify them in the Advanced performance category, 60.5 percent had scores placing them inthe Proficient performance category and 29.7 percent of the students had scores indicating thatthey are in Need of Improvement based on the performance categories’ definitions and cutscores. The national percentages of grade 12 students classified in each of the three performancecategories also are given: Advanced – 18.9%; Proficient – 47.8%; and Needs Improvement –33.3%.

Considering the results reported for Domain 3 (Liturgy and Sacraments) the group had anaverage score of 5.2 out of a possible maximum number of points of 7 questions, which is anaverage percent correct for the group of 75%. The estimated national average for this Domainfor students at grade 12 is also a score of 5.2 correct out of a possible 7, which converts to anational average percent correct of 74%. It is noted that the percent correct values are differentfor what appears to be the same average number of items correct. The reason for the apparentdiscrepancy is that the average number of items correct values reported (5.2) are roundedvalues, but the values used in computing the percent correct are more precise. These apparentdiscrepancies may occur occasionally in reported values due to the rounding of numbers forreporting, but the values reported are correct given the more precise computations. For Domain3, 72.7% of the students achieved scores of 6 or 7 correct, 26.7% attained scores between 3 and5, and 0.6% received scores between 0 and 2 correct.

As you study and review results of your students, consider the following questions to guide youranalysis and interpretation. These reports present information at the “score” level, that is, forTotal Scores and Classification based on the Part 1 assessment and by Domain. The perspectiveone needs to have with such results becomes more global and holistic. One relatedconsideration: for the most part 6 to 10 items comprise the measurement in each Domain. Thefewer the number of items, e.g., (Church History, Domain 6, Level 1 includes four questions),the less stable the assessment can be. Be careful not to “over interpret results” in such situations.Consider the following inquiries regarding your results:

How did our students fare, given our goals and expectations, based on NCEAStandards classification? How “at ease” are we about these results? What were ourexpectations – did we have expectations similar to or different from the NCEAStandards?

Is performance in each Domain acceptable? Are there areas that seem strong, othersthat need improvement?

Are there large discrepancies found when our local groups are compared? Is thisunderstandable and explainable?

Page 37: 86097 NCEA ACRE Interpretation Manual.pmd

34

How does local performance compare to the estimated National Averages shown?Given your students and your expectations, is your comparison favorable or ofconcern? How should your students be performing?

Is your pattern of performance across Domains similar or are there noticeabledifferences? Are the differences expected or unexpected given the Domain emphasesin your curriculum?

Do students distribute across the possible Domain score ranges? If some students arelearning and appear to have knowledge (scoring high), why don’t all or most studentsperform at this high level? Are there some Domain scores that are especially low? Whatis the coverage in your curriculum in these areas? Is some evaluation needed?

Is improvement needed for many of your students? Are some students seriouslylacking content knowledge? What are the next steps you need to take/consider?

3. School/Parish Summary Report: Part I Score Summary by the Four Pillars of theCatechism of the Catholic Church

These reports, one for each of the four groups containing six or more students, parallel theDomain information in the reports discussed in Score Report above. What distinguishes thisPillar report from the Domain report is that scores have been computed based on alignment ofthe questions to model the four Pillars of the Catechism of the Catholic Church. The items thatconstitute this arrangement were identified earlier in this manual. Of note regarding the Pillarsscores is that not all questions presented in the NCEA ACRE assessment are used to create thescores. As was provided for the Domain summaries, National Averages were computed usingtest information and actual data from all NCEA ACRE administrations during the previousacademic year. Refer to the illustration of Score Report and consider the following: shown asthe illustration is the performance summary for the 126 grade 12 Catholic students from OurLady Elementary. With regard to Pillar 4 (Christian Prayer), these 126 students attained anaverage score of 8.0 on the 11 questions that comprise the assessment for this Pillar for anaverage percent correct of 73%. The National average performance on these items for studentsat this grade is an average score of 7.6 (or 69% correct on average). Continuing, 78.6% of the126 students attained scores between 8 and 11, 21.4% scored from 4 to 7, and no students (0%)scored 3 or lower.

Local review of your results needs to consider issues as those posed previously (Score Report )with the added condition of relating findings to student knowledge acquisition and curriculumemphases as well as expectations with reference to faith knowledge learning with specificreference to the tenets of the Catechism. Consider also, given that the Pillar scores have beentallied based on Catechism alignment; do the Pillar results offer any sharper/clearer insightsbeyond the Domain scores? Also note that although Pillars 2 and 4 have some questions incommon with Domains 3 and 7 respectively, questions that make up these companion scales dodiffer somewhat. How did your students score across these companion scales and whatinformation can be gleaned based on such a comparison?

4. School/Parish Summary Report: Part I Faith Knowledge Total Score FrequencyDistributions

For this report, the heading information in the upper right-hand corner remains consistent withthe prior reports in the series. The score information reported details the Frequency tables of Part1 total scores attained by students for each of the four student reporting groups (all, Catholic,non Catholic, and returning students). As has been discussed, when fewer than six (6) studentscomprise a group, that group’s table is omitted from the report. For a group that is presented,

Page 38: 86097 NCEA ACRE Interpretation Manual.pmd

35

shown in each table are the individual scores that define each NCEA performance category(Advanced, Proficient, Needs Improvement) for that assessment, the number of studentsattaining a particular score (Number), and the percentage of local students attaining a specificscore (Freq%). The bottom two rows of each table reports the percent of local and nationalstudents found in the particular category for that group at the grade assessed.

Score Report illustrates results summarized in the Frequency Distribution reports. NCEAACRE total assessment scores for twelfth grade students from Our Lady are reported. In all therewere 172 grade 12 students assessed (information was available that identified 126 students asCatholic students, 41 as non-Catholic, and within the total group, 168 reported having been inthe local program for at least two (2) years). Examining the “All Students” table, we arereminded that Part 1 faith knowledge total scores that range from 55 to 63 define the categoryAdvanced (Adv), scores from 41 to 54 are categorized as Proficient (Prof), and scores of 40 andlower are classified as in Need of Improvement (NI). In the table showing the scores of AllStudents, no student in the group assessed (0) attained a perfect score (63), one (1) studentscored a 62, 2 students scored a 61 correct, etc. The two students attaining a score of 61 comprise1.2% of the students tested. Referring to the Catholic Students display table, we observe that10.3%, 65.1% and 24.6% of the students were classified respectively into the performancestandard categories of Advanced, Proficient and in Need of Improvement (the sum of the Freq %column for each performance category). National classification percentage rates for grade 12Catholic students were 19.9%, 48.9% and 31.2% in the Advanced, Proficient and NeedsImprovement categories, respectively.

The information compiled in the Frequency Distribution tables provides a direct means ofevaluation of student performance within the NCEA performance standard categories. Thetables detail the relationship between students’ attained scores and categorical classification fordifferent student groups. A review of the tables gives a clear sense of the pattern of student totalperformance, where scores in your program may be clustering and an indication of thevariability in performance (greater spread of scores may signal unevenness in instruction, issuesinvolving motivation, or perhaps curricular gaps in the opportunity for instruction, etc.). Acareful analysis of these data displays is merited. If your program has taken time to establish“local standards,” it is these tables that will inform you as to the extent to which your students aremeeting your assessment performance goals and expectations. The Freq % column will beuseful should you wish to tally the percent of students in performance categories when localcriteria for category placement differ from those adopted by the NCEA (e.g., what percent of ourstudents scored 53 or higher, between 50 and 45, etc.). Following are some queries to guide yourlocal review and evaluation.

How did our students do based on NCEA Standards classification? What were ourexpectations – did we have expectations similar or different from the NCEA Standards?What success did our students have meeting our local cutscores? How do studentsmeasure up against our local performance standards?How much change (more learning) is needed to move performance to high rates of highperformance? Are there large discrepancies observed when our local groups arecompared? Is this understandable and acceptable?How does local performance compare to the baseline year National percentages shown?Given your students and your expectations, is your comparison favorable or of concern?Do students distribute across the total score ranges or are they clustered? If some studentsare learning and appear to have knowledge (scoring high), why don’t all or most studentsperform at this high level? Are there some scores that are especially low (outliers)? Aresome students seriously lacking content knowledge? Is some evaluation of the needs ofparticular students warranted?What are the next steps you need to take/consider?

Page 39: 86097 NCEA ACRE Interpretation Manual.pmd

36

5. School/Parish Summary Report: Affective Statements by Student Information Category

The format of these reports is comparable to, though not the same as, the displays of informationassociated with the question-by-question reports for faith knowledge Domains previouslydiscussed (Score Report ). General information identifies the school or parish program, thegrade of the students assessed, when the assessment occurred, and the number of studentsassessed in each of the four groups. Affective statements are grouped and displayed in seven (7)affective area Reporting Categories. The reporting format is the same for six of the sevenaffective categories: Relationship with Jesus, Images of God, Catholic Identity, Morality,Perceptions About Your School/Parish Program, and Relationships with Others. The Affectivereport format is different for the Students’ Concerns category, which is described in the nextsection.

The report for each Reporting Category presents the affective statement as it actually appears inthe NCEA ACRE booklet, and further identifies the statement by its number (as appearing in therevised 2001 NCEA ACRE booklet). Statements are listed within a category in the numericalorder they occur in the assessment booklet. For each statement, the percent of studentsresponding “strongly agree,” “agree,” “disagree” or “strongly disagree” is reported for eachgroup. The percent of students not responding to the statement (Omit) is also shown. Asterisksdenote the preferred/correct response to each statement. As with the faith knowledge cognitivereports, data are presented by response options for up to four groups of students: all students,students who report being Catholic, those students indicating they are not Catholic, and finally,students indicating they have been members of the school or parish program for two years orlonger. Again, group reports are provided only when six or more students comprise a group.Graphic displays given on these reports summarize student responses by combining thecategories of the preferred response options. For example in the Score Report illustration forstatement 13, 92% of all students responded either “strongly agree” or “agree” which were thepreferred responses to the statement.

Your results to these affective statements merit close inspection and consideration. Unlike thefaith knowledge cognitive questions (Part 1 of the NCEA ACRE assessments) where there isspecified a correct response, often it is the pattern of responses over statements in a category, notonly the magnitude of responses to individual statements, that deserves thoughtful andreflective review. Important in the study of these data is the determination of when a responserate crosses a threshold that signals the need for attention. This should be considered andinterpreted on a statement-by-statement basis. Unlike the cognitive questions where perhaps a10 to 15 percent incorrect response rate can be set aside and attributed to individual differencesin cognition, when we consider the affective statements, a heightened attention to the extent andpattern of responses deserves consideration. Review and evaluation across groups should alsobe instructive. These data provide a rich resource for staff to engage students on topics, ideas,principles of behavior and practice. The need for staff development on issues that arise shouldbe considered.

Page 40: 86097 NCEA ACRE Interpretation Manual.pmd

37

SCORE REPORT �NCEA ACRE Catholic Students

PART I SCORE SUMMARY BY THE FOUR PILLARS OF THE CATECHISM OF THE CATHOLIC CHURCH

SCHOOL/PARISH:DIOCESE:

NO. OF STUDENTS:

NO.PILLAR AVERAGE CORRECT PERCENTAGE OF YOUR STUDENTS**

OUR LADY SCHOOLKANSAS CITY, KS126

GRADE:DATE OF

ASSESSMENT:

12 Nov 2002

25%

78.6%

21.4%

0.0%

69.0%

31.0%

0.0%

64.3%

33.3

2.4%

78.6%

21.4%

0.0%

50% 75%

PART I: Profession of Faith (Creed)

Group Average: 12.3/16 = 77%

Nat’l Avg: 12.1/16 = 76%

Group Average: 9.4/13 = 72%

Nat’l Avg: 9.3/13 = 72%

Group Average: 10.2/15 = 68%

Nat’l Avg: 10.0/15 = 67%

Group Average: 8.0/11 = 73%

Nat’l Avg: 7.6/11 = 69%

12-16

6-11

0-5

Part II: The Celebration of the Christian Mystery (Liturgy and Sacraments)

10-13

5-9

0-4

Part III: Life in Christ (Morality)

11-15

6-10

0-5

Part IV: Christian Prayer (Prayer)

8-11

4-7

0-3

* The National Average is based on all scores from the 2001/2002 NCEA ACRE administrations.** The percentage of your students scoring in the identified “number correct” range.

Page 41: 86097 NCEA ACRE Interpretation Manual.pmd

38

NC

EA

AC

RE

PAR

T I FA

ITH

KN

OW

LE

DG

E SC

OR

E F

RE

QU

EN

CY

DIST

RIB

UT

ION

S

Score

Num

berF

req %S

coreN

umber

Freq %

Score

Num

berF

req %

630

0.054

84.7

407

4.162

10.6

536

3.539

52.9

612

1.252

42.3

386

3.560

00.0

518

4.737

42.3

591

0.650

116.4

361

0.658

31.7

499

5.235

00.0

571

0.648

52.9

344

2.356

31.7

479

5.233

10.6

556

3.546

105.8

322

1.245

84.7

314

2.344

84.7

303

1.743

95.2

291

0.642

42.3

280

0.041

52.9

< 2813

7.6Local %

Adv.

9.9Local %

Prof.

60.5Local %

NI

29.7N

at’l % A

dv.18.9

Nat’l %

Prof.

47.8N

at’l % N

I33.3

All Students

Advanced

Proficient

Needs Im

provement

Score

Num

berF

req %S

coreN

umber

Freq %

Score

Num

berF

req %

630

0.054

86.3

406

4.862

10.8

535

4.039

21.6

612

1.652

32.4

385

4.060

00.0

517

5.637

32.4

591

0.850

97.1

360

0.058

32.4

496

4.835

00.0

571

0.848

32.4

342

1.656

32.4

476

4.833

10.8

552

1.646

64.8

322

1.645

64.8

311

0.844

75.6

301

0.843

86.3

291

0.842

43.2

280

0.041

43.2

< 287

5.6

Local % A

dv.10.3

Local % P

rof.65.1

Local % N

I24.6

Nat’l %

Adv.

19.9N

at’l % P

rof.48.9

Nat’l %

NI

31.2

Catholic Students

Advanced

Proficient

Needs Im

provement

Score

Num

berF

req %S

coreN

umber

Freq %

Score

Num

berF

req %

630

0.054

00.0

401

2.462

00.0

531

2.439

12.4

610

0.052

12.4

381

2.460

00.0

511

2.437

12.4

590

0.050

24.9

361

2.458

00.0

493

7.335

00.0

570

0.048

24.9

342

4.956

00.0

472

4.933

00.0

554

9.846

37.3

320

0.045

24.9

313

7.344

00.0

302

4.943

12.4

290

0.042

00.0

280

0.041

12.4

< 286

14.6

Local % A

dv. 9.8

Local % P

rof.46.3

Local % N

I43.9

Nat’l %

Adv.

12.1N

at’l % P

rof.42.4

Nat’l %

NI

45.4

Non C

atholic StudentsA

dvancedP

roficientN

eeds Improvem

ent

Score

Num

berF

req %S

coreN

umber

Freq %

Score

Num

berF

req %

630

0.054

84.8

407

4.262

10.6

536

3.639

53.0

612

1.252

42.4

386

3.660

00.0

518

4.837

31.8

591

0.650

116.5

361

0.658

31.8

499

5.435

00.0

571

0.648

42.4

344

2.456

31.8

479

5.433

10.6

556

3.646

95.4

322

1.245

84.8

314

2.444

74.2

303

1.843

95.4

291

0.642

42.4

280

0.041

53.0

< 2813

7.7

Local % A

dv.10.1

Local % P

rof.60.1

Local % N

I29.8

Nat’l %

Adv.

19.1N

at’l % P

rof.49.2

Nat’l %

NI

31.7

Returning Students

Advanced

Proficient

Needs Im

provement

OU

R LA

DY

SC

HO

OL

KA

NS

AS

CIT

YK

AN

SA

S C

ITY

, KS

NO

V 2002

12

School/Parish:C

ity:D

iocese:D

ate of Assessm

ent:G

rade:T

otal Students:

172N

on Catholic S

tudents:41

Catholic S

tudents:126

Returning S

tudents:168

SCO

RE

RE

PO

RT

4

Score =

scores possible; Num

ber = num

ber of students attaining a score; Freq %

= percentage of students attaining that score or a low

er score

Page 42: 86097 NCEA ACRE Interpretation Manual.pmd

39

A - 90%C

- 97N

- 68R

- 90

25%50%

75%

A - 95%C

- 98N

- 83R

- 94

A - 89%C

- 95N

- 68R

- 89

A - 83%C

- 88N

- 74R

- 85

A - 92%C

- 97N

- 78R

- 92

I feel Jesus really understands m

e.

I look upon Jesus as my

Savior and friend.

Jesus’ relationship with m

ereally helps m

e.

One w

ay that God speaks

to me is through the B

ible.

I believe that Jesus curedthe blind and raised the dead.

19 1725

A - All students

C - C

atholic students only

* Note: For each R

eporting Category, statem

ents are numbered and ordered as they occur in the booklet. P

referred responses are marked by an asterisk.

** When there are few

er than 6 students in a category, no summ

ary is reported for that group.

N - N

on Catholic students only

R - R

eturning students (attended the school/parish program for 2 years or m

ore)

13

A - 46%*

C - 52

N - 24

R - 45

A - 80%*

C - 84

N - 66

R - 80

A - 51%*

C - 55

N - 41

R - 51

A - 28%*

C - 25

N - 37

R - 29

A - 69%*

C - 72

N - 61

R - 69

44%*

454445 15%*

14171438%*

40273823%*

251723

8%2248 5%21559%52295%2155

2%072 1%0212%01022%172

0%000

55%*

633756

13%121712

3%1103

0%000 0%0000%0000%000

#*

StonglyA

greeStonglyD

isagree A

gree D

isagreeO

mit

Percent of P

referred Responses**

Statement

SCO

RE

RE

POR

T �

NC

EA

AC

RE

AF

FE

CT

IVE

STA

TE

ME

NT

S BY

STU

DE

NT

INF

OR

MA

TIO

N C

AT

EG

OR

Y

RE

PO

RT

ING

CA

TE

GO

RY

: RE

LA

TIO

NSH

IP W

ITH

JESU

S

School/Parish:C

ity:D

iocese:D

ate of Assessm

ent:G

rade:

OU

R LA

DY S

CH

OO

LK

AN

SA

S C

ITYK

AN

SA

S C

ITY, KS

NO

V 2002

12Total S

tudents: 172

Non C

atholic Students:

41C

atholic Students:

126 R

eturning Students:

168

Page 43: 86097 NCEA ACRE Interpretation Manual.pmd

40

6. School/Parish Summary Report: Affective Statements by Reporting Category –Students’ Concerns

The Students’ Concerns statements report patterns the information contained in the morebroadly represented affective statement reports discussed above. Though comparable informat, the Students’ Concerns report summary differs in a number of significant ways. Studentsrespond to the Concerns category statements using a scale of “Not a Problem,” “A MinorProblem,” or “A Big Problem.” Surveyed are conditions and behaviors that students may beexperiencing in their local school environment (note: students participating in the assessment asmembers of a parish program are nevertheless directed to offer their perceptions with referenceto their school environment). For the Concerns category report, no disaggregated group reportsare provided, data are summarized for the total group only. Information is presented by orderingstatements from the greatest to the least frequently reported issue or problem. The graphicdisplay reveals the percent of students who responded that the statement was either a minorproblem or a big problem. For example in the illustration provided (Score Report ), for thisgroup of students, 54% responded that Fighting is a problem (minor or big) at their school.

When reviewing these results, it is advised that the magnitude of an issue or problem should notbe judged based on majority counts. Any occasion of meaningful departure from what shouldbe expected or understood, given the students being served, merits attention and calls for somecourse of action by the local program. NOTE: During field trials, samples of students at everyassessment level (1, 2, and 3) were interviewed to determine the appropriateness of the affectivestatement or question for that grade level and the understandability of the language used.

“Local school environment” is defined at the local level and communicated to the studentsverbally at the time of assessment. For example, if you wish “local school environment” tomean only what happens on school property during the school day hours, tell the studentsbefore they begin to respond to those questions. If, on the other hand, you wish this phraseto mean what happens at school as well as outside of school time but with school friendssocially, let the students know. Since this information is for your benefit to get a sense ofhow faith attitudes are or are not being appropriated or owned by the students, define“local school environment” in a way most suitable to your local situation.

Page 44: 86097 NCEA ACRE Interpretation Manual.pmd

41

25%50%

75%

94%C

ursing, blasphemy, sw

earing36

* The statements are num

bered as they occur in the booklet, but are listed from greatest to least problem

areas.** For the Students’ C

oncerns Category, responses are tallied and reported for all students only (no disaggregation). Percents show

n are the combined (M

inor + Big) values.

5%47%

47%0%

89%Alcohol

4210%

40%49%

0%

89%C

heating/lack of honesty45

10%42%

47%0%

88%M

arijuana43

10%40%

48%1%

80%O

ther drugs44

19%48%

32%0%

54%Fighting

4642%

45%9%

3%

47%Eating D

isorders41

53%35%

12%0%

45%Sexual harassm

ent39

55%33%

12%0%

34%R

espect for diversity35

66%27%

7%0%

33%R

acism38

67%21%

12%0%

29%Personal safety

3770%

23%6%

0%

20%D

ate rape40

80%7%

13%0%

SCO

RE

RE

POR

T �

NC

EA

AC

RE

A

FF

EC

TIV

E STA

TE

ME

NT

S BY

STU

DE

NT

INF

OR

MA

TIO

N C

AT

EG

OR

Y

RE

PO

RT

ING

CA

TE

GO

RY

: STU

DE

NT

S’ CO

NC

ER

NS

School/Parish:C

ityD

iocese:D

ate of Assessm

ent:G

rade:

OU

R LAD

Y SCH

OO

LKAN

SAS CITY

KANSAS C

ITY, KSN

OV 2002

12Total Students:

172 N

on Catholic Students:

41C

atholic Students: 126

Returning Students:

168

#*

Not a

ProblemM

inorProblem

Big

Problem

Om

itStatem

entPercent of Indicating a Problem

**

Page 45: 86097 NCEA ACRE Interpretation Manual.pmd

42

7. School/Parish Summary Report: Local Assessment Results

As mentioned previously, these reports are provided if your program incorporated into theNCEA ACRE assessment a collection of local faith knowledge questions chosen to extend orenhance the assessment. Two distinct reports become available when local questions have beenused. Refer to Appendix D for information about this feature of the 2001 NCEA ACRE programif you are interested in this opportunity for assessment in a subsequent year.

The first of the reports parallels the Faith Knowledge Domain reports. Shown is how yourstudents performed on each assessment question you used. Item data are presented for up to fourgroups (all students, Catholic students, non-Catholic students, and those students involved inyour program for two years or more). When the group size is sufficient (n > 5), reported is thepercent of students selecting each response choice. The percent choosing the keyed “correctresponse” is denoted by an asterisk, along with the percent selecting each of the other responsechoices (including any Omits) for each group. Items are displayed in the numerical order youpresented them in your local assessment, and performance for up to 20 questions aresummarized. To illustrate, in the example provided, note that with reference to LocalAssessment Item 3, 50% of all students selected the keyed response, whereas 60% of returningstudents made the correct response. These results must be reviewed while referring to thequestions you administered to your students. Refer to Appendix D if your results appear to beinconsistent with your expectations.

The second arrangement of results from your local assessment shows how students in differentgroups scored on all your questions. Total scores for the questions you administered have beencomputed, and for each group (n > 5) a summary prepared. Reported is the number of studentscomprising the group, the group’s average score and average percent correct score on thequestions. Also reported is a score distribution for each group showing the percent of studentsscoring in each range designated. For example, in the illustration provided, the entire group ismade up of 20 students, and for all students the average percent correct on the local questionswas 62.0%. For the group, no students attained scores of 9 or 10 (0%), 15% of the students scored8, 10% scored 7, and 75% received scores between 0 and 6.

As you review your results from your local assessment, consider the following:

How did the total scores attained by students compare to similar Domain and the TotalAssessment scores on the NCEA ACRE?

With regard to individual questions, how did your students perform to comparableNCEA ACRE questions? Did your questions seem harder? or perhaps easier? Why?

Did you find anomalies in your question-by-question analyses? That is, for somequestions, are incorrect choices being selected more often than the “correct” answer?Could there be a problem with some of your questions, or do students havemisinformation? (You should review your assessment to make sure that you did indeedfollow the “answer key” instructions. Refer to Appendix D.)

Did your students’ performance on your self-chosen items meet your expectations?

Page 46: 86097 NCEA ACRE Interpretation Manual.pmd

43

A - 25%C

- 8N

- 57R

- 40

25%50%

75%

A - 30%C

- 8N

- 71R

- 60

A - 25%C

- 15N

- 43R

- 30

A - 35%C

- 23N

- 57R

- 50

A - 50%C

- 46N

- 57R

- 60

Local Assessm

entItem

4

Local Assessm

entItem

1

Local Assessm

entItem

2

Local Assessm

entItem

5

Local Assessm

entItem

3

4 125

A - All students

C - C

atholic students only

* Note: Item

s are ordered numerically. The correct answ

er is marked by an asterisk.

** When there are few

er than 6 students in a category, no summ

ary is reported for that group.

N - N

on Catholic students only

R - R

eturning students (attended the school/parish program for 2 years or m

ore)

3

A - 35%C

- 46N

- 14R

- 30

A - 40%C

- 46N

- 29R

- 20

A - 25%*

C - 15

N - 43

R - 30

A - 10%C

- 15N

- 0R

- 10

A - 5%C

- 8N

- 0R

- 0

25%*

85740 10%1502025%23293020%152930

10%1500 20%310020%3101050%

*465760

30%312930 30%

*8716030%31293025%311410

0%000

35%*

235750

35%314330

20%31010

0%000 0%0000%0000%000

SCO

RE

RE

POR

T �

NC

EA

AC

RE

2001S

CH

OO

L/PAR

ISH

SU

MM

AR

Y RE

PO

RT

LO

CA

L A

SSESSM

EN

T R

ESU

LTS

School/Parish:

Diocese:

Date of A

ssessment:

Grade:

OU

R LA

DY E

LEM

KA

NS

AS

CITY, K

SO

CT 2001

5

Total Students:

20 N

on Catholic S

tudents: 7

Catholic S

tudents: 13

Returning S

tudents: 10

Item#*A

B

C

D

O

mit

Selected Response

Percent of C

orrect Responses**

Page 47: 86097 NCEA ACRE Interpretation Manual.pmd

44

SCORE REPORT �Catholic Students

School/Parish:Diocese:

No. of Students:

OUR LADY ELEMKANSAS CITY, KS10

Grade:Date of

Assessment:

5 OCT 2001

0.0%

0.0%

15.0%

10.0%

75.0%

All Students

Group Avg: 62%

# Students: 20

10 - 109 - 98 - 87 - 70 - 6

0.0%

0.0%

7.7%

15.4%

76.9%

10 - 109 - 98 - 87 - 70 - 6

NCEA ACRE 2001SCHOOL/PARISH SUMMARY REPORTLOCAL ASSESSMENT SUMMARY

NO. CORRECT PERCENTAGE OF STUDENTS

25% 50% 75%

Catholic Students

Group Avg: 42.2%

# Students: 13

NO. CORRECT PERCENTAGE OF STUDENTS

25% 50% 75%

Non Catholic Students

Group Avg: 98.6%

# Students: 7

NO. CORRECT PERCENTAGE OF STUDENTS

25% 50% 75%

Returning Students

Group Avg: 80.0%

# Students: 10

NO. CORRECT PERCENTAGE OF STUDENTS

25% 50% 75%

0.0%

0.0%

28.6%

0.0%

71.4%

10 - 109 - 98 - 87 - 70 - 6

0.0%

0.0%

30.0%

0.0%

70.0%

10 - 109 - 98 - 87 - 70 - 6

Page 48: 86097 NCEA ACRE Interpretation Manual.pmd

45

8. Student Report Summary Listing

This listing is shown on the following illustrative display (Score Report 8). The IndividualStudent Report summary is provided only to programs that ordered the optional IndividualStudent Reports. Contained in the listing is a summary that identifies each student assessed. Aseparate listing is prepared for separate grades assessed. Reported for each student is theindividual’s Total percent correct score and resulting NCEA ACRE performance levelClassification, and her/his scores on each Domain and for each Pillar assessed on the NCEAACRE. These part-test scores are shown as the percent correct attained by the individual. At thebottom of each page of these listings is shown the average percent correct score for the local andnational groups, and the number of questions comprising each score reported. This informationis a convenient summary listing that can be useful to individual instructors as they consider thefaith knowledge learning and needs of particular students. Consider this information to helpplan specific interventions and experiences for individual students.

Page 49: 86097 NCEA ACRE Interpretation Manual.pmd

46

* Names appear exactly as marked on answer sheets by students.

Total Domains Pillars% Correct 1 2 3 4 5 6 7 8 1 2 3 4

Group Average: 70% 78% 81% 75% 59% 64% 58% 80% 68% 75% 70% 67% 71%National Average: 70% 76% 80% 74% 60% 63% 60% 74% 67% 74% 70% 66% 68%Questions per Category: 63 8 8 7 8 10 8 8 6 16 1 3 15 11

Classifications:Adv = AdvancedProf = ProficientNI = Needs Improvement

Domain 1. GodDomain 2. ChurchDomain 3. Liturgy and SacramentsDomain 4. Revelation, Scripture, Faith

Domain 5. Life in ChristDomain 6. Church HistoryDomain 7. Prayer/Religious PracticeDomain 8. Faith Literacy

Pillar 1. CreedPillar 2. Liturgy and SacramentsPillar 3. MoralityPillar 4. Prayer

Total NCEA ACRE Domains PillarsStudent Name* % Correct Classification 1 2 3 4 5 6 7 8 1 2 3 4

AARON NAME 46% NI 62% 62% 57% 25% 60% 38% 50% 0% 56% 38% 53% 36%

ADAM NAME 8 6 Prof 88 88 86 75 90 88 100 67 75 92 87 91

ADRIAN NAME 7 5 Prof 62 88 71 75 60 88 88 67 62 77 73 91

ALBERT NAME 73 Prof 75 100 71 88 60 62 62 67 81 77 67 73

ALEJANDRO NAME 6 0 NI 7 5 6 2 5 7 3 8 7 0 1 2 8 8 8 3 6 2 5 4 7 3 5 5

ALEXIS NAME 70 Prof 62 75 71 62 70 62 88 67 81 62 67 73

ALICIA NAME 7 3 Prof 75 88 71 50 60 62 100 83 75 62 60 100

AMANDA NAME 7 0 Prof 88 88 71 62 60 38 75 83 81 62 67 73

ANALISA NAME 78 Prof 100 100 100 38 70 50 100 67 94 85 67 82

ANDRES NAME 7 8 Prof 88 88 71 75 70 62 100 67 88 62 73 91

ANDREW NAME 73 Prof 88 88 57 88 70 62 62 67 81 69 80 64

ANGELA NAME 84 Prof 100 88 86 62 90 50 100 100 88 69 93 91

ANGELA NAME 81 Prof 62 88 86 88 70 75 100 83 69 85 80 91

ANGELINA NAME 75 Prof 62 100 100 38 80 38 100 83 69 77 80 82

ANGELO NAME 73 Prof 62 88 86 88 50 62 88 67 75 77 60 82

ANNAH NAME 86 Prof 88 88 100 75 80 88 88 83 88 100 87 73

ANTHONY NAME 84 Prof 88 100 86 88 70 75 88 83 81 92 80 82

ARMANDO NAME 79 Prof 100 75 100 50 60 88 75 100 88 100 60 82

ARTHUR NAME 68 Prof 88 88 43 62 60 62 75 67 81 62 67 64

A am NAME 5 9 NI 7 5 6 2 4 3 3 8 5 0 5 0 7 5 8 3 7 5 4 6 5 3 6 4

BRANDON NAME 54 NI 88 38 86 38 40 50 38 67 75 69 33 36

BRIAN NAME 79 Prof 100 100 86 50 80 62 88 67 94 69 80 82

BRIAN NAME 87 Prof 100 88 100 88 60 88 88 100 94 92 73 91

BRIANNA NAME 75 Prof 75 100 71 62 80 62 88 50 75 69 87 64

CARLOS NAME 7 6 Prof 100 88 100 38 70 50 88 83 81 85 80 73

CARLOS NAME 6 0 NI 6 2 7 5 5 7 5 0 6 0 6 2 7 5 3 3 6 2 3 8 6 7 5 5

CARLOS NAME 8 3 Prof 88 88 86 75 50 100 100 83 81 85 67 91

CASEY NAME 9 7 Adv 100 8 8 100 8 8 100 100 100 100 8 8 100 100 100

CHANELL NAME 8 7 Prof 100 88 86 62 90 100 88 83 81 85 93 100

CHRIS NAME 6 2 NI 6 2 6 2 100 3 8 6 0 3 8 8 8 5 0 6 2 9 2 4 7 6 4

CHRISTIAN NAME 3 2 NI 2 5 1 2 7 1 5 0 2 0 1 2 2 5 5 0 3 1 4 6 2 7 3 6

CHRISTIAN NAME 7 5 Prof 88 75 100 62 70 62 75 67 81 77 73 73

CHRISTINA NAME 7 3 Prof 88 100 71 50 70 62 75 67 81 77 80 73

CHRISTINE NAME 71 Prof 100 75 86 50 80 50 75 50 81 69 73 73

CHRISTINE NAME 79 Prof 100 88 86 50 80 88 75 67 94 77 80 82

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○ ○

SCORE REPORT �NCEA ACRE INDIVIDUAL STUDENT REPORTPART I: FAITH KNOWLEDGE

School/Parish:Diocese:

Date of Assessment:Grade of Students:

OUR LADY SCHOOLKANSAS CITY, KSNov 200212

Page 50: 86097 NCEA ACRE Interpretation Manual.pmd

47

9. Individual Student Report for parent and child

This is a detailed report of an individual student’s performance on the NCEA ACRE assessmenttaken. An illustrative report follows for reference and examination. Reported is the student’stotal score and classification on the Part 1 portion of the NCEA ACRE assessment and abreakdown of performance arranged by the eight Domains covered by the assessment and thefour Pillars of the Catechism of the Catholic Church. The student’s scores are reported as rawscores (points attained) and as the resulting percent correct score. The student’s NCEA ACREperformance level classification based on the total score attained is also reported. The reportprovides the NCEA statement for the interpretation of performance in the classification categoryattained by the student. Data are also reported to reference the average percent correct attainedby the individual’s peers in the local school or parish program and the national average percentcorrect score for the student’s national peer group. Accompanying this report (printed on theback of each student’s report) is text that describes the scores and gives an interpretation toinform and assist parents who may receive the report. To review the illustration, shown is thestudent’s name (as coded on the NCEA ACRE answer sheet), grade and date of the assessment.Then, based on the NCEA ACRE Part 1 assessment alone, total score and classification resultsare reported followed by the eight Domains scores and then the four Pillars results. In theillustration provided, the student scored 54 out of a possible 63 points on the Level 1 FaithKnowledge questions, which converts to 86% correct. Among peers in the program at the parishor school, the average percent correct was 70%. The national average for students at this grade is70% correct for the total assessment. This same information is then provided for each Domainand Pillar area assessed.

These Individual Score Reports may be shared with parents or guardians, or with the student ifdesired. It is recommended, however, that these reports not be given to a student in the context ofthe religion session, but should be mailed individually to the student’s residence. When sharinginformation, it is important that staff be able to discuss local religious education program goalsand expectations in a face-to-face meeting, and to offer interpretations that stress growth andcontinuous improvement in addition to the student’s standing at any point in time. When sharingthis report with others (student or parents/guardians, etc.), an awareness and knowledge of theNCEA performance standard categories and associated definitions, including the scoreperformance standards (cutscores) will be essential. The appropriateness, usefulness andrelevance of comparative information should be evaluated on a case-by-case basis whensharing information with others. As has been discussed previously, the National statisticaverages are based on the performance of students during the baseline (or initial) school years ofthe Revised NCEA ACRE. Annual results are available and are distributed annually by theNCEA.

Page 51: 86097 NCEA ACRE Interpretation Manual.pmd

48

OVERALL PERFORMANCE SUMMARY ON FAITH KNOWLEDGE

NUMBERAND

PERCENTCORRECT

54/63=86%

NCEA ACREPERFORMANCE

CLASSIFICATION

SCHOOL/PROGRAMAVERAGEPERCENTCORRECT

NATIONALAVERAGEPERCENTCORRECT

Proficient* 70% 70%

* Performance at this level demonstrates satisfactory knowledge and understanding of the faith knowledgeassessed with evidence of acceptable command of key concepts that support core faith themes and domainsassessed appropriate to age and stage of faith formation. Performance generally is strong and competent withfew deficiencies and typically not substantially concentrated in any one faith knowledge domain or theme.

Pillars of the Catechism of the Catholic Church

PILLAR

1. Creed2. Liturgy and Sacraments**3. Morality4. Prayer**

NUMBERCORRECT

STUDENT’SPERCENTCORRECT

SCHOOL/PROGRAMAVERAGEPERCENTCORRECT

NATIONALAVERAGEPERCENTCORRECT

12/1612/1313/1510/11

75%92%87%91%

75%70%67%71%

74%70%66%68%

** Though similar, questions scored in these areas are not entirely the same.

DETAILED PERFORMANCE SUMMARYDomains

DOMAIN

1. God2. Church3. Liturgy and Sacraments**4. Revelation, Scripture, Faith5. Life in Christ6. Church History7. Prayer/Religious Practice**8. Faith Literacy

NUMBERCORRECT

STUDENT’SPERCENTCORRECT

SCHOOL/PROGRAMAVERAGEPERCENTCORRECT

NATIONALAVERAGEPERCENTCORRECT

7/87/86/76/8

9/107/88/84/6

88%88%86%75%90%88%

100%67%

78%81%75%59%64%58%79%68%

76%80%74%60%63%60%74%67%

SCORE REPORT �NCEA ACRE INDIVIDUAL STUDENT REPORT

STUDENT:SCHOOL:DIOCESE:

ADAM NAMEOUR LADY SCHOOLKANSAS CITY, KS

GRADE:DATE ASSESSED:

12NOV 2002

Page 52: 86097 NCEA ACRE Interpretation Manual.pmd

49

Interpreting Your NCEA ACRE Results

Your immediate use and interpretation of these data and results will be straightforward. Shouldyou have questions regarding religious education program planning, curriculum orinstructional designs and decisions, please feel free to contact the Department of ReligiousEducation at the NCEA (202-337-6232). NCEA stands ready to provide guidance andassistance as may be needed. Should you have questions regarding the data, data summaries, orreport accuracy or completeness, contact the Center for Educational Testing and Evaluation(CETE) at the University of Kansas (1-866 FOR-CETE). This academic measurement,evaluation and research center is working with the NCEA in managing and offering the NCEAACRE assessment program and they can be of assistance as well.

As mentioned, there is not much that reaches beyond common sense in the tabulations andsummaries provided. Yet, there is a goal for educational assessment that is also common sense,but as simple as it is, it can go unnoticed. Effective assessment is assessment that supportsteaching and learning, and can provide a source of information to help guide continuousimprovement. The NCEA ACRE was planned and developed by leading national religiouseducation educators with this purpose in mind. The challenge is to use your program’s NCEAACRE results to support, strengthen and advance local educational efforts. To facilitate thisgoal, documents describing an activity to guide a review of the curricular match or alignmentbetween your local program and the NCEA ACRE assessment were included in the assessmentmaterials previously sent to your program. In those documents, these materials were referred toas Appendix C. We have reproduced these alignment activity materials in Appendix C of thisManual. If you have not already done so, please refer to this activity and consider itsappropriateness and utility for your setting. We underscore its importance to future programplanning and providing a context in which to interpret your reports. An earlier section of thisManual presented a means for considering “local performance standards.” Engaging in a task asthis can also greatly guide a school’s or parish’s plans for local curriculum or instructional aimsand goals. We strongly encourage utilization of score results coupled with efforts intended tosupport planning and systematic review which can lead to revision.

The goal is to evaluate how your students have fared on the NCEA ACRE in relation to yourprogram offerings and your expectations. (If you did not carry out the Appendix C alignmentevaluation or address local performance expectations, we encourage you and your colleaguesto do so at this time.) If you carried out the alignment activity, review your NCEA ACRE resultsin consideration of your completed worksheet data. How did you do? Give particular attentionto the question-by-question summary results and the score summary by faith knowledgeDomains results.

In reviewing reports, consider the following. On those items in the Domains and for the TotalAssessment, did your student do relatively well on those questions and domains that werejudged to have a “strong to good” fit to your curriculum and program (that is, how did yourstudents do on the “A” and “B” rated items of Appendix C)? What did the Domain and TotalAssessment scores look like in relation to the number of “A+B” tallies? Are there surprises? Dothe fit ratings correlate to student performance? If not, or if not consistently, then why not? If thematerial is covered in your curriculum and you are really teaching the material (i.e., fit is good),why are students having difficulty in some areas? Did you predict poor fit on some questions, yetstudents did well? Are there areas, i.e., Domains, that received high “fit/alignment” ratings (thatis, high “A+B” counts), but over which students did not perform solidly? The Appendix C

Page 53: 86097 NCEA ACRE Interpretation Manual.pmd

50

worksheets when completed by local staff and then compared to actual student performance onthe NCEA ACRE should stimulate a conversation as to expectations, performance and futureneeds and direction. Such deliberations inevitably lead to considerations of standards: howshould we expect students to score if we are accomplishing our religious education goals andmission? Educational assessment is intended to support and inform such discussion and providesome evidence as to the extent to which students are meeting program expectations. NCEAACRE is not a test; rather, it is one step and a credible source of information in an assessmentprogram evaluation process.

Page 54: 86097 NCEA ACRE Interpretation Manual.pmd

51

Appendix A: CATHOLIC FAITH LITERACY

Fundamental terms in the Catholic vocabulary not used extensively in the questions, answer choicesand affective statements in 2001 NCEA ACRE. An assessment instrument of this kind cannot possiblyexhaust the full range of the Catholic lexicon with a limited number of responses, so the terms in NCEAACRE include and presume a core vocabulary. The following listing by level is a glossary of terms thatsupport the key concepts under each of the domains.

Teacher/catechists should be familiar with this sample of terms and the concepts they represent. Everylocal site (school and parish) is encouraged to engage in an all-faculty process with “Appendix C” thatyou received with the Administration Guide (a copy is included in this Manual as Appendix C as well).Doing so will help the entire faculty identify concepts and a common language covered in the writtenand unwritten religion curriculum. The summary reports you receive confirm which terms, and thuswhich key concepts, are most or least understood by your students in the Domain “Catholic FaithLiteracy.”

Students involved in Catholic faith formation either through Catholic schools or parish-based religiouseducation programs should know the following terms. The listing by level is cumulative, that is,advanced levels assume knowledge of terms from earlier levels.

NCEA ACRE: Level One Literacy Terms

Act of Contrition Ecumenical/Ecumenism Jesus PriestActs of Mercy Epistle John Prodigal SonAdvent Eucharist(ic) Judge Promised LandAll Saints (Feast) Evangelization Laity ProphetsAll Souls Day Excommunication Last Supper PsalmAnointing Exodus Lent PurgatoryApostles’ Creed Faith Liturgical Season ReconciliationBaptism Fasting Liturgy RedeemerBeatitudes Father Lord RedemptionBible Feast of All Saints Love Religious LifeBishop Forgiveness (1) Magisterium Roman CatholicCardinal General Intercessions Mary RosaryCatholic Gifts of the Holy Spirit Mary Magdalene SacramentChristian God Mass Sacrament of PenanceChristmas Good Friday Matrimony SacrificeChurch Gospels Memorare SaintCommandment Grace Missal SalvationCommunion Hail Mary Mortal Sin SamaritanCommunion of Saints Heaven Moses Sanctifying GraceConfess Hell Mystery SaviorConfession Hierarchy Noah Sermon on the MountConfirmation Holy Old Testament (2) SinConscience Holy Communion Original Sin SonContrition Holy Days Our Father Spirit

Page 55: 86097 NCEA ACRE Interpretation Manual.pmd

52

Covenant Holy Spirit Paschal Mystery Ten CommandmentsCreator Holy Trinity Paul ThanksgivingCreed Hope Pentecost TrinityCross Immaculate Conception Peter Venial SinDavid Incarnation Pharisees VirtueDeacon Initiation Pope VocationDiocese Israelites Prayer VolunteerDiscrimination Jericho Prayer of the Faithful YahwehDivine Jerusalem Stations of the CrossEaster Our Lady of Guadalupe SundayHoly Orders Parish Tabernacle

Level 1 Literacy terms, continued:

NCEA ACRE: Level 2 Literacy Terms Level 3 Terms

Abortion Miracle AdorationAbraham Mission AnnulmentActs of the Apostles Missionary CatechumenAdultery Moral CatechumenateAIDS Parable CommemorationAltar Peacemakers Community of FaithAscension Preach Dark AgesAssumption Presentation End TimeBlasphemy Presentation of Jesus in the Temple EvangelistsBrother (Religious) Procreation GenesisCapital Punishment Prophecies Grace (Sanctifying)Catholic Social Teaching Protestant IndulgenceChurch Council Rebecca InquisitionChurch Fathers Resurrection Kingdom of GodConsecration Revelation MonasticismConversion Rite Papal InfallibilityCrusades Rite of Christian Initiation of Adults Pastoral LetterDisciples Roman Empire Precepts of the ChurchEmperor Sarah Preferential Option for the PoorEuthanasia Scripture(s) Racial DiscriminationFinal Judgment Sexual Intercourse ReformationHomily Spirituality VenerationInspiration Sister (Religious)Isaac StewardshipIsaiah TempleKingdom TithingLector TraditionLiturgical Year Vatican IILiturgy of the Word Virginity

Page 56: 86097 NCEA ACRE Interpretation Manual.pmd

53

Appendix B: NCEA ACRE Blueprint:Content Specifications, Domains, Student Objectives, and Elaborated

Key Concepts

DOMAIN 1– GOD: FATHER, SON AND HOLY SPIRIT

Student Objective:1. To know and understand basic Catholic teaching about God as Father, Son, and Holy

Spirit.

1. Trinity: A community of three persons in one indivisible God and the central mysteryof faith.

2. God the Father: our loving Creator3. God the Son: Jesus, Savior, Life, Death and Resurrection – human and divine.4. God the Holy Spirit: God’s sanctifying power in the life of the church.5. Creed: a summary of the faith.6. God’s activity in human history (ongoing revelation), covenant with the Jews,

salvation.

DOMAIN 2– CHURCH: ONE, HOLY, CATHOLIC AND APOSTOLIC

Student Objective:1. To Understand the origin, mission, structure, community, and membership of the Church.

1. Attributes of the Church (one, holy, catholic and apostolic).2. Mary.3. Church: People of God, Body of Christ, communion of saints.4. The role of Church leaders: hierarchy and laity.5. Relationship of Church to others & the world (ecumenism).6. Church’s mission and evangelization: the responsibility of the baptized.7. Church as communion: universal, local (diocese) and Christian faith community

(parish).

DOMAIN 3– LITURGY AND SACRAMENTS

Student Objectives:1. To be knowledgeable about the Church’s liturgical life in terms of liturgical feasts,

seasons, liturgical symbols, religious practices, and concepts of prayer.2. To know and understand the sacraments as signs and instruments of grace.

1. Liturgical year.2. Liturgical symbols.3. The Mass: Nature, Liturgy of the Word, and Liturgy of the Eucharist.4. Roles in the Liturgy.5. Celebration of sacraments as signs of grace and encounters with Christ.6. Sacraments of Initiation: Baptism, Confirmation, and Eucharist.7. Sacraments of Healing: Penance & Anointing.8. Sacraments of Vocation: Holy Orders & Matrimony.

Page 57: 86097 NCEA ACRE Interpretation Manual.pmd

54

DOMAIN 4– REVELATION, SCRIPTURE, AND FAITH

Student Objectives:1. To be able to recognize Scripture as God’s inspired word.2. To know the major divisions of the Bible, the chief persons in biblical history, and major

biblical themes from the Old and New Testaments.

1. The Bible as the inspired Word of God.2. Parts of the Bible and gospel writers.3. Major biblical themes: Old Testament – creation, sin, covenant, Exodus, law,

prophets, Kingdom.4. Major biblical themes: New Testament – parables, miracles, kingdom of God,

beatitudes, paschal mystery, Jesus’ mission, eternal hope.5. Fonts of revelation: Tradition and Scripture.6. Responses to God’s call: faith (ecclesial and personal), conversion, discipleship, and

justice.

DOMAIN 5– LIFE IN CHRIST: PERSONAL MORALITY AND CATHOLIC SOCIALTEACHING

Student Objectives:1. To be knowledgeable about the teachings of Jesus and the Church as the basis of

Christian morality and to understand Catholic Social Teaching.2. To be aware of the importance of a well-formed conscience for decision-making.

1. God’s plan for Christian life: Two great commandments, Beatitudes & 10commandments.

2. Theological virtues: God’s gifts of faith, hope & love; character and Christian habits:(Cardinal virtues: prudence, fortitude, temperance).

3. Ongoing conversion: commitment to discipleship.4. Personal and social aspects of sin.5. Catholic Social Teaching: seven principles: life and dignity of the human person; call

to family; community and participation; rights and responsibilities; preferential optionfor the poor and vulnerable; dignity of work and rights of workers; solidarity; care forGod’s creation.

6. Conscience, freedom, decision-making, responsibility, and the courage to act.7. Morality as based on natural and divine law.

Page 58: 86097 NCEA ACRE Interpretation Manual.pmd

55

DOMAIN 6– CHURCH HISTORY

Student Objective:1. To become familiar with the central stories, key events, and major figures that have

shaped the history and development of the Church over time as appropriate for thestudent’s grade level.

1. Apostolic age: Pentecost, birthday of the Church; formation of Scripture.2. End of persecution of Christians and legal acceptance of Church under Constantine;

Nicene Creed; Catechumenate (4th century).3. Mary: Mother of God.4. St. Francis of Assisi, St. Clare of Assisi, devotional life, reform of church,

monasticism, Marian traditions (11th century).5. Reformation and Council of Trent, St. Ignatius Loyola, St. Theresa of Avila,

missionary movement (16th century).6. Age of Enlightenment, French Revolution, immigration and Church in America,

Bishop Carroll, Mother Seton, Kateri Tekakwitha (18th century).7. Church and worker; Leo XIII rerum novarum; Catholic Social Teaching (19th

century).8. Pius X and age of communion; US church and immigrants, restoration of the

catechumenate, Vatican II, ecumenism, Pope John Paul II, St. Catherine Drexel, Bp.Oscar Romero, Dorothy Day (20th century).

DOMAIN 7– PRAYER / RELIGIOUS PRACTICES

Student Objective:1. To recognize and learn how to engage in Catholic forms of personal prayer and ways of

deepening one’s spiritual life.

1. The Lord’s Prayer: summary of the Gospel; Hail Mary; Glory be to the Father, etc.meal prayers, sign of the cross, act of contrition, apostles creed.

2. Sacramentals: Rosary, Stations of the Cross, holy water, etc.3. Devotional practices rooted in different cultures.4. Purpose and forms of prayer, blessing, petition, intercession, thanksgiving, adoration

and praise.5. Precepts of the Church: attend Mass on Sundays and holidays, yearly confession of

serious sin, holy communion during Easter season, observe fasting and abstinence,Church laws on marriage, support of the Church.

6. Spirituality: religious interiority marked by meditation, contemplation and reflection;nurtured by Scripture and Tradition, liturgical, communal, and private prayer;connected to age and stage of faith development.

Page 59: 86097 NCEA ACRE Interpretation Manual.pmd

56

DOMAIN 8– CATHOLIC FAITH LITERACY

Student Objective:1. To be literate in the use of Catholic religious terminology.

Note: By having these key concepts closely parallel the Domains, we identify terminologythat is central to each domain and literacy questions which can then be crossreferenced so that they appear in each domain or can be separated out for thesingle domain of “Catholic literacy.”

1. God (anointed, ascension, Christ, Creator, cross, divine, Father, Holy, Incarnation,Jesus, Lord, mystery, one, Paschal, Redeemer, redemption, resurrection, salvation,Savior, son, Spirit, Trinity, Yahweh).

2. Church (bishop, cardinal, catholic, Christian, communion of saints, community offaith, deacon, diocese, ecumenism, evangelization, hierarchy, Infallibility, laity,magisterium, Mary, mission, Missionary, parish, pastoral letter, Pope, priest, religiousbrother, religious life, religious sister, Roman Catholic).

3. Liturgy / Sacraments (Advent, altar, anointing, ascension, assumption, Baptism,catechumenate, Christians, commemoration, Communion, confess, confession,confirmation, consecration, contrition, Easter, Eucharist, Feast of All Saints,forgiveness, Good Friday, holy communion, holy orders, holy, ImmaculateConception, Initiation, Last Supper, lector, Lent, liturgical year, liturgy of the Word,Mass, Matrimony, Pentecost, Prayer of the Faithful, preach, Presentation of Jesus inthe Temple, Rite of Christian Initiation of Adults, reconciliation, Reconciliation, rite,sacrament, Sacrament of Penance, sacrifice, season, Sunday, tabernacle).

4. Revelation, Scripture, and Faith (Abraham, Acts of the Apostles, apostles, Bible,covenant, David, disciples, epistles, evangelists, Exodus, Genesis, gospels,inspiration, Isaac, Isaiah, Israelites, Jericho, Jerusalem, John, judge, kingdom,kingdom of God, Mary Magdalen, miracle, Moses, Noah, Old Testament, parable,Paul, Peter, Pharisees, Prodigal Son, Promised Land, prophecies, Prophets, psalm,Rebecca, Samaritan, Sarah, Sermon on the Mount, temple, Ten Commandments,Tradition).

5. Life in Christ / Morality (abortion, adultery, AIDS, annulment, beatitudes, blasphemy,capital punishment, Catholic Social Teaching, 10 commandments, conscience,conversion, discrimination, euthanasia, excommunication, faith, gifts of the HolySpirit, grace, holy days of obligation, moral, mortal sin, original sin, peacemakers,precepts of the church, preferential option for the poor, procreation, saint, sanctifyinggrace, sexual intercourse, tithing, venial sin, virginity, Virtue, vocation).

6. Church History (Church Council, Church Fathers, Creed, Crusades, dark ages,emperor, Inquisition, Monasticism, Protestant, Reformation, Roman Empire, VaticanCouncil II).

7. Prayer / Religious Practices (act of contrition, acts of mercy, adoration, Apostles’Creed, fasting, Hail Mary, indulgences, Memorare, missal, Our Father, Our Lady ofGuadalupe, prayer, rosary, spirituality, stations of the cross, stewardship,thanksgiving, tithing, veneration).

8. Christian Hope (eschatology, final judgment, heaven, hell, purgatory).

Page 60: 86097 NCEA ACRE Interpretation Manual.pmd

57

Appendix C*Examining Local Curriculum Content Standards and NCEA ACRE Alignment:

A basis for curriculum planning and review, and gaining a perspective onlocal learning goals and expectations

Those who attend assessment workshops on NCEA Assessment of Catechesis/ReligiousEducation attest to how vitally important “Appendix C” is to the entire assessment process.We encourage everyone who will administer NCEA ACRE to students (teachers in Catholicschools and catechists in parish-based programs) to review the NCEA ACRE questionsappropriate to the level of the instrument you are using against this “Appendix C.” Theactivity, also known as the curriculum alignment review, exposes the core essentials of theCatholic faith that NCEA ACRE assesses. The process heightens awareness of the degree towhich these core essentials are taught directly through instruction and indirectly throughfaith formation experiences. This important preassessment activity also gives a context forinterpreting NCEA ACRE results. Making time for “Appendix C” before administeringNCEA ACRE is the surest way to get the most benefit from the NCEA ACRE results whenthey arrive.

Diana Dudoit Raiche, Executive DirectorNCEA Department of Religious Education

(Do these activities with the full faculty/team of catechists before you administer NCEA ACRE!)

Knowing the extent of the religious education knowledge of your students is vitally importanttoward monitoring and evaluating religious education programs. The NCEA assessments havebeen designed to provide one source of information to help you judge the quality of yourcatechetical/religious education efforts, and identify program strengths and weaknesses.

To support local interpretation and use of results, it is essential and beneficial for you andyour staff to examine the NCEA ACRE assessment in detail, question by question.Following is a procedure to help you determine how your local program correlates to theNCEA ACRE assessment blueprint. This activity needs to be done without reference tostudent scores. The principal or DRE and local instructors are to be involved in thisprocess. We strongly advise involving those teachers/instructors at the grade whereNCEA ACRE is given, but also include instructors at all grade levels (a program reflectsthe work of more than one teacher for more than one year). The pastor, parents andothers who may have responsibility for the religious education program might also beinvolved. It would not be effective to rely on the perceptions of one instructional staffmember working alone to do this activity.

This “Appendix C” alignment review is to proceed as follows. One person is to serve as thecoordinator/facilitator for this activity — the expectation is that individual would be theprincipal, DRE or coordinator of the religious education curriculum for the school or parishprogram. The facilitator’s role is to keep the group moving and on-task. The group (or panel)consists of the facilitator, instructors and any others you would like to participate (i.e., parents,

* The information in this Appendix is provided to parish and school programs when NCEA ACREassessment materials are shipped. If you have not implemented this activity, users are encouraged toenact this process as described before assessment results are reviewed. As discussed, we believe it to beof special benefit.

Page 61: 86097 NCEA ACRE Interpretation Manual.pmd

58

board chair or members, Pastor, etc.). The facilitator also serves as the recorder for the group’sconsensus judgments. Special Worksheets have been prepared and are attached for each NCEAACRE assessment Level (1, 2, and 3). Make additional copies as needed. Implement these steps,but modify the procedure to meet your needs and preferences. Depending on the size of thegroup and their familiarity with the curriculum, the process detailed below could take about 2hours.

1. The facilitator convenes the meeting of key individuals who are to provide their input towardjudging local curriculum/NCEA ACRE alignment. There needs to be a separate panel foreach NCEA ACRE level administered. Within a panel, initiate a discussion of your religionprogram’s learning objectives and expectations. The panel needs to reflect upon and discuss:(1) your students and their characteristics, (2) your community and how it supports studentsvia parish and home, (3) when during the year students will take NCEA ACRE, and (4) theirfirst-hand knowledge of the religious instruction program that is actually offered. (suggestedtime: 30 minutes).

2. Spend a few minutes to discuss the reasons for giving the NCEA ACRE assessment: to assistin program review and evaluation (not teacher evaluation, not for giving students a grade inreligion, and not for the purpose of comparing school and parish programs). (suggested time:5 - 10 minutes)

3. The following process should be followed for each NCEA ACRE level used in your school orparish program. Only the NCEA ACRE knowledge questions (Part 1) come under review inthis process. Provide each participant with an NCEA ACRE assessment booklet. (Theprocess should take approximately 1 to 1.5 hours)

The following overview describes the activity in which participants on a panel becomeengaged: (a) to read an NCEA ACRE assessment question, (b) to hold a (brief) discussion toshare information, perspectives, and beliefs, and (c) to arrive at a consensus rating and thento proceed on to the next question, repeating the discussion, review and rating process.

Here is the essential task: As the panel reads a question, participants are to discuss, consider,explore and share their professional opinions from two central perspectives:

I. Whole community perspective:“Given our students’ formal instruction, typical parish experiences, community/parentalguidance, and in consideration of when NCEA ACRE is administered during the schoolyear, to what extent have our students had an opportunity to learn the content of thisquestion?

and, as the second perspective:

II. School/Parish religion program perspective:“Considering only the opportunities available via formal instruction through our school/parish instructional program(s) and when during the school year NCEA ACRE isadministered to our students, to what extent do our students have an opportunity to learnthe content of this question?”

Having discussed a question from these two related points of impact, use the followingevaluative central question and rating scale: Will our students have had the opportunity toacquire the content knowledge necessary to answer this question correctly?

Page 62: 86097 NCEA ACRE Interpretation Manual.pmd

59

Rate as: A) A certaintyB) Very likelyC) PerhapsD) Probably notE) Not Sure/don’t know

During the discussion, the facilitator works to move the group to establish/confirm a consensusrating for each NCEA ACRE question. The facilitator is to record the group’s consensus rating(A, B, C, D or E) on the worksheet in the box alongside the question’s number. Be sure to beusing the appropriate worksheet (Level 1, 2, or 3) for the NCEA ACRE assessment beingreviewed by the panel. When all Part 1 questions have been rated by the group (the panel’s workis completed at this point), then the facilitator completes Step 2.

In Step 2, the facilitator circles only those questions rated “A” or “B” which are listed in the boxon the lower half of the worksheet page. In each row, the NCEA ACRE questions are clusteredinto the Domains that are represented on the NCEA ACRE assessment.

Finally, when all items have been accounted for in Step 2, then the facilitator counts and recordsthe number of A and B rated questions for each domain, and then tallies and records the totalnumber of A’s and B’s. Needless to say, please check the tallying for accuracy.

Locally, the completed Worksheet: (1) serves as a continuing point of departure for discussionas you review and monitor your local religious education/catechetical program, and (2) needs tobe referenced continuously as your NCEA ACRE score reports are reviewed. The higher theA+B count (for the Total Assessment and its subparts, i.e., the Domain scores), the better the “fit”between what is taught and what is assessed by NCEA ACRE; a low count means NCEA ACREassesses knowledge/content not (as yet) provided through your instructional programs. Alsorealize that students’ scores are directly related to the instructional opportunities available, thefundamental premise being that students should not be expected to learn what they are notdirectly taught. Consider this: if the number of A’s and B’s is fewer than 50 to 60 percent of theNCEA ACRE questions, should your students be expected do “well” until your curriculumcomes in line with the NCEA ACRE assessed content? If the “fit” between the NCEA ACREquestions and your instructional program opportunity is high (as reflected by many, 70 to 80percent or more of the questions rated A or B), then we should expect students to score well onNCEA ACRE if the instruction has been effective.

Taking the time to wrestle with content standards and curriculum provides a basis againstwhich to interpret your assessment results. This process should result in a thoroughprogram content review and prompt lively and productive staff discussions. The amount oftime and the number of staff members available to have input to such an alignment reviewwill vary from program to program. However, the more you prepare and become proactivelocally to receive and create a basis for interpreting your results, the more useful theinformation will be.

The Worksheets to implement and record the activity follow.

Page 63: 86097 NCEA ACRE Interpretation Manual.pmd

60

Lev

el 1

NC

EA

AC

RE

Alig

nmen

t Rev

iew

Wor

kshe

etA

ppen

dix

C

Scho

ol N

ame:

Faci

litat

or:

Step

1:

base

d on

dis

cuss

ion

and

cons

ensu

s, ra

te e

ach

Lev

el 1

NC

EA

AC

RE

Par

t 1 it

em a

s to

stud

ent “

oppo

rtun

ity to

lear

n” a

ccor

ding

toth

e fo

llow

ing

scal

e:A

= A

Cer

tain

ty;

B =

Ver

y L

ikel

y;C

= P

erha

ps;

D =

Pro

babl

y N

ot;

E =

Not

Sur

e/do

n’t k

now

.R

ecor

d th

e ra

ting

in th

e bo

x by

the

item

num

ber.

Step

2:

in th

e D

omai

n gr

id b

elow

, CIR

CL

E e

ach

item

rate

d A

or B

, the

n re

cord

the

num

ber c

ircl

ed fo

r eac

h D

omai

n an

d th

e T

otal

.

NC

EA

AC

RE

Dom

ains

– C

ircl

e ea

ch it

em r

ated

A o

r B

Dom

ain

1 ite

ms:

89

1 63 1

4 55 1

Dom

ain

2 ite

ms:

11 7

3 23 9

4 6

Dom

ain

3 ite

ms:

51 0

1 82 3

2 53 0

3 34 0

4 7

Dom

ain

4 ite

ms:

31 1

2 02 2

3 44 1

4 8

Dom

ain

5 ite

ms:

41 2

1 92 7

3 54 2

4 9

Dom

ain

6 ite

ms:

1321

2836

Dom

ain

7 ite

ms:

26

1426

2937

Dom

ain

8 ite

ms:

715

2438

4344

50

Tot

al:

Num

ber

of It

ems C

ircl

ed

Item

#

1 2 3 4 5 6 7 8 9 10 11

Item

#

12 13 14 15 16 17 18 19 20 21 22

Item

#

23 24 25 26 27 28 29 30 31 32 33

Item

#

34 35 36 37 38 39 40 41 42 43 44

Item

#

45 46 47 48 49 50 51

Rat

ing

Rat

ing

Rat

ing

Rat

ing

Rat

ing

Page 64: 86097 NCEA ACRE Interpretation Manual.pmd

61

Step

2:

in th

e D

omai

n gr

id b

elow

, CIR

CL

E e

ach

item

rate

d A

or B

, the

n re

cord

the

num

ber c

ircl

ed fo

r eac

h D

omai

n an

d th

e T

otal

.

NC

EA

AC

RE

Dom

ains

– C

ircl

e ea

ch it

em r

ated

A o

r B

Dom

ain

1 ite

ms:

19

1526

3442

55

Dom

ain

2 ite

ms:

1020

2736

4350

56

Dom

ain

3 ite

ms:

21 1

1 92 8

3 74 4

5 35 7

Dom

ain

4 ite

ms:

1221

2952

4554

Dom

ain

5 ite

ms:

35

1318

2230

3538

4049

Dom

ain

6 ite

ms:

61 4

2 33 1

3 94 6

Dom

ain

7 ite

ms:

71 6

2 43 2

4 7

Dom

ain

8 ite

ms:

48

1725

3341

4851

Tot

al:

Num

ber

of It

ems C

ircl

ed

Item

#

1 2 3 4 5 6 7 8 9 10 11 12

Item

#

13 14 15 16 17 18 19 20 21 22 23 24

Item

#

25 26 27 28 29 30 31 32 33 34 35 36

Item

#

37 38 39 40 41 42 43 44 45 46 47 48

Item

#

49 50 51 52 53 54 55 56 57

Rat

ing

Rat

ing

Rat

ing

Rat

ing

Rat

ing

Lev

el 2

NC

EA

AC

RE

Alig

nmen

t Rev

iew

Wor

kshe

et

Scho

ol N

ame:

Faci

litat

or:

Step

1:

base

d on

dis

cuss

ion

and

cons

ensu

s, ra

te e

ach

Lev

el 2

NC

EA

AC

RE

Par

t 1 it

em a

s to

stud

ent “

oppo

rtun

ity to

lear

n” a

ccor

ding

toth

e fo

llow

ing

scal

e:A

= A

Cer

tain

ty;

B =

Ver

y L

ikel

y;C

= P

erha

ps;

D =

Pro

babl

y N

ot;

E =

Not

Sur

e/do

n’t k

now

.R

ecor

d th

e ra

ting

in th

e bo

x by

the

item

num

ber.

Page 65: 86097 NCEA ACRE Interpretation Manual.pmd

62

Step

2:

in th

e D

omai

n gr

id b

elow

, CIR

CL

E e

ach

item

rate

d A

or B

, the

n re

cord

the

num

ber c

ircl

ed fo

r eac

h D

omai

n an

d th

e T

otal

.

Lev

el 3

NC

EA

AC

RE

Alig

nmen

t Rev

iew

Wor

kshe

et

Scho

ol N

ame:

Faci

litat

or:

Step

1:

base

d on

dis

cuss

ion

and

cons

ensu

s, ra

te e

ach

Lev

el 3

NC

EA

AC

RE

Par

t 1 it

em a

s to

stud

ent “

oppo

rtun

ity to

lear

n” a

ccor

ding

toth

e fo

llow

ing

scal

e:A

= A

Cer

tain

ty;

B =

Ver

y L

ikel

y;C

= P

erha

ps;

D =

Pro

babl

y N

ot;

E =

Not

Sur

e/do

n’t k

now

.R

ecor

d th

e ra

ting

in th

e bo

x by

the

item

num

ber.

NC

EA

AC

RE

Dom

ains

– C

ircl

e ea

ch it

em r

ated

A o

r B

Dom

ain

1 ite

ms:

11 8

2 62 7

3 34 2

5 05 8

Dom

ain

2 ite

ms:

21 0

1 61 9

2 34 3

5 15 9

Dom

ain

3 ite

ms:

79

1120

3652

60

Dom

ain

4 ite

ms:

41 2

2 12 8

3 74 5

5 36 1

Dom

ain

5 ite

ms:

1317

2229

3438

4654

5562

Dom

ain

6 ite

ms:

614

3035

3944

4756

Dom

ain

7 ite

ms:

315

2431

4048

5763

Dom

ain

8 ite

ms:

58

2532

4149

Tot

al:

Num

ber

of It

ems C

ircl

ed

Item

#

1 2 3 4 5 6 7 8 9 10 11 12 13

Item

#

14 15 16 17 18 19 20 21 22 23 24 25 26

Item

#

27 28 29 30 31 32 33 34 35 36 37 38 39

Item

#

40 41 42 43 44 45 46 47 48 49 50 51

Item

#

52 53 54 55 56 57 58 59 60 61 62 63

Rat

ing

Rat

ing

Rat

ing

Rat

ing

Rat

ing

Page 66: 86097 NCEA ACRE Interpretation Manual.pmd

63

Appendix D:Enhancing the NCEA ACRE Assessment using Locally Selected/Constructed Questions

(The following is provided as a reminder to local schools and parishes that there is anopportunity to enhance the assessment that NCEA ACRE provides. Consider the followingwhen you plan your next administration of NCEA ACRE.)

Any assessment can only monitor a sample of desired learned knowledges and understandings.The NCEA ACRE assessments are no exception. While there has been considerable planning tocreate the NCEA ACRE assessments, nevertheless not all questions that the many developmentadvisors wished to have asked are presented due to limitations imposed by time considerations.

This opportunity is intended to help local programs extend the depth or include a more diversearray of faith knowledge questions to expand the evaluation of local religious educationoutcomes and goals. We include on Side One of each student’s answer sheet an area for your useto capture student responses to additional items. If you are interested/intrigued, here is how totake advantage of this opportunity.

1. Only multiple-choice questions having one correct/best answer can be used. Four (4)response choices must be presented with each question and these choices must be labeledas: (A), (B), (C), and (D). Refer to an NCEA ACRE assessment booklet if a model is needed.

2. You decide the questions to ask. Select questions that compliment your local religioninstruction goals and expectations. The questions may be from public domain sources orinstructional materials that provide questions, or may be locally developed questions. Theappropriateness, adequacy and representativeness of the questions used are localdecisions, as is participation.

3. Not more than 20 questions can be used (using fewer than 20 items is OK).4. Different questions can be used at different grades.5. Students are to mark their answers to the questions using the first 20 response positions in

the area labeled “Additional Data Collection Area” on Side One of their answer sheet.6. When preparing your questions, you must adhere to the following answer key. That is,

question 1 must be presented so that the correct answer is “D,” for question 2 the correctanswer must be “A,” etc. By following this answer key, we are able to score and report howyour students do on your questions. Adopt this answer key for all grades:

1. D 5. B 9. A 13. B 17. A2. A 6. C 10. B 14. D 18. C3. C 7. A 11. D 15. C 19. D4. B 8. D 12. C 16. A 20. B

7. We recommended that your “local” questions be administered to students prior to the timethey take the “formal” NCEA ACRE assessment. Consider scheduling the “localassessment” during a pre-session when students would first complete the answer sheetinformation grids required by the NCEA ACRE program (about 10 minutes), then go on toanswer the local questions.

There is no charge for this service. Group results for each item are reported along with the schoolor parish’s NCEA ACRE results. Contact NCEA or CETE if there are questions or shouldassistance be needed.

Page 67: 86097 NCEA ACRE Interpretation Manual.pmd

64

Contributors

The following individuals made major contributions to the design and development of the revised 2001 NCEAACRE. The NCEA is indebted to their assistance, attention, contributions and support of the assessment, and theircommitment to quality religious education programs in our parishes and schools.

Leisa Anslinger, Dir. of Catholic Formation,Immaculate Heart of Mary Parish, Cincinnati, OH

Bishop Robert BanksDiocese of Green Bay

Bob Colbert, Executive DirectorDepartment of Religious Education, NCEA

Paul Cooper, Director of Religious EducationSt. Margaret Mary Alacoque, St. Louis, MO

Jack Craven, Director of EducationDiocese of Boise, Boise, ID

Lois DeFelice, St. Cyprian Parish,River Grove, IL.

Paul E. DeZarn, Principal, St. Raphael School,Louisville, KY

Nancy Dome, Church of St. Isidore,Stow, MA

Suzanne Emsweller, DRE & Assistant Principal, St.Andrew’s Parish & School, Diocese of Columbus

Emily Filippi, Assist. Dir., Office of ChristianFormation, Catholic Diocese of Richmond, VA

Elizabeth Foer, Assoc. Director, CatecheticalOffice, Archdiocese of Newark

John Galvan, Department Chair, Religious StudiesOur Lady of Peace Academy, San Diego, CA

Dr. Doug Glasnapp, Co-director CETEUniversity of Kansas

Barbara Glynn, DRE, St. Catherine of Siena,Riverside, CT

Brian Guillot, St. John Bosco,Laewood, WA

James Hamburge, Principal, Judge MemorialCatholic High School, Salt Lake City, UT

Vicki Hawkins, DRE, Nativity Catholic Parish,Diocese of St. Petersburg

Mary Margaret Henshaw, PrincipalTransfiguration High School, Tarrytown, NY

Brian Keane, Coor. of Assessment and ResearchOffice for the Catechism, USCCB, Wash., DC

Kathleen C. Kelley, Christ the Good Shepherd,Spring, TX

Fr. Dan Kutys, Director, Office for theCatechism USCCB, Washington, DC

Sr. Antoine Lawlor, Camden Diocese CenterCamden, NJ

Brian Lemoi, Director of Religious EducationDiocese of St. Petersburg, FL

Eileen Loughran, DirectorSt. John Fisher, Rancho Palos Verdes, CA

Dr. Frank Lucido, Sec. for Educ. & DRE, Dio. ofCorpus Christi, Asst. Prof. Texas A&M atCorpus Christi, TX

Sr. Jeanette Lucinio, Catholic TheologicalUnion, Chicago, IL

Lars Lund, Assistant SuperintendentArchdiocese of San Francisco, CA

Bishop Richard MaloneArchdiocese of Boston

Patricia Manuli, DRE, St. Mary Parish,Wappingers Falls, NY

Dr. Mark Markuly, Assoc. DRE & CathechesisOffice of Education, Diocese of Belleville, IL

Catherine Martin, Ph.D., St. Raphael Parish,Livingston, NJ

Sr. Eva Regina Martin, Professor, Institute forBlack Catholic Studies. Xavier Univ.,New Orleans, LA

Tom McLaughlin, Office for CatechesisArchdiocese of Chicago, IL

Robert Melevin, Low Desert Consultant,Office of Rel. Educ. & Formation,Diocese of San Bernardino

Isobel Milligan, St. Elizabeth Ann Seton,Lake Ridge, VA

Frank Montejano, Regional Supervisor &Religion Coor. for Elem. Schools, Arch.of Los Angeles, CA

Page 68: 86097 NCEA ACRE Interpretation Manual.pmd

Dan Mulhall, Assist. Secretary for Catechesis andInculturation Concerns, USCCB, Washington, DC

Barbara Nichols, Holy Family Parish,Marietta, GA

William (Bill Nolan, Chair, Secondary Rel. Dept.,Totino-Grace HS, Archdiocese of St. Paul &Minneapolis

Evelyn Nordberg, PrincipalSt. Timothy’s School, San Mateo, CA

Br. John Paige, CSC, Assistant ProfessorSt. Edward’s University, Austin, TX

Steve Palmer, Associate Executive Director,Department of Religious Education, NCEA

Dr. Joe Pedulla, CSTEEPBoston College, MA

Ron Pihokker, Director of CatecheticsArchdiocese of Newark, NJ

Dr. John Poggio, Co-Director CETEUniversity of Kansas

Rev. John PollardArchdiocese of Chicago

Publishers of Catechetical / Religious EducationMaterials*

Kitty Quinn, Office of EducationDiocese of Columbus, OH

William Raddell, Villa Angela-St. Joseph HighSchool & Center for Learning, Cleveland, OH

Diana Dudoit Raiche, Assistant Executive DirectorDept. of Religious Education, NCEA

Dr. Jane Regan, Assistant Professor of Theology& R.E., Boston College, MA

Larry Rilla, Director of Religious Education,Archdiocese of Washington

Sr. Dominica Rocchio, SC, Secretary of EducationandSuperintendent of Schools, Archdiocese ofNewark, NJ

Joseph Shadle, St. John Fisher Parish,Cincinnati, OH

Jennifer Schmidt, Co-Principal, St. Thomas MoreSchool, Archdiocese of Los Angeles

Marge Schumer, DRE, Queen of All Saints Parish,Archdiocese of St. Louis

Sr. Maureen Shaughnessy, SC, Assistant Secretaryfor Catechesis and Leadership, Formation,USCCB, Washington, DC

Joseph Sinwell, D. Min., Committee Chair, DirectorReligious Education, Diocese of Providence

Carolyn Stucke, Assistant Director, Office ofReligious Ed., Archdiocese of Cincinnati, OH

John Valenti, Director of Religious Education,Diocese of Jackson

Dr. Fayette Veverka, Assistant ProfessorVillanova University, Drexel Hill, PA

Cris Villapando, Director of Faith FormationPrograms, Diocese of Charlotte

Michele Idiart Walsh, DRE, St. Joachim Parish,Hayward, CA

Dr. Tom Walters, St. Meinrad’s SeminarySanta Claus, IN

Jacqueline Wilson, Executive Director, Office ofBlack Catholics, Archdiocese of Washington, DC

Bishop Donald WuerlDiocese of Pittsburgh

* Publishers of catechetical/religious education instructional materials were invited and didcontribute to various phases of the NCEA ACRE revision process.


Recommended